Spark Configurator


Welcome to the Spark cluster configurator! Given your hardware capacity, this tool will give you the most appropriate settings to maximize your cluster efficiency and avoid memory issues.

The configurator assumes that you are using YARN or some sort of application manager.

These parameters can be used in spark jobs, such as in:
$ spark-submit --class --num-executors ? --executor-cores ? --executor-memory ? ... .