Running multiple Spark Executors per Spark Worker node

https://spark.apache.org/docs/latest/configuration.html
 
add "spark.executor.cores=2" to your spark-defaults.conf file if you want 2 Executors per Worker node JVM. 
 
you want to only run one start-slave.sh (single JVM) per Worker node, but you can configure multiple Executor threads inside the Worker node JVM - 1 per available core max.
 
here's a bit more context:
 
http://stackoverflow.com/questions/29955133/spark-standalone-cluster-create-multiple-executors-per-worker-node
 
https://issues.apache.org/jira/browse/SPARK-1706
 
 
Have more questions? Submit a request

Comments