Running multiple Spark Executors per Spark Worker node
add "spark.executor.cores=2" to your spark-defaults.conf file if you want 2 Executors per Worker node JVM. 
you want to only run one (single JVM) per Worker node, but you can configure multiple Executor threads inside the Worker node JVM - 1 per available core max.
here's a bit more context:
Have more questions? Submit a request