Must be used before SparkInterpreter (%spark) initialized Chris Fregly July 07, 2015 01:58 Follow Have more questions? Submit a request Comments Chris Fregly July 07, 2015 01:58 This is most-likely caused when you are trying to add a dependency using %dep such as the following: z.addRepo("maven central").url("search.maven.org") z.load("com.datastax.spark:spark-cassandra-connector_2.10:1.4.0-M1") z.load("org.elasticsearch:elasticsearch-spark_2.10:2.1.0") after you've already run a %spark command. note that %spark is the default even if you run an empty cell with no code at all. the workaround is to restart the Spark interpreter. note that when you do this, you'll lose any notebook-scoped variables as well as interpreter-wide temp tables, etc.