Error: ClassNotFoundException: org.apache.spark.SparkConf
Please check the dependency scope of Spark in pom.xml.
Change the scope from "provided" to "compile". Then the problem is solved.
"provided" vs "compile" in Maven
- compile scope is the default scope, and available in all classpaths.
- provided scope is only available on the compilation and test classpath, and dependencies are not transitive.
- provided dependencies are not packaged
Reference:
http://stackoverflow.com/questions/6646959/difference-between-maven-scope-compile-and-provided-for-jar-packaging
No comments:
Post a Comment