0

while I'm trying to submit a spark job from eclipse onto a remote hadoop cluster(yarn) getting below error:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:674)
    at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 14 more

Please help me with this.

shiva k
  • 21
  • 1
  • 6
  • Start with sharing code. – mtoto Mar 24 '17 at 09:37
  • You are missing some dependency jars. You can check this link.. http://stackoverflow.com/questions/40287289/java-lang-noclassdeffounderror-org-apache-spark-logging – Deepan Ram Mar 24 '17 at 10:25
  • Sorry, I forgot to include the spark version which I'm using. I'm using Spark 1.6.0 on production cluster, I cant change the version to 1.5.2 – shiva k Mar 24 '17 at 12:31
  • Thanks for your time mtoto, Deepan Ram. The problem is solved by downloading and saving the spark-assembly jar onto the HDFS cluster and adding this property to the eclipse code set("spark.yarn.jar" "hdfs//namenode:8020/ – shiva k Mar 27 '17 at 06:34

1 Answers1

0

The problem is solved by downloading and saving the spark-assembly jar onto the HDFS cluster and adding this property to the eclipse code set("spark.yarn.jar" "hdfs//namenode:8020/

shiva k
  • 21
  • 1
  • 6