4

I wrote this :

import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession

object ProcessingApp extends App {
  val sparkConf = new SparkConf()
    .setAppName("er")
    .setMaster("local")
  val sparkSession: SparkSession = SparkSession.builder().config(sparkConf).getOrCreate()

  val test = sparkSession.version

  println(test)

}

I want to run it locally with my Intellij IDE by right click on the run ProcessingApp but this doesn't work , I made my spark dependencies not provided at the build.sbt file level. I am getting this error:

Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass
scalacode
  • 1,096
  • 1
  • 16
  • 38
  • 4
    Can you open `edit configurations` and click on `Include dependencies with "Provided" scope` like [here](https://i.stack.imgur.com/fcWVU.png)? – Krzysztof Atłasik Jul 22 '19 at 15:43
  • 1
    Possible duplicate of [IntelliJ: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/types/DataType](https://stackoverflow.com/questions/55753141/intellij-exception-in-thread-main-java-lang-noclassdeffounderror-org-apache) – Krzysztof Atłasik Jul 22 '19 at 15:44

2 Answers2

10

change the scope of all the spark dependencies from provided to compile

Chitral Verma
  • 2,695
  • 1
  • 17
  • 29
0

Try to right click on the jar file in the target directory and run it. If the dependencies are included in your jar, it should pick it.

Chida
  • 1