0

I am trying to load application.conf in spark-shell using typesafe-config.

Following is the code and spark-shell command:

scala code:

import com.typesafe.config.{Config, ConfigFactory}

val config: Config = ConfigFactory.load("application.conf")
val env = config.getString("key.key1")

spark-shell commands:

  • spark-shell --jars confing-1.3.4.jar --files application.conf --driver-java-options -Dconfig.file=application.conf (attempt 1)
  • spark-shell --jars confing-1.3.4.jar (attempt 2)

application.conf:

key {

 key1 = "value1"

}

error:

com.typesafe.config.ConfigException$Missing: system properties: No configuration setting found for key 'key'
Atish
  • 4,277
  • 2
  • 24
  • 32
  • Does this answer your question? [Submit an application property file with Spark typesafe config](https://stackoverflow.com/questions/53290715/submit-an-application-property-file-with-spark-typesafe-config) – Florian Corzilius Mar 25 '20 at 05:37

2 Answers2

0

Where is your conf file present? Did you try using the full path from the root? Eg:

val config: Config = ConfigFactory.load("/<root-path>/application.conf")
-1

You can pass file to each executors by --files "application.conf" and read with ConfigFactory.parseFile() as

You can get the file passed by --files with SparkFiles.get

import com.typesafe.config.ConfigFactory
import java.io.File
import org.apache.spark.SparkFiles

val config = ConfigFactory.parseFile(new File(SparkFiles.get("application.conf")))

This should load the config you want.

koiralo
  • 22,594
  • 6
  • 51
  • 72
  • Thanks, @Shankar. this is taking application.conf path to /tmp..... location in hdfs which doesn't exist. It returns an error "path does not exist" – Atish Jul 29 '19 at 05:23