Uploaded image for project: 'Livy'
  1. Livy
  2. LIVY-339

Unable to start spark session if spark.jars.packages are set in spark config.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 0.4.0
    • 0.9.0
    • Core, REPL
    • None
    • Spark 2.1 running on Hadoop 2.8

    Description

      I have added following configuration to spark-defaults.conf

       spark.jars.packages com.amazonaws:aws-java-sdk:1.11.115,org.apache.hadoop:hadoop-aws:2.8.0
      

      If I start spark-shell with this configuration, spark-shell loads these jars and starts without any issue.

      But when I use Jupyter with Livy, I am getting error as shown in attached error.png.
      Same Jupyter+Livy+Spark combo works well with default spark configuration as shown in screenshot success.png

      Attachments

        1. Error.png
          91 kB
          Alex Bozarth
        2. Success.png
          71 kB
          Alex Bozarth

        Activity

          People

            Unassigned Unassigned
            ckb Chetan Kumar Bhatt
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: