Uploaded image for project: 'Livy'
  1. Livy
  2. LIVY-339

Unable to start spark session if spark.jars.packages are set in spark config.

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 0.4.0
    • Fix Version/s: None
    • Component/s: Core, REPL
    • Labels:
      None
    • Environment:
      Spark 2.1 running on Hadoop 2.8

      Description

      I have added following configuration to spark-defaults.conf

       spark.jars.packages com.amazonaws:aws-java-sdk:1.11.115,org.apache.hadoop:hadoop-aws:2.8.0
      

      If I start spark-shell with this configuration, spark-shell loads these jars and starts without any issue.

      But when I use Jupyter with Livy, I am getting error as shown in attached error.png.
      Same Jupyter+Livy+Spark combo works well with default spark configuration as shown in screenshot success.png

        Attachments

        1. Success.png
          71 kB
          Alex Bozarth
        2. Error.png
          91 kB
          Alex Bozarth

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              ckb Chetan Kumar Bhatt
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated: