Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-6435

spark-shell --jars option does not add all jars to classpath

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.3.0
    • 1.4.0
    • Spark Shell, Windows
    • None
    • Win64

    Description

      Not all jars supplied via the --jars option will be added to the driver (and presumably executor) classpath. The first jar(s) will be added, but not all.

      To reproduce this, just add a few jars (I tested 5) to the --jars option, and then try to import a class from the last jar. This fails. A simple reproducer:

      Create a bunch of dummy jars:
      jar cfM jar1.jar log.txt
      jar cfM jar2.jar log.txt
      jar cfM jar3.jar log.txt
      jar cfM jar4.jar log.txt

      Start the spark-shell with the dummy jars and guava at the end:
      %SPARK_HOME%\bin\spark-shell --master local --jars jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar

      In the shell, try importing from guava; you'll get an error:

      scala> import com.google.common.base.Strings
      <console>:19: error: object Strings is not a member of package com.google.common.base
             import com.google.common.base.Strings
                    ^
      

      Attachments

        Issue Links

          Activity

            People

              tsudukim Masayoshi Tsuzuki
              vjapache Vijay Garla
              Votes:
              0 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: