Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
1.3.0
-
None
-
Win64
Description
Not all jars supplied via the --jars option will be added to the driver (and presumably executor) classpath. The first jar(s) will be added, but not all.
To reproduce this, just add a few jars (I tested 5) to the --jars option, and then try to import a class from the last jar. This fails. A simple reproducer:
Create a bunch of dummy jars:
jar cfM jar1.jar log.txt
jar cfM jar2.jar log.txt
jar cfM jar3.jar log.txt
jar cfM jar4.jar log.txt
Start the spark-shell with the dummy jars and guava at the end:
%SPARK_HOME%\bin\spark-shell --master local --jars jar1.jar,jar2.jar,jar3.jar,jar4.jar,c:\code\lib\guava-14.0.1.jar
In the shell, try importing from guava; you'll get an error:
scala> import com.google.common.base.Strings <console>:19: error: object Strings is not a member of package com.google.common.base import com.google.common.base.Strings ^
Attachments
Issue Links
- is related to
-
SPARK-4941 Yarn cluster mode does not upload all needed jars to driver node (Spark 1.2.0)
- Resolved
- links to