Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Duplicate
-
1.4.0
-
None
-
Ubuntu, Spark 1.4.0 Standalone
Description
In my spark-defaults.conf I have configured a set of libararies to be uploaded to my Spark 1.4.0 Standalone cluster. The entry appears as:
spark.files libarary.zip,file1.py,file2.py
When I execute spark-submit -v test.py
I see that spark-submit reads the defaults correctly, but that it overwrites the "spark.files" default entry and replaces it with the name if the job script, i.e. "test.py".
This behavior doesn't seem intuitive. test.py, should be added to the spark working folder, but it should not overwrite the "spark.files" defaults.
Attachments
Issue Links
- duplicates
-
SPARK-14845 spark.files in properties file is not distributed to driver in yarn-cluster mode
- Resolved