In my spark-defaults.conf I have configured a set of libararies to be uploaded to my Spark 1.4.0 Standalone cluster. The entry appears as:
When I execute spark-submit -v test.py
I see that spark-submit reads the defaults correctly, but that it overwrites the "spark.files" default entry and replaces it with the name if the job script, i.e. "test.py".
This behavior doesn't seem intuitive. test.py, should be added to the spark working folder, but it should not overwrite the "spark.files" defaults.