Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Won't Fix
-
None
-
None
-
None
Description
For applications (zeppelin & livy ) that use both pyspark and sparkr, we have to duplicate the code in SparkSubmit to figure the where is the requirement resources for pyspark & sparkr (pyspark.zip, py4j, sparkr.zip and R file in jars). It would be better to provide option spark.usePython and spark.useR, so that downstream project which use both pyspark and sparkr in one spark application don't need to duplicate the code in SparkSubmit, can just leverage these 2 options.