Details
-
Sub-task
-
Status: Resolved
-
Blocker
-
Resolution: Fixed
-
None
-
None
Description
This is not working correctly at present. We should assume that SPARK_DRIVER_MEM should be set unless --deploy-mode is explicitly set to "cluster".
patrick@patrick-t430s:~/Documents/spark$ SPARK_PRINT_LAUNCH_COMMAND=1 ./bin/spark-shell --driver-mem
ory 2g
Spark Command: /usr/lib/jvm/jdk1.7.0_25/bin/java -cp ::/home/patrick/Documents/spark/conf:/home/patrick/Documents/spark/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.0.4.jar -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main
========================================
Attachments
Attachments
Issue Links
- is duplicated by
-
SPARK-1756 Add missing description to spark-env.sh.template
- Resolved