Description
Currently, the app name is hardcoded in pyspark as "PySparkShell", and the app name cannot be changed.
SPARK-8650 fixed this issue for spark-sql.
SPARK-9180 introduced a new option --name for spark-shell.
sparkR is different because SparkContext is not automatically constructed in sparkR, and the app name can be set when initializing SparkContext.
In summary-
shell | able to set app name |
---|---|
pyspark | no |
spark-shell | yes via --name |
spark-sql | yes via --conf spark.app.name |
sparkR | n/a |
Attachments
Issue Links
- is related to
-
SPARK-8650 Use the user-specified app name priority in SparkSQLCLIDriver or HiveThriftServer2
- Resolved
- relates to
-
SPARK-9180 Accept --name option in spark-submit
- Resolved
- links to