Details
-
Improvement
-
Status: Resolved
-
Trivial
-
Resolution: Fixed
-
1.4.1
-
None
Description
Currently spark-shell command does not accept specifying its application name via --name option, as it is hard coded as "Spark shell" in org.apache.spark.repl.Main.
I think it's better if the app name is user configurable.
The major problem of using the default application name "Spark shell" is that Monitoring JSON API (http://<spark-shell-host>:4040/api/v1/applications/app-id/) cannot be used with spark-shell, as the application ID contains space character.
Attachments
Issue Links
- is related to
-
SPARK-9270 Allow --name option in pyspark
- Resolved
- links to