Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-29487

Ability to run Spark Kubernetes other than from /opt/spark

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Duplicate
    • Affects Version/s: 2.4.4
    • Fix Version/s: None
    • Labels:
      None

      Description

      On spark kubernetes Dockerfile, the spark binaries are copied to /opt/spark.

      If we try to create our own Dockerfile without using /opt/spark then the image will not run.

      After looking at the source code, it seem that in various places, the path is hard-coded to /opt/spark

      Example :

      Constants.scala :

      // Spark app configs for containers
      val SPARK_CONF_VOLUME = "spark-conf-volume"
      val SPARK_CONF_DIR_INTERNAL = "/opt/spark/conf"

       

      Is it possible to make this configurable so we can put spark elsewhere than /opt/.

       

       

        Attachments

        Issue Links

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              benjaminmiao Benjamin Miao CAI

              Dates

              • Created:
                Updated:
                Resolved:

                Issue deployment