Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-29487

Ability to run Spark Kubernetes other than from /opt/spark

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Duplicate
    • 2.4.4
    • None
    • None

    Description

      On spark kubernetes Dockerfile, the spark binaries are copied to /opt/spark.

      If we try to create our own Dockerfile without using /opt/spark then the image will not run.

      After looking at the source code, it seem that in various places, the path is hard-coded to /opt/spark

      Example :

      Constants.scala :

      // Spark app configs for containers
      val SPARK_CONF_VOLUME = "spark-conf-volume"
      val SPARK_CONF_DIR_INTERNAL = "/opt/spark/conf"

       

      Is it possible to make this configurable so we can put spark elsewhere than /opt/.

       

       

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              benjaminmiao Benjamin Miao CAI
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: