On spark kubernetes Dockerfile, the spark binaries are copied to /opt/spark.
If we try to create our own Dockerfile without using /opt/spark then the image will not run.
After looking at the source code, it seem that in various places, the path is hard-coded to /opt/spark
// Spark app configs for containers
val SPARK_CONF_VOLUME = "spark-conf-volume"
val SPARK_CONF_DIR_INTERNAL = "/opt/spark/conf"
Is it possible to make this configurable so we can put spark elsewhere than /opt/.