Description
It would be nice to support multiple versions of Spark; the Livy server doesn't have a direct dependency on Spark, so users could just say which Spark version they want when launching sessions.
Livy would need some mapping between versions and different SPARK_HOME values, along with potentially different configuration files / directories too.
Attachments
Issue Links
- is a parent of
-
LIVY-246 Support multiple Spark home in runtime
- Open
-
LIVY-162 Support Spark 2
- Resolved
-
LIVY-219 Change Livy IT to support different scala version of livy-repl
- Resolved
-
LIVY-227 Propose to ship two REPL bundles in one assembly
- Resolved
-
LIVY-228 Shade some repl dependencies to mitigate the building difference between Spark1 and 2
- Resolved
-
LIVY-233 Support SparkSession in Job API
- Resolved
-
LIVY-234 Make Livy Scala API to support both Scala 2.10 and 2.11
- Resolved