Details
-
Umbrella
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
For reasoning, discussion of pros and cons, and other more detailed information, please see attached doc.
The idea is to be able to build a Spark distribution that has just a directory full of jars instead of the huge assembly files we currently have.
Getting there requires changes in a bunch of places, I'll try to list the ones I identified in the document, in the order that I think would be needed to not break things:
- make streaming backends not be assemblies
Since people may depend on the current assembly artifacts in their deployments, we can't really remove them; but we can make them be dummy jars and rely on dependency resolution to download all the jars.
PySpark tests would also need some tweaking here.
- make examples jar not be an assembly
Probably requires tweaks to the run-example script. The location of the examples jar would have to change (it won't be able to live in the same place as the main Spark jars anymore).
- update YARN backend to handle a directory full of jars when launching apps
Currently YARN localizes the Spark assembly (depending on the user configuration); it needs to be modified so that it can localize all needed libraries instead of a single jar.
- Modify launcher library to handle the jars directory
This should be trivial
- Modify assembly/pom.xml to generate assembly or a libs directory depending on which profile is enabled.
We should keep the option to build with the assembly on by default, for backwards compatibility, to give people time to prepare.
Filing this bug as an umbrella; please file sub-tasks if you plan to work on a specific part of the issue.
Attachments
Attachments
Issue Links
- is duplicated by
-
SPARK-7009 Build assembly JAR via ant to avoid zip64 problems
- Resolved
- relates to
-
HIVE-15302 Relax the requirement that HoS needs Spark built w/o Hive
- Open
-
SPARK-14601 Minor doc/usage changes related to removal of Spark assembly
- Resolved