Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
3.0.0
-
None
-
None
Description
Nowadays, Spark only pack the jars under $SPARK_HOME/jars.
How about packing the user jars when submitting Spark application?
Sometimes, user may involve lots of jars expect spark libs.
I think it can reduce the pressure for HDFS and nodemanager(localizer).