Details
-
Bug
-
Status: Resolved
-
Blocker
-
Resolution: Fixed
-
None
-
None
Description
Right now since the JRE silently swallows the invalid jar, it will produce really confusing behavior if users hit this. We should check if we are in this situation (either in spark-class or compute-classpath) and fail with an explicit error.
We can do something like:
$JAVA_HOME/bin/jar -tf lib/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar org/apache/spark/SparkContext
Which, when a user is running with JRE 6 and a JDK-7-compiled jar will produce:
java.util.zip.ZipException: invalid CEN header (bad signature) at java.util.zip.ZipFile.open(Native Method) at java.util.zip.ZipFile.<init>(ZipFile.java:132) at java.util.zip.ZipFile.<init>(ZipFile.java:93) at sun.tools.jar.Main.list(Main.java:997) at sun.tools.jar.Main.run(Main.java:242) at sun.tools.jar.Main.main(Main.java:1167)
Attachments
Issue Links
- is related to
-
SPARK-1520 Assembly Jar with more than 65536 files won't work when compiled on JDK7 and run on JDK6
- Resolved
-
SPARK-1911 Warn users if their assembly jars are not built with Java 6
- Resolved
-
SPARK-1698 Improve spark integration
- Closed