Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
0.6.0
-
None
-
None
-
Linux note-objective 3.16.0-53-generic #72~14.04.1-Ubuntu SMP Fri Nov 6 18:17:23 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
java version "1.8.0_60"
Java(TM) SE Runtime Environment (build 1.8.0_60-b27)
Java HotSpot(TM) 64-Bit Server VM (build 25.60-b23, mixed mode)
Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T08:57:37-03:00)Linux note-objective 3.16.0-53-generic #72~14.04.1-Ubuntu SMP Fri Nov 6 18:17:23 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux java version "1.8.0_60" Java(TM) SE Runtime Environment (build 1.8.0_60-b27) Java HotSpot(TM) 64-Bit Server VM (build 25.60-b23, mixed mode) Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T08:57:37-03:00)
Description
I'm facing a jars conflict after building zeppelin 0.6.0-snapshot with the following command
mvn -Pspark-1.5 -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phadoop-2.6 -DskipTests -Drat.numUnapprovedLicenses=1000 clean package -P build-distr
The exception is the following:
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
at java.lang.ClassLoader.checkCerts(ClassLoader.java:895)
at java.lang.ClassLoader.preDefineClass(ClassLoader.java:665)
at java.lang.ClassLoader.defineClass(ClassLoader.java:758)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:110)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:101)
at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:78)
at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62)
at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:62)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:61)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:74)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:190)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:141)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
at eleflow.uberdata.core.IUberdataContext.confSetup(IUberdataContext.scala:312)
at eleflow.uberdata.core.IUberdataContext.createSparkContextForProvisionedCluster(IUberdataContext.scala:318)
at eleflow.uberdata.core.IUberdataContext$$anonfun$sparkContext$1.apply(IUberdataContext.scala:194)
at eleflow.uberdata.core.IUberdataContext$$anonfun$sparkContext$1.apply(IUberdataContext.scala:193)
at scala.Option.getOrElse(Option.scala:120)
Workaround:
Looking into the lib folder of zeppelin distribution I found that we have both jetty jars and javax.servlet jars.
Removing javax.servlet jars, this exception doesn't happen anymore.
This stackoverflow post describe a similar problem:
http://stackoverflow.com/q/28086520/1791289