Uploaded image for project: 'Oozie'
  1. Oozie
  2. OOZIE-3159

Spark Action fails because of absence of hadoop mapreduce jar(s)

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 5.0.0b1
    • Component/s: None
    • Labels:
      None

      Description

      OOZIE-2869 removed map reduce dependencies from getting added to Spark action. Spark action uses
      org.apache.hadoop.filecache.DistributedCache. It is not available anymore in Spack action's classpath, causing it to fail.

      java.lang.reflect.InvocationTargetException
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:498)
      	at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:412)
      	at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:56)
      	at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:225)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:422)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
      	at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:219)
      	at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:155)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:422)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
      	at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:142)
      Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/filecache/DistributedCache
      	at org.apache.oozie.action.hadoop.SparkArgsExtractor.extract(SparkArgsExtractor.java:309)
      	at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:74)
      	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:101)
      	at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:60)
      	... 16 more
      Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.filecache.DistributedCache
      	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
      	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
      	... 20 more
      Failing Oozie Launcher, org/apache/hadoop/filecache/DistributedCache
      java.lang.NoClassDefFoundError: org/apache/hadoop/filecache/DistributedCache
      	at org.apache.oozie.action.hadoop.SparkArgsExtractor.extract(SparkArgsExtractor.java:309)
      	at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:74)
      	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:101)
      	at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:60)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:498)
      	at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:412)
      	at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:56)
      	at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:225)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:422)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
      	at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:219)
      	at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:155)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:422)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
      	at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:142)
      Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.filecache.DistributedCache
      	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
      	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
      	... 20 more
      Oozie Launcher, uploading action data to HDFS sequence file: hdfs://localhost:8020/user/saley/oozie-sale/0000009-180112124633268-oozie-sale-W/spark-node--spark/action-data.seq
      

      I enable adding map reduce jars by setting oozie.launcher.oozie.action.mapreduce.needed.for to true. The launcher job was able to kick the child job. But child job failed with

      2018-01-12 15:00:13,301 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster  - User class threw exception: java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
      java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
      	at java.lang.ClassLoader.checkCerts(ClassLoader.java:898)
      	at java.lang.ClassLoader.preDefineClass(ClassLoader.java:668)
      	at java.lang.ClassLoader.defineClass(ClassLoader.java:761)
      	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
      	at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
      	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
      	at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
      	at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
      	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
      	at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
      	at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
      	at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
      	at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:126)
      	at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:113)
      	at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:78)
      	at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62)
      	at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62)
      	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
      	at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:62)
      	at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:63)
      	at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:76)
      	at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:195)
      	at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:146)
      	at org.apache.spark.SparkContext.<init>(SparkContext.scala:473)
      	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
      	at org.apache.oozie.example.SparkFileCopy.main(SparkFileCopy.java:35)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:498)
      	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:542)
      2018-01-12 15:00:13,303 [Driver] INFO  org.apache.spark.deploy.yarn.ApplicationMaster  - Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package)
      

      I looked around this exception, it is due to servlet-api-2.5.jar which got pulled in by hadoop-common in the spark sharelib. We need to revisit the reason for adding hadoop-common as dependency.

        Attachments

        1. OOZIE-3159-004.patch
          2 kB
          Attila Sasvari
        2. OOZIE-3159-003.patch
          2 kB
          Attila Sasvari
        3. OOZIE-3159-002.patch
          1 kB
          Attila Sasvari
        4. OOZIE-3159-001.patch
          0.9 kB
          Attila Sasvari

          Issue Links

            Activity

              People

              • Assignee:
                asasvari Attila Sasvari
                Reporter:
                satishsaley Satish Subhashrao Saley
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: