Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 4.2.0
    • Fix Version/s: 4.3.0
    • Component/s: core, workflow
    • Labels:
      None
    • Environment:

      Hadoop 2.7.2, Spark 1.6.0 on Yarn, Oozie 4.2.0
      Cluster secured with Kerberos

      Description

      Hello,

      I'm trying to run pi.py example in a pyspark job with Oozie. Every try I made failed for the same reason: key not found: SPARK_HOME.
      Note: A scala job works well in the environment with Oozie.

      The logs on the executors are:

      SLF4J: Class path contains multiple SLF4J bindings.
      SLF4J: Found binding in [jar:file:/mnt/hd4/hadoop/yarn/local/filecache/145/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: Found binding in [jar:file:/mnt/hd2/hadoop/yarn/local/filecache/155/spark-assembly-1.6.0-hadoop2.7.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: Found binding in [jar:file:/opt/application/Hadoop/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
      SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
      SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
      log4j:ERROR setFile(null,true) call failed.
      java.io.FileNotFoundException: /mnt/hd7/hadoop/yarn/log/application_1454673025841_13136/container_1454673025841_13136_01_000001 (Is a directory)
              at java.io.FileOutputStream.open(Native Method)
              at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
              at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
              at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
              at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
              at org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55)
              at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
              at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
              at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
              at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:809)
              at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)
              at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615)
              at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:502)
              at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:547)
              at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:483)
              at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
              at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
              at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285)
              at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
              at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
              at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:275)
              at org.apache.hadoop.service.AbstractService.<clinit>(AbstractService.java:43)
      Using properties file: null
      Parsed arguments:
        master                  yarn-master
        deployMode              cluster
        executorMemory          null
        executorCores           null
        totalExecutorCores      null
        propertiesFile          null
        driverMemory            null
        driverCores             null
        driverExtraClassPath    null
        driverExtraLibraryPath  null
        driverExtraJavaOptions  null
        supervise               false
        queue                   null
        numExecutors            null
        files                   null
        pyFiles                 null
        archives                null
        mainClass               null
        primaryResource         hdfs://hadoopsandbox/User/toto/WORK/Oozie/pyspark/lib/pi.py
        name                    Pysparkpi example
        childArgs               [100]
        jars                    null
        packages                null
        packagesExclusions      null
        repositories            null
        verbose                 true
      
      Spark properties used, including those specified through
       --conf and those from the properties file null:
        spark.executorEnv.SPARK_HOME -> /opt/application/Spark/current
        spark.executorEnv.PYTHONPATH -> /opt/application/Spark/current/python
        spark.yarn.appMasterEnv.SPARK_HOME -> /opt/application/Spark/current
      
      
      Main class:
      org.apache.spark.deploy.yarn.Client
      Arguments:
      --name
      Pysparkpi example
      --primary-py-file
      hdfs://hadoopsandbox/User/toto/WORK/Oozie/pyspark/lib/pi.py
      --class
      org.apache.spark.deploy.PythonRunner
      --arg
      100
      System properties:
      spark.executorEnv.SPARK_HOME -> /opt/application/Spark/current
      spark.executorEnv.PYTHONPATH -> /opt/application/Spark/current/python
      SPARK_SUBMIT -> true
      spark.app.name -> Pysparkpi example
      spark.submit.deployMode -> cluster
      spark.yarn.appMasterEnv.SPARK_HOME -> /opt/application/Spark/current
      spark.yarn.isPython -> true
      spark.master -> yarn-cluster
      Classpath elements:
      
      
      
      Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, key not found: SPARK_HOME
      java.util.NoSuchElementException: key not found: SPARK_HOME
              at scala.collection.MapLike$class.default(MapLike.scala:228)
              at scala.collection.AbstractMap.default(Map.scala:58)
              at scala.collection.MapLike$class.apply(MapLike.scala:141)
              at scala.collection.AbstractMap.apply(Map.scala:58)
              at org.apache.spark.deploy.yarn.Client$$anonfun$findPySparkArchives$2.apply(Client.scala:1045)
              at org.apache.spark.deploy.yarn.Client$$anonfun$findPySparkArchives$2.apply(Client.scala:1044)
              at scala.Option.getOrElse(Option.scala:120)
              at org.apache.spark.deploy.yarn.Client.findPySparkArchives(Client.scala:1044)
              at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:717)
              at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:142)
              at org.apache.spark.deploy.yarn.Client.run(Client.scala:1016)
              at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1076)
              at org.apache.spark.deploy.yarn.Client.main(Client.scala)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:606)
              at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
              at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
              at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
              at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
              at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
              at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104)
              at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95)
              at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
              at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:606)
              at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
              at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
              at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
              at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
              at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
              at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
              at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
              at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
              at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
              at java.util.concurrent.FutureTask.run(FutureTask.java:262)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
              at java.lang.Thread.run(Thread.java:745)
      log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
      log4j:WARN Please initialize the log4j system properly.
      log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
      

      The workflow used for Oozie is the following:

      <workflow-app xmlns='uri:oozie:workflow:0.5' name='PysparkPi-test'>
              <start to='spark-node' />
              <action name='spark-node'>
                      <spark xmlns="uri:oozie:spark-action:0.1">
                              <job-tracker>${jobTracker}</job-tracker>
                              <name-node>${nameNode}</name-node>
                              <master>${master}</master>
                              <mode>${mode}</mode>
                              <name>Pysparkpi example</name>
                              <class></class>
                              <jar>${nameNode}/User/toto/WORK/Oozie/pyspark/lib/pi.py</jar>
                              <spark-opts>--conf spark.yarn.appMasterEnv.SPARK_HOME=/opt/application/Spark/current --conf spark.executorEnv.SPARK_HOME=/opt/application/Spark/current --conf spark.executorEnv.PYTHONPATH=/opt/application/Spark/current/python</spark-opts>
                              <arg>100</arg>
                      </spark>
                      <ok to="end" />
                      <error to="fail" />
              </action>
              <kill name="fail">
                      <message>Workflow failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
              </kill>
              <end name='end' />
      </workflow-app>
      

      I also created a JIRA for Spark: https://issues.apache.org/jira/browse/SPARK-13679

      1. pyspark.zip
        349 kB
        Satish Subhashrao Saley
      2. py4j-0.9-src.zip
        44 kB
        Satish Subhashrao Saley
      3. OOZIE-2482-zip.patch
        36 kB
        Peter Cseh
      4. OOZIE-2482-6.patch
        523 kB
        Peter Cseh
      5. OOZIE-2482-5.patch
        522 kB
        Peter Cseh
      6. OOZIE-2482-4.patch
        524 kB
        Peter Cseh
      7. OOZIE-2482-3.patch
        535 kB
        Satish Subhashrao Saley
      8. OOZIE-2482-2.patch
        524 kB
        Satish Subhashrao Saley
      9. OOZIE-2482-1.patch
        29 kB
        Satish Subhashrao Saley

        Issue Links

          Activity

          Hide
          murali.msse Murali Ramasami added a comment -

          Can you try setting the SPARK_HOME in your hadoop-env.sh file and try again? I had faced the simillar issue and after setting the SPARK_HOME in hadoop-env the problem resolved.

          Show
          murali.msse Murali Ramasami added a comment - Can you try setting the SPARK_HOME in your hadoop-env.sh file and try again? I had faced the simillar issue and after setting the SPARK_HOME in hadoop-env the problem resolved.
          Hide
          murali.msse Murali Ramasami added a comment -

          Also, can you tell me which mode you have tried. I have tried with yarn-cluster.

          Show
          murali.msse Murali Ramasami added a comment - Also, can you tell me which mode you have tried. I have tried with yarn-cluster.
          Hide
          BigDataOrange Alexandre Linte added a comment -

          Hi Murali Ramasami,

          Thank you for your feedback, I'm going to check your solution today.

          I tried both yarn-cluster and yarn-client mode but the result was the same (error key not found: SPARK_HOME).

          I also have a question. Did you set the SPARK_HOME in spark-env.sh?

          Show
          BigDataOrange Alexandre Linte added a comment - Hi Murali Ramasami , Thank you for your feedback, I'm going to check your solution today. I tried both yarn-cluster and yarn-client mode but the result was the same (error key not found: SPARK_HOME). I also have a question. Did you set the SPARK_HOME in spark-env.sh?
          Hide
          murali.msse Murali Ramasami added a comment -

          Alexandre Linte Please specify the SPARK_HOME in your hadoop-env.sh and restart the services and can you try?

          Show
          murali.msse Murali Ramasami added a comment - Alexandre Linte Please specify the SPARK_HOME in your hadoop-env.sh and restart the services and can you try?
          Hide
          BigDataOrange Alexandre Linte added a comment -

          Hi Murali Ramasami, I tried your solution this morning. I set the SPARK_HOME in the hadoop-env.sh. I don't have the error "key not found: SPARK_HOME" but now I have the following:

          Using properties file: null
          Parsed arguments:
            master                  yarn-master
            deployMode              cluster
            executorMemory          null
            executorCores           null
            totalExecutorCores      null
            propertiesFile          null
            driverMemory            null
            driverCores             null
            driverExtraClassPath    null
            driverExtraLibraryPath  null
            driverExtraJavaOptions  null
            supervise               false
            queue                   null
            numExecutors            null
            files                   null
            pyFiles                 null
            archives                null
            mainClass               null
            primaryResource         hdfs://sandbox/User/zzqj3827/WORK/Oozie/pyspark/lib/pi.py
            name                    Pysparkpi example
            childArgs               [100]
            jars                    null
            packages                null
            packagesExclusions      null
            repositories            null
            verbose                 true
          
          Spark properties used, including those specified through
           --conf and those from the properties file null:
          
          
          
          Main class:
          org.apache.spark.deploy.yarn.Client
          Arguments:
          --name
          Pysparkpi example
          --primary-py-file
          hdfs://sandbox/User/zzqj3827/WORK/Oozie/pyspark/lib/pi.py
          --class
          org.apache.spark.deploy.PythonRunner
          --arg
          100
          System properties:
          SPARK_SUBMIT -> true
          spark.app.name -> Pysparkpi example
          spark.submit.deployMode -> cluster
          spark.yarn.isPython -> true
          spark.master -> yarn-cluster
          Classpath elements:
          
          
          
          Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, requirement failed: pyspark.zip not found; cannot run pyspark application in YARN mode.
          java.lang.IllegalArgumentException: requirement failed: pyspark.zip not found; cannot run pyspark application in YARN mode.
                  at scala.Predef$.require(Predef.scala:233)
                  at org.apache.spark.deploy.yarn.Client$$anonfun$findPySparkArchives$2.apply(Client.scala:1047)
                  at org.apache.spark.deploy.yarn.Client$$anonfun$findPySparkArchives$2.apply(Client.scala:1044)
                  at scala.Option.getOrElse(Option.scala:120)
                  at org.apache.spark.deploy.yarn.Client.findPySparkArchives(Client.scala:1044)
                  at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:717)
                  at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:142)
                  at org.apache.spark.deploy.yarn.Client.run(Client.scala:1016)
                  at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1076)
                  at org.apache.spark.deploy.yarn.Client.main(Client.scala)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
                  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
                  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
                  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
                  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
                  at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104)
                  at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95)
                  at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
                  at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
                  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
                  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
                  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
                  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
                  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
                  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                  at java.lang.Thread.run(Thread.java:745)
          log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
          log4j:WARN Please initialize the log4j system properly.
          log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
          

          Do I need to add the "pyspark.zip" in the Oozie sharelibs to make it work?

          Show
          BigDataOrange Alexandre Linte added a comment - Hi Murali Ramasami , I tried your solution this morning. I set the SPARK_HOME in the hadoop-env.sh. I don't have the error "key not found: SPARK_HOME" but now I have the following: Using properties file: null Parsed arguments: master yarn-master deployMode cluster executorMemory null executorCores null totalExecutorCores null propertiesFile null driverMemory null driverCores null driverExtraClassPath null driverExtraLibraryPath null driverExtraJavaOptions null supervise false queue null numExecutors null files null pyFiles null archives null mainClass null primaryResource hdfs://sandbox/User/zzqj3827/WORK/Oozie/pyspark/lib/pi.py name Pysparkpi example childArgs [100] jars null packages null packagesExclusions null repositories null verbose true Spark properties used, including those specified through --conf and those from the properties file null: Main class: org.apache.spark.deploy.yarn.Client Arguments: --name Pysparkpi example --primary-py-file hdfs://sandbox/User/zzqj3827/WORK/Oozie/pyspark/lib/pi.py --class org.apache.spark.deploy.PythonRunner --arg 100 System properties: SPARK_SUBMIT -> true spark.app.name -> Pysparkpi example spark.submit.deployMode -> cluster spark.yarn.isPython -> true spark.master -> yarn-cluster Classpath elements: Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, requirement failed: pyspark.zip not found; cannot run pyspark application in YARN mode. java.lang.IllegalArgumentException: requirement failed: pyspark.zip not found; cannot run pyspark application in YARN mode. at scala.Predef$.require(Predef.scala:233) at org.apache.spark.deploy.yarn.Client$$anonfun$findPySparkArchives$2.apply(Client.scala:1047) at org.apache.spark.deploy.yarn.Client$$anonfun$findPySparkArchives$2.apply(Client.scala:1044) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.deploy.yarn.Client.findPySparkArchives(Client.scala:1044) at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:717) at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:142) at org.apache.spark.deploy.yarn.Client.run(Client.scala:1016) at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1076) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104) at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Do I need to add the "pyspark.zip" in the Oozie sharelibs to make it work?
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Hi Alexandre Linte,
          Could you please check if you are facing https://issues.apache.org/jira/browse/SPARK-10795? This and this comment mentioned when the issue was seen. Internally, Oozie uses spark-submit to submit spark job.

          Parsed arguments:
          master yarn-master

          yarn-master is not a valid argument for master. Spark doc does not mention it.

          Show
          satishsaley Satish Subhashrao Saley added a comment - Hi Alexandre Linte , Could you please check if you are facing https://issues.apache.org/jira/browse/SPARK-10795? This and this comment mentioned when the issue was seen. Internally, Oozie uses spark-submit to submit spark job. Parsed arguments: master yarn-master yarn-master is not a valid argument for master. Spark doc does not mention it.
          Hide
          BigDataOrange Alexandre Linte added a comment -

          Hi Satish Subhashrao Saley,

          Thank you for the replay. My bad, the argument "yarn-master" is a mistake. I corrected it by setting "yarn-cluster" in my job configuration.

          I checked the comments on the JIRA SPARK-10795. I can successfully do the command:

          [toto@client pysparkpi]$ spark-submit -v --master yarn-client ./pi.py 100
          Using properties file: /opt/application/Spark/current/conf/spark-defaults.conf
          Adding default property: spark.serializer=org.apache.spark.serializer.KryoSerializer
          Adding default property: spark.executor.extraJavaOptions=-Djava.library.path=/opt/application/Hadoop/current/lib/native/
          Adding default property: spark.broadcast.compress=true
          Adding default property: spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec
          Adding default property: spark.eventLog.enabled=true
          Adding default property: spark.driver.maxResultSize=1200m
          Adding default property: spark.io.compression.snappy.blockSize=32k
          Adding default property: spark.kryoserializer.buffer.max=1500m
          Adding default property: spark.sql.hive.metastore.jars=builtin
          Adding default property: spark.driver.memory=2g
          Adding default property: spark.executor.instances=4
          Adding default property: spark.kryo.referenceTracking=false
          Adding default property: spark.default.parallelism=10
          Adding default property: spark.kryo.classesToRegister=org.apache.hadoop.hive.ql.io.HiveKey,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch
          Adding default property: spark.kryoserializer.buffer=100m
          Adding default property: spark.master=yarn-client
          Adding default property: spark.broadcast.blockSize=4096
          Adding default property: spark.executor.memory=4g
          Adding default property: spark.eventLog.dir=hdfs:///Products/SPARK/logs/
          Adding default property: spark.eventLog.compress=true
          Adding default property: spark.executor.cores=2
          Adding default property: spark.yarn.scheduler.heartbeat.interval-ms=3000
          Adding default property: spark.akka.frameSize=100
          Adding default property: spark.sql.hive.metastore.version=1.2.1
          Parsed arguments:
            master                  yarn-client
            deployMode              null
            executorMemory          4g
            executorCores           2
            totalExecutorCores      null
            propertiesFile          /opt/application/Spark/current/conf/spark-defaults.conf
            driverMemory            2g
            driverCores             null
            driverExtraClassPath    null
            driverExtraLibraryPath  null
            driverExtraJavaOptions  null
            supervise               false
            queue                   null
            numExecutors            4
            files                   null
            pyFiles                 null
            archives                null
            mainClass               null
            primaryResource         file:/home/toto/workspace/oozie/pyspark/pysparkpi/./pi.py
            name                    pi.py
            childArgs               [100]
            jars                    null
            packages                null
            packagesExclusions      null
            repositories            null
            verbose                 true
          
          Spark properties used, including those specified through
           --conf and those from the properties file /opt/application/Spark/current/conf/spark-defaults.conf:
            spark.io.compression.codec -> org.apache.spark.io.SnappyCompressionCodec
            spark.default.parallelism -> 10
            spark.executor.memory -> 4g
            spark.driver.memory -> 2g
            spark.kryo.referenceTracking -> false
            spark.broadcast.blockSize -> 4096
            spark.executor.instances -> 4
            spark.eventLog.compress -> true
            spark.eventLog.enabled -> true
            spark.kryo.classesToRegister -> org.apache.hadoop.hive.ql.io.HiveKey,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch
            spark.kryoserializer.buffer -> 100m
            spark.serializer -> org.apache.spark.serializer.KryoSerializer
            spark.executor.extraJavaOptions -> -Djava.library.path=/opt/application/Hadoop/current/lib/native/
            spark.akka.frameSize -> 100
            spark.yarn.scheduler.heartbeat.interval-ms -> 3000
            spark.sql.hive.metastore.version -> 1.2.1
            spark.kryoserializer.buffer.max -> 1500m
            spark.broadcast.compress -> true
            spark.eventLog.dir -> hdfs:///Products/SPARK/logs/
            spark.driver.maxResultSize -> 1200m
            spark.master -> yarn-client
            spark.io.compression.snappy.blockSize -> 32k
            spark.executor.cores -> 2
            spark.sql.hive.metastore.jars -> builtin
          
          
          Main class:
          org.apache.spark.deploy.PythonRunner
          Arguments:
          file:/home/toto/workspace/oozie/pyspark/pysparkpi/./pi.py
          null
          100
          System properties:
          spark.io.compression.codec -> org.apache.spark.io.SnappyCompressionCodec
          spark.default.parallelism -> 10
          spark.kryo.referenceTracking -> false
          spark.driver.memory -> 2g
          spark.executor.memory -> 4g
          spark.broadcast.blockSize -> 4096
          spark.executor.instances -> 4
          spark.eventLog.compress -> true
          spark.eventLog.enabled -> true
          spark.kryo.classesToRegister -> org.apache.hadoop.hive.ql.io.HiveKey,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch
          SPARK_SUBMIT -> true
          spark.kryoserializer.buffer -> 100m
          spark.serializer -> org.apache.spark.serializer.KryoSerializer
          spark.akka.frameSize -> 100
          spark.executor.extraJavaOptions -> -Djava.library.path=/opt/application/Hadoop/current/lib/native/
          spark.app.name -> pi.py
          spark.yarn.scheduler.heartbeat.interval-ms -> 3000
          spark.sql.hive.metastore.version -> 1.2.1
          spark.submit.deployMode -> client
          spark.kryoserializer.buffer.max -> 1500m
          spark.broadcast.compress -> true
          spark.eventLog.dir -> hdfs:///Products/SPARK/logs/
          spark.driver.maxResultSize -> 1200m
          spark.yarn.isPython -> true
          spark.master -> yarn-client
          spark.io.compression.snappy.blockSize -> 32k
          spark.executor.cores -> 2
          spark.sql.hive.metastore.jars -> builtin
          Classpath elements:
          
          
          
          Pi is roughly 3.142274
          

          But when I do the job with oozie, it still fails.

          For info, the pi script I'm using is the following:

          from __future__ import print_function
          #
          # Licensed to the Apache Software Foundation (ASF) under one or more
          # contributor license agreements.  See the NOTICE file distributed with
          # this work for additional information regarding copyright ownership.
          # The ASF licenses this file to You under the Apache License, Version 2.0
          # (the "License"); you may not use this file except in compliance with
          # the License.  You may obtain a copy of the License at
          #
          #    http://www.apache.org/licenses/LICENSE-2.0
          #
          # Unless required by applicable law or agreed to in writing, software
          # distributed under the License is distributed on an "AS IS" BASIS,
          # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
          # See the License for the specific language governing permissions and
          # limitations under the License.
          #
          
          import sys
          import os
          from random import random
          from operator import add
          
          from pyspark import SparkContext
          
          
          if __name__ == "__main__":
              """
                  Usage: pi [partitions]
              """
              os.environ["SPARK_HOME"] = "/opt/application/Spark/current"
              sc = SparkContext(appName="PythonPi")
              partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2
              n = 100000 * partitions
          
              def f(_):
                  x = random() * 2 - 1
                  y = random() * 2 - 1
                  return 1 if x ** 2 + y ** 2 < 1 else 0
          
              count = sc.parallelize(range(1, n + 1), partitions).map(f).reduce(add)
              print("Pi is roughly %f" % (4.0 * count / n))
          
              sc.stop()
          
          Show
          BigDataOrange Alexandre Linte added a comment - Hi Satish Subhashrao Saley , Thank you for the replay. My bad, the argument "yarn-master" is a mistake. I corrected it by setting "yarn-cluster" in my job configuration. I checked the comments on the JIRA SPARK-10795 . I can successfully do the command: [toto@client pysparkpi]$ spark-submit -v --master yarn-client ./pi.py 100 Using properties file: /opt/application/Spark/current/conf/spark-defaults.conf Adding default property: spark.serializer=org.apache.spark.serializer.KryoSerializer Adding default property: spark.executor.extraJavaOptions=-Djava.library.path=/opt/application/Hadoop/current/lib/native/ Adding default property: spark.broadcast.compress=true Adding default property: spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec Adding default property: spark.eventLog.enabled=true Adding default property: spark.driver.maxResultSize=1200m Adding default property: spark.io.compression.snappy.blockSize=32k Adding default property: spark.kryoserializer.buffer.max=1500m Adding default property: spark.sql.hive.metastore.jars=builtin Adding default property: spark.driver.memory=2g Adding default property: spark.executor.instances=4 Adding default property: spark.kryo.referenceTracking=false Adding default property: spark.default.parallelism=10 Adding default property: spark.kryo.classesToRegister=org.apache.hadoop.hive.ql.io.HiveKey,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch Adding default property: spark.kryoserializer.buffer=100m Adding default property: spark.master=yarn-client Adding default property: spark.broadcast.blockSize=4096 Adding default property: spark.executor.memory=4g Adding default property: spark.eventLog.dir=hdfs:///Products/SPARK/logs/ Adding default property: spark.eventLog.compress=true Adding default property: spark.executor.cores=2 Adding default property: spark.yarn.scheduler.heartbeat.interval-ms=3000 Adding default property: spark.akka.frameSize=100 Adding default property: spark.sql.hive.metastore.version=1.2.1 Parsed arguments: master yarn-client deployMode null executorMemory 4g executorCores 2 totalExecutorCores null propertiesFile /opt/application/Spark/current/conf/spark-defaults.conf driverMemory 2g driverCores null driverExtraClassPath null driverExtraLibraryPath null driverExtraJavaOptions null supervise false queue null numExecutors 4 files null pyFiles null archives null mainClass null primaryResource file:/home/toto/workspace/oozie/pyspark/pysparkpi/./pi.py name pi.py childArgs [100] jars null packages null packagesExclusions null repositories null verbose true Spark properties used, including those specified through --conf and those from the properties file /opt/application/Spark/current/conf/spark-defaults.conf: spark.io.compression.codec -> org.apache.spark.io.SnappyCompressionCodec spark.default.parallelism -> 10 spark.executor.memory -> 4g spark.driver.memory -> 2g spark.kryo.referenceTracking -> false spark.broadcast.blockSize -> 4096 spark.executor.instances -> 4 spark.eventLog.compress -> true spark.eventLog.enabled -> true spark.kryo.classesToRegister -> org.apache.hadoop.hive.ql.io.HiveKey,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch spark.kryoserializer.buffer -> 100m spark.serializer -> org.apache.spark.serializer.KryoSerializer spark.executor.extraJavaOptions -> -Djava.library.path=/opt/application/Hadoop/current/lib/native/ spark.akka.frameSize -> 100 spark.yarn.scheduler.heartbeat.interval-ms -> 3000 spark.sql.hive.metastore.version -> 1.2.1 spark.kryoserializer.buffer.max -> 1500m spark.broadcast.compress -> true spark.eventLog.dir -> hdfs:///Products/SPARK/logs/ spark.driver.maxResultSize -> 1200m spark.master -> yarn-client spark.io.compression.snappy.blockSize -> 32k spark.executor.cores -> 2 spark.sql.hive.metastore.jars -> builtin Main class: org.apache.spark.deploy.PythonRunner Arguments: file:/home/toto/workspace/oozie/pyspark/pysparkpi/./pi.py null 100 System properties: spark.io.compression.codec -> org.apache.spark.io.SnappyCompressionCodec spark.default.parallelism -> 10 spark.kryo.referenceTracking -> false spark.driver.memory -> 2g spark.executor.memory -> 4g spark.broadcast.blockSize -> 4096 spark.executor.instances -> 4 spark.eventLog.compress -> true spark.eventLog.enabled -> true spark.kryo.classesToRegister -> org.apache.hadoop.hive.ql.io.HiveKey,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch SPARK_SUBMIT -> true spark.kryoserializer.buffer -> 100m spark.serializer -> org.apache.spark.serializer.KryoSerializer spark.akka.frameSize -> 100 spark.executor.extraJavaOptions -> -Djava.library.path=/opt/application/Hadoop/current/lib/native/ spark.app.name -> pi.py spark.yarn.scheduler.heartbeat.interval-ms -> 3000 spark.sql.hive.metastore.version -> 1.2.1 spark.submit.deployMode -> client spark.kryoserializer.buffer.max -> 1500m spark.broadcast.compress -> true spark.eventLog.dir -> hdfs:///Products/SPARK/logs/ spark.driver.maxResultSize -> 1200m spark.yarn.isPython -> true spark.master -> yarn-client spark.io.compression.snappy.blockSize -> 32k spark.executor.cores -> 2 spark.sql.hive.metastore.jars -> builtin Classpath elements: Pi is roughly 3.142274 But when I do the job with oozie, it still fails. For info, the pi script I'm using is the following: from __future__ import print_function # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # import sys import os from random import random from operator import add from pyspark import SparkContext if __name__ == "__main__": """ Usage: pi [partitions] """ os.environ["SPARK_HOME"] = "/opt/application/Spark/current" sc = SparkContext(appName="PythonPi") partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2 n = 100000 * partitions def f(_): x = random() * 2 - 1 y = random() * 2 - 1 return 1 if x ** 2 + y ** 2 < 1 else 0 count = sc.parallelize(range(1, n + 1), partitions).map(f).reduce(add) print("Pi is roughly %f" % (4.0 * count / n)) sc.stop()
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Ferenc Denes Have you resolved the issue already (saw ticket reassigned)? If not, I am willing to work on it.

          Show
          satishsaley Satish Subhashrao Saley added a comment - Ferenc Denes Have you resolved the issue already (saw ticket reassigned)? If not, I am willing to work on it.
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Hi Alexandre Linte,
          I set SPARK_HOME incorrectly in hadoop-env.sh and faced same issue. After setting it correctly, I was able to execute pi.py.
          export SPARK_HOME=/Users/saley/hadoop-stuff/spark-1.6.1-bin-hadoop2.6

          Try setting export PYSPARK_ARCHIVES_PATH=$SPARK_HOME/python/lib/pyspark.zip,$SPARK_HOME/python/lib/py4j-0.9-src.zip

          But it should work even if you don't set PYSPARK_ARCHIVES_PATH variable, the else block in the code will get executed.

          Show
          satishsaley Satish Subhashrao Saley added a comment - Hi Alexandre Linte , I set SPARK_HOME incorrectly in hadoop-env.sh and faced same issue. After setting it correctly, I was able to execute pi.py. export SPARK_HOME=/Users/saley/hadoop-stuff/spark-1.6.1-bin-hadoop2.6 Try setting export PYSPARK_ARCHIVES_PATH=$SPARK_HOME/python/lib/pyspark.zip,$SPARK_HOME/python/lib/py4j-0.9-src.zip But it should work even if you don't set PYSPARK_ARCHIVES_PATH variable, the else block in the code will get executed.
          Hide
          fdenes Ferenc Denes added a comment -

          Please feel free to work on that, I'm not close to the solution yet, and tied down with other issues.

          Show
          fdenes Ferenc Denes added a comment - Please feel free to work on that, I'm not close to the solution yet, and tied down with other issues.
          Hide
          grimesmi Mike Grimes added a comment -

          Based on some of the comments above it looks like the issue is due to the fact that spark-defaults.conf is not being pulled in (see that "Using properties file: null" in output). This is because Oozie will launch the spark action in a container on any random node - assuming spark and its required configuration is set up correctly on each node. Is this a fair assumption to make? I feel like this goes against how spark is currently being used in the community, it seems much more common to have Spark installed on the master, with all necessary configuration, and to run jobs from there.

          Would it be ideal to re-implement the spark action not as an extension on the JavaAction, but the SshAction, to ensure it runs on the master node?

          Show
          grimesmi Mike Grimes added a comment - Based on some of the comments above it looks like the issue is due to the fact that spark-defaults.conf is not being pulled in (see that "Using properties file: null" in output). This is because Oozie will launch the spark action in a container on any random node - assuming spark and its required configuration is set up correctly on each node. Is this a fair assumption to make? I feel like this goes against how spark is currently being used in the community, it seems much more common to have Spark installed on the master, with all necessary configuration, and to run jobs from there. Would it be ideal to re-implement the spark action not as an extension on the JavaAction, but the SshAction, to ensure it runs on the master node?
          Hide
          rkanter Robert Kanter added a comment -

          The Spark Action actually will have the spark-default.conf content as long as you provide it to the Oozie Server (see OOZIE-2170).

          Show
          rkanter Robert Kanter added a comment - The Spark Action actually will have the spark-default.conf content as long as you provide it to the Oozie Server (see OOZIE-2170 ).
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Earlier I tried pyspark with yarn-cluster on single node cluster on my mac and it was very easy. But running pyspark with yarn-cluster mode on multinode cluster needs few more things.

          1. When we submit a spark job, Spark code checks for PYSPARK_ARCHIVES_PATH. If PYSPARK_ARCHIVES_PATH is not present then it looks for SPARK_HOME. Therefore, we must have at least one of them set up correctly.
          We can set this environment variable using oozie.launcher.mapred.child.env property.

          2. The py4j-0.9-src.zip and pyspark.zip (versions may vary based on spark version) are necessary to run python script in spark. Therefore, we need both of them present in classpath while executing the script. Simple way is to put them under lib/ directory of our workflow.

          3. --py-files option must be configured and passed in <spark-opts>

          Settings would look like -

          <spark>
          <configuration>
          .....
          .....
          <property> 
            <name>oozie.launcher.mapred.child.env</name>
            <value>PYSPARK_ARCHIVES_PATH=pyspark.zip</value>
          </property>
          </configuration>
          
          <master>yarn-cluster</master>
          <name>pyspark example</name>
          <jar>/hdfs/path/to/pi.py</jar>
          <spark-opts>--queue satishq --conf spark.yarn.historyServer.address=http://spark.yarn.hsaddress.com:#port --conf spark.ui.view.acls=* --conf spark.eventLog.dir=hdfs://hdfspath/mapred/sparkhistory --py-files pyspark.zip,py4j-0.9-src.zip</spark-opts>
          </spark>
          

          Oozie can do some extra work to make user's life easy by setting PYSPARK_ARCHIVES_PATH, adding --py-files option automatically by figuring out location of pyspark.zip and py4j-0.9-src.zip based on the mapping file provided by user in oozie.service.ShareLibService.mapping.file or from default sharelib location if user has not provided any mapping file.

          Show
          satishsaley Satish Subhashrao Saley added a comment - Earlier I tried pyspark with yarn-cluster on single node cluster on my mac and it was very easy. But running pyspark with yarn-cluster mode on multinode cluster needs few more things. 1. When we submit a spark job, Spark code checks for PYSPARK_ARCHIVES_PATH . If PYSPARK_ARCHIVES_PATH is not present then it looks for SPARK_HOME . Therefore, we must have at least one of them set up correctly. We can set this environment variable using oozie.launcher.mapred.child.env property. 2. The py4j-0.9-src.zip and pyspark.zip (versions may vary based on spark version) are necessary to run python script in spark. Therefore, we need both of them present in classpath while executing the script. Simple way is to put them under lib/ directory of our workflow. 3. --py-files option must be configured and passed in <spark-opts> Settings would look like - <spark> <configuration> ..... ..... <property> <name>oozie.launcher.mapred.child.env</name> <value>PYSPARK_ARCHIVES_PATH=pyspark.zip</value> </property> </configuration> <master>yarn-cluster</master> <name>pyspark example</name> <jar>/hdfs/path/to/pi.py</jar> <spark-opts>--queue satishq --conf spark.yarn.historyServer.address=http: //spark.yarn.hsaddress.com:#port --conf spark.ui.view.acls=* --conf spark.eventLog.dir=hdfs://hdfspath/mapred/sparkhistory --py-files pyspark.zip,py4j-0.9-src.zip</spark-opts> </spark> Oozie can do some extra work to make user's life easy by setting PYSPARK_ARCHIVES_PATH , adding --py-files option automatically by figuring out location of pyspark.zip and py4j-0.9-src.zip based on the mapping file provided by user in oozie.service.ShareLibService.mapping.file or from default sharelib location if user has not provided any mapping file.
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          +1 PATCH_APPLIES
          +1 CLEAN
          +1 RAW_PATCH_ANALYSIS
          . +1 the patch does not introduce any @author tags
          . +1 the patch does not introduce any tabs
          . +1 the patch does not introduce any trailing spaces
          . +1 the patch does not introduce any line longer than 132
          . +1 the patch does adds/modifies 3 testcase(s)
          +1 RAT
          . +1 the patch does not seem to introduce new RAT warnings
          +1 JAVADOC
          . +1 the patch does not seem to introduce new Javadoc warnings
          +1 COMPILE
          . +1 HEAD compiles
          . +1 patch compiles
          . +1 the patch does not seem to introduce new javac warnings
          +1 BACKWARDS_COMPATIBILITY
          . +1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations
          . +1 the patch does not modify JPA files
          -1 TESTS
          . Tests run: 1779
          . Tests failed: 0
          . Tests errors: 1

          . The patch failed the following testcases:

          .

          +1 DISTRO
          . +1 distro tarball builds with the patch

          ----------------------------
          -1 Overall result, please check the reported -1(s)

          The full output of the test-patch run is available at

          . https://builds.apache.org/job/oozie-trunk-precommit-build/2884/

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- +1 PATCH_APPLIES +1 CLEAN +1 RAW_PATCH_ANALYSIS . +1 the patch does not introduce any @author tags . +1 the patch does not introduce any tabs . +1 the patch does not introduce any trailing spaces . +1 the patch does not introduce any line longer than 132 . +1 the patch does adds/modifies 3 testcase(s) +1 RAT . +1 the patch does not seem to introduce new RAT warnings +1 JAVADOC . +1 the patch does not seem to introduce new Javadoc warnings +1 COMPILE . +1 HEAD compiles . +1 patch compiles . +1 the patch does not seem to introduce new javac warnings +1 BACKWARDS_COMPATIBILITY . +1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations . +1 the patch does not modify JPA files -1 TESTS . Tests run: 1779 . Tests failed: 0 . Tests errors: 1 . The patch failed the following testcases: . +1 DISTRO . +1 distro tarball builds with the patch ---------------------------- -1 Overall result, please check the reported -1(s) The full output of the test-patch run is available at . https://builds.apache.org/job/oozie-trunk-precommit-build/2884/
          Hide
          rkanter Robert Kanter added a comment -

          Thanks Satish Subhashrao Saley for working on this. Peter Cseh has been working on this too and finally got a working version the other day, which I think is very similar to what you have, though IIRC, he was setting SPARK_HOME to . (i.e. the working dir) instead of setting PYSPARK_ARCHIVES_PATH. I'm not sure which is the better env var to set.

          Another concern I have is over the two zip files themselves. Peter Cseh was working on a way to automatically include them in the Spark Sharelib using the maven assembly part of the build. The current patch you posted adds them as test dependencies and seems to leave them up to the user otherwise.

          Peter Cseh, have you been able to figure out the maven assembly stuff? Perhaps we can combine your and Satish Subhashrao Saley's efforts.

          Show
          rkanter Robert Kanter added a comment - Thanks Satish Subhashrao Saley for working on this. Peter Cseh has been working on this too and finally got a working version the other day, which I think is very similar to what you have, though IIRC, he was setting SPARK_HOME to . (i.e. the working dir) instead of setting PYSPARK_ARCHIVES_PATH . I'm not sure which is the better env var to set. Another concern I have is over the two zip files themselves. Peter Cseh was working on a way to automatically include them in the Spark Sharelib using the maven assembly part of the build. The current patch you posted adds them as test dependencies and seems to leave them up to the user otherwise. Peter Cseh , have you been able to figure out the maven assembly stuff? Perhaps we can combine your and Satish Subhashrao Saley 's efforts.
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          -1 Patch failed to apply to head of branch

          ----------------------------

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- -1 Patch failed to apply to head of branch ----------------------------
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Tests are failing because jenkins is unable to find py4j and pyspark zip in test resources. Attaching them here as per discussion with Rohini Palaniswamy. I have added those in sharelib/spark/src/test/resources

          Show
          satishsaley Satish Subhashrao Saley added a comment - Tests are failing because jenkins is unable to find py4j and pyspark zip in test resources. Attaching them here as per discussion with Rohini Palaniswamy . I have added those in sharelib/spark/src/test/resources
          Hide
          gezapeti Peter Cseh added a comment -

          I have an ugly way to extract the py files and create the appropriate zip that could make this work with Spark 1.1.0 as the Python files were packed into the spark-core jar.
          Unfortunately, none of the jars in 1.6.1 contains the py or zip files so that solution won't work.
          We might try to convince them to propagate the python files into Maven somehow in a future release. Until then we could stick with 1.1.0 and grab the files from the old jar or upgrade and put the zips into the repository. I would prefer the former solution.

          Show
          gezapeti Peter Cseh added a comment - I have an ugly way to extract the py files and create the appropriate zip that could make this work with Spark 1.1.0 as the Python files were packed into the spark-core jar. Unfortunately, none of the jars in 1.6.1 contains the py or zip files so that solution won't work. We might try to convince them to propagate the python files into Maven somehow in a future release. Until then we could stick with 1.1.0 and grab the files from the old jar or upgrade and put the zips into the repository. I would prefer the former solution.
          Hide
          rkanter Robert Kanter added a comment - - edited

          I suppose the other option is to make it so that the user has to manually add the two zip files into the Spark sharelib. Given the complexities here, and how Spark keeps changing their packaging, we're probably best off just leaving that up to the user. We can make it clear in the Oozie setup docs; and also if the user specifies a a python file but the zips are not there, the Spark Action(Executor?) could fail fast with a specific message about adding those zips. We might even be able to have Oozie reject the workflow at submission time if the requirements are not met (though that might require Spark Action-related code outside of the SparkActionExecutor and SparkMain classes, so maybe we shouldn't do it at submission time).

          Satish Subhashrao Saley, Peter Cseh, what do you think?

          Show
          rkanter Robert Kanter added a comment - - edited I suppose the other option is to make it so that the user has to manually add the two zip files into the Spark sharelib. Given the complexities here, and how Spark keeps changing their packaging, we're probably best off just leaving that up to the user. We can make it clear in the Oozie setup docs; and also if the user specifies a a python file but the zips are not there, the Spark Action(Executor?) could fail fast with a specific message about adding those zips. We might even be able to have Oozie reject the workflow at submission time if the requirements are not met (though that might require Spark Action-related code outside of the SparkActionExecutor and SparkMain classes, so maybe we shouldn't do it at submission time). Satish Subhashrao Saley , Peter Cseh , what do you think?
          Hide
          gezapeti Peter Cseh added a comment -

          Thank you for the patch Satish Subhashrao Saley, I've put together another patch based on yours that sets the SPARK_HOME variable instead of the archives and creates a python.zip file which van be added to the DistributedCache as an archive. I haven't finished that part, but inserted a line into SparkActionExcecutor.addShareLib how it would look. I don't like my solution because it's very spark-version specific and I find it ugly in general.

          Robert Kanter, making the users add these zips to the sharelib may work with some extra checks, but we will need to set one (or both) of the environment variables.
          I will test this out for tomorrow.

          (I had to change the assignee to be able to attach files)

          Show
          gezapeti Peter Cseh added a comment - Thank you for the patch Satish Subhashrao Saley , I've put together another patch based on yours that sets the SPARK_HOME variable instead of the archives and creates a python.zip file which van be added to the DistributedCache as an archive. I haven't finished that part, but inserted a line into SparkActionExcecutor.addShareLib how it would look. I don't like my solution because it's very spark-version specific and I find it ugly in general. Robert Kanter , making the users add these zips to the sharelib may work with some extra checks, but we will need to set one (or both) of the environment variables. I will test this out for tomorrow. (I had to change the assignee to be able to attach files)
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Setting up SPARK_HOME=. will work as well, but we need to make sure that pyspark and py4j zip files are under $SPARK_HOME/python/lib/ directory as spark will look for it in this code.

          Main reason for moving to spark 1.6.1 is the version mismatch errors I faced while writing the tests.

          Exception: Python in worker has different version 2.7 than that in driver /Users/saley/src/oozie/sharelib/spark/target/test-da    ta/minicluster/mapred/local/1_0/taskTracker/test/jobcache/job_0001/attempt_0001_m_000000_0/work/tmp/spark-f71bd1cd-72f6-458d-b3c2-930c5a0eeb00, PySpark cannot run with different minor versions
          

          Robert Kanter I agree with you regarding documenting the change and appropriate error messages.
          Also, if users are already using oozie.service.ShareLibService.mapping.file for spark sharelib, then we can encourage them to add paths for pyspark and py4j zip files in there. That way individual user does not need copy over the zip files in workflow lib/ directory.

          Show
          satishsaley Satish Subhashrao Saley added a comment - Setting up SPARK_HOME=. will work as well, but we need to make sure that pyspark and py4j zip files are under $SPARK_HOME/python/lib/ directory as spark will look for it in this code . Main reason for moving to spark 1.6.1 is the version mismatch errors I faced while writing the tests. Exception: Python in worker has different version 2.7 than that in driver /Users/saley/src/oozie/sharelib/spark/target/test-da ta/minicluster/mapred/local/1_0/taskTracker/test/jobcache/job_0001/attempt_0001_m_000000_0/work/tmp/spark-f71bd1cd-72f6-458d-b3c2-930c5a0eeb00, PySpark cannot run with different minor versions Robert Kanter I agree with you regarding documenting the change and appropriate error messages. Also, if users are already using oozie.service.ShareLibService.mapping.file for spark sharelib, then we can encourage them to add paths for pyspark and py4j zip files in there. That way individual user does not need copy over the zip files in workflow lib/ directory.
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          -1 Patch failed to apply to head of branch

          ----------------------------

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- -1 Patch failed to apply to head of branch ----------------------------
          Hide
          gezapeti Peter Cseh added a comment -

          Regarding environment variables:
          I think the solution with PYSPARK_ARCHIVES_PATH is nicer, because you don't have to care about the python/lib directory structure, but I can't find any documentation about it.

          SPARK_HOME will likely stick around, but the paths needed relative to it or the zipfiles may change as they are also not showing up in the documentation.

          Marcelo Vanzin, what do you think which one is more robust? Are there any plans on changing, dropping or officially supporting PYSPARK_ARCHIVES_PATH or the python/lib/*zip structure?

          Show
          gezapeti Peter Cseh added a comment - Regarding environment variables: I think the solution with PYSPARK_ARCHIVES_PATH is nicer, because you don't have to care about the python/lib directory structure, but I can't find any documentation about it. SPARK_HOME will likely stick around, but the paths needed relative to it or the zipfiles may change as they are also not showing up in the documentation. Marcelo Vanzin , what do you think which one is more robust? Are there any plans on changing, dropping or officially supporting PYSPARK_ARCHIVES_PATH or the python/lib/*zip structure?
          Hide
          rkanter Robert Kanter added a comment -

          By the way, Satish Subhashrao Saley and Peter Cseh, if you use the --binary argument when generating the patch, it should include the binary content as part of the patch. e.g. git diff --no-prefix --binary ...

          Robert Kanter I agree with you regarding documenting the change and appropriate error messages.
          Also, if users are already using oozie.service.ShareLibService.mapping.file for spark sharelib, then we can encourage them to add paths for pyspark and py4j zip files in there. That way individual user does not need copy over the zip files in workflow lib/ directory.

          Right. Though that only covers if you're using the mapping file. We should make sure to also document how to do this if you don't use the mapping file. In CDH, for instance, I'm planning on having us (somehow) put the zip files in the Spark Sharelib as part of our build so users don't have to even worry about this; but we should document the three ways (mapping file, lib/, and sharelib) for other users.

          Show
          rkanter Robert Kanter added a comment - By the way, Satish Subhashrao Saley and Peter Cseh , if you use the --binary argument when generating the patch, it should include the binary content as part of the patch. e.g. git diff --no-prefix --binary ... Robert Kanter I agree with you regarding documenting the change and appropriate error messages. Also, if users are already using oozie.service.ShareLibService.mapping.file for spark sharelib, then we can encourage them to add paths for pyspark and py4j zip files in there. That way individual user does not need copy over the zip files in workflow lib/ directory. Right. Though that only covers if you're using the mapping file. We should make sure to also document how to do this if you don't use the mapping file. In CDH, for instance, I'm planning on having us (somehow) put the zip files in the Spark Sharelib as part of our build so users don't have to even worry about this; but we should document the three ways (mapping file, lib/, and sharelib) for other users.
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Including zip files inside patch. Thank you Robert Kanter.

          Show
          satishsaley Satish Subhashrao Saley added a comment - Including zip files inside patch. Thank you Robert Kanter .
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          -1 Patch failed to apply to head of branch

          ----------------------------

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- -1 Patch failed to apply to head of branch ----------------------------
          Hide
          gezapeti Peter Cseh added a comment -

          It seems like the patch command does not handle binary files. I've created OOZIE-2532 with the details.

          Show
          gezapeti Peter Cseh added a comment - It seems like the patch command does not handle binary files. I've created OOZIE-2532 with the details.
          Hide
          vanzin Marcelo Vanzin added a comment -

          `PYSPARK_ARCHIVES_PATH` won't be removed unless there's a replacement for it. So if you want to use it, you can.

          I'm not sure about the location of the zip file, but I see no reason for it to change.

          Show
          vanzin Marcelo Vanzin added a comment - `PYSPARK_ARCHIVES_PATH` won't be removed unless there's a replacement for it. So if you want to use it, you can. I'm not sure about the location of the zip file, but I see no reason for it to change.
          Hide
          rkanter Robert Kanter added a comment -

          Here's some feedback on the 2 patch:

          1. Rename TestPyspark to TestPySpark
          2. The SparkActionExecutor should only add PYSPARK_ARCHIVES_PATH if the user is running a PySpark job (SparkMain already has a check for this and only does it’s PySpark stuff in that case)
          3. Docs
            1. Add a section to the Install docs page about adding the two zip files to the sharelib dir or by the mapping file
            2. Update the Spark Action docs page to explain how to use PySpark and add a note about the zip files that links to the Install docs page
          4. TestPyspark fails. The stdout from one of the launcher jobs shows this:
            Error from python worker:
              /usr/bin/python: No module named pyspark
            PYTHONPATH was:
              /Users/rkanter/.m2/repository/org/apache/spark/spark-core_2.10/1.6.1/spark-core_2.10-1.6.1.jar
            java.io.EOFException
                    at java.io.DataInputStream.readInt(DataInputStream.java:392)
                    at org.apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:164)
                    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:87)
                    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:63)
                    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:134)
                    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
                    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
                    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
                    at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
                    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
                    at org.apache.spark.scheduler.Task.run(Task.scala:89)
                    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
                    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                    ... 1 more
            
            Intercepting System.exit(1)
            
          Show
          rkanter Robert Kanter added a comment - Here's some feedback on the 2 patch: Rename TestPyspark to TestPySpark The SparkActionExecutor should only add PYSPARK_ARCHIVES_PATH if the user is running a PySpark job (SparkMain already has a check for this and only does it’s PySpark stuff in that case) Docs Add a section to the Install docs page about adding the two zip files to the sharelib dir or by the mapping file Update the Spark Action docs page to explain how to use PySpark and add a note about the zip files that links to the Install docs page TestPyspark fails. The stdout from one of the launcher jobs shows this: Error from python worker: /usr/bin/python: No module named pyspark PYTHONPATH was: /Users/rkanter/.m2/repository/org/apache/spark/spark-core_2.10/1.6.1/spark-core_2.10-1.6.1.jar java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream.java:392) at org.apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:164) at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:87) at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:63) at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:134) at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101) at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) at org.apache.spark.scheduler.Task.run(Task.scala:89) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) ... 1 more Intercepting System.exit(1)
          Hide
          BigDataOrange Alexandre Linte added a comment -

          Hi Satish Subhashrao Saley, sorry for the delay.
          I tried your solution (PYSPARK_ARCHIVES_PATH and py4j-0.9-src.zip + pyspark.zip). The result is better but it's not working at 100%. When it fails I have the following logs on the Oozie server.

          2016-05-11 08:43:28,391  WARN CoordActionReadyXCommand:523 - USER[czfv1086] GROUP[-] TOKEN[] APP[coord_app_2ip_loadicxip] JOB[0000024-160426195954711-oozie-C] ACTION[] No actions to start for jobId=0000024-160426195954711-oozie-C as max concurrency reached!
          2016-05-11 08:43:30,971  INFO SparkActionExecutor:520 - USER[czfv1086] GROUP[-] TOKEN[] APP[wf_app_2ip_loadicxip] JOB[0000660-160510172237486-oozie-W] ACTION[0000660-160510172237486-oozie-W@launch_streaming] checking action, hadoop job ID [job_1461692698792_19420] status [RUNNING]
          2016-05-11 08:43:47,663  INFO StatusTransitService$StatusTransitRunnable:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Acquired lock for [org.apache.oozie.service.StatusTransitService]
          2016-05-11 08:43:47,664  INFO StatusTransitService$StatusTransitRunnable:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Running coordinator status service from last instance time =  2016-05-11T06:42Z
          2016-05-11 08:43:47,671  INFO StatusTransitService$StatusTransitRunnable:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Running bundle status service from last instance time =  2016-05-11T06:42Z
          2016-05-11 08:43:47,674  INFO StatusTransitService$StatusTransitRunnable:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Released lock for [org.apache.oozie.service.StatusTransitService]
          2016-05-11 08:43:52,311  INFO PauseTransitService:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Acquired lock for [org.apache.oozie.service.PauseTransitService]
          2016-05-11 08:43:52,338  INFO PauseTransitService:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Released lock for [org.apache.oozie.service.PauseTransitService]
          2016-05-11 08:43:54,799  INFO CallbackServlet:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] callback for action [0000819-160510172237486-oozie-W@spark-node]
          2016-05-11 08:43:55,130  INFO SparkActionExecutor:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] action completed, external ID [job_1461692698792_19524]
          2016-05-11 08:43:55,136  WARN SparkActionExecutor:523 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1461692698792_19525 finished with failed status
          2016-05-11 08:43:55,136  WARN SparkActionExecutor:523 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] Launcher exception: Application application_1461692698792_19525 finished with failed status
          org.apache.spark.SparkException: Application application_1461692698792_19525 finished with failed status
                  at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034)
                  at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
                  at org.apache.spark.deploy.yarn.Client.main(Client.scala)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
                  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
                  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
                  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
                  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
                  at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104)
                  at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95)
                  at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
                  at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
                  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
                  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
                  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
                  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
                  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
                  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                  at java.lang.Thread.run(Thread.java:745)
          
          2016-05-11 08:43:55,193  INFO ActionEndXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] ERROR is considered as FAILED for SLA
          2016-05-11 08:43:55,319  INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@fail] Start action [0000819-160510172237486-oozie-W@fail] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
          2016-05-11 08:43:55,321  INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@fail] [***0000819-160510172237486-oozie-W@fail***]Action status=DONE
          2016-05-11 08:43:55,321  INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@fail] [***0000819-160510172237486-oozie-W@fail***]Action updated in DB!
          2016-05-11 08:43:55,401  INFO WorkflowNotificationXCommand:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@fail] No Notification URL is defined. Therefore nothing to notify for job 0000819-160510172237486-oozie-W@fail
          2016-05-11 08:43:55,401  INFO WorkflowNotificationXCommand:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000819-160510172237486-oozie-W] ACTION[] No Notification URL is defined. Therefore nothing to notify for job 0000819-160510172237486-oozie-W
          2016-05-11 08:43:55,402  INFO WorkflowNotificationXCommand:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] No Notification URL is defined. Therefore nothing to notify for job 0000819-160510172237486-oozie-W@spark-node
          
          Show
          BigDataOrange Alexandre Linte added a comment - Hi Satish Subhashrao Saley , sorry for the delay. I tried your solution (PYSPARK_ARCHIVES_PATH and py4j-0.9-src.zip + pyspark.zip). The result is better but it's not working at 100%. When it fails I have the following logs on the Oozie server. 2016-05-11 08:43:28,391 WARN CoordActionReadyXCommand:523 - USER[czfv1086] GROUP[-] TOKEN[] APP[coord_app_2ip_loadicxip] JOB[0000024-160426195954711-oozie-C] ACTION[] No actions to start for jobId=0000024-160426195954711-oozie-C as max concurrency reached! 2016-05-11 08:43:30,971 INFO SparkActionExecutor:520 - USER[czfv1086] GROUP[-] TOKEN[] APP[wf_app_2ip_loadicxip] JOB[0000660-160510172237486-oozie-W] ACTION[0000660-160510172237486-oozie-W@launch_streaming] checking action, hadoop job ID [job_1461692698792_19420] status [RUNNING] 2016-05-11 08:43:47,663 INFO StatusTransitService$StatusTransitRunnable:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Acquired lock for [org.apache.oozie.service.StatusTransitService] 2016-05-11 08:43:47,664 INFO StatusTransitService$StatusTransitRunnable:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Running coordinator status service from last instance time = 2016-05-11T06:42Z 2016-05-11 08:43:47,671 INFO StatusTransitService$StatusTransitRunnable:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Running bundle status service from last instance time = 2016-05-11T06:42Z 2016-05-11 08:43:47,674 INFO StatusTransitService$StatusTransitRunnable:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Released lock for [org.apache.oozie.service.StatusTransitService] 2016-05-11 08:43:52,311 INFO PauseTransitService:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Acquired lock for [org.apache.oozie.service.PauseTransitService] 2016-05-11 08:43:52,338 INFO PauseTransitService:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] Released lock for [org.apache.oozie.service.PauseTransitService] 2016-05-11 08:43:54,799 INFO CallbackServlet:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] callback for action [0000819-160510172237486-oozie-W@spark-node] 2016-05-11 08:43:55,130 INFO SparkActionExecutor:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] action completed, external ID [job_1461692698792_19524] 2016-05-11 08:43:55,136 WARN SparkActionExecutor:523 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1461692698792_19525 finished with failed status 2016-05-11 08:43:55,136 WARN SparkActionExecutor:523 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] Launcher exception: Application application_1461692698792_19525 finished with failed status org.apache.spark.SparkException: Application application_1461692698792_19525 finished with failed status at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034) at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104) at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-05-11 08:43:55,193 INFO ActionEndXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] ERROR is considered as FAILED for SLA 2016-05-11 08:43:55,319 INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@fail] Start action [0000819-160510172237486-oozie-W@fail] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10] 2016-05-11 08:43:55,321 INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@fail] [***0000819-160510172237486-oozie-W@fail***]Action status=DONE 2016-05-11 08:43:55,321 INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@fail] [***0000819-160510172237486-oozie-W@fail***]Action updated in DB! 2016-05-11 08:43:55,401 INFO WorkflowNotificationXCommand:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@fail] No Notification URL is defined. Therefore nothing to notify for job 0000819-160510172237486-oozie-W@fail 2016-05-11 08:43:55,401 INFO WorkflowNotificationXCommand:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000819-160510172237486-oozie-W] ACTION[] No Notification URL is defined. Therefore nothing to notify for job 0000819-160510172237486-oozie-W 2016-05-11 08:43:55,402 INFO WorkflowNotificationXCommand:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000819-160510172237486-oozie-W] ACTION[0000819-160510172237486-oozie-W@spark-node] No Notification URL is defined. Therefore nothing to notify for job 0000819-160510172237486-oozie-W@spark-node
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Could please share the logs for application_1461692698792_19525?

          Show
          satishsaley Satish Subhashrao Saley added a comment - Could please share the logs for application_1461692698792_19525?
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Thank you for review Robert. I set spark.executorEnv.PYTHONPATH=pyspark.zip:py4j-0.9-src.zip and it started working. I am checking whether we should have it by default as well.

          Show
          satishsaley Satish Subhashrao Saley added a comment - Thank you for review Robert. I set spark.executorEnv.PYTHONPATH=pyspark.zip:py4j-0.9-src.zip and it started working. I am checking whether we should have it by default as well.
          Hide
          BigDataOrange Alexandre Linte added a comment -

          Hi Satish Subhashrao Saley, I won't be able to give you the logs for the application 1461692698792_19525, the logs were purged.

          Here are the logs for a pyspark job that fails with the same error (application 1461692698792_29704 / 1461692698792_29705).

          OOZIE LOGS

          2016-05-20 17:42:00,627  INFO CallbackServlet:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] callback for action [0012689-160510172237486-oozie-W@spark-node]
          2016-05-20 17:42:00,892  INFO SparkActionExecutor:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] action completed, external ID [job_1461692698792_29704]
          2016-05-20 17:42:00,897  WARN SparkActionExecutor:523 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1461692698792_29705 finished with failed status
          2016-05-20 17:42:00,897  WARN SparkActionExecutor:523 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] Launcher exception: Application application_1461692698792_29705 finished with failed status
          org.apache.spark.SparkException: Application application_1461692698792_29705 finished with failed status
                  at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034)
                  at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
                  at org.apache.spark.deploy.yarn.Client.main(Client.scala)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
                  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
                  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
                  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
                  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
                  at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104)
                  at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95)
                  at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
                  at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
                  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
                  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
                  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
                  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
                  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
                  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                  at java.lang.Thread.run(Thread.java:745)
          
          2016-05-20 17:42:01,017  INFO ActionEndXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] ERROR is considered as FAILED for SLA
          2016-05-20 17:42:01,080  INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@fail] Start action [0012689-160510172237486-oozie-W@fail] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
          2016-05-20 17:42:01,081  INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@fail] [***0012689-160510172237486-oozie-W@fail***]Action status=DONE
          

          RESOURCE MANAGER LOGS

          2016-05-20 17:41:51,123 WARN org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl: The specific max attempts: 0 for application: 29705 is invalid, because it is out of the range [1, 2]. Use the global max attempts instead.
          2016-05-20 17:41:52,880 WARN org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl: The specific max attempts: 0 for application: 29706 is invalid, because it is out of the range [1, 2]. Use the global max attempts instead.
          2016-05-20 17:41:58,161 WARN org.apache.hadoop.yarn.server.resourcemanager.RMAuditLogger: USER=shfs3453 OPERATION=Application Finished - Failed TARGET=RMAppManager     RESULT=FAILURE  DESCRIPTION=App failed with state: FAILED       PERMISSIONS=Application application_1461692698792_29705 failed 2 times due to AM Container for appattempt_1461692698792_29705_000002 exited with  exitCode: 1
          For more detailed output, check application tracking page:http://uabigrm02.rouen.francetelecom.fr:8088/cluster/app/application_1461692698792_29705Then, click on links to logs of each attempt.
          Diagnostics: Exception from container-launch.
          Container id: container_1461692698792_29705_02_000001
          Exit code: 1
          Stack trace: ExitCodeException exitCode=1:
                  at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
                  at org.apache.hadoop.util.Shell.run(Shell.java:456)
                  at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
                  at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:297)
                  at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
                  at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
                  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
                  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                  at java.lang.Thread.run(Thread.java:745)
          
          Shell output: main : command provided 1
          main : user is shfs3453
          main : requested yarn user is shfs3453
          
          
          Container exited with a non-zero exit code 1
          Failing this attempt. Failing the application.  APPID=application_1461692698792_29705
          2016-05-20 17:41:59,375 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for appattempt_1461692698792_29706_000001 (auth:SIMPLE)
          2016-05-20 17:41:59,381 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for appattempt_1461692698792_29706_000001 (auth:TOKEN) for protocol=interface org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB
          2016-05-20 17:42:00,745 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for oozie/uabigord01.rouen.francetelecom.fr@SANDBOX.HADOOP (auth:KERBEROS)
          

          DATANODE LOGS (stderr)

          SLF4J: Class path contains multiple SLF4J bindings.
          SLF4J: Found binding in [jar:file:/mnt/hd3/hadoop/yarn/local/filecache/187/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
          SLF4J: Found binding in [jar:file:/mnt/hd9/hadoop/yarn/local/filecache/249/spark-assembly-1.6.1-hadoop2.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
          SLF4J: Found binding in [jar:file:/opt/application/Hadoop/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
          SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
          SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
          log4j:ERROR setFile(null,true) call failed.
          java.io.FileNotFoundException: /mnt/hd6/hadoop/yarn/log/application_1461692698792_29704/container_1461692698792_29704_01_000001 (Is a directory)
                  at java.io.FileOutputStream.open(Native Method)
                  at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
                  at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
                  at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
                  at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
                  at org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55)
                  at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
                  at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
                  at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
                  at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:809)
                  at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)
                  at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615)
                  at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:502)
                  at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:547)
                  at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:483)
                  at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
                  at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
                  at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:270)
                  at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
                  at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
                  at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:275)
                  at org.apache.hadoop.service.AbstractService.<clinit>(AbstractService.java:43)
          May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
          INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class
          May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
          INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
          May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
          INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class
          May 20, 2016 5:41:44 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
          INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
          May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
          INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
          May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
          INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
          May 20, 2016 5:41:45 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
          INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices to GuiceManagedComponentProvider with the scope "PerRequest"
          Using properties file: null
          Parsed arguments:
            master                  yarn-cluster
            deployMode              cluster
            executorMemory          null
            executorCores           null
            totalExecutorCores      null
            propertiesFile          null
            driverMemory            null
            driverCores             null
            driverExtraClassPath    null
            driverExtraLibraryPath  null
            driverExtraJavaOptions  null
            supervise               false
            queue                   null
            numExecutors            null
            files                   null
            pyFiles                 hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/py4j-0.9-src.zip,hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pyspark.zip
            archives                null
            mainClass               null
            primaryResource         hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pi.py
            name                    Pysparkpi example
            childArgs               [100]
            jars                    null
            packages                null
            packagesExclusions      null
            repositories            null
            verbose                 true
          
          Spark properties used, including those specified through
           --conf and those from the properties file null:
            spark.executorEnv.SPARK_HOME -> /opt/application/Spark/current
            spark.executorEnv.PYTHONPATH -> /opt/application/Spark/current/python
            spark.yarn.appMasterEnv.SPARK_HOME -> /opt/application/Spark/current
          
          
          Main class:
          org.apache.spark.deploy.yarn.Client
          Arguments:
          --name
          Pysparkpi example
          --primary-py-file
          hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pi.py
          --py-files
          hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/py4j-0.9-src.zip,hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pyspark.zip
          --class
          org.apache.spark.deploy.PythonRunner
          --arg
          100
          System properties:
          spark.executorEnv.SPARK_HOME -> /opt/application/Spark/current
          spark.executorEnv.PYTHONPATH -> /opt/application/Spark/current/python
          SPARK_SUBMIT -> true
          spark.app.name -> Pysparkpi example
          spark.submit.deployMode -> cluster
          spark.yarn.appMasterEnv.SPARK_HOME -> /opt/application/Spark/current
          spark.yarn.isPython -> true
          spark.master -> yarn-cluster
          Classpath elements:
          
          
          
          Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1461692698792_29705 finished with failed status
          org.apache.spark.SparkException: Application application_1461692698792_29705 finished with failed status
                  at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034)
                  at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
                  at org.apache.spark.deploy.yarn.Client.main(Client.scala)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
                  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
                  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
                  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
                  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
                  at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104)
                  at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95)
                  at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
                  at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
                  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
                  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
                  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
                  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
                  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
                  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                  at java.lang.Thread.run(Thread.java:745)
          log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
          log4j:WARN Please initialize the log4j system properly.
          log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
          

          DATANODE LOGS (stdout)

          Oozie Launcher starts
          
          Heart beat
          {"properties":[{"key":"oozie.launcher.job.id","value":"job_1461692698792_29704","isFinal":false,"resource":"programatically"},{"key":"oozie.job.id","value":"0012689-160510172237486-oozie-W","isFinal":false,"resource":"programatically"},{"key":"oozie.action.id","value":"0012689-160510172237486-oozie-W@spark-node","isFinal":false,"resource":"programatically"},{"key":"mapreduce.job.tags","value":"oozie-f217792cb72212277adf42b1fef23939","isFinal":false,"resource":"programatically"}]}Starting the execution of prepare actions
          Completed the execution of prepare actions successfully
          
          Files in current dir:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/.
          ======================
          File: jackson-databind-2.3.1.jar
          File: jets3t-0.7.1.jar
          File: libhadoop.a
          File: propagation-conf.xml
          File: paranamer-2.6.jar
          File: zookeeper-3.4.6.jar
          File: tachyon-client-0.5.0.jar
          File: spark-graphx_2.10-1.6.1.jar
          File: akka-actor_2.10-2.2.3-shaded-protobuf.jar
          File: spark-catalyst_2.10-1.6.1.jar
          File: jetty-webapp-8.1.14.v20131031.jar
          File: slf4j-api-1.6.6.jar
          File: javax.mail.glassfish-1.4.1.v201005082020.jar
          File: spark-mllib_2.10-1.6.1.jar
          File: jsr305-1.3.9.jar
          File: libhadoop.so.1.0.0
          File: jetty-security-8.1.14.v20131031.jar
          File: spark-core_2.10-1.6.1.jar
          File: scala-compiler-2.10.0.jar
          File: protobuf-java-2.4.1-shaded.jar
          File: akka-remote_2.10-2.2.3-shaded-protobuf.jar
          File: metrics-jvm-3.0.2.jar
          File: metrics-graphite-3.0.0.jar
          File: netty-all-4.0.23.Final.jar
          File: container_tokens
          File: objenesis-1.2.jar
          File: curator-recipes-2.5.0.jar
          File: chill-java-0.3.6.jar
          File: json4s-ast_2.10-3.2.10.jar
          File: commons-io-2.1.jar
          File: jetty-io-8.1.14.v20131031.jar
          File: jetty-continuation-8.1.14.v20131031.jar
          File: action.xml
          File: launch_container.sh
          File: stream-2.7.0.jar
          File: py4j-0.9-src.zip
          File: .action.xml.crc
          File: minlog-1.2.jar
          File: libhdfs.a
          File: netty-3.6.6.Final.jar
          File: jetty-util-8.1.14.v20131031.jar
          File: spark-yarn_2.10-1.6.1.jar
          File: scalap-2.10.0.jar
          File: job.xml
          File: json-simple-1.1.jar
          File: jetty-http-8.1.14.v20131031.jar
          File: pyrolite-2.0.1.jar
          File: log4j-1.2.16.jar
          File: jline-0.9.94.jar
          File: uncommons-maths-1.2.2a.jar
          File: slf4j-log4j12-1.6.6.jar
          File: py4j-0.8.2.1.jar
          File: json4s-jackson_2.10-3.2.10.jar
          Dir: jobSubmitDir
            File: job.splitmetainfo
            File: job.split
          File: metrics-core-3.0.2.jar
          File: jblas-1.2.3.jar
          File: libhdfs.so.0.0.0
          File: curator-client-2.7.1.jar
          File: spark-sql_2.10-1.6.1.jar
          File: concurrent-1.3.4.jar
          File: jetty-jndi-8.1.14.v20131031.jar
          File: oozie-sharelib-spark-4.2.0.jar
          File: curator-framework-2.5.0.jar
          File: pyspark.zip
          File: javax.activation-1.1.0.v201105071233.jar
          File: tachyon-0.6.4.jar
          File: reflectasm-1.07-shaded.jar
          File: spark-tools_2.10-1.6.1.jar
          File: GC.log
          File: libhadooppipes.a
          File: spark-assembly-1.6.1-hadoop2.7.2.jar
          File: javax.servlet-3.0.0.v201112011016.jar
          File: jetty-servlet-8.1.14.v20131031.jar
          File: scala-library-2.10.4.jar
          File: jackson-annotations-2.3.0.jar
          File: libhadoop.so
          File: akka-slf4j_2.10-2.2.3-shaded-protobuf.jar
          File: spark-network-yarn_2.10-1.6.1.jar
          File: commons-lang3-3.3.2.jar
          File: compress-lzf-1.0.0.jar
          File: spark-1.6.1-yarn-shuffle.jar
          File: jackson-core-2.3.1.jar
          File: json4s-core_2.10-3.2.10.jar
          File: jetty-plus-8.1.14.v20131031.jar
          File: config-1.0.2.jar
          File: oozie-sharelib-oozie-4.2.0.jar
          File: scala-reflect-2.10.0.jar
          File: jetty-xml-8.1.14.v20131031.jar
          File: jetty-server-8.1.14.v20131031.jar
          File: commons-lang-2.4.jar
          File: snappy-java-1.0.5.3.jar
          File: libhdfs.so
          File: kryo-2.21.jar
          File: spark-streaming_2.10-1.6.1.jar
          File: commons-httpclient-3.1.jar
          File: guava-14.0.1.jar
          File: commons-codec-1.4.jar
          File: libhadooputils.a
          Dir: tmp
            Dir: Jetty_0_0_0_0_54973_mapreduce____.j5f3a5
          File: chill_2.10-0.3.6.jar
          File: lz4-1.2.0.jar
          File: javax.transaction-1.1.1.v201105210645.jar
          File: commons-net-2.2.jar
          File: spark-hive_2.10-1.6.1.jar
          File: metrics-json-3.0.2.jar
          File: colt-1.2.0.jar
          File: oozie-hadoop-utils-hadoop-2-4.2.0.jar
          
          Oozie Java/Map-Reduce/Pig action launcher-job configuration
          =================================================================
          Workflow job id   : 0012689-160510172237486-oozie-W
          Workflow action id: 0012689-160510172237486-oozie-W@spark-node
          
          Classpath         :
          ------------------------
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001
            job.jar/job.jar
            job.jar/classes/
            job.jar/lib/*
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-databind-2.3.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jets3t-0.7.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/paranamer-2.6.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/zookeeper-3.4.6.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/tachyon-client-0.5.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-graphx_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-actor_2.10-2.2.3-shaded-protobuf.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-catalyst_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-webapp-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-api-1.6.6.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.mail.glassfish-1.4.1.v201005082020.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-mllib_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jsr305-1.3.9.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-security-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-core_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-compiler-2.10.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/protobuf-java-2.4.1-shaded.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-remote_2.10-2.2.3-shaded-protobuf.jar
              /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-catalyst_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-webapp-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-api-1.6.6.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.mail.glassfish-1.4.1.v201005082020.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-mllib_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jsr305-1.3.9.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-security-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-core_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-compiler-2.10.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/protobuf-java-2.4.1-shaded.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-remote_2.10-2.2.3-shaded-protobuf.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-jvm-3.0.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-graphite-3.0.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/netty-all-4.0.23.Final.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/objenesis-1.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-recipes-2.5.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/chill-java-0.3.6.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-ast_2.10-3.2.10.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-io-2.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-io-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-continuation-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/stream-2.7.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/minlog-1.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/netty-3.6.6.Final.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-util-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-yarn_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scalap-2.10.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json-simple-1.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-http-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/pyrolite-2.0.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/log4j-1.2.16.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jline-0.9.94.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/uncommons-maths-1.2.2a.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-log4j12-1.6.6.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/py4j-0.8.2.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-jackson_2.10-3.2.10.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-core-3.0.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jblas-1.2.3.jar
              /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-client-2.7.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-sql_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/concurrent-1.3.4.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-jndi-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-sharelib-spark-4.2.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-framework-2.5.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.activation-1.1.0.v201105071233.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/tachyon-0.6.4.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/reflectasm-1.07-shaded.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-tools_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-assembly-1.6.1-hadoop2.7.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.servlet-3.0.0.v201112011016.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-servlet-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-library-2.10.4.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-annotations-2.3.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-slf4j_2.10-2.2.3-shaded-protobuf.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-network-yarn_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-lang3-3.3.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/compress-lzf-1.0.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-1.6.1-yarn-shuffle.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-core-2.3.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-core_2.10-3.2.10.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-plus-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/config-1.0.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-sharelib-oozie-4.2.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-reflect-2.10.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-xml-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-server-8.1.14.v20131031.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-lang-2.4.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/snappy-java-1.0.5.3.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/kryo-2.21.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-streaming_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-httpclient-3.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/guava-14.0.1.jar
              /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-codec-1.4.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/chill_2.10-0.3.6.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/lz4-1.2.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.transaction-1.1.1.v201105210645.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-net-2.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-hive_2.10-1.6.1.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-json-3.0.2.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/colt-1.2.0.jar
            /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-hadoop-utils-hadoop-2-4.2.0.jar
            /opt/application/Hadoop/current/etc/hadoop/
            /opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/hadoop-nfs-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2-tests.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jersey-server-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/guava-11.0.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jsp-api-2.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/junit-4.11.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-codec-1.4.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/netty-3.6.2.Final.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/httpcore-4.2.5.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/asm-3.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jetty-util-6.1.26.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/curator-framework-2.7.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-digester-1.8.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jetty-6.1.26.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/httpclient-4.2.5.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-collections-3.2.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-annotations-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/protobuf-java-2.5.0.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-auth-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/paranamer-2.3.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/zookeeper-3.4.6.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/xmlenc-0.52.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-cli-1.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/xz-1.0.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jsch-0.1.42.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/api-util-1.0.0-M20.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jersey-core-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/stax-api-1.0-2.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jersey-json-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/avro-1.7.4.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/log4j-1.2.17.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-logging-1.1.3.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jackson-xc-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-api-1.7.10.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/java-xmlbuilder-0.4.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/activation-1.1.jar
              /opt/application/Hadoop/current/share/hadoop/common/lib/jettison-1.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/servlet-api-2.5.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-api-2.2.2.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-httpclient-3.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/snappy-java-1.0.4.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-1.7.0.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-io-2.4.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/mockito-all-1.8.5.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jsr305-3.0.0.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-math3-3.1.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/gson-2.2.4.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/hamcrest-core-1.3.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/curator-recipes-2.7.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jets3t-0.9.0.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-net-3.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-lang-2.6.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-compress-1.4.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/curator-client-2.7.1.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2-tests.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-server-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/guava-11.0.2.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-codec-1.4.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/asm-3.2.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-6.1.26.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/xmlenc-0.52.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-cli-1.2.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-core-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/log4j-1.2.17.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/servlet-api-2.5.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-io-2.4.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jsr305-3.0.0.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-lang-2.6.jar
            /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-client-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-hadoop.jar
              /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-common-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-json.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/esri-geometry-api.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-common-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-registry-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-hive.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-api-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-server-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/guava-11.0.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-codec-1.4.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/netty-3.6.2.Final.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/asm-3.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-util-6.1.26.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-6.1.26.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-collections-3.2.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-servlet-3.0.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-guice-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-3.0.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-cli-1.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/xz-1.0.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-core-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/stax-api-1.0-2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-json-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/log4j-1.2.17.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-logging-1.1.3.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/activation-1.1.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jettison-1.1.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/servlet-api-2.5.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-client-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-io-2.4.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jsr305-3.0.0.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-lang-2.6.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-compress-1.4.1.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/aopalliance-1.0.jar
            /opt/application/Hadoop/current/share/hadoop/yarn/lib/javax.inject-1.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-server-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/junit-4.11.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/asm-3.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.2.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/paranamer-2.3.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-3.0.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/xz-1.0.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-core-1.9.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/avro-1.7.4.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/log4j-1.2.17.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-io-2.4.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/aopalliance-1.0.jar
            /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/javax.inject-1.jar
          ------------------------
          
          Main class        : org.apache.oozie.action.hadoop.SparkMain
          
          Maximum output    : 2048
          
          Arguments         :
                              100
          
          Java System Properties:
          ------------------------
          #
          #Fri May 20 17:41:46 CEST 2016
          java.runtime.name=OpenJDK Runtime Environment
          sun.boot.library.path=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/amd64
          java.vm.version=24.79-b02
          oozie.action.externalChildIDs=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/externalChildIDs
          hadoop.root.logger=INFO,CLA
          java.vm.vendor=Oracle Corporation
          java.vendor.url=http\://java.oracle.com/
          path.separator=\:
          java.vm.name=OpenJDK 64-Bit Server VM
          file.encoding.pkg=sun.io
          oozie.job.launch.time=1463758882528
          user.country=US
          sun.java.launcher=SUN_STANDARD
          sun.os.patch.level=unknown
          java.vm.specification.name=Java Virtual Machine Specification
          user.dir=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001
          oozie.action.newId=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/newId
          java.runtime.version=1.7.0_79-mockbuild_2015_07_24_09_26-b00
          java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment
          java.endorsed.dirs=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/endorsed
          os.arch=amd64
          oozie.job.id=0012689-160510172237486-oozie-W
          oozie.action.id=0012689-160510172237486-oozie-W@spark-node
          yarn.app.container.log.dir=/mnt/hd6/hadoop/yarn/log/application_1461692698792_29704/container_1461692698792_29704_01_000001
          java.io.tmpdir=./tmp
          line.separator=\n
          oozie.action.output.properties=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/output.properties
          java.vm.specification.vendor=Oracle Corporation
          os.name=Linux
          log4j.configuration=container-log4j.properties
          sun.jnu.encoding=UTF-8
          java.library.path=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001\:/usr/java/packages/lib/amd64\:/usr/lib64\:/lib64\:/lib\:/usr/lib
          oozie.action.conf.xml=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/action.xml
          java.specification.name=Java Platform API Specification
          java.class.version=51.0
          sun.management.compiler=HotSpot 64-Bit Tiered Compilers
          os.version=2.6.32-573.7.1.el6.x86_64
          oozie.action.error.properties=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/error.properties
          yarn.app.container.log.filesize=0
          user.home=/home/shfs3453
          user.timezone=Europe/Paris
          java.awt.printerjob=sun.print.PSPrinterJob
          file.encoding=UTF-8
          java.specification.version=1.7
          java.class.path=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001\:job.jar/job.jar\:job.jar/classes/\:job.jar/lib/*\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-databind-2.3.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jets3t-0.7.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/paranamer-2.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/zookeeper-3.4.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/tachyon-client-0.5.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-graphx_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-actor_2.10-2.2.3-shaded-protobuf.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-catalyst_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-webapp-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-api-1.6.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.mail.glassfish-1.4.1.v201005082020.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-mllib_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jsr305-1.3.9.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-security-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-core_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-compiler-2.10.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/protobuf-java-2.4.1-shaded.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-remote_2.10-2.2.3-shaded-protobuf.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-jvm-3.0.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-graphite-3.0.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/netty-all-4.0.23.Final.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/objenesis-1.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-recipes-2.5.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/chill-java-0.3.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-ast_2.10-3.2.10.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-io-2.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-io-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-continuation-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/stream-2.7.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/minlog-1.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/netty-3.6.6.Final.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-util-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-yarn_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scalap-2.10.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json-simple-1.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-http-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/pyrolite-2.0.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/log4j-1.2.16.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jline-0.9.94.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/uncommons-maths-1.2.2a.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-log4j12-1.6.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/py4j-0.8.2.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-jackson_2.10-3.2.10.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-core-3.0.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jblas-1.2.3.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-client-2.7.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-sql_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/concurrent-1.3.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-jndi-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-sharelib-spark-4.2.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-framework-2.5.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.activation-1.1.0.v201105071233.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/tachyon-0.6.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/reflectasm-1.07-shaded.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-tools_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-assembly-1.6.1-hadoop2.7.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.servlet-3.0.0.v201112011016.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-servlet-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-library-2.10.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-annotations-2.3.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-slf4j_2.10-2.2.3-shaded-protobuf.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-network-yarn_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-lang3-3.3.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/compress-lzf-1.0.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-1.6.1-yarn-shuffle.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-core-2.3.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-core_2.10-3.2.10.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-plus-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/config-1.0.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-sharelib-oozie-4.2.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-reflect-2.10.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-xml-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-server-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-lang-2.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/snappy-java-1.0.5.3.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/kryo-2.21.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-streaming_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-httpclient-3.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/guava-14.0.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-codec-1.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/chill_2.10-0.3.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/lz4-1.2.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.transaction-1.1.1.v201105210645.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-net-2.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-hive_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-json-3.0.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/colt-1.2.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-hadoop-utils-hadoop-2-4.2.0.jar\:/opt/application/Hadoop/current/etc/hadoop/\:/opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/hadoop-nfs-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2-tests.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-server-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/guava-11.0.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jsp-api-2.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/junit-4.11.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-codec-1.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/netty-3.6.2.Final.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/httpcore-4.2.5.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/asm-3.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jetty-util-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-framework-2.7.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-digester-1.8.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jetty-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/httpclient-4.2.5.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-collections-3.2.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-annotations-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/protobuf-java-2.5.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-auth-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/paranamer-2.3.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/zookeeper-3.4.6.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/xmlenc-0.52.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-cli-1.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/xz-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jsch-0.1.42.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/api-util-1.0.0-M20.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-core-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/stax-api-1.0-2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-json-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/avro-1.7.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/log4j-1.2.17.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-logging-1.1.3.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-xc-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-api-1.7.10.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/java-xmlbuilder-0.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/activation-1.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jettison-1.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/servlet-api-2.5.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-api-2.2.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-httpclient-3.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/snappy-java-1.0.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-1.7.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-io-2.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/mockito-all-1.8.5.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jsr305-3.0.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-math3-3.1.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/gson-2.2.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/hamcrest-core-1.3.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-recipes-2.7.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jets3t-0.9.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-net-3.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-lang-2.6.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-compress-1.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-client-2.7.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2-tests.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-server-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/guava-11.0.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-codec-1.4.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/asm-3.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xmlenc-0.52.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-cli-1.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-core-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/log4j-1.2.17.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/servlet-api-2.5.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-io-2.4.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jsr305-3.0.0.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-lang-2.6.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-client-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-hadoop.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-common-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-json.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/esri-geometry-api.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-common-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-registry-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-hive.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-api-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-server-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guava-11.0.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-codec-1.4.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/netty-3.6.2.Final.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/asm-3.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-util-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-collections-3.2.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-servlet-3.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-guice-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-3.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-cli-1.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/xz-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-core-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/stax-api-1.0-2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-json-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/log4j-1.2.17.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-logging-1.1.3.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/activation-1.1.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jettison-1.1.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/servlet-api-2.5.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-client-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-io-2.4.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jsr305-3.0.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-lang-2.6.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-compress-1.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/aopalliance-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/javax.inject-1.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-server-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/junit-4.11.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/asm-3.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/paranamer-2.3.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-3.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/xz-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-core-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/avro-1.7.4.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/log4j-1.2.17.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-io-2.4.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/aopalliance-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/javax.inject-1.jar
          user.name=shfs3453
          java.vm.specification.version=1.7
          sun.java.command=org.apache.hadoop.mapreduce.v2.app.MRAppMaster
          java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre
          sun.arch.data.model=64
          user.language=en
          java.specification.vendor=Oracle Corporation
          awt.toolkit=sun.awt.X11.XToolkit
          java.vm.info=mixed mode
          java.version=1.7.0_79
          java.ext.dirs=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/ext\:/usr/java/packages/lib/ext
          sun.boot.class.path=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/resources.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/rt.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/sunrsasign.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/jsse.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/jce.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/charsets.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/rhino.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/jfr.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/classes
          java.vendor=Oracle Corporation
          file.separator=/
          oozie.launcher.job.id=job_1461692698792_29704
          oozie.action.stats.properties=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/stats.properties
          java.vendor.url.bug=http\://bugreport.sun.com/bugreport/
          sun.io.unicode.encoding=UnicodeLittle
          sun.cpu.endian=little
          sun.cpu.isalist=
          ------------------------
          
          =================================================================
          
          >>> Invoking Main class now >>>
          
          Fetching child yarn jobs
          tag id : oozie-f217792cb72212277adf42b1fef23939
          Child yarn jobs are found -
          Spark Action Main class        : org.apache.spark.deploy.SparkSubmit
          
          Oozie Spark action configuration
          =================================================================
          
          
                              --master
                              yarn-cluster
                              --deploy-mode
                              cluster
                              --name
                              Pysparkpi example
                              --conf
                              spark.yarn.appMasterEnv.SPARK_HOME=/opt/application/Spark/current
                              --conf
                              spark.executorEnv.SPARK_HOME=/opt/application/Spark/current
                              --conf
                              spark.executorEnv.PYTHONPATH=/opt/application/Spark/current/python
                              --py-files
                              hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/py4j-0.9-src.zip,hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pyspark.zip
                              --verbose
                              hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pi.py
                              100
          
          =================================================================
          
          >>> Invoking Spark class now >>>
          
          <<< Invocation of Main class completed <<<
          
          Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1461692698792_29705 finished with failed status
          org.apache.spark.SparkException: Application application_1461692698792_29705 finished with failed status
                  at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034)
                  at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
                  at org.apache.spark.deploy.yarn.Client.main(Client.scala)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
                  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
                  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
                  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
                  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
                  at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104)
                  at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95)
                  at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
                  at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                  at java.lang.reflect.Method.invoke(Method.java:606)
                  at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
                  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
                  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
                  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187)
                  at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230)
                  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
                  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
                  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                  at java.lang.Thread.run(Thread.java:745)
          
          Oozie Launcher failed, finishing Hadoop job gracefully
          
          Oozie Launcher, uploading action data to HDFS sequence file: hdfs://sandbox/user/shfs3453/oozie/0012689-160510172237486-oozie-W/spark-node--spark/action-data.seq
          
          Oozie Launcher ends
          
          Show
          BigDataOrange Alexandre Linte added a comment - Hi Satish Subhashrao Saley , I won't be able to give you the logs for the application 1461692698792_19525, the logs were purged. Here are the logs for a pyspark job that fails with the same error (application 1461692698792_29704 / 1461692698792_29705). OOZIE LOGS 2016-05-20 17:42:00,627 INFO CallbackServlet:520 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] callback for action [0012689-160510172237486-oozie-W@spark-node] 2016-05-20 17:42:00,892 INFO SparkActionExecutor:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] action completed, external ID [job_1461692698792_29704] 2016-05-20 17:42:00,897 WARN SparkActionExecutor:523 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1461692698792_29705 finished with failed status 2016-05-20 17:42:00,897 WARN SparkActionExecutor:523 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] Launcher exception: Application application_1461692698792_29705 finished with failed status org.apache.spark.SparkException: Application application_1461692698792_29705 finished with failed status at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034) at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104) at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-05-20 17:42:01,017 INFO ActionEndXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@spark-node] ERROR is considered as FAILED for SLA 2016-05-20 17:42:01,080 INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@fail] Start action [0012689-160510172237486-oozie-W@fail] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10] 2016-05-20 17:42:01,081 INFO ActionStartXCommand:520 - USER[shfs3453] GROUP[-] TOKEN[] APP[PysparkPi-test] JOB[0012689-160510172237486-oozie-W] ACTION[0012689-160510172237486-oozie-W@fail] [***0012689-160510172237486-oozie-W@fail***]Action status=DONE RESOURCE MANAGER LOGS 2016-05-20 17:41:51,123 WARN org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl: The specific max attempts: 0 for application: 29705 is invalid, because it is out of the range [1, 2]. Use the global max attempts instead. 2016-05-20 17:41:52,880 WARN org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl: The specific max attempts: 0 for application: 29706 is invalid, because it is out of the range [1, 2]. Use the global max attempts instead. 2016-05-20 17:41:58,161 WARN org.apache.hadoop.yarn.server.resourcemanager.RMAuditLogger: USER=shfs3453 OPERATION=Application Finished - Failed TARGET=RMAppManager RESULT=FAILURE DESCRIPTION=App failed with state: FAILED PERMISSIONS=Application application_1461692698792_29705 failed 2 times due to AM Container for appattempt_1461692698792_29705_000002 exited with exitCode: 1 For more detailed output, check application tracking page:http://uabigrm02.rouen.francetelecom.fr:8088/cluster/app/application_1461692698792_29705Then, click on links to logs of each attempt. Diagnostics: Exception from container-launch. Container id: container_1461692698792_29705_02_000001 Exit code: 1 Stack trace: ExitCodeException exitCode=1: at org.apache.hadoop.util.Shell.runCommand(Shell.java:545) at org.apache.hadoop.util.Shell.run(Shell.java:456) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:297) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Shell output: main : command provided 1 main : user is shfs3453 main : requested yarn user is shfs3453 Container exited with a non-zero exit code 1 Failing this attempt. Failing the application. APPID=application_1461692698792_29705 2016-05-20 17:41:59,375 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for appattempt_1461692698792_29706_000001 (auth:SIMPLE) 2016-05-20 17:41:59,381 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for appattempt_1461692698792_29706_000001 (auth:TOKEN) for protocol=interface org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB 2016-05-20 17:42:00,745 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for oozie/uabigord01.rouen.francetelecom.fr@SANDBOX.HADOOP (auth:KERBEROS) DATANODE LOGS (stderr) SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/mnt/hd3/hadoop/yarn/local/filecache/187/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/mnt/hd9/hadoop/yarn/local/filecache/249/spark-assembly-1.6.1-hadoop2.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/application/Hadoop/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] log4j:ERROR setFile(null,true) call failed. java.io.FileNotFoundException: /mnt/hd6/hadoop/yarn/log/application_1461692698792_29704/container_1461692698792_29704_01_000001 (Is a directory) at java.io.FileOutputStream.open(Native Method) at java.io.FileOutputStream.<init>(FileOutputStream.java:221) at java.io.FileOutputStream.<init>(FileOutputStream.java:142) at org.apache.log4j.FileAppender.setFile(FileAppender.java:294) at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165) at org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55) at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307) at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172) at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104) at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:809) at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735) at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:502) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:547) at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:483) at org.apache.log4j.LogManager.<clinit>(LogManager.java:127) at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:270) at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155) at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132) at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:275) at org.apache.hadoop.service.AbstractService.<clinit>(AbstractService.java:43) May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class May 20, 2016 5:41:44 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton" May 20, 2016 5:41:44 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton" May 20, 2016 5:41:45 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices to GuiceManagedComponentProvider with the scope "PerRequest" Using properties file: null Parsed arguments: master yarn-cluster deployMode cluster executorMemory null executorCores null totalExecutorCores null propertiesFile null driverMemory null driverCores null driverExtraClassPath null driverExtraLibraryPath null driverExtraJavaOptions null supervise false queue null numExecutors null files null pyFiles hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/py4j-0.9-src.zip,hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pyspark.zip archives null mainClass null primaryResource hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pi.py name Pysparkpi example childArgs [100] jars null packages null packagesExclusions null repositories null verbose true Spark properties used, including those specified through --conf and those from the properties file null: spark.executorEnv.SPARK_HOME -> /opt/application/Spark/current spark.executorEnv.PYTHONPATH -> /opt/application/Spark/current/python spark.yarn.appMasterEnv.SPARK_HOME -> /opt/application/Spark/current Main class: org.apache.spark.deploy.yarn.Client Arguments: --name Pysparkpi example --primary-py-file hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pi.py --py-files hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/py4j-0.9-src.zip,hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pyspark.zip --class org.apache.spark.deploy.PythonRunner --arg 100 System properties: spark.executorEnv.SPARK_HOME -> /opt/application/Spark/current spark.executorEnv.PYTHONPATH -> /opt/application/Spark/current/python SPARK_SUBMIT -> true spark.app.name -> Pysparkpi example spark.submit.deployMode -> cluster spark.yarn.appMasterEnv.SPARK_HOME -> /opt/application/Spark/current spark.yarn.isPython -> true spark.master -> yarn-cluster Classpath elements: Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1461692698792_29705 finished with failed status org.apache.spark.SparkException: Application application_1461692698792_29705 finished with failed status at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034) at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104) at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. DATANODE LOGS (stdout) Oozie Launcher starts Heart beat {"properties":[{"key":"oozie.launcher.job.id","value":"job_1461692698792_29704","isFinal":false,"resource":"programatically"},{"key":"oozie.job.id","value":"0012689-160510172237486-oozie-W","isFinal":false,"resource":"programatically"},{"key":"oozie.action.id","value":"0012689-160510172237486-oozie-W@spark-node","isFinal":false,"resource":"programatically"},{"key":"mapreduce.job.tags","value":"oozie-f217792cb72212277adf42b1fef23939","isFinal":false,"resource":"programatically"}]}Starting the execution of prepare actions Completed the execution of prepare actions successfully Files in current dir:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/. ====================== File: jackson-databind-2.3.1.jar File: jets3t-0.7.1.jar File: libhadoop.a File: propagation-conf.xml File: paranamer-2.6.jar File: zookeeper-3.4.6.jar File: tachyon-client-0.5.0.jar File: spark-graphx_2.10-1.6.1.jar File: akka-actor_2.10-2.2.3-shaded-protobuf.jar File: spark-catalyst_2.10-1.6.1.jar File: jetty-webapp-8.1.14.v20131031.jar File: slf4j-api-1.6.6.jar File: javax.mail.glassfish-1.4.1.v201005082020.jar File: spark-mllib_2.10-1.6.1.jar File: jsr305-1.3.9.jar File: libhadoop.so.1.0.0 File: jetty-security-8.1.14.v20131031.jar File: spark-core_2.10-1.6.1.jar File: scala-compiler-2.10.0.jar File: protobuf-java-2.4.1-shaded.jar File: akka-remote_2.10-2.2.3-shaded-protobuf.jar File: metrics-jvm-3.0.2.jar File: metrics-graphite-3.0.0.jar File: netty-all-4.0.23.Final.jar File: container_tokens File: objenesis-1.2.jar File: curator-recipes-2.5.0.jar File: chill-java-0.3.6.jar File: json4s-ast_2.10-3.2.10.jar File: commons-io-2.1.jar File: jetty-io-8.1.14.v20131031.jar File: jetty-continuation-8.1.14.v20131031.jar File: action.xml File: launch_container.sh File: stream-2.7.0.jar File: py4j-0.9-src.zip File: .action.xml.crc File: minlog-1.2.jar File: libhdfs.a File: netty-3.6.6.Final.jar File: jetty-util-8.1.14.v20131031.jar File: spark-yarn_2.10-1.6.1.jar File: scalap-2.10.0.jar File: job.xml File: json-simple-1.1.jar File: jetty-http-8.1.14.v20131031.jar File: pyrolite-2.0.1.jar File: log4j-1.2.16.jar File: jline-0.9.94.jar File: uncommons-maths-1.2.2a.jar File: slf4j-log4j12-1.6.6.jar File: py4j-0.8.2.1.jar File: json4s-jackson_2.10-3.2.10.jar Dir: jobSubmitDir File: job.splitmetainfo File: job.split File: metrics-core-3.0.2.jar File: jblas-1.2.3.jar File: libhdfs.so.0.0.0 File: curator-client-2.7.1.jar File: spark-sql_2.10-1.6.1.jar File: concurrent-1.3.4.jar File: jetty-jndi-8.1.14.v20131031.jar File: oozie-sharelib-spark-4.2.0.jar File: curator-framework-2.5.0.jar File: pyspark.zip File: javax.activation-1.1.0.v201105071233.jar File: tachyon-0.6.4.jar File: reflectasm-1.07-shaded.jar File: spark-tools_2.10-1.6.1.jar File: GC.log File: libhadooppipes.a File: spark-assembly-1.6.1-hadoop2.7.2.jar File: javax.servlet-3.0.0.v201112011016.jar File: jetty-servlet-8.1.14.v20131031.jar File: scala-library-2.10.4.jar File: jackson-annotations-2.3.0.jar File: libhadoop.so File: akka-slf4j_2.10-2.2.3-shaded-protobuf.jar File: spark-network-yarn_2.10-1.6.1.jar File: commons-lang3-3.3.2.jar File: compress-lzf-1.0.0.jar File: spark-1.6.1-yarn-shuffle.jar File: jackson-core-2.3.1.jar File: json4s-core_2.10-3.2.10.jar File: jetty-plus-8.1.14.v20131031.jar File: config-1.0.2.jar File: oozie-sharelib-oozie-4.2.0.jar File: scala-reflect-2.10.0.jar File: jetty-xml-8.1.14.v20131031.jar File: jetty-server-8.1.14.v20131031.jar File: commons-lang-2.4.jar File: snappy-java-1.0.5.3.jar File: libhdfs.so File: kryo-2.21.jar File: spark-streaming_2.10-1.6.1.jar File: commons-httpclient-3.1.jar File: guava-14.0.1.jar File: commons-codec-1.4.jar File: libhadooputils.a Dir: tmp Dir: Jetty_0_0_0_0_54973_mapreduce____.j5f3a5 File: chill_2.10-0.3.6.jar File: lz4-1.2.0.jar File: javax.transaction-1.1.1.v201105210645.jar File: commons-net-2.2.jar File: spark-hive_2.10-1.6.1.jar File: metrics-json-3.0.2.jar File: colt-1.2.0.jar File: oozie-hadoop-utils-hadoop-2-4.2.0.jar Oozie Java/Map-Reduce/Pig action launcher-job configuration ================================================================= Workflow job id : 0012689-160510172237486-oozie-W Workflow action id: 0012689-160510172237486-oozie-W@spark-node Classpath : ------------------------ /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001 job.jar/job.jar job.jar/classes/ job.jar/lib/* /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-databind-2.3.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jets3t-0.7.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/paranamer-2.6.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/zookeeper-3.4.6.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/tachyon-client-0.5.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-graphx_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-actor_2.10-2.2.3-shaded-protobuf.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-catalyst_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-webapp-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-api-1.6.6.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.mail.glassfish-1.4.1.v201005082020.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-mllib_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jsr305-1.3.9.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-security-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-core_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-compiler-2.10.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/protobuf-java-2.4.1-shaded.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-remote_2.10-2.2.3-shaded-protobuf.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-catalyst_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-webapp-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-api-1.6.6.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.mail.glassfish-1.4.1.v201005082020.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-mllib_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jsr305-1.3.9.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-security-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-core_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-compiler-2.10.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/protobuf-java-2.4.1-shaded.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-remote_2.10-2.2.3-shaded-protobuf.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-jvm-3.0.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-graphite-3.0.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/netty-all-4.0.23.Final.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/objenesis-1.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-recipes-2.5.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/chill-java-0.3.6.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-ast_2.10-3.2.10.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-io-2.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-io-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-continuation-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/stream-2.7.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/minlog-1.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/netty-3.6.6.Final.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-util-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-yarn_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scalap-2.10.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json-simple-1.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-http-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/pyrolite-2.0.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/log4j-1.2.16.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jline-0.9.94.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/uncommons-maths-1.2.2a.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-log4j12-1.6.6.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/py4j-0.8.2.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-jackson_2.10-3.2.10.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-core-3.0.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jblas-1.2.3.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-client-2.7.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-sql_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/concurrent-1.3.4.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-jndi-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-sharelib-spark-4.2.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-framework-2.5.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.activation-1.1.0.v201105071233.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/tachyon-0.6.4.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/reflectasm-1.07-shaded.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-tools_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-assembly-1.6.1-hadoop2.7.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.servlet-3.0.0.v201112011016.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-servlet-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-library-2.10.4.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-annotations-2.3.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-slf4j_2.10-2.2.3-shaded-protobuf.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-network-yarn_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-lang3-3.3.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/compress-lzf-1.0.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-1.6.1-yarn-shuffle.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-core-2.3.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-core_2.10-3.2.10.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-plus-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/config-1.0.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-sharelib-oozie-4.2.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-reflect-2.10.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-xml-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-server-8.1.14.v20131031.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-lang-2.4.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/snappy-java-1.0.5.3.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/kryo-2.21.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-streaming_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-httpclient-3.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/guava-14.0.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-codec-1.4.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/chill_2.10-0.3.6.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/lz4-1.2.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.transaction-1.1.1.v201105210645.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-net-2.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-hive_2.10-1.6.1.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-json-3.0.2.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/colt-1.2.0.jar /mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-hadoop-utils-hadoop-2-4.2.0.jar /opt/application/Hadoop/current/etc/hadoop/ /opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/common/hadoop-nfs-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2-tests.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jersey-server-1.9.jar /opt/application/Hadoop/current/share/hadoop/common/lib/guava-11.0.2.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jsp-api-2.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/junit-4.11.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-codec-1.4.jar /opt/application/Hadoop/current/share/hadoop/common/lib/netty-3.6.2.Final.jar /opt/application/Hadoop/current/share/hadoop/common/lib/httpcore-4.2.5.jar /opt/application/Hadoop/current/share/hadoop/common/lib/asm-3.2.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jetty-util-6.1.26.jar /opt/application/Hadoop/current/share/hadoop/common/lib/curator-framework-2.7.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-digester-1.8.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jetty-6.1.26.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/common/lib/httpclient-4.2.5.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-collections-3.2.2.jar /opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-annotations-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/common/lib/protobuf-java-2.5.0.jar /opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar /opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-auth-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/common/lib/paranamer-2.3.jar /opt/application/Hadoop/current/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar /opt/application/Hadoop/current/share/hadoop/common/lib/zookeeper-3.4.6.jar /opt/application/Hadoop/current/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar /opt/application/Hadoop/current/share/hadoop/common/lib/xmlenc-0.52.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-cli-1.2.jar /opt/application/Hadoop/current/share/hadoop/common/lib/xz-1.0.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jsch-0.1.42.jar /opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar /opt/application/Hadoop/current/share/hadoop/common/lib/api-util-1.0.0-M20.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jersey-core-1.9.jar /opt/application/Hadoop/current/share/hadoop/common/lib/stax-api-1.0-2.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jersey-json-1.9.jar /opt/application/Hadoop/current/share/hadoop/common/lib/avro-1.7.4.jar /opt/application/Hadoop/current/share/hadoop/common/lib/log4j-1.2.17.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-logging-1.1.3.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jackson-xc-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-api-1.7.10.jar /opt/application/Hadoop/current/share/hadoop/common/lib/java-xmlbuilder-0.4.jar /opt/application/Hadoop/current/share/hadoop/common/lib/activation-1.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jettison-1.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/servlet-api-2.5.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-api-2.2.2.jar /opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-httpclient-3.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/snappy-java-1.0.4.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-1.7.0.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-io-2.4.jar /opt/application/Hadoop/current/share/hadoop/common/lib/mockito-all-1.8.5.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jsr305-3.0.0.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-math3-3.1.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/gson-2.2.4.jar /opt/application/Hadoop/current/share/hadoop/common/lib/hamcrest-core-1.3.jar /opt/application/Hadoop/current/share/hadoop/common/lib/curator-recipes-2.7.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jets3t-0.9.0.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-net-3.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-lang-2.6.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-compress-1.4.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/curator-client-2.7.1.jar /opt/application/Hadoop/current/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar /opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2-tests.jar /opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-server-1.9.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/guava-11.0.2.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-codec-1.4.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/asm-3.2.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-6.1.26.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/xmlenc-0.52.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-cli-1.2.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-core-1.9.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/log4j-1.2.17.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/servlet-api-2.5.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-io-2.4.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jsr305-3.0.0.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-lang-2.6.jar /opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-client-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-hadoop.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-common-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-json.jar /opt/application/Hadoop/current/share/hadoop/yarn/esri-geometry-api.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-common-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-registry-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-hive.jar /opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-api-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-server-1.9.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/guava-11.0.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-codec-1.4.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/netty-3.6.2.Final.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/asm-3.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-util-6.1.26.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-6.1.26.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-collections-3.2.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-servlet-3.0.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-guice-1.9.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-3.0.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-cli-1.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/xz-1.0.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-core-1.9.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/stax-api-1.0-2.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-json-1.9.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/log4j-1.2.17.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-logging-1.1.3.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/activation-1.1.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jettison-1.1.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/servlet-api-2.5.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-client-1.9.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-io-2.4.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jsr305-3.0.0.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-lang-2.6.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-compress-1.4.1.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/aopalliance-1.0.jar /opt/application/Hadoop/current/share/hadoop/yarn/lib/javax.inject-1.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-server-1.9.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/junit-4.11.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/asm-3.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.2.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/paranamer-2.3.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-3.0.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/xz-1.0.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-core-1.9.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/avro-1.7.4.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/log4j-1.2.17.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-io-2.4.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/aopalliance-1.0.jar /opt/application/Hadoop/current/share/hadoop/mapreduce/lib/javax.inject-1.jar ------------------------ Main class : org.apache.oozie.action.hadoop.SparkMain Maximum output : 2048 Arguments : 100 Java System Properties: ------------------------ # #Fri May 20 17:41:46 CEST 2016 java.runtime.name=OpenJDK Runtime Environment sun.boot.library.path=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/amd64 java.vm.version=24.79-b02 oozie.action.externalChildIDs=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/externalChildIDs hadoop.root.logger=INFO,CLA java.vm.vendor=Oracle Corporation java.vendor.url=http\://java.oracle.com/ path.separator=\: java.vm.name=OpenJDK 64-Bit Server VM file.encoding.pkg=sun.io oozie.job.launch.time=1463758882528 user.country=US sun.java.launcher=SUN_STANDARD sun.os.patch.level=unknown java.vm.specification.name=Java Virtual Machine Specification user.dir=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001 oozie.action.newId=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/newId java.runtime.version=1.7.0_79-mockbuild_2015_07_24_09_26-b00 java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment java.endorsed.dirs=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/endorsed os.arch=amd64 oozie.job.id=0012689-160510172237486-oozie-W oozie.action.id=0012689-160510172237486-oozie-W@spark-node yarn.app.container.log.dir=/mnt/hd6/hadoop/yarn/log/application_1461692698792_29704/container_1461692698792_29704_01_000001 java.io.tmpdir=./tmp line.separator=\n oozie.action.output.properties=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/output.properties java.vm.specification.vendor=Oracle Corporation os.name=Linux log4j.configuration=container-log4j.properties sun.jnu.encoding=UTF-8 java.library.path=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001\:/usr/java/packages/lib/amd64\:/usr/lib64\:/lib64\:/lib\:/usr/lib oozie.action.conf.xml=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/action.xml java.specification.name=Java Platform API Specification java.class.version=51.0 sun.management.compiler=HotSpot 64-Bit Tiered Compilers os.version=2.6.32-573.7.1.el6.x86_64 oozie.action.error.properties=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/error.properties yarn.app.container.log.filesize=0 user.home=/home/shfs3453 user.timezone=Europe/Paris java.awt.printerjob=sun.print.PSPrinterJob file.encoding=UTF-8 java.specification.version=1.7 java.class.path=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001\:job.jar/job.jar\:job.jar/classes/\:job.jar/lib/*\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-databind-2.3.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jets3t-0.7.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/paranamer-2.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/zookeeper-3.4.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/tachyon-client-0.5.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-graphx_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-actor_2.10-2.2.3-shaded-protobuf.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-catalyst_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-webapp-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-api-1.6.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.mail.glassfish-1.4.1.v201005082020.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-mllib_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jsr305-1.3.9.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-security-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-core_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-compiler-2.10.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/protobuf-java-2.4.1-shaded.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-remote_2.10-2.2.3-shaded-protobuf.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-jvm-3.0.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-graphite-3.0.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/netty-all-4.0.23.Final.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/objenesis-1.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-recipes-2.5.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/chill-java-0.3.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-ast_2.10-3.2.10.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-io-2.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-io-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-continuation-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/stream-2.7.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/minlog-1.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/netty-3.6.6.Final.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-util-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-yarn_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scalap-2.10.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json-simple-1.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-http-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/pyrolite-2.0.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/log4j-1.2.16.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jline-0.9.94.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/uncommons-maths-1.2.2a.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/slf4j-log4j12-1.6.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/py4j-0.8.2.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-jackson_2.10-3.2.10.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-core-3.0.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jblas-1.2.3.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-client-2.7.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-sql_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/concurrent-1.3.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-jndi-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-sharelib-spark-4.2.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/curator-framework-2.5.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.activation-1.1.0.v201105071233.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/tachyon-0.6.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/reflectasm-1.07-shaded.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-tools_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-assembly-1.6.1-hadoop2.7.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.servlet-3.0.0.v201112011016.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-servlet-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-library-2.10.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-annotations-2.3.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/akka-slf4j_2.10-2.2.3-shaded-protobuf.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-network-yarn_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-lang3-3.3.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/compress-lzf-1.0.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-1.6.1-yarn-shuffle.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jackson-core-2.3.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/json4s-core_2.10-3.2.10.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-plus-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/config-1.0.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-sharelib-oozie-4.2.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/scala-reflect-2.10.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-xml-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/jetty-server-8.1.14.v20131031.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-lang-2.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/snappy-java-1.0.5.3.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/kryo-2.21.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-streaming_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-httpclient-3.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/guava-14.0.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-codec-1.4.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/chill_2.10-0.3.6.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/lz4-1.2.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/javax.transaction-1.1.1.v201105210645.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/commons-net-2.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/spark-hive_2.10-1.6.1.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/metrics-json-3.0.2.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/colt-1.2.0.jar\:/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/oozie-hadoop-utils-hadoop-2-4.2.0.jar\:/opt/application/Hadoop/current/etc/hadoop/\:/opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/hadoop-nfs-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2-tests.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-server-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/guava-11.0.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jsp-api-2.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/junit-4.11.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-codec-1.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/netty-3.6.2.Final.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/httpcore-4.2.5.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/asm-3.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jetty-util-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-framework-2.7.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-digester-1.8.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jetty-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/httpclient-4.2.5.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-collections-3.2.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-annotations-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/protobuf-java-2.5.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-auth-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/paranamer-2.3.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/zookeeper-3.4.6.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/xmlenc-0.52.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-cli-1.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/xz-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jsch-0.1.42.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/api-util-1.0.0-M20.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-core-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/stax-api-1.0-2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-json-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/avro-1.7.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/log4j-1.2.17.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-logging-1.1.3.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-xc-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-api-1.7.10.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/java-xmlbuilder-0.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/activation-1.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jettison-1.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/servlet-api-2.5.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-api-2.2.2.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-httpclient-3.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/snappy-java-1.0.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-1.7.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-io-2.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/mockito-all-1.8.5.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jsr305-3.0.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-math3-3.1.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/gson-2.2.4.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/hamcrest-core-1.3.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-recipes-2.7.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jets3t-0.9.0.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-net-3.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-lang-2.6.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-compress-1.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-client-2.7.1.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2-tests.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-server-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/guava-11.0.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-codec-1.4.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/asm-3.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xmlenc-0.52.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-cli-1.2.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-core-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/log4j-1.2.17.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/servlet-api-2.5.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-io-2.4.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jsr305-3.0.0.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-lang-2.6.jar\:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-client-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-hadoop.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-common-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-json.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/esri-geometry-api.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-common-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-registry-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/spatial-sdk-hive.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-api-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-server-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guava-11.0.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-codec-1.4.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/netty-3.6.2.Final.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/asm-3.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-util-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-6.1.26.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-collections-3.2.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-servlet-3.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-guice-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-3.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-cli-1.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/xz-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-core-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/stax-api-1.0-2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-json-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/log4j-1.2.17.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-logging-1.1.3.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/activation-1.1.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jettison-1.1.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/servlet-api-2.5.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-client-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-io-2.4.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jsr305-3.0.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-lang-2.6.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-compress-1.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/aopalliance-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/yarn/lib/javax.inject-1.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-server-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/junit-4.11.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/asm-3.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.2.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/paranamer-2.3.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-3.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/xz-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-core-1.9.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/avro-1.7.4.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/log4j-1.2.17.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-io-2.4.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/aopalliance-1.0.jar\:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/javax.inject-1.jar user.name=shfs3453 java.vm.specification.version=1.7 sun.java.command=org.apache.hadoop.mapreduce.v2.app.MRAppMaster java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre sun.arch.data.model=64 user.language=en java.specification.vendor=Oracle Corporation awt.toolkit=sun.awt.X11.XToolkit java.vm.info=mixed mode java.version=1.7.0_79 java.ext.dirs=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/ext\:/usr/java/packages/lib/ext sun.boot.class.path=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/resources.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/rt.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/sunrsasign.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/jsse.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/jce.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/charsets.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/rhino.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/lib/jfr.jar\:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.79.x86_64/jre/classes java.vendor=Oracle Corporation file.separator=/ oozie.launcher.job.id=job_1461692698792_29704 oozie.action.stats.properties=/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1461692698792_29704/container_1461692698792_29704_01_000001/stats.properties java.vendor.url.bug=http\://bugreport.sun.com/bugreport/ sun.io.unicode.encoding=UnicodeLittle sun.cpu.endian=little sun.cpu.isalist= ------------------------ ================================================================= >>> Invoking Main class now >>> Fetching child yarn jobs tag id : oozie-f217792cb72212277adf42b1fef23939 Child yarn jobs are found - Spark Action Main class : org.apache.spark.deploy.SparkSubmit Oozie Spark action configuration ================================================================= --master yarn-cluster --deploy-mode cluster --name Pysparkpi example --conf spark.yarn.appMasterEnv.SPARK_HOME=/opt/application/Spark/current --conf spark.executorEnv.SPARK_HOME=/opt/application/Spark/current --conf spark.executorEnv.PYTHONPATH=/opt/application/Spark/current/python --py-files hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/py4j-0.9-src.zip,hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pyspark.zip --verbose hdfs://sandbox/User/shfs3453/WORK/Oozie/pyspark/lib/pi.py 100 ================================================================= >>> Invoking Spark class now >>> <<< Invocation of Main class completed <<< Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, Application application_1461692698792_29705 finished with failed status org.apache.spark.SparkException: Application application_1461692698792_29705 finished with failed status at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034) at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:104) at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:95) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:38) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:380) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:301) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:187) at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:230) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Oozie Launcher failed, finishing Hadoop job gracefully Oozie Launcher, uploading action data to HDFS sequence file: hdfs://sandbox/user/shfs3453/oozie/0012689-160510172237486-oozie-W/spark-node--spark/action-data.seq Oozie Launcher ends
          Hide
          satishsaley Satish Subhashrao Saley added a comment -
          • referring to hdfs location for pyspark dependencies in --py-files option
          • setting PYTHONPATH in case of local mode
          • documentation
          Show
          satishsaley Satish Subhashrao Saley added a comment - referring to hdfs location for pyspark dependencies in --py-files option setting PYTHONPATH in case of local mode documentation
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          -1 Patch failed to apply to head of branch

          ----------------------------

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- -1 Patch failed to apply to head of branch ----------------------------
          Hide
          gezapeti Peter Cseh added a comment -

          Thank you for the great work Satish Subhashrao Saley!
          I managed to modify your patch and get it to work on my machine. I changed the following:

          • SparkActionExecutor sets SPARK_HOME instead of PYSPARK_ARCHIVES_PATH
          • SparkMain creates the folder python/lib under the current working directory and copies the needed zip files there.
            I've attached my solution. I've included your documentation changes in it.
          Show
          gezapeti Peter Cseh added a comment - Thank you for the great work Satish Subhashrao Saley ! I managed to modify your patch and get it to work on my machine. I changed the following: SparkActionExecutor sets SPARK_HOME instead of PYSPARK_ARCHIVES_PATH SparkMain creates the folder python/lib under the current working directory and copies the needed zip files there. I've attached my solution. I've included your documentation changes in it.
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          -1 Patch failed to apply to head of branch

          ----------------------------

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- -1 Patch failed to apply to head of branch ----------------------------
          Hide
          rkanter Robert Kanter added a comment -

          Now that OOZIE-2532 is in, the patch should apply, so I've kicked off Jenkins.

          Show
          rkanter Robert Kanter added a comment - Now that OOZIE-2532 is in, the patch should apply, so I've kicked off Jenkins.
          Hide
          rkanter Robert Kanter added a comment -

          Thanks Satish Subhashrao Saley and Peter Cseh for working on this. It's proved to be very tricky.
          Here's some feedback on the -4 patch:

          1. In AG_Install.twiki, where it says

            These files can be added either to workflow's lib/ directory or in sharelib mapping file.

            I think it should also mention that they can also be added to the spark directory if using the "old" sharelib configuration.
          2. In DG_SparkActionExtension.twiki, where it says

            please refer to installation document

            , that should link to the AG_Install.twiki section about the sharelib (there should be an anchor there already because it’s a heading).
          3. I don’t think it’s necessary to repeat all of the standard "The prepare element...", "The job-xml element...", etc in the PySpark section in DG_SparkActionExectension.twiki. That’s already mentioned earlier. It’s only necessary to mention that the python file goes in the <jar> element.
          4. In SparkActionExecutor, Something seems funny here:
                    String mapredChildEnv = conf.get("oozie.launcher." + MAPRED_CHILD_ENV);
            
                    if (mapredChildEnv == null) {
                        conf.set(MAPRED_CHILD_ENV, sparkHome);
                    }
                    else if (!mapredChildEnv.contains("SPARK_HOME")) {
                        conf.set(MAPRED_CHILD_ENV, mapredChildEnv + "," + sparkHome);
                    }
                    return conf;
            

            We’re getting oozie.launcher.mapred.child.env from conf, but we’re setting mapred.child.env in conf. Shouldn’t these match?

          5. There’s an extra blank line added in SparkMain before the if statement for the VERBOSE_OPTION
          6. In SparkMain, there’s a Javadoc that says "... pyspark.zip an py4j-VERSION-src.zip files…". "An" should be "an" here.
          7. We don’t need to do it for this JIRA, but it might be nice to have a new PySpark example workflow. Can you file a new related JIRA for that?
          Show
          rkanter Robert Kanter added a comment - Thanks Satish Subhashrao Saley and Peter Cseh for working on this. It's proved to be very tricky. Here's some feedback on the -4 patch: In AG_Install.twiki, where it says These files can be added either to workflow's lib/ directory or in sharelib mapping file. I think it should also mention that they can also be added to the spark directory if using the "old" sharelib configuration. In DG_SparkActionExtension.twiki, where it says please refer to installation document , that should link to the AG_Install.twiki section about the sharelib (there should be an anchor there already because it’s a heading). I don’t think it’s necessary to repeat all of the standard "The prepare element...", "The job-xml element...", etc in the PySpark section in DG_SparkActionExectension.twiki. That’s already mentioned earlier. It’s only necessary to mention that the python file goes in the <jar> element. In SparkActionExecutor , Something seems funny here: String mapredChildEnv = conf.get( "oozie.launcher." + MAPRED_CHILD_ENV); if (mapredChildEnv == null ) { conf.set(MAPRED_CHILD_ENV, sparkHome); } else if (!mapredChildEnv.contains( "SPARK_HOME" )) { conf.set(MAPRED_CHILD_ENV, mapredChildEnv + "," + sparkHome); } return conf; We’re getting oozie.launcher.mapred.child.env from conf , but we’re setting mapred.child.env in conf . Shouldn’t these match? There’s an extra blank line added in SparkMain before the if statement for the VERBOSE_OPTION In SparkMain , there’s a Javadoc that says "... pyspark.zip an py4j-VERSION-src.zip files…". "An" should be "an" here. We don’t need to do it for this JIRA, but it might be nice to have a new PySpark example workflow. Can you file a new related JIRA for that?
          Hide
          rkanter Robert Kanter added a comment -

          On #6, I meant that it should be "and", not "an".

          Show
          rkanter Robert Kanter added a comment - On #6, I meant that it should be "and", not "an".
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          +1 PATCH_APPLIES
          +1 CLEAN
          -1 RAW_PATCH_ANALYSIS
          . +1 the patch does not introduce any @author tags
          . +1 the patch does not introduce any tabs
          . +1 the patch does not introduce any trailing spaces
          . -1 the patch contains 1 line(s) longer than 132 characters
          . +1 the patch does adds/modifies 3 testcase(s)
          +1 RAT
          . +1 the patch does not seem to introduce new RAT warnings
          +1 JAVADOC
          . +1 the patch does not seem to introduce new Javadoc warnings
          +1 COMPILE
          . +1 HEAD compiles
          . +1 patch compiles
          . +1 the patch does not seem to introduce new javac warnings
          +1 BACKWARDS_COMPATIBILITY
          . +1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations
          . +1 the patch does not modify JPA files
          -1 TESTS
          . Tests run: 1781
          . Tests failed: 1
          . Tests errors: 0

          . The patch failed the following testcases:

          . testIDGeneration(org.apache.oozie.service.TestZKUUIDService)

          +1 DISTRO
          . +1 distro tarball builds with the patch

          ----------------------------
          -1 Overall result, please check the reported -1(s)

          The full output of the test-patch run is available at

          . https://builds.apache.org/job/oozie-trunk-precommit-build/2901/

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- +1 PATCH_APPLIES +1 CLEAN -1 RAW_PATCH_ANALYSIS . +1 the patch does not introduce any @author tags . +1 the patch does not introduce any tabs . +1 the patch does not introduce any trailing spaces . -1 the patch contains 1 line(s) longer than 132 characters . +1 the patch does adds/modifies 3 testcase(s) +1 RAT . +1 the patch does not seem to introduce new RAT warnings +1 JAVADOC . +1 the patch does not seem to introduce new Javadoc warnings +1 COMPILE . +1 HEAD compiles . +1 patch compiles . +1 the patch does not seem to introduce new javac warnings +1 BACKWARDS_COMPATIBILITY . +1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations . +1 the patch does not modify JPA files -1 TESTS . Tests run: 1781 . Tests failed: 1 . Tests errors: 0 . The patch failed the following testcases: . testIDGeneration(org.apache.oozie.service.TestZKUUIDService) +1 DISTRO . +1 distro tarball builds with the patch ---------------------------- -1 Overall result, please check the reported -1(s) The full output of the test-patch run is available at . https://builds.apache.org/job/oozie-trunk-precommit-build/2901/
          Hide
          gezapeti Peter Cseh added a comment - - edited

          Thank you for the review Robert Kanter.
          1) changed to: These files can be added either to workflow's lib/ directory, to the sharelib or in sharelib mapping file.
          2) link added with anchor
          3) duplications removed.
          4) Fixed, just like in JavaActionExecutor.injectLauncherProperties(), we now add the original keys and the ones without oozie.launcher prefix.
          5) deleted
          6) typo fixed
          7) good idea, will do!

          Show
          gezapeti Peter Cseh added a comment - - edited Thank you for the review Robert Kanter . 1) changed to: These files can be added either to workflow's lib/ directory, to the sharelib or in sharelib mapping file. 2) link added with anchor 3) duplications removed. 4) Fixed, just like in JavaActionExecutor.injectLauncherProperties(), we now add the original keys and the ones without oozie.launcher prefix. 5) deleted 6) typo fixed 7) good idea, will do!
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          +1 PATCH_APPLIES
          +1 CLEAN
          -1 RAW_PATCH_ANALYSIS
          . +1 the patch does not introduce any @author tags
          . +1 the patch does not introduce any tabs
          . +1 the patch does not introduce any trailing spaces
          . -1 the patch contains 1 line(s) longer than 132 characters
          . +1 the patch does adds/modifies 3 testcase(s)
          +1 RAT
          . +1 the patch does not seem to introduce new RAT warnings
          +1 JAVADOC
          . +1 the patch does not seem to introduce new Javadoc warnings
          +1 COMPILE
          . +1 HEAD compiles
          . +1 patch compiles
          . +1 the patch does not seem to introduce new javac warnings
          +1 BACKWARDS_COMPATIBILITY
          . +1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations
          . +1 the patch does not modify JPA files
          -1 TESTS
          . Tests run: 1781
          . Tests failed: 3
          . Tests errors: 0

          . The patch failed the following testcases:

          . testActionKillCommandDate(org.apache.oozie.command.coord.TestCoordActionsKillXCommand)
          . testMemoryUsageAndSpeed(org.apache.oozie.service.TestPartitionDependencyManagerService)
          . testBundleStatusTransitWithLock(org.apache.oozie.service.TestStatusTransitService)

          +1 DISTRO
          . +1 distro tarball builds with the patch

          ----------------------------
          -1 Overall result, please check the reported -1(s)

          The full output of the test-patch run is available at

          . https://builds.apache.org/job/oozie-trunk-precommit-build/2902/

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- +1 PATCH_APPLIES +1 CLEAN -1 RAW_PATCH_ANALYSIS . +1 the patch does not introduce any @author tags . +1 the patch does not introduce any tabs . +1 the patch does not introduce any trailing spaces . -1 the patch contains 1 line(s) longer than 132 characters . +1 the patch does adds/modifies 3 testcase(s) +1 RAT . +1 the patch does not seem to introduce new RAT warnings +1 JAVADOC . +1 the patch does not seem to introduce new Javadoc warnings +1 COMPILE . +1 HEAD compiles . +1 patch compiles . +1 the patch does not seem to introduce new javac warnings +1 BACKWARDS_COMPATIBILITY . +1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations . +1 the patch does not modify JPA files -1 TESTS . Tests run: 1781 . Tests failed: 3 . Tests errors: 0 . The patch failed the following testcases: . testActionKillCommandDate(org.apache.oozie.command.coord.TestCoordActionsKillXCommand) . testMemoryUsageAndSpeed(org.apache.oozie.service.TestPartitionDependencyManagerService) . testBundleStatusTransitWithLock(org.apache.oozie.service.TestStatusTransitService) +1 DISTRO . +1 distro tarball builds with the patch ---------------------------- -1 Overall result, please check the reported -1(s) The full output of the test-patch run is available at . https://builds.apache.org/job/oozie-trunk-precommit-build/2902/
          Hide
          rkanter Robert Kanter added a comment -

          One last thing on the -5 patch: A dependency on commons-io is added to SparkMain, so we should be adding it to the spark sharelib explicitly instead of relying on Spark to pull it in.

          I tried running a PySpark job and a Spark job on a nonsecure and secure cluster, with and without the zip files, and everything seems to be having as expected. However, it's not working with yarn-client mode, only {{yarn-cluster}.

          Also, Satish Subhashrao Saley, what do you think of the latest patch (other than the yarn-client issue)? The approach is roughly the same, but it's setting different env vars.

          Show
          rkanter Robert Kanter added a comment - One last thing on the -5 patch: A dependency on commons-io is added to SparkMain , so we should be adding it to the spark sharelib explicitly instead of relying on Spark to pull it in. I tried running a PySpark job and a Spark job on a nonsecure and secure cluster, with and without the zip files, and everything seems to be having as expected. However, it's not working with yarn-client mode, only {{yarn-cluster}. Also, Satish Subhashrao Saley , what do you think of the latest patch (other than the yarn-client issue)? The approach is roughly the same, but it's setting different env vars.
          Hide
          gezapeti Peter Cseh added a comment - - edited

          Fixed up the documentation to explain that py files should be referred locally. PySpark does not like full hdfs paths in all running modes. After changing the path, the pi.py run in all 3 modes.
          Added the dependency also.

          Show
          gezapeti Peter Cseh added a comment - - edited Fixed up the documentation to explain that py files should be referred locally. PySpark does not like full hdfs paths in all running modes. After changing the path, the pi.py run in all 3 modes. Added the dependency also.
          Hide
          hadoopqa Hadoop QA added a comment -

          Testing JIRA OOZIE-2482

          Cleaning local git workspace

          ----------------------------

          +1 PATCH_APPLIES
          +1 CLEAN
          -1 RAW_PATCH_ANALYSIS
          . +1 the patch does not introduce any @author tags
          . +1 the patch does not introduce any tabs
          . +1 the patch does not introduce any trailing spaces
          . -1 the patch contains 1 line(s) longer than 132 characters
          . +1 the patch does adds/modifies 3 testcase(s)
          +1 RAT
          . +1 the patch does not seem to introduce new RAT warnings
          +1 JAVADOC
          . +1 the patch does not seem to introduce new Javadoc warnings
          +1 COMPILE
          . +1 HEAD compiles
          . +1 patch compiles
          . +1 the patch does not seem to introduce new javac warnings
          +1 BACKWARDS_COMPATIBILITY
          . +1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations
          . +1 the patch does not modify JPA files
          -1 TESTS
          . Tests run: 1781
          . Tests failed: 2
          . Tests errors: 0

          . The patch failed the following testcases:

          . testIDGeneration(org.apache.oozie.service.TestZKUUIDService)
          . testMultipleIDGeneration(org.apache.oozie.service.TestZKUUIDService)

          +1 DISTRO
          . +1 distro tarball builds with the patch

          ----------------------------
          -1 Overall result, please check the reported -1(s)

          The full output of the test-patch run is available at

          . https://builds.apache.org/job/oozie-trunk-precommit-build/2904/

          Show
          hadoopqa Hadoop QA added a comment - Testing JIRA OOZIE-2482 Cleaning local git workspace ---------------------------- +1 PATCH_APPLIES +1 CLEAN -1 RAW_PATCH_ANALYSIS . +1 the patch does not introduce any @author tags . +1 the patch does not introduce any tabs . +1 the patch does not introduce any trailing spaces . -1 the patch contains 1 line(s) longer than 132 characters . +1 the patch does adds/modifies 3 testcase(s) +1 RAT . +1 the patch does not seem to introduce new RAT warnings +1 JAVADOC . +1 the patch does not seem to introduce new Javadoc warnings +1 COMPILE . +1 HEAD compiles . +1 patch compiles . +1 the patch does not seem to introduce new javac warnings +1 BACKWARDS_COMPATIBILITY . +1 the patch does not change any JPA Entity/Colum/Basic/Lob/Transient annotations . +1 the patch does not modify JPA files -1 TESTS . Tests run: 1781 . Tests failed: 2 . Tests errors: 0 . The patch failed the following testcases: . testIDGeneration(org.apache.oozie.service.TestZKUUIDService) . testMultipleIDGeneration(org.apache.oozie.service.TestZKUUIDService) +1 DISTRO . +1 distro tarball builds with the patch ---------------------------- -1 Overall result, please check the reported -1(s) The full output of the test-patch run is available at . https://builds.apache.org/job/oozie-trunk-precommit-build/2904/
          Hide
          satishsaley Satish Subhashrao Saley added a comment -

          Hi Robert Kanter and Peter Cseh,
          Setting SPARK_HOME approach is nice and saves us from passing zip files through --py-files option.
          But I am worried about copying over the zip files again by spark since they reside on local file system. Also we are copying them over from lib/ to python/lib/.
          If we mention those using --py-files option with hdfs paths to pyspark.zip and py4j-0.9-src.zip then this copy could be avoided (i.e.if these files are part of sharelib, we can pass in that path). And for local mode, we just use the files which are available under local directory of launcher job.

          I think its between – simpler code with extra copying Vs some involved code to avoid copying.

          with SPARK_HOME in yarn-cluster

          2016-05-25 15:44:22,873 INFO [uber-SubtaskRunner] org.apache.spark.deploy.yarn.Client: Uploading resource file:/private/tmp/hadoop-saley/nm-local-dir/usercache/saley/appcache/application_1464215511846_0003/container_1464215511846_0003_01_000001/python/lib/pyspark.zip -> hdfs://localhost:8020/user/saley/.sparkStaging/application_1464215511846_0004/pyspark.zip
          2016-05-25 15:44:22,880 INFO [uber-SubtaskRunner] org.apache.spark.deploy.yarn.Client: Uploading resource file:/private/tmp/hadoop-saley/nm-local-dir/usercache/saley/appcache/application_1464215511846_0003/container_1464215511846_0003_01_000001/python/lib/py4j-0.9-src.zip -> hdfs://localhost:8020/user/saley/.sparkStaging/application_1464215511846_0004/py4j-0.9-src.zip
          
          

          with PYSPARK_ARCHIVES_PATH in yarn-cluster mode. I have setup zip files inside sharelib:

          2016-05-25 23:35:43,440 INFO [uber-SubtaskRunner] org.apache.spark.deploy.yarn.Client: Source and destination file systems are the same. Not copying hdfs:/tmp/sharelib_dir/spark_yarn/share/spark/python/lib/pyspark.zip
          
          2016-05-25 23:35:43,460 INFO [uber-SubtaskRunner] org.apache.spark.deploy.yarn.Client: Source and destination file systems are the same. Not copying hdfs:/tmp/sharelib_dir/spark_yarn/share/spark/python/lib/py4j-0.9-src.zip
          
          
          Show
          satishsaley Satish Subhashrao Saley added a comment - Hi Robert Kanter and Peter Cseh , Setting SPARK_HOME approach is nice and saves us from passing zip files through --py-files option. But I am worried about copying over the zip files again by spark since they reside on local file system. Also we are copying them over from lib/ to python/lib/. If we mention those using --py-files option with hdfs paths to pyspark.zip and py4j-0.9-src.zip then this copy could be avoided (i.e.if these files are part of sharelib, we can pass in that path). And for local mode, we just use the files which are available under local directory of launcher job. I think its between – simpler code with extra copying Vs some involved code to avoid copying. with SPARK_HOME in yarn-cluster 2016-05-25 15:44:22,873 INFO [uber-SubtaskRunner] org.apache.spark.deploy.yarn.Client: Uploading resource file:/ private /tmp/hadoop-saley/nm-local-dir/usercache/saley/appcache/application_1464215511846_0003/container_1464215511846_0003_01_000001/python/lib/pyspark.zip -> hdfs: //localhost:8020/user/saley/.sparkStaging/application_1464215511846_0004/pyspark.zip 2016-05-25 15:44:22,880 INFO [uber-SubtaskRunner] org.apache.spark.deploy.yarn.Client: Uploading resource file:/ private /tmp/hadoop-saley/nm-local-dir/usercache/saley/appcache/application_1464215511846_0003/container_1464215511846_0003_01_000001/python/lib/py4j-0.9-src.zip -> hdfs: //localhost:8020/user/saley/.sparkStaging/application_1464215511846_0004/py4j-0.9-src.zip with PYSPARK_ARCHIVES_PATH in yarn-cluster mode. I have setup zip files inside sharelib: 2016-05-25 23:35:43,440 INFO [uber-SubtaskRunner] org.apache.spark.deploy.yarn.Client: Source and destination file systems are the same. Not copying hdfs:/tmp/sharelib_dir/spark_yarn/share/spark/python/lib/pyspark.zip 2016-05-25 23:35:43,460 INFO [uber-SubtaskRunner] org.apache.spark.deploy.yarn.Client: Source and destination file systems are the same. Not copying hdfs:/tmp/sharelib_dir/spark_yarn/share/spark/python/lib/py4j-0.9-src.zip
          Hide
          rkanter Robert Kanter added a comment -

          These two zip files are tiny compared to all of the jars that end up having to be copied/reuploaded by the Spark Action because of how Spark handles it's jars and classpath (this whole nonsense). That will cause the action to reupload Hadoop jars. One of the zip files is ~350kb and the other is only ~50kb, so these are hardly going to slow things down; in the meantime, the Spark Sharelib jars are over 100mb and the Hadoop jars are probably around there somewhere.

          The SPARK_HOME approach seems simpler, which is good. I think Peter Cseh is going to take another look at the PYSPARK_ARCHIVES_PATH approach. Anyway, we can discuss this in person tomorrow.

          Show
          rkanter Robert Kanter added a comment - These two zip files are tiny compared to all of the jars that end up having to be copied/reuploaded by the Spark Action because of how Spark handles it's jars and classpath ( this whole nonsense ). That will cause the action to reupload Hadoop jars. One of the zip files is ~350kb and the other is only ~50kb, so these are hardly going to slow things down; in the meantime, the Spark Sharelib jars are over 100mb and the Hadoop jars are probably around there somewhere. The SPARK_HOME approach seems simpler, which is good. I think Peter Cseh is going to take another look at the PYSPARK_ARCHIVES_PATH approach. Anyway, we can discuss this in person tomorrow.
          Hide
          rkanter Robert Kanter added a comment -

          We discussed this in person. We're going to use the SPARK_HOME approach for now (patch -6) and can revisit this later if it make sense to switch it; that would be invisible to the user, so it's not a big deal to change it later.

          +1 on the -6 patch
          Test failures unrelated. The long line is a twiki thing.

          Show
          rkanter Robert Kanter added a comment - We discussed this in person. We're going to use the SPARK_HOME approach for now (patch -6) and can revisit this later if it make sense to switch it; that would be invisible to the user, so it's not a big deal to change it later. +1 on the -6 patch Test failures unrelated. The long line is a twiki thing.
          Hide
          rkanter Robert Kanter added a comment -

          Thanks Satish Subhashrao Saley and Peter Cseh for working on this. It's been a big problem and I'm glad we got it working now.

          Committed to master!

          Show
          rkanter Robert Kanter added a comment - Thanks Satish Subhashrao Saley and Peter Cseh for working on this. It's been a big problem and I'm glad we got it working now. Committed to master!
          Hide
          rkanter Robert Kanter added a comment -

          Closing issue; Oozie 4.3.0 is released.

          Show
          rkanter Robert Kanter added a comment - Closing issue; Oozie 4.3.0 is released.

            People

            • Assignee:
              satishsaley Satish Subhashrao Saley
              Reporter:
              BigDataOrange Alexandre Linte
            • Votes:
              1 Vote for this issue
              Watchers:
              13 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development