Uploaded image for project: 'Oozie'
  1. Oozie
  2. OOZIE-1643

Oozie doesn't parse Hadoop Job Id from the Hive action

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • trunk
    • 4.1.0
    • action
    • None

    Description

      I'm not sure how long this has been going on (possibly for quite a while), but the Hive action isn't able to parse the Hadoop Job Ids launched by Hive.

      The way its supposed to work is that the HiveMain creates a hive-log4j.properties file which redirects the output from HiveCLI to the console (for easy viewing in the launcher, and creates a hive-exec-log4j.properties to redirect the output from one of the hive-exec classes to a log file; Oozie would then parse that log file for the Hadoop Job Ids.

      What's instead happening is that the HiveCLI is picking up a hive-log4j.properties file from hive-common.jar instead. This is making it log everything to stderr. Oozie then can't parse the Hadoop Job Id.

      stdout
      ...
      <<< Invocation of Hive command completed <<<
      
       Hadoop Job IDs executed by Hive: 
      
      
      <<< Invocation of Main class completed <<<
      
      
      Oozie Launcher, capturing output data:
      =======================
      #
      #Mon Dec 16 16:01:34 PST 2013
      hadoopJobs=
      
      
      =======================
      
      stderr
      Picked up _JAVA_OPTIONS: -Djava.awt.headless=true
      2013-12-16 16:01:20.884 java[59363:1703] Unable to load realm info from SCDynamicStore
      WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
      Logging initialized using configuration in jar:file:/Users/rkanter/dev/hadoop-1.2.0/dirs/mapred/taskTracker/distcache/-4202506229388278450_-1489127056_2111515407/localhost/user/rkanter/share/lib/lib_20131216160106/hive/hive-common-0.10.0.jar!/hive-log4j.properties
      Hive history file=/tmp/rkanter/hive_job_log_rkanter_201312161601_851054619.txt
      OK
      Time taken: 5.444 seconds
      Total MapReduce jobs = 3
      Launching Job 1 out of 3
      Number of reduce tasks is set to 0 since there's no reduce operator
      Starting Job = job_201312161418_0008, Tracking URL = http://localhost:50030/jobdetails.jsp?jobid=job_201312161418_0008
      Kill Command = /Users/rkanter/dev/hadoop-1.2.0/libexec/../bin/hadoop job  -kill job_201312161418_0008
      Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0
      2013-12-16 16:01:33,409 Stage-1 map = 0%,  reduce = 0%
      2013-12-16 16:01:34,415 Stage-1 map = 100%,  reduce = 100%
      Ended Job = job_201312161418_0008
      Ended Job = 1084818925, job is filtered out (removed at runtime).
      Ended Job = -956386500, job is filtered out (removed at runtime).
      Moving data to: hdfs://localhost:8020/tmp/hive-rkanter/hive_2013-12-16_16-01-28_168_4802779111653057155/-ext-10000
      Moving data to: /user/rkanter/examples/output-data/hive
      MapReduce Jobs Launched: 
      Job 0:  HDFS Read: 0 HDFS Write: 0 SUCCESS
      Total MapReduce CPU Time Spent: 0 msec
      OK
      Time taken: 6.284 seconds
      Log file: /Users/rkanter/dev/hadoop-1.2.0/dirs/mapred/taskTracker/rkanter/jobcache/job_201312161418_0007/attempt_201312161418_0007_m_000000_0/work/hive-oozie-job_201312161418_0007.log  not present. Therefore no Hadoop jobids found
      

      Attachments

        1. OOZIE-1643.patch
          0.9 kB
          Robert Kanter

        Issue Links

          Activity

            People

              rkanter Robert Kanter
              rkanter Robert Kanter
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: