Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-22757

NullPointerException when executing SQLs

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 2.3.6
    • None
    • None

    Description

      When executing SQL:

      insert overwrite table ods.ods_1 partition(stat_day='20191209')
      select
      id
      ,user_id
      ,teacher_user_id
      ,partner_user_id
      ,order_id
      ,barcode
      ,sub_order_id
      ,item_id
      ,sales
      ,refund
      ,teacher_profit
      ,partner_profit
      ,teacher_refund_profit
      ,partner_refund_profit
      ,teacher_commission_value
      ,partner_commission_value
      ,biz_type
      ,pay_time
      ,item_profit_type
      ,black_mark
      ,is_deleted
      ,create_time
      ,modify_time
      from src.src_1
      where partition_date='20191209'
      union all
      select
      t1.id
      ,t1.user_id
      ,t1.teacher_user_id
      ,t1.partner_user_id
      ,t1.order_id
      ,t1.barcode
      ,t1.sub_order_id
      ,t1.item_id
      ,t1.sales
      ,t1.refund
      ,t1.teacher_profit
      ,t1.partner_profit
      ,t1.teacher_refund_profit
      ,t1.partner_refund_profit
      ,t1.teacher_commission_value
      ,t1.partner_commission_value
      ,t1.biz_type
      ,t1.pay_time
      ,t1.item_profit_type
      ,t1.black_mark
      ,t1.is_deleted
      ,t1.create_time
      ,t1.modify_time
      from
      (select *
      from ods.ods_1
      where stat_day='20191208'
      ) t1
      left join
      ( select order_id
      ,sub_order_id
      from src.src_1
      where partition_date='20191209'
      ) t2
      on t1.order_id=t2.order_id
      and t1.sub_order_id=t2.sub_order_id
      where t2.order_id is null
      

      `java.lang.NullPointerException` thrown because the array list `neededNestedColumnPaths` haven't been inited when `addAll` method is invoked.

      Launching Job 5 out of 5
      Number of reduce tasks is set to 0 since there's no reduce operator
      Starting Job = job_1566481621886_4925755, Tracking URL = http://TXIDC65-bigdata-resourcemanager1:8042/proxy/application_1566481621886_4925755/
      Kill Command = /usr/local/yunji/hadoop/bin/hadoop job  -kill job_1566481621886_4925755
      Hadoop job information for Stage-4: number of mappers: 1; number of reducers: 0
      2019-12-24 16:00:40,584 Stage-4 map = 0%,  reduce = 0%
      2019-12-24 16:01:40,956 Stage-4 map = 0%,  reduce = 0%
      2019-12-24 16:02:41,451 Stage-4 map = 0%,  reduce = 0%
      2019-12-24 16:02:45,550 Stage-4 map = 100%,  reduce = 0%
      Ended Job = job_1566481621886_4925755 with errors
      Error during job, obtaining debugging information...
      Examining task ID: task_1566481621886_4925755_m_000000 (and more) from job job_1566481621886_4925755
      
      Task with the most failures(4):
      -----
      Task ID:
        task_1566481621886_4925755_m_000000
      
      URL:
        http://TXIDC65-bigdata-resourcemanager1:8088/taskdetails.jsp?jobid=job_1566481621886_4925755&tipid=task_1566481621886_4925755_m_000000
      -----
      Diagnostic Messages for this Task:
      Error: java.io.IOException: java.lang.reflect.InvocationTargetException
      	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
      	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
      	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271)
      	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:217)
      	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345)
      	at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:695)
      	at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169)
      	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:438)
      	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
      	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at javax.security.auth.Subject.doAs(Subject.java:422)
      	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
      	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
      Caused by: java.lang.reflect.InvocationTargetException
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257)
      	... 11 more
      Caused by: java.lang.NullPointerException
      	at java.util.AbstractCollection.addAll(AbstractCollection.java:343)
      	at org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:118)
      	at org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:189)
      	at org.apache.hadoop.hive.ql.io.parquet.ParquetRecordReaderBase.getSplit(ParquetRecordReaderBase.java:75)
      	at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:75)
      	at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:60)
      	at org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:75)
      	at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:99)
      	... 16 more
      FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
      MapReduce Jobs Launched:
      Stage-Stage-11: Map: 2   Cumulative CPU: 8.06 sec   HDFS Read: 30968 HDFS Write: 192 SUCCESS
      Stage-Stage-2: Map: 3   Cumulative CPU: 11.67 sec   HDFS Read: 38059 HDFS Write: 2079 SUCCESS
      Stage-Stage-4: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
      Total MapReduce CPU Time Spent: 19 seconds 730 msec
      

      Attachments

        1. HIVE-22757.patch
          1 kB
          Deegue

        Issue Links

          Activity

            People

              Unassigned Unassigned
              Deegue Deegue
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 20m
                  20m