Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-8430

Enable parquet_join.q [Spark Branch]

    XMLWordPrintableJSON

    Details

    • Type: Test
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.1.0
    • Component/s: Spark
    • Labels:
      None

      Description

      Currently, even with HIVE-8412, parquet_join.q will fail with the following exception:

      org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.IllegalArgumentException: TaskAttemptId string : 000000_52 is not properly formed
      	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:274)
      	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:544)
      	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:488)
      	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:597)
      	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:799)
      	at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
      	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:799)
      	at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:95)
      	at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:799)
      	at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:536)
      	at org.apache.hadoop.hive.ql.exec.spark.SparkMapRecordHandler.processRow(SparkMapRecordHandler.java:139)
      	at org.apache.hadoop.hive.ql.exec.spark.HiveMapFunctionResultList.processNextRecord(HiveMapFunctionResultList.java:47)
      	at org.apache.hadoop.hive.ql.exec.spark.HiveMapFunctionResultList.processNextRecord(HiveMapFunctionResultList.java:28)
      	at org.apache.hadoop.hive.ql.exec.spark.HiveBaseFunctionResultList$ResultIterator.hasNext(HiveBaseFunctionResultList.java:108)
      	at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:41)
      	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
      	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
      	at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:763)
      	at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:763)
      	at org.apache.spark.SparkContext$$anonfun$runJob$3.apply(SparkContext.scala:1138)
      	at org.apache.spark.SparkContext$$anonfun$runJob$3.apply(SparkContext.scala:1138)
      	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
      	at org.apache.spark.scheduler.Task.run(Task.scala:56)
      	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:181)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      	at java.lang.Thread.run(Thread.java:745)
      Caused by: java.lang.IllegalArgumentException: TaskAttemptId string : 000000_52 is not properly formed
      	at org.apache.hadoop.mapreduce.TaskAttemptID.forName(TaskAttemptID.java:201)
      	at org.apache.hadoop.hive.ql.io.parquet.write.ParquetRecordWriterWrapper.<init>(ParquetRecordWriterWrapper.java:49)
      	at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getParquerRecordWriterWrapper(MapredParquetOutputFormat.java:122)
      	at org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat.getHiveRecordWriter(MapredParquetOutputFormat.java:113)
      	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:284)
      	at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:271)
      	... 26 more
      

      We need to investigate this.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                csun Chao Sun
              • Votes:
                0 Vote for this issue
                Watchers:
                1 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: