0: jdbc:hive2://nc-h04:10000/casino> insert into table foo7 VALUES(1);
14/09/30 21:51:11 DEBUG parse.VariableSubstitution: Substitution is on: insert into table foo7 VALUES(1)
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 DEBUG parse.VariableSubstitution: Substitution is on: insert into table foo7 VALUES(1)
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO parse.ParseDriver: Parsing command: insert into table foo7 VALUES(1)
14/09/30 21:51:11 INFO parse.ParseDriver: Parse Completed
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
14/09/30 21:51:11 DEBUG hdfs.DFSClient: /tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/_tmp_space.db/Values__Tmp__Table__5: masked=rwxr-xr-x
14/09/30 21:51:11 DEBUG ipc.Client: The ping interval is 60000 ms.
14/09/30 21:51:11 DEBUG ipc.Client: Connecting to nc-h04/192.168.20.6:8020
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 5ms
14/09/30 21:51:11 DEBUG hdfs.DFSClient: /tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/_tmp_space.db/Values__Tmp__Table__5/data_file: masked=rw-r--r--
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: create took 3ms
14/09/30 21:51:11 DEBUG hdfs.DFSClient: computePacketChunkSize: src=/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/_tmp_space.db/Values__Tmp__Table__5/data_file, chunkSize=516, chunksPerPacket=127, packetSize=65532
14/09/30 21:51:11 DEBUG hdfs.DFSClient: DFSClient writeChunk allocating new packet seqno=0, src=/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/_tmp_space.db/Values__Tmp__Table__5/data_file, packetSize=65532, chunksPerPacket=127, bytesCurBlock=0
14/09/30 21:51:11 DEBUG hdfs.DFSClient: Queued packet 0
14/09/30 21:51:11 DEBUG hdfs.DFSClient: Queued packet 1
14/09/30 21:51:11 DEBUG hdfs.DFSClient: Waiting for ack for: 1
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: complete took 2ms
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG hive.log: DDL: struct Values__Tmp__Table__5 { string tmp_values_col1}
14/09/30 21:51:11 DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[tmp_values_col1] columnTypes=[string] separator=[[B@214add6d] nullstring=\N lastColumnTakesRest=false
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Completed phase 1 of Semantic Analysis
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Get metadata for source tables
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Get metadata for subqueries
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Get metadata for destination tables
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Completed getting MetaData in Semantic Analysis
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Can not invoke CBO; query contains operators not supported for CBO.
14/09/30 21:51:11 DEBUG hive.log: DDL: struct values__tmp__table__5 { string tmp_values_col1}
14/09/30 21:51:11 DEBUG lazy.LazySimpleSerDe: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe initialized with: columnNames=[tmp_values_col1] columnTypes=[string] separator=[[B@3b2439c9] nullstring=\N lastColumnTakesRest=false
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: Created Table Plan for values__tmp__table__5 TS[16]
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: tree: (TOK_SELECT (TOK_SELEXPR TOK_ALLCOLREF))
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: genSelectPlan: input = values__tmp__table__5{(tmp_values_col1,tmp_values_col1: string)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE: bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID: struct)}
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: Created Select Plan row schema: values__tmp__table__5{(tmp_values_col1,_col0: string)}
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: Created Select Plan for clause: insclause-0
14/09/30 21:51:11 DEBUG hdfs.DFSClient: /tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-1: masked=rwx------
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 2ms
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
14/09/30 21:51:11 DEBUG hive.log: DDL: struct foo7 { i32 id}
14/09/30 21:51:11 DEBUG hive.log: DDL: struct foo7 { i32 id}
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Turning off vectorization for acid write operation
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Modifying config values for ACID write
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.FunctionRegistry: Method didn't match: passed = [string] accepted = [bigint] method = public org.apache.hadoop.io.IntWritable org.apache.hadoop.hive.ql.udf.UDFToInteger.evaluate(org.apache.hadoop.io.LongWritable)
14/09/30 21:51:11 DEBUG exec.FunctionRegistry: Method didn't match: passed = [string] accepted = [struct] method = public org.apache.hadoop.io.IntWritable org.apache.hadoop.hive.ql.udf.UDFToInteger.evaluate(org.apache.hadoop.hive.ql.io.RecordIdentifier)
14/09/30 21:51:11 DEBUG exec.FunctionRegistry: Method did match: passed = [string] accepted = [double] method = public org.apache.hadoop.io.IntWritable org.apache.hadoop.hive.ql.udf.UDFToInteger.evaluate(org.apache.hadoop.hive.serde2.io.DoubleWritable)
14/09/30 21:51:11 DEBUG exec.FunctionRegistry: Method did match: passed = [string] accepted = [string] method = public org.apache.hadoop.io.IntWritable org.apache.hadoop.hive.ql.udf.UDFToInteger.evaluate(org.apache.hadoop.io.Text)
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: Created FileSink Plan for clause: insclause-0dest_path: hdfs://nc-h04/user/hive/warehouse/casino.db/foo7 row schema: {(_col0,_col0: int)}
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: Created Body Plan for Query Block null
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: Created Plan for Query Block null
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: Before logical optimization
TS[16]-SEL[17]-SEL[18]-FS[19]
14/09/30 21:51:11 INFO ppd.OpProcFactory: Processing for FS(19)
14/09/30 21:51:11 INFO ppd.OpProcFactory: Processing for SEL(18)
14/09/30 21:51:11 INFO ppd.OpProcFactory: Processing for SEL(17)
14/09/30 21:51:11 INFO ppd.OpProcFactory: Processing for TS(16)
14/09/30 21:51:11 DEBUG ppd.PredicatePushDown: After PPD:
TS[16]-SEL[17]-SEL[18]-FS[19]
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:TS[16] with rr:values__tmp__table__5{(tmp_values_col1,tmp_values_col1: string)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE: bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID: struct)}
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator TS[16]
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:SEL[17] with rr:values__tmp__table__5{(tmp_values_col1,_col0: string)}
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator SEL[17]
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcFactory: New column list:(Column[tmp_values_col1])
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:SEL[18] with rr:{(_col0,_col0: int)}
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator SEL[18]
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcFactory: New column list:(GenericUDFBridge(Column[_col0]))
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:FS[19] with rr:{(_col0,_col0: int)}
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator FS[19]
14/09/30 21:51:11 DEBUG index.RewriteGBUsingIndex: No Valid Index Found to apply Rewrite, skipping RewriteGBUsingIndex optimization
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: After logical optimization
TS[16]-SEL[17]-FS[19]
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:TS[16] with rr:values__tmp__table__5{(tmp_values_col1,tmp_values_col1: string)(block__offset__inside__file,BLOCK__OFFSET__INSIDE__FILE: bigint)(input__file__name,INPUT__FILE__NAME: string)(row__id,ROW__ID: struct)}
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator TS[16]
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:SEL[17] with rr:{(_col0,_col0: int)}
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator SEL[17]
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 ERROR optimizer.ConstantPropagateProcFactory: The UDF implementation class 'org.apache.hadoop.hive.ql.udf.UDFToInteger' is not present in the class path
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcFactory: Function class org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge undeterministic, quit folding.
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcFactory: New column list:(GenericUDFBridge(Column[tmp_values_col1]))
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Getting constants of op:FS[19] with rr:{(_col0,_col0: int)}
14/09/30 21:51:11 DEBUG optimizer.ConstantPropagateProcCtx: Offerring constants [] to operator FS[19]
14/09/30 21:51:11 INFO log.PerfLogger:
No rows affected (7.405 seconds)
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getContentSummary took 1ms
14/09/30 21:51:11 DEBUG stats.StatsUtils: Estimated average row size: 100
14/09/30 21:51:11 DEBUG exec.TableScanOperator: Setting stats (Num rows: 0 Data size: 2 Basic stats: PARTIAL Column stats: NONE) on TS[16]
14/09/30 21:51:11 DEBUG annotation.StatsRulesProcFactory: [0] STATS-TS[16] (values__tmp__table__5): numRows: 0 dataSize: 2 basicStatsState: PARTIAL colStatsState: NONE colStats: {}
14/09/30 21:51:11 DEBUG exec.SelectOperator: Setting stats (Num rows: 0 Data size: 2 Basic stats: PARTIAL Column stats: NONE) on SEL[17]
14/09/30 21:51:11 DEBUG annotation.StatsRulesProcFactory: [1] STATS-SEL[17]: numRows: 0 dataSize: 2 basicStatsState: PARTIAL colStatsState: NONE colStats: {}
14/09/30 21:51:11 DEBUG annotation.StatsRulesProcFactory: [0] STATS-FS[19]: numRows: 0 dataSize: 2 basicStatsState: PARTIAL colStatsState: NONE colStats: {}
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
14/09/30 21:51:11 DEBUG exec.TableScanOperator: Setting traits (org.apache.hadoop.hive.ql.plan.OpTraits@33b8f40d) on TS[16]
14/09/30 21:51:11 DEBUG exec.SelectOperator: Setting traits (org.apache.hadoop.hive.ql.plan.OpTraits@339e9c02) on SEL[17]
14/09/30 21:51:11 DEBUG exec.FileSinkOperator: Setting traits (org.apache.hadoop.hive.ql.plan.OpTraits@339e9c02) on FS[19]
14/09/30 21:51:11 DEBUG parse.TezCompiler: Component:
14/09/30 21:51:11 DEBUG parse.TezCompiler: Operator: TS, 16
14/09/30 21:51:11 DEBUG parse.TezCompiler: Component:
14/09/30 21:51:11 DEBUG parse.TezCompiler: Operator: SEL, 17
14/09/30 21:51:11 DEBUG parse.TezCompiler: Component:
14/09/30 21:51:11 DEBUG parse.TezCompiler: Operator: FS, 19
14/09/30 21:51:11 INFO parse.TezCompiler: Cycle free: true
14/09/30 21:51:11 DEBUG parse.GenTezWork: Root operator: TS[16]
14/09/30 21:51:11 DEBUG parse.GenTezWork: Leaf operator: FS[19]
14/09/30 21:51:11 DEBUG parse.GenTezUtils: Adding map work (Map 1) for TS[16]
14/09/30 21:51:11 DEBUG hive.log: DDL: struct values__tmp__table__5 { string tmp_values_col1}
14/09/30 21:51:11 DEBUG hive.log: DDL: struct values__tmp__table__5 { string tmp_values_col1}
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG hive.log: DDL: struct values__tmp__table__5 { string tmp_values_col1}
14/09/30 21:51:11 DEBUG optimizer.GenMapRedUtils: Adding hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/_tmp_space.db/Values__Tmp__Table__5 of tablevalues__tmp__table__5
14/09/30 21:51:11 DEBUG optimizer.GenMapRedUtils: Information added for path hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/_tmp_space.db/Values__Tmp__Table__5
14/09/30 21:51:11 DEBUG parse.GenTezWork: First pass. Leaf operator: FS[19]
14/09/30 21:51:11 DEBUG parse.TezCompiler: There are 0 app master events.
14/09/30 21:51:11 DEBUG physical.NullScanTaskDispatcher: Looking at: Map 1
14/09/30 21:51:11 INFO physical.NullScanTaskDispatcher: Looking for table scans where optimization is applicable
14/09/30 21:51:11 INFO physical.NullScanTaskDispatcher: Found 0 null table scans
14/09/30 21:51:11 DEBUG physical.NullScanTaskDispatcher: Looking at: Map 1
14/09/30 21:51:11 INFO physical.NullScanTaskDispatcher: Looking for table scans where optimization is applicable
14/09/30 21:51:11 INFO physical.NullScanTaskDispatcher: Found 0 null table scans
14/09/30 21:51:11 DEBUG physical.NullScanTaskDispatcher: Looking at: Map 1
14/09/30 21:51:11 INFO physical.NullScanTaskDispatcher: Looking for table scans where optimization is applicable
14/09/30 21:51:11 INFO physical.NullScanTaskDispatcher: Found 0 null table scans
14/09/30 21:51:11 DEBUG parse.TezCompiler: Skipping vectorization
14/09/30 21:51:11 DEBUG parse.TezCompiler: Skipping stage id rearranger
14/09/30 21:51:11 INFO parse.SemanticAnalyzer: Completed plan generation
14/09/30 21:51:11 INFO ql.Driver: Semantic Analysis Completed
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: validation start
14/09/30 21:51:11 DEBUG parse.SemanticAnalyzer: Not a partition.
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null)], properties:null)
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 DEBUG lockmgr.DbTxnManager: Opened txn 556
14/09/30 21:51:11 DEBUG lockmgr.DbTxnManager: Setting lock request transaction to 556
14/09/30 21:51:11 DEBUG lockmgr.DbTxnManager: Adding lock component to lock request LockComponent(type:SHARED_READ, level:TABLE, dbname:casino, tablename:values__tmp__table__5)
14/09/30 21:51:11 DEBUG lockmgr.DbTxnManager: output is null false
14/09/30 21:51:11 DEBUG lockmgr.DbTxnManager: Adding lock component to lock request LockComponent(type:SHARED_READ, level:TABLE, dbname:casino, tablename:foo7)
14/09/30 21:51:11 DEBUG lockmgr.DbLockManager: Requesting lock
14/09/30 21:51:11 DEBUG ql.Driver: Encoding valid txns info 556:396:399:402:410:411:412:414:418:445:462:468:469:470:471:472:483:503:543:555
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO ql.Driver: Starting command: insert into table foo7 VALUES(1)
14/09/30 21:51:11 INFO ql.Driver: Query ID = hduser_20140930215151_ad9b1a41-cea9-435c-bb19-c980f0012401
14/09/30 21:51:11 INFO ql.Driver: Total jobs = 1
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO ql.Driver: Launching Job 1 out of 1
14/09/30 21:51:11 INFO ql.Driver: Starting task [Stage-1:MAPRED] in serial mode
14/09/30 21:51:11 INFO tez.TezSessionPoolManager: The current user: hduser, session user: hduser
14/09/30 21:51:11 DEBUG hdfs.DFSClient: /tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-6: masked=rwx------
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 2ms
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
14/09/30 21:51:11 INFO ql.Context: New scratch dir is hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-6
14/09/30 21:51:11 DEBUG tez.DagUtils: TezDir path set hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-6/hduser/_tez_scratch_dir for user: hduser
14/09/30 21:51:11 DEBUG hdfs.DFSClient: /tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-6/hduser/_tez_scratch_dir: masked=rwxr-xr-x
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 2ms
14/09/30 21:51:11 INFO exec.Task: Session is already open
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
14/09/30 21:51:11 INFO tez.DagUtils: Resource modification time: 1412106082871
14/09/30 21:51:11 DEBUG exec.Task: Adding local resource: scheme: "hdfs" host: "nc-h04" port: -1 file: "/tmp/hive/hduser/_tez_session_dir/a1373579-8c20-4721-b8d0-0377200882b5/postgresql-8.4-703.jdbc4.jar"
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO ql.Context: New scratch dir is hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-6
14/09/30 21:51:11 DEBUG hdfs.DFSClient: /tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-6/hduser/_tez_scratch_dir/7f26199a-921d-44d8-a6f3-6e622ace0b11: masked=rwxr-xr-x
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 2ms
14/09/30 21:51:11 DEBUG hdfs.DFSClient: /tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-1/_tmp.-ext-10000: masked=rwxr-xr-x
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 1ms
14/09/30 21:51:11 INFO tez.DagUtils: Vertex has custom input? false
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO exec.Utilities: Serializing MapWork via kryo
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO exec.Utilities: Setting plan: /tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-6/hduser/_tez_scratch_dir/7f26199a-921d-44d8-a6f3-6e622ace0b11/map.xml
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Setting mapreduce.map.output.key.class for stage: unknown based on job level configuration. Value: org.apache.hadoop.hive.ql.io.HiveKey
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Setting mapreduce.map.output.value.class for stage: unknown based on job level configuration. Value: org.apache.hadoop.io.BytesWritable
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.reduce.shuffle.merge.percent, mr initial value=0.66, tez:tez.runtime.shuffle.merge.percent=0.66
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.reduce.shuffle.input.buffer.percent, mr initial value=0.70, tez:tez.runtime.shuffle.fetch.buffer.percent=0.70
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.task.io.sort.mb, mr initial value=100, tez:tez.runtime.io.sort.mb=100
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.reduce.shuffle.memory.limit.percent, mr initial value=0.25, tez:tez.runtime.shuffle.memory.limit.percent=0.25
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.task.io.sort.factor, mr initial value=10, tez:tez.runtime.io.sort.factor=10
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.reduce.shuffle.connect.timeout, mr initial value=180000, tez:tez.runtime.shuffle.connect.timeout=180000
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):map.sort.class, mr initial value=org.apache.hadoop.util.QuickSort, tez:tez.runtime.internal.sorter.class=org.apache.hadoop.util.QuickSort
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.task.merge.progress.records, mr initial value=10000, tez:tez.runtime.merge.progress.records=10000
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.map.output.compress, mr initial value=false, tez:tez.runtime.compress=false
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.map.output.key.class, mr initial value=org.apache.hadoop.hive.ql.io.HiveKey, tez:tez.runtime.key.class=org.apache.hadoop.hive.ql.io.HiveKey
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.map.sort.spill.percent, mr initial value=0.80, tez:tez.runtime.sort.spill.percent=0.80
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.shuffle.ssl.enabled, mr initial value=false, tez:tez.runtime.shuffle.ssl.enable=false
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.ifile.readahead, mr initial value=true, tez:tez.runtime.ifile.readahead=true
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.reduce.shuffle.parallelcopies, mr initial value=5, tez:tez.runtime.shuffle.parallel.copies=5
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.ifile.readahead.bytes, mr initial value=4194304, tez:tez.runtime.ifile.readahead.bytes=4194304
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.reduce.input.buffer.percent, mr initial value=0.0, tez:tez.runtime.task.input.post-merge.buffer.percent=0.0
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.reduce.shuffle.read.timeout, mr initial value=180000, tez:tez.runtime.shuffle.read.timeout=180000
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.map.output.value.class, mr initial value=org.apache.hadoop.io.BytesWritable, tez:tez.runtime.value.class=org.apache.hadoop.io.BytesWritable
14/09/30 21:51:11 DEBUG hadoop.MRHelpers: Config: mr(unset):mapreduce.map.output.compress.codec, mr initial value=org.apache.hadoop.io.compress.DefaultCodec, tez:tez.runtime.compress.codec=org.apache.hadoop.io.compress.DefaultCodec
14/09/30 21:51:11 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:11 DEBUG tez.DagUtils: Marking URI as needing credentials: hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/_tmp_space.db/Values__Tmp__Table__5
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO client.TezClient: Submitting dag to TezSession, sessionName=HIVE-a1373579-8c20-4721-b8d0-0377200882b5, applicationId=application_1412065018660_0039, dagName=hduser_20140930215151_ad9b1a41-cea9-435c-bb19-c980f0012401:5
14/09/30 21:51:11 DEBUG client.TezClientUtils: #sessionTokens=1, Services: application_1412065018660_0039,
14/09/30 21:51:11 DEBUG api.DAG: #dagTokens=1, Services: application_1412065018660_0039,
14/09/30 21:51:11 DEBUG ipc.Client: The ping interval is 60000 ms.
14/09/30 21:51:11 DEBUG ipc.Client: Connecting to nc-h04/192.168.20.6:8032
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getApplicationReport took 4ms
14/09/30 21:51:11 DEBUG client.TezClientUtils: Connecting to Tez AM at nc-h04/192.168.20.6:53731
14/09/30 21:51:11 DEBUG security.UserGroupInformation: PrivilegedAction as:hduser (auth:SIMPLE) from:org.apache.tez.client.TezClientUtils.getAMProxy(TezClientUtils.java:829)
14/09/30 21:51:11 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@35b102dd
14/09/30 21:51:11 DEBUG ipc.Client: The ping interval is 60000 ms.
14/09/30 21:51:11 DEBUG ipc.Client: Connecting to nc-h04/192.168.20.6:53731
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: submitDAG took 62ms
14/09/30 21:51:11 INFO client.TezClient: Submitted dag to TezSession, sessionName=HIVE-a1373579-8c20-4721-b8d0-0377200882b5, applicationId=application_1412065018660_0039, dagName=hduser_20140930215151_ad9b1a41-cea9-435c-bb19-c980f0012401:5
14/09/30 21:51:11 DEBUG service.AbstractService: Service: org.apache.hadoop.yarn.client.api.impl.YarnClientImpl entered state INITED
14/09/30 21:51:11 INFO client.RMProxy: Connecting to ResourceManager at nc-h04/192.168.20.6:8032
14/09/30 21:51:11 DEBUG security.UserGroupInformation: PrivilegedAction as:hduser (auth:SIMPLE) from:org.apache.hadoop.yarn.client.RMProxy.getProxy(RMProxy.java:130)
14/09/30 21:51:11 DEBUG ipc.YarnRPC: Creating YarnRPC for org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC
14/09/30 21:51:11 DEBUG ipc.HadoopYarnProtoRPC: Creating a HadoopYarnProtoRpc proxy for protocol interface org.apache.hadoop.yarn.api.ApplicationClientProtocol
14/09/30 21:51:11 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@35b102dd
14/09/30 21:51:11 DEBUG service.AbstractService: Service org.apache.hadoop.yarn.client.api.impl.YarnClientImpl is started
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO tez.TezJobMonitor:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getApplicationReport took 2ms
14/09/30 21:51:11 DEBUG rpc.DAGClientRPCImpl: App: application_1412065018660_0039 in state: RUNNING
14/09/30 21:51:11 DEBUG client.TezClientUtils: Connecting to Tez AM at nc-h04/192.168.20.6:53731
14/09/30 21:51:11 DEBUG security.UserGroupInformation: PrivilegedAction as:hduser (auth:SIMPLE) from:org.apache.tez.client.TezClientUtils.getAMProxy(TezClientUtils.java:829)
14/09/30 21:51:11 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@35b102dd
14/09/30 21:51:11 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:11 DEBUG ipc.Client: The ping interval is 60000 ms.
14/09/30 21:51:11 DEBUG ipc.Client: Connecting to nc-h04/192.168.20.6:53731
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 5ms
14/09/30 21:51:11 DEBUG exec.Heartbeater: heartbeating
14/09/30 21:51:11 DEBUG lockmgr.DbTxnManager: Heartbeating lock and transaction 556
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO tez.TezJobMonitor: Status: Running (Executing on YARN cluster with App id application_1412065018660_0039)
14/09/30 21:51:11 INFO log.PerfLogger:
14/09/30 21:51:11 INFO tez.TezJobMonitor: Map 1: -/-
14/09/30 21:51:11 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:11 INFO tez.TezJobMonitor: Map 1: 0/1
14/09/30 21:51:11 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:11 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:12 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:12 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:12 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:12 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:12 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:12 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:12 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:12 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:12 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:12 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:13 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:13 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:13 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:13 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:13 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:13 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:13 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:13 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:13 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:13 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:14 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:14 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:14 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:14 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:14 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:14 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:14 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:14 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:14 INFO tez.TezJobMonitor: Map 1: 0/1
14/09/30 21:51:15 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:15 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:15 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:15 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:15 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:15 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:15 INFO tez.TezJobMonitor: Map 1: 0(+1)/1
14/09/30 21:51:15 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:15 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:15 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:15 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:16 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:16 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:16 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:16 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:16 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:16 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:16 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:16 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:16 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:16 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:17 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:17 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:17 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:17 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:17 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:17 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:17 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:17 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 3ms
14/09/30 21:51:17 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:17 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:18 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 2ms
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 INFO tez.TezJobMonitor: Map 1: 1/1
14/09/30 21:51:18 INFO tez.TezJobMonitor: Status: Finished successfully in 6.54 seconds
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 DEBUG rpc.DAGClientRPCImpl: GetDAGStatus via AM for app: application_1412065018660_0039 dag:dag_1412065018660_0039_5
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getDAGStatus took 6ms
14/09/30 21:51:18 INFO exec.Task: org.apache.tez.common.counters.DAGCounter:
14/09/30 21:51:18 INFO exec.Task: TOTAL_LAUNCHED_TASKS: 1
14/09/30 21:51:18 INFO exec.Task: RACK_LOCAL_TASKS: 1
14/09/30 21:51:18 INFO exec.Task: File System Counters:
14/09/30 21:51:18 INFO exec.Task: HDFS: BYTES_READ: 3
14/09/30 21:51:18 INFO exec.Task: HDFS: BYTES_WRITTEN: 611
14/09/30 21:51:18 INFO exec.Task: HDFS: READ_OPS: 4
14/09/30 21:51:18 INFO exec.Task: HDFS: LARGE_READ_OPS: 0
14/09/30 21:51:18 INFO exec.Task: HDFS: WRITE_OPS: 3
14/09/30 21:51:18 INFO exec.Task: org.apache.tez.common.counters.TaskCounter:
14/09/30 21:51:18 INFO exec.Task: GC_TIME_MILLIS: 98
14/09/30 21:51:18 INFO exec.Task: CPU_MILLISECONDS: 4530
14/09/30 21:51:18 INFO exec.Task: PHYSICAL_MEMORY_BYTES: 186900480
14/09/30 21:51:18 INFO exec.Task: VIRTUAL_MEMORY_BYTES: 899563520
14/09/30 21:51:18 INFO exec.Task: COMMITTED_HEAP_BYTES: 106954752
14/09/30 21:51:18 INFO exec.Task: INPUT_RECORDS_PROCESSED: 1
14/09/30 21:51:18 INFO exec.Task: OUTPUT_RECORDS: 0
14/09/30 21:51:18 INFO exec.Task: HIVE:
14/09/30 21:51:18 INFO exec.Task: CREATED_FILES: 1
14/09/30 21:51:18 INFO exec.Task: org.apache.hadoop.hive.ql.exec.MapOperator$Counter:
14/09/30 21:51:18 INFO exec.Task: DESERIALIZE_ERRORS: 0
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
14/09/30 21:51:18 DEBUG exec.Utilities: TaskId for 000000_0 = 000000
14/09/30 21:51:18 INFO exec.FileSinkOperator: Moving tmp dir: hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-1/_tmp.-ext-10000 to: hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-1/-ext-10000
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: rename took 2ms
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: delete took 2ms
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 INFO ql.Driver: Starting task [Stage-2:DEPENDENCY_COLLECTION] in serial mode
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 INFO ql.Driver: Starting task [Stage-0:MOVE] in serial mode
14/09/30 21:51:18 INFO exec.Task: Loading data to table casino.foo7 from hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-1/-ext-10000
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 17ms
14/09/30 21:51:18 DEBUG metadata.Hive: Acid move Looking for original buckets in hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-1/-ext-10000
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
14/09/30 21:51:18 DEBUG metadata.Hive: Acid move found 1 original buckets
14/09/30 21:51:18 DEBUG metadata.Hive: Acid move looking for delta files in bucket hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-1/-ext-10000/000000_0
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
14/09/30 21:51:18 DEBUG metadata.Hive: Acid move found 1 delta files
14/09/30 21:51:18 DEBUG hdfs.DFSClient: /user/hive/warehouse/casino.db/foo7/delta_0000556_0000556: masked=rwxr-xr-x
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: mkdirs took 2ms
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
14/09/30 21:51:18 DEBUG metadata.Hive: Acid move found 1 bucket files
14/09/30 21:51:18 INFO metadata.Hive: Moving bucket hdfs://nc-h04/tmp/hive/hduser/0d8ccf6b-84e1-4008-a8d6-a349c7d5ff3b/hive_2014-09-30_21-51-11_143_7150781650032843032-1/-ext-10000/000000_0/delta_0000556_0000556/bucket_00000 to hdfs://nc-h04/user/hive/warehouse/casino.db/foo7/delta_0000556_0000556/bucket_00000
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: rename took 2ms
14/09/30 21:51:18 DEBUG hive.log: DDL: struct foo7 { i32 id}
14/09/30 21:51:18 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:18 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 INFO ql.Driver: Starting task [Stage-3:STATS] in serial mode
14/09/30 21:51:18 INFO exec.StatsTask: Executing stats task
14/09/30 21:51:18 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
14/09/30 21:51:18 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 1ms
14/09/30 21:51:18 DEBUG hive.log: DDL: struct foo7 { i32 id}
14/09/30 21:51:18 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:18 DEBUG exec.Utilities: Use session specified class loader
14/09/30 21:51:18 INFO exec.Task: Table casino.foo7 stats: [numFiles=1, numRows=1, totalSize=607, rawDataSize=0]
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 INFO ql.Driver: OK
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 DEBUG lockmgr.DbTxnManager: Committing txn 556
14/09/30 21:51:18 INFO log.PerfLogger:
14/09/30 21:51:18 INFO log.PerfLogger: