2016-09-29 04:53:32,277 [281361b2-f089-3e97-458b-6ea737b61d7e:foreman] INFO o.a.drill.exec.work.foreman.Foreman - Query text for query id 281361b2-f089-3e97-458b-6ea737b61d7e: select emp.lvl2_prod_nm FROM hive.fct_cust_terr_spclty_sls as emp join hive.fct_ims_plntrk_sls as dept on emp.lvl2_prod_nm = dept.lvl2_prod_nm limit 1000 2016-09-29 04:53:32,441 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:0] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:0: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,441 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:0] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:0: State to report: RUNNING 2016-09-29 04:53:32,445 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:1] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:1: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,445 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:1] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:1: State to report: RUNNING 2016-09-29 04:53:32,446 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:5] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:5: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,446 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:5] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:5: State to report: RUNNING 2016-09-29 04:53:32,446 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:4] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:4: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,446 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:4] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:4: State to report: RUNNING 2016-09-29 04:53:32,448 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:3] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:3: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,448 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:3] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:3: State to report: RUNNING 2016-09-29 04:53:32,449 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:2] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:2: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,449 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:2] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:2: State to report: RUNNING 2016-09-29 04:53:32,496 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:3] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:3: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,496 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:3] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:3: State to report: RUNNING 2016-09-29 04:53:32,500 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:2] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:2: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,500 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:2] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:2: State to report: RUNNING 2016-09-29 04:53:32,500 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:1] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:1: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,500 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:1] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:1: State to report: RUNNING 2016-09-29 04:53:32,511 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:5] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:5: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,511 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:5] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:5: State to report: RUNNING 2016-09-29 04:53:32,515 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:4] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:4: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,515 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:4] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:4: State to report: RUNNING 2016-09-29 04:53:32,522 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:0:0] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:0:0: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,522 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:0:0] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:0:0: State to report: RUNNING 2016-09-29 04:53:32,525 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:0] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:0: State change requested AWAITING_ALLOCATION --> RUNNING 2016-09-29 04:53:32,525 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:0] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:0: State to report: RUNNING 2016-09-29 04:53:37,861 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:3] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:3: State change requested RUNNING --> FINISHED 2016-09-29 04:53:37,861 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:3] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:3: State to report: FINISHED 2016-09-29 04:53:37,880 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:5] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:5: State change requested RUNNING --> FINISHED 2016-09-29 04:53:37,880 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:5] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:5: State to report: FINISHED 2016-09-29 04:53:37,910 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:4] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:4: State change requested RUNNING --> FINISHED 2016-09-29 04:53:37,910 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:4] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:4: State to report: FINISHED 2016-09-29 04:53:38,010 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:2] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:2: State change requested RUNNING --> FINISHED 2016-09-29 04:53:38,010 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:2] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:2: State to report: FINISHED 2016-09-29 04:53:38,754 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:1] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:1: State change requested RUNNING --> FINISHED 2016-09-29 04:53:38,754 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:1] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:1: State to report: FINISHED 2016-09-29 04:53:39,530 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:3] INFO o.a.d.e.p.i.u.UnorderedReceiverBatch - Informing senders of request to terminate sending. 2016-09-29 04:53:39,531 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:0] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:2:0: State change requested RUNNING --> FINISHED 2016-09-29 04:53:39,531 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:1] INFO o.a.d.e.p.i.u.UnorderedReceiverBatch - Informing senders of request to terminate sending. 2016-09-29 04:53:39,531 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:2:0] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:2:0: State to report: FINISHED 2016-09-29 04:53:39,531 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:4] INFO o.a.d.e.p.i.u.UnorderedReceiverBatch - Informing senders of request to terminate sending. 2016-09-29 04:53:39,532 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:5] INFO o.a.d.e.p.i.u.UnorderedReceiverBatch - Informing senders of request to terminate sending. 2016-09-29 04:53:39,532 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:0] INFO o.a.d.e.p.i.u.UnorderedReceiverBatch - Informing senders of request to terminate sending. 2016-09-29 04:53:39,532 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:2] INFO o.a.d.e.p.i.u.UnorderedReceiverBatch - Informing senders of request to terminate sending. 2016-09-29 04:53:39,538 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:0 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:3 as path to executor unavailable. 2016-09-29 04:53:39,538 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:1 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:3 as path to executor unavailable. 2016-09-29 04:53:39,538 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:2 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:3 as path to executor unavailable. 2016-09-29 04:53:39,538 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:3 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:3 as path to executor unavailable. 2016-09-29 04:53:39,538 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:4 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:3 as path to executor unavailable. 2016-09-29 04:53:39,538 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:5 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:3 as path to executor unavailable. 2016-09-29 04:53:39,538 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:0 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:1 as path to executor unavailable. 2016-09-29 04:53:39,538 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:1 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:1 as path to executor unavailable. 2016-09-29 04:53:39,539 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:2 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:1 as path to executor unavailable. 2016-09-29 04:53:39,540 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:0:0] INFO o.a.d.e.p.i.u.UnorderedReceiverBatch - Informing senders of request to terminate sending. 2016-09-29 04:53:39,540 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:3 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:1 as path to executor unavailable. 2016-09-29 04:53:39,545 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:4 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:1 as path to executor unavailable. 2016-09-29 04:53:39,545 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:5 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:1 as path to executor unavailable. 2016-09-29 04:53:39,545 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:0 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:4 as path to executor unavailable. 2016-09-29 04:53:39,545 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:0 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:5 as path to executor unavailable. 2016-09-29 04:53:39,545 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:1 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:4 as path to executor unavailable. 2016-09-29 04:53:39,545 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:1 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:5 as path to executor unavailable. 2016-09-29 04:53:39,545 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:2 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:4 as path to executor unavailable. 2016-09-29 04:53:39,545 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:0 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:0 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:2 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:5 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:3 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:4 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:0 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:2 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:3 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:5 as path to executor unavailable. 2016-09-29 04:53:39,546 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:0:0] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:0:0: State change requested RUNNING --> FINISHED 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:1 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:0 as path to executor unavailable. 2016-09-29 04:53:39,546 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:0:0] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:0:0: State to report: FINISHED 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:4 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:4 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:4 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:5 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:1 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:2 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:2 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:0 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:5 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:4 as path to executor unavailable. 2016-09-29 04:53:39,546 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:5 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:5 as path to executor unavailable. 2016-09-29 04:53:39,547 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:3 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:0 as path to executor unavailable. 2016-09-29 04:53:39,547 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:2 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:2 as path to executor unavailable. 2016-09-29 04:53:39,547 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:3 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:2 as path to executor unavailable. 2016-09-29 04:53:39,548 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:4 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:2 as path to executor unavailable. 2016-09-29 04:53:39,548 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:5 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:2 as path to executor unavailable. 2016-09-29 04:53:39,548 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:4 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:0 as path to executor unavailable. 2016-09-29 04:53:39,548 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request for early fragment termination for path 281361b2-f089-3e97-458b-6ea737b61d7e:2:5 -> 281361b2-f089-3e97-458b-6ea737b61d7e:1:0 as path to executor unavailable. 2016-09-29 04:53:39,553 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - Applying request for early sender termination for 281361b2-f089-3e97-458b-6ea737b61d7e:1:0 -> 0:0. 2016-09-29 04:53:39,553 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - Applying request for early sender termination for 281361b2-f089-3e97-458b-6ea737b61d7e:1:1 -> 0:0. 2016-09-29 04:53:39,555 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - Applying request for early sender termination for 281361b2-f089-3e97-458b-6ea737b61d7e:1:2 -> 0:0. 2016-09-29 04:53:39,555 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - Applying request for early sender termination for 281361b2-f089-3e97-458b-6ea737b61d7e:1:3 -> 0:0. 2016-09-29 04:53:39,557 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - Applying request for early sender termination for 281361b2-f089-3e97-458b-6ea737b61d7e:1:4 -> 0:0. 2016-09-29 04:53:39,557 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - Applying request for early sender termination for 281361b2-f089-3e97-458b-6ea737b61d7e:1:5 -> 0:0. 2016-09-29 04:54:26,573 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:0] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:0: State change requested RUNNING --> FINISHED 2016-09-29 04:54:26,573 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:0] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:0: State to report: FINISHED 2016-09-29 04:58:53,243 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:3] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:3: State change requested RUNNING --> FINISHED 2016-09-29 04:58:53,243 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:3] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:3: State to report: FINISHED 2016-09-29 05:04:50,094 [DATA-rpc-event-queue] ERROR o.a.drill.exec.rpc.data.DataServer - Failure while getting fragment manager. 281361b2-f089-3e97-458b-6ea737b61d7e:0:0 org.apache.drill.exec.exception.FragmentSetupException: Failed to receive plan fragment that was required for id: 281361b2-f089-3e97-458b-6ea737b61d7e:0:0 at org.apache.drill.exec.rpc.control.WorkEventBus.getFragmentManager(WorkEventBus.java:108) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataServer.submit(DataServer.java:121) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataServer.handle(DataServer.java:159) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataServer.handle(DataServer.java:52) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$RequestEvent.run(RpcBus.java:363) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor$RunnableProcessor.run(SerializedExecutor.java:89) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$SameExecutor.execute(RpcBus.java:240) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor.execute(SerializedExecutor.java:123) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:274) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:245) [drill-rpc-1.8.0.jar:1.8.0] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.timeout.ReadTimeoutHandler.channelRead(ReadTimeoutHandler.java:150) [netty-handler-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) [netty-common-4.0.27.Final.jar:4.0.27.Final] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] 2016-09-29 05:04:50,095 [DATA-rpc-event-queue] ERROR o.a.drill.exec.ops.StatusHandler - Data not accepted downstream. Stopping future sends. 2016-09-29 05:04:50,095 [DATA-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:2: State change requested RUNNING --> FAILED 2016-09-29 05:04:50,099 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:2] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:2: State change requested FAILED --> FINISHED 2016-09-29 05:04:50,100 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:2] ERROR o.a.d.e.w.fragment.FragmentExecutor - SYSTEM ERROR: RpcException: Data not accepted downstream. Fragment 1:2 [Error Id: d2cfcbe2-6b59-4847-85ec-7cd7fb312051 on ip-172-31-16-222.us-west-2.compute.internal:31010] org.apache.drill.common.exceptions.UserException: SYSTEM ERROR: RpcException: Data not accepted downstream. Fragment 1:2 [Error Id: d2cfcbe2-6b59-4847-85ec-7cd7fb312051 on ip-172-31-16-222.us-west-2.compute.internal:31010] at org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:543) ~[drill-common-1.8.0.jar:1.8.0] at org.apache.drill.exec.work.fragment.FragmentExecutor.sendFinalState(FragmentExecutor.java:293) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.work.fragment.FragmentExecutor.cleanup(FragmentExecutor.java:160) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:262) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38) [drill-common-1.8.0.jar:1.8.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_101] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] Caused by: org.apache.drill.exec.rpc.RpcException: Data not accepted downstream. at org.apache.drill.exec.ops.StatusHandler.success(StatusHandler.java:54) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.ops.StatusHandler.success(StatusHandler.java:29) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.ListeningCommand$DeferredRpcOutcome.success(ListeningCommand.java:55) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.ListeningCommand$DeferredRpcOutcome.success(ListeningCommand.java:46) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataTunnel$ThrottlingOutcomeListener.success(DataTunnel.java:134) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataTunnel$ThrottlingOutcomeListener.success(DataTunnel.java:117) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RequestIdMap$RpcListener.set(RequestIdMap.java:129) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$ResponseEvent.run(RpcBus.java:415) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor$RunnableProcessor.run(SerializedExecutor.java:89) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$SameExecutor.execute(RpcBus.java:240) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor.execute(SerializedExecutor.java:123) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:279) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:245) ~[drill-rpc-1.8.0.jar:1.8.0] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) ~[netty-handler-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) ~[netty-common-4.0.27.Final.jar:4.0.27.Final] ... 1 common frames omitted 2016-09-29 05:04:50,116 [USER-rpc-event-queue] ERROR o.a.d.exec.server.rest.QueryWrapper - Query Failed org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR: RpcException: Data not accepted downstream. Fragment 1:2 [Error Id: d2cfcbe2-6b59-4847-85ec-7cd7fb312051 on ip-172-31-16-222.us-west-2.compute.internal:31010] at org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.user.UserClient.handleReponse(UserClient.java:134) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:46) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:31) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus.handle(RpcBus.java:65) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$RequestEvent.run(RpcBus.java:363) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor$RunnableProcessor.run(SerializedExecutor.java:89) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$SameExecutor.execute(RpcBus.java:240) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor.execute(SerializedExecutor.java:123) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:274) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:245) [drill-rpc-1.8.0.jar:1.8.0] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) [netty-handler-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) [netty-common-4.0.27.Final.jar:4.0.27.Final] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] 2016-09-29 05:04:50,117 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:1: State change requested RUNNING --> CANCELLATION_REQUESTED 2016-09-29 05:04:50,117 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:1: State to report: CANCELLATION_REQUESTED 2016-09-29 05:04:50,117 [CONTROL-rpc-event-queue] WARN o.a.d.e.w.b.ControlMessageHandler - Dropping request to cancel fragment. 281361b2-f089-3e97-458b-6ea737b61d7e:1:2 does not exist. 2016-09-29 05:04:50,118 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:4: State change requested RUNNING --> CANCELLATION_REQUESTED 2016-09-29 05:04:50,118 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:4: State to report: CANCELLATION_REQUESTED 2016-09-29 05:04:50,118 [qtp613131516-2022] ERROR o.a.d.e.server.rest.QueryResources - Query from Web UI Failed org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR: RpcException: Data not accepted downstream. Fragment 1:2 [Error Id: d2cfcbe2-6b59-4847-85ec-7cd7fb312051 on ip-172-31-16-222.us-west-2.compute.internal:31010] at org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.user.UserClient.handleReponse(UserClient.java:134) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:46) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:31) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus.handle(RpcBus.java:65) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$RequestEvent.run(RpcBus.java:363) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor$RunnableProcessor.run(SerializedExecutor.java:89) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$SameExecutor.execute(RpcBus.java:240) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor.execute(SerializedExecutor.java:123) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:274) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:245) ~[drill-rpc-1.8.0.jar:1.8.0] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) ~[netty-handler-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) ~[netty-common-4.0.27.Final.jar:4.0.27.Final] at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_101] 2016-09-29 05:04:50,118 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:5: State change requested RUNNING --> CANCELLATION_REQUESTED 2016-09-29 05:04:50,119 [CONTROL-rpc-event-queue] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:5: State to report: CANCELLATION_REQUESTED 2016-09-29 05:04:50,119 [CONTROL-rpc-event-queue] WARN o.a.d.exec.rpc.control.WorkEventBus - A fragment message arrived but there was no registered listener for that message: profile { state: CANCELLATION_REQUESTED minor_fragment_id: 1 operator_profile { input_profile { records: 4000 batches: 1 schemas: 1 } operator_id: 7 operator_type: 22 setup_nanos: 0 process_nanos: 30611038 peak_local_memory_allocated: 118784 wait_nanos: 0 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 8 operator_type: 11 setup_nanos: 0 process_nanos: 34375864 peak_local_memory_allocated: 0 metric { metric_id: 0 long_value: 0 } metric { metric_id: 1 long_value: 6 } wait_nanos: 5998663260 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 6 operator_type: 10 setup_nanos: 215717 process_nanos: 41483457 peak_local_memory_allocated: 131072 wait_nanos: 0 } operator_profile { input_profile { records: 8000 batches: 2 schemas: 2 } input_profile { records: 4603818 batches: 1155 schemas: 0 } operator_id: 5 operator_type: 4 setup_nanos: 0 process_nanos: 635695463342 peak_local_memory_allocated: 105018368 metric { metric_id: 0 long_value: 65536 } metric { metric_id: 1 long_value: 22 } metric { metric_id: 2 long_value: 0 } metric { metric_id: 3 long_value: 0 } wait_nanos: 0 } operator_profile { input_profile { records: 3556512000 batches: 889129 schemas: 1 } operator_id: 4 operator_type: 10 setup_nanos: 124913 process_nanos: 14114200238 peak_local_memory_allocated: 172032 wait_nanos: 0 } operator_profile { input_profile { records: 3556512000 batches: 889129 schemas: 1 } operator_id: 3 operator_type: 10 setup_nanos: 104281 process_nanos: 12163332291 peak_local_memory_allocated: 86016 wait_nanos: 0 } operator_profile { input_profile { records: 3556512000 batches: 889129 schemas: 1 } operator_id: 2 operator_type: 7 setup_nanos: 7757 process_nanos: 5286309389 peak_local_memory_allocated: 94208 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 1 operator_type: 14 setup_nanos: 216544 process_nanos: 488781 peak_local_memory_allocated: 53248 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 0 operator_type: 0 setup_nanos: 0 process_nanos: 58473 peak_local_memory_allocated: 53248 metric { metric_id: 0 long_value: 14004 } wait_nanos: 52521 } start_time: 1475124812430 end_time: 1475125490117 memory_used: 113018880 max_memory_used: 113018880 endpoint { address: "ip-172-31-16-222.us-west-2.compute.internal" user_port: 31010 control_port: 31011 data_port: 31012 } } handle { query_id { part1: 2887759207242219159 part2: 5011220674853084542 } major_fragment_id: 1 minor_fragment_id: 1 } . 2016-09-29 05:04:50,120 [CONTROL-rpc-event-queue] WARN o.a.d.exec.rpc.control.WorkEventBus - A fragment message arrived but there was no registered listener for that message: profile { state: CANCELLATION_REQUESTED minor_fragment_id: 4 operator_profile { input_profile { records: 4000 batches: 1 schemas: 1 } operator_id: 7 operator_type: 22 setup_nanos: 0 process_nanos: 90812177 peak_local_memory_allocated: 118784 wait_nanos: 0 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 8 operator_type: 11 setup_nanos: 0 process_nanos: 45929863 peak_local_memory_allocated: 0 metric { metric_id: 0 long_value: 0 } metric { metric_id: 1 long_value: 6 } wait_nanos: 5879799936 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 6 operator_type: 10 setup_nanos: 198520 process_nanos: 54769694 peak_local_memory_allocated: 131072 wait_nanos: 0 } operator_profile { input_profile { records: 8000 batches: 2 schemas: 2 } input_profile { records: 4603818 batches: 1155 schemas: 0 } operator_id: 5 operator_type: 4 setup_nanos: 0 process_nanos: 637675643248 peak_local_memory_allocated: 45274112 metric { metric_id: 0 long_value: 65536 } metric { metric_id: 2 long_value: 0 } metric { metric_id: 3 long_value: 0 } metric { metric_id: 1 long_value: 22 } wait_nanos: 0 } operator_profile { input_profile { records: 3243848000 batches: 810963 schemas: 1 } operator_id: 4 operator_type: 10 setup_nanos: 123882 process_nanos: 13361035638 peak_local_memory_allocated: 172032 wait_nanos: 0 } operator_profile { input_profile { records: 3243844000 batches: 810962 schemas: 1 } operator_id: 3 operator_type: 10 setup_nanos: 105411 process_nanos: 11528686553 peak_local_memory_allocated: 86016 wait_nanos: 0 } operator_profile { input_profile { records: 3243844000 batches: 810962 schemas: 1 } operator_id: 2 operator_type: 7 setup_nanos: 7235 process_nanos: 5007512565 peak_local_memory_allocated: 94208 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 1 operator_type: 14 setup_nanos: 217792 process_nanos: 181050 peak_local_memory_allocated: 53248 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 0 operator_type: 0 setup_nanos: 0 process_nanos: 50171 peak_local_memory_allocated: 53248 metric { metric_id: 0 long_value: 14004 } wait_nanos: 787724 } start_time: 1475124812431 end_time: 1475125490118 memory_used: 53221120 max_memory_used: 53274368 endpoint { address: "ip-172-31-16-222.us-west-2.compute.internal" user_port: 31010 control_port: 31011 data_port: 31012 } } handle { query_id { part1: 2887759207242219159 part2: 5011220674853084542 } major_fragment_id: 1 minor_fragment_id: 4 } . 2016-09-29 05:04:50,120 [CONTROL-rpc-event-queue] WARN o.a.d.exec.rpc.control.WorkEventBus - A fragment message arrived but there was no registered listener for that message: profile { state: CANCELLATION_REQUESTED minor_fragment_id: 5 operator_profile { input_profile { records: 4000 batches: 1 schemas: 1 } operator_id: 7 operator_type: 22 setup_nanos: 0 process_nanos: 79903843 peak_local_memory_allocated: 118784 wait_nanos: 0 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 8 operator_type: 11 setup_nanos: 0 process_nanos: 39479052 peak_local_memory_allocated: 0 metric { metric_id: 0 long_value: 0 } metric { metric_id: 1 long_value: 6 } wait_nanos: 5966311033 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 6 operator_type: 10 setup_nanos: 115384 process_nanos: 47432552 peak_local_memory_allocated: 131072 wait_nanos: 0 } operator_profile { input_profile { records: 8000 batches: 2 schemas: 2 } input_profile { records: 4603818 batches: 1155 schemas: 0 } operator_id: 5 operator_type: 4 setup_nanos: 0 process_nanos: 637010660695 peak_local_memory_allocated: 43947008 metric { metric_id: 0 long_value: 65536 } metric { metric_id: 3 long_value: 0 } metric { metric_id: 1 long_value: 22 } metric { metric_id: 2 long_value: 0 } wait_nanos: 0 } operator_profile { input_profile { records: 3331388000 batches: 832848 schemas: 1 } operator_id: 4 operator_type: 10 setup_nanos: 130058 process_nanos: 13546364856 peak_local_memory_allocated: 172032 wait_nanos: 0 } operator_profile { input_profile { records: 3331388000 batches: 832848 schemas: 1 } operator_id: 3 operator_type: 10 setup_nanos: 114558 process_nanos: 11767178245 peak_local_memory_allocated: 86016 wait_nanos: 0 } operator_profile { input_profile { records: 3331388000 batches: 832848 schemas: 1 } operator_id: 2 operator_type: 7 setup_nanos: 8253 process_nanos: 5107129680 peak_local_memory_allocated: 94208 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 1 operator_type: 14 setup_nanos: 228726 process_nanos: 191943 peak_local_memory_allocated: 53248 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 0 operator_type: 0 setup_nanos: 0 process_nanos: 46075 peak_local_memory_allocated: 53248 metric { metric_id: 0 long_value: 14004 } wait_nanos: 29694 } start_time: 1475124812431 end_time: 1475125490118 memory_used: 51914496 max_memory_used: 51947264 endpoint { address: "ip-172-31-16-222.us-west-2.compute.internal" user_port: 31010 control_port: 31011 data_port: 31012 } } handle { query_id { part1: 2887759207242219159 part2: 5011220674853084542 } major_fragment_id: 1 minor_fragment_id: 5 } . 2016-09-29 05:04:50,121 [DATA-rpc-event-queue] ERROR o.a.drill.exec.rpc.data.DataServer - Failure while getting fragment manager. 281361b2-f089-3e97-458b-6ea737b61d7e:0:0 org.apache.drill.exec.exception.FragmentSetupException: Failed to receive plan fragment that was required for id: 281361b2-f089-3e97-458b-6ea737b61d7e:0:0 at org.apache.drill.exec.rpc.control.WorkEventBus.getFragmentManager(WorkEventBus.java:108) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataServer.submit(DataServer.java:121) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataServer.handle(DataServer.java:159) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataServer.handle(DataServer.java:52) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$RequestEvent.run(RpcBus.java:363) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor$RunnableProcessor.run(SerializedExecutor.java:89) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$SameExecutor.execute(RpcBus.java:240) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor.execute(SerializedExecutor.java:123) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:274) [drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:245) [drill-rpc-1.8.0.jar:1.8.0] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.timeout.ReadTimeoutHandler.channelRead(ReadTimeoutHandler.java:150) [netty-handler-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242) [netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) [netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) [netty-common-4.0.27.Final.jar:4.0.27.Final] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] 2016-09-29 05:04:50,121 [DATA-rpc-event-queue] ERROR o.a.drill.exec.ops.StatusHandler - Data not accepted downstream. Stopping future sends. 2016-09-29 05:04:50,121 [DATA-rpc-event-queue] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:4: State change requested CANCELLATION_REQUESTED --> FAILED 2016-09-29 05:04:50,124 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:1] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:1: State change requested CANCELLATION_REQUESTED --> FINISHED 2016-09-29 05:04:50,124 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:1] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:1: State to report: CANCELLED 2016-09-29 05:04:50,124 [drill-executor-347] WARN o.a.d.exec.rpc.control.WorkEventBus - Fragment 281361b2-f089-3e97-458b-6ea737b61d7e:1:1 not found in the work bus. 2016-09-29 05:04:50,124 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:5] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:5: State change requested CANCELLATION_REQUESTED --> FINISHED 2016-09-29 05:04:50,124 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:5] INFO o.a.d.e.w.f.FragmentStatusReporter - 281361b2-f089-3e97-458b-6ea737b61d7e:1:5: State to report: CANCELLED 2016-09-29 05:04:50,125 [drill-executor-345] WARN o.a.d.exec.rpc.control.WorkEventBus - Fragment 281361b2-f089-3e97-458b-6ea737b61d7e:1:5 not found in the work bus. 2016-09-29 05:04:50,125 [CONTROL-rpc-event-queue] WARN o.a.d.exec.rpc.control.WorkEventBus - A fragment message arrived but there was no registered listener for that message: profile { state: CANCELLED minor_fragment_id: 1 operator_profile { input_profile { records: 4000 batches: 1 schemas: 1 } operator_id: 7 operator_type: 22 setup_nanos: 0 process_nanos: 30611038 peak_local_memory_allocated: 118784 wait_nanos: 0 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 8 operator_type: 11 setup_nanos: 0 process_nanos: 34375864 peak_local_memory_allocated: 0 metric { metric_id: 0 long_value: 0 } metric { metric_id: 1 long_value: 6 } wait_nanos: 5998663260 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 6 operator_type: 10 setup_nanos: 215717 process_nanos: 41483457 peak_local_memory_allocated: 131072 wait_nanos: 0 } operator_profile { input_profile { records: 8000 batches: 2 schemas: 2 } input_profile { records: 4603818 batches: 1155 schemas: 0 } operator_id: 5 operator_type: 4 setup_nanos: 0 process_nanos: 635697389121 peak_local_memory_allocated: 105018368 metric { metric_id: 0 long_value: 65536 } metric { metric_id: 1 long_value: 22 } metric { metric_id: 2 long_value: 0 } metric { metric_id: 3 long_value: 0 } wait_nanos: 0 } operator_profile { input_profile { records: 3556516000 batches: 889130 schemas: 1 } operator_id: 4 operator_type: 10 setup_nanos: 124913 process_nanos: 14114222140 peak_local_memory_allocated: 172032 wait_nanos: 0 } operator_profile { input_profile { records: 3556516000 batches: 889130 schemas: 1 } operator_id: 3 operator_type: 10 setup_nanos: 104281 process_nanos: 12163348436 peak_local_memory_allocated: 86016 wait_nanos: 0 } operator_profile { input_profile { records: 3556516000 batches: 889130 schemas: 1 } operator_id: 2 operator_type: 7 setup_nanos: 7757 process_nanos: 5286316684 peak_local_memory_allocated: 94208 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 1 operator_type: 14 setup_nanos: 216544 process_nanos: 489800 peak_local_memory_allocated: 53248 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 0 operator_type: 0 setup_nanos: 0 process_nanos: 79360 peak_local_memory_allocated: 53248 metric { metric_id: 0 long_value: 14004 } wait_nanos: 82880 } start_time: 1475124812430 end_time: 1475125490124 memory_used: 0 max_memory_used: 113018880 endpoint { address: "ip-172-31-16-222.us-west-2.compute.internal" user_port: 31010 control_port: 31011 data_port: 31012 } } handle { query_id { part1: 2887759207242219159 part2: 5011220674853084542 } major_fragment_id: 1 minor_fragment_id: 1 } . 2016-09-29 05:04:50,125 [CONTROL-rpc-event-queue] WARN o.a.d.exec.rpc.control.WorkEventBus - A fragment message arrived but there was no registered listener for that message: profile { state: CANCELLED minor_fragment_id: 5 operator_profile { input_profile { records: 4000 batches: 1 schemas: 1 } operator_id: 7 operator_type: 22 setup_nanos: 0 process_nanos: 79903843 peak_local_memory_allocated: 118784 wait_nanos: 0 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 8 operator_type: 11 setup_nanos: 0 process_nanos: 39479052 peak_local_memory_allocated: 0 metric { metric_id: 0 long_value: 0 } metric { metric_id: 1 long_value: 6 } wait_nanos: 5966311033 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 6 operator_type: 10 setup_nanos: 115384 process_nanos: 47432552 peak_local_memory_allocated: 131072 wait_nanos: 0 } operator_profile { input_profile { records: 8000 batches: 2 schemas: 2 } input_profile { records: 4603818 batches: 1155 schemas: 0 } operator_id: 5 operator_type: 4 setup_nanos: 0 process_nanos: 637012243816 peak_local_memory_allocated: 43947008 metric { metric_id: 0 long_value: 65536 } metric { metric_id: 3 long_value: 0 } metric { metric_id: 1 long_value: 22 } metric { metric_id: 2 long_value: 0 } wait_nanos: 0 } operator_profile { input_profile { records: 3331392000 batches: 832849 schemas: 1 } operator_id: 4 operator_type: 10 setup_nanos: 130058 process_nanos: 13546386516 peak_local_memory_allocated: 172032 wait_nanos: 0 } operator_profile { input_profile { records: 3331392000 batches: 832849 schemas: 1 } operator_id: 3 operator_type: 10 setup_nanos: 114558 process_nanos: 11767196031 peak_local_memory_allocated: 86016 wait_nanos: 0 } operator_profile { input_profile { records: 3331392000 batches: 832849 schemas: 1 } operator_id: 2 operator_type: 7 setup_nanos: 8253 process_nanos: 5107137715 peak_local_memory_allocated: 94208 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 1 operator_type: 14 setup_nanos: 228726 process_nanos: 193088 peak_local_memory_allocated: 53248 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 0 operator_type: 0 setup_nanos: 0 process_nanos: 66640 peak_local_memory_allocated: 53248 metric { metric_id: 0 long_value: 14004 } wait_nanos: 56345 } start_time: 1475124812431 end_time: 1475125490124 memory_used: 0 max_memory_used: 51947264 endpoint { address: "ip-172-31-16-222.us-west-2.compute.internal" user_port: 31010 control_port: 31011 data_port: 31012 } } handle { query_id { part1: 2887759207242219159 part2: 5011220674853084542 } major_fragment_id: 1 minor_fragment_id: 5 } . 2016-09-29 05:04:50,148 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:4] INFO o.a.d.e.w.fragment.FragmentExecutor - 281361b2-f089-3e97-458b-6ea737b61d7e:1:4: State change requested FAILED --> FINISHED 2016-09-29 05:04:50,149 [281361b2-f089-3e97-458b-6ea737b61d7e:frag:1:4] ERROR o.a.d.e.w.fragment.FragmentExecutor - SYSTEM ERROR: RpcException: Data not accepted downstream. Fragment 1:4 [Error Id: 34ca7b90-eba6-47a5-9a3a-9046285c2617 on ip-172-31-16-222.us-west-2.compute.internal:31010] org.apache.drill.common.exceptions.UserException: SYSTEM ERROR: RpcException: Data not accepted downstream. Fragment 1:4 [Error Id: 34ca7b90-eba6-47a5-9a3a-9046285c2617 on ip-172-31-16-222.us-west-2.compute.internal:31010] at org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:543) ~[drill-common-1.8.0.jar:1.8.0] at org.apache.drill.exec.work.fragment.FragmentExecutor.sendFinalState(FragmentExecutor.java:293) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.work.fragment.FragmentExecutor.cleanup(FragmentExecutor.java:160) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:262) [drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38) [drill-common-1.8.0.jar:1.8.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_101] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] Caused by: org.apache.drill.exec.rpc.RpcException: Data not accepted downstream. at org.apache.drill.exec.ops.StatusHandler.success(StatusHandler.java:54) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.ops.StatusHandler.success(StatusHandler.java:29) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.ListeningCommand$DeferredRpcOutcome.success(ListeningCommand.java:55) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.ListeningCommand$DeferredRpcOutcome.success(ListeningCommand.java:46) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataTunnel$ThrottlingOutcomeListener.success(DataTunnel.java:134) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.data.DataTunnel$ThrottlingOutcomeListener.success(DataTunnel.java:117) ~[drill-java-exec-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RequestIdMap$RpcListener.set(RequestIdMap.java:129) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$ResponseEvent.run(RpcBus.java:415) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor$RunnableProcessor.run(SerializedExecutor.java:89) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$SameExecutor.execute(RpcBus.java:240) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.common.SerializedExecutor.execute(SerializedExecutor.java:123) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:279) ~[drill-rpc-1.8.0.jar:1.8.0] at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:245) ~[drill-rpc-1.8.0.jar:1.8.0] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) ~[netty-handler-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242) ~[netty-codec-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) ~[netty-transport-4.0.27.Final.jar:4.0.27.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) ~[netty-common-4.0.27.Final.jar:4.0.27.Final] ... 1 common frames omitted 2016-09-29 05:04:50,149 [drill-executor-336] WARN o.a.d.exec.rpc.control.WorkEventBus - Fragment 281361b2-f089-3e97-458b-6ea737b61d7e:1:4 not found in the work bus. 2016-09-29 05:04:50,150 [CONTROL-rpc-event-queue] WARN o.a.d.exec.rpc.control.WorkEventBus - A fragment message arrived but there was no registered listener for that message: profile { state: FAILED error { error_id: "34ca7b90-eba6-47a5-9a3a-9046285c2617" endpoint { address: "ip-172-31-16-222.us-west-2.compute.internal" user_port: 31010 control_port: 31011 data_port: 31012 } error_type: SYSTEM message: "SYSTEM ERROR: RpcException: Data not accepted downstream.\n\nFragment 1:4\n\n[Error Id: 34ca7b90-eba6-47a5-9a3a-9046285c2617 on ip-172-31-16-222.us-west-2.compute.internal:31010]" exception { exception_class: "org.apache.drill.exec.rpc.RpcException" message: "Data not accepted downstream." stack_trace { class_name: "org.apache.drill.exec.ops.StatusHandler" file_name: "StatusHandler.java" line_number: 54 method_name: "success" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.ops.StatusHandler" file_name: "StatusHandler.java" line_number: 29 method_name: "success" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.ListeningCommand$DeferredRpcOutcome" file_name: "ListeningCommand.java" line_number: 55 method_name: "success" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.ListeningCommand$DeferredRpcOutcome" file_name: "ListeningCommand.java" line_number: 46 method_name: "success" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.data.DataTunnel$ThrottlingOutcomeListener" file_name: "DataTunnel.java" line_number: 134 method_name: "success" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.data.DataTunnel$ThrottlingOutcomeListener" file_name: "DataTunnel.java" line_number: 117 method_name: "success" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.RequestIdMap$RpcListener" file_name: "RequestIdMap.java" line_number: 129 method_name: "set" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.RpcBus$ResponseEvent" file_name: "RpcBus.java" line_number: 415 method_name: "run" is_native_method: false } stack_trace { class_name: "org.apache.drill.common.SerializedExecutor$RunnableProcessor" file_name: "SerializedExecutor.java" line_number: 89 method_name: "run" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.RpcBus$SameExecutor" file_name: "RpcBus.java" line_number: 240 method_name: "execute" is_native_method: false } stack_trace { class_name: "org.apache.drill.common.SerializedExecutor" file_name: "SerializedExecutor.java" line_number: 123 method_name: "execute" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.RpcBus$InboundHandler" file_name: "RpcBus.java" line_number: 279 method_name: "decode" is_native_method: false } stack_trace { class_name: "org.apache.drill.exec.rpc.RpcBus$InboundHandler" file_name: "RpcBus.java" line_number: 245 method_name: "decode" is_native_method: false } stack_trace { class_name: "io.netty.handler.codec.MessageToMessageDecoder" file_name: "MessageToMessageDecoder.java" line_number: 89 method_name: "channelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 339 method_name: "invokeChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 324 method_name: "fireChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.handler.timeout.IdleStateHandler" file_name: "IdleStateHandler.java" line_number: 254 method_name: "channelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 339 method_name: "invokeChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 324 method_name: "fireChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.handler.codec.MessageToMessageDecoder" file_name: "MessageToMessageDecoder.java" line_number: 103 method_name: "channelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 339 method_name: "invokeChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 324 method_name: "fireChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.handler.codec.ByteToMessageDecoder" file_name: "ByteToMessageDecoder.java" line_number: 242 method_name: "channelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 339 method_name: "invokeChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 324 method_name: "fireChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.ChannelInboundHandlerAdapter" file_name: "ChannelInboundHandlerAdapter.java" line_number: 86 method_name: "channelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 339 method_name: "invokeChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.AbstractChannelHandlerContext" file_name: "AbstractChannelHandlerContext.java" line_number: 324 method_name: "fireChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.DefaultChannelPipeline" file_name: "DefaultChannelPipeline.java" line_number: 847 method_name: "fireChannelRead" is_native_method: false } stack_trace { class_name: "io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe" file_name: "AbstractNioByteChannel.java" line_number: 131 method_name: "read" is_native_method: false } stack_trace { class_name: "io.netty.channel.nio.NioEventLoop" file_name: "NioEventLoop.java" line_number: 511 method_name: "processSelectedKey" is_native_method: false } stack_trace { class_name: "io.netty.channel.nio.NioEventLoop" file_name: "NioEventLoop.java" line_number: 468 method_name: "processSelectedKeysOptimized" is_native_method: false } stack_trace { class_name: "io.netty.channel.nio.NioEventLoop" file_name: "NioEventLoop.java" line_number: 382 method_name: "processSelectedKeys" is_native_method: false } stack_trace { class_name: "io.netty.channel.nio.NioEventLoop" file_name: "NioEventLoop.java" line_number: 354 method_name: "run" is_native_method: false } stack_trace { class_name: "io.netty.util.concurrent.SingleThreadEventExecutor$2" file_name: "SingleThreadEventExecutor.java" line_number: 111 method_name: "run" is_native_method: false } stack_trace { class_name: "..." line_number: 0 method_name: "..." is_native_method: false } } } minor_fragment_id: 4 operator_profile { input_profile { records: 4000 batches: 1 schemas: 1 } operator_id: 7 operator_type: 22 setup_nanos: 0 process_nanos: 90812177 peak_local_memory_allocated: 118784 wait_nanos: 0 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 8 operator_type: 11 setup_nanos: 0 process_nanos: 45929863 peak_local_memory_allocated: 0 metric { metric_id: 0 long_value: 0 } metric { metric_id: 1 long_value: 6 } wait_nanos: 5879799936 } operator_profile { input_profile { records: 4607818 batches: 1156 schemas: 1 } operator_id: 6 operator_type: 10 setup_nanos: 198520 process_nanos: 54769694 peak_local_memory_allocated: 131072 wait_nanos: 0 } operator_profile { input_profile { records: 8000 batches: 2 schemas: 2 } input_profile { records: 4603818 batches: 1155 schemas: 0 } operator_id: 5 operator_type: 4 setup_nanos: 0 process_nanos: 637675643248 peak_local_memory_allocated: 120758272 metric { metric_id: 0 long_value: 65536 } metric { metric_id: 2 long_value: 0 } metric { metric_id: 3 long_value: 0 } metric { metric_id: 1 long_value: 22 } wait_nanos: 0 } operator_profile { input_profile { records: 3243848000 batches: 810963 schemas: 1 } operator_id: 4 operator_type: 10 setup_nanos: 123882 process_nanos: 13361051874 peak_local_memory_allocated: 172032 wait_nanos: 0 } operator_profile { input_profile { records: 3243848000 batches: 810963 schemas: 1 } operator_id: 3 operator_type: 10 setup_nanos: 105411 process_nanos: 11528699155 peak_local_memory_allocated: 86016 wait_nanos: 0 } operator_profile { input_profile { records: 3243848000 batches: 810963 schemas: 1 } operator_id: 2 operator_type: 7 setup_nanos: 7235 process_nanos: 5007518032 peak_local_memory_allocated: 94208 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 1 operator_type: 14 setup_nanos: 217792 process_nanos: 181895 peak_local_memory_allocated: 53248 wait_nanos: 0 } operator_profile { input_profile { records: 1000 batches: 2 schemas: 1 } operator_id: 0 operator_type: 0 setup_nanos: 0 process_nanos: 79153 peak_local_memory_allocated: 53248 metric { metric_id: 0 long_value: 14004 } wait_nanos: 3935275 } start_time: 1475124812431 end_time: 1475125490149 memory_used: 0 max_memory_used: 128758528 endpoint { address: "ip-172-31-16-222.us-west-2.compute.internal" user_port: 31010 control_port: 31011 data_port: 31012 } } handle { query_id { part1: 2887759207242219159 part2: 5011220674853084542 } major_fragment_id: 1 minor_fragment_id: 4 } .