2016-10-26 16:27:36,623 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 47855: responding to null from 192.168.200.219:38779 Call#-33 Retry#-1 Wrote 166 bytes. 2016-10-26 16:27:36,625 DEBUG org.apache.hadoop.ipc.Server: got #-33 2016-10-26 16:27:36,625 DEBUG org.apache.hadoop.ipc.Server: Have read input token of size 346 for processing by saslServer.evaluateResponse() 2016-10-26 16:27:36,655 DEBUG org.apache.hadoop.yarn.server.security.BaseNMTokenSecretManager: creating password for appattempt_1475850791417_0147_000002 for user shfs3453 to run on NM datanode05.bigdata.fr:47855 2016-10-26 16:27:36,701 DEBUG org.apache.hadoop.yarn.security.NMTokenIdentifier: Writing NMTokenIdentifier to RPC layer: appAttemptId { application_id { id: 147 cluster_timestamp: 1475850791417 } attemptId: 2 } nodeId { host: "datanode05.bigdata.fr" port: 47855 } appSubmitter: "shfs3453" keyId: -646854807 2016-10-26 16:27:36,702 DEBUG org.apache.hadoop.yarn.server.nodemanager.security.NMTokenSecretManagerInNM: NMToken password retrieved successfully!! 2016-10-26 16:27:36,703 DEBUG org.apache.hadoop.security.SaslRpcServer: SASL server DIGEST-MD5 callback: setting password for client: appattempt_1475850791417_0147_000002 (auth:SIMPLE) 2016-10-26 16:27:36,704 DEBUG org.apache.hadoop.security.SaslRpcServer: SASL server DIGEST-MD5 callback: setting canonicalized client ID: appattempt_1475850791417_0147_000002 2016-10-26 16:27:36,705 DEBUG org.apache.hadoop.ipc.Server: Will send SUCCESS token of size 40 from saslServer. 2016-10-26 16:27:36,705 DEBUG org.apache.hadoop.ipc.Server: SASL server context established. Negotiated QoP is auth 2016-10-26 16:27:36,705 DEBUG org.apache.hadoop.ipc.Server: SASL server successfully authenticated client: appattempt_1475850791417_0147_000002 (auth:SIMPLE) 2016-10-26 16:27:36,705 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for appattempt_1475850791417_0147_000002 (auth:SIMPLE) 2016-10-26 16:27:36,706 DEBUG org.apache.hadoop.ipc.Server: Sending sasl message state: SUCCESS token: "rspauth=8883a2ea93944f64555a89e67e98fc8a" 2016-10-26 16:27:36,706 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 47855: responding to null from 192.168.200.219:38779 Call#-33 Retry#-1 2016-10-26 16:27:36,706 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 47855: responding to null from 192.168.200.219:38779 Call#-33 Retry#-1 Wrote 64 bytes. 2016-10-26 16:27:36,708 DEBUG org.apache.hadoop.ipc.Server: got #-3 2016-10-26 16:27:36,718 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for appattempt_1475850791417_0147_000002 (auth:TOKEN) for protocol=interface org.apache.hadoop.yarn.api.ContainerManagementProtocolPB 2016-10-26 16:27:36,722 DEBUG org.apache.hadoop.ipc.Server: Successfully authorized userInfo { } protocol: "org.apache.hadoop.yarn.api.ContainerManagementProtocolPB" 2016-10-26 16:27:36,722 DEBUG org.apache.hadoop.ipc.Server: got #2166 2016-10-26 16:27:36,723 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 47855: org.apache.hadoop.yarn.api.ContainerManagementProtocolPB.startContainers from 192.168.200.219:38779 Call#2166 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:36,724 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:appattempt_1475850791417_0147_000002 (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:36,855 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ ] 2016-10-26 16:27:36,949 DEBUG org.apache.hadoop.yarn.server.security.BaseContainerTokenSecretManager: Retrieving password for container_1475850791417_0147_02_000001 for user container_1475850791417_0147_02_000001 (auth:SIMPLE) to be run on NM datanode05.bigdata.fr:47855 2016-10-26 16:27:36,963 DEBUG org.apache.hadoop.yarn.security.ContainerTokenIdentifier: Writing ContainerTokenIdentifier to RPC layer: containerId { app_attempt_id { application_id { id: 147 cluster_timestamp: 1475850791417 } attemptId: 2 } id: 1 } nmHostAddr: "datanode05.bigdata.fr:47855" appSubmitter: "shfs3453" resource { memory: 4096 virtual_cores: 1 } expiryTimeStamp: 1477492656599 masterKeyId: -1972402786 rmIdentifier: 1475850791417 priority { priority: 0 } creationTime: 1477492056598 2016-10-26 16:27:36,964 DEBUG org.apache.hadoop.yarn.server.nodemanager.security.NMTokenSecretManagerInNM: NMToken key updated for application attempt : appattempt_1475850791417_0147_000002 2016-10-26 16:27:36,965 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl: Start request for container_1475850791417_0147_02_000001 by user shfs3453 2016-10-26 16:27:36,978 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl: = Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id { id: 147 cluster_timestamp: 1475850791417 } attemptId: 2 } keyId: -998955696) 2016-10-26 16:27:36,978 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl: ha-hdfs:sandbox = Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:sandbox, Ident: (HDFS_DELEGATION_TOKEN token 12521 for shfs3453) 2016-10-26 16:27:36,979 WARN org.apache.hadoop.security.token.Token: Cannot find class for token kind HBASE_AUTH_TOKEN 2016-10-26 16:27:36,979 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl: e7142baa-7bed-4c65-8ed5-6594470a05bd = Kind: HBASE_AUTH_TOKEN, Service: e7142baa-7bed-4c65-8ed5-6594470a05bd, Ident: 00 00 00 2e 08 00 12 17 73 68 66 73 33 34 35 33 40 53 41 4e 44 42 4f 58 2e 48 41 44 4f 4f 50 18 c6 04 20 8b e4 8d 8b 80 2b 28 8b ec bf ab 82 2b 30 59 2016-10-26 16:27:37,027 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl: Creating a new application reference for app application_1475850791417_0147 2016-10-26 16:27:37,040 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationInitEvent.EventType: INIT_APPLICATION 2016-10-26 16:27:37,040 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Processing application_1475850791417_0147 of type INIT_APPLICATION 2016-10-26 16:27:37,043 INFO org.apache.hadoop.yarn.server.nodemanager.NMAuditLogger: USER=shfs3453 IP=192.168.200.219 OPERATION=Start Container Request TARGET=ContainerManageImpl RESULT=SUCCESS APPID=application_1475850791417_0147 CONTAINERID=container_1475850791417_0147_02_000001 2016-10-26 16:27:37,044 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Application application_1475850791417_0147 transitioned from NEW to INITING 2016-10-26 16:27:37,044 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationContainerInitEvent.EventType: INIT_CONTAINER 2016-10-26 16:27:37,044 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Processing application_1475850791417_0147 of type INIT_CONTAINER 2016-10-26 16:27:37,044 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Adding container_1475850791417_0147_02_000001 to application application_1475850791417_0147 2016-10-26 16:27:37,045 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.loghandler.event.LogHandlerAppStartedEvent.EventType: APPLICATION_STARTED 2016-10-26 16:27:37,065 DEBUG org.apache.hadoop.ipc.Server: Served: startContainers queueTime= 89 procesingTime= 254 2016-10-26 16:27:37,067 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 47855: responding to org.apache.hadoop.yarn.api.ContainerManagementProtocolPB.startContainers from 192.168.200.219:38779 Call#2166 Retry#0 2016-10-26 16:27:37,067 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 47855: responding to org.apache.hadoop.yarn.api.ContainerManagementProtocolPB.startContainers from 192.168.200.219:38779 Call#2166 Retry#0 Wrote 80 bytes. 2016-10-26 16:27:37,207 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 2016-10-26 16:27:37,208 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false 2016-10-26 16:27:37,208 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 2016-10-26 16:27:37,208 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.domain.socket.path = 2016-10-26 16:27:37,256 DEBUG org.apache.hadoop.hdfs.HAUtil: No HA service delegation token found for logical URI hdfs://sandbox 2016-10-26 16:27:37,256 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 2016-10-26 16:27:37,256 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false 2016-10-26 16:27:37,257 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 2016-10-26 16:27:37,257 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.domain.socket.path = 2016-10-26 16:27:37,262 DEBUG org.apache.hadoop.io.retry.RetryUtils: multipleLinearRandomRetry = null 2016-10-26 16:27:37,269 DEBUG org.apache.hadoop.ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@6e756936 2016-10-26 16:27:37,598 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:37,611 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:37,612 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #434 2016-10-26 16:27:37,613 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #434 2016-10-26 16:27:37,613 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:37,852 DEBUG org.apache.hadoop.net.unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@3dee5864: starting with interruptCheckPeriodMs = 60000 2016-10-26 16:27:37,881 DEBUG org.apache.hadoop.util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled. 2016-10-26 16:27:37,892 DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 2016-10-26 16:27:37,899 DEBUG org.apache.hadoop.ipc.Client: The ping interval is 60000 ms. 2016-10-26 16:27:37,899 DEBUG org.apache.hadoop.ipc.Client: Connecting to namenode01.bigdata.fr/192.168.200.23:8020 2016-10-26 16:27:37,901 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:nm/datanode05.bigdata.fr@SANDBOX.HADOOP (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724) 2016-10-26 16:27:37,902 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: NEGOTIATE 2016-10-26 16:27:37,904 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: NEGOTIATE auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" challenge: "realm=\"default\",nonce=\"2i4gbPeAc/ePQUa8H1P3eUUrlocnW/1zfcF/3aB5\",qop=\"auth\",charset=utf-8,algorithm=md5-sess" } auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "namenode01.bigdata.fr" } 2016-10-26 16:27:37,904 DEBUG org.apache.hadoop.security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 2016-10-26 16:27:37,904 DEBUG org.apache.hadoop.security.SaslRpcClient: Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal) 2016-10-26 16:27:37,907 DEBUG org.apache.hadoop.security.SaslRpcClient: RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is nn/namenode01.bigdata.fr@SANDBOX.HADOOP 2016-10-26 16:27:37,907 DEBUG org.apache.hadoop.security.SaslRpcClient: Creating SASL GSSAPI(KERBEROS) client to authenticate to service at namenode01.bigdata.fr 2016-10-26 16:27:37,908 DEBUG org.apache.hadoop.security.SaslRpcClient: Use KERBEROS authentication for protocol ClientNamenodeProtocolPB 2016-10-26 16:27:37,934 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: INITIATE token: "`\202\002\263\006\t*\206H\206\367\022\001\002\002\001\000n\202\002\2420\202\002\236\240\003\002\001\005\241\003\002\001\016\242\a\003\005\000 \000\000\000\243\202\001\222a\202\001\2160\202\001\212\240\003\002\001\005\241\020\033\016SANDBOX.HADOOP\242301\240\003\002\001\000\241*0(\033\002nn\033\"namenode01.bigdata.fr\243\202\001:0\202\0016\240\003\002\001\022\241\003\002\001\v\242\202\001(\004\202\001$\343\340\366\366M\275\210Zt\262/\245\a\206\215\002\003s\255\r\361\320\037\016\220\030\3306f\371\243\006\025-\326\025\333\202\362\236\312\a\331\216M\354T\365Xvw\253\304\210\025\016\260\247\215\213\221R\333!\232nf\004\373\210b\037\3515,+\326Z\253\023\304\222Ct\v\026\307a\313\361\002S\034\032\246\274\0374d6;lT+,\2230\006\235\230\307\220j[\376\330;\021pJ\311M{\230\367\035\204\235\026\340=\356\232\304Z\322J\354\315\331\212\304mi\311\004\246\231\006\231\322lx0\354\345\2446 \201\233x\371\342\230~l\tNo\016\327\310\3454\226\316G\305>\234\034\260\021\037\326\265\341=\202I\244\344\371\324c\340J\353\302-\251\260\244C\310\0022\242\v\265\352bz\240]\263\313EJ8\364\2623\355\321q\317\261\327\206B\302FW\002\355\376[\231\037\236,\004\216\332 s\365\265u\313\322\037\3777@?e\030&\025\352\300\327!\275\210 \375*\277\2230b\322(\372\a\005\321w\375\305[\2570\017,\223R\244\201\3620\201\357\240\003\002\001\022\242\201\347\004\201\344\006|<\342\260,\360\363\226(u\002P\206\020:;\207R\312\267x\336o\375\267\210\274\224\200\033\247\037\005\302\301x\212|\241U\374\304g+#\3422O\256]\246n\027\001d\205.c\302W\233\324\027\223\027j-\033\002w\223A\020\234\3364\277\317\272\374\3656\034CQ\343\371V66\234a[t\030\237\233}\215\374 \353\224A\252\342\341\341\373)\352\257\325\214\276\025@9\376\265\376\033+\217\366]\'b\352U\370g\324\344\n\226\264O\362\216\312\242\376$8\253\377\262\002\354\361\352\335\3635\241\363\347\340w49\312\005\242N\330\343\246\306\265\214xY\362\025\021\207d\204\025\307w\034<)\227\222\270\025\347\265\350:\253\342\377\255GM\375\345}_<\275!\263\242\377\2515\345\243\236\334:\\x\352z\230\276\025\305\\\221" auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "namenode01.bigdata.fr" } 2016-10-26 16:27:37,936 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: CHALLENGE token: "`j\006\t*\206H\206\367\022\001\002\002\002\000o[0Y\240\003\002\001\005\241\003\002\001\017\242M0K\240\003\002\001\022\242D\004B\244\200\326|q\211*\004\323\222\223\377=\366\315L\3210\206\354\236\242\366\000Xa\021\340\331\25341\3359\323\317\374T\321y\337\207\244\245\314UuFd\235\243s-\3507\321S\207f_)\024%\320\203g" 2016-10-26 16:27:37,938 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: RESPONSE token: "" 2016-10-26 16:27:37,942 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: CHALLENGE token: "\005\004\001\377\000\f\000\000\000\000\000\000\vl\3031\001\001\000\000QZk>\016\342\205\275\214\376\207\030" 2016-10-26 16:27:37,944 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: RESPONSE token: "\005\004\000\377\000\f\000\000\000\000\000\000\030.qM\001\001\000\000\230\273P%Jt~Y\360\236x\324" 2016-10-26 16:27:37,945 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: SUCCESS 2016-10-26 16:27:37,945 DEBUG org.apache.hadoop.ipc.Client: Negotiated QOP is :auth 2016-10-26 16:27:37,946 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP: starting, having connections 2 2016-10-26 16:27:37,946 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #435 2016-10-26 16:27:37,948 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #435 2016-10-26 16:27:37,948 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: getFileInfo took 49ms 2016-10-26 16:27:38,008 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:nm/datanode05.bigdata.fr@SANDBOX.HADOOP (auth:KERBEROS) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:331) 2016-10-26 16:27:38,011 WARN org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.AppLogAggregatorImpl: rollingMonitorInterval is set as -1. The log rolling mornitoring interval is disabled. The logs will be aggregated after this application is finished. 2016-10-26 16:27:38,012 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:shfs3453 (auth:SIMPLE) from:org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.LogAggregationService.createAppDir(LogAggregationService.java:261) 2016-10-26 16:27:38,013 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 2016-10-26 16:27:38,013 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false 2016-10-26 16:27:38,013 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 2016-10-26 16:27:38,013 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.domain.socket.path = 2016-10-26 16:27:38,017 DEBUG org.apache.hadoop.security.SecurityUtil: Acquired token Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.200.23:8020, Ident: (HDFS_DELEGATION_TOKEN token 12521 for shfs3453) 2016-10-26 16:27:38,017 DEBUG org.apache.hadoop.hdfs.HAUtil: Mapped HA service delegation token for logical URI hdfs://sandbox to namenode namenode01.bigdata.fr/192.168.200.23:8020 2016-10-26 16:27:38,017 DEBUG org.apache.hadoop.security.SecurityUtil: Acquired token Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.200.24:8020, Ident: (HDFS_DELEGATION_TOKEN token 12521 for shfs3453) 2016-10-26 16:27:38,017 DEBUG org.apache.hadoop.hdfs.HAUtil: Mapped HA service delegation token for logical URI hdfs://sandbox to namenode namenode02.bigdata.fr/192.168.200.24:8020 2016-10-26 16:27:38,018 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 2016-10-26 16:27:38,018 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false 2016-10-26 16:27:38,018 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 2016-10-26 16:27:38,018 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.domain.socket.path = 2016-10-26 16:27:38,018 DEBUG org.apache.hadoop.io.retry.RetryUtils: multipleLinearRandomRetry = null 2016-10-26 16:27:38,019 DEBUG org.apache.hadoop.ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@6e756936 2016-10-26 16:27:38,020 DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 2016-10-26 16:27:38,021 DEBUG org.apache.hadoop.ipc.Client: The ping interval is 60000 ms. 2016-10-26 16:27:38,021 DEBUG org.apache.hadoop.ipc.Client: Connecting to namenode01.bigdata.fr/192.168.200.23:8020 2016-10-26 16:27:38,022 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:shfs3453 (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724) 2016-10-26 16:27:38,022 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: NEGOTIATE 2016-10-26 16:27:38,023 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: NEGOTIATE auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" challenge: "realm=\"default\",nonce=\"tRUblacLEoF5gSUEjgCgkVpgJN3m4feWuPga+zmT\",qop=\"auth\",charset=utf-8,algorithm=md5-sess" } auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "namenode01.bigdata.fr" } 2016-10-26 16:27:38,023 DEBUG org.apache.hadoop.security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 2016-10-26 16:27:38,025 DEBUG org.apache.hadoop.security.SaslRpcClient: Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 2016-10-26 16:27:38,026 DEBUG org.apache.hadoop.security.SaslRpcClient: Use TOKEN authentication for protocol ClientNamenodeProtocolPB 2016-10-26 16:27:38,027 DEBUG org.apache.hadoop.security.SaslRpcClient: SASL client callback: setting username: ABdzaGZzMzQ1M0BTQU5EQk9YLkhBRE9PUAJybQCKAVgBY3KMigFYJW/2jI4w6Y4GHg== 2016-10-26 16:27:38,027 DEBUG org.apache.hadoop.security.SaslRpcClient: SASL client callback: setting userPassword 2016-10-26 16:27:38,027 DEBUG org.apache.hadoop.security.SaslRpcClient: SASL client callback: setting realm: default 2016-10-26 16:27:38,028 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: INITIATE token: "charset=utf-8,username=\"ABdzaGZzMzQ1M0BTQU5EQk9YLkhBRE9PUAJybQCKAVgBY3KMigFYJW/2jI4w6Y4GHg==\",realm=\"default\",nonce=\"tRUblacLEoF5gSUEjgCgkVpgJN3m4feWuPga+zmT\",nc=00000001,cnonce=\"V++QXknDBgKr0reRy7MCzQy5qwCLtx+rlcZQP99v\",digest-uri=\"/default\",maxbuf=65536,response=9c7f9f4756a7c9389989af9c3d56cfe2,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 2016-10-26 16:27:38,030 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: SUCCESS token: "rspauth=a53b272477020accb96079c0cf450a16" 2016-10-26 16:27:38,031 DEBUG org.apache.hadoop.ipc.Client: Negotiated QOP is :auth 2016-10-26 16:27:38,031 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453: starting, having connections 3 2016-10-26 16:27:38,031 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 sending #436 2016-10-26 16:27:38,033 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 got value #436 2016-10-26 16:27:38,033 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: getFileInfo took 13ms 2016-10-26 16:27:38,035 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationEvent.EventType: APPLICATION_LOG_HANDLING_INITED 2016-10-26 16:27:38,035 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Processing application_1475850791417_0147 of type APPLICATION_LOG_HANDLING_INITED 2016-10-26 16:27:38,036 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.ApplicationLocalizationEvent.EventType: INIT_APPLICATION_RESOURCES 2016-10-26 16:27:38,039 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationInitedEvent.EventType: APPLICATION_INITED 2016-10-26 16:27:38,039 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Processing application_1475850791417_0147 of type APPLICATION_INITED 2016-10-26 16:27:38,040 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Application application_1475850791417_0147 transitioned from INITING to RUNNING 2016-10-26 16:27:38,040 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerInitEvent.EventType: INIT_CONTAINER 2016-10-26 16:27:38,041 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type INIT_CONTAINER 2016-10-26 16:27:38,049 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Container container_1475850791417_0147_02_000001 transitioned from NEW to LOCALIZING 2016-10-26 16:27:38,049 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServicesEvent.EventType: CONTAINER_INIT 2016-10-26 16:27:38,049 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServices: Got event CONTAINER_INIT for appId application_1475850791417_0147 2016-10-26 16:27:38,051 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.ContainerLocalizationRequestEvent.EventType: INIT_CONTAINER_RESOURCES 2016-10-26 16:27:38,066 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-hbase-handler-2.1.0.jar of type REQUEST 2016-10-26 16:27:38,068 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-hbase-handler-2.1.0.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,068 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/metrics-core-2.2.0.jar of type REQUEST 2016-10-26 16:27:38,068 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/metrics-core-2.2.0.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,068 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-exec-2.1.0.jar of type REQUEST 2016-10-26 16:27:38,068 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-exec-2.1.0.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,068 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-hadoop-compat-1.1.2.jar of type REQUEST 2016-10-26 16:27:38,068 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-hadoop-compat-1.1.2.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,068 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-common-1.1.2.jar of type REQUEST 2016-10-26 16:27:38,068 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-common-1.1.2.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,069 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/netty-all-4.0.23.Final.jar of type REQUEST 2016-10-26 16:27:38,069 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/netty-all-4.0.23.Final.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,069 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-client-1.1.2.jar of type REQUEST 2016-10-26 16:27:38,069 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-client-1.1.2.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,069 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-jdbc-2.1.0-standalone.jar of type REQUEST 2016-10-26 16:27:38,069 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-jdbc-2.1.0-standalone.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,069 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/guava-14.0.1.jar of type REQUEST 2016-10-26 16:27:38,069 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/guava-14.0.1.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,069 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-protocol-1.1.2.jar of type REQUEST 2016-10-26 16:27:38,069 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-protocol-1.1.2.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,070 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/spark-assembly-1.4.1-hadoop2.7.1.jar of type REQUEST 2016-10-26 16:27:38,070 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/spark-assembly-1.4.1-hadoop2.7.1.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,070 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-server-1.1.2.jar of type REQUEST 2016-10-26 16:27:38,070 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-server-1.1.2.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,070 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.jar of type REQUEST 2016-10-26 16:27:38,070 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.jar transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,070 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.splitmetainfo of type REQUEST 2016-10-26 16:27:38,070 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.splitmetainfo transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,070 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.split of type REQUEST 2016-10-26 16:27:38,071 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.split transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,071 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.xml of type REQUEST 2016-10-26 16:27:38,071 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.xml transitioned from INIT to DOWNLOADING 2016-10-26 16:27:38,071 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,071 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: Created localizer for container_1475850791417_0147_02_000001 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,075 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizerResourceRequestEvent.EventType: REQUEST_RESOURCE_LOCALIZATION 2016-10-26 16:27:38,134 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:nm/datanode05.bigdata.fr@SANDBOX.HADOOP (auth:KERBEROS) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:331) 2016-10-26 16:27:38,146 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: Writing credentials to the nmPrivate file /mnt/hd0/hadoop/yarn/local/nmPrivate/container_1475850791417_0147_02_000001.tokens. Credentials list: 2016-10-26 16:27:38,147 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: : GwoOCgoIkwEQ-dO__PkqEAIQ0MrUo_z_____ARSQhEqFnIFAl3ivPEC2IKS9zNq3cRBZQVJOX0FNX1JNX1RPS0VOAA 2016-10-26 16:27:38,147 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: ha-hdfs:sandbox : MQAXc2hmczM0NTNAU0FOREJPWC5IQURPT1ACcm0AigFYAWNyjIoBWCVv9oyOMOmOBh4UoU6kOA3YV-0Uq2TESPSVJsujJNIVSERGU19ERUxFR0FUSU9OX1RPS0VOD2hhLWhkZnM6c2FuZGJveA 2016-10-26 16:27:38,148 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: e7142baa-7bed-4c65-8ed5-6594470a05bd : MgAAAC4IABIXc2hmczM0NTNAU0FOREJPWC5IQURPT1AYxgQgi-SNi4ArKIvsv6uCKzBZFBEQe7buSwL4V2cbFsO-Ih-blVVFEEhCQVNFX0FVVEhfVE9LRU4kZTcxNDJiYWEtN2JlZC00YzY1LThlZDUtNjU5NDQ3MGEwNWJk 2016-10-26 16:27:38,614 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:38,614 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:38,615 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #437 2016-10-26 16:27:38,616 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #437 2016-10-26 16:27:38,616 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:38,740 DEBUG org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor: initApplication: [/opt/application/Hadoop/current/bin/container-executor, shfs3453, shfs3453, 0, application_1475850791417_0147, /mnt/hd0/hadoop/yarn/local/nmPrivate/container_1475850791417_0147_02_000001.tokens, /mnt/hd8/hadoop/yarn/local,/mnt/hd3/hadoop/yarn/local,/mnt/hd2/hadoop/yarn/local,/mnt/hd1/hadoop/yarn/local,/mnt/hd9/hadoop/yarn/local,/mnt/hd10/hadoop/yarn/local,/mnt/hd4/hadoop/yarn/local,/mnt/hd5/hadoop/yarn/local,/mnt/hd6/hadoop/yarn/local,/mnt/hd0/hadoop/yarn/local,/mnt/hd11/hadoop/yarn/local,/mnt/hd7/hadoop/yarn/local, /mnt/hd8/hadoop/yarn/log,/mnt/hd3/hadoop/yarn/log,/mnt/hd2/hadoop/yarn/log,/mnt/hd1/hadoop/yarn/log,/mnt/hd9/hadoop/yarn/log,/mnt/hd10/hadoop/yarn/log,/mnt/hd4/hadoop/yarn/log,/mnt/hd5/hadoop/yarn/log,/mnt/hd6/hadoop/yarn/log,/mnt/hd0/hadoop/yarn/log,/mnt/hd11/hadoop/yarn/log,/mnt/hd7/hadoop/yarn/log, /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.111.x86_64/jre/bin/java, -classpath, /opt/application/Hadoop/current/etc/hadoop/:/opt/application/Hadoop/current/etc/hadoop/:/opt/application/Hadoop/current/etc/hadoop/:/opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/netty-3.6.2.Final.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-framework-2.7.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-client.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/servlet-api-2.5.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/stax-api-1.0-2.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-net-3.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/activation-1.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-annotations-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jsch-0.1.42.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/httpclient-4.2.5.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jetty-6.1.26.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-framework.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jetty-util-6.1.26.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/mockito-all-1.8.5.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/junit-4.11.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-auth-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-math3-3.1.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-digester-1.8.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/xmlenc-0.52.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-collections-3.2.2.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-auth.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jsr305-3.0.0.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/paranamer-2.3.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-io-2.4.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/xz-1.0.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/hamcrest-core-1.3.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-compress-1.4.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/asm-3.2.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-codec-1.4.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/gson-2.2.4.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/log4j-1.2.17.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-json-1.9.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jettison-1.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-client-2.7.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-cli-1.2.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-httpclient-3.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-server-1.9.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/hadoop-annotations.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jersey-core-1.9.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jets3t-0.9.0.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-lang-2.6.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/guava-11.0.2.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/httpcore-4.2.5.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/avro-1.7.4.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/curator-recipes.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/zookeeper-3.4.6.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/jsp-api-2.1.jar:/opt/application/Hadoop/current/share/hadoop/common/lib/commons-logging-1.1.3.jar:/opt/application/Hadoop/current/share/hadoop/common/hadoop-nfs-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/common/hadoop-common.jar:/opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/common/hadoop-common-2.7.2-tests.jar:/opt/application/Hadoop/current/share/hadoop/hdfs:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-io-2.4.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/asm-3.2.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/guava-11.0.2.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2-tests.jar:/opt/application/Hadoop/current/share/hadoop/hdfs/hadoop-hdfs-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/servlet-api-2.5.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/activation-1.1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/javax.inject-1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-6.1.26.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/aopalliance-1.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-3.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-io-2.4.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/xz-1.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/asm-3.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-codec-1.4.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-client-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/log4j-1.2.17.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-json-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jettison-1.1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-cli-1.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-server-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-core-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-lang-2.6.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guava-11.0.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-client.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-resourcemanager.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-registry-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-common-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-common.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-registry.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-nodemanager.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-common-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-web-proxy.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-api.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-common.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-distributedshell.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-client-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-api-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/javax.inject-1.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/junit-4.11.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-3.0.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/xz-1.0.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/asm-3.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2-tests.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-common.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-core.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-app.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/mapreduce/hadoop-mapreduce-client-hs.jar::/opt/application/Tez/current/conf:/opt/application/Tez/current/tez-runtime-library-0.5.1.jar:/opt/application/Tez/current/tez-common-0.5.1.jar:/opt/application/Tez/current/tez-examples-0.5.1.jar:/opt/application/Tez/current/tez-dag-0.5.1.jar:/opt/application/Tez/current/tez-yarn-timeline-history-0.5.1.jar:/opt/application/Tez/current/tez-runtime-internals-0.5.1.jar:/opt/application/Tez/current/tez-tests-0.5.1.jar:/opt/application/Tez/current/tez-api-0.5.1.jar:/opt/application/Tez/current/tez-mapreduce-0.5.1.jar:/opt/application/Tez/current/lib/commons-collections-3.2.1.jar:/opt/application/Tez/current/lib/guava-11.0.2.jar:/opt/application/Tez/current/lib/commons-logging-1.1.3.jar:/opt/application/Tez/current/lib/commons-cli-1.2.jar:/opt/application/Tez/current/lib/commons-collections4-4.0.jar:/opt/application/Tez/current/lib/commons-io-2.4.jar:/opt/application/Tez/current/lib/hadoop-mapreduce-client-common-2.4.1.jar:/opt/application/Tez/current/lib/hadoop-mapreduce-client-core-2.4.1.jar:/opt/application/Tez/current/lib/commons-codec-1.4.jar:/opt/application/Tez/current/lib/commons-math3-3.1.1.jar:/opt/application/Tez/current/lib/jsr305-2.0.3.jar:/opt/application/Tez/current/lib/protobuf-java-2.5.0.jar:/opt/application/Tez/current/lib/jettison-1.3.4.jar:/opt/application/Tez/current/lib/commons-lang-2.6.jar:/opt/application/Tez/current/lib/hadoop-annotations-2.4.1.jar:/opt/application/Tez/current/lib/log4j-1.2.17.jar:/opt/application/Hadoop/current/contrib/capacity-scheduler/*.jar:/opt/application/Hadoop/current/contrib/capacity-scheduler/*.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-client.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-resourcemanager.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-registry-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-common-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-common.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-registry.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-nodemanager.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-common-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-web-proxy.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-api.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-common.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-applications-distributedshell.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-client-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/hadoop-yarn-api-2.7.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/servlet-api-2.5.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/activation-1.1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/javax.inject-1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-6.1.26.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/aopalliance-1.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-3.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-io-2.4.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/xz-1.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/asm-3.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-codec-1.4.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-client-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/log4j-1.2.17.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-json-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jettison-1.1.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-cli-1.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-server-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jersey-core-1.9.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-lang-2.6.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/guava-11.0.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/opt/application/Hadoop/current/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/opt/application/Hadoop/current/etc/hadoop//nm-config/log4j.properties, -Djava.library.path=/opt/application/Hadoop/hadoop-2.7.2/lib/native, org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer, shfs3453, application_1475850791417_0147, container_1475850791417_0147_02_000001, datanode05.bigdata.fr, 8040, /mnt/hd8/hadoop/yarn/local, /mnt/hd3/hadoop/yarn/local, /mnt/hd2/hadoop/yarn/local, /mnt/hd1/hadoop/yarn/local, /mnt/hd9/hadoop/yarn/local, /mnt/hd10/hadoop/yarn/local, /mnt/hd4/hadoop/yarn/local, /mnt/hd5/hadoop/yarn/local, /mnt/hd6/hadoop/yarn/local, /mnt/hd0/hadoop/yarn/local, /mnt/hd11/hadoop/yarn/local, /mnt/hd7/hadoop/yarn/local] 2016-10-26 16:27:39,617 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:39,617 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:39,618 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #438 2016-10-26 16:27:39,620 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #438 2016-10-26 16:27:39,620 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:39,856 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ ] 2016-10-26 16:27:40,621 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:40,621 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:40,622 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #439 2016-10-26 16:27:40,623 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #439 2016-10-26 16:27:40,623 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:41,624 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:41,624 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:41,625 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #440 2016-10-26 16:27:41,626 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #440 2016-10-26 16:27:41,626 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:41,676 DEBUG org.apache.hadoop.ipc.Server: IPC Server idle connection scanner for port 47855: task running 2016-10-26 16:27:41,766 DEBUG org.apache.hadoop.ipc.Server: IPC Server idle connection scanner for port 8040: task running 2016-10-26 16:27:42,626 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:42,627 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:42,628 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #441 2016-10-26 16:27:42,629 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #441 2016-10-26 16:27:42,629 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:42,856 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ ] 2016-10-26 16:27:43,490 DEBUG org.apache.hadoop.ipc.Server: Server connection from 192.168.200.29:47238; # active connections: 1; # queued calls: 0 2016-10-26 16:27:43,504 DEBUG org.apache.hadoop.ipc.Server: got #-33 2016-10-26 16:27:43,504 DEBUG org.apache.hadoop.security.SaslRpcServer: Created SASL server with mechanism = DIGEST-MD5 2016-10-26 16:27:43,505 DEBUG org.apache.hadoop.ipc.Server: Sending sasl message state: NEGOTIATE auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" challenge: "realm=\"default\",nonce=\"y4oqjtRH8KVyarAE6UO3r3HRlyGdgCidPHEsjCHV\",qop=\"auth\",charset=utf-8,algorithm=md5-sess" } auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nm" serverId: "datanode05.bigdata.fr" } 2016-10-26 16:27:43,505 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 8040: responding to null from 192.168.200.29:47238 Call#-33 Retry#-1 2016-10-26 16:27:43,506 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 8040: responding to null from 192.168.200.29:47238 Call#-33 Retry#-1 Wrote 226 bytes. 2016-10-26 16:27:43,528 DEBUG org.apache.hadoop.ipc.Server: got #-33 2016-10-26 16:27:43,528 DEBUG org.apache.hadoop.ipc.Server: Have read input token of size 246 for processing by saslServer.evaluateResponse() 2016-10-26 16:27:43,529 DEBUG org.apache.hadoop.security.SaslRpcServer: SASL server DIGEST-MD5 callback: setting password for client: testing (auth:SIMPLE) 2016-10-26 16:27:43,530 DEBUG org.apache.hadoop.security.SaslRpcServer: SASL server DIGEST-MD5 callback: setting canonicalized client ID: testing 2016-10-26 16:27:43,530 DEBUG org.apache.hadoop.ipc.Server: Will send SUCCESS token of size 40 from saslServer. 2016-10-26 16:27:43,530 DEBUG org.apache.hadoop.ipc.Server: SASL server context established. Negotiated QoP is auth 2016-10-26 16:27:43,531 DEBUG org.apache.hadoop.ipc.Server: SASL server successfully authenticated client: testing (auth:SIMPLE) 2016-10-26 16:27:43,531 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for testing (auth:SIMPLE) 2016-10-26 16:27:43,532 DEBUG org.apache.hadoop.ipc.Server: Sending sasl message state: SUCCESS token: "rspauth=4d7874e0ea6405f7f12c663bd64f4ca0" 2016-10-26 16:27:43,532 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 8040: responding to null from 192.168.200.29:47238 Call#-33 Retry#-1 2016-10-26 16:27:43,532 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 8040: responding to null from 192.168.200.29:47238 Call#-33 Retry#-1 Wrote 64 bytes. 2016-10-26 16:27:43,629 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:43,630 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:43,631 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #442 2016-10-26 16:27:43,631 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #442 2016-10-26 16:27:43,632 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:43,817 DEBUG org.apache.hadoop.ipc.Server: got #-3 2016-10-26 16:27:43,818 INFO SecurityLogger.org.apache.hadoop.security.authorize.ServiceAuthorizationManager: Authorization successful for testing (auth:TOKEN) for protocol=interface org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB 2016-10-26 16:27:43,819 DEBUG org.apache.hadoop.ipc.Server: Successfully authorized userInfo { } protocol: "org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB" 2016-10-26 16:27:43,819 DEBUG org.apache.hadoop.ipc.Server: got #0 2016-10-26 16:27:43,819 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#0 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:43,820 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:43,918 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 17 procesingTime= 82 2016-10-26 16:27:43,919 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#0 Retry#0 2016-10-26 16:27:43,920 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#0 Retry#0 Wrote 253 bytes. 2016-10-26 16:27:44,632 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:44,633 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:44,634 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #443 2016-10-26 16:27:44,634 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #443 2016-10-26 16:27:44,635 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:45,383 DEBUG org.apache.hadoop.ipc.Server: got #2 2016-10-26 16:27:45,384 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#2 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:45,384 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:45,403 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 13 procesingTime= 6 2016-10-26 16:27:45,403 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#2 Retry#0 2016-10-26 16:27:45,404 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#2 Retry#0 Wrote 248 bytes. 2016-10-26 16:27:45,598 DEBUG org.apache.hadoop.ipc.Server: got #7 2016-10-26 16:27:45,598 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#7 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:45,599 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:45,602 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-hbase-handler-2.1.0.jar of type LOCALIZED 2016-10-26 16:27:45,605 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-hbase-handler-2.1.0.jar(->/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/filecache/10/hive-hbase-handler-2.1.0.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:45,605 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:45,605 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:45,612 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/metrics-core-2.2.0.jar of type LOCALIZED 2016-10-26 16:27:45,612 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/metrics-core-2.2.0.jar(->/mnt/hd11/hadoop/yarn/local/usercache/shfs3453/filecache/11/metrics-core-2.2.0.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:45,613 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:45,613 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:45,614 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 15 2016-10-26 16:27:45,615 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#7 Retry#0 2016-10-26 16:27:45,615 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#7 Retry#0 Wrote 244 bytes. 2016-10-26 16:27:45,617 DEBUG org.apache.hadoop.ipc.Server: got #8 2016-10-26 16:27:45,617 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#8 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:45,617 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:45,619 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 0 procesingTime= 2 2016-10-26 16:27:45,619 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#8 Retry#0 2016-10-26 16:27:45,619 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#8 Retry#0 Wrote 254 bytes. 2016-10-26 16:27:45,635 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:45,636 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:45,636 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #444 2016-10-26 16:27:45,637 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #444 2016-10-26 16:27:45,637 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:45,856 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ ] 2016-10-26 16:27:45,958 DEBUG org.apache.hadoop.ipc.Server: got #11 2016-10-26 16:27:45,958 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#11 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:45,959 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:45,959 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-exec-2.1.0.jar of type LOCALIZED 2016-10-26 16:27:45,960 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-exec-2.1.0.jar(->/mnt/hd7/hadoop/yarn/local/usercache/shfs3453/filecache/12/hive-exec-2.1.0.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:45,960 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:45,960 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:45,962 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 3 2016-10-26 16:27:45,962 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#11 Retry#0 2016-10-26 16:27:45,962 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#11 Retry#0 Wrote 247 bytes. 2016-10-26 16:27:45,988 DEBUG org.apache.hadoop.ipc.Server: got #14 2016-10-26 16:27:45,988 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#14 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:45,988 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:45,989 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-hadoop-compat-1.1.2.jar of type LOCALIZED 2016-10-26 16:27:45,989 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-hadoop-compat-1.1.2.jar(->/mnt/hd8/hadoop/yarn/local/usercache/shfs3453/filecache/13/hbase-hadoop-compat-1.1.2.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:45,989 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:45,989 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:45,991 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 2 2016-10-26 16:27:45,991 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#14 Retry#0 2016-10-26 16:27:45,992 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#14 Retry#0 Wrote 251 bytes. 2016-10-26 16:27:46,022 DEBUG org.apache.hadoop.ipc.Server: got #17 2016-10-26 16:27:46,023 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#17 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:46,023 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:46,024 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-common-1.1.2.jar of type LOCALIZED 2016-10-26 16:27:46,024 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-common-1.1.2.jar(->/mnt/hd3/hadoop/yarn/local/usercache/shfs3453/filecache/14/hbase-common-1.1.2.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:46,024 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:46,024 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:46,025 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 0 procesingTime= 2 2016-10-26 16:27:46,025 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#17 Retry#0 2016-10-26 16:27:46,025 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#17 Retry#0 Wrote 247 bytes. 2016-10-26 16:27:46,063 DEBUG org.apache.hadoop.ipc.Server: got #20 2016-10-26 16:27:46,064 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#20 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:46,064 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:46,065 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/netty-all-4.0.23.Final.jar of type LOCALIZED 2016-10-26 16:27:46,065 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/netty-all-4.0.23.Final.jar(->/mnt/hd2/hadoop/yarn/local/usercache/shfs3453/filecache/15/netty-all-4.0.23.Final.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:46,065 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:46,065 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:46,067 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 3 2016-10-26 16:27:46,067 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#20 Retry#0 2016-10-26 16:27:46,067 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#20 Retry#0 Wrote 256 bytes. 2016-10-26 16:27:46,099 DEBUG org.apache.hadoop.ipc.Server: got #23 2016-10-26 16:27:46,100 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#23 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:46,100 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:46,101 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-client-1.1.2.jar of type LOCALIZED 2016-10-26 16:27:46,101 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-client-1.1.2.jar(->/mnt/hd1/hadoop/yarn/local/usercache/shfs3453/filecache/16/hbase-client-1.1.2.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:46,101 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:46,101 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:46,103 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 2 2016-10-26 16:27:46,103 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#23 Retry#0 2016-10-26 16:27:46,103 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#23 Retry#0 Wrote 241 bytes. 2016-10-26 16:27:46,250 DEBUG org.apache.hadoop.ipc.Server: got #27 2016-10-26 16:27:46,250 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#27 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:46,251 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:46,252 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-jdbc-2.1.0-standalone.jar of type LOCALIZED 2016-10-26 16:27:46,252 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-jdbc-2.1.0-standalone.jar(->/mnt/hd9/hadoop/yarn/local/usercache/shfs3453/filecache/17/hive-jdbc-2.1.0-standalone.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:46,252 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:46,252 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:46,253 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 2 2016-10-26 16:27:46,254 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#27 Retry#0 2016-10-26 16:27:46,254 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#27 Retry#0 Wrote 249 bytes. 2016-10-26 16:27:46,293 DEBUG org.apache.hadoop.ipc.Server: got #29 2016-10-26 16:27:46,293 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#29 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:46,294 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:46,295 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/guava-14.0.1.jar of type LOCALIZED 2016-10-26 16:27:46,295 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/guava-14.0.1.jar(->/mnt/hd10/hadoop/yarn/local/usercache/shfs3453/filecache/18/guava-14.0.1.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:46,295 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:46,295 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:46,297 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 3 2016-10-26 16:27:46,297 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#29 Retry#0 2016-10-26 16:27:46,297 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#29 Retry#0 Wrote 262 bytes. 2016-10-26 16:27:46,351 DEBUG org.apache.hadoop.ipc.Server: got #32 2016-10-26 16:27:46,351 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#32 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:46,351 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:46,352 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-protocol-1.1.2.jar of type LOCALIZED 2016-10-26 16:27:46,352 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-protocol-1.1.2.jar(->/mnt/hd4/hadoop/yarn/local/usercache/shfs3453/filecache/19/hbase-protocol-1.1.2.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:46,352 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:46,352 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:46,353 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 0 procesingTime= 2 2016-10-26 16:27:46,353 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#32 Retry#0 2016-10-26 16:27:46,354 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 3 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#32 Retry#0 Wrote 247 bytes. 2016-10-26 16:27:46,638 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:46,638 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:46,639 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #445 2016-10-26 16:27:46,640 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #445 2016-10-26 16:27:46,640 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:46,709 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 47855: disconnecting client 192.168.200.219:38779. Number of active connections: 0 2016-10-26 16:27:47,078 DEBUG org.apache.hadoop.ipc.Server: got #35 2016-10-26 16:27:47,079 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#35 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:47,079 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:47,080 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/spark-assembly-1.4.1-hadoop2.7.1.jar of type LOCALIZED 2016-10-26 16:27:47,080 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/spark-assembly-1.4.1-hadoop2.7.1.jar(->/mnt/hd5/hadoop/yarn/local/usercache/shfs3453/filecache/20/spark-assembly-1.4.1-hadoop2.7.1.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:47,080 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:47,080 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:47,082 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 3 2016-10-26 16:27:47,082 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#35 Retry#0 2016-10-26 16:27:47,083 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#35 Retry#0 Wrote 276 bytes. 2016-10-26 16:27:47,107 DEBUG org.apache.hadoop.ipc.Server: got #38 2016-10-26 16:27:47,107 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#38 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:47,107 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:47,108 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-server-1.1.2.jar of type LOCALIZED 2016-10-26 16:27:47,108 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-server-1.1.2.jar(->/mnt/hd6/hadoop/yarn/local/usercache/shfs3453/filecache/21/hbase-server-1.1.2.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:47,108 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:47,108 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:47,109 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 0 procesingTime= 2 2016-10-26 16:27:47,110 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#38 Retry#0 2016-10-26 16:27:47,110 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#38 Retry#0 Wrote 266 bytes. 2016-10-26 16:27:47,640 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:47,641 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:47,642 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #446 2016-10-26 16:27:47,642 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #446 2016-10-26 16:27:47,643 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:47,747 DEBUG org.apache.hadoop.ipc.Server: got #41 2016-10-26 16:27:47,748 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#41 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:47,749 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:47,750 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.jar of type LOCALIZED 2016-10-26 16:27:47,750 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.jar(->/mnt/hd0/hadoop/yarn/local/usercache/shfs3453/appcache/application_1475850791417_0147/filecache/10/job.jar) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:47,750 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:47,751 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:47,752 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 3 2016-10-26 16:27:47,752 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#41 Retry#0 2016-10-26 16:27:47,753 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#41 Retry#0 Wrote 257 bytes. 2016-10-26 16:27:47,778 DEBUG org.apache.hadoop.ipc.Server: got #44 2016-10-26 16:27:47,779 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#44 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:47,779 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:47,780 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.splitmetainfo of type LOCALIZED 2016-10-26 16:27:47,780 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.splitmetainfo(->/mnt/hd11/hadoop/yarn/local/usercache/shfs3453/appcache/application_1475850791417_0147/filecache/11/job.splitmetainfo) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:47,781 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:47,781 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:47,782 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 3 2016-10-26 16:27:47,783 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#44 Retry#0 2016-10-26 16:27:47,783 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#44 Retry#0 Wrote 255 bytes. 2016-10-26 16:27:47,946 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP: closed 2016-10-26 16:27:47,946 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP: stopped, remaining connections 2 2016-10-26 16:27:48,032 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453: closed 2016-10-26 16:27:48,032 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453: stopped, remaining connections 1 2016-10-26 16:27:48,467 DEBUG org.apache.hadoop.ipc.Server: got #47 2016-10-26 16:27:48,468 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#47 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:48,468 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:48,469 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.split of type LOCALIZED 2016-10-26 16:27:48,469 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.split(->/mnt/hd7/hadoop/yarn/local/usercache/shfs3453/appcache/application_1475850791417_0147/filecache/12/job.split) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:48,469 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:48,469 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:48,469 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 0 procesingTime= 1 2016-10-26 16:27:48,469 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#47 Retry#0 2016-10-26 16:27:48,469 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 1 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#47 Retry#0 Wrote 34 bytes. 2016-10-26 16:27:48,490 DEBUG org.apache.hadoop.ipc.Server: got #50 2016-10-26 16:27:48,490 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#50 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:48,490 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:48,491 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.xml of type LOCALIZED 2016-10-26 16:27:48,491 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Resource hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.xml(->/mnt/hd8/hadoop/yarn/local/usercache/shfs3453/appcache/application_1475850791417_0147/filecache/13/job.xml) transitioned from DOWNLOADING to LOCALIZED 2016-10-26 16:27:48,491 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerResourceLocalizedEvent.EventType: RESOURCE_LOCALIZED 2016-10-26 16:27:48,491 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 1 procesingTime= 0 2016-10-26 16:27:48,491 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type RESOURCE_LOCALIZED 2016-10-26 16:27:48,491 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#50 Retry#0 2016-10-26 16:27:48,491 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 0 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#50 Retry#0 Wrote 34 bytes. 2016-10-26 16:27:48,493 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Container container_1475850791417_0147_02_000001 transitioned from LOCALIZING to LOCALIZED 2016-10-26 16:27:48,493 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.ContainerLocalizationEvent.EventType: CONTAINER_RESOURCES_LOCALIZED 2016-10-26 16:27:48,494 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainersLauncherEvent.EventType: LAUNCH_CONTAINER 2016-10-26 16:27:48,499 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.sharedcache.SharedCacheUploadEvent.EventType: UPLOAD 2016-10-26 16:27:48,503 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:nm/datanode05.bigdata.fr@SANDBOX.HADOOP (auth:KERBEROS) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:331) 2016-10-26 16:27:48,532 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerEvent.EventType: CONTAINER_LAUNCHED 2016-10-26 16:27:48,532 DEBUG org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor: launchContainer: [/opt/application/Hadoop/current/bin/container-executor, shfs3453, shfs3453, 1, application_1475850791417_0147, container_1475850791417_0147_02_000001, /mnt/hd10/hadoop/yarn/local/usercache/shfs3453/appcache/application_1475850791417_0147/container_1475850791417_0147_02_000001, /mnt/hd4/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/launch_container.sh, /mnt/hd0/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/container_1475850791417_0147_02_000001.tokens, /mnt/hd3/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/container_1475850791417_0147_02_000001.pid, /mnt/hd8/hadoop/yarn/local,/mnt/hd3/hadoop/yarn/local,/mnt/hd2/hadoop/yarn/local,/mnt/hd1/hadoop/yarn/local,/mnt/hd9/hadoop/yarn/local,/mnt/hd10/hadoop/yarn/local,/mnt/hd4/hadoop/yarn/local,/mnt/hd5/hadoop/yarn/local,/mnt/hd6/hadoop/yarn/local,/mnt/hd0/hadoop/yarn/local,/mnt/hd11/hadoop/yarn/local,/mnt/hd7/hadoop/yarn/local, /mnt/hd8/hadoop/yarn/log,/mnt/hd3/hadoop/yarn/log,/mnt/hd2/hadoop/yarn/log,/mnt/hd1/hadoop/yarn/log,/mnt/hd9/hadoop/yarn/log,/mnt/hd10/hadoop/yarn/log,/mnt/hd4/hadoop/yarn/log,/mnt/hd5/hadoop/yarn/log,/mnt/hd6/hadoop/yarn/log,/mnt/hd0/hadoop/yarn/log,/mnt/hd11/hadoop/yarn/log,/mnt/hd7/hadoop/yarn/log, cgroups=none] 2016-10-26 16:27:48,532 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type CONTAINER_LAUNCHED 2016-10-26 16:27:48,533 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Container container_1475850791417_0147_02_000001 transitioned from LOCALIZED to RUNNING 2016-10-26 16:27:48,533 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainerStartMonitoringEvent.EventType: START_MONITORING_CONTAINER 2016-10-26 16:27:48,643 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:48,644 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:48,644 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #447 2016-10-26 16:27:48,645 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #447 2016-10-26 16:27:48,645 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:48,856 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ ] 2016-10-26 16:27:48,857 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Starting resource-monitoring for container_1475850791417_0147_02_000001 2016-10-26 16:27:48,858 DEBUG org.apache.hadoop.yarn.server.nodemanager.util.ProcessIdFileReader: Accessing pid from pid file /mnt/hd3/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/container_1475850791417_0147_02_000001.pid 2016-10-26 16:27:48,858 DEBUG org.apache.hadoop.yarn.server.nodemanager.util.ProcessIdFileReader: Got pid 24883 from path /mnt/hd3/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/container_1475850791417_0147_02_000001.pid 2016-10-26 16:27:48,858 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Tracking ProcessTree 24883 for the first time 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl: ContainerResource_container_1475850791417_0147_02_000001, Metrics for container: container_1475850791417_0147_02_000001 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig: poking parent 'PropertiesConfiguration' for key: source.source.start_mbeans 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig: poking parent 'MetricsConfig' for key: source.start_mbeans 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsConfig: poking parent 'PropertiesConfiguration' for key: *.source.start_mbeans 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: Updating attr cache... 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: Done. # tags & metrics=0 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: Updating info cache... 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl: [] 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: Done 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.util.MBeans: Registered Hadoop:service=NodeManager,name=ContainerResource_container_1475850791417_0147_02_000001 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ContainerResource_container_1475850791417_0147_02_000001 registered. 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Registered source ContainerResource_container_1475850791417_0147_02_000001 2016-10-26 16:27:48,862 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Constructing ProcessTree for : PID = 24883 ContainerId = container_1475850791417_0147_02_000001 2016-10-26 16:27:49,065 DEBUG org.apache.hadoop.yarn.util.ProcfsBasedProcessTree: [ 24883 24902 ] 2016-10-26 16:27:49,074 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Memory usage of ProcessTree 24883 for container-id container_1475850791417_0147_02_000001: 58.1 MB of 4 GB physical memory used; 4.5 GB of 8.4 GB virtual memory used 2016-10-26 16:27:49,492 DEBUG org.apache.hadoop.ipc.Server: got #51 2016-10-26 16:27:49,493 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#51 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:49,493 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:49,493 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 0 procesingTime= 0 2016-10-26 16:27:49,494 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#51 Retry#0 2016-10-26 16:27:49,494 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 4 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#51 Retry#0 Wrote 34 bytes. 2016-10-26 16:27:49,495 DEBUG org.apache.hadoop.ipc.Server: got #52 2016-10-26 16:27:49,495 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#52 Retry#0 for RpcKind RPC_PROTOCOL_BUFFER 2016-10-26 16:27:49,495 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:testing (auth:TOKEN) from:org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-10-26 16:27:49,496 DEBUG org.apache.hadoop.ipc.Server: Served: heartbeat queueTime= 0 procesingTime= 1 2016-10-26 16:27:49,496 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#52 Retry#0 2016-10-26 16:27:49,496 DEBUG org.apache.hadoop.ipc.Server: IPC Server handler 2 on 8040: responding to org.apache.hadoop.yarn.server.nodemanager.api.LocalizationProtocolPB.heartbeat from 192.168.200.29:47238 Call#52 Retry#0 Wrote 34 bytes. 2016-10-26 16:27:49,646 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:49,646 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: RUNNING, Diagnostics: , ExitStatus: -1000, ]] 2016-10-26 16:27:49,647 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #448 2016-10-26 16:27:49,648 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #448 2016-10-26 16:27:49,648 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:49,845 DEBUG org.apache.hadoop.ipc.Server: Socket Reader #1 for port 8040: disconnecting client 192.168.200.29:47238. Number of active connections: 0 2016-10-26 16:27:49,846 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: main : command provided 0 2016-10-26 16:27:49,846 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: main : user is shfs3453 2016-10-26 16:27:49,846 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: main : requested yarn user is shfs3453 2016-10-26 16:27:50,479 WARN org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor: Exit code from container container_1475850791417_0147_02_000001 is : 1 2016-10-26 16:27:50,480 WARN org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor: Exception from container-launch with container ID: container_1475850791417_0147_02_000001 and exit code: 1 ExitCodeException exitCode=1: at org.apache.hadoop.util.Shell.runCommand(Shell.java:545) at org.apache.hadoop.util.Shell.run(Shell.java:456) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:297) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: Exception from container-launch. 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: Container id: container_1475850791417_0147_02_000001 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: Exit code: 1 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: Stack trace: ExitCodeException exitCode=1: 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at org.apache.hadoop.util.Shell.runCommand(Shell.java:545) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at org.apache.hadoop.util.Shell.run(Shell.java:456) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:297) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at java.util.concurrent.FutureTask.run(FutureTask.java:262) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 2016-10-26 16:27:50,481 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: at java.lang.Thread.run(Thread.java:745) 2016-10-26 16:27:50,482 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: 2016-10-26 16:27:50,482 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: Shell output: main : command provided 1 2016-10-26 16:27:50,482 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: main : user is shfs3453 2016-10-26 16:27:50,482 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor: main : requested yarn user is shfs3453 2016-10-26 16:27:50,482 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type UPDATE_DIAGNOSTICS_MSG 2016-10-26 16:27:50,483 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Container container_1475850791417_0147_02_000001 completed with exit code 1 2016-10-26 16:27:50,483 WARN org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Container exited with a non-zero exit code 1 2016-10-26 16:27:50,483 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerExitEvent.EventType: CONTAINER_EXITED_WITH_FAILURE 2016-10-26 16:27:50,483 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type CONTAINER_EXITED_WITH_FAILURE 2016-10-26 16:27:50,484 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Container container_1475850791417_0147_02_000001 transitioned from RUNNING to EXITED_WITH_FAILURE 2016-10-26 16:27:50,484 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainersLauncherEvent.EventType: CLEANUP_CONTAINER 2016-10-26 16:27:50,484 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Cleaning up container container_1475850791417_0147_02_000001 2016-10-26 16:27:50,484 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Marking container container_1475850791417_0147_02_000001 as inactive 2016-10-26 16:27:50,484 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Getting pid for container container_1475850791417_0147_02_000001 to kill from pid file /mnt/hd3/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/container_1475850791417_0147_02_000001.pid 2016-10-26 16:27:50,484 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Accessing pid for container container_1475850791417_0147_02_000001 from pid file /mnt/hd3/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/container_1475850791417_0147_02_000001.pid 2016-10-26 16:27:50,484 DEBUG org.apache.hadoop.yarn.server.nodemanager.util.ProcessIdFileReader: Accessing pid from pid file /mnt/hd3/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/container_1475850791417_0147_02_000001.pid 2016-10-26 16:27:50,485 DEBUG org.apache.hadoop.yarn.server.nodemanager.util.ProcessIdFileReader: Got pid 24883 from path /mnt/hd3/hadoop/yarn/local/nmPrivate/application_1475850791417_0147/container_1475850791417_0147_02_000001/container_1475850791417_0147_02_000001.pid 2016-10-26 16:27:50,485 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Got pid 24883 for container container_1475850791417_0147_02_000001 2016-10-26 16:27:50,485 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Sending signal to pid 24883 as user shfs3453 for container container_1475850791417_0147_02_000001 2016-10-26 16:27:50,485 DEBUG org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor: signalContainer: [/opt/application/Hadoop/current/bin/container-executor, shfs3453, shfs3453, 2, 24883, 15] 2016-10-26 16:27:50,511 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch: Sent signal SIGTERM to pid 24883 as user shfs3453 for container container_1475850791417_0147_02_000001, result=failed 2016-10-26 16:27:50,512 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:nm/datanode05.bigdata.fr@SANDBOX.HADOOP (auth:KERBEROS) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:331) 2016-10-26 16:27:50,528 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.ContainerLocalizationCleanupEvent.EventType: CLEANUP_CONTAINER_RESOURCES 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-hbase-handler-2.1.0.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/metrics-core-2.2.0.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-exec-2.1.0.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-hadoop-compat-1.1.2.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-common-1.1.2.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/netty-all-4.0.23.Final.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-client-1.1.2.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hive-jdbc-2.1.0-standalone.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/guava-14.0.1.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-protocol-1.1.2.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/spark-assembly-1.4.1-hadoop2.7.1.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/libjars/hbase-server-1.1.2.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.jar of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.splitmetainfo of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.split of type RELEASE 2016-10-26 16:27:50,529 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: Processing hdfs://sandbox:8020/Products/MR/staging/shfs3453/.staging/job_1475850791417_0147/job.xml of type RELEASE 2016-10-26 16:27:50,533 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerEvent.EventType: CONTAINER_RESOURCES_CLEANEDUP 2016-10-26 16:27:50,533 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Processing container_1475850791417_0147_02_000001 of type CONTAINER_RESOURCES_CLEANEDUP 2016-10-26 16:27:50,533 WARN org.apache.hadoop.yarn.server.nodemanager.NMAuditLogger: USER=shfs3453 OPERATION=Container Finished - Failed TARGET=ContainerImpl RESULT=FAILURE DESCRIPTION=Container failed with state: EXITED_WITH_FAILURE APPID=application_1475850791417_0147 CONTAINERID=container_1475850791417_0147_02_000001 2016-10-26 16:27:50,535 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Container container_1475850791417_0147_02_000001 transitioned from EXITED_WITH_FAILURE to DONE 2016-10-26 16:27:50,535 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationContainerFinishedEvent.EventType: APPLICATION_CONTAINER_FINISHED 2016-10-26 16:27:50,535 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Processing application_1475850791417_0147 of type APPLICATION_CONTAINER_FINISHED 2016-10-26 16:27:50,535 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Removing container_1475850791417_0147_02_000001 from application application_1475850791417_0147 2016-10-26 16:27:50,535 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainerStopMonitoringEvent.EventType: STOP_MONITORING_CONTAINER 2016-10-26 16:27:50,535 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.loghandler.event.LogHandlerContainerFinishedEvent.EventType: CONTAINER_FINISHED 2016-10-26 16:27:50,535 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.AppLogAggregatorImpl: Considering container container_1475850791417_0147_02_000001 for log-aggregation 2016-10-26 16:27:50,535 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServicesEvent.EventType: CONTAINER_STOP 2016-10-26 16:27:50,535 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServices: Got event CONTAINER_STOP for appId application_1475850791417_0147 2016-10-26 16:27:50,648 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:50,649 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 1 container statuses: [ContainerStatus: [ContainerId: container_1475850791417_0147_02_000001, State: COMPLETE, Diagnostics: Exception from container-launch. Container id: container_1475850791417_0147_02_000001 Exit code: 1 Stack trace: ExitCodeException exitCode=1: at org.apache.hadoop.util.Shell.runCommand(Shell.java:545) at org.apache.hadoop.util.Shell.run(Shell.java:456) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722) at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:297) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Shell output: main : command provided 1 main : user is shfs3453 main : requested yarn user is shfs3453 Container exited with a non-zero exit code 1 , ExitStatus: 1, ]] 2016-10-26 16:27:50,650 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #449 2016-10-26 16:27:50,650 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #449 2016-10-26 16:27:50,651 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:50,762 DEBUG org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor: signalContainer: [/opt/application/Hadoop/current/bin/container-executor, shfs3453, shfs3453, 2, 24883, 9] 2016-10-26 16:27:51,651 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:51,652 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:51,652 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #450 2016-10-26 16:27:51,653 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #450 2016-10-26 16:27:51,653 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:51,654 INFO org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Removed completed containers from NM context: [container_1475850791417_0147_02_000001] 2016-10-26 16:27:51,656 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.CMgrCompletedAppsEvent.EventType: FINISH_APPS 2016-10-26 16:27:51,659 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationFinishEvent.EventType: FINISH_APPLICATION 2016-10-26 16:27:51,659 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Processing application_1475850791417_0147 of type FINISH_APPLICATION 2016-10-26 16:27:51,659 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Application application_1475850791417_0147 transitioned from RUNNING to APPLICATION_RESOURCES_CLEANINGUP 2016-10-26 16:27:51,659 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.ApplicationLocalizationEvent.EventType: DESTROY_APPLICATION_RESOURCES 2016-10-26 16:27:51,667 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServicesEvent.EventType: APPLICATION_STOP 2016-10-26 16:27:51,667 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServices: Got event APPLICATION_STOP for appId application_1475850791417_0147 2016-10-26 16:27:51,672 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationEvent.EventType: APPLICATION_RESOURCES_CLEANEDUP 2016-10-26 16:27:51,672 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Processing application_1475850791417_0147 of type APPLICATION_RESOURCES_CLEANEDUP 2016-10-26 16:27:51,673 DEBUG org.apache.hadoop.yarn.server.nodemanager.security.NMTokenSecretManagerInNM: Removing application attempts NMToken keys for application application_1475850791417_0147 2016-10-26 16:27:51,673 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Application application_1475850791417_0147 transitioned from APPLICATION_RESOURCES_CLEANINGUP to FINISHED 2016-10-26 16:27:51,673 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.loghandler.event.LogHandlerAppFinishedEvent.EventType: APPLICATION_FINISHED 2016-10-26 16:27:51,674 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.AppLogAggregatorImpl: Application just finished : application_1475850791417_0147 2016-10-26 16:27:51,676 DEBUG org.apache.hadoop.ipc.Server: IPC Server idle connection scanner for port 47855: task running 2016-10-26 16:27:51,677 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:shfs3453 (auth:SIMPLE) from:org.apache.hadoop.yarn.logaggregation.AggregatedLogFormat$LogWriter.(AggregatedLogFormat.java:378) 2016-10-26 16:27:51,678 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:shfs3453 (auth:SIMPLE) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:331) 2016-10-26 16:27:51,683 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 2016-10-26 16:27:51,683 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false 2016-10-26 16:27:51,683 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 2016-10-26 16:27:51,683 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.domain.socket.path = 2016-10-26 16:27:51,686 DEBUG org.apache.hadoop.security.SecurityUtil: Acquired token Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.200.23:8020, Ident: (HDFS_DELEGATION_TOKEN token 12521 for shfs3453) 2016-10-26 16:27:51,686 DEBUG org.apache.hadoop.hdfs.HAUtil: Mapped HA service delegation token for logical URI hdfs://sandbox to namenode namenode01.bigdata.fr/192.168.200.23:8020 2016-10-26 16:27:51,687 DEBUG org.apache.hadoop.security.SecurityUtil: Acquired token Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.200.24:8020, Ident: (HDFS_DELEGATION_TOKEN token 12521 for shfs3453) 2016-10-26 16:27:51,687 DEBUG org.apache.hadoop.hdfs.HAUtil: Mapped HA service delegation token for logical URI hdfs://sandbox to namenode namenode02.bigdata.fr/192.168.200.24:8020 2016-10-26 16:27:51,687 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 2016-10-26 16:27:51,688 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false 2016-10-26 16:27:51,688 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 2016-10-26 16:27:51,688 DEBUG org.apache.hadoop.hdfs.BlockReaderLocal: dfs.domain.socket.path = 2016-10-26 16:27:51,688 DEBUG org.apache.hadoop.io.retry.RetryUtils: multipleLinearRandomRetry = null 2016-10-26 16:27:51,689 DEBUG org.apache.hadoop.ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@6e756936 2016-10-26 16:27:51,690 DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 2016-10-26 16:27:51,694 DEBUG org.apache.hadoop.ipc.Client: The ping interval is 60000 ms. 2016-10-26 16:27:51,694 DEBUG org.apache.hadoop.ipc.Client: Connecting to namenode01.bigdata.fr/192.168.200.23:8020 2016-10-26 16:27:51,695 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:shfs3453 (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724) 2016-10-26 16:27:51,696 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: NEGOTIATE 2016-10-26 16:27:51,697 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: NEGOTIATE auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" challenge: "realm=\"default\",nonce=\"cIk2ILp1ulpbmj7zkJgK2dR1AdYkAaxsXAzT1XyX\",qop=\"auth\",charset=utf-8,algorithm=md5-sess" } auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "namenode01.bigdata.fr" } 2016-10-26 16:27:51,697 DEBUG org.apache.hadoop.security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 2016-10-26 16:27:51,698 DEBUG org.apache.hadoop.security.SaslRpcClient: Creating SASL DIGEST-MD5(TOKEN) client to authenticate to service at default 2016-10-26 16:27:51,698 DEBUG org.apache.hadoop.security.SaslRpcClient: Use TOKEN authentication for protocol ClientNamenodeProtocolPB 2016-10-26 16:27:51,699 DEBUG org.apache.hadoop.security.SaslRpcClient: SASL client callback: setting username: ABdzaGZzMzQ1M0BTQU5EQk9YLkhBRE9PUAJybQCKAVgBY3KMigFYJW/2jI4w6Y4GHg== 2016-10-26 16:27:51,699 DEBUG org.apache.hadoop.security.SaslRpcClient: SASL client callback: setting userPassword 2016-10-26 16:27:51,699 DEBUG org.apache.hadoop.security.SaslRpcClient: SASL client callback: setting realm: default 2016-10-26 16:27:51,701 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: INITIATE token: "charset=utf-8,username=\"ABdzaGZzMzQ1M0BTQU5EQk9YLkhBRE9PUAJybQCKAVgBY3KMigFYJW/2jI4w6Y4GHg==\",realm=\"default\",nonce=\"cIk2ILp1ulpbmj7zkJgK2dR1AdYkAaxsXAzT1XyX\",nc=00000001,cnonce=\"iGT8uYztHCjDvS/cLXmwtUEjLKJWP7zXHfKgMpWd\",digest-uri=\"/default\",maxbuf=65536,response=aba2a592869f8461cec7f6f0ac3004c3,qop=auth" auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" } 2016-10-26 16:27:51,703 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: SUCCESS token: "rspauth=4f6cf47babec9a25447971192942630f" 2016-10-26 16:27:51,703 DEBUG org.apache.hadoop.ipc.Client: Negotiated QOP is :auth 2016-10-26 16:27:51,704 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453: starting, having connections 2 2016-10-26 16:27:51,704 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 sending #451 2016-10-26 16:27:51,706 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 got value #451 2016-10-26 16:27:51,706 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: getServerDefaults took 12ms 2016-10-26 16:27:51,766 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 sending #452 2016-10-26 16:27:51,766 DEBUG org.apache.hadoop.ipc.Server: IPC Server idle connection scanner for port 8040: task running 2016-10-26 16:27:51,769 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 got value #452 2016-10-26 16:27:51,769 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: create took 4ms 2016-10-26 16:27:51,781 DEBUG org.apache.hadoop.hdfs.DFSClient: computePacketChunkSize: src=/Products/YARN/logs/shfs3453/logs/application_1475850791417_0147/datanode05.bigdata.fr_47855.tmp, chunkSize=516, chunksPerPacket=126, packetSize=65016 2016-10-26 16:27:51,796 DEBUG org.apache.hadoop.hdfs.LeaseRenewer: Lease renewer daemon for [DFSClient_NONMAPREDUCE_962226214_327] with renew id 1 started 2016-10-26 16:27:51,827 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.AppLogAggregatorImpl: Uploading logs for container container_1475850791417_0147_02_000001. Current good log dirs are /mnt/hd8/hadoop/yarn/log,/mnt/hd3/hadoop/yarn/log,/mnt/hd2/hadoop/yarn/log,/mnt/hd1/hadoop/yarn/log,/mnt/hd9/hadoop/yarn/log,/mnt/hd10/hadoop/yarn/log,/mnt/hd4/hadoop/yarn/log,/mnt/hd5/hadoop/yarn/log,/mnt/hd6/hadoop/yarn/log,/mnt/hd0/hadoop/yarn/log,/mnt/hd11/hadoop/yarn/log,/mnt/hd7/hadoop/yarn/log 2016-10-26 16:27:51,879 DEBUG org.apache.hadoop.io.nativeio.NativeIO: Got UserName shfs3453 for ID 10012 from the native implementation 2016-10-26 16:27:51,882 DEBUG org.apache.hadoop.io.nativeio.NativeIO: Got GroupName hadoop for ID 500 from the native implementation 2016-10-26 16:27:51,895 DEBUG org.apache.hadoop.ipc.Client: The ping interval is 60000 ms. 2016-10-26 16:27:51,895 DEBUG org.apache.hadoop.ipc.Client: Connecting to namenode01.bigdata.fr/192.168.200.23:8020 2016-10-26 16:27:51,897 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:nm/datanode05.bigdata.fr@SANDBOX.HADOOP (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724) 2016-10-26 16:27:51,897 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: NEGOTIATE 2016-10-26 16:27:51,903 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: NEGOTIATE auths { method: "TOKEN" mechanism: "DIGEST-MD5" protocol: "" serverId: "default" challenge: "realm=\"default\",nonce=\"cBIt/tjJtjrmclIZofIb/inC4GBAE/1Xl9rd+zNd\",qop=\"auth\",charset=utf-8,algorithm=md5-sess" } auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "namenode01.bigdata.fr" } 2016-10-26 16:27:51,903 DEBUG org.apache.hadoop.security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) 2016-10-26 16:27:51,904 DEBUG org.apache.hadoop.security.SaslRpcClient: Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal) 2016-10-26 16:27:51,904 DEBUG org.apache.hadoop.security.SaslRpcClient: RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is nn/namenode01.bigdata.fr@SANDBOX.HADOOP 2016-10-26 16:27:51,904 DEBUG org.apache.hadoop.security.SaslRpcClient: Creating SASL GSSAPI(KERBEROS) client to authenticate to service at namenode01.bigdata.fr 2016-10-26 16:27:51,905 DEBUG org.apache.hadoop.security.SaslRpcClient: Use KERBEROS authentication for protocol ClientNamenodeProtocolPB 2016-10-26 16:27:51,910 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: INITIATE token: "`\202\002\263\006\t*\206H\206\367\022\001\002\002\001\000n\202\002\2420\202\002\236\240\003\002\001\005\241\003\002\001\016\242\a\003\005\000 \000\000\000\243\202\001\222a\202\001\2160\202\001\212\240\003\002\001\005\241\020\033\016SANDBOX.HADOOP\242301\240\003\002\001\000\241*0(\033\002nn\033\"namenode01.bigdata.fr\243\202\001:0\202\0016\240\003\002\001\022\241\003\002\001\v\242\202\001(\004\202\001$\343\340\366\366M\275\210Zt\262/\245\a\206\215\002\003s\255\r\361\320\037\016\220\030\3306f\371\243\006\025-\326\025\333\202\362\236\312\a\331\216M\354T\365Xvw\253\304\210\025\016\260\247\215\213\221R\333!\232nf\004\373\210b\037\3515,+\326Z\253\023\304\222Ct\v\026\307a\313\361\002S\034\032\246\274\0374d6;lT+,\2230\006\235\230\307\220j[\376\330;\021pJ\311M{\230\367\035\204\235\026\340=\356\232\304Z\322J\354\315\331\212\304mi\311\004\246\231\006\231\322lx0\354\345\2446 \201\233x\371\342\230~l\tNo\016\327\310\3454\226\316G\305>\234\034\260\021\037\326\265\341=\202I\244\344\371\324c\340J\353\302-\251\260\244C\310\0022\242\v\265\352bz\240]\263\313EJ8\364\2623\355\321q\317\261\327\206B\302FW\002\355\376[\231\037\236,\004\216\332 s\365\265u\313\322\037\3777@?e\030&\025\352\300\327!\275\210 \375*\277\2230b\322(\372\a\005\321w\375\305[\2570\017,\223R\244\201\3620\201\357\240\003\002\001\022\242\201\347\004\201\3447\213?\246\304\316\004*{h\a,~2\352\200\030.\324q\r\t5\236\304\336Eai9\241\360\021)\202\303\311K\024\263\004\375>\303\301\207]\326\346\001g\204\r:\310\a\210\"G\302\345\032\264\312R\r\232\255R\364H8\257e\335\2052}/\310\3204\005\351\225\353\0304\306c\r=\336&\b\231\204\233\207\363G\227\262\250\333\347|H8\334\221|8\227\303\265X\024m\377z@\211T_\344?\346\327a\250\245\\\315\216(\271\036\251\376l-\276i\377o\260\2250$\021u94\232\247\311\367\364\265\327\303\321\224\302\034\265\2443(6\364\230\025\327\262\350{\316\a\230\277C\373\334\317\372\356\262\t\255\016\247\nN\231\263\204u\236\322\375\274\270\363f\211\223\374\022\276\231k\200\021G\325jX\374\205\205\257\2230-\203\305" auths { method: "KERBEROS" mechanism: "GSSAPI" protocol: "nn" serverId: "namenode01.bigdata.fr" } 2016-10-26 16:27:51,913 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: CHALLENGE token: "`j\006\t*\206H\206\367\022\001\002\002\002\000o[0Y\240\003\002\001\005\241\003\002\001\017\242M0K\240\003\002\001\022\242D\004B&\243\236A\331F\303;\211v\243\b\035&;\023\234z\366f\254\300_\243l\345\027\346/K\001\033\303\340\000\247\023\303\264/\214\b:M\351\346b\312;K1\3037~^\256h.\204\3305\243\2300jG" 2016-10-26 16:27:51,915 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: RESPONSE token: "" 2016-10-26 16:27:51,916 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: CHALLENGE token: "\005\004\001\377\000\f\000\000\000\000\000\000/^\241+\001\001\000\000\362\037R\345\264\341^\247\206\3363\f" 2016-10-26 16:27:51,917 DEBUG org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: RESPONSE token: "\005\004\000\377\000\f\000\000\000\000\000\000#\tmZ\001\001\000\000\263\275\024\0236\316\000\267\350P\025\362" 2016-10-26 16:27:51,918 DEBUG org.apache.hadoop.security.SaslRpcClient: Received SASL message state: SUCCESS 2016-10-26 16:27:51,918 DEBUG org.apache.hadoop.ipc.Client: Negotiated QOP is :auth 2016-10-26 16:27:51,919 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP: starting, having connections 3 2016-10-26 16:27:51,919 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #453 2016-10-26 16:27:51,921 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #453 2016-10-26 16:27:51,921 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: getListing took 26ms 2016-10-26 16:27:51,933 DEBUG org.apache.hadoop.hdfs.DFSClient: DFSClient writeChunk allocating new packet seqno=0, src=/Products/YARN/logs/shfs3453/logs/application_1475850791417_0147/datanode05.bigdata.fr_47855.tmp, packetSize=65016, chunksPerPacket=126, bytesCurBlock=0 2016-10-26 16:27:51,938 DEBUG org.apache.hadoop.hdfs.DFSClient: Queued packet 0 2016-10-26 16:27:51,938 DEBUG org.apache.hadoop.hdfs.DFSClient: Queued packet 1 2016-10-26 16:27:51,938 DEBUG org.apache.hadoop.hdfs.DFSClient: Allocating new block 2016-10-26 16:27:51,939 DEBUG org.apache.hadoop.hdfs.DFSClient: Waiting for ack for: 1 2016-10-26 16:27:51,951 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 sending #454 2016-10-26 16:27:51,955 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 got value #454 2016-10-26 16:27:51,955 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: addBlock took 4ms 2016-10-26 16:27:51,973 DEBUG org.apache.hadoop.hdfs.DFSClient: pipeline = DatanodeInfoWithStorage[192.168.200.29:1004,DS-6885cd16-1d1f-4592-ac7b-40cb63ea60f4,DISK] 2016-10-26 16:27:51,973 DEBUG org.apache.hadoop.hdfs.DFSClient: pipeline = DatanodeInfoWithStorage[192.168.200.25:1004,DS-0c41cb0a-630e-4aee-9036-0fa96292b11c,DISK] 2016-10-26 16:27:51,973 DEBUG org.apache.hadoop.hdfs.DFSClient: pipeline = DatanodeInfoWithStorage[192.168.200.30:1004,DS-44679fa4-e735-4019-9479-c0858e6cb271,DISK] 2016-10-26 16:27:51,973 DEBUG org.apache.hadoop.hdfs.DFSClient: Connecting to datanode 192.168.200.29:1004 2016-10-26 16:27:51,974 DEBUG org.apache.hadoop.hdfs.DFSClient: Send buf size 131072 2016-10-26 16:27:51,974 DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient: SASL client skipping handshake in secured configuration with privileged port for addr = /192.168.200.29, datanodeId = DatanodeInfoWithStorage[192.168.200.29:1004,DS-6885cd16-1d1f-4592-ac7b-40cb63ea60f4,DISK] 2016-10-26 16:27:52,074 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ 24883 ] 2016-10-26 16:27:52,075 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Stopping resource-monitoring for container_1475850791417_0147_02_000001 2016-10-26 16:27:52,134 DEBUG org.apache.hadoop.hdfs.DFSClient: DataStreamer block BP-476779567-192.168.200.23-1386176889420:blk_1076631054_1099514520190 sending packet packet seqno: 0 offsetInBlock: 0 lastPacketInBlock: false lastByteOffsetInBlock: 2988 2016-10-26 16:27:52,182 DEBUG org.apache.hadoop.hdfs.DFSClient: DFSClient seqno: 0 reply: SUCCESS reply: SUCCESS reply: SUCCESS downstreamAckTimeNanos: 1069580 flag: 0 flag: 0 flag: 0 2016-10-26 16:27:52,184 DEBUG org.apache.hadoop.hdfs.DFSClient: DataStreamer block BP-476779567-192.168.200.23-1386176889420:blk_1076631054_1099514520190 sending packet packet seqno: 1 offsetInBlock: 2988 lastPacketInBlock: true lastByteOffsetInBlock: 2988 2016-10-26 16:27:52,187 DEBUG org.apache.hadoop.hdfs.DFSClient: DFSClient seqno: 1 reply: SUCCESS reply: SUCCESS reply: SUCCESS downstreamAckTimeNanos: 1613606 flag: 0 flag: 0 flag: 0 2016-10-26 16:27:52,188 DEBUG org.apache.hadoop.hdfs.DFSClient: Closing old block BP-476779567-192.168.200.23-1386176889420:blk_1076631054_1099514520190 2016-10-26 16:27:52,192 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 sending #455 2016-10-26 16:27:52,195 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 got value #455 2016-10-26 16:27:52,195 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: complete took 4ms 2016-10-26 16:27:52,200 DEBUG org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:shfs3453 (auth:SIMPLE) from:org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.AppLogAggregatorImpl.uploadLogsForContainers(AppLogAggregatorImpl.java:304) 2016-10-26 16:27:52,201 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 sending #456 2016-10-26 16:27:52,202 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 got value #456 2016-10-26 16:27:52,202 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms 2016-10-26 16:27:52,205 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 sending #457 2016-10-26 16:27:52,208 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453 got value #457 2016-10-26 16:27:52,208 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: rename took 4ms 2016-10-26 16:27:52,214 DEBUG org.apache.hadoop.yarn.event.AsyncDispatcher: Dispatching the event org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationEvent.EventType: APPLICATION_LOG_HANDLING_FINISHED 2016-10-26 16:27:52,214 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationImpl: Processing application_1475850791417_0147 of type APPLICATION_LOG_HANDLING_FINISHED 2016-10-26 16:27:52,214 DEBUG org.apache.hadoop.ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@6e756936 2016-10-26 16:27:52,656 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:52,656 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:52,657 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #458 2016-10-26 16:27:52,658 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #458 2016-10-26 16:27:52,658 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:53,658 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:53,659 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:53,659 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #459 2016-10-26 16:27:53,660 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #459 2016-10-26 16:27:53,660 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:54,660 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:54,661 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:54,661 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #460 2016-10-26 16:27:54,662 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #460 2016-10-26 16:27:54,662 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:55,075 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ ] 2016-10-26 16:27:55,662 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:55,663 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:55,664 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #461 2016-10-26 16:27:55,665 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #461 2016-10-26 16:27:55,665 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:27:56,666 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:56,666 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:56,666 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #462 2016-10-26 16:27:56,667 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #462 2016-10-26 16:27:56,667 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:57,668 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:57,668 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:57,668 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #463 2016-10-26 16:27:57,669 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #463 2016-10-26 16:27:57,669 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:58,075 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ ] 2016-10-26 16:27:58,670 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:58,670 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:58,670 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #464 2016-10-26 16:27:58,671 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #464 2016-10-26 16:27:58,671 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:27:59,672 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:27:59,672 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:27:59,672 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #465 2016-10-26 16:27:59,673 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #465 2016-10-26 16:27:59,674 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 1ms 2016-10-26 16:28:00,674 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:28:00,674 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:28:00,675 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #466 2016-10-26 16:28:00,675 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #466 2016-10-26 16:28:00,676 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:28:01,075 DEBUG org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Current ProcessTree list : [ ] 2016-10-26 16:28:01,676 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:28:01,676 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:28:01,676 DEBUG org.apache.hadoop.ipc.Server: IPC Server idle connection scanner for port 47855: task running 2016-10-26 16:28:01,677 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #467 2016-10-26 16:28:01,677 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #467 2016-10-26 16:28:01,678 DEBUG org.apache.hadoop.ipc.ProtobufRpcEngine: Call: nodeHeartbeat took 2ms 2016-10-26 16:28:01,767 DEBUG org.apache.hadoop.ipc.Server: IPC Server idle connection scanner for port 8040: task running 2016-10-26 16:28:01,920 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP: closed 2016-10-26 16:28:01,920 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP: stopped, remaining connections 2 2016-10-26 16:28:02,075 DEBUG org.apache.hadoop.metrics2.util.MBeans: Unregistering Hadoop:service=NodeManager,name=ContainerResource_container_1475850791417_0147_02_000001 2016-10-26 16:28:02,204 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453: closed 2016-10-26 16:28:02,204 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to namenode01.bigdata.fr/192.168.200.23:8020 from shfs3453: stopped, remaining connections 1 2016-10-26 16:28:02,678 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Node's health-status : true, 2016-10-26 16:28:02,678 DEBUG org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl: Sending out 0 container statuses: [] 2016-10-26 16:28:02,679 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP sending #468 2016-10-26 16:28:02,679 DEBUG org.apache.hadoop.ipc.Client: IPC Client (507956019) connection to resourcemanager02.bigdata.fr/192.168.200.219:8031 from nm/datanode05.bigdata.fr@SANDBOX.HADOOP got value #468