Details
-
Bug
-
Status: Open
-
Minor
-
Resolution: Unresolved
-
2.7.1
-
None
-
None
Description
When a client creates a socket connection to the Datanode and sends an empty message, the datanode logs have exceptions like these:
2015-07-08 20:00:55,427 ERROR datanode.DataNode (DataXceiver.java:run(278)) - bidev17.rtp.ibm.com:50010:DataXceiver error processing unknown operation src: /127.0.0.1:41508 dst: /127.0.0.1:50010
java.io.EOFException
at java.io.DataInputStream.readShort(DataInputStream.java:315)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
at java.lang.Thread.run(Thread.java:745)
2015-07-08 20:00:56,671 ERROR datanode.DataNode (DataXceiver.java:run(278)) - bidev17.rtp.ibm.com:50010:DataXceiver error processing unknown operation src: /127.0.0.1:41509 dst: /127.0.0.1:50010
java.io.EOFException
at java.io.DataInputStream.readShort(DataInputStream.java:315)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
at java.lang.Thread.run(Thread.java:745)
These can fill up the logs and was recently noticed with an Ambari 2.1 based install which tries to check if the datanode is up.
Can be easily reproduced with a simple Java client creating a Socket connection:
public static void main(String[] args) {
Socket DNClient;
try
catch (UnknownHostException e)
{ // TODO Auto-generated catch block e.printStackTrace(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); }}
Attachments
Attachments
Issue Links
- duplicates
-
HDFS-9572 Prevent DataNode log spam if a client connects on the data transfer port but sends no data.
- Resolved