Details
-
Sub-task
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
Description
When running a cat tool (/hadoop-hdfs-native-client/src/main/native/libhdfspp/examples/cat/c/cat.c) I get the following error:
[libprotobuf ERROR google/protobuf/wire_format.cc:1053] String field contains invalid UTF-8 data when serializing a protocol buffer. Use the 'bytes' type if you intend to send raw bytes.
However it executes correctly. Looks like this error happens when trying to serialize Client name in ClientOperationHeaderProto::SerializeWithCachedSizes (/hadoop-hdfs-native-client/target/main/native/libhdfspp/lib/proto/datatransfer.pb.cc)
Possibly the problem is caused by generating client name as a UUID in GetRandomClientName (/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/common/util.cc)
In Java client it looks like there are two different unique client identifiers: ClientName and ClientId:
Client name is generated as:
clientName = "DFSClient_" + dfsClientConf.getTaskId() + "" + ThreadLocalRandom.current().nextInt() + "" + Thread.currentThread().getId(); (/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/DFSClient.java)
ClientId is generated as a UUID in (/hadoop-common/src/main/java/org/apache/hadoop/ipc/ClientId.java)
In libhdfs++ we need to possibly also have two unique client identifiers, or fix the current client name to work without protobuf warnings/errors.