Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-10669

Avro serialization does not flush buffered serialized values causing data lost

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 2.4.0
    • None
    • io
    • None
    • avro serialization

    Description

      Found this debugging Nutch.

      MapTask serializes keys and values to the same stream, in pairs:

      keySerializer.serialize(key);
      .....
      valSerializer.serialize(value);
      .....
      bb.write(b0, 0, 0);

      AvroSerializer does not flush its buffer after each serialization. So if it is used for valSerializer, the values are only partially written or not written at all to the output stream before the record is marked as complete (the last line above).

      <EDIT> Added HADOOP-10699_all.patch. This is a less intrusive fix, as it does not try to flush MapTask stream. Instead, we write serialized values directly to MapTask stream and avoid using a buffer on avro side.

      Attachments

        1. HADOOP-10669.patch
          0.7 kB
          Mikhail Bernadsky
        2. HADOOP-10669_alt.patch
          0.8 kB
          Mikhail Bernadsky

        Issue Links

          Activity

            People

              Unassigned Unassigned
              mikebern Mikhail Bernadsky
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated: