Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-4243

Serialization framework use SequenceFile/TFile/Other metadata to instantiate deserializer

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • None
    • None
    • contrib/serialization
    • None

    Description

      SequenceFile metadata is useful for storing additional information about the serialized data, for example, for RecordIO, whether the data is CSV or Binary. For thrift, the same thing - Binary, JSON, ...

      For Hive, this may be especially important, because it has a Dynamic generic serializer/deserializer that takes its DDL at runtime (as opposed to RecordIO and Thrift which require pre-compilation into a specific class whose name can be stored in the sequence file key or value class). In this case, the class name is like Record.java in RecordIO - it doesn't tell you anything without the DDL.

      One way to address this could be adding the sequence file metadata to the getDeserializer call in Serialization interface. The api would then be something like getDeserializer(Class<?>, Map<Text, Text> metadata) or Properties metadata.

      But, I am open to proposals.

      This also means that saying a class implements Writable is not enough to necessarily deserialize it since it may do specific actions based on the metadata - e.g., RecordIO might determine whether to use CSV rather than the default Binary deserialization.

      There's the other issue of the getSerializer returning the metadata to be written to the Sequence/T File.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              wyckoff Pete Wyckoff
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: