Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-474

support compressed text files as input and output

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 0.5.0
    • 0.6.0
    • None
    • None

    Description

      I'd like TextInputFomat and TextOutputFormat to automatically compress and uncompress text files when they are read and written. Furthermore, I'd like to be able to use custom compressors as defined in HADOOP-441. Therefore, I propose:

      Adding a map of compression codecs in the server config files:

      io.compression.codecs = "<suffix>=<codec class>,..."

      so the default would be something like:

      <property>
      <name>io.compression.codecs</name>
      <value>.gz=org.apache.hadoop.io.GZipCodec,.Z=org.apache.hadoop.io.ZipCodec</value>
      <description>A list of file suffixes and the codecs for them.</description>
      </property>

      note that the suffix can include multiple "." so you could support suffixes like ".tar.gz", but they are just treated as literals against the end of the filename.

      If the TextInputFormat is dealing with such a file, it:
      1. makes a single split
      2. decompresses automatically

      On the output side, if mapred.output.compress is true, then TextOutputFormat would use a new property mapred.output.compression.codec that would define the codec to use to compress the outputs, defaulting to gzip.

      Attachments

        1. text-gz-3.patch
          59 kB
          Owen O'Malley
        2. text-gz-2.patch
          54 kB
          Owen O'Malley
        3. text-gz.patch
          41 kB
          Owen O'Malley

        Issue Links

          Activity

            People

              omalley Owen O'Malley
              omalley Owen O'Malley
              Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: