Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-16452

Increase ipc.maximum.data.length default from 64MB to 128MB

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.6.0
    • Fix Version/s: 3.3.0
    • Component/s: ipc
    • Labels:
      None
    • Release Note:
      Default ipc.maximum.data.length is now 128 MB in order to accommodate huge block reports.

      Description

      Reason for bumping the default:
      Denser DataNodes are common. It is not uncommon to find a DataNode with > 7 million blocks these days.

      With such a high number of blocks, the block report message can exceed the 64mb limit (defined by ipc.maximum.data.length). The block reports are rejected, causing missing blocks in HDFS. We had to double this configuration value in order to work around the issue.

      We are seeing an increasing number of these cases. I think it's time to revisit some of these default values as the hardware evolves.

        Attachments

        1. HADOOP-16452.001.patch
          0.9 kB
          Siyao Meng
        2. HADOOP-16452.002.patch
          2 kB
          Siyao Meng

          Issue Links

            Activity

              People

              • Assignee:
                smeng Siyao Meng
                Reporter:
                weichiu Wei-Chiu Chuang
              • Votes:
                0 Vote for this issue
                Watchers:
                10 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: