Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-1314

dfs.blocksize accepts only absolute value

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 0.23.0
    • Fix Version/s: 0.23.1
    • Component/s: None
    • Labels:
    • Hadoop Flags:
      Reviewed
    • Release Note:
      The default blocksize property 'dfs.blocksize' now accepts unit symbols to be used instead of byte length. Values such as "10k", "128m", "1g" are now OK to provide instead of just no. of bytes as was before.

      Description

      Using "dfs.block.size=8388608" works
      but "dfs.block.size=8mb" does not.

      Using "dfs.block.size=8mb" should throw some WARNING on NumberFormatException.
      (http://pastebin.corp.yahoo.com/56129)

        Attachments

        1. hdfs-1314.txt
          7 kB
          Harsh J
        2. hdfs-1314.txt
          6 kB
          Sho Shimauchi
        3. hdfs-1314.txt
          7 kB
          Sho Shimauchi
        4. hdfs-1314.txt
          7 kB
          Sho Shimauchi
        5. hdfs-1314.txt
          8 kB
          Sho Shimauchi
        6. hdfs-1314.txt
          6 kB
          Sho Shimauchi

          Issue Links

            Activity

              People

              • Assignee:
                sho.shimauchi Sho Shimauchi
                Reporter:
                karims Karim Saadah
              • Votes:
                1 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: