Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-1314

dfs.blocksize accepts only absolute value

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • 0.23.0
    • 0.23.1
    • None
    • Reviewed
    • The default blocksize property 'dfs.blocksize' now accepts unit symbols to be used instead of byte length. Values such as "10k", "128m", "1g" are now OK to provide instead of just no. of bytes as was before.

    Description

      Using "dfs.block.size=8388608" works
      but "dfs.block.size=8mb" does not.

      Using "dfs.block.size=8mb" should throw some WARNING on NumberFormatException.
      (http://pastebin.corp.yahoo.com/56129)

      Attachments

        1. hdfs-1314.txt
          7 kB
          Harsh J
        2. hdfs-1314.txt
          6 kB
          Sho Shimauchi
        3. hdfs-1314.txt
          7 kB
          Sho Shimauchi
        4. hdfs-1314.txt
          7 kB
          Sho Shimauchi
        5. hdfs-1314.txt
          8 kB
          Sho Shimauchi
        6. hdfs-1314.txt
          6 kB
          Sho Shimauchi

        Issue Links

          Activity

            People

              sho.shimauchi Sho Shimauchi
              karims Karim Saadah
              Votes:
              1 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: