Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-1194

mapred.reduce.slowstart.completed.maps allows values more than 1.0 and less than 0.0

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • 0.20.1
    • None
    • None

    Description

      When run with value less than 0.0 (e.g. -1.0) works similar to 0.0, but when value is more than 1.0 (e.g. 50.0) reducers don't start at all. Is there a reason why such values are allowed? I understand that this is clear from the description that parameter is fraction but some people may forget and confuse it with percents and as result have problems. Why not throw an error in this case?

      Attachments

        1. MAPREDUCE-1194.rev1.patch
          2 kB
          Gera Shegalov

        Activity

          People

            Unassigned Unassigned
            mzizin Maxim Zizin
            Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

              Created:
              Updated: