Hadoop Map/Reduce
  1. Hadoop Map/Reduce
  2. MAPREDUCE-62

job.xml should have high replication factor by default

    Details

    • Type: Bug Bug
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Not a Problem
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None

      Description

      job.xml is set the default replication factor of 3 by the JobClient. The same config file is accessed from the DFS by JT as well as all the TTs. Hence it should be set a high replication factor say 10, just like the job.jar.

        Activity

        Hide
        Vinod Kumar Vavilapalli added a comment -

        Setting this would help ameliorate JobTracker's behaviour in situations like HADOOP-5285, as JT would now have more datanodes to download job.xml from.

        Show
        Vinod Kumar Vavilapalli added a comment - Setting this would help ameliorate JobTracker's behaviour in situations like HADOOP-5285 , as JT would now have more datanodes to download job.xml from.
        Hide
        Harsh J added a comment -

        This is now done via mapred.submit.replication and the likes. Default is 10.

        Show
        Harsh J added a comment - This is now done via mapred.submit.replication and the likes. Default is 10.

          People

          • Assignee:
            Unassigned
            Reporter:
            Vinod Kumar Vavilapalli
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development