Hadoop Common
  1. Hadoop Common
  2. HADOOP-6846

Scripts for building Hadoop 0.22.0 release

    Details

    • Type: Task Task
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 0.22.0
    • Fix Version/s: 0.22.0
    • Component/s: build
    • Labels:
      None
    • Hadoop Flags:
      Reviewed
    1. release-scripts.tar.gz
      4 kB
      Tom White
    2. hadoop-install.tar.gz
      42 kB
      Joep Rottinghuis
    3. HADOOP-6846.patch
      2 kB
      Patrick Hunt

      Issue Links

        Activity

        Hide
        Tom White added a comment -

        Here are the scripts I used to build the release candidate in conjunction with the instructions at http://wiki.apache.org/hadoop/HowToRelease. I used an EC2 instance (AMI ami-c38d6caa) for the build.

        Show
        Tom White added a comment - Here are the scripts I used to build the release candidate in conjunction with the instructions at http://wiki.apache.org/hadoop/HowToRelease . I used an EC2 instance (AMI ami-c38d6caa) for the build.
        Hide
        Robert Demb added a comment -

        Hi,

        I'm responding to a bug I'm encountering using these release scripts.

        Since there is no build.xml file for 0.21.0, I searched and found these scripts. Executing the build-hadoop.sh file successfully builds hadoop-common and hadoop-hdfs, but produces the following the error for hadoop-mapreduce - see below.

        I apologize if this has gone out to a wide audience who are already familiar with this issue and have it solved. I could not find any documentation on how to build 0.21.0 - aside from the jira posts concerning the absence of the build.xml, and the jira posts on these scripts.

        If there is a 0.21.0 release that has the coordinated build.xml, please let me know.

        Thank you,

        Robert

        javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoop/ma
        pred/jobdetailshistory_jsp.java:268: cannot find symbol
        [javac] symbol : class FilteredJob
        [javac] location: class org.apache.hadoop.mapred.HistoryViewer
        [javac] HistoryViewer.FilteredJob filter = new HistoryViewer.FilteredJob
        (job,TaskStatus.State.FAILED.toString());
        [javac] ^
        [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo
        p/mapred/jobdetailshistory_jsp.java:317: cannot find symbol
        [javac] symbol : class FilteredJob
        [javac] location: class org.apache.hadoop.mapred.HistoryViewer
        [javac] filter = new HistoryViewer.FilteredJob(job, TaskStatus.State.KIL
        LED.toString());
        [javac] ^
        [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo
        p/mapred/jobhistory_jsp.java:186: cannot find symbol
        [javac] symbol : variable OLD_SUFFIX
        [javac] location: class org.apache.hadoop.mapred.JobHistory
        [javac] path.getName().endsWith(JobHistory.OLD_SUFFIX)) &&
        [javac] ^
        [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo
        p/mapred/jobhistory_jsp.java:316: cannot find symbol
        [javac] symbol : method getJobIDFromHistoryFilePath(org.apache.hadoop.fs.Pa
        th)
        [javac] location: class org.apache.hadoop.mapred.JobHistory
        [javac] String jobId = JobHistory.getJobIDFromHistoryFilePath(jobFile)
        .toString();
        [javac] ^
        [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo
        p/mapred/jobhistory_jsp.java:317: cannot find symbol
        [javac] symbol : method getUserFromHistoryFilePath(org.apache.hadoop.fs.Pat
        h)
        [javac] location: class org.apache.hadoop.mapred.JobHistory
        [javac] String userName = JobHistory.getUserFromHistoryFilePath(jobFil
        e);
        [javac] ^
        [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo
        p/mapred/taskdetailshistory_jsp.java:52: cannot find symbol
        [javac] symbol : method getTaskLogsUrl(org.apache.hadoop.mapreduce.jobhisto
        ry.JobHistoryParser.TaskAttemptInfo)
        [javac] location: class org.apache.hadoop.mapred.HistoryViewer
        [javac] String taskLogsUrl = HistoryViewer.getTaskLogsUrl(taskAttempt);
        [javac] ^
        [javac] Note: Some input files use or override a deprecated API.
        [javac] Note: Recompile with -Xlint:deprecation for details.
        [javac] 13 errors

        BUILD FAILED
        /home/rdemb/hadoop_devel/hadoop-mapreduce/build.xml:378: Compile failed; see the
        compiler error output for details.

        Show
        Robert Demb added a comment - Hi, I'm responding to a bug I'm encountering using these release scripts. Since there is no build.xml file for 0.21.0, I searched and found these scripts. Executing the build-hadoop.sh file successfully builds hadoop-common and hadoop-hdfs, but produces the following the error for hadoop-mapreduce - see below. I apologize if this has gone out to a wide audience who are already familiar with this issue and have it solved. I could not find any documentation on how to build 0.21.0 - aside from the jira posts concerning the absence of the build.xml, and the jira posts on these scripts. If there is a 0.21.0 release that has the coordinated build.xml, please let me know. Thank you, Robert javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoop/ma pred/jobdetailshistory_jsp.java:268: cannot find symbol [javac] symbol : class FilteredJob [javac] location: class org.apache.hadoop.mapred.HistoryViewer [javac] HistoryViewer.FilteredJob filter = new HistoryViewer.FilteredJob (job,TaskStatus.State.FAILED.toString()); [javac] ^ [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo p/mapred/jobdetailshistory_jsp.java:317: cannot find symbol [javac] symbol : class FilteredJob [javac] location: class org.apache.hadoop.mapred.HistoryViewer [javac] filter = new HistoryViewer.FilteredJob(job, TaskStatus.State.KIL LED.toString()); [javac] ^ [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo p/mapred/jobhistory_jsp.java:186: cannot find symbol [javac] symbol : variable OLD_SUFFIX [javac] location: class org.apache.hadoop.mapred.JobHistory [javac] path.getName().endsWith(JobHistory.OLD_SUFFIX)) && [javac] ^ [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo p/mapred/jobhistory_jsp.java:316: cannot find symbol [javac] symbol : method getJobIDFromHistoryFilePath(org.apache.hadoop.fs.Pa th) [javac] location: class org.apache.hadoop.mapred.JobHistory [javac] String jobId = JobHistory.getJobIDFromHistoryFilePath(jobFile) .toString(); [javac] ^ [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo p/mapred/jobhistory_jsp.java:317: cannot find symbol [javac] symbol : method getUserFromHistoryFilePath(org.apache.hadoop.fs.Pat h) [javac] location: class org.apache.hadoop.mapred.JobHistory [javac] String userName = JobHistory.getUserFromHistoryFilePath(jobFil e); [javac] ^ [javac] /home/hadoop_devel/hadoop-mapreduce/build/src/org/apache/hadoo p/mapred/taskdetailshistory_jsp.java:52: cannot find symbol [javac] symbol : method getTaskLogsUrl(org.apache.hadoop.mapreduce.jobhisto ry.JobHistoryParser.TaskAttemptInfo) [javac] location: class org.apache.hadoop.mapred.HistoryViewer [javac] String taskLogsUrl = HistoryViewer.getTaskLogsUrl(taskAttempt); [javac] ^ [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] 13 errors BUILD FAILED /home/rdemb/hadoop_devel/hadoop-mapreduce/build.xml:378: Compile failed; see the compiler error output for details.
        Hide
        Tom White added a comment -

        > Since there is no build.xml file for 0.21.0

        The 0.21 has three build.xml files, one in each of the common, hdfs and mapred directories. You can use these to build independent tarballs, then created a single unified tarball using the script attached to this issue.

        Show
        Tom White added a comment - > Since there is no build.xml file for 0.21.0 The 0.21 has three build.xml files, one in each of the common, hdfs and mapred directories. You can use these to build independent tarballs, then created a single unified tarball using the script attached to this issue.
        Hide
        Konstantin Boudnik added a comment -

        I just got an idea (perhaps it has been discussed earlier thought): shall we try to use a tool like crepo for the complex releases like that?

        Show
        Konstantin Boudnik added a comment - I just got an idea (perhaps it has been discussed earlier thought): shall we try to use a tool like crepo for the complex releases like that?
        Hide
        Tom White added a comment -

        These scripts should be useful for the 0.22.0 release too, so I suggest they get committed after HADOOP-7106 is done.

        Show
        Tom White added a comment - These scripts should be useful for the 0.22.0 release too, so I suggest they get committed after HADOOP-7106 is done.
        Hide
        Celina d´ Ávila Samogin added a comment -

        I have followed the instructions in the README, I have applied these patches in 0.22.0 common, and everything has worked fine on my notebook. I have edited the tar-munge file and I made ​​a few modifications to do it. I've installed the generated tarball for version 0.22.0, hadoop daemons (NameNode, SecondaryNameNode, DataNode, JobTracker, TaskTracker) have worked well, and it has worked well on my notebook.

        Show
        Celina d´ Ávila Samogin added a comment - I have followed the instructions in the README, I have applied these patches in 0.22.0 common, and everything has worked fine on my notebook. I have edited the tar-munge file and I made ​​a few modifications to do it. I've installed the generated tarball for version 0.22.0, hadoop daemons (NameNode, SecondaryNameNode, DataNode, JobTracker, TaskTracker) have worked well, and it has worked well on my notebook.
        Hide
        Celina d´ Ávila Samogin added a comment -
        Show
        Celina d´ Ávila Samogin added a comment - I have applied patches from https://issues.apache.org/jira/browse/HADOOP-6342 .
        Hide
        Patrick Hunt added a comment -

        I pulled the tar-munge file out of release-scripts.tar.gz and attached it as a patch.

        Commit to hadoop/nightly, ensure that the file is executable.

        Show
        Patrick Hunt added a comment - I pulled the tar-munge file out of release-scripts.tar.gz and attached it as a patch. Commit to hadoop/nightly, ensure that the file is executable.
        Hide
        Patrick Hunt added a comment -

        Tom White and I (phunt) worked on this as part of the hackathon. It allows a hadoop-0.22.0 tar.gz file to be generated by the jenkins build, which can be seen here:

        https://builds.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-22-Build/9/

        (hadoop-0.22.0-SNAPSHOT.tar.gz includes core/hdfs/mapred tar.gz's)

        After committing this change (hadoop/nightly) edit the build configuration on jenkins to remove this section from "execute shell":

        1. for the moment pull my version
          wget --no-check-certificate http://github.com/phunt/hadoop-nightly/raw/master/tar-munge
          chmod a+x tar-munge
          mv tar-munge $ {WORKSPACE}

          /nightly/tar-munge

        Show
        Patrick Hunt added a comment - Tom White and I (phunt) worked on this as part of the hackathon. It allows a hadoop-0.22.0 tar.gz file to be generated by the jenkins build, which can be seen here: https://builds.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-22-Build/9/ (hadoop-0.22.0-SNAPSHOT.tar.gz includes core/hdfs/mapred tar.gz's) After committing this change (hadoop/nightly) edit the build configuration on jenkins to remove this section from "execute shell": — for the moment pull my version wget --no-check-certificate http://github.com/phunt/hadoop-nightly/raw/master/tar-munge chmod a+x tar-munge mv tar-munge $ {WORKSPACE} /nightly/tar-munge —
        Hide
        Patrick Hunt added a comment -

        Tom, I just noticed that you did not check "grant for inclusion" on your release-scripts.tar.gz attachment, can you please comment on this jira granting such (given I pulled the patch from that archive). Thanks.

        Show
        Patrick Hunt added a comment - Tom, I just noticed that you did not check "grant for inclusion" on your release-scripts.tar.gz attachment, can you please comment on this jira granting such (given I pulled the patch from that archive). Thanks.
        Hide
        Hadoop QA added a comment -

        -1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12478884/HADOOP-6846.patch
        against trunk revision 1102093.

        +1 @author. The patch does not contain any @author tags.

        -1 tests included. The patch doesn't appear to include any new or modified tests.
        Please justify why no new tests are needed for this patch.
        Also please list what manual steps were performed to verify this patch.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        -1 findbugs. The patch appears to introduce 1 new Findbugs (version 1.3.9) warnings.

        +1 release audit. The applied patch does not increase the total number of release audit warnings.

        +1 core tests. The patch passed core unit tests.

        +1 contrib tests. The patch passed contrib unit tests.

        +1 system test framework. The patch passed system test framework compile.

        Test results: https://builds.apache.org/hudson/job/PreCommit-HADOOP-Build/436//testReport/
        Findbugs warnings: https://builds.apache.org/hudson/job/PreCommit-HADOOP-Build/436//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Console output: https://builds.apache.org/hudson/job/PreCommit-HADOOP-Build/436//console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12478884/HADOOP-6846.patch against trunk revision 1102093. +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. -1 findbugs. The patch appears to introduce 1 new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed core unit tests. +1 contrib tests. The patch passed contrib unit tests. +1 system test framework. The patch passed system test framework compile. Test results: https://builds.apache.org/hudson/job/PreCommit-HADOOP-Build/436//testReport/ Findbugs warnings: https://builds.apache.org/hudson/job/PreCommit-HADOOP-Build/436//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Console output: https://builds.apache.org/hudson/job/PreCommit-HADOOP-Build/436//console This message is automatically generated.
        Hide
        Tom White added a comment -

        I grant the patch for inclusion.

        Show
        Tom White added a comment - I grant the patch for inclusion.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Common-trunk-Commit #592 (See https://builds.apache.org/hudson/job/Hadoop-Common-trunk-Commit/592/)
        HADOOP-6846. Scripts for building Hadoop 0.22.0 release.

        Show
        Hudson added a comment - Integrated in Hadoop-Common-trunk-Commit #592 (See https://builds.apache.org/hudson/job/Hadoop-Common-trunk-Commit/592/ ) HADOOP-6846 . Scripts for building Hadoop 0.22.0 release.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-trunk-Commit #664 (See https://builds.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/664/)
        HADOOP-6846. Scripts for building Hadoop 0.22.0 release.

        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk-Commit #664 (See https://builds.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/664/ ) HADOOP-6846 . Scripts for building Hadoop 0.22.0 release.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-22-branch #47 (See https://builds.apache.org/hudson/job/Hadoop-Mapreduce-22-branch/47/)
        HADOOP-6846. Scripts for building Hadoop 0.22.0 release.

        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-22-branch #47 (See https://builds.apache.org/hudson/job/Hadoop-Mapreduce-22-branch/47/ ) HADOOP-6846 . Scripts for building Hadoop 0.22.0 release.
        Hide
        Hudson added a comment -

        Integrated in ZooKeeper-trunk #1180 (See https://builds.apache.org/hudson/job/ZooKeeper-trunk/1180/)
        HADOOP-6846. Scripts for building Hadoop 0.22.0 release.

        Show
        Hudson added a comment - Integrated in ZooKeeper-trunk #1180 (See https://builds.apache.org/hudson/job/ZooKeeper-trunk/1180/ ) HADOOP-6846 . Scripts for building Hadoop 0.22.0 release.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Common-trunk #686 (See https://builds.apache.org/hudson/job/Hadoop-Common-trunk/686/)
        HADOOP-6846. Scripts for building Hadoop 0.22.0 release.

        Show
        Hudson added a comment - Integrated in Hadoop-Common-trunk #686 (See https://builds.apache.org/hudson/job/Hadoop-Common-trunk/686/ ) HADOOP-6846 . Scripts for building Hadoop 0.22.0 release.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-trunk #679 (See https://builds.apache.org/hudson/job/Hadoop-Mapreduce-trunk/679/)

        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk #679 (See https://builds.apache.org/hudson/job/Hadoop-Mapreduce-trunk/679/ )
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Hdfs-trunk-Commit #658 (See https://builds.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/658/)

        Show
        Hudson added a comment - Integrated in Hadoop-Hdfs-trunk-Commit #658 (See https://builds.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/658/ )
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Common-22-branch #49 (See https://builds.apache.org/hudson/job/Hadoop-Common-22-branch/49/)

        Show
        Hudson added a comment - Integrated in Hadoop-Common-22-branch #49 (See https://builds.apache.org/hudson/job/Hadoop-Common-22-branch/49/ )
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Hdfs-22-branch #49 (See https://builds.apache.org/hudson/job/Hadoop-Hdfs-22-branch/49/)

        Show
        Hudson added a comment - Integrated in Hadoop-Hdfs-22-branch #49 (See https://builds.apache.org/hudson/job/Hadoop-Hdfs-22-branch/49/ )
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Hdfs-trunk #673 (See https://builds.apache.org/hudson/job/Hadoop-Hdfs-trunk/673/)

        Show
        Hudson added a comment - Integrated in Hadoop-Hdfs-trunk #673 (See https://builds.apache.org/hudson/job/Hadoop-Hdfs-trunk/673/ )
        Hide
        Celina d´ Ávila Samogin added a comment -

        After many tests, I submit some scripts to generate the hadoop tarball without documentation and no source code, only for 32 bit binary code. I have used in my master's job.

        With these scripts, you can get a hadoop tarball with modified and compiled source code for testing.

        The scripts were based on patches HADOOP-6342 and HADOOP-6846.

        https://github.com/celinasam/Generate-Hadoop-Tarball

        Show
        Celina d´ Ávila Samogin added a comment - After many tests, I submit some scripts to generate the hadoop tarball without documentation and no source code, only for 32 bit binary code. I have used in my master's job. With these scripts, you can get a hadoop tarball with modified and compiled source code for testing. The scripts were based on patches HADOOP-6342 and HADOOP-6846 . https://github.com/celinasam/Generate-Hadoop-Tarball
        Hide
        Joep Rottinghuis added a comment -

        I used a slightly different approach.
        My requirements were
        1) Build everything from source
        2) build everything together into one bundle (did not want to have to deal with this machine has that version of such rpm, can that be compatible with that other version, etc. etc.)
        3) Ability to push deployment to machine and stage w/o taking down cluster
        4) Ability to have multiple installations on a server in parallel
        5) Ship configuration from source control
        6) have a simple install script that create appropriate symlinks
        7) Avoid /etc/alternatives and other indirections (see 4)
        8) Do not rely on existing environment variables.

        What I did is to create a Jenkins job for each component. That includes the hadoop-config. This is simply a tarball with config directories wrapped in a nice Eclipse project. The installer will link server or client config in. Sample provided.

        Then one assembly build pulls all tarballs from the jenkins server, untars, places jars in proper location, generates bin templates (injects variable names where needed).

        See attached for an example.

        Show
        Joep Rottinghuis added a comment - I used a slightly different approach. My requirements were 1) Build everything from source 2) build everything together into one bundle (did not want to have to deal with this machine has that version of such rpm, can that be compatible with that other version, etc. etc.) 3) Ability to push deployment to machine and stage w/o taking down cluster 4) Ability to have multiple installations on a server in parallel 5) Ship configuration from source control 6) have a simple install script that create appropriate symlinks 7) Avoid /etc/alternatives and other indirections (see 4) 8) Do not rely on existing environment variables. What I did is to create a Jenkins job for each component. That includes the hadoop-config. This is simply a tarball with config directories wrapped in a nice Eclipse project. The installer will link server or client config in. Sample provided. Then one assembly build pulls all tarballs from the jenkins server, untars, places jars in proper location, generates bin templates (injects variable names where needed). See attached for an example.

          People

          • Assignee:
            Tom White
            Reporter:
            Tom White
          • Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development