Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Minor Minor
    • Resolution: Fixed
    • Affects Version/s: 0.21.0
    • Fix Version/s: 0.21.0
    • Component/s: build
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      Two problems found:

      • $ {build.src}

        not included in ant target "compile-core-classes", thus o.a.h.package-info.java is not compiled, which contains the version annotation.

      • bin/hadoop-config.sh attempts to include jar files matching pattern hadoop-core.jar rather than hadoop-core.jar.

        Issue Links

          Activity

          Hide
          Hudson added a comment -

          Integrated in Hadoop-Common-trunk #41 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/41/)
          . Fix jar file names in hadoop-config.sh and include $

          {build.src}

          as a part of the source list in build.xml. Contributed by Hong Tang

          Show
          Hudson added a comment - Integrated in Hadoop-Common-trunk #41 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/41/ ) . Fix jar file names in hadoop-config.sh and include $ {build.src} as a part of the source list in build.xml. Contributed by Hong Tang
          Hide
          Tsz Wo Nicholas Sze added a comment -

          Tested manually as shown below:

          bash-3.2$ ./bin/hadoop version
          Hadoop Unknown
          Subversion Unknown -r Unknown
          Compiled by Unknown on Unknown
          From source with checksum Unknown
          
          # apply patch and recompile
          bash-3.2$ patch -p0 -i ../hadoop-6172-20090731.patch 
          ...
          
          bash-3.2$ ./bin/hadoop version
          Hadoop 0.21.0-dev
          Subversion https://svn.apache.org/repos/asf/hadoop/common/trunk -r 798247
          Compiled by tsz on Fri Jul 31 14:40:51 PDT 2009
          From source with checksum 4cec386d7ec492be92e14fc12a5a50a4
          

          I have committed this. Thanks, Hong!

          Show
          Tsz Wo Nicholas Sze added a comment - Tested manually as shown below: bash-3.2$ ./bin/hadoop version Hadoop Unknown Subversion Unknown -r Unknown Compiled by Unknown on Unknown From source with checksum Unknown # apply patch and recompile bash-3.2$ patch -p0 -i ../hadoop-6172-20090731.patch ... bash-3.2$ ./bin/hadoop version Hadoop 0.21.0-dev Subversion https://svn.apache.org/repos/asf/hadoop/common/trunk -r 798247 Compiled by tsz on Fri Jul 31 14:40:51 PDT 2009 From source with checksum 4cec386d7ec492be92e14fc12a5a50a4 I have committed this. Thanks, Hong!
          Hide
          Konstantin Boudnik added a comment -

          Fine with me. Let's commit this first. To track the scripts testing I've created HADOOP-6172

          Show
          Konstantin Boudnik added a comment - Fine with me. Let's commit this first. To track the scripts testing I've created HADOOP-6172
          Hide
          Tsz Wo Nicholas Sze added a comment -

          > Also, I think whatever tests might be automated with a relatively low effort should be automated. ...

          Since our build process does not support script unit tests, I think it is not a relatively low effort to add a test for this issue. The fixes in this issue are simple but important. Even we decide to create unit tests for script, how about do it in a separated issue, so that we can commit this first?

          Show
          Tsz Wo Nicholas Sze added a comment - > Also, I think whatever tests might be automated with a relatively low effort should be automated. ... Since our build process does not support script unit tests, I think it is not a relatively low effort to add a test for this issue. The fixes in this issue are simple but important. Even we decide to create unit tests for script, how about do it in a separated issue, so that we can commit this first?
          Hide
          Hong Tang added a comment -

          @Cos, in this particular case, I want to conduct two tests:

          • $hadoop-src-dir/bin/hadoop version
          • $hadoop-src-dir/build/hadoop-xxx/bin/hadoop version

          The second test requires hadoop being copied to build/bin, which would not happen until "ant tar" is called. So the second test would depend on "tar" target.

          Show
          Hong Tang added a comment - @Cos, in this particular case, I want to conduct two tests: $hadoop-src-dir/bin/hadoop version $hadoop-src-dir/build/hadoop-xxx/bin/hadoop version The second test requires hadoop being copied to build/bin, which would not happen until "ant tar" is called. So the second test would depend on "tar" target.
          Hide
          Konstantin Boudnik added a comment -

          I'm not sure why the dependency would be changed - it seems that scripts are coming directly from SVN and nothing special is required to run

          bin/hadoop version
          

          but compilation. Thus certain script tests won't require anything special too. Am I mistaken?

          Also, I think whatever tests might be automated with a relatively low effort should be automated. Manual tests aren't that efficient nor error proof. E.g. in the current common trunk bin/hadoop version produces a lot of 'unknown' version tags which might've been caught by a routinely executed test.

          Shall we start adding tests for scripts right now? Why not - this day isn't worst than any consequent one

          Show
          Konstantin Boudnik added a comment - I'm not sure why the dependency would be changed - it seems that scripts are coming directly from SVN and nothing special is required to run bin/hadoop version but compilation. Thus certain script tests won't require anything special too. Am I mistaken? Also, I think whatever tests might be automated with a relatively low effort should be automated. Manual tests aren't that efficient nor error proof. E.g. in the current common trunk bin/hadoop version produces a lot of 'unknown' version tags which might've been caught by a routinely executed test. Shall we start adding tests for scripts right now? Why not - this day isn't worst than any consequent one
          Hide
          Hong Tang added a comment -

          Such test could change the dependency, because it is testing whether the tar target runs correctly. Or we probably could incorporate it as part of the "test-patch" target (because that would build a tar in the process).

          Show
          Hong Tang added a comment - Such test could change the dependency, because it is testing whether the tar target runs correctly. Or we probably could incorporate it as part of the "test-patch" target (because that would build a tar in the process).
          Hide
          Tsz Wo Nicholas Sze added a comment -

          > Shouldn't we have a primitive unit test to check bin/hadoop version result? ..

          I think this is a good idea but we do not have any unit test to test any script (correct me if I am wrong). Not sure why we are not creating unit tests for scripts. I guess we usually do manual tests for scripts. Are you proposing to add unit tests for scripts?

          Holding on the commit.

          Show
          Tsz Wo Nicholas Sze added a comment - > Shouldn't we have a primitive unit test to check bin/hadoop version result? .. I think this is a good idea but we do not have any unit test to test any script (correct me if I am wrong). Not sure why we are not creating unit tests for scripts. I guess we usually do manual tests for scripts. Are you proposing to add unit tests for scripts? Holding on the commit.
          Hide
          Konstantin Boudnik added a comment -

          Shouldn't we have a primitive unit test to check bin/hadoop version result? I believe this is a very basic sanity test too.

          Show
          Konstantin Boudnik added a comment - Shouldn't we have a primitive unit test to check bin/hadoop version result? I believe this is a very basic sanity test too.
          Hide
          Tsz Wo Nicholas Sze added a comment -

          Since this is a script changes, test-patch and unit tests are not related. I am going to commit this without submitting it to hudson.

          Show
          Tsz Wo Nicholas Sze added a comment - Since this is a script changes, test-patch and unit tests are not related. I am going to commit this without submitting it to hudson.
          Hide
          Tsz Wo Nicholas Sze added a comment -

          +1 patch looks good.

          Show
          Tsz Wo Nicholas Sze added a comment - +1 patch looks good.
          Hide
          Hong Tang added a comment -

          @sreekanth, I am not sure if they are the same problem. For HADOOP-4503, call "ant jar" twice would resolve the problem. But for this one, a java file (build/src/org/apache/hadoop/package-info.java) is simply not compiled and deposited to build/classes directory.

          Show
          Hong Tang added a comment - @sreekanth, I am not sure if they are the same problem. For HADOOP-4503 , call "ant jar" twice would resolve the problem. But for this one, a java file (build/src/org/apache/hadoop/package-info.java) is simply not compiled and deposited to build/classes directory.
          Hide
          Hong Tang added a comment -

          Trivial patch that fixes both problems. No junit test required. Manually tested and confirmed to resolve the problems.

          Show
          Hong Tang added a comment - Trivial patch that fixes both problems. No junit test required. Manually tested and confirmed to resolve the problems.
          Hide
          Sreekanth Ramakrishnan added a comment -

          HADOOP-4503 actually talks about the first problem which is mentioned in this JIRA.

          Show
          Sreekanth Ramakrishnan added a comment - HADOOP-4503 actually talks about the first problem which is mentioned in this JIRA.
          Hide
          Hong Tang added a comment -

          Problem symptoms:
          % bin/hadoop version
          Hadoop Unknown
          Subversion Unknown -r Unknown
          Compiled by Unknown on Unknown
          From source with checksum Unknown

          % build/hadoop-core-0.21.0-dev/bin/hadoop version
          Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/VersionInfo
          Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.VersionInfo
          at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
          at java.security.AccessController.doPrivileged(Native Method)
          at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
          at java.lang.ClassLoader.loadClass(ClassLoader.java:319)
          at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:330)
          at java.lang.ClassLoader.loadClass(ClassLoader.java:254)
          at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:402)

          Show
          Hong Tang added a comment - Problem symptoms: % bin/hadoop version Hadoop Unknown Subversion Unknown -r Unknown Compiled by Unknown on Unknown From source with checksum Unknown % build/hadoop-core-0.21.0-dev/bin/hadoop version Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/VersionInfo Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.VersionInfo at java.net.URLClassLoader$1.run(URLClassLoader.java:200) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:188) at java.lang.ClassLoader.loadClass(ClassLoader.java:319) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:330) at java.lang.ClassLoader.loadClass(ClassLoader.java:254) at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:402)

            People

            • Assignee:
              Hong Tang
              Reporter:
              Hong Tang
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development