Details

    • Type: Improvement Improvement
    • Status: Closed
    • Priority: Minor Minor
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.17.0
    • Component/s: build
    • Labels:
      None

      Description

      I've created Eclipse project files for Hadoop (to be attached). I've found them very useful for exploring Hadoop and running the unit tests.

      The project files can be included in the source repository to make it easy to import Hadoop into Eclipse.

      A few features:

      • Eclipse automatically calls the Ant build to generate some of the necessary source files
      • Single unit tests can be run from inside Eclipse
      • Basic Java code style formatter settings for the Hadoop conventions (still needs some work)

      The following VM arguments must be specified in the run configuration to get unit tests to run:

      -Xms256m -Xmx256m -Dtest.build.data=$

      {project_loc}

      \build\test\data

      Some of the unit tests don't run yet, possibly due to some missing VM flags, the fact that I'm running Windows, or some other reason(s).

      TODO:

      • Specify native library location(s) once I investigate building of Hadoop's native library
      • Get all the unit tests to run
      1. hadoop-1228-v2.patch
        5 kB
        Tom White
      2. hadoop-1228.patch
        5 kB
        Tom White
      3. eclipse.patch
        10 kB
        Mark Butler
      4. hadoop-eclipse.zip
        5 kB
        Albert Strasheim
      5. .project
        0.7 kB
        Albert Strasheim
      6. .classpath
        2 kB
        Albert Strasheim

        Activity

        Albert Strasheim created issue -
        Hide
        Albert Strasheim added a comment -

        Eclipse project file

        Show
        Albert Strasheim added a comment - Eclipse project file
        Albert Strasheim made changes -
        Field Original Value New Value
        Attachment .classpath [ 12355133 ]
        Hide
        Albert Strasheim added a comment -

        Eclipse project file

        Show
        Albert Strasheim added a comment - Eclipse project file
        Albert Strasheim made changes -
        Attachment .project [ 12355134 ]
        Hide
        Albert Strasheim added a comment -

        All the pieces needed by Eclipse

        Show
        Albert Strasheim added a comment - All the pieces needed by Eclipse
        Albert Strasheim made changes -
        Attachment hadoop-eclipse.zip [ 12355135 ]
        Hide
        Mark Butler added a comment -

        Unfortunately it is not possible to have a standard project file for Eclipse because

        org.apache.hadoop.record.compiler.ant.RccTask

        needs ant.jar. This was removed here https://issues.apache.org/jira/browse/HADOOP-1726

        so consequently within Eclipse we do not know where it is, as I am not aware of way to reference the default Java classpath in Eclipse.

        I am guessing this is not a big problem, because most people on the project do not use Eclipse. Users who do should follow the advice here

        http://wiki.apache.org/lucene-hadoop/EclipseEnvironment

        I also enclose a patch that does all of the Eclipse configuration except the ant.jar part.

        Show
        Mark Butler added a comment - Unfortunately it is not possible to have a standard project file for Eclipse because org.apache.hadoop.record.compiler.ant.RccTask needs ant.jar. This was removed here https://issues.apache.org/jira/browse/HADOOP-1726 so consequently within Eclipse we do not know where it is, as I am not aware of way to reference the default Java classpath in Eclipse. I am guessing this is not a big problem, because most people on the project do not use Eclipse. Users who do should follow the advice here http://wiki.apache.org/lucene-hadoop/EclipseEnvironment I also enclose a patch that does all of the Eclipse configuration except the ant.jar part.
        Mark Butler made changes -
        Attachment eclipse.patch [ 12367760 ]
        Mark Butler made changes -
        Attachment eclipse.patch [ 12367760 ]
        Hide
        Mark Butler added a comment -

        Patch file that provides files needed to compile Hadoop in Eclipse, although the location of ant.jar still needs configuring.

        Show
        Mark Butler added a comment - Patch file that provides files needed to compile Hadoop in Eclipse, although the location of ant.jar still needs configuring.
        Mark Butler made changes -
        Attachment eclipse.patch [ 12367761 ]
        Mark Butler made changes -
        Comment [ Patch file that does adds Eclipse files that does most of the configuration required except the ant.jar issue. ]
        Hide
        Tom White added a comment -

        Here's a patch (hadoop-1228.patch) that takes a different approach. It stores template files in a directory called .eclipse.templates, and there is an ant task to copy the files to the right place for Eclipse to find them. Since the .classpath, .project and other Eclipse files are not checked into subversion, it's possible to tweak your Eclipse set up without seeing modified flags. In particular, you can check out multiple copies of Hadoop into the same workspace with different names.

        To use it:

        1. Set up an ANT_HOME Classpath Variable in Preferences. (This is a global Eclipse setting so you only need to do this once.)
        2. Checkout Hadoop.
        3. Apply this patch.
        4. Run the generate-eclipse-files ant target.
        5. Refresh the project.
        6. Select Project | Build Project.

        If folks find this useful we could check it in.

        Show
        Tom White added a comment - Here's a patch (hadoop-1228.patch) that takes a different approach. It stores template files in a directory called .eclipse.templates, and there is an ant task to copy the files to the right place for Eclipse to find them. Since the .classpath, .project and other Eclipse files are not checked into subversion, it's possible to tweak your Eclipse set up without seeing modified flags. In particular, you can check out multiple copies of Hadoop into the same workspace with different names. To use it: 1. Set up an ANT_HOME Classpath Variable in Preferences. (This is a global Eclipse setting so you only need to do this once.) 2. Checkout Hadoop. 3. Apply this patch. 4. Run the generate-eclipse-files ant target. 5. Refresh the project. 6. Select Project | Build Project. If folks find this useful we could check it in.
        Tom White made changes -
        Attachment hadoop-1228.patch [ 12374878 ]
        Tom White made changes -
        Status Open [ 1 ] Patch Available [ 10002 ]
        Hide
        Hadoop QA added a comment -

        -1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12374878/hadoop-1228.patch
        against trunk revision 616796.

        @author +1. The patch does not contain any @author tags.

        javadoc +1. The javadoc tool did not generate any warning messages.

        javac +1. The applied patch does not generate any new javac compiler warnings.

        release audit +1. The applied patch does not generate any new release audit warnings.

        findbugs +1. The patch does not introduce any new Findbugs warnings.

        core tests -1. The patch failed core unit tests.

        contrib tests +1. The patch passed contrib unit tests.

        Test results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/1750/testReport/
        Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/1750/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Checkstyle results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/1750/artifact/trunk/build/test/checkstyle-errors.html
        Console output: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/1750/console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12374878/hadoop-1228.patch against trunk revision 616796. @author +1. The patch does not contain any @author tags. javadoc +1. The javadoc tool did not generate any warning messages. javac +1. The applied patch does not generate any new javac compiler warnings. release audit +1. The applied patch does not generate any new release audit warnings. findbugs +1. The patch does not introduce any new Findbugs warnings. core tests -1. The patch failed core unit tests. contrib tests +1. The patch passed contrib unit tests. Test results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/1750/testReport/ Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/1750/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Checkstyle results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/1750/artifact/trunk/build/test/checkstyle-errors.html Console output: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/1750/console This message is automatically generated.
        Hide
        Owen O'Malley added a comment -

        I kind of got this to work, but I had to import the project. (Otherwise, there was nothing to refresh...)

        Could I suggest that the target be "ant eclipse-files" since the generate is implied by ant? Other than that, my only complaint is that it isn't the way that I like eclipse (building using java builder instead of ant, subversive hooked up)

        Show
        Owen O'Malley added a comment - I kind of got this to work, but I had to import the project. (Otherwise, there was nothing to refresh...) Could I suggest that the target be "ant eclipse-files" since the generate is implied by ant? Other than that, my only complaint is that it isn't the way that I like eclipse (building using java builder instead of ant, subversive hooked up)
        Hide
        Tom White added a comment -

        Renamed target to "eclipse-files" following Owen's suggestion.

        The java builder is only invoked to build the record io files for tests so everything is complied for Eclipse - you can still use ant. Regarding subversive - I'm using subclipse, but don't see why we can't add whatever's needed for subversive at some point.

        Show
        Tom White added a comment - Renamed target to "eclipse-files" following Owen's suggestion. The java builder is only invoked to build the record io files for tests so everything is complied for Eclipse - you can still use ant. Regarding subversive - I'm using subclipse, but don't see why we can't add whatever's needed for subversive at some point.
        Tom White made changes -
        Attachment hadoop-1228-v2.patch [ 12378490 ]
        Hide
        Tom White added a comment -

        I've just committed this.

        Show
        Tom White added a comment - I've just committed this.
        Tom White made changes -
        Assignee Tom White [ tomwhite ]
        Resolution Fixed [ 1 ]
        Status Patch Available [ 10002 ] Resolved [ 5 ]
        Tom White made changes -
        Fix Version/s 0.17.0 [ 12312913 ]
        Hide
        Hudson added a comment -
        Show
        Hudson added a comment - Integrated in Hadoop-trunk #444 (See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/444/ )
        Nigel Daley made changes -
        Status Resolved [ 5 ] Closed [ 6 ]
        Hide
        lili added a comment -

        I have created Map/reduce project in Eclipse. My program is working properly. I measure the time takes to do the job by putting Two long variable start and finish before and after job.waitForCompletion(true) as follow:
        start=System.currentTimeMilli
        boolean b = job.waitForCompletion(true);
        finish=System.currentTimeMillis();
        But the problem is that the job takes more that the normal program which I wrote without Map/reduce. I do not know if I measure the time correctly. And also how could I reach to history file of job when I am running program from eclipse?
        I have Three text files each contain about 80,000 binary string records. Given a string like A=(1001001001111......), I would like to search among all text files and check weather the zero bits in a A, are also zero in a record. I also would like to know wether Map/reduce job always has to work on some text files and in String format or not?
        while (itr.hasMoreTokens())
        {
        word.set(itr.nextToken());
        for (int i=0;i<queryIndex.length() && match ;i++ )
        {
        if (queryIndex.charAt=='0')
        {
        if (word.charAt!=queryIndex.charAt)

        { match=false; }

        }
        }
        if (match)

        { Text temp=new Text("user"+counter); context.write(temp, one); }

        }
        counter++;
        }

        Thanks a lot

        Show
        lili added a comment - I have created Map/reduce project in Eclipse. My program is working properly. I measure the time takes to do the job by putting Two long variable start and finish before and after job.waitForCompletion(true) as follow: start=System.currentTimeMilli boolean b = job.waitForCompletion(true); finish=System.currentTimeMillis(); But the problem is that the job takes more that the normal program which I wrote without Map/reduce. I do not know if I measure the time correctly. And also how could I reach to history file of job when I am running program from eclipse? I have Three text files each contain about 80,000 binary string records. Given a string like A=(1001001001111......), I would like to search among all text files and check weather the zero bits in a A, are also zero in a record. I also would like to know wether Map/reduce job always has to work on some text files and in String format or not? while (itr.hasMoreTokens()) { word.set(itr.nextToken()); for (int i=0;i<queryIndex.length() && match ;i++ ) { if (queryIndex.charAt =='0') { if (word.charAt !=queryIndex.charAt ) { match=false; } } } if (match) { Text temp=new Text("user"+counter); context.write(temp, one); } } counter++; } Thanks a lot

          People

          • Assignee:
            Tom White
            Reporter:
            Albert Strasheim
          • Votes:
            1 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development