Hadoop Common
  1. Hadoop Common
  2. HADOOP-3305

Publish hadoop-core to the apache repository with an appropriate POM file

    Details

    • Type: New Feature New Feature
    • Status: Resolved
    • Priority: Minor Minor
    • Resolution: Fixed
    • Affects Version/s: 0.16.2, 0.16.3
    • Fix Version/s: None
    • Component/s: build
    • Labels:
      None
    • Release Note:
      Hide
      This build.xml pulls the ivy.jar by itself and doesn't require any environment change except for ant-1.7.0. Builds can go as we do now expect that the machine which runs the build has to be online for resolving the dependencies as it resolves dependencies from maven repository.

      Minimum ant version requirement - 1.7.0
      Show
      This build.xml pulls the ivy.jar by itself and doesn't require any environment change except for ant-1.7.0. Builds can go as we do now expect that the machine which runs the build has to be online for resolving the dependencies as it resolves dependencies from maven repository. Minimum ant version requirement - 1.7.0

      Description

      To let people downstream build/test with hadoop, using Apache Ivy or Apache Maven2 to pull it down, hadoop-core needs to be published to the apache repository with a .pom file that lists its mandatory dependencies.

      In an automated build process, this means
      -having a template XML pom defining all included dependencies (and excluded transient dependency artifacts)
      -having a property file driving version numbering of all artifacts
      -copying this template with property expansion to create the release POM file
      -public releases only: sticking this POM file up on people.apache.org in the right place, along with the JAR and some .md5 checksums

      There's a risk that if the hadoop team dont do this, someone else will (as mahout are doing under http://people.apache.org/~kalle/mahout/maven2/org/apache/hadoop/ )
      This is bad as hadoop end up fielding the support calls from someone elses files.

      Before automating the process, existing hadoop-core JARs can be pushed out with hand-encoded POM files. The repository police dont allow pom files ever to be changed, so supporting existing releases (.16.2, 0.16.3 ... ) is a way of beta testing the POMs.

      1. rmlib-v3.sh
        1 kB
        Giridharan Kesavan
      2. rmlib.sh
        1 kB
        Giridharan Kesavan
      3. rmlib_V2.sh
        1.0 kB
        Giridharan Kesavan
      4. ivy-support-first-pass.zip
        849 kB
        steve_l
      5. ivysupport.zip
        850 kB
        steve_l
      6. ivysupport.zip
        805 kB
        steve_l
      7. hadoop-core-0.16.2.pom
        4 kB
        steve_l
      8. HADOOP-3305-v3.patch
        276 kB
        Giridharan Kesavan
      9. HADOOP-3305.patch
        396 kB
        Giridharan Kesavan
      10. HADOOP-3305.patch
        257 kB
        Giridharan Kesavan
      11. HADOOP-3305_V2.patch
        380 kB
        Giridharan Kesavan

        Issue Links

          Activity

          Hide
          Doug Cutting added a comment -

          I just committed this, with two changes:

          • add 'unless="offline"' to the downloads of Ivy, so that developers can work offline by specifying 'ant -Doffline=true'.
          • include jars in lib/ directory of release builds, for back-compatibility.
          Show
          Doug Cutting added a comment - I just committed this, with two changes: add 'unless="offline"' to the downloads of Ivy, so that developers can work offline by specifying 'ant -Doffline=true'. include jars in lib/ directory of release builds, for back-compatibility.
          Hide
          Giridharan Kesavan added a comment -

          Here is the final version of the patch which enables ivy for most of the targets.
          except for findbugs and the target which uses forrest for doc generation.

          Thanks,
          Giri

          Show
          Giridharan Kesavan added a comment - Here is the final version of the patch which enables ivy for most of the targets. except for findbugs and the target which uses forrest for doc generation. Thanks, Giri
          Hide
          Giridharan Kesavan added a comment -

          I'm resubmitting the patch
          Thanks,
          Giri

          Show
          Giridharan Kesavan added a comment - I'm resubmitting the patch Thanks, Giri
          Hide
          Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12396071/rmlib-v3.sh
          against trunk revision 726129.

          +1 @author. The patch does not contain any @author tags.

          -1 tests included. The patch doesn't appear to include any new or modified tests.
          Please justify why no tests are needed for this patch.

          -1 patch. The patch command could not apply the patch.

          Console output: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/3744/console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12396071/rmlib-v3.sh against trunk revision 726129. +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no tests are needed for this patch. -1 patch. The patch command could not apply the patch. Console output: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch/3744/console This message is automatically generated.
          Hide
          Giridharan Kesavan added a comment -

          v3 version of patch comprises

          • comments/suggestions made by Steve.
          • All the ant xml files are called build.xml and not ivybuild.xml as suggested by Doug.

          Thanks,
          Giri

          Show
          Giridharan Kesavan added a comment - v3 version of patch comprises comments/suggestions made by Steve. All the ant xml files are called build.xml and not ivybuild.xml as suggested by Doug. Thanks, Giri
          Hide
          Doug Cutting added a comment -

          Shouldn't this replace build.xml, not add ivybuild.xml?

          Show
          Doug Cutting added a comment - Shouldn't this replace build.xml, not add ivybuild.xml?
          Hide
          Giridharan Kesavan added a comment -

          This patch comprises comments suggested by Steve. This patch also has fixes for most of the ant targets except for forrest. as forest jar is yet to be published.

          Please review the patch, Im also supplying a supporting script for removing the jars files which would be resolved by ivy.

          Thanks again to Steve for his comments.

          Giri

          Show
          Giridharan Kesavan added a comment - This patch comprises comments suggested by Steve. This patch also has fixes for most of the ant targets except for forrest. as forest jar is yet to be published. Please review the patch, Im also supplying a supporting script for removing the jars files which would be resolved by ivy. Thanks again to Steve for his comments. Giri
          Hide
          steve_l added a comment -

          Giri, here's the core ivy settings updated to Jetty6 and the last ivy rc out the door. There is another Ivy release being voted on today, but I haven't used that yet.

          I think repeat think the POM is adequate in excluding lots of stuff that the logging tools pull in, stuff that isn't in the repository and which will break downstream builds if you don't block it. Jetty uses SLF for logging and the log4j bindings for that can live alongside log4j.

          Show
          steve_l added a comment - Giri, here's the core ivy settings updated to Jetty6 and the last ivy rc out the door. There is another Ivy release being voted on today, but I haven't used that yet. I think repeat think the POM is adequate in excluding lots of stuff that the logging tools pull in, stuff that isn't in the repository and which will break downstream builds if you don't block it. Jetty uses SLF for logging and the log4j bindings for that can live alongside log4j.
          Hide
          steve_l added a comment -

          >Other thing that I'm not sure is "How to decide on the version of components that doesn't have the version # as part of their jar name?"

          Sticking the checksums into google works most reliably.
          JSON is 1.0,
          junit 3.8.1
          ant is 1.7.0,
          servlet API and servlet 5.5.12
          jsp-api and taglibs don't exist

          Show
          steve_l added a comment - >Other thing that I'm not sure is "How to decide on the version of components that doesn't have the version # as part of their jar name?" Sticking the checksums into google works most reliably. JSON is 1.0, junit 3.8.1 ant is 1.7.0, servlet API and servlet 5.5.12 jsp-api and taglibs don't exist
          Hide
          Giridharan Kesavan added a comment - - edited

          Thanks Steve for your comments!

          Here is the approach to address point 1

          We have the top level libraries.properties and component level libraries.properties file.
          In the component level libraries.properties file we can define version for dependencies whichever has a different version of dependency than the one defined in the top level(or the global libraries.properties file).

          Other thing that I'm not sure is "How to decide on the version of components that doesn't have the version # as part of their jar name?"

          For example servlet-api.jar inside the chukwa/lib folder doesn't seem to have a version. This is one such example , we have lot more like this.

          We have different dependencies inside the Chukwa/lib which doesn't seem to have a version # as well.
          How do we define versions for them?

          To answer the 2 point
          I see that Chukwa is importing the hadoop/fs hadoop/io and hadoop/conf packages. Which means that Chukwa depends on hadoop.
          I looked at the smartfrog ivy files but coudn't make out the cross-referencing idea. Could you please elaborate on that

          I would address comment 1, 3 and 4 in my next patch.

          Thanks again for your comments.
          Giri

          Show
          Giridharan Kesavan added a comment - - edited Thanks Steve for your comments! Here is the approach to address point 1 We have the top level libraries.properties and component level libraries.properties file. In the component level libraries.properties file we can define version for dependencies whichever has a different version of dependency than the one defined in the top level(or the global libraries.properties file). Other thing that I'm not sure is "How to decide on the version of components that doesn't have the version # as part of their jar name?" For example servlet-api.jar inside the chukwa/lib folder doesn't seem to have a version. This is one such example , we have lot more like this. We have different dependencies inside the Chukwa/lib which doesn't seem to have a version # as well. How do we define versions for them? To answer the 2 point I see that Chukwa is importing the hadoop/fs hadoop/io and hadoop/conf packages. Which means that Chukwa depends on hadoop. I looked at the smartfrog ivy files but coudn't make out the cross-referencing idea. Could you please elaborate on that I would address comment 1 , 3 and 4 in my next patch. Thanks again for your comments. Giri
          Hide
          steve_l added a comment -

          I've had a quick flick through and I'm impressed by the effort that Giri has gone to here. This looks like the basis for moving everything in the core to ivy

          1. It would be good to have everything driven by a single master properties file, with each project having the option to override them (via their own libraries.properties) but not requiring them to. This makes it easy to push up every app to a new version of, say, log4j, by changing one file, and it keeps things consistent. Giri's patch shows how inconsistent the projects are regarding versions of things, and that just leads to trouble down the line

          2. Does Chukwa depend on hadoop-core? If it does, there's a good case for the ivy config file of hadoop-core to contain some specific configurations for jetty and jsp support, so that Chukwa can pull them in without having to repeat them. This is what we do in smartfrog by cross-referencing component packages: only one package is allowed to import a third party library; everything else has to depend on that package. It works very well when you move to RPM distribution as the same dependencies and ownership rules apply there.

          3. We'd need to go through the ivy reports of everything and make sure that nothing is pulling in transient dependencies you don't want. If they pull in transients you do want, it is safer to declare them and the version you desire. commons-logging is a notorious source of problems here; you should only ever depend on its "master" version to avoid stuff you dont need like avalon-logkit and bits of JMX.

          4. src/contrib/hdfsproxy/ivybuild.xml has hard coded version numbers in the build file. It shoud be driven from the .properties file

          I need to play with this some more, by patching a clean version of the source tree and seeing how it goes. Its a good design, there's just a few more tweaks we need to get in there

          Show
          steve_l added a comment - I've had a quick flick through and I'm impressed by the effort that Giri has gone to here. This looks like the basis for moving everything in the core to ivy 1. It would be good to have everything driven by a single master properties file, with each project having the option to override them (via their own libraries.properties) but not requiring them to. This makes it easy to push up every app to a new version of, say, log4j, by changing one file, and it keeps things consistent. Giri's patch shows how inconsistent the projects are regarding versions of things, and that just leads to trouble down the line 2. Does Chukwa depend on hadoop-core? If it does, there's a good case for the ivy config file of hadoop-core to contain some specific configurations for jetty and jsp support, so that Chukwa can pull them in without having to repeat them. This is what we do in smartfrog by cross-referencing component packages: only one package is allowed to import a third party library; everything else has to depend on that package. It works very well when you move to RPM distribution as the same dependencies and ownership rules apply there. 3. We'd need to go through the ivy reports of everything and make sure that nothing is pulling in transient dependencies you don't want. If they pull in transients you do want, it is safer to declare them and the version you desire. commons-logging is a notorious source of problems here; you should only ever depend on its "master" version to avoid stuff you dont need like avalon-logkit and bits of JMX. 4. src/contrib/hdfsproxy/ivybuild.xml has hard coded version numbers in the build file. It shoud be driven from the .properties file I need to play with this some more, by patching a clean version of the source tree and seeing how it goes. Its a good design, there's just a few more tweaks we need to get in there
          Hide
          steve_l added a comment -

          This is what I'm working on right now; it publishes the core hadoop artifacts to the local repository, where my other build can grab them. I also take the new artifacts and stick them in an SVN-managed repository so that our hudson build runs against the releases I make, with some work needed to make sure that my local build doesn't pick up those (usually dated) artifacts, but instead stays up to date with whatever I build.

          The versions I have for things are

          commons-cli.version=2.0-SNAPSHOT
          commons-codec.version=1.3
          commons-httpclient.version=3.0.1
          commons-net.version=1.4.1
          commons-logging.version=1.0.4
          commons-el.version=1.0
          hsqldb.version=1.8.0.7
          ivy.version=2.0.0.rc1_20080519182948
          jasper.version=5.5.12
          jsp-api.version=$

          {jasper.version}

          jets3t.version=0.6.0
          jetty5.version=5.1.4
          junit.version=3.8.1
          kfs.version=0.1
          log4j.version=1.2.15
          oro.version=2.0.8
          servletapi.version=2.4
          xmlenc.version=0.52

          This hasn't migrated to jetty6 yet, and it used an early release of Ivy; it should work with the latest official release now.

          commons-cli versions is a problem -there hasn't been an official release there for a while. The solution there is to persuade them to release one.
          json is a different problem in that there isn't an official release at all; you get to build your own, and often they have different MD5 checksums just from manifest.mf metadata. We need a better idea of what they built against and maybe stick something unofficial in the repository that works.

          Show
          steve_l added a comment - This is what I'm working on right now; it publishes the core hadoop artifacts to the local repository, where my other build can grab them. I also take the new artifacts and stick them in an SVN-managed repository so that our hudson build runs against the releases I make, with some work needed to make sure that my local build doesn't pick up those (usually dated) artifacts, but instead stays up to date with whatever I build. The versions I have for things are commons-cli.version=2.0-SNAPSHOT commons-codec.version=1.3 commons-httpclient.version=3.0.1 commons-net.version=1.4.1 commons-logging.version=1.0.4 commons-el.version=1.0 hsqldb.version=1.8.0.7 ivy.version=2.0.0.rc1_20080519182948 jasper.version=5.5.12 jsp-api.version=$ {jasper.version} jets3t.version=0.6.0 jetty5.version=5.1.4 junit.version=3.8.1 kfs.version=0.1 log4j.version=1.2.15 oro.version=2.0.8 servletapi.version=2.4 xmlenc.version=0.52 This hasn't migrated to jetty6 yet, and it used an early release of Ivy; it should work with the latest official release now. commons-cli versions is a problem -there hasn't been an official release there for a while. The solution there is to persuade them to release one. json is a different problem in that there isn't an official release at all; you get to build your own, and often they have different MD5 checksums just from manifest.mf metadata. We need a better idea of what they built against and maybe stick something unofficial in the repository that works.
          Hide
          Nigel Daley added a comment -

          Thanks Giri!

          Steve, can you review this? Does this look like it's on the right track?

          Show
          Nigel Daley added a comment - Thanks Giri! Steve, can you review this? Does this look like it's on the right track?
          Hide
          Giridharan Kesavan added a comment - - edited

          This patch HADOOP-3305.patch is developed from the patch submitted by Steve, Thanks to Steve.
          rmlib.sh is the supporting script to cleanup the jar file from the lib folder.

          With this patch we should be able to use ivy to resolve dependencies through the maven repository, and we can get rid of the local lib folder when all the dependencies all available on the m2 repository.

          As Steve mentioned there are certain missing libraries in the m2 repo.

          Those missing dependencies are still added to the classpath from the local lib folder, and other dependencies that are available in the m2 repository are resolved/retrieved by IVY from the maven repository.

          Here I 've provided the list of missing dependencies for different component.

          Hadoop-core
          -commons-cli-2.0-SNAPSHOT.jar - 20040117.000000 snapshot available..
          -kfs-0.2.2.jar - NOT AVAILABLE

          Thrift
          -hadoopthriftapi.jar - NOT AVAILABLE
          -libthrift.jar - NOT AVAILABLE

          Chukwa
          -json.jar - Not sure fo the jar file version though json is available in the m2 respo.

          This patch contains a new set of ivybuild.xml & ivy.xml files for diff comps
          which resembles the existing buid.xml files except for the ivy implementations.
          The ivy.xml file has all the dependencies for diff components listed in it with versions
          pulled out from the corresponding components libraries.properties file.

          Also this patch contains a top level ivysettings.xml file which has the details of all the resolvers and the url for resolving the dependencies.


          Instruction to test the patch

          Apply the patch
          run the rmlib.sh script from hadoop-core base folder.
          [ This would remove the local lib folder with jar files that can be resolved/retrieved from the m2 repository]

          As of now the patch supports 3 main targets

          *compile
          *package
          *releaseaudit

          To execute the targets

          ant -f ivybuild.xml releaseaudit

          This has compile and package targets as dependencies.

          Work in progress for other ant targets. Meanwhile I would like to get this patch reviewed.
          Please review this patch.

          Thanks,
          Giri

          Show
          Giridharan Kesavan added a comment - - edited This patch HADOOP-3305 .patch is developed from the patch submitted by Steve, Thanks to Steve. rmlib.sh is the supporting script to cleanup the jar file from the lib folder. With this patch we should be able to use ivy to resolve dependencies through the maven repository, and we can get rid of the local lib folder when all the dependencies all available on the m2 repository. As Steve mentioned there are certain missing libraries in the m2 repo. Those missing dependencies are still added to the classpath from the local lib folder, and other dependencies that are available in the m2 repository are resolved/retrieved by IVY from the maven repository. Here I 've provided the list of missing dependencies for different component. Hadoop-core -commons-cli-2.0-SNAPSHOT.jar - 20040117.000000 snapshot available.. -kfs-0.2.2.jar - NOT AVAILABLE Thrift -hadoopthriftapi.jar - NOT AVAILABLE -libthrift.jar - NOT AVAILABLE Chukwa -json.jar - Not sure fo the jar file version though json is available in the m2 respo. This patch contains a new set of ivybuild.xml & ivy.xml files for diff comps which resembles the existing buid.xml files except for the ivy implementations. The ivy.xml file has all the dependencies for diff components listed in it with versions pulled out from the corresponding components libraries.properties file. Also this patch contains a top level ivysettings.xml file which has the details of all the resolvers and the url for resolving the dependencies. Instruction to test the patch Apply the patch run the rmlib.sh script from hadoop-core base folder. [ This would remove the local lib folder with jar files that can be resolved/retrieved from the m2 repository] As of now the patch supports 3 main targets *compile *package *releaseaudit To execute the targets ant -f ivybuild.xml releaseaudit This has compile and package targets as dependencies. Work in progress for other ant targets. Meanwhile I would like to get this patch reviewed. Please review this patch. Thanks, Giri
          Hide
          steve_l added a comment -

          This is a zip file containing nearly everything needed to
          -pull in all the hadoop-core dependencies from Ivy
          -publish the built file to a local ivy repository
          -generate maven2-compatible JAR and POM, both with MD5 signatures

          It doesnt make any changes to the existing build; there is a new file ivybuild.xml that lives alongside it to do ivy work.
          Files
          ivy.xml lists the various classpaths that get set up and their dependencies. Based on some trial and error.
          ivybuild.xml publishes the artifacts
          ivy/ivysettings.xml configuration file for ivy
          ivy/libraries.properties list of all versions of all artifacts
          ivy/ivy-2.0.0.rc1_20080519182948.jar (build of SVN_HEAD from monday)
          ivy/hadoop-core.pom template POM file for hadoop. This is copied with property expansion to put in correct version information, and excludes all unneeded artifacts.

          I'm publishing this for people who want to integrate hadoop builds with local Ivy builds, and to start a process of sticking hadoop artifacts up on the apache repositories. It also shows that Ivy can be used to set up Hadoop's classpath, but doesnt make a strong case for actually doing so
          -there's no kfs.jar in the central repositories
          -there's no commons-cli 2.0 artifacts in the central or snapshot repositories

          What is useful for many other projects is to put the hadoop-core artifacts into the maven repository, starting with the snapshot. That could be done using a small subset of what we have here, though there's still the problem of no commons-cli, which the command line tools use.

          Show
          steve_l added a comment - This is a zip file containing nearly everything needed to -pull in all the hadoop-core dependencies from Ivy -publish the built file to a local ivy repository -generate maven2-compatible JAR and POM, both with MD5 signatures It doesnt make any changes to the existing build; there is a new file ivybuild.xml that lives alongside it to do ivy work. Files ivy.xml lists the various classpaths that get set up and their dependencies. Based on some trial and error. ivybuild.xml publishes the artifacts ivy/ivysettings.xml configuration file for ivy ivy/libraries.properties list of all versions of all artifacts ivy/ivy-2.0.0.rc1_20080519182948.jar (build of SVN_HEAD from monday) ivy/hadoop-core.pom template POM file for hadoop. This is copied with property expansion to put in correct version information, and excludes all unneeded artifacts. I'm publishing this for people who want to integrate hadoop builds with local Ivy builds, and to start a process of sticking hadoop artifacts up on the apache repositories. It also shows that Ivy can be used to set up Hadoop's classpath, but doesnt make a strong case for actually doing so -there's no kfs.jar in the central repositories -there's no commons-cli 2.0 artifacts in the central or snapshot repositories What is useful for many other projects is to put the hadoop-core artifacts into the maven repository, starting with the snapshot. That could be done using a small subset of what we have here, though there's still the problem of no commons-cli, which the command line tools use.
          Hide
          steve_l added a comment -

          >> -having a property file driving version numbering of all artifacts

          >Lucene does this by using the version property from build.xml, so that we don't have to maintain another version file.

          Yes, hadoop should do that to. Even so you need another file to drive the versions of all your dependencies, those that are currently encoded in the filenames in /lib (jetty, log4j) or not documented at all (servlet-api.jar)

          >> -public releases only: sticking this POM file up on people.apache.org in the right place, along with the JAR and some .md5 checksums

          >If these are officially released artifacts, can't we just post them with the release, to www.apache.org/dist/hadoop/core? Why do we need to alter our distribution mechanism for Maven?

          Its something that could be done on the side (in a separate build.xml), which takes the signed off release artifacts and scps them them up to people.apache.org. The repository police do check that the JARs put up are officially released, although they dont audit the POMs so thoroughly

          Show
          steve_l added a comment - >> -having a property file driving version numbering of all artifacts >Lucene does this by using the version property from build.xml, so that we don't have to maintain another version file. Yes, hadoop should do that to. Even so you need another file to drive the versions of all your dependencies, those that are currently encoded in the filenames in /lib (jetty, log4j) or not documented at all (servlet-api.jar) >> -public releases only: sticking this POM file up on people.apache.org in the right place, along with the JAR and some .md5 checksums >If these are officially released artifacts, can't we just post them with the release, to www.apache.org/dist/hadoop/core? Why do we need to alter our distribution mechanism for Maven? Its something that could be done on the side (in a separate build.xml), which takes the signed off release artifacts and scps them them up to people.apache.org. The repository police do check that the JARs put up are officially released, although they dont audit the POMs so thoroughly
          Hide
          Doug Cutting added a comment -

          > -having a property file driving version numbering of all artifacts

          Lucene does this by using the version property from build.xml, so that we don't have to maintain another version file.

          > -public releases only: sticking this POM file up on people.apache.org in the right place, along with the JAR and some .md5 checksums

          If these are officially released artifacts, can't we just post them with the release, to www.apache.org/dist/hadoop/core? Why do we need to alter our distribution mechanism for Maven?

          Show
          Doug Cutting added a comment - > -having a property file driving version numbering of all artifacts Lucene does this by using the version property from build.xml, so that we don't have to maintain another version file. > -public releases only: sticking this POM file up on people.apache.org in the right place, along with the JAR and some .md5 checksums If these are officially released artifacts, can't we just post them with the release, to www.apache.org/dist/hadoop/core? Why do we need to alter our distribution mechanism for Maven?
          Hide
          steve_l added a comment -

          Prototype hadoop-0.16.2 POM. this should be reviewed and we can consider whether to publish this to the central repository.

          Some notes
          -versions were taken from 0.16.2 lib directory
          -dropped kfs dependency as there is no such artifact in the repository
          -marked everything but commons-logging as optional.
          -there is no hadoop-compatible commons-cli release in the repository (marked one as optional)
          -jetS3t pulls in way too much stuff -does it really need all of those? If that is so, then adding AWS authentication to http-client and doing RESTful S3 operations would be better. (I use restlet for this, incidentally)

          Show
          steve_l added a comment - Prototype hadoop-0.16.2 POM. this should be reviewed and we can consider whether to publish this to the central repository. Some notes -versions were taken from 0.16.2 lib directory -dropped kfs dependency as there is no such artifact in the repository -marked everything but commons-logging as optional. -there is no hadoop-compatible commons-cli release in the repository (marked one as optional) -jetS3t pulls in way too much stuff -does it really need all of those? If that is so, then adding AWS authentication to http-client and doing RESTful S3 operations would be better. (I use restlet for this, incidentally)
          Hide
          steve_l added a comment -

          There is already a draft pom for 1.7.0
          http://people.apache.org/~kalle/mahout/maven2/org/apache/hadoop/core/0.17.0-SNAPSHOT/core-0.17.0-20080315.201857-1.pom

          this declares an explicit dependency on most of hadoop-core/lib, and does not indicate which is optional. It is not ideal.

          Show
          steve_l added a comment - There is already a draft pom for 1.7.0 http://people.apache.org/~kalle/mahout/maven2/org/apache/hadoop/core/0.17.0-SNAPSHOT/core-0.17.0-20080315.201857-1.pom this declares an explicit dependency on most of hadoop-core/lib, and does not indicate which is optional. It is not ideal.

            People

            • Assignee:
              Giridharan Kesavan
              Reporter:
              Steve Loughran
            • Votes:
              3 Vote for this issue
              Watchers:
              11 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development