Details

    • Type: Task Task
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.22.0
    • Component/s: None
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Issue Links

        Activity

        Hide
        Nigel Daley added a comment -

        Owen, thoughts on how we should rename these jar files in the mapred and hdfs projects?

        hadoop-0.21.0-dev-ant.jar
        hadoop-0.21.0-dev-core.jar
        hadoop-0.21.0-dev-examples.jar
        hadoop-0.21.0-dev-test.jar
        hadoop-0.21.0-dev-tools.jar

        Show
        Nigel Daley added a comment - Owen, thoughts on how we should rename these jar files in the mapred and hdfs projects? hadoop-0.21.0-dev-ant.jar hadoop-0.21.0-dev-core.jar hadoop-0.21.0-dev-examples.jar hadoop-0.21.0-dev-test.jar hadoop-0.21.0-dev-tools.jar
        Hide
        Giridharan Kesavan added a comment -

        With this split I'm also thinking about changing the folder structure to follow maven standards. Any comments?
        tnx!

        Show
        Giridharan Kesavan added a comment - With this split I'm also thinking about changing the folder structure to follow maven standards. Any comments? tnx!
        Hide
        Nigel Daley added a comment -

        What would those standards be?

        Show
        Nigel Daley added a comment - What would those standards be?
        Hide
        dhruba borthakur added a comment -

        I am assuming that HadoopQA will run all unit tests (both dfs and map-reduce) when a patch is submitted for the dfs sub-project (or for the map-reduce subproject).

        Show
        dhruba borthakur added a comment - I am assuming that HadoopQA will run all unit tests (both dfs and map-reduce) when a patch is submitted for the dfs sub-project (or for the map-reduce subproject).
        Hide
        Giridharan Kesavan added a comment -

        What would those standards be?

        java sources would go in here.
        src/main/java
        test sources would go in here
        src/test/java
        For details : http://maven.apache.org/guides/introduction/introduction-to-the-standard-directory-layout.html

        Show
        Giridharan Kesavan added a comment - What would those standards be? java sources would go in here. src/main/java test sources would go in here src/test/java For details : http://maven.apache.org/guides/introduction/introduction-to-the-standard-directory-layout.html
        Hide
        Tom White added a comment -

        As far as jar naming goes, how about

        hadoop-

        {core,hdfs,mapred}-0.21.0-dev.jar
        hadoop-{core,hdfs,mapred}

        -

        {ant,examples,test,tools}

        -0.21.0-dev.jar

        Having the version number at the end seems to be the standard way to name things in the Ivy/Maven world.

        Show
        Tom White added a comment - As far as jar naming goes, how about hadoop- {core,hdfs,mapred}-0.21.0-dev.jar hadoop-{core,hdfs,mapred} - {ant,examples,test,tools} -0.21.0-dev.jar Having the version number at the end seems to be the standard way to name things in the Ivy/Maven world.
        Hide
        Nigel Daley added a comment -

        +1 on maven layout.

        +1 on Tom's jar name suggestions.

        Regarding jar names, we'll need to update bin scripts. Here are current references in the top-level bin scripts:

        hadoop-config.sh:for f in $HADOOP_CORE_HOME/hadoop-*-core.jar; do
        hadoop-config.sh:for f in $HADOOP_CORE_HOME/hadoop-*-tools.jar; do
        hadoop-config.sh:for f in $HADOOP_CORE_HOME/build/hadoop-*-tools.jar; do
        hadoop-config.sh:  for f in $HADOOP_HDFS_HOME/hadoop-*-hdfs.jar; do
        hdfs:for f in $HADOOP_HDFS_HOME/hadoop-*-hdfs.jar; do
        mapred:for f in $HADOOP_MAPRED_HOME/hadoop-*-mapred.jar; do
        rcc:for f in $HADOOP_HOME/hadoop-*-core.jar; do
        

        Contrib bin scripts, Makefiles, etc also need to be checked and updated. Owen, this seems beyond what Giri should have to do for getting the build working. Should we file a separate Jira for that once the names are settled on.

        Show
        Nigel Daley added a comment - +1 on maven layout. +1 on Tom's jar name suggestions. Regarding jar names, we'll need to update bin scripts. Here are current references in the top-level bin scripts: hadoop-config.sh: for f in $HADOOP_CORE_HOME/hadoop-*-core.jar; do hadoop-config.sh: for f in $HADOOP_CORE_HOME/hadoop-*-tools.jar; do hadoop-config.sh: for f in $HADOOP_CORE_HOME/build/hadoop-*-tools.jar; do hadoop-config.sh: for f in $HADOOP_HDFS_HOME/hadoop-*-hdfs.jar; do hdfs: for f in $HADOOP_HDFS_HOME/hadoop-*-hdfs.jar; do mapred: for f in $HADOOP_MAPRED_HOME/hadoop-*-mapred.jar; do rcc: for f in $HADOOP_HOME/hadoop-*-core.jar; do Contrib bin scripts, Makefiles, etc also need to be checked and updated. Owen, this seems beyond what Giri should have to do for getting the build working. Should we file a separate Jira for that once the names are settled on.
        Hide
        Iyappan Srinivasan added a comment -

        Please look at MAPREDUCE-1004, ant binary not copying jar files properly.

        Show
        Iyappan Srinivasan added a comment - Please look at MAPREDUCE-1004 , ant binary not copying jar files properly.
        Hide
        Owen O'Malley added a comment -

        This should have been resolved a long time ago.

        Show
        Owen O'Malley added a comment - This should have been resolved a long time ago.

          People

          • Assignee:
            Owen O'Malley
            Reporter:
            Owen O'Malley
          • Votes:
            1 Vote for this issue
            Watchers:
            19 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development