Accumulo
  1. Accumulo
  2. ACCUMULO-1244

commons-io version conflict with CDH4

    Details

    • Type: Bug Bug
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.5.0
    • Component/s: None
    • Labels:
      None
    • Environment:

      Hadoop version 2.0.0-CDH4.2.0

      Description

      CDH4 appears to rely on commons-io version 2.0 or greater. Accumulo currently packages in version 1.4. We should bump this up to achieve compatibility.

      Workaround: put the hadoop dependency libraries before the accumulo dependency libraries in the general.classpaths variable in accumulo-site.xml.

      2013-04-04 22:27:13,868 [tabletserver.Tablet] ERROR: Unknown error during minor compaction for extent: !0;~;!0<
      java.lang.RuntimeException: java.lang.NoSuchMethodError: org.apache.commons.io.IOUtils.closeQuietly(Ljava/io/Closeable;)V
        at org.apache.accumulo.server.tabletserver.Tablet.minorCompact(Tablet.java:2152)
        at org.apache.accumulo.server.tabletserver.Tablet.access$4400(Tablet.java:152)
        at org.apache.accumulo.server.tabletserver.Tablet$MinorCompactionTask.run(Tablet.java:2219)
        at org.apache.accumulo.core.util.LoggingRunnable.run(LoggingRunnable.java:34)
        at org.apache.accumulo.trace.instrument.TraceRunnable.run(TraceRunnable.java:47)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at org.apache.accumulo.trace.instrument.TraceRunnable.run(TraceRunnable.java:47)
        at org.apache.accumulo.core.util.LoggingRunnable.run(LoggingRunnable.java:34)
        at java.lang.Thread.run(Thread.java:662)
      Caused by: java.lang.NoSuchMethodError: org.apache.commons.io.IOUtils.closeQuietly(Ljava/io/Closeable;)V
        at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:941)
        at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:471)
        at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:662)
        at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:706)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.DataInputStream.readFully(DataInputStream.java:178)
        at java.io.DataInputStream.readLong(DataInputStream.java:399)
        at org.apache.accumulo.core.file.rfile.bcfile.BCFile$Reader.<init>(BCFile.java:608)
        at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.init(CachableBlockFile.java:246)
        at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.getBCFile(CachableBlockFile.java:257)
        at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.access$000(CachableBlockFile.java:143)
        at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader$MetaBlockLoader.get(CachableBlockFile.java:212)
        at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.getBlock(CachableBlockFile.java:313)
        at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.getMetaBlock(CachableBlockFile.java:367)
        at org.apache.accumulo.core.file.blockfile.impl.CachableBlockFile$Reader.getMetaBlock(CachableBlockFile.java:143)
        at org.apache.accumulo.core.file.rfile.RFile$Reader.<init>(RFile.java:834)
        at org.apache.accumulo.core.file.rfile.RFileOperations.openReader(RFileOperations.java:79)
        at org.apache.accumulo.core.file.DispatchingFileFactory.openReader(FileOperations.java:72)
        at org.apache.accumulo.server.tabletserver.Compactor.call(Compactor.java:317)
        at org.apache.accumulo.server.tabletserver.MinorCompactor.call(MinorCompactor.java:96)
        at org.apache.accumulo.server.tabletserver.Tablet.minorCompact(Tablet.java:2138)
        ... 9 more
      

        Issue Links

          Activity

          Hide
          Billie Rinaldi added a comment -

          Would commons-io 2.1 be sufficient? That appears to be the version used by Apache Hadoop 1 and 2. Maybe we should review all our common dependencies with Hadoop.

          Show
          Billie Rinaldi added a comment - Would commons-io 2.1 be sufficient? That appears to be the version used by Apache Hadoop 1 and 2. Maybe we should review all our common dependencies with Hadoop.
          Hide
          John Vines added a comment -

          I believe they all come from the apache pom parent, so we should double
          check that we're still in line with them.

          Sent from my phone, please pardon the typos and brevity.

          Show
          John Vines added a comment - I believe they all come from the apache pom parent, so we should double check that we're still in line with them. Sent from my phone, please pardon the typos and brevity.
          Hide
          Billie Rinaldi added a comment -

          These are the ones I've found with lower versions (1.5 branch dependency -> hadoop 1.0.3 dependency -> hadoop-common 2.0.2-alpha dependency). The only one with a major version difference is commons-io.

          commons-codec          1.2    -> 1.3    -> 1.4
          commons-collections    3.2    -> 3.2.1  -> 3.2.1
          commons-configuration  1.5    -> 1.6    -> 1.6
          commons-io             1.4    -> 2.1    -> 2.1
          commons-lang           2.4    -> 2.4    -> 2.5
          commons-logging        1.0.4  -> 1.1.1  -> 1.1.1
          log4j                  1.2.16 -> 1.2.15 -> 1.2.17
          
          Show
          Billie Rinaldi added a comment - These are the ones I've found with lower versions (1.5 branch dependency -> hadoop 1.0.3 dependency -> hadoop-common 2.0.2-alpha dependency). The only one with a major version difference is commons-io. commons-codec 1.2 -> 1.3 -> 1.4 commons-collections 3.2 -> 3.2.1 -> 3.2.1 commons-configuration 1.5 -> 1.6 -> 1.6 commons-io 1.4 -> 2.1 -> 2.1 commons-lang 2.4 -> 2.4 -> 2.5 commons-logging 1.0.4 -> 1.1.1 -> 1.1.1 log4j 1.2.16 -> 1.2.15 -> 1.2.17
          Hide
          Billie Rinaldi added a comment -

          No, these are specified explicitly in our pom.

          Show
          Billie Rinaldi added a comment - No, these are specified explicitly in our pom.
          Hide
          Christopher Tubbs added a comment -

          Should these just be listed as "provided" then, if we're assuming they will come in with the hadoop installation? Then, we can just specify a range of acceptable versions for our code, and let maven do the dependency resolution, based on the hadoop dependencies.

          Show
          Christopher Tubbs added a comment - Should these just be listed as "provided" then, if we're assuming they will come in with the hadoop installation? Then, we can just specify a range of acceptable versions for our code, and let maven do the dependency resolution, based on the hadoop dependencies.
          Hide
          John Vines added a comment -

          Sounds good to me.

          Show
          John Vines added a comment - Sounds good to me.
          Hide
          Keith Turner added a comment -

          Should these just be listed as "provided" then, if we're assuming they will come in with the hadoop installation?

          that follows the pattern of other dependencies Accumulo shares with Hadoop. Why should these be different? I do not know of a reason, making the change is ok w/ me.

          Show
          Keith Turner added a comment - Should these just be listed as "provided" then, if we're assuming they will come in with the hadoop installation? that follows the pattern of other dependencies Accumulo shares with Hadoop. Why should these be different? I do not know of a reason, making the change is ok w/ me.
          Hide
          Hudson added a comment -

          Integrated in Accumulo-1.5 #72 (See https://builds.apache.org/job/Accumulo-1.5/72/)
          ACCUMULO-1244 Make more libraries provided, because they must be. So, we'll depend on the version given by Hadoop. Remove all provided jars and source jars from lib directory. (Revision 1466685)

          Result = UNSTABLE
          ctubbsii :
          Files :

          • /accumulo/branches/1.5/assemble/pom.xml
          • /accumulo/branches/1.5/bin/accumulo
          • /accumulo/branches/1.5/bin/bootstrap_hdfs.sh
          • /accumulo/branches/1.5/core/pom.xml
          • /accumulo/branches/1.5/examples/simple/pom.xml
          • /accumulo/branches/1.5/fate/pom.xml
          • /accumulo/branches/1.5/pom.xml
          • /accumulo/branches/1.5/proxy/pom.xml
          • /accumulo/branches/1.5/server/pom.xml
          • /accumulo/branches/1.5/start/pom.xml
          • /accumulo/branches/1.5/test/pom.xml
          • /accumulo/branches/1.5/trace/pom.xml
          Show
          Hudson added a comment - Integrated in Accumulo-1.5 #72 (See https://builds.apache.org/job/Accumulo-1.5/72/ ) ACCUMULO-1244 Make more libraries provided, because they must be. So, we'll depend on the version given by Hadoop. Remove all provided jars and source jars from lib directory. (Revision 1466685) Result = UNSTABLE ctubbsii : Files : /accumulo/branches/1.5/assemble/pom.xml /accumulo/branches/1.5/bin/accumulo /accumulo/branches/1.5/bin/bootstrap_hdfs.sh /accumulo/branches/1.5/core/pom.xml /accumulo/branches/1.5/examples/simple/pom.xml /accumulo/branches/1.5/fate/pom.xml /accumulo/branches/1.5/pom.xml /accumulo/branches/1.5/proxy/pom.xml /accumulo/branches/1.5/server/pom.xml /accumulo/branches/1.5/start/pom.xml /accumulo/branches/1.5/test/pom.xml /accumulo/branches/1.5/trace/pom.xml
          Hide
          David Medinets added a comment -

          +!, although I seem to be late to the voting booth.

          Show
          David Medinets added a comment - +!, although I seem to be late to the voting booth.
          Hide
          Hudson added a comment -

          Integrated in Accumulo-Trunk-Hadoop-2.0 #184 (See https://builds.apache.org/job/Accumulo-Trunk-Hadoop-2.0/184/)
          ACCUMULO-1244 fix log4 lookup for hadoop2 (Revision 1467052)
          ACCUMULO-1265, ACCUMULO-1250, ACCUMULO-1244, ACCUMULO-1063 merged to trunk from 1.5 branch (Revision 1467036)

          Result = SUCCESS
          ecn :
          Files :

          • /accumulo/trunk
          • /accumulo/trunk/assemble
          • /accumulo/trunk/bin/accumulo
          • /accumulo/trunk/core
          • /accumulo/trunk/examples
          • /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/ZooStore.java
          • /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/zookeeper/ZooSession.java
          • /accumulo/trunk/server
          • /accumulo/trunk/src

          ctubbsii :
          Files :

          • /accumulo/trunk
          • /accumulo/trunk/assemble
          • /accumulo/trunk/assemble/pom.xml
          • /accumulo/trunk/bin/accumulo
          • /accumulo/trunk/bin/bootstrap_hdfs.sh
          • /accumulo/trunk/core
          • /accumulo/trunk/core/pom.xml
          • /accumulo/trunk/core/src/main/scripts/generate-thrift.sh
          • /accumulo/trunk/examples
          • /accumulo/trunk/examples/simple/pom.xml
          • /accumulo/trunk/fate/pom.xml
          • /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/ZooStore.java
          • /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/zookeeper/ZooSession.java
          • /accumulo/trunk/pom.xml
          • /accumulo/trunk/proxy/pom.xml
          • /accumulo/trunk/server
          • /accumulo/trunk/server/pom.xml
          • /accumulo/trunk/src
          • /accumulo/trunk/start/pom.xml
          • /accumulo/trunk/test/pom.xml
          • /accumulo/trunk/trace/pom.xml
          Show
          Hudson added a comment - Integrated in Accumulo-Trunk-Hadoop-2.0 #184 (See https://builds.apache.org/job/Accumulo-Trunk-Hadoop-2.0/184/ ) ACCUMULO-1244 fix log4 lookup for hadoop2 (Revision 1467052) ACCUMULO-1265 , ACCUMULO-1250 , ACCUMULO-1244 , ACCUMULO-1063 merged to trunk from 1.5 branch (Revision 1467036) Result = SUCCESS ecn : Files : /accumulo/trunk /accumulo/trunk/assemble /accumulo/trunk/bin/accumulo /accumulo/trunk/core /accumulo/trunk/examples /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/ZooStore.java /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/zookeeper/ZooSession.java /accumulo/trunk/server /accumulo/trunk/src ctubbsii : Files : /accumulo/trunk /accumulo/trunk/assemble /accumulo/trunk/assemble/pom.xml /accumulo/trunk/bin/accumulo /accumulo/trunk/bin/bootstrap_hdfs.sh /accumulo/trunk/core /accumulo/trunk/core/pom.xml /accumulo/trunk/core/src/main/scripts/generate-thrift.sh /accumulo/trunk/examples /accumulo/trunk/examples/simple/pom.xml /accumulo/trunk/fate/pom.xml /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/ZooStore.java /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/zookeeper/ZooSession.java /accumulo/trunk/pom.xml /accumulo/trunk/proxy/pom.xml /accumulo/trunk/server /accumulo/trunk/server/pom.xml /accumulo/trunk/src /accumulo/trunk/start/pom.xml /accumulo/trunk/test/pom.xml /accumulo/trunk/trace/pom.xml
          Hide
          Hudson added a comment -

          Integrated in Accumulo-1.5 #73 (See https://builds.apache.org/job/Accumulo-1.5/73/)
          ACCUMULO-1244 fix log4 lookup for hadoop2 (Revision 1467051)

          Result = SUCCESS
          ecn :
          Files :

          • /accumulo/branches/1.5/bin/accumulo
          Show
          Hudson added a comment - Integrated in Accumulo-1.5 #73 (See https://builds.apache.org/job/Accumulo-1.5/73/ ) ACCUMULO-1244 fix log4 lookup for hadoop2 (Revision 1467051) Result = SUCCESS ecn : Files : /accumulo/branches/1.5/bin/accumulo
          Hide
          Hudson added a comment -

          Integrated in Accumulo-Trunk #826 (See https://builds.apache.org/job/Accumulo-Trunk/826/)
          ACCUMULO-1244 fix log4 lookup for hadoop2 (Revision 1467052)
          ACCUMULO-1265, ACCUMULO-1250, ACCUMULO-1244, ACCUMULO-1063 merged to trunk from 1.5 branch (Revision 1467036)

          Result = SUCCESS
          ecn :
          Files :

          • /accumulo/trunk
          • /accumulo/trunk/assemble
          • /accumulo/trunk/bin/accumulo
          • /accumulo/trunk/core
          • /accumulo/trunk/examples
          • /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/ZooStore.java
          • /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/zookeeper/ZooSession.java
          • /accumulo/trunk/server
          • /accumulo/trunk/src

          ctubbsii :
          Files :

          • /accumulo/trunk
          • /accumulo/trunk/assemble
          • /accumulo/trunk/assemble/pom.xml
          • /accumulo/trunk/bin/accumulo
          • /accumulo/trunk/bin/bootstrap_hdfs.sh
          • /accumulo/trunk/core
          • /accumulo/trunk/core/pom.xml
          • /accumulo/trunk/core/src/main/scripts/generate-thrift.sh
          • /accumulo/trunk/examples
          • /accumulo/trunk/examples/simple/pom.xml
          • /accumulo/trunk/fate/pom.xml
          • /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/ZooStore.java
          • /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/zookeeper/ZooSession.java
          • /accumulo/trunk/pom.xml
          • /accumulo/trunk/proxy/pom.xml
          • /accumulo/trunk/server
          • /accumulo/trunk/server/pom.xml
          • /accumulo/trunk/src
          • /accumulo/trunk/start/pom.xml
          • /accumulo/trunk/test/pom.xml
          • /accumulo/trunk/trace/pom.xml
          Show
          Hudson added a comment - Integrated in Accumulo-Trunk #826 (See https://builds.apache.org/job/Accumulo-Trunk/826/ ) ACCUMULO-1244 fix log4 lookup for hadoop2 (Revision 1467052) ACCUMULO-1265 , ACCUMULO-1250 , ACCUMULO-1244 , ACCUMULO-1063 merged to trunk from 1.5 branch (Revision 1467036) Result = SUCCESS ecn : Files : /accumulo/trunk /accumulo/trunk/assemble /accumulo/trunk/bin/accumulo /accumulo/trunk/core /accumulo/trunk/examples /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/ZooStore.java /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/zookeeper/ZooSession.java /accumulo/trunk/server /accumulo/trunk/src ctubbsii : Files : /accumulo/trunk /accumulo/trunk/assemble /accumulo/trunk/assemble/pom.xml /accumulo/trunk/bin/accumulo /accumulo/trunk/bin/bootstrap_hdfs.sh /accumulo/trunk/core /accumulo/trunk/core/pom.xml /accumulo/trunk/core/src/main/scripts/generate-thrift.sh /accumulo/trunk/examples /accumulo/trunk/examples/simple/pom.xml /accumulo/trunk/fate/pom.xml /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/ZooStore.java /accumulo/trunk/fate/src/main/java/org/apache/accumulo/fate/zookeeper/ZooSession.java /accumulo/trunk/pom.xml /accumulo/trunk/proxy/pom.xml /accumulo/trunk/server /accumulo/trunk/server/pom.xml /accumulo/trunk/src /accumulo/trunk/start/pom.xml /accumulo/trunk/test/pom.xml /accumulo/trunk/trace/pom.xml
          Hide
          John Vines added a comment -

          For the record, this breaks compatibility for releases at least at and before hadoop-1.0.1.

          Show
          John Vines added a comment - For the record, this breaks compatibility for releases at least at and before hadoop-1.0.1.
          Hide
          Christopher Tubbs added a comment -

          So, of the 1.0.x line, can we just say we support >= 1.0.3?

          Show
          Christopher Tubbs added a comment - So, of the 1.0.x line, can we just say we support >= 1.0.3?
          Hide
          Josh Elser added a comment -

          No complaints here. Is there something with 1.0.0 through 1.0.2 that we don't support?

          Show
          Josh Elser added a comment - No complaints here. Is there something with 1.0.0 through 1.0.2 that we don't support?
          Hide
          John Vines added a comment -

          Like 0.20, 1.0.1 does not contain commons-io at all. I didn't take the time
          to peg it's introduction.

          Sent from my phone, please pardon the typos and brevity.

          Show
          John Vines added a comment - Like 0.20, 1.0.1 does not contain commons-io at all. I didn't take the time to peg it's introduction. Sent from my phone, please pardon the typos and brevity.
          Hide
          Christopher Tubbs added a comment -

          Okay, so I'd hate to package it by default.

          The real issue here is whether it should be included in our distribution, and I think it should not be. If one is using an older version of hadoop, they can simply download this dependency and include it themselves. Either way, we mark it as "provided"... whether it is provided by Hadoop, or provided alongside Hadoop, I think should be a user-consideration. As "provided", though, we should not include it at all.

          Show
          Christopher Tubbs added a comment - Okay, so I'd hate to package it by default. The real issue here is whether it should be included in our distribution, and I think it should not be. If one is using an older version of hadoop, they can simply download this dependency and include it themselves. Either way, we mark it as "provided"... whether it is provided by Hadoop, or provided alongside Hadoop, I think should be a user-consideration. As "provided", though, we should not include it at all.
          Hide
          Christopher Tubbs added a comment -

          Re-opening until we decide whether to include commons-io in our distribution or not.

          Show
          Christopher Tubbs added a comment - Re-opening until we decide whether to include commons-io in our distribution or not.
          Hide
          Keith Turner added a comment -

          , I think should be a user-consideration. As "provided", though, we should not include it at all.

          If we go down this path, then it should be documented in the README what the user should do if using hadoop 0.20

          Show
          Keith Turner added a comment - , I think should be a user-consideration. As "provided", though, we should not include it at all. If we go down this path, then it should be documented in the README what the user should do if using hadoop 0.20
          Hide
          Josh Elser added a comment -

          Either way, we mark it as "provided"... whether it is provided by Hadoop, or provided alongside Hadoop, I think should be a user-consideration. As "provided", though, we should not include it at all.

          I think that's the best thing to do. That, and what Keith noted about updating documentation on the subject of running against 0.20

          Show
          Josh Elser added a comment - Either way, we mark it as "provided"... whether it is provided by Hadoop, or provided alongside Hadoop, I think should be a user-consideration. As "provided", though, we should not include it at all. I think that's the best thing to do. That, and what Keith noted about updating documentation on the subject of running against 0.20
          Hide
          Christopher Tubbs added a comment -

          Closed, and opened a documentation ticket to address this (ACCUMULO-1320).

          Show
          Christopher Tubbs added a comment - Closed, and opened a documentation ticket to address this ( ACCUMULO-1320 ).
          Hide
          Billie Rinaldi added a comment -

          Is it a necessity for the dependencies to be marked provided in the module poms, or could we move the provided markings to the top-level pom? It would useful for me to be able to see which ones are provided at the top level. Also, I don't know how to run Accumulo now. When I try to init, it throws an exception because it can't find log4j. I even dropped a log4j jar in the lib directory, but it still doesn't work.

          Show
          Billie Rinaldi added a comment - Is it a necessity for the dependencies to be marked provided in the module poms, or could we move the provided markings to the top-level pom? It would useful for me to be able to see which ones are provided at the top level. Also, I don't know how to run Accumulo now. When I try to init, it throws an exception because it can't find log4j. I even dropped a log4j jar in the lib directory, but it still doesn't work.
          Hide
          Eric Newton added a comment -

          If you are running hadoop-2.0, you need to extend the classpath using the instructions in the example accumulo-site.xml files. Hadoop moved the locations of its many jar files around.

          Show
          Eric Newton added a comment - If you are running hadoop-2.0, you need to extend the classpath using the instructions in the example accumulo-site.xml files. Hadoop moved the locations of its many jar files around.
          Hide
          Christopher Tubbs added a comment -

          Billie Rinaldi

          Is it a necessity for the dependencies to be marked provided in the module poms, or could we move the provided markings to the top-level pom?

          I would argue that scopes are best put in module poms, because the same artifact may have two different scopes in two different modules in a multi-module project, and it gets confusing when the default "compile" scope is explicit in some cases (to override the parent) and implicit in other cases.

          Specifying scopes this way, also helps simplify the copy-dependencies plugin configuration a lot, and makes the behavior of plugins that use the scopes more predictable (vs. the extra configuration in our assembly to get around the messiness of picking and choosing what to include). It also makes our integration testing environment much more realistic. The biggest downside I'm aware of is that provided scope is not resolved transitively in the unit testing phase of the build lifecycle.

          When I try to init, it throws an exception because it can't find log4j.

          I don't get that. Have you figured this out yet?

          Show
          Christopher Tubbs added a comment - Billie Rinaldi Is it a necessity for the dependencies to be marked provided in the module poms, or could we move the provided markings to the top-level pom? I would argue that scopes are best put in module poms, because the same artifact may have two different scopes in two different modules in a multi-module project, and it gets confusing when the default "compile" scope is explicit in some cases (to override the parent) and implicit in other cases. Specifying scopes this way, also helps simplify the copy-dependencies plugin configuration a lot, and makes the behavior of plugins that use the scopes more predictable (vs. the extra configuration in our assembly to get around the messiness of picking and choosing what to include). It also makes our integration testing environment much more realistic. The biggest downside I'm aware of is that provided scope is not resolved transitively in the unit testing phase of the build lifecycle. When I try to init, it throws an exception because it can't find log4j. I don't get that. Have you figured this out yet?
          Hide
          Christopher Tubbs added a comment -

          Billie Rinaldi Also, your comment seems to relate to ACCUMULO-935.

          Show
          Christopher Tubbs added a comment - Billie Rinaldi Also, your comment seems to relate to ACCUMULO-935 .
          Hide
          Billie Rinaldi added a comment -

          If you are running hadoop-2.0, you need to extend the classpath using the instructions in the example accumulo-site.xml files.

          It's not 2.0, but it's a later 1.x version that has moved the jar files too. I added the actual jar directory to accumulo-site.xml and it didn't help. I dropped the log4j jar in the accumulo/lib directory and it also didn't help. I copied the jar to the hadoop/lib directory and then it could find it.

          Is it possible that the classloader needs log4j before it is able to find out about the modification to general.classpaths? I don't know why it would be able to find the jar in hadoop/lib and not accumulo/lib, though. This is the error:

          java.lang.NoClassDefFoundError: org/apache/log4j/Logger
          	at org.apache.accumulo.start.classloader.AccumuloClassLoader.<clinit>(AccumuloClassLoader.java:65)
          	at org.apache.accumulo.start.Main.main(Main.java:37)
          
          Show
          Billie Rinaldi added a comment - If you are running hadoop-2.0, you need to extend the classpath using the instructions in the example accumulo-site.xml files. It's not 2.0, but it's a later 1.x version that has moved the jar files too. I added the actual jar directory to accumulo-site.xml and it didn't help. I dropped the log4j jar in the accumulo/lib directory and it also didn't help. I copied the jar to the hadoop/lib directory and then it could find it. Is it possible that the classloader needs log4j before it is able to find out about the modification to general.classpaths? I don't know why it would be able to find the jar in hadoop/lib and not accumulo/lib, though. This is the error: java.lang.NoClassDefFoundError: org/apache/log4j/Logger at org.apache.accumulo.start.classloader.AccumuloClassLoader.<clinit>(AccumuloClassLoader.java:65) at org.apache.accumulo.start.Main.main(Main.java:37)
          Hide
          John Vines added a comment -

          Line 84 of the accumulo script:

          LOG4J_JAR=$(find $HADOOP_PREFIX/lib $HADOOP_PREFIX/share/hadoop/common/lib -    name 'log4j*.jar' -print 2>/dev/null | head -1) 

          Here's the culprit, I think. We need to make this a bit more tolerant for the various releases. But it should probably be a new ticket and/or under the hadoop2 ticket.

          Show
          John Vines added a comment - Line 84 of the accumulo script: LOG4J_JAR=$(find $HADOOP_PREFIX/lib $HADOOP_PREFIX/share/hadoop/common/lib - name 'log4j*.jar' -print 2>/dev/ null | head -1) Here's the culprit, I think. We need to make this a bit more tolerant for the various releases. But it should probably be a new ticket and/or under the hadoop2 ticket.
          Hide
          Billie Rinaldi added a comment -

          Good find, John. I'll open a ticket.

          Show
          Billie Rinaldi added a comment - Good find, John. I'll open a ticket.
          Hide
          ASF subversion and git services added a comment -

          Commit 5e3967fa08a36b386cdd0b40a04b5cd7c1e331de in branch refs/heads/1.4.5-SNAPSHOT from Sean Busbey
          [ https://git-wip-us.apache.org/repos/asf?p=accumulo.git;h=5e3967f ]

          ACCUMULO-1792 Update commons-io dependency for Hadoop2.

          Based on the discussion around ACCUMULO-1244, we can update to 2.1 while not marking commons-io as provided to eliminated classpath issues on hadoop 2 and bring a copy in lib/ for hadoop 0.20.

          Signed-off-by: Eric Newton <eric.newton@gmail.com>

          Show
          ASF subversion and git services added a comment - Commit 5e3967fa08a36b386cdd0b40a04b5cd7c1e331de in branch refs/heads/1.4.5-SNAPSHOT from Sean Busbey [ https://git-wip-us.apache.org/repos/asf?p=accumulo.git;h=5e3967f ] ACCUMULO-1792 Update commons-io dependency for Hadoop2. Based on the discussion around ACCUMULO-1244 , we can update to 2.1 while not marking commons-io as provided to eliminated classpath issues on hadoop 2 and bring a copy in lib/ for hadoop 0.20. Signed-off-by: Eric Newton <eric.newton@gmail.com>
          Hide
          ASF subversion and git services added a comment -

          Commit 5e3967fa08a36b386cdd0b40a04b5cd7c1e331de in branch refs/heads/1.5.1-SNAPSHOT from Sean Busbey
          [ https://git-wip-us.apache.org/repos/asf?p=accumulo.git;h=5e3967f ]

          ACCUMULO-1792 Update commons-io dependency for Hadoop2.

          Based on the discussion around ACCUMULO-1244, we can update to 2.1 while not marking commons-io as provided to eliminated classpath issues on hadoop 2 and bring a copy in lib/ for hadoop 0.20.

          Signed-off-by: Eric Newton <eric.newton@gmail.com>

          Show
          ASF subversion and git services added a comment - Commit 5e3967fa08a36b386cdd0b40a04b5cd7c1e331de in branch refs/heads/1.5.1-SNAPSHOT from Sean Busbey [ https://git-wip-us.apache.org/repos/asf?p=accumulo.git;h=5e3967f ] ACCUMULO-1792 Update commons-io dependency for Hadoop2. Based on the discussion around ACCUMULO-1244 , we can update to 2.1 while not marking commons-io as provided to eliminated classpath issues on hadoop 2 and bring a copy in lib/ for hadoop 0.20. Signed-off-by: Eric Newton <eric.newton@gmail.com>
          Hide
          ASF subversion and git services added a comment -

          Commit 5e3967fa08a36b386cdd0b40a04b5cd7c1e331de in branch refs/heads/1.6.0-SNAPSHOT from Sean Busbey
          [ https://git-wip-us.apache.org/repos/asf?p=accumulo.git;h=5e3967f ]

          ACCUMULO-1792 Update commons-io dependency for Hadoop2.

          Based on the discussion around ACCUMULO-1244, we can update to 2.1 while not marking commons-io as provided to eliminated classpath issues on hadoop 2 and bring a copy in lib/ for hadoop 0.20.

          Signed-off-by: Eric Newton <eric.newton@gmail.com>

          Show
          ASF subversion and git services added a comment - Commit 5e3967fa08a36b386cdd0b40a04b5cd7c1e331de in branch refs/heads/1.6.0-SNAPSHOT from Sean Busbey [ https://git-wip-us.apache.org/repos/asf?p=accumulo.git;h=5e3967f ] ACCUMULO-1792 Update commons-io dependency for Hadoop2. Based on the discussion around ACCUMULO-1244 , we can update to 2.1 while not marking commons-io as provided to eliminated classpath issues on hadoop 2 and bring a copy in lib/ for hadoop 0.20. Signed-off-by: Eric Newton <eric.newton@gmail.com>

            People

            • Assignee:
              Christopher Tubbs
              Reporter:
              Adam Fuchs
            • Votes:
              0 Vote for this issue
              Watchers:
              9 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development