Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Not A Problem
-
1.0.0
-
None
-
pig 12.1 on Cloudera Hadoop, CDH3
Description
Upgrading from Spark-0.9.1 to Spark-1.0.0 causes all Pig scripts to fail when they "register" a jar containing Spark. The error appears to be related to org.slf4j.spi.LocationAwareLogger.log.
Caused by: java.lang.RuntimeException: Could not resolve error that occured when launching map reduce job: java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598) at java.lang.Thread.dispatchUncaughtException(Thread.java:1874)
To reproduce: compile Spark via $ SPARK_HADOOP_VERSION=0.20.2-cdh3u4 sbt/sbt assembly and register the resulting jar into a pig script. E.g.
REGISTER /usr/share/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar;
data0 = LOAD 'data' USING PigStorage();
ttt = LIMIT data0 10;
DUMP ttt;
The Spark-1.0 jar includes some slf4j dependencies that were not present in 0.9.1
rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar | grep -i "slf" | grep LocationAware
3259 Mon Mar 25 21:49:34 PDT 2013 org/apache/commons/logging/impl/SLF4JLocationAwareLog.class
455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
479 Fri Dec 13 16:44:40 PST 2013 parquet/org/slf4j/spi/LocationAwareLogger.class
vs.
rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf spark-assembly-0.9.1-hadoop0.20.2-cdh3u3.jar | grep -i "slf" | grep LocationAware
455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class