Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-8548 Integrate with remote Spark context after HIVE-8528 [Spark Branch]
  3. HIVE-8854

Guava dependency conflict between hive driver and remote spark context[Spark Branch]

Log workAgile BoardRank to TopRank to BottomBulk Copy AttachmentsBulk Move AttachmentsVotersWatch issueWatchersConvert to IssueMoveLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.1.0
    • Spark

    Description

      Hive driver would load guava 11.0.2 from hadoop/tez, while remote spark context depends on guava 14.0.1, It should be JobMetrics deserialize failed on Hive driver side since Absent is used in Metrics, here is the hive driver log:

      java.lang.IllegalAccessError: tried to access method com.google.common.base.Optional.<init>()V from class com.google.common.base.Absent
              at com.google.common.base.Absent.<init>(Absent.java:35)
              at com.google.common.base.Absent.<clinit>(Absent.java:33)
              at sun.misc.Unsafe.ensureClassInitialized(Native Method)
              at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
              at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:140)
              at java.lang.reflect.Field.acquireFieldAccessor(Field.java:1057)
              at java.lang.reflect.Field.getFieldAccessor(Field.java:1038)
              at java.lang.reflect.Field.getLong(Field.java:591)
              at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1663)
              at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
              at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
              at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
              at java.security.AccessController.doPrivileged(Native Method)
              at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
              at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
              at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
              at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
              at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
              at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
              at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
              at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
              at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
              at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
              at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
              at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
              at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
              at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
              at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
              at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
              at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
              at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
              at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
              at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
              at scala.util.Try$.apply(Try.scala:161)
              at akka.serialization.Serialization.deserialize(Serialization.scala:98)
              at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:63)
              at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
              at scala.util.Try$.apply(Try.scala:161)
              at akka.serialization.Serialization.deserialize(Serialization.scala:98)
              at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
              at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
              at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
              at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
              at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
              at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
              at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
              at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
              at akka.actor.ActorCell.invoke(ActorCell.scala:487)
              at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
              at akka.dispatch.Mailbox.run(Mailbox.scala:220)
              at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
              at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
              at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
              at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
              at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
      
      

      and remote spark context log:

      2014-11-13 17:16:28,481 INFO  [task-result-getter-1]: scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Finished task 0.0 in stage 1.0 (TID 1) in 439 ms on node14-4 (1/1)
      2014-11-13 17:16:28,482 INFO  [sparkDriver-akka.actor.default-dispatcher-8]: scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Stage 1 (foreachAsync at RemoteHiveSparkClient.java:121) finished in 0.452 s
      2014-11-13 17:16:28,482 INFO  [task-result-getter-1]: scheduler.TaskSchedulerImpl (Logging.scala:logInfo(59)) - Removed TaskSet 1.0, whose tasks have all completed, from pool
      2014-11-13 17:16:28,486 INFO  [08592e9f-19a2-413d-bc48-c871259c4d2e-akka.actor.default-dispatcher-4]: remote.RemoteActorRefProvider$RemoteDeadLetterActorRef (Slf4jLogger.scala:apply$mcV$sp(74)) - Message [org.apache.hive.spark.client.Protocol$JobMetrics] from Actor[akka://08592e9f-19a2-413d-bc48-c871259c4d2e/user/RemoteDriver#-893697064] to Actor[akka://08592e9f-19a2-413d-bc48-c871259c4d2e/deadLetters] was not delivered. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
      2014-11-13 17:16:28,494 INFO  [08592e9f-19a2-413d-bc48-c871259c4d2e-akka.actor.default-dispatcher-4]: remote.RemoteActorRefProvider$RemoteDeadLetterActorRef (Slf4jLogger.scala:apply$mcV$sp(74)) - Message [org.apache.hive.spark.client.Protocol$JobResult] from Actor[akka://08592e9f-19a2-413d-bc48-c871259c4d2e/user/RemoteDriver#-893697064] to Actor[akka://08592e9f-19a2-413d-bc48-c871259c4d2e/deadLetters] was not delivered. [4] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
      

      Attachments

        1. hive-dirver-classloader-info.output
          282 kB
          Chengxiang Li
        2. HIVE-8854.1-spark.patch
          12 kB
          Marcelo Masiero Vanzin
        3. HIVE-8854.1-spark.patch
          12 kB
          Szehon Ho

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            vanzin Marcelo Masiero Vanzin Assign to me
            chengxiang li Chengxiang Li
            Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment