Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2263

Can't insert Map<K, V> values to Hive tables

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.0.0
    • 1.0.1, 1.1.0
    • SQL
    • None

    Description

      Scala Map[K, V] values are not converted to their Java correspondence:

      scala> loadTestTable("src")
      
      scala> hql("create table m(value map<int, string>)")
      res1: org.apache.spark.sql.SchemaRDD = 
      SchemaRDD[0] at RDD at SchemaRDD.scala:100
      == Query Plan ==
      <Native command: executed by Hive>
      
      scala> hql("insert overwrite table m select map(key, value) from src")
      org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:0 failed 1 times, most recent failure: Exception failure in TID 0 on host localhost: java.lang.ClassCastException: scala.collection.immutable.Map$Map1 cannot be cast to java.util.Map
              org.apache.hadoop.hive.serde2.objectinspector.StandardMapObjectInspector.getMap(StandardMapObjectInspector.java:82)
              org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:515)
              org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serializeField(LazySimpleSerDe.java:439)
              org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:423)
              org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$2$$anonfun$apply$1.apply(InsertIntoHiveTable.scala:200)
              org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$2$$anonfun$apply$1.apply(InsertIntoHiveTable.scala:192)
              scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
              org.apache.spark.sql.hive.execution.InsertIntoHiveTable.org$apache$spark$sql$hive$execution$InsertIntoHiveTable$$writeToFile$1(InsertIntoHiveTable.scala:149)
              org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$1.apply(InsertIntoHiveTable.scala:158)
              org.apache.spark.sql.hive.execution.InsertIntoHiveTable$$anonfun$saveAsHiveFile$1.apply(InsertIntoHiveTable.scala:158)
              org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:112)
              org.apache.spark.scheduler.Task.run(Task.scala:51)
              org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
              java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
              java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
              java.lang.Thread.run(Thread.java:745)
      Driver stacktrace:
              at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1040)
              at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1024)
              at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1022)
              at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
              at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
              at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1022)
              at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:640)
              at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:640)
              at scala.Option.foreach(Option.scala:236)
              at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:640)
              at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1214)
              at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
              at akka.actor.ActorCell.invoke(ActorCell.scala:456)
              at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
              at akka.dispatch.Mailbox.run(Mailbox.scala:219)
              at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
              at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
              at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
              at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
              at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
      
      
      scala> 
      

      Attachments

        Activity

          People

            lian cheng Cheng Lian
            lian cheng Cheng Lian
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: