Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-7947

Serdes Command not working

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 1.3.1
    • None
    • Spark Shell, Windows
    • windows 8.1, hadoop 2.5.2, hive 1.1.0, spark 1.3.1, scala 2.10.4

    Description

      I have configured spark sql and executed the hive serde command like below

      val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
      
      hiveContext.sql("ALTER TABLE event_db.sample SET SERDEPROPERTIES ('serialization.encoding'='GBK')")
      

      Above command is working fine in hive shell but it is not supporting in spark shell.
      Got the below error in spark shell.

            org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:
      1036)
              at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:19
      9)
              at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:16
      6)
              at org.apache.spark.sql.hive.HiveQl$.getAst(HiveQl.scala:227)
              at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:241)
              at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.appl
      y(ExtendedHiveQlParser.scala:41)
              at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.appl
      y(ExtendedHiveQlParser.scala:40)
              at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
              at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Par
      sers.scala:242)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Par
      sers.scala:242)
              at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222
      )
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonf
      un$apply$2.apply(Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonf
      un$apply$2.apply(Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:20
      2)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(
      Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(
      Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222
      )
              at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply
      (Parsers.scala:891)
              at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply
      (Parsers.scala:891)
              at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
              at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890
      )
              at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratPar
      sers.scala:110)
              at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSp
      arkSQLParser.scala:38)
              at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:138)
              at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:138)
              at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$Spa
      rkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
              at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$Spa
      rkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
              at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
              at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Par
      sers.scala:242)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Par
      sers.scala:242)
              at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222
      )
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonf
      un$apply$2.apply(Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonf
      un$apply$2.apply(Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:20
      2)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(
      Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(
      Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222
      )
              at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply
      (Parsers.scala:891)
              at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply
      (Parsers.scala:891)
              at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
              at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890
      )
              at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratPar
      sers.scala:110)
              at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSp
      arkSQLParser.scala:38)
              at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:234)
              at org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContex
      t.scala:101)
              at org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContex
      t.scala:101)
              at scala.Option.getOrElse(Option.scala:120)
              at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:101)
              at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console
      >:24)
              at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
      
              at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
              at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
              at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
              at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:37)
              at $line35.$read$$iwC$$iwC.<init>(<console>:39)
              at $line35.$read$$iwC.<init>(<console>:41)
              at $line35.$read.<init>(<console>:43)
              at $line35.$read$.<init>(<console>:47)
              at $line35.$read$.<clinit>(<console>)
              at $line35.$eval$.<init>(<console>:7)
              at $line35.$eval$.<clinit>(<console>)
              at $line35.$eval.$print(<console>)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
      java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
      sorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:606)
              at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:
      1065)
              at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:
      1338)
              at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840
      )
              at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
              at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
              at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8
      56)
              at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca
      la:901)
              at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
              at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
              at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
              at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$lo
      op(SparkILoop.scala:669)
              at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
      ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
              at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
      ILoop$$process$1.apply(SparkILoop.scala:944)
              at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
      ILoop$$process$1.apply(SparkILoop.scala:944)
              at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass
      Loader.scala:135)
              at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr
      ocess(SparkILoop.scala:944)
              at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
              at org.apache.spark.repl.Main$.main(Main.scala:31)
              at org.apache.spark.repl.Main.main(Main.scala)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
      java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
      sorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:606)
              at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub
      mit$$runMain(SparkSubmit.scala:569)
              at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:16
      6)
              at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
              at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
              at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      org.apache.spark.sql.AnalysisException: mismatched input 'SET' expecting KW_EXCH
      ANGE near 'sample' in alter exchange partition; line 1 pos 28
              at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:254)
              at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.appl
      y(ExtendedHiveQlParser.scala:41)
              at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.appl
      y(ExtendedHiveQlParser.scala:40)
              at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
              at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Par
      sers.scala:242)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Par
      sers.scala:242)
              at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222
      )
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonf
      un$apply$2.apply(Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonf
      un$apply$2.apply(Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:20
      2)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(
      Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(
      Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222
      )
              at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply
      (Parsers.scala:891)
              at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply
      (Parsers.scala:891)
              at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
              at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890
      )
              at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratPar
      sers.scala:110)
              at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSp
      arkSQLParser.scala:38)
              at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:138)
              at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:138)
              at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$Spa
      rkSQLParser$$others$1.apply(SparkSQLParser.scala:96)
              at org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$Spa
      rkSQLParser$$others$1.apply(SparkSQLParser.scala:95)
              at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
              at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Par
      sers.scala:242)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Par
      sers.scala:242)
              at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222
      )
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonf
      un$apply$2.apply(Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonf
      un$apply$2.apply(Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:20
      2)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(
      Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(
      Parsers.scala:254)
              at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222
      )
              at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply
      (Parsers.scala:891)
              at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply
      (Parsers.scala:891)
              at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
              at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890
      )
              at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratPar
      sers.scala:110)
              at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSp
      arkSQLParser.scala:38)
              at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:234)
              at org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContex
      t.scala:101)
              at org.apache.spark.sql.hive.HiveContext$$anonfun$sql$1.apply(HiveContex
      t.scala:101)
              at scala.Option.getOrElse(Option.scala:120)
              at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:101)
              at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
              at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
              at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
              at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
              at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
              at $iwC$$iwC$$iwC.<init>(<console>:37)
              at $iwC$$iwC.<init>(<console>:39)
              at $iwC.<init>(<console>:41)
              at <init>(<console>:43)
              at .<init>(<console>:47)
              at .<clinit>(<console>)
              at .<init>(<console>:7)
              at .<clinit>(<console>)
              at $print(<console>)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
      java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
      sorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:606)
              at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:
      1065)
              at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:
      1338)
              at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840
      )
              at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
              at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
              at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:8
      56)
              at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.sca
      la:901)
              at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
              at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
              at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
              at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$lo
      op(SparkILoop.scala:669)
              at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
      ILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
              at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
      ILoop$$process$1.apply(SparkILoop.scala:944)
              at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$Spark
      ILoop$$process$1.apply(SparkILoop.scala:944)
              at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClass
      Loader.scala:135)
              at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$pr
      ocess(SparkILoop.scala:944)
              at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
              at org.apache.spark.repl.Main$.main(Main.scala:31)
              at org.apache.spark.repl.Main.main(Main.scala)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
      java:57)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
      sorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:606)
              at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSub
      mit$$runMain(SparkSubmit.scala:569)
              at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:16
      6)
              at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
              at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
              at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      
      
      

      Please help me on how to resolve this error.

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned
            Mallieswarid Mallieswari
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment