Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3004

HiveThriftServer2 throws exception when the result set contains NULL

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 1.0.2
    • 1.1.0
    • SQL
    • None

    Description

      To reproduce this issue with beeline:

      $ cd $SPARK_HOME
      $ ./bin/beeline -u jdbc:hive2://localhost:10000 -n lian
      ...
      0: jdbc:hive2://localhost:10000> create table src1 (key int, value string);
      ...
      0: jdbc:hive2://localhost:10000> load data local inpath './sql/hive/src/test/resources/data/files/kv3.txt' into table src1;
      ...
      0: jdbc:hive2://localhost:10000> select * from src1 where key is null;
      Error:  (state=,code=0)
      

      Exception thrown from HiveThriftServer2:

      java.lang.RuntimeException: Failed to check null bit for primitive int value.
              at scala.sys.package$.error(package.scala:27)
              at org.apache.spark.sql.catalyst.expressions.GenericRow.getInt(Row.scala:145)
              at org.apache.spark.sql.hive.thriftserver.server.SparkSQLOperationManager$$anon$1.getNextRowSet(SparkSQLOperationManager.scala:80)
              at org.apache.hive.service.cli.operation.OperationManager.getOperationNextRowSet(OperationManager.java:170)
              at org.apache.hive.service.cli.session.HiveSessionImpl.fetchResults(HiveSessionImpl.java:417)
              at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:306)
              at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:386)
              at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1373)
              at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1358)
              at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
              at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
              at org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:58)
              at org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:55)
              at java.security.AccessController.doPrivileged(Native Method)
              at javax.security.auth.Subject.doAs(Subject.java:415)
              at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
              at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:526)
              at org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContainingProcessor.java:55)
              at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
              at java.lang.Thread.run(Thread.java:745)
      

      The cause is that we didn't check isNullAt in SparkSQLOperationManager.getNextRowSet

      Attachments

        Activity

          People

            lian cheng Cheng Lian
            lian cheng Cheng Lian
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: