Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21686

spark.sql.hive.convertMetastoreOrc is causing NullPointerException while reading ORC tables

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 1.6.1
    • None
    • Spark Shell
    • None
    • spark_2_4_2_0_258-1.6.1.2.4.2.0-258.el6.noarch
      spark_2_4_2_0_258-python-1.6.1.2.4.2.0-258.el6.noarch
      spark_2_4_2_0_258-yarn-shuffle-1.6.1.2.4.2.0-258.el6.noarch
      RHEL-7 (64-Bit)
      JDK 1.8

    Description

      The issue is very similar to SPARK-10304;

      Spark Query throws a NullPointerException.

      >>> sqlContext.sql('select * from core_next.spark_categorization').show(57)
      17/06/19 11:26:54 ERROR Executor: Exception in task 2.0 in stage 21.0 (TID 48)
      java.lang.NullPointerException
      at org.apache.spark.sql.hive.HiveInspectors$class.unwrapperFor(HiveInspectors.scala:488)
      at org.apache.spark.sql.hive.orc.OrcTableScan.unwrapperFor(OrcRelation.scala:244)
      at org.apache.spark.sql.hive.orc.OrcTableScan$$anonfun$org$apache$spark$sql$hive$orc$OrcTableScan$$fillObject$1$$anonfun$6.apply(OrcRelation.scala:275)
      at org.apache.spark.sql.hive.orc.OrcTableScan$$anonfun$org$apache$spark$sql$hive$orc$OrcTableScan$$fillObject$1$$anonfun$6.apply(OrcRelation.scala:275)

      Turn off ORC optimizations and issue was resolved:

      "sqlContext.setConf("spark.sql.hive.convertMetastoreOrc", "false")

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              emattosHWX Ernani Pereira de Mattos Junior
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: