Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
1.6.1
-
None
-
None
-
spark_2_4_2_0_258-1.6.1.2.4.2.0-258.el6.noarch
spark_2_4_2_0_258-python-1.6.1.2.4.2.0-258.el6.noarch
spark_2_4_2_0_258-yarn-shuffle-1.6.1.2.4.2.0-258.el6.noarch
RHEL-7 (64-Bit)
JDK 1.8
Description
The issue is very similar to SPARK-10304;
Spark Query throws a NullPointerException.
>>> sqlContext.sql('select * from core_next.spark_categorization').show(57)
17/06/19 11:26:54 ERROR Executor: Exception in task 2.0 in stage 21.0 (TID 48)
java.lang.NullPointerException
at org.apache.spark.sql.hive.HiveInspectors$class.unwrapperFor(HiveInspectors.scala:488)
at org.apache.spark.sql.hive.orc.OrcTableScan.unwrapperFor(OrcRelation.scala:244)
at org.apache.spark.sql.hive.orc.OrcTableScan$$anonfun$org$apache$spark$sql$hive$orc$OrcTableScan$$fillObject$1$$anonfun$6.apply(OrcRelation.scala:275)
at org.apache.spark.sql.hive.orc.OrcTableScan$$anonfun$org$apache$spark$sql$hive$orc$OrcTableScan$$fillObject$1$$anonfun$6.apply(OrcRelation.scala:275)
Turn off ORC optimizations and issue was resolved:
"sqlContext.setConf("spark.sql.hive.convertMetastoreOrc", "false")
Attachments
Issue Links
- blocks
-
SPARK-20901 Feature parity for ORC with Parquet
- Open
- duplicates
-
SPARK-18355 Spark SQL fails to read data from a ORC hive table that has a new column added to it
- Resolved
- relates to
-
SPARK-16628 OrcConversions should not convert an ORC table represented by MetastoreRelation to HadoopFsRelation if metastore schema does not match schema stored in ORC files
- Resolved