Details
Description
this is my sql:
create table tmp.tmp_test_6387_1224_spark stored as ORCFile as select 0.00 as a
select a from tmp.tmp_test_6387_1224_spark
CREATE TABLE `tmp.tmp_test_6387_1224_spark`(
`a` decimal(2,2))
ROW FORMAT SERDE
'org.apache.hadoop.hive.ql.io.orc.OrcSerde'
STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
When I query this table(use hive or sparksql,the exception is same), I throw the following exception information
Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed stream Stream for column 1 kind DATA position: 0 length: 0 range: 0 offset: 0 limit: 0
at org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
at org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
at org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
Attachments
Issue Links
- Blocked
-
SPARK-40253 Data read exception in orc format
- Resolved
- blocks
-
SPARK-20901 Feature parity for ORC with Parquet
- Open
- is superceded by
-
SPARK-25271 Creating parquet table with all the column null throws exception
- Resolved
- relates to
-
SPARK-22977 DataFrameWriter operations do not show details in SQL tab
- Resolved
-
HIVE-13083 Writing HiveDecimal to ORC can wrongly suppress present stream
- Closed