Description
PARQUET
scala> Seq(Some(1), None).toDF("col.dots").write.parquet("/tmp/parquet_dot") scala> spark.read.parquet("/tmp/parquet_dot").show +--------+ |col.dots| +--------+ | 1| | null| +--------+
ORC
scala> Seq(Some(1), None).toDF("col.dots").write.orc("/tmp/orc_dot") scala> spark.read.orc("/tmp/orc_dot").show org.apache.spark.sql.catalyst.parser.ParseException: mismatched input '.' expecting ':'(line 1, pos 10) == SQL == struct<col.dots:int> ----------^^^
Attachments
Issue Links
- blocks
-
SPARK-20901 Feature parity for ORC with Parquet
- Open
- is related to
-
SPARK-35274 old hive table's all columns are read when column pruning applies in spark3.0
- Open
- is superceded by
-
SPARK-20682 Add new ORCFileFormat based on Apache ORC
- Resolved
- links to