Details
-
Sub-task
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
2.0.0, 2.4.1
-
None
Description
When reading a parquet file which contains characters considered invalid, the reader fails with exception:
Name: org.apache.spark.sql.AnalysisException
Message: Attribute name "..." contains invalid character(s) among " ,;{}()\n\t=". Please use alias to rename it.
Spark should not be able to write such files, but it should be able to read it (and allow the user to correct it). However, possible workarounds (such as using alias to rename the column, or forcing another schema) do not work, since the check is done on the input.
(Possible fix: remove superficial ParquetWriteSupport.setSchema(requiredSchema, hadoopConf) from buildReaderWithPartitionValues ?)