Details
-
Task
-
Status: Closed
-
Blocker
-
Resolution: Fixed
-
None
-
None
Description
From the community,
"I am observing that HUDI bulk inser in 0.9.0 version is not honoring
spark.sql.parquet.writeLegacyFormat=true
config. Can you suggest way to set this config.
Reason to use this config:
Current Bulk insert use spark dataframe writer and don't do avro conversion. The decimal columns in my DF are written as INT32 type in parquet.
The upsert functionality which uses avro conversion is generating Fixed Length byte array for decimal types which is failing with datatype mismatch."
The main reason is that the config is hardcoded. We can make it configurable.
Attachments
Issue Links
- is related to
-
HUDI-3245 Convert uppercase letters to lowercase in storage configs
- Closed
- links to