In Hive, to use CHAR and VARCHAR as the data type for a column, the types have to be parameterized with the length of the character sequence. Refer https://github.com/apache/hive/blob/f37c5de6c32b9395d1b34fa3c02ed06d1bfbf6eb/serde/src/test/org/apache/hadoop/hive/serde2/typeinfo/TestTypeInfoUtils.java#L68
In addition, for the DECIMAL data type, custom precision and scale can be provided as parameters.
A user has faced an Exception while reading data from a table created in Hive with a parameterized DECIMAL data type. Refer https://lists.apache.org/thread.html/r159012fbefce24d734096e3ec24ecd112de5f89b8029e57147d233b0%40%3Cuser.beam.apache.org%3E.
This ticket is created to support converting from HcatRecords to Rows when the HCatRecords have parameterized types.
To Support Parameterized data types, we can make use of HcatFieldSchema. Refer https://github.com/apache/hive/blob/f37c5de6c32b9395d1b34fa3c02ed06d1bfbf6eb/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/schema/HCatFieldSchema.java#L34