Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
1.9.0
Description
Hit the following exception when access a Hive table with decimal column:
Caused by: org.apache.flink.table.api.TableException: TableSource of type org.apache.flink.batch.connectors.hive.HiveTableSource returned a DataSet of data type ROW<`x` LEGACY(BigDecimal)> that does not match with the data type ROW<`x` DECIMAL(10, 0)> declared by the TableSource.getProducedDataType() method. Please validate the implementation of the TableSource. at org.apache.flink.table.plan.nodes.dataset.BatchTableSourceScan.translateToPlan(BatchTableSourceScan.scala:118) at org.apache.flink.table.api.internal.BatchTableEnvImpl.translate(BatchTableEnvImpl.scala:303) at org.apache.flink.table.api.internal.BatchTableEnvImpl.translate(BatchTableEnvImpl.scala:281) at org.apache.flink.table.api.internal.BatchTableEnvImpl.writeToSink(BatchTableEnvImpl.scala:117) at org.apache.flink.table.api.internal.TableEnvImpl.insertInto(TableEnvImpl.scala:564) at org.apache.flink.table.api.internal.TableEnvImpl.insertInto(TableEnvImpl.scala:516) at org.apache.flink.table.api.internal.BatchTableEnvImpl.insertInto(BatchTableEnvImpl.scala:59) at org.apache.flink.table.api.internal.TableImpl.insertInto(TableImpl.java:428)
Attachments
Issue Links
- contains
-
FLINK-13549 Unable to query Hive table with char or varchar columns
- Closed
- requires
-
FLINK-13495 blink-planner should support decimal precision to table source
- Resolved
- links to