Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Duplicate
-
None
-
None
-
None
Description
Env :
Flink 1.9.1
table-planner-blink
Question:
If i have a kafka sink table with json schema:
String jsonSchema = "{ type:'object', properties:{ name: { type: 'string' }, age: { type: 'integer' }, sex: { type: 'string' } } }"; JsonRowDeserializationSchema deserializationSchema = new JsonRowDeserializationSchema(jsonSchema.toString()); TypeInformation fieldTypes = deserializationSchema.getProducedType(); Kafka kafka = new Kafka....... Schema schema = new Schema...... tableEnvironment.connect(kafka) .withFormat( new Json().jsonSchema(jsonSchema)) .withSchema( schema ) .inAppendMode() .registerTableSink("sink_example2") ; String sinksql = "insert into sink_example2 select * from table2" tableEnvironment.sqlUpdate(sinksql);
Error:
Query result schema: [name: String, age: BigDecimal, sex: String] TableSink schema: [name: String, age: BigDecimal, sex: String]
The table `table2` : table schema
[2019-12-19 18:10:16,937] INFO t2: root |-- name: STRING |-- age: DECIMAL(10, 0) |-- sex: STRING
When i use kafka to read data for json schema , i understand the integer type in json is mapped DECIMAL(38, 18) in flink table
|-- name: STRING |-- age: DECIMAL(38, 18) |-- sex: STRING
That's why i know to set decimal precision
String sinksql = "insert into sink_example2 select name CAST(age as decimal(38,18) ) as age, sex from table2"
JSON format needs to pay attention to this problem.
Attachments
Issue Links
- is duplicated by
-
FLINK-15313 Can not insert decimal with precision into sink using TypeInformation
- Resolved
- relates to
-
FLINK-15313 Can not insert decimal with precision into sink using TypeInformation
- Resolved