Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
Description
https://apachenifi.slack.com/archives/C0L9UPWJZ/p1602145019023300
I use ConvertRecord to convert from JSON to Avro.
For a field as "int" in the avro schema, if the json payload contains a number that is too big, NiFi does not throw an error but writes "crap" in the avro file. Is that intended?When using avrotool it throws an exception: org.codehaus.jackson.JsonParseException: Numeric value (2156760545) out of range of int
The ConvertRecord is configured with JsonTreeReader (Infer Schema strategy) and AvroRecordSetWriter (Use 'Schema Text' Property).
So I guess NiFi converts an inferred long to an explicitely specified int?
How can I make NiFi less lenient? I would prefer a failure than wrong data in output.
Workaround: use ValidateRecord.
I'm also wondering if the ConsumeKafkaRecord processors could be affected.
Attachments
Issue Links
- links to