Uploaded image for project: 'Apache NiFi'
  1. Apache NiFi
  2. NIFI-2531

SQL-to-Avro processors do not convert BIGINT correctly

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.0.0, 0.7.0
    • Fix Version/s: 1.0.0
    • Component/s: None
    • Labels:
      None

      Description

      For the SQL to Avro processors that use JdbcCommon (such as ExecuteSQL), if a BigInteger object is being put into an Avro record, it is being put in as a String. However when the Avro schema is created and the SQL type of the column is BIGINT, the schema contains the expected type "long" (actually a union between null and long to allow for null values). This causes errors such as:
      UnresolvedUnionException: not in union: ["null", "long"]

      If a BigInteger is retrieved from the result set and the SQL type is BIGINT, then its value is expected to fit into 8 bytes and should thus be converted to a long before storing in the Avro record.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                mattyb149 Matt Burgess
                Reporter:
                mattyb149 Matt Burgess
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: