Uploaded image for project: 'Apache NiFi'
  1. Apache NiFi
  2. NIFI-2531

SQL-to-Avro processors do not convert BIGINT correctly

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.0.0, 0.7.0
    • 1.0.0
    • None
    • None

    Description

      For the SQL to Avro processors that use JdbcCommon (such as ExecuteSQL), if a BigInteger object is being put into an Avro record, it is being put in as a String. However when the Avro schema is created and the SQL type of the column is BIGINT, the schema contains the expected type "long" (actually a union between null and long to allow for null values). This causes errors such as:
      UnresolvedUnionException: not in union: ["null", "long"]

      If a BigInteger is retrieved from the result set and the SQL type is BIGINT, then its value is expected to fit into 8 bytes and should thus be converted to a long before storing in the Avro record.

      Attachments

        Issue Links

          Activity

            People

              mattyb149 Matt Burgess
              mattyb149 Matt Burgess
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: