Uploaded image for project: 'Kafka'
  1. Kafka
  2. KAFKA-9067

BigDecimal conversion unnecessarily enforces the scale

Agile BoardAttach filesAttach ScreenshotAdd voteVotersStop watchingWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 2.3.0
    • None
    • connect

    Description

      In Kafka Connect schema framework it is possible to use fixed point decimal numbers mapped as logical type Decimal. The type is related to Avro defined logical type. When the type is used, the scale value is stored in the schema definition (later it might end in Avro schema) and the unscaled value is stored as integer of unbounded size.

      The problem arises when the decimal value to decode has different scale than the one declared in the schema. During conversion to Avro or JSON using standard converters the operation fails with DataException.

      The proposed solution is to use setScale method to adapt the scale to the correct value and provide rounding mode as parameter to the schema:
      https://docs.oracle.com/javase/8/docs/api/java/math/BigDecimal.html#setScale-int-java.math.RoundingMode-

       

       

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            Unassigned Unassigned
            psmolinski Piotr Smolinski

            Dates

              Created:
              Updated:

              Slack

                Issue deployment