Uploaded image for project: 'Phoenix'
  1. Phoenix
  2. PHOENIX-3504

Spark integration doesn't work with decimal columns that are using default precision

Attach filesAttach ScreenshotAdd voteVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Patch Available
    • Major
    • Resolution: Unresolved
    • 4.8.0
    • None
    • None
    • None

    Description

      Not sure when this issue was introduced and whether this code was working well before, but in PhoenixRDD.phoenixTypeToCatalystType for decimal precision we have a check
      (columnInfo.getPrecision < 0)
      which is fail for decimal columns that were created with default precision and scale because precision is null in this case.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            sergey.soldatov Sergey Soldatov
            sergey.soldatov Sergey Soldatov

            Dates

              Created:
              Updated:

              Slack

                Issue deployment