Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-38846

Teradata's Number is either converted to its floor value or ceiling value despite its fractional part.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.3.0, 3.2.1
    • 3.4.0
    • SQL
    • None
    • Spark2.3.0/Spark3.2.1 on Yarn

      Teradata 16.20.32.59

    Description

      I'm trying to load data from Teradata, the code using is:
          

          sparkSession.read
                .format("jdbc")
                .options(
                  Map(
                    "url" -> "jdbc:teradata://hostname, user=$username, password=$password",
                    "MAYBENULL" -> "ON",
                    "SIP_SUPPORT" -> "ON",
                    "driver" -> "com.teradata.jdbc.TeraDriver",
                    "dbtable" -> $table_name
                  )
                )
                .load()

      However, some data lost its fractional part after loading. To be more concise, the column in Teradata is in the type of [Number][1] and after loading, the data type in Spark is `DecimalType(38,0)`, the scale value is 0 which means no digits after decimal point. 

      Data in Teradata is something like,

          id column1 column2
          1   50.23    100.23
          2   25.8     20.669
          3   30.2     19.23

      The `dataframe` of Spark is like,

          id column1 column2
          1   50     100
          2   26     21
          3   30     19

      The meta data of the table in Teradata is like:

          CREATE SET TABLE table_name (id BIGINT, column1 NUMBER, column2 NUMBER) PRIMARY INDEX (id);

      The Spark version is 2.3.0/3.2.1 and Teradata is 16.20.32.59. 

       

      Attachments

        Activity

          People

            eugeneple eugene
            eugeneple eugene
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: