Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-34212

For parquet table, after changing the precision and scale of decimal type in hive, spark reads incorrect value

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 2.4.5, 3.0.1, 3.1.1
    • Fix Version/s: 2.4.8, 3.0.2, 3.1.1
    • Component/s: SQL
    • Labels:

      Description

      In Hive, 

      create table test_decimal(amt decimal(18,2)) stored as parquet; 
      insert into test_decimal select 100;
      alter table test_decimal change amt amt decimal(19,3);
      

      In Spark,

      select * from test_decimal;
      
      +--------+
      |    amt |
      +--------+
      | 10.000 |
      +--------+
      

        Attachments

          Activity

            People

            • Assignee:
              dongjoon Dongjoon Hyun
              Reporter:
              jack86596 Yahui Liu
            • Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: