Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-36773

Fix the uts to check the parquet compression

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.3.0
    • 3.3.0
    • SQL
    • None

    Description

      In my own test env. I import a wrong jar version about zstd-jni-1.4.4-3.jar with the parquet version 1.12.0, when I pass the full uts and release a spark version to my env , using the parquet table with compression zstd, it throw a "NoClassDefFoundError" Exception. And I check the parquet source code, find the right zstd version for the 1.12.0 is 1.4.9-1. Once I upgrade the version about zstd-jni, it work well. 

      But I think we need check for all plugin compression algorithm can work well when we upgrade the version about those compression jars and parquet version. 

      Attachments

        Activity

          People

            jp.xiong jingpan xiong
            jp.xiong jingpan xiong
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: