Uploaded image for project: 'Parquet'
  1. Parquet
  2. PARQUET-2025

Bump snappy to 1.1.8.3 to support Mac m1

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 1.12.0
    • 1.13.0
    • None
    • None

    Description

      When running unit tests of  iceberg on Mac m1 , it throws:                

       

      Caused by:

                      java.lang.NoClassDefFoundError: Could not initialize class org.xerial.snappy.Snappy

                          at org.apache.parquet.hadoop.codec.SnappyCompressor.compress(SnappyCompressor.java:67)

                          at org.apache.hadoop.io.compress.CompressorStream.compress(CompressorStream.java:81)

                          at org.apache.hadoop.io.compress.CompressorStream.finish(CompressorStream.java:92)

                          at org.apache.parquet.hadoop.CodecFactory$HeapBytesCompressor.compress(CodecFactory.java:165)

                          at org.apache.parquet.hadoop.ColumnChunkPageWriteStore$ColumnChunkPageWriter.writePage(ColumnChunkPageWriteStore.java:122)

                          at org.apache.parquet.column.impl.ColumnWriterV1.writePage(ColumnWriterV1.java:53)

                          at org.apache.parquet.column.impl.ColumnWriterBase.writePage(ColumnWriterBase.java:315)

                          at org.apache.parquet.column.impl.ColumnWriteStoreBase.flush(ColumnWriteStoreBase.java:152)

                          at org.apache.parquet.column.impl.ColumnWriteStoreV1.flush(ColumnWriteStoreV1.java:27)

                          at org.apache.parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:172)

                          at org.apache.parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:114)

                          at org.apache.parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:165)

                          at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetOutputWriter.scala:42)

                          at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:57)

                          at org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:74)

                          at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:247)

                          at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask$3.apply(FileFormatWriter.scala:242)

                          at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)

                          at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:248)

                          ... 10 more

      Attachments

        Activity

          People

            Unassigned Unassigned
            junjie Junjie Chen
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: