Details
-
Bug
-
Status: Open
-
P3
-
Resolution: Unresolved
-
None
-
None
-
None
Description
When run on Spark or Flink remote cluster, ParquetIOIT fails with the following stacktrace:
org.apache.beam.sdk.io.parquet.ParquetIOIT > writeThenReadAll FAILED org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.NoSuchMethodError: org.apache.parquet.hadoop.ParquetWriter$Builder.<init>(Lorg/apache/parquet/io/OutputFile;)V at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:66) at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:99) at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:87) at org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:116) at org.apache.beam.runners.spark.TestSparkRunner.run(TestSparkRunner.java:61) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331) at org.apache.beam.sdk.io.parquet.ParquetIOIT.writeThenReadAll(ParquetIOIT.java:133) Caused by: java.lang.NoSuchMethodError: org.apache.parquet.hadoop.ParquetWriter$Builder.<init>(Lorg/apache/parquet/io/OutputFile;)V
Attachments
Issue Links
- relates to
-
BEAM-7979 Avro incompatibilities with Spark 2.2 and Spark 2.3
- Open
- links to