Details
Description
With Scala-2.12 profile, Spark application fails while Spark is okay. For example, our documented `SimpleApp` example succeeds to compile but it fails at runtime because it doesn't use `paranamer 2.8` and hits SPARK-22128.
$ mvn dependency:tree -Dincludes=com.thoughtworks.paranamer
[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @ simple ---
[INFO] my.test:simple:jar:1.0-SNAPSHOT
[INFO] \- org.apache.spark:spark-sql_2.12:jar:3.0.0-SNAPSHOT:compile
[INFO] \- org.apache.spark:spark-core_2.12:jar:3.0.0-SNAPSHOT:compile
[INFO] \- org.apache.avro:avro:jar:1.8.2:compile
[INFO] \- com.thoughtworks.paranamer:paranamer:jar:2.7:compile
Attachments
Issue Links
- is duplicated by
-
SPARK-26819 ArrayIndexOutOfBoundsException while loading a CSV to a Dataset with dependencies spark-core_2.12 and spark-sql_2.12 (with spark-core_2.11 and spark-sql_2.11 : working fine)
- Closed
- relates to
-
SPARK-22128 Update paranamer to 2.8 to avoid BytecodeReadingParanamer ArrayIndexOutOfBoundsException with Scala 2.12 + Java 8 lambda
- Resolved