Details
-
Sub-task
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
2.3.0
-
None
Description
Minor, but probably needs a JIRA.
After updating to Scala 2.12 I encountered this issue pretty quickly in tests: https://github.com/paul-hammant/paranamer/issues/17
java.lang.ArrayIndexOutOfBoundsException: .... at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.accept(BytecodeReadingParanamer.java:563) ...
Spark depends on jackson-module-paranamer 2.6.7 to match the other jackson deps, and that depends on paranamer 2.6. The bug above is fixed in 2.8.
However I noticed that, really, Spark uses jackson-module-scala 2.6.7.1, and that in turn actually depends on jackson-module-paranamer 2.7.9 (kind of odd), which already depends on 2.8.
So it seemed prudent to just not manually manage this dependency down. Then we still need to manage paranamer to 2.8 because Avro 1.7 pulls in 2.3.
But it all seems to work in a quick test. And it's necessary to get 2.12 working.
Attachments
Issue Links
- is duplicated by
-
SPARK-26819 ArrayIndexOutOfBoundsException while loading a CSV to a Dataset with dependencies spark-core_2.12 and spark-sql_2.12 (with spark-core_2.11 and spark-sql_2.11 : working fine)
- Closed
- is related to
-
SPARK-26583 Add `paranamer` dependency to `core` module
- Resolved
- relates to
-
BEAM-14345 Hadoop version tests are failing for Spark 3
- Triage Needed
- links to