Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33090

Upgrade Google Guava

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: In Progress
    • Major
    • Resolution: Unresolved
    • 3.0.1
    • None
    • Build
    • None

    Description

      Hadoop versions newer than 3.2.0 (such as 3.2.1 and 3.3.0) have started using features from newer versions of Google Guava.

      This leads to MethodNotFound exceptions, etc in Spark builds that specify newer versions of Hadoop. I believe this is due to the use of new methods in com.google.common.base.Preconditions.

      The above versions of Hadoop use guava-27.0-jre, whereas Spark is currently glued to guava-14.0.1.

      I have been running a Spark cluster with the version bumped to guava-29.0-jre without issue.

      Partly due to the way Spark is built, this change is a little more complicated that just changing the version, because newer versions of guava have a new dependency on com.google.guava:failureaccess:1.0.

       

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            sfcoy Stephen Coy
            Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

              Created:
              Updated: