Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3794

Building spark core fails due to inadvertent dependency on Commons IO

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.2.0
    • 1.2.0
    • Spark Core
    • Mac OS X 10.9.5

    Description

      At the commit cf1d32e3e1071829b152d4b597bf0a0d7a5629a2, building spark core result in compilation error when we specify some hadoop versions.

      To reproduce this issue, we should execute following command with <hadoop.version>=1.1.0, 1.1.1, 1.1.2, 1.2.0, 1.2.1, or 2.2.0.

      $ cd ./core
      $ mvn -Dhadoop.version=<hadoop.version> -DskipTests clean compile
      ...
      [ERROR] /Users/tomohiko/MyRepos/Scala/spark/core/src/main/scala/org/apache/spark/util/Utils.scala:720: value listFilesAndDirs is not a member of object org.apache.commons.io.FileUtils
      [ERROR]       val files = FileUtils.listFilesAndDirs(dir, TrueFileFilter.TRUE, TrueFileFilter.TRUE)
      [ERROR]                             ^
      

      Because that compilation uses commons-io version 2.1 and FileUtils#listFilesAndDirs method was added at commons-io version 2.2, this compilation always fails.

      FileUtils#listFilesAndDirs → http://commons.apache.org/proper/commons-io/apidocs/org/apache/commons/io/FileUtils.html#listFilesAndDirs%28java.io.File,%20org.apache.commons.io.filefilter.IOFileFilter,%20org.apache.commons.io.filefilter.IOFileFilter%29

      Because a hadoop-client in those problematic version depends on commons-io 2.1 not 2.4, we should have assumption that commons-io is version 2.1.

      Attachments

        Activity

          People

            Unassigned Unassigned
            cocoatomo Tomohiko K.
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: