Details
-
Bug
-
Status: Resolved
-
Trivial
-
Resolution: Fixed
-
3.1.1, 3.1.2, 3.2.0
-
None
Description
When trying to follow steps in documentation about building Spark with a specific Hadoop version, this seems to fail with the following compiler error.
[INFO] Compiling 560 Scala sources and 99 Java sources to /Users/puigcalvachef/Documents/os/spark/core/target/scala-2.12/classes ... [ERROR] [Error] /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: type mismatch; found : K where type K required: String [ERROR] [Error] /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: value map is not a member of V [ERROR] [Error] /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: missing argument list for method stripXSS in class XssSafeRequest Unapplied methods are only converted to functions when a function type is expected. You can make this conversion explicit by writing `stripXSS _` or `stripXSS(_)` instead of `stripXSS`. [ERROR] [Error] /Users/puigcalvachef/Documents/os/spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307: value startsWith is not a member of K [ERROR] four errors found
The minimal build to reproduce it would be just by building spark core module using:
./build/mvn -Dhadoop.version=2.7.4 -pl :spark-core_2.12 -DskipTests clean install
After some testing, it seems that something changed between 3.1.1-rc1 and 3.1.1-rc2 that made the feature start failing.
I tried a few versions of Hadoop that fail: 2.7.4, 2.8.1, 2.8.5
But it was successful when using Hadoop 3.0.0.