Details
-
Bug
-
Status: Closed
-
Minor
-
Resolution: Invalid
-
3.0.0
-
None
-
None
-
The build was tested in two environments. The first was Debian 10 running OpenJDK 11 with Scala 2.12. The second was Debian 9.1 with OpenJDK 8 and Scala 2.12.
The same error occurred in both environments.
Both environments used Linux kernel 4.19. Both environments were VirtualBox VMs running on a MacBook.
The build was tested in two environments. The first was Debian 10 running OpenJDK 11 with Scala 2.12. The second was Debian 9.1 with OpenJDK 8 and Scala 2.12. The same error occurred in both environments. Both environments used Linux kernel 4.19. Both environments were VirtualBox VMs running on a MacBook.
Description
Build fails at Spark Core stage when using Maven with specified Hadoop version 3.2. The build command run is:
./build/mvn -DskipTests -Dhadoop.version=3.2.0 package
The build error output is
[INFO]
[INFO] --- scala-maven-plugin:4.2.0:testCompile (scala-test-compile-first) @ spark-core_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiling 262 Scala sources and 27 Java sources to /usr/local/src/spark/core/target/scala-2.12/test-classes ...
[ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:23: object lang is not a member of package org.apache.commons
[ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:49: not found: value SerializationUtils
[ERROR] two errors found
The problem does not occur when building without Hadoop package specification, i.e. when running:
./build/mvn -DskipTests package