Details
-
Bug
-
Status: Closed
-
Blocker
-
Resolution: Duplicate
-
None
-
None
Description
when maven command-line
mvn -Dspark.version=2.2.2 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
will return error
[ERROR] [Error] F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\datasources\HBaseTableScanRDD.scala:216: overloaded method value addTaskCompletionListener with alternatives:
(f: org.apache.spark.TaskContext => Unit)org.apache.spark.TaskContext <and>
(listener: org.apache.spark.util.TaskCompletionListener)org.apache.spark.TaskContext
does not take type parameters
[ERROR] one error found
but use the spark.version=2.4.0 is ok
mvn -Dspark.version=2.4.0 -Dscala.version=2.11.7 -Dscala.binary.version=2.11 -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
other try
mvn -Dspark.version=3.0.0 -Dscala.version=2.12.12 -Dscala.binary.version=2.12 -Dcheckstyle.skip=true -Dmaven.test.skip=true clean install
return error
[ERROR] [Error] F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:439: object SparkHadoopUtil in package deploy cannot be accessed in package org.apache.spark.deploy
[ERROR] [Error] F:\hbase-connectors\spark\hbase-spark\src\main\scala\org\apache\hadoop\hbase\spark\HBaseContext.scala:487: not found: value SparkHadoopUtil
[ERROR] two errors found
go to the spark @github
define SparkHadoopUtil to private[spark]
private[spark] class SparkHadoopUtil extends Logging {}
Attachments
Issue Links
- duplicates
-
HBASE-25326 Allow hbase-connector to be used with Apache Spark 3.0
- Resolved