Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
2.3.0, 2.4.1
-
None
Description
[ERROR]: [Error] $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value size is not a member of Object [ERROR]: [Error] $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value size is not a member of Object
ERROR: two errors found
Below is the related code:
test("toString") { val empty = Matrices.ones(0, 0) empty.toString(0, 0) val mat = Matrices.rand(5, 10, new Random()) mat.toString(-1, -5) mat.toString(0, 0) mat.toString(Int.MinValue, Int.MinValue) mat.toString(Int.MaxValue, Int.MaxValue) var lines = mat.toString(6, 50).lines.toArray assert(lines.size == 5 && lines.forall(_.size <= 50)) lines = mat.toString(5, 100).lines.toArray assert(lines.size == 5 && lines.forall(_.size <= 100)) }
test("numNonzeros and numActives") { val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1)) assert(dm1.numNonzeros === 3) assert(dm1.numActives === 6) val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), Array(0.0, -1.2, 0.0)) assert(sm1.numNonzeros === 1) assert(sm1.numActives === 3) }
what shall i do to solve this problem, and when will spark support jdk11?
Attachments
Issue Links
- duplicates
-
SPARK-24417 Build and Run Spark on JDK11
- Resolved