Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
2.0.2, 2.1.2, 2.2.0
-
None
Description
$ spark-shell … scala> import org.apache.spark.Partition import org.apache.spark.Partition scala> class P(p: Partition) <console>:11: error: not found: type Partition class P(p: Partition) ^ scala> class P(val index: Int) extends Partition <console>:11: error: not found: type Partition class P(val index: Int) extends Partition ^
Any class that I import gives "not found: type ___" when used as a parameter to a class, or in an extends clause; this applies to classes I import from JARs I provide via --jars as well as core Spark classes as above.
This worked in 1.6.3 but has been broken since 2.0.0.