Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
0.9.1, 1.0.0
-
None
-
Ubuntu 14.04 with spark-x.x.x-bin-hadoop2
Description
Normal scala expressions are interpreted in a strange way in the spark shell. For instance
case class Foo(x: Int) def print(f: Foo) = f.x val f = Foo(3) print(f) <console>:24: error: type mismatch; found : Foo required: Foo
For another example
trait Currency case object EUR extends Currency case object USD extends Currency def nextCurrency: Currency = nextInt(2) match { case 0 => EUR case _ => USD } <console>:22: error: type mismatch; found : EUR.type required: Currency case 0 => EUR <console>:24: error: type mismatch; found : USD.type required: Currency case _ => USD
Attachments
Issue Links
- duplicates
-
SPARK-1199 Type mismatch in Spark shell when using case class defined in shell
- Resolved