Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-2330

Spark shell has weird scala semantics

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Duplicate
    • Affects Version/s: 0.9.1, 1.0.0
    • Fix Version/s: None
    • Component/s: Spark Core
    • Labels:
    • Environment:

      Ubuntu 14.04 with spark-x.x.x-bin-hadoop2

      Description

      Normal scala expressions are interpreted in a strange way in the spark shell. For instance

      case class Foo(x: Int)
      def print(f: Foo) = f.x
      val f = Foo(3)
      print(f)
      <console>:24: error: type mismatch;
       found   : Foo
       required: Foo
      

      For another example

      trait Currency
      case object EUR extends Currency
      case object USD extends Currency
      
      def nextCurrency: Currency = nextInt(2) match {
        case 0 => EUR
        case _ => USD
      }
      
      <console>:22: error: type mismatch;
       found   : EUR.type
       required: Currency
               case 0 => EUR
      
      <console>:24: error: type mismatch;
       found   : USD.type
       required: Currency
               case _ => USD
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                andrea.ferretti Andrea Ferretti
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: