Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21390

Dataset filter api inconsistency

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Incomplete
    • 2.0.1, 2.1.0, 2.2.0
    • None
    • SQL

    Description

      Hello everybody,

      I've encountered a strange situation with the spark-shell.
      When I run the code below in my IDE the second test case prints as expected count "1". However, when I run the same code using the spark-shell in the second test case I get 0 back as a count.
      I've made sure that I'm running scala 2.11.8 and spark 2.0.1 in both my IDE and spark-shell.

        import org.apache.spark.sql.Dataset
      
        case class SomeClass(field1:String, field2:String)
      
        val filterCondition: Seq[SomeClass] = Seq( SomeClass("00", "01") )
      
        // Test 1
        val filterMe1: Dataset[SomeClass] = Seq( SomeClass("00", "01") ).toDS
        
        println("Works fine!" +filterMe1.filter(filterCondition.contains(_)).count)
        
        // Test 2
        case class OtherClass(field1:String, field2:String)
        
        val filterMe2 = Seq( OtherClass("00", "01"), OtherClass("00", "02")).toDS
      
        println("Fail, count should return 1: " + filterMe2.filter(x=> filterCondition.contains(SomeClass(x.field1, x.field2))).count)
      

      Note if I transform the dataset first I get 1 back as expected.

       println(filterMe2.map(x=> SomeClass(x.field1, x.field2)).filter(filterCondition.contains(_)).count)
      

      Is this a bug? I can see that this filter function has been marked as experimental https://spark.apache.org/docs/2.1.0/api/java/org/apache/spark/sql/Dataset.html#filter(scala.Function1)

      Attachments

        Activity

          People

            Unassigned Unassigned
            gheo21 Gheorghe Gheorghe
            Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: