Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-16994

Filter and limit are illegally permuted.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.0.0
    • 2.0.1, 2.1.0
    • SQL

    Description

      scala> spark.createDataset(1 to 100).limit(10).filter($"value" % 10 === 0).explain
      == Physical Plan ==
      CollectLimit 10
      +- *Filter ((value#875 % 10) = 0)
         +- LocalTableScan [value#875]
      
      scala> spark.createDataset(1 to 100).limit(10).filter($"value" % 10 === 0).collect
      res23: Array[Int] = Array(10, 20, 30, 40, 50, 60, 70, 80, 90, 100)
      

      Attachments

        Activity

          People

            rxin Reynold Xin
            TPolzer TobiasP
            Votes:
            1 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: