Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-37171

Addition of forany and forall semantics to Spark Dataframes

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Minor
    • Resolution: Won't Fix
    • 3.2.0
    • None
    • SQL
    • None

    Description

      Spark dataframes currently do not have semantics of forany and forall.

      Scala collections do have these semantics.

      It would be nice to have them as an API as there is potential for implementing them in unoptimized form.

       

      E.g. forany might be implemented as df.filter(<condition>).count > 0, which is not optimal

      Attachments

        Activity

          People

            Unassigned Unassigned
            dhirennavani Dhiren Navani
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: