Details
-
New Feature
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
3.2.0
-
None
-
None
Description
Spark dataframes currently do not have semantics of forany and forall.
Scala collections do have these semantics.
It would be nice to have them as an API as there is potential for implementing them in unoptimized form.
E.g. forany might be implemented as df.filter(<condition>).count > 0, which is not optimal