In Spark 3, we should remove interfaces like org.apache.spark.api.java.function.Function and replace with java.util.function equivalents, for better compatibility with Java 8. This would let callers pass, in more cases, an existing functional object in Java rather than wrap in a lambda.
It's possible to have the functional interfaces in Spark just extend Java 8 functional interfaces to interoperate better with existing code, but might be as well to remove them in Spark 3 to clean up.
A partial list of transitions from Spark to Java interfaces:
- Function -> Function
- Function0 -> Supplier
- Function2 -> BiFunction
- VoidFunction -> Consumer
- FlatMapFunction etc -> extends Function<T,Iterable<R>> etc