Description
there actually is a desired scenario mentioned several times in the Spark mailing list that users are writing Scala/Java Spark applications (not SparkR) but want to use R functions in some transformations. typically this can be achieved by calling Pipe() in RDD. However, there are limitations on pipe(). So we can support applying a R function in source code format to a Dataset/DataFrame (Thus SparkR is not needed for serializing an R function.)