Description
Hello all,
I used the function next_day in the spark SQL component and loved it: https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L3077
Actually the signature of this function is: def next_day(date: Column, dayOfWeek: String): Column.
It accepts the dayOfWeek parameter as a String. However in my case, the dayOfWeek is in a Column, so different values for each row of the dataframe. So I had to use the NextDay function like this: NextDay(dateCol.expr, dayOfWeekCol.expr).
My proposition is to add another signature for this function: def next_day(date: Column, dayOfWeek: Column): Column
In fact it is already the case for some other functions in this scala object, exemple:
def date_sub(start: Column, days: Int): Column = date_sub(start, lit(days))
def date_sub(start: Column, days: Column): Column = withExpr { DateSub(start.expr, days.expr) }
or
def add_months(startDate: Column, numMonths: Int): Column = add_months(startDate, lit(numMonths))
def add_months(startDate: Column, numMonths: Column): Column = withExpr
{ AddMonths(startDate.expr, numMonths.expr) }
I hope have explained my idea clearly. Let me know what are your opinions. If you are ok, I can submit a pull request with the necessary change.
Kind regardes,
Chongguang