Description
I have a UDF which takes java.sql.Timestamp and String as input column type and returns an Array of (Seq[case class], Double) as output. Since some of values in input columns can be nullable, I put the UDF inside a when($input.isNull, null).otherwise(UDF) filter. Such function works well when I test in spark shell. But running as a scala jar in spark-submit with yarn cluster mode, it raised NullPointerException which points to the UDF function. If I remove the when().otherwsie() condition, but put null check inside the UDF, the function works without issue in spark-submit.