Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Invalid
-
3.0.0
-
None
-
None
-
Patch, Important
Description
Have started experimenting Spark 3.0 new SQL functions and along the way found an issue with the transform_keys function. It is raising "Cannot use null as map key" exception but the Map actually doesn't hold any Null values.
Find my spark code below to reproduce the error.
val df = Seq(Map("EID_1"->10000,"EID_2"->25000)).toDF("employees") df.withColumn("employees",transform_keys($"employees",(k,v)=>lit(k.+("XYX")))) .show
Exception in thread "main" java.lang.RuntimeException: Cannot use null as map key.