Details
-
New Feature
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
3.5.1
Description
Currently, several SQL functions accept both native types and Columns, but only accept native types in their scala/python APIs:
- array_remove (works in SQL, scala, not in python)
- array_position(works in SQL, scala, not in python)
- map_contains_key (works in SQL, scala, not in python)
- substring (works only in SQL)
For example, this is possible in SQL:
spark.sql("select array_remove(col1, col2) from values(array(1,2,3), 2)")
But not in python:
df.select(F.array_remove(F.col("col1"), F.col("col2"))
Attachments
Attachments
Issue Links
- links to