Details
-
New Feature
-
Status: Resolved
-
Major
-
Resolution: Done
-
None
-
None
-
None
Description
As of Spark 2.0, Spark falls back to Hive for only the following built-in functions:
"elt", "hash", "java_method", "histogram_numeric", "map_keys", "map_values", "parse_url", "percentile", "percentile_approx", "reflect", "sentences", "stack", "str_to_map", "xpath", "xpath_boolean", "xpath_double", "xpath_float", "xpath_int", "xpath_long", "xpath_number", "xpath_short", "xpath_string", // table generating function "inline", "posexplode"
The goal of the ticket is to implement all of these in Spark so we don't need to fall back into Hive's UDFs.
Attachments
Issue Links
- contains
-
SPARK-16270 Implement xpath user defined functions
- Resolved