Description
Spark reflect function will fail if underlying method call throws exception. This causes the whole job to fail.
In Hive however the exception is caught and null is returned. Simple test to reproduce the behavior
select reflect('java.net.URLDecoder', 'decode', '%')
The workaround would be to wrap this call in a try
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala#L136
We can support this by adding a new UDF `try_reflect` which mimics the Hive's behavior. Please share your thoughts on this.