Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
3.5.0
Description
Currently, when a UDF fails the connect client does not receive the actual error that caused the failure.
As an example, the error message looks like -
Exception in thread "main" org.apache.spark.SparkException: grpc_shaded.io.grpc.StatusRuntimeException: INTERNAL: Job aborted due to stage failure: Task 2 in stage 0.0 failed 4 times, most recent failure: Lost task 2.3 in stage 0.0 (TID 10) (10.68.141.158 executor 0): org.apache.spark.SparkException: [FAILED_EXECUTE_UDF] Failed to execute user defined function (` (Main$$$Lambda$4770/1714264622)`: (int) => int). SQLSTATE: 39000
In this case, the actual error was a java.lang.NoClassDefFoundError.
Attachments
Issue Links
- links to