I've been trying to do some testing of udf's using code in other module, so that AddArtifact is necessary.
I got the following error:
Which doesn't give any clue about what happens.
Only after noticeable investigation I found the problem: I'm specifying the wrong path and the artifact fails to upload. Specifically what happens is that ArtifactManager doesn't read the file immediately, but rather creates iterator object which will incrementally generate requests to send. This iterator is passed to grpc's stream_unary to consume and actually send, and while grpc catches the error (see above), it suppresses the underlying exception.
I think we should improve pyspark user experience. One of the possible ways to fix this is to wrap ArtifactsManager._create_requests with an iterator wrapper which would log the throwable into spark connect logger so that user would see something like below at least when the debug mode is on.