In the embedded driver, insertion is aborted if the stream used as source is shorter then the specified length.
In the client driver, the stream is padded and the insertion is completed. Note that an exception is thrown here as well, but the data is inserted into the database and can be retrieved after the exception have been caught.
If anyone has information about why this was done, it would be nice. Could it be that the DRDA protocol can't handle this easily?
This difference was discovered while writing tests, and the impact on the test code is not good as you need special code for running it with the client.