We recently got an error when deleted full rows from a table with 64 columns using sparkSQL, however if we delete a column from the table, this error will not appear. The error is:
Failed to write at least 1000 rows to Kudu; Sample errors: Not implemented: Unknown row operation type (error 0)
I tested this by deleting a full row from a table with 64 column using java client 1.12.0/1.13.0, if the row is set NULL for some columns, I got an error:
Row error for primary key=[-128, 0, 0, 1], tablet=null, server=d584b3407ea444519e91b32f2744b162, status=Invalid argument: DELETE should not have a value for column: c63 STRING NULLABLE (error 0)
if the row is set values for all columns , I got an error like:
Row error for primary key=[-128, 0, 0, 1], tablet=null, server=null, status=Corruption: Not enough data for column: c63 STRING NULLABLE (error 0)
I also tested this with tables with different number of columns. The weird thing is I could delete full rows from a table with 8/16/32/63/65 columns, but couldn't do this if the table has 64/128 columns.