Details
-
Bug
-
Status: Resolved
-
Blocker
-
Resolution: Fixed
-
2.2.0
-
None
Description
This problem was observed after the changes made for SPARK-17931.
In my use-case I'm sending very long insert statements to Spark thrift server and they are failing at TaskDescription.scala:89 because writeUTF fails if requested to write strings longer than 64Kb (see https://www.drillio.com/en/2009/java-encoded-string-too-long-64kb-limit/ for a description of the issue).
As suggested by Imran Rashid I tracked down the offending key: it is "spark.job.description" and it contains the complete SQL statement.
The problem can be reproduced by creating a table like:
create table test (a int) using parquet
and by sending an insert statement like:
scala> val r = 1 to 128000
scala> println("insert into table test values (" + r.mkString("),(") + ")")
Attachments
Issue Links
- relates to
-
SPARK-17931 taskScheduler has some unneeded serialization
- Resolved
- links to