Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19796

taskScheduler fails serializing long statements received by thrift server

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 2.2.0
    • 2.2.0
    • Spark Core
    • None

    Description

      This problem was observed after the changes made for SPARK-17931.

      In my use-case I'm sending very long insert statements to Spark thrift server and they are failing at TaskDescription.scala:89 because writeUTF fails if requested to write strings longer than 64Kb (see https://www.drillio.com/en/2009/java-encoded-string-too-long-64kb-limit/ for a description of the issue).

      As suggested by Imran Rashid I tracked down the offending key: it is "spark.job.description" and it contains the complete SQL statement.

      The problem can be reproduced by creating a table like:
      create table test (a int) using parquet

      and by sending an insert statement like:
      scala> val r = 1 to 128000
      scala> println("insert into table test values (" + r.mkString("),(") + ")")

      Attachments

        Issue Links

          Activity

            People

              irashid Imran Rashid
              gbloisi Giambattista
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: