Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-19796

taskScheduler fails serializing long statements received by thrift server

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotBulk Copy AttachmentsBulk Move AttachmentsVotersStop watchingWatchersCreate sub-taskConvert to sub-taskLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 2.2.0
    • 2.2.0
    • Spark Core
    • None

    Description

      This problem was observed after the changes made for SPARK-17931.

      In my use-case I'm sending very long insert statements to Spark thrift server and they are failing at TaskDescription.scala:89 because writeUTF fails if requested to write strings longer than 64Kb (see https://www.drillio.com/en/2009/java-encoded-string-too-long-64kb-limit/ for a description of the issue).

      As suggested by Imran Rashid I tracked down the offending key: it is "spark.job.description" and it contains the complete SQL statement.

      The problem can be reproduced by creating a table like:
      create table test (a int) using parquet

      and by sending an insert statement like:
      scala> val r = 1 to 128000
      scala> println("insert into table test values (" + r.mkString("),(") + ")")

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            irashid Imran Rashid Assign to me
            gbloisi Giambattista
            Votes:
            0 Vote for this issue
            Watchers:
            6 Stop watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment