Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1916

SparkFlumeEvent with body bigger than 1020 bytes are not read properly

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 0.9.0
    • 0.9.2, 1.0.1
    • DStreams
    • None

    Description

      The readExternal implementation on SparkFlumeEvent will read only the first 1020 bytes of the actual body when streaming data from flume.

      This means that any event sent to Spark via Flume will be processed properly if the body is small, but will fail if the body is bigger than 1020.
      Considering that the default max size for a Flume Avro Event is 32K, the implementation should be updated to read more.

      The following is related : http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-using-Flume-body-size-limitation-tt6127.html

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            lemieud David Lemieux
            lemieud David Lemieux
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment