Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1916

SparkFlumeEvent with body bigger than 1020 bytes are not read properly

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 0.9.0
    • Fix Version/s: 0.9.2, 1.0.1
    • Component/s: DStreams
    • Labels:
      None

      Description

      The readExternal implementation on SparkFlumeEvent will read only the first 1020 bytes of the actual body when streaming data from flume.

      This means that any event sent to Spark via Flume will be processed properly if the body is small, but will fail if the body is bigger than 1020.
      Considering that the default max size for a Flume Avro Event is 32K, the implementation should be updated to read more.

      The following is related : http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-using-Flume-body-size-limitation-tt6127.html

        Attachments

        Issue Links

          Activity

            People

            • Assignee:
              lemieud David Lemieux
              Reporter:
              lemieud David Lemieux

              Dates

              • Created:
                Updated:
                Resolved:

                Issue deployment