Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-5037

support dynamic loading of input DStreams in pyspark streaming

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Major
    • Resolution: Auto Closed
    • 1.2.0
    • None
    • DStreams, PySpark

    Description

      The scala and java streaming APIs support "external" InputDStreams (e.g. the ZeroMQReceiver example) through a number of mechanisms, for instance by overriding ActorReceiver or just subclassing Receiver directly. The pyspark streaming API does not currently allow similar flexibility, being limited at the moment to file-backed text and binary streams or socket text streams.

      It would be great to open up the pyspark streaming API to other stream sources, putting it closer to on par with the JVM APIs.

      One way of doing this could be to support dynamically loading InputDStream implementations through reflection at the JVM level, analogously to what is currently done for Hadoop InputFormats in the regular pyspark context.py Hadoop methods.

      I'll submit a PR momentarily with my shot at this. Comments and alternative approaches more than welcome.

      Attachments

        Activity

          People

            Unassigned Unassigned
            jswisher Jascha Swisher
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: