Uploaded image for project: 'Apache NiFi'
  1. Apache NiFi
  2. NIFI-710

Enumerate available Hadoop library versions for GetHDFS and PutHDFS

Attach filesAttach ScreenshotAdd voteVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • None
    • None
    • Extensions
    • Unix, Hadoop

    Description

      As far as I'm aware, it is only possible to use a single Hadoop library version per NiFi instance/cluster.

      Would it be possible in some way to enumerate the available Hadoop library versions in the /lib directory and give the user the option to select which version of Hadoop cluster they are writing to or reading from per HDFS processor? The intent would be to allow a single NiFi instance/cluster to write to multiple HDFS clusters which may be running on different versions eg. QA and Prod. This eliminates the need for having multiple HDFS feeder NiFis only because the Hadoop versions are different.

      I would try and figure this out my self however I don't know a whole lot about class loading in NiFi or generally how I would start this.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            ozhurakousky Oleg Zhurakousky
            NathanG Nathan Gough

            Dates

              Created:
              Updated:

              Slack

                Issue deployment