Uploaded image for project: 'Hadoop HDFS'
  1. Hadoop HDFS
  2. HDFS-9651

All web UIs should include a robots.txt file

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • None
    • 2.9.0, 3.0.0-alpha4
    • None
    • None
    • Reviewed

    Description

      Similar to HDFS-330. So that public UIs don't get crawled.

      I can provide a patch that includes a simple robots.txt. Another alternative is probably a Filter that provides one automatically for all UIs but I don't have time to do that.

      If anyone wants to take over please go ahead.

      Attachments

        1. HDFS-9651.1.patch
          0.9 kB
          Lars Francke
        2. HDFS-9651.2.patch
          2 kB
          Lars Francke

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            larsfrancke Lars Francke
            larsfrancke Lars Francke
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment