Hadoop HDFS
  1. Hadoop HDFS
  2. HDFS-330

Datanode Web UIs should provide robots.txt

    Details

    • Type: Improvement Improvement
    • Status: Resolved
    • Priority: Trivial Trivial
    • Resolution: Fixed
    • Affects Version/s: 0.22.0
    • Fix Version/s: 0.22.0
    • Component/s: datanode
    • Labels:
      None
    • Hadoop Flags:
      Incompatible change
    • Release Note:
      A robots.txt is now in place which will prevent well behaved crawlers from perusing Hadoop web interfaces.

      Description

      There is a potential issue that someone might have an internal corporate crawler that goes through HDFS browser accidentally. It might be a good idea to provide a default robots file that disables crawling. [No, this didn't happen to us. :) ]

      1. HDFS-330.txt
        0.2 kB
        Allen Wittenauer

        Activity

          People

          • Assignee:
            Allen Wittenauer
            Reporter:
            Allen Wittenauer
          • Votes:
            1 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development