-
Type:
Improvement
-
Status: Resolved
-
Priority:
Trivial
-
Resolution: Fixed
-
Affects Version/s: 0.22.0
-
Fix Version/s: 0.22.0
-
Component/s: datanode
-
Labels:None
-
Hadoop Flags:Incompatible change
-
Release Note:A robots.txt is now in place which will prevent well behaved crawlers from perusing Hadoop web interfaces.
There is a potential issue that someone might have an internal corporate crawler that goes through HDFS browser accidentally. It might be a good idea to provide a default robots file that disables crawling. [No, this didn't happen to us. :) ]