Details
Description
There is a potential issue that someone might have an internal corporate crawler that goes through HDFS browser accidentally. It might be a good idea to provide a default robots file that disables crawling. [No, this didn't happen to us. :) ]