Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
Description
The public endpoints get crawled leading to lots of 403's. We should have a /robots.txt that tells crawlers not to bother looking any further.
The public endpoints get crawled leading to lots of 403's. We should have a /robots.txt that tells crawlers not to bother looking any further.