Details
-
New Feature
-
Status: Resolved
-
Minor
-
Resolution: Duplicate
-
None
-
None
-
None
Description
It would be nice to be able to modify the URL rules while a crawler is running. This would allow me to dynamically exclude areas from being crawled based on results being returned. Basically I want to look for certain markers inside a page, then not crawl those pages without having update a robots file. Different paths of our site is going to enter into the index from a different method than the main crawl, so I can skip them once I find them.
Having a modifiable filter would allow people to load their rules from places other than a file without having to write their own implementation or extension. I'll try to work up a patch sometime this week.
Attachments
Issue Links
- duplicates
-
DROIDS-111 Open the API of RegexURLFilter to other types of input
- Closed