Droids
  1. Droids
  2. DROIDS-77

Be able to modify URL rules while crawler is running

    Details

    • Type: New Feature New Feature
    • Status: Resolved
    • Priority: Minor Minor
    • Resolution: Duplicate
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: core
    • Labels:
      None

      Description

      It would be nice to be able to modify the URL rules while a crawler is running. This would allow me to dynamically exclude areas from being crawled based on results being returned. Basically I want to look for certain markers inside a page, then not crawl those pages without having update a robots file. Different paths of our site is going to enter into the index from a different method than the main crawl, so I can skip them once I find them.

      Having a modifiable filter would allow people to load their rules from places other than a file without having to write their own implementation or extension. I'll try to work up a patch sometime this week.

        Issue Links

          Activity

          No work has yet been logged on this issue.

            People

            • Assignee:
              Unassigned
              Reporter:
              Richard Frovarp
            • Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development