Nutch
  1. Nutch
  2. NUTCH-1032

Delegate parsing of robots.txt to crawler-commons

    Details

    • Type: Task Task
    • Status: Closed
    • Priority: Minor Minor
    • Resolution: Duplicate
    • Affects Version/s: None
    • Fix Version/s: 1.4, nutchgora
    • Component/s: None
    • Labels:

      Description

      We're about to release the first version of Crawler-Commons http://code.google.com/p/crawler-commons/ which contains a parser for robots.txt files. This parser should also be better than the one we currently have in Nutch. I will delegate this functionality to CC as soon as it is available publicly

        Activity

          People

          • Assignee:
            Julien Nioche
            Reporter:
            Julien Nioche
          • Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development