Nutch
  1. Nutch
  2. NUTCH-1031

Delegate parsing of robots.txt to crawler-commons

    Details

    • Type: Task Task
    • Status: Closed
    • Priority: Minor Minor
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.7, 2.2
    • Component/s: None
    • Labels:

      Description

      We're about to release the first version of Crawler-Commons http://code.google.com/p/crawler-commons/ which contains a parser for robots.txt files. This parser should also be better than the one we currently have in Nutch. I will delegate this functionality to CC as soon as it is available publicly

      1. CC.robots.multiple.agents.patch
        3 kB
        Tejas Patil
      2. CC.robots.multiple.agents.v2.patch
        5 kB
        Tejas Patil
      3. NUTCH-1031.v1.patch
        30 kB
        Tejas Patil
      4. NUTCH-1031-2.x.v1.patch
        62 kB
        Tejas Patil
      5. NUTCH-1031-trunk.v2.patch
        47 kB
        Tejas Patil
      6. NUTCH-1031-trunk.v3.patch
        55 kB
        Tejas Patil
      7. NUTCH-1031-trunk.v4.patch
        55 kB
        Tejas Patil
      8. NUTCH-1031-trunk.v5.patch
        55 kB
        Tejas Patil

        Issue Links

          Activity

            People

            • Assignee:
              Tejas Patil
              Reporter:
              Julien Nioche
            • Votes:
              0 Vote for this issue
              Watchers:
              9 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development