Uploaded image for project: 'Nutch'
  1. Nutch
  2. NUTCH-1031

Delegate parsing of robots.txt to crawler-commons

    XMLWordPrintableJSON

Details

    • Task
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • None
    • 1.7, 2.2
    • None

    Description

      We're about to release the first version of Crawler-Commons http://code.google.com/p/crawler-commons/ which contains a parser for robots.txt files. This parser should also be better than the one we currently have in Nutch. I will delegate this functionality to CC as soon as it is available publicly

      Attachments

        1. NUTCH-1031.v1.patch
          30 kB
          Tejas Patil
        2. CC.robots.multiple.agents.patch
          3 kB
          Tejas Patil
        3. CC.robots.multiple.agents.v2.patch
          5 kB
          Tejas Patil
        4. NUTCH-1031-trunk.v2.patch
          47 kB
          Tejas Patil
        5. NUTCH-1031-trunk.v3.patch
          55 kB
          Tejas Patil
        6. NUTCH-1031-trunk.v4.patch
          55 kB
          Tejas Patil
        7. NUTCH-1031-trunk.v5.patch
          55 kB
          Tejas Patil
        8. NUTCH-1031-2.x.v1.patch
          62 kB
          Tejas Patil

        Issue Links

          Activity

            People

              tejasp Tejas Patil
              jnioche Julien Nioche
              Votes:
              0 Vote for this issue
              Watchers:
              9 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: