Uploaded image for project: 'Nutch'
  1. Nutch
  2. NUTCH-1067

Configure minimum throughput for fetcher

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Closed
    • Minor
    • Resolution: Fixed
    • None
    • 1.4
    • fetcher
    • None
    • Patch Available

    Description

      Large fetches can contain a lot of url's for the same domain. These can be very slow to crawl due to politeness from robots.txt, e.g. 10s per url. If all other url's have been fetched, these queue's can stall the entire fetcher, 60 url's can then take 10 minutes or even more. This can usually be dealt with using the time bomb but the time bomb value is hard to determine.

      This patch adds a fetcher.throughput.threshold setting meaning the minimum number of pages per second before the fetcher gives up. It doesn't use the global number of pages / running time but records the actual pages processed in the previous second. This value is compared with the configured threshold.

      Besides the check the fetcher's status is also updated with the actual number of pages per second and bytes per second.

      Attachments

        1. NUTCH-1067-1.4-1.patch
          4 kB
          Markus Jelsma
        2. NUTCH-1067-1.4-2.patch
          5 kB
          Markus Jelsma
        3. NUTCH-1067-1.4-3.patch
          7 kB
          Markus Jelsma
        4. NUTCH-1067-1.4-4.patch
          7 kB
          Markus Jelsma
        5. NUTCH-1045-1.4-v2.patch
          144 kB
          Markus Jelsma

        Activity

          People

            markus17 Markus Jelsma
            markus17 Markus Jelsma
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: