In the past we have encountered situations where crawling specific broken sites resulted in ridiciously long urls that caused the stalling of tasks. The regex plugins (normalizing/filtering) processed single urls for hours, if not indefinitely hanging.
My suggestion is to limit the outlink url target length as soon possible. It is a configurable limit, the default is 3000. This should be reasonably long enough for most uses. But sufficienly strict enough to make sure regex plugins do not choke on urls that are too long. Please see attached patch for the Nutchgora implementation.
I'd like to hear what you think about this.