[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [bug-gawk] 4.1.3->4.1.4 = Linux-libre's deblob-check grows huge and

From: Andrew J. Schorr
Subject: Re: [bug-gawk] 4.1.3->4.1.4 = Linux-libre's deblob-check grows huge and takes forever
Date: Thu, 13 Jul 2017 22:33:27 -0400
User-agent: Mutt/1.5.21 (2010-09-15)

On Thu, Jul 13, 2017 at 10:12:03PM -0400, Andrew J. Schorr wrote:
> And yes, the attached patch seems to fix the performance regression.
> But I don't understand the deeper issues here about why this change was
> made in the first place. I imagine that there are some cases where this
> was helping performance...

My first thought was that since dfa search is supposed to be faster,
maybe replacing '! need_start' with 'start == 0' was excluding some cases
from dfa. So I thought we should perhaps try using the union of those
2 conditions, i.e.

      if (rp->dfareg != NULL && ! no_bol && (! need_start || start == 0)) {

But with that patch, the code fails this performance test. So I guess the dfa
search causes the performance problem for this case. Or I may be totally
confused. I have no idea how this portion of the code works...


reply via email to

[Prev in Thread] Current Thread [Next in Thread]