procmail
[Top] [All Lists]

Avoiding concurrent mail processing (massive fgrep)?

1998-03-24 14:47:04

Situation: I use fgrep to scan headers against a massive list of keywords,
spamdomains, twit addresses, etc.  This works fine on regular traffic mail
levels.

It doesn't work well if I either get a flurry of mail by chance, or if my
sever connectivity has been unavailable, and then gets restored (and I then
get loads of mail queued at my mail secondary).  What happens then, is that
my system quickly uses up all 64MB of RAM processing a half-dozen of the
massive fgreps, and drops to a crawl.

So, I tried wrapping the twit recipes in a lockfile:

:0:$TEMP/FGREPTWITS$LOGEXT
{
INCLUDERC=$PMDIR/spam/notwit.rc
}

(the notwit.rc file first handles checking for certain non-twits, then
checks for twits with various rules).

The log sez "Extraneous locallockfile ignored", and the recipes are
executed concurrently.

Two questions:

        If lockfiling is how one would go about it, what am I doing wrong?

        If not, HOW would one limit the execution of multiple copies of the
        fgrep recipes?  Or do I have to go to each individual recipe and
        put locallockfiles on each of them?

---
 Please DO NOT carbon me on list replies.  I'll get my copy from the list.

 Sean B. Straw / Professional Software Engineering
 Post Box 2395 / San Rafael, CA  94912-2395

<Prev in Thread] Current Thread [Next in Thread>