procmail
[Top] [All Lists]

Re: how to delay (following) tasks on filtered messages

2005-03-19 13:08:41
Please note that your message posts to the list as a MIME attachment. Generally, I don't read these things, but Ithe subject tickled my curiosity on this otherwise boring morning.

At 13:21 2005-03-19 +0100, Michelle Konzack wrote:
Date: Sat, 19 Mar 2005 13:21:46 +0100
From: Michelle Konzack <linux4michelle(_at_)freenet(_dot_)de>
Content-type: multipart/mixed; boundary="Boundary_(ID_H4rzT1JPRuXRPZfbFgTgMg)"

[snip]

but now I like to download the Set with "apt-get source $SOURCE"...

Which works fine but block procmail/and fetchmail up to the finish
of the download.

This would be different from your idea of wrapping the fetchmail invocation with a lockfile script how? I realize there are obvious differences, but the end result seems to be pretty much the same.

Does someone has an idea how to delay the Task ?

Doing
        apt-get source $SOURCE &

does not work.  :-(

'man at'

Sort of the poor (wo)man's cron.

Oh yes, please not, that I am using a wraper script for 'fetchmail'
which download $USER after $USER and fetchmailrc after fetchmailrc,
Whic mean I have no paralel instances of fetchmail running.

This redouc the load on my FileServer enormeous... because I have
heavy Filtering rules some different sbl/xbl/dul, spamassassin,
f-prot, and own filter...

Multiple fetchmailrc per $USER...

Should I code a fetchmail wraper lockfile while fetchmail download
for the $USER and then use a cronjob (each minute)

Do you REALLY need it invoked that often? IOW, is something coming down on that notification list really so critical that you need to have it the minute it shows up, or wouldn't 15 minutes work just as well (and be less of a 90% useless task on your machine). See your own reference to load on your server...

 which check for
the lockfile and if there is none, download the $SOURCE ?

Why would the fetchmail need to be within a lockfile wrapper? Seems that would still manage to hold up your mail processing.

Why not create a subdir into which you place files named according to the file you want to retrieve. A cron-invoked script looks to that directory and invokes a retrieval (such as apt-get) for each filename found there, deleting the marker file once the file has been downloaded, and progressing to the next filename found in that dir (which, FTR, wouldn't be where the downloaded files are stored). The very existance of the files would be atomic - they're either there or they are not, and procmail wouldn't be writing anything INTO the files, merely creating them, so no lockfile would be necessary (save for the invocation of the retrieval script itself).

This way, your formail could be invoked whatever number of times, and you could match watever number of notification messages that you take action on - they'll just get queued up for retrieval.

Using a cron-invoked script allows for more simplistic error recovery than using at - if the script simply doesn't delete the stub file from the request directory unless it succeeds, then the next invocation can attempt to retrieve it again, without any special retry coding on your part. With at, you'd need to set it up to re-queue the request, and unless you're also creating a request file with the at approach, you won't have an automatic recovery if the at process dies. Having the request files also gives you an easily viewed "queue", and the file timestamps show when they were added to the queue.

---
 Sean B. Straw / Professional Software Engineering

 Procmail disclaimer: <http://www.professional.org/procmail/disclaimer.html>
 Please DO NOT carbon me on list replies.  I'll get my copy from the list.


____________________________________________________________
procmail mailing list   Procmail homepage: http://www.procmail.org/
procmail(_at_)lists(_dot_)RWTH-Aachen(_dot_)DE
http://MailMan.RWTH-Aachen.DE/mailman/listinfo/procmail

<Prev in Thread] Current Thread [Next in Thread>