Ok.. thank you Era, Philip and Jeff for your suggestions (I hope I got
everybody). It turns out perl was in the path, but procmail wasn't
handing perl the body of the message how I was thinking it would. My
script was looking for an argument at ARGV[0], and nothing was
appearing there. So, with a bit of modification, I made the procmail
create data.$DATE (where date is %d%m%y%H%M%S), fed that file name as
an argument to the perl, modified the perl to look for a file at argv0
and hey presto, it works. The perl takes care of the rm'ing of the
file that is created.
However, I have just realised that if I get 2 mails in the same
second, I'm up a creek. Can anyone suggest a better method of what
I'm trying to do?
New recipe is:
DATE=`date +%d%m%y%H%M%S`
:0
* ^FROM(_dot_)*dhill(_at_)pct(_dot_)edu
{
:0 bc
/home/dhill/data.$DATE
:0 b
| /home/dhill/mailparse.pl /home/dhill/data.$DATE
}
and the perl looks something like:
use Mail::Sendmail;
$infile = $ARGV[0];
$linecount = 1;
open (OUTFILE, ">> /home/dhill/mailparse.log") or die "$!\n";
open (INFILE, $infile) or die "$!\n";
while (($line = <INFILE>) && $linecount==1)
{
chomp $line;
print OUTFILE "$line\n";
($address, $first, $last, $number) = split /\,/, $line;
$linecount++;
}
close files & unlink infile.
I suppose I should actually have the error messages dump to a file at
the very least, in case I run out of disk.
Any ideas?
--
Duncan Hill Sapere aude
One net to rule them all, One net to find them,
One net to bring them all, and using Unix bind them.