Ok, I just thought of this while walking the dog, so I don't claim it to be
anything but a stray thought....
I was trying to think of an absolutely bulletproof method of storing
duplicates. Of course this costs some on the 'pain in the neck' scale, but
perhaps someone can improve on it. Hang on... it's ugly.
It requires several pieces:
- a plaintext database of Message-IDs
- a process to check the incoming Message-ID against the database
- a (nightly?) cron job to trim the database to the desired size
- a change of all delivery recipes to include the 'c' flag
- an INCLUDERC to be used after the delivery recipe, called with the
'a' flag (to be included only if the recipe succeeded) which adds the new
Message-ID to the database
Simple enough, yes, but of course it requires an extra step for each
delivery recipe.... still, it seems that this would insure that no messages
are lost due to be mistaken for duplicates when they have not actually been
delivered anywhere.
Another random thought: set LASTFOLDER to something bogus, check end to see
if LASTFOLDER has changed.... I'm not sure this would be a bulletproof
answer, however....
I still think that if I wanted to do this I'd have the duplicate messages
delivered to /tmp/duplicates.USER (chmod 600 of course) and just let the
reboot take care of cleaning that out.
Or, if you don't like /tmp/ solutions, put them in a file in your $HOME and
remove the file on logout.
TjL