While I agree with you that it is a great problem that e-mail addresses
are collected this way, I doubt this is the solution. First, I doubt these
robots (of all) look for the robots meta-tag. Secondly, it is a bit
important to me that my archives are indexed by robots, they play a role
in getting out the points we want to get across to the public.
Many search engines don't honor META flags -- last I checked, excite!
ignored them entirely and would only work with a robots.txt file. Even
a robots.txt file is no guarantee, however. It's just a guideline
that a homegrown robot doesn't have to use.
The message also says that addresses are harvested from the Net, not
just the Web. I'd say that it's more likely that newsgroups are
scanned for addresses than web sites, but the latter has been known
One possibility to make your addresses appear more protected is to
print them out using the printable Latin-1 Character codes. i.e.
on a web page shows up as 'l(_at_)m' and will attempt to mail the correct
addres, but a scan for @ delimited text will fail.
Fortunately, I never saw the original post until I went looking for
it. The non-compliant Message-Id: tagged it as spam for my filters.
What about limiting posts to the list to actual subscribers instead
of the general public?