[Top] [All Lists]

Re: We need an IETF BCP for GREY LISTING

2011-10-24 17:14:27


There is no suggestion to change the framework. There is an growing issue with increasing deployment of Greylisting. Sure, mileage will vary. Is it universal issue? No. Nevertheless, it is a problem waiting for a solution. There is an recognition regarding wasted overhead and increasing delays. Fortunately, SMTP is robust enough to tamper the "pain" per se down because for the most part, the mail will make it - abeit with more attempts and longer delays.

That is something I wish to address.  That's all. Its that simple.


Richard Kulawiec wrote:
On Sun, Oct 16, 2011 at 06:37:47PM -0400, Hector Santos wrote:
My long 30 years of mail engineering experience has given me the
position and view that non-timely delivery (i.e. unreasonable
hindrance) is the exception and not the rule.

My equally-long years of mail engineering experience suggest that
non-timely delivery is only an issue when there is an outage,
a misconfiguration, an attack, or a major overload due to external events.
Then again, I don't see mail as a real-time service, and I don't see any
need for it to try to be.  (Other than, perhaps, on a localized basis --
where it already pretty much is, within the limits imposed by the
throughput of the system(s) handling it.)

I'm not arguing, in case it isn't clear, that we shouldn't try to
make mail systems better, faster, more reliable, etc.  Of course
we should.  But even if we could design, develop, and deploy the
perfect greylisting solution Internet-wide tomorrow...we would
achieve very little.  There are other much more serious problems
that remain, that have far more serious negative effects.  So if the
goal is overall improvement, then I think our attention is far better
directed at the elephants rather than the gnats.

Moreover: our hypothetical improved greylisting system will be
subject to scrutiny and attack from the same adversaries who've
already demonstrated considerable ingenuity and remarkable adaption
skills.  Merely engineering it so that it works nicely between
friendly cooperating parties is insufficient; we would need to engineer
it so that it can't be turned into a weapon. (See below for
an example of a very poorly-thought-out idea that has long since
been weaponized in just this fashion.)

Perhaps that's possible.  But -- so far -- I don't think so.
It looks to me, from here, like optimizing-for-throughput is
functionally equivalent to optimizing-for-vulnerability.  I would
like to be convinced otherwise, but I don't see a cogent argument
to that effect being made, yet.

Its the same thing with CBV (Callback Verification). CBV works
wonderfully, but its not something I would recommend to be used
widely. It would not scale.  Its not something everyone can do.

CBV is inherently abusive, as has previously been discussed at length
on forums such as the old spam-l.  Anyone using it is providing a
spam support/enabling service and (possibly, IANAL) in violation of the
applicable statutes which (in some jurisdictions) make it illegal to
bypass a security mechanism.  Thus it should never be used. [1]

Just imagine if GMAIL.COM began to do Greylisting?  One can stand
maybe YAHOO with its GL derivative, but with more of the bigger ESPs
doing the same thing, and it will get out of hand.

(a) We could imagine anything, but that doesn't mean it will happen.

(b) Greylisting has been around long enough that if there was going
to be a mass adoption of would have already happened.

(c) But if it does, and if a sufficient number of sufficiently large
sites begin to do greylisting, then spammers will respond by making it
ineffective.  (There is some evidence that this is already happening,
albeit in a scattered way.)

(d) If (c) comes to pass, then it's only a matter of time until greylisting
becomes as useless as "greet pause" techniques -- which had their day,
and still have some marginal utility, but really don't seem terribly
useful now that most spammers have fixed their SMTP engines.  I therefore
question whether it's worth putting all this effort into something
that spammers can defeat on a whim, and that some already have.


[1] The exception being purely-internal operations.  Although in
that case, there is always a better way.  Even static files rsync'd
by cron every few minutes are a vastly superior solution.