spf-discuss
[Top] [All Lists]

Re: TXT Records

2003-11-21 06:42:28
On Tuesday 18 November 2003 10:09 pm, Marc wrote:
I believe that there are a lot of domain owners
that cannot conform to the TXT requirement for one reason or another.  For
example, I know many 'hobbyists' (people who run their own domains 'for
fun', such as myself) and tightly-run businesses that cannot run their own
DNS and are unwilling/unable to pay for a full-featured DNS provider that
supports TXT DNS records.

I think we can divide domain holders into two categories here:

(1) Those that are mainly here for web publising. These will often have low 
technical literacy and cheap infrastructure as you point out.
(2) Those who do only non-web stuff. Very rare and will certainly will have 
good technical skills and equipment - enough to run a custom DNS server for 
example.

Let's assume group (2) can handle anything we throw at them.

For group (1) we can assume that a web server is available and that they know 
how to publish files to it. They (should) already have been introduced to the 
concept of 'special control files' in the form of robots.txt. 

The simplest way for them to publish SPF would be in a similar form, eg a file 
called 'smtp-spf.txt' in their webserver root. They may not have the ability 
to set the 'expires' header so they would have to be able to specify a TTL in 
the file.

The tricky bit is finding the right server to query for each domain. I would 
suggest:
(a) look for a domain 'A' record, then try to connect to port 80 there.
(b) if (a) fails, look for a 'www'.domain 'A' record, then try to connect to 
port 80 there.

This vector will work for countless 'simple domains' run by little guys on the 
cheap. For security reasons it should only be searched if the DNS TXT lookup 
fails to provide authoritative SPF records. Trust the hostmaster more than 
the webmaster.

It will not work for more complex setups, but again people with complex setups 
will generally have good control of their DNS and can use TXT records.

It could be implemented by adding an 'http' mechanism to the SPF spec, and 
suggesting that receiving MTAs have a configuarable default SPF record that 
is applied when a TXT lookup fails. This would contain something like:

"v=spf1 mx http http:www.%{d} default=softdeny"

The http mechanism just looks for a 'smtp-spf.txt' file in the root of a 
webserver. Just concat the lines in this file and add in an optional TTL 
specifier to the spec. If not specified, default TTL should be the value of 
the 'Expires' or 'Cache-Control' HTTP headers. If not present use the 
receiving MTA's administrative default TTL.

This will certainly make rollout easier, at the cost of (possibly) several 
additional lookups. But those lookups only happen for domains which would 
otherwise not be SPF compliant, so it is better than 'unknown'. Admins that 
do not like this can change their default SPF record as they see fit.

It also mixes in the administrative 'role' of webmaster to our mix, which 
currently only requires 'hostmaster' and 'postmaster' admins to cooperate. 
This is an unavoidable consequence of the fact that many domain owners have 
little more than basic webmaster capabilities. In fact most are not even 
privileged users of the hosting servers.

Anyone know of any domain hosting service that does not permit the publishing 
of a 'robots.txt' file? The only ones I can think of are 'parking' services - 
which I suppose _are_ the administrators that we need to convince to 
cooperate. I for one would be happy to reject mail from badly parked domains.

- Dan

-------
Sender Permitted From: http://spf.pobox.com/
Archives at http://archives.listbox.com/spf-discuss/current/
Latest draft at http://spf.pobox.com/draft-mengwong-spf-02.6.txt
To unsubscribe, change your address, or temporarily deactivate your 
subscription, 
please go to 
http://v2.listbox.com/member/?listname(_at_)©#«Mo\¯HÝÜîU;±¤Ö¤Íµø?¡


<Prev in Thread] Current Thread [Next in Thread>