Remember, the idea wasn't to have a 'global' list of 'good domains', but
ones which the *user* has whitelisted, so the user recognises them.
OK - I see what is meant now.
Still, how often does the average user visit a domain they've not visited
before? For if they constantly have to approve 'new' websites, they're either
going to turn the warnings off or they're going to ignore them, which defeats
the point.
Note that it is not uncommon for example.com to use iframes that load stuff
from the apparently unrelated example.org. This is also commonly used by
malicious sites. If a user visits a 'new' site using iframes, should such a
system warn against all the domains used by the iframes?
Again, there are a lot of web filters already out there. Can someone explain
either why the problem we're trying to solve here is different than the one
they attempt to solve or, if it isn't, why our solution will be better?
Martijn.
________________________________
Virus Bulletin Ltd, The Pentagon, Abingdon, OX14 3YP, England.
Company Reg No: 2388295. VAT Reg No: GB 532 5598 33.
_______________________________________________
Asrg mailing list
Asrg(_at_)irtf(_dot_)org
http://www.irtf.org/mailman/listinfo/asrg