spf-discuss
[Top] [All Lists]

FTC 20 questions

2005-06-24 00:42:11
Here's the 20 questions being posed on the terrible FTC webform - extracted and posted here for clarity.

It has to be said that an organisation which omits the only protocol that has a million records already published and is in use by several large operators, and which allows such an incompetent website and webform to be made public, is very suspect in my book. What else are they incapable of doing?


https://secure.commentworks.com/FTC-EmailAuthenticationQuestionnaire/submitcomment1.aspx

<quote>
In November 2004, the FTC and the Department of Commerce's National Institute for Standards and Technology ("NIST") held a two-day Email Authentication Summit, at which proponents of various domain-level authentication standards and other interested parties discussed the utility of and issues surrounding domain-level authentication. At the Summit, the proponents of five of the proposed standards agreed to make their testing results public in order to assist in the evaluation of their standards. These five standards are the Internet Protocol ("IP")-based proposals "Sender ID" and Client SMTP Validation ("CSV"), and the cryptographic-based approaches Bounce Address Tag Validation ("BATV"), "DomainKeys" and Identified Internet Mail ("IIM"). This website contains several technologically-based questions that address issues such as the functionality, interoperability, scalability, and effectiveness of these standards.

We invite companies that are testing email authentication standards to answer the following questions about their testing and implementation efforts, and to update their responses as new information becomes available. By submitting this information, companies understand and agree that the information may be posted publicly on the FTC's website.



1. Identify the Email Authentication Standard tested by the Company.

2. Describe any modifications you made to the specification when you tested it, and explain why the modification was made. Do you believe that the test results would be the same if you used the published specification? If not, how do you believe the results would differ?

3. Describe what you tested (e.g., product functionality, interoperability, etc.), and the process that you used to test it. Include an analysis of how your testing process measured the capabilities being tested, and an interpretation of the results.

4. Describe the test environment. In particular, how many servers were used, what was the configuration of the computers and servers (size and speed of disk, type and speed of the central processing unit (“CPU”), amount of memory), how many servers did you use, etc.?

5. What special software and/or hardware were required to conduct the testing, and how were all relevant components of the system (software and hardware) configured for the test?

6. How many email messages were tested?

7. From where did the tested emails originate? Were they live emails transmitted in real time or fabricated “test” messages? If they were live email messages, how did you treat them if they were not successfully authenticated?

8. Over what time frame did you conduct the testing?

9. Were prior tests conducted? If so, describe the parameters of the prior testing and how the current testing differed, if at all. Also, explain whether there were any reasons for modifying the testing parameters and whether the results varied.

10. Identify what percentage of email traffic was successfully authenticated.

11. If possible, explain whether any of the successfully authenticated email messages were sent by spammers, phishers, or zombie drones? If so, explain how you were able to make this determination.

12. Identify what percentage of email traffic was not successfully authenticated. Describe what part(s) of the authentication process failed. Please explain why you believe it failed.

13. Describe whether your results indicate whether the methodology or methodologies were scalable and why you believe they were scalable, and to what scale (that is, how big)?

14. Describe the CPU usage and network usage observed during the test. How did this affect other uses of the system and network, if any?

15. If you manage or facilitate a tool for public testing, please describe it and summarize any results of the public testing to date.

16. Describe any costs (e.g., financial, time required, personnel required, etc.) involved in testing the email authentication standard(s), including any costs associated with making modifications to your existing system.

17. If you have tested or plan to test Sender ID, did you check or will you check both Purported Responsible Address (“PRA”) and “MAIL FROM?”

18. If you have not checked the PRA records and will not in the future, why not?

19. Are you planning to test additional standards?  If so, when and which ones?

20. Describe what steps are required before your testing program is completed.