Russ,
My view on this would be dependent on what algorithms (mandatory or otherwise)
are to be listed in the suite.
If CMS comes up with a very short list of those algorithms that are very highly
regarded by the cryptographic community, AND have been (or promise to be)
widely accepted by the vendor community, then yes, I would like to see a stable
list of algorithms everyone could implement and interoperate with. If not, I
would hope that the individual standards committees would be more selective.
On the other hand, the track record of the IETF in general, and the S/MIME
group in particular, is very disappointing in this area, and so I don't hold
out much hope overall.
Unfortunately, other criteria are given more weight than the factors I listed,
including in particular intellectual property considerations. Then there is
what I call the "pet rock" problem, where everyone's favorite algorithm gets
included regardless of its proven robustness (or lack thereof -- the proof, not
necessarily the robustness) or commercial viability, just to avoid offending
someone, or having to make a hard choice.
That way lies madness, IMHO, not to mention fat, bloated code that is
unnecessarily slow and doesn't work reliably; together with horrendous
interoperability problems.
Moore's law predicts that hardware speeds will double every 18 months or so,
but "Gate's law" says that software will run slower and slower to compensate.
(I'm not being fair to Microsoft -- it is the entire software industry that is
causing this problem, including the so-called standards groups that are causing
this problem.)
If the number of algorithms included in the standard set are N, then the theme
variations in key wrapping, etc., tends to go up as N-squared, and the
interoperability problems goes up as N-squared times M, or maybe M-squared,
where M is the number of different implementations.
We can't afford this sort of nonsense.
Bob