ietf
[Top] [All Lists]

Re: What is at stake?

2002-02-04 09:00:02
Ed Gerck <egerck(_at_)nma(_dot_)com> wrote:

In this scenario, and with all due respect to everyone's opinions,
policies that might have been justifiable some 10 or 15 years ago,
such as laissez-faire interoperation, conformance verification and
trust, cannot be justified by saying "the existing system is quite
effective" or "in doing this for the last 10 years, i've yet to suffer a
mishap because of this..."  What was, aint' any more.

In addition, within the last ten years the Internet has changed radically
from a centrally controlled network to a network of networks -- with no
control point whatsoever.  There is, thus, further reason to doubt the
assertion that what worked ten years ago will work today in the same
way.

I'm coming in late to this thread, but it seems to me that the Internet
was not particularly centrally controlled about 10 years ago.  The ARPAnet
was retired; regional research networks were in full force; commercial
networks were starting to appear.  If you'd said 15 years ago, I'd be more
likely to agree.

However, I find it interesting that this thread seems to have grown out
of a complaint about MIME (non)interoperability.  It strikes me as ironic
that even back in the 1980s, there was a lot of noninteroperability, at
least among mail systems.  I'm sure many remember mail messages that would
bounce because they were sent to "networks" (note quotes) that didn't (or
did) treat % or @ or ! as special characters.  But leaving those types
of problems aside, there were still sizable pockets of noninteroperability.
(I remember being flamed at on header-people many years ago because I used
a feature of emacs that allowed me to edit mail headers in ways that were
non-RFC 822 compliant.)

Issues such as "who got to attach to the Internet" don't seem to be that
relevant here, because as far as I can recall, neither ARPA nor its
contract agencies were in the protocol standardization business.
Interoperability came about mostly as a desire on the part of implementors
to have implementations that would work together, rather than as a mandate
from on high.  (Also, a fair amount of peer pressure, in that as an
implementor, you didn't want to get a bad reputation for noninteroperable
software, if you subscribed to the IETF ethos of interoperability.)

Seems to me what's been overlooked in comparing now to "back in the day"
is that the notion of interoperability now comes primarily from the creators
of the most used software, which for the most part interoperates with
itself and not with that of other vendors (and very rarely with reference
implementations created from IETF specs).  Furthermore, the value of
interoperability with another vendor's products is weighed (by the
consumer) against the cost; another vendor may have a more interoperable
product, but may charge more for it (why not, as it takes more time to
develop).  What is the consumer's perception of paying more money for
a product that's, say, certified by the IETF Underwriters Labs, if they
can get something that works nearly 100% of the time for all of their
needs, and costs less (and in some cases, is bundled with the rest of the
system)?

My guess is that at this point, certification of most software isn't
going to matter much to consumers unless the software costs less.
There are some exceptions -- consumers may pay more money for more secure
versions of software that are also interoperable.  But I don't think
distributing lists of nonconformant products is going to matter much at
this point.

--gregbo
gds at best.com



<Prev in Thread] Current Thread [Next in Thread>
  • Re: What is at stake?, Greg Skinner <=