Behalf Of Michael Thomas
Are we to believe that they are largely self-healing problems
as bad p2p apps will eventually correct themselves since it's
in their interest? Is it reasonable to believe that there is
enough general clue that they could be expected to do that?
And the collective clue of the ietf is not really needed to
help this along?
Last time I heard that type of talk it was from all those folk who just
had to tell us why the network hypertext project we were working on at
the time was certain to be a collosal failure.
In terms of bandwidth headroom the Internet seems a heck of a lot better
off today than in the early days of the Web when people thought a T1 was
the ultimate in network connection.
Sure bittorrent is probably not great design by many standards. Neither
is NNTP, a protocol which used to rip up about half the Internet
bandwidth churning bits around that almost nobody would ever read.
Pre-emptive flood fill is an 'interesting' strategy to say the least.
Many ISPs no longer support it. Comcast charges extra.
I think that the reasonable question to ask here is 'does MBONE have a
future?' If I was asked to design a new media distribution protocol from
scratch I certainly would not choose MBONE as a model. The technical and
political problems both appear insoluble to me.
To return to the original question. The IETF is certainly capable of
solving complex problems. What it does not appear to be capable of is
letting go of failed experiments. Take multicast out to the woodshed,
its long overdue.
Ietf mailing list