ietf
[Top] [All Lists]

Re: Facts, please, not handwaving

2006-09-19 04:30:08
Dear Frank,
the main problem in a human debate is that the different protagonists tend to see the world, the debate, and the vision of others through their own vision. If they are reasonably clever what is something I take for granted here, and sufficently informed (what RFC 3935 requires), the whole issue is actually for them to progressively understand the others' point of view. When they have understood each others, the solution is usually obvious to all, or the differences one can manage.

The whole issue is therefore to adopt a system where:
- one can understand the others' point of view (a methodology is needed to make sure that no lobby exclude other positions) - one can check the competence of the proponents. The Internet standard process goes into that direction when it calls for some network experts to be involved in new areas to network engineering. The education process can also help. However, humility is probably the best way to learn from others. This can be helped by some methodology. A WG should commonly produce a document, not to revise someone's proposition. - one must have the appropriate medium to express the common understanding. The IETF currently proceed like Nature. RFC are articles, calling officially for comments. This is research, not engineering. Engineering would result in an IETF Good Book. It would address reality, not document its own virtuality (cf. below).

At 04:32 19/09/2006, Frank Ellermann wrote:
Jefsey_Morfin wrote;

> The Internet has dramatically increased this to the point we
> have accepted it as a virtual and a global world, i.e. a
> conceptual and geographical equivalent coverage to reality.
> The IETF is therefore in the core of this

But not alone, googlebot, wikipedia, and some other companies
are nearer to that core.

?
These are content services not the network architecture. Not the same layer as IETF. They are nearer from people's core, not of the network's core.

They build over the IETF vision. Reread RFC 3935. The IETF technology is not a technology, but the technology the IETF members want, from their core values. This is the problem: the reference is not the reality, but the virtuality made of the previous RFCs.

> the support of what people are to believe to be their
> _unique_ virtuality.

I don't believe in "unique", and I don't believe in arbitrary
borders between "real" and "virtual".

This is not a question of faith. In writing this you just prove that you are in your own virtuality. Reality is what makes electon exist and move. Virtuality is everything human brains can devise to document it. Physical laws structure a virtuality which tries to best document reality. For centuries we believed in Ptolemy's system, then in Copernic's and Newton's law, then in Eistein's law, now we know from corrections that most probably all is fractal (and I think there is a simple, good, and universal reason for that I call syllodata, and I explained to Sam Hartman). Use the excel paradigm. Metadata are the column, syllodata are the formulas.

Reality IS. Virtuality is when you start saying you BELIEVE. A context is when you say "we believe".

> The RFC system is not accompanied by a network ontology RFCs
> would update.

Evolving as needed.  Today you can get human readable meta data
for RFCs with the rfc-editor.org search engine, use ietf.tools,
Bill's additional dependency tools, etc.  Some years ago I had
only a CD ROM and grep.

This is not an ontology. It certainly could help building an IETF vision ontology. But this is still to be done. Then, the referent being the IETF and not the reality, it would not be an attempt to a universal model but to an IETF model.

> There is therefore no description of the virtuality the IETF
> develops and the world is to beleive in.

If we're in a sub-sub-sub-thread of "newtrk" (and not "NomCom")
here, then IIRC one conclusion was that everybody is free to
write "overview" documents about everything (s)he cares about.

Getting rough consensus for a publication as an IETF RFC is of
course a separate issue.

The target is not to pile virtualities. It is to document their aggregation, as multiple faces of reality. This can only be made in a concerted way. Rough consensus is not a concerted system.

> reality is diverse, so the virtuality must be diverse

Yes, therefore please don't write "unique" outside of contexts
where it's clear / necessary / desirable / ... (roll your own).

I do not call for uniqueness. I oppose the uniqueness of IETF borned from rough consensus. At the end of the day IETF has only one standard, one doctrine, one RFC, one single root, one single IP addressing plan, one single pivotal language, one single IANA,. etc. This is why I call the result the "mono-Internet".

> IETF wants to influence THE way people design, use, and
> manage the Internet.

There's no "THE way" in RFC 3935.

Oh! yes there is. There is a full part to describe why there is only one way, along the IETF core values. Read in detail RFC 3935. It is extermely well thought and done. As are well writen RFC 3869 and RFC 3774. Clever people have analysed the issue, proposed their comments and solutions. This should not be disregarded. But worked on. To understand what all this means and implies.

My own conclusion, from also study, analysis, and validation is that the "rough consensus" was a step ahead but lead to a approximative unique virtuality, instead of leading to a precise diversified better decription of reality. This impacts the whole Internet architecture, culture, and engineering. IMHO this is why the Internet new generation is blocked. Because intrinsically this "mono" nature of the IETF architecture is inadequate to the "multi" nature of the need.

> the way RFC 4646 is disrespected and therefore not
> interoperable.

You can bury that troll now, it's dead and begins to smell.

Calling the core problem you do not know how to address a troll, certainly helps you building a virtuality. But not a virtuality which tries to match reality. A virtuality which cannot scale. This is why they called Gallileo a troll.

We wrote an IESG further documented RFC 4646. I made it acceptable to me from an external point of view. You are now embarassed with it. You disrespect it. You want to mend it. Your problem. Except that it makes the whole mono-Internet uninteroperable with the main purpose of the multi-internet, to support user diversity (see below)

> IMHO this comes from its decision method (rough consensus).
> It is a major step _ahead_ over "democratic" votes, but
> there is still a long cultural way to reach the adequate
> "concerted consensus" necessary to the subsidiarity of our
> networked technical, societial, industrial, political
> diversified world.

The models you've proposed where apparently based on national
agencies,

????
Please document what leads to think this?

and I think it's ridiculous if individual experts can
hide themselves behind smoke like "Iceland does not support to
add xyz to standard abc".  And then selling the results of such
dubious activities as wannabe standards.  They almost certainly
have no mandate by (in this example) the people of Iceland.

This is not what I propose.
However, this would have some guarrantee of reflecting diversity, what rough consensus among lobbies does not.

And even if they would have that mandate, why should this be
better than the mandate of a comparable town which happens to
be no nation ?

True. And why would that be better than a group of people mandated by their own PAC?

As Lessig wrote it, "the constitution is in the code", what means the standard which define the code. The IETF wants to be the place where these standards are written. They are/act as the founding fathers of what structures our today world. They have no mandate for that. Except their dedication and competence. As for every human groups, there are leaders (read RFC 3935). These leaders claim to lead through "influence" mechanisms, based upon "core values", to document the "best way" the Internet can work.

Read RFC 1958. This has a fundamental flaw. It does not scale. In my opinion they should work through concerted mechanisms, based upon consensual attempts to best observe and use reality, to document the diverse and hopefully consistent ways the Internet users can chose.

> This is more complex, but this is the way we live, in
> intergovernance.

I'd be not suprised if "rough consensus" comes to a grinding
halt at some point.  Making it more complex isn't attractive.

The halt of a human process is when it is no more appropriate. In IETF parlance, when it does not scale. Rough consensus was an appropriate step ahead when the IETF was a user group. There is a long time it is not anymore. It is an RFC knowleadgeable group. There are two possibilities: either it reforms itself to become a user group once again, or the grassroots process develops elsewhere and it stays interoperable.

I tend to think that the IETF cannot reform itself, because that reform will not only touch its inner culture, but also its technology (cf. RFC 3935, the IETF develops a technology along with its own values, i.e. its own culture). However, I also observe the result: most is actually made outside of the IETF, with the same kind of culture. With NAT like reactions as a result. This is not the best we could hope for.

So, we look like being in a dead-end. That dead-end is precisely RFC 4646. Because it closes, for the IETF, the opening of the solution.

The solution is simple enough and quite usual. In perfect line with RFC 1958 recipes. It is to consider that the IETF virtuality (the mono-Internet) is not unique, and therefore can be embedded in the virtuality diversity of the "multi-Internet" (I suggest it is current default). This is all the more easy than the whole Internet layer itself is to be embedded in a more global user model.

The IETF only considers the Internet. The users consider the Internet among other solutions among other layers of interest. A user centric convegence permits to force the Internet to scale. However, the Internet must stay compatible with the user. Users are at a layer above the Internet. Telecoms are at hardware infrastructure plug to plug interconnection electrical signal centralised protocol layer. Internet is at software superstructure end to end interoperability digital data decentralised protocol layer. People are at brainware metastructure brain to brain interintelegibillity meaning language distributed protocol layer.

RFC 4646 documents the way the Internet wants to curb the upper layer protocols. This is a layer violation.

The way out is standard. Either the IETF adopts a concerted consensus approach and supports all the upper layer protocols (extending the TCP/IP pile). Or the IETF consolidates its own virtuality and makes it interoperable with the upper layer support of its own protocols (an OSI like model). I said I opposed RFC 4646 being approved until Tunis. Because the ecology of the network made the Tunis deal an obvious conclusion (the current Internet belongs to the USA, its diversification is to be discussed by the IGF). This settled the second approach. I find it significative that the IESG approved RFC 4646 only a very few hours after Gross agreed the Tunis deal, making approved the US Congress architectural decision by the world.

What the IGF is now to learn is how its own intergovernance will work. How concerted consensus is to work. These are concepts new to most searchers but so simple and obvious to lay people ... The concerted consensus (real consensus on the outcome, not necessarily on the decision or the solutions) is the key of the concertation (in that "en-EU" language they do not want) in subsidiarity that is the very nature of our current global world process.

Question is, does the IETF needs, want, can adapt? for which option?
jfc








_______________________________________________
Ietf mailing list
Ietf(_at_)ietf(_dot_)org
https://www1.ietf.org/mailman/listinfo/ietf