ietf
[Top] [All Lists]

3. Focus on linking open standards to code, operationals, and interoperability.

2016-06-13 11:05:45
I think trying to discuss all four topics in one thread is doomed. I want
to focus on just one:

3. Focus on linking open standards to code, operationals, and
interoperability.

One of the reasons I have been pushing for a change in the RFC format is
that putting links into specifications allows them to become more
effective. In the past I have spent hours pulling schemas, code, etc. out
of specifications, removing headers, footers etc.so I can feed them into
tools. It isn't just a waste of time, it cripples the development process:

A) Using the schemas in specifications as the basis for code becomes the
exception, not the rule.

B) As a result of A, fewer tools are built, tested, used.

C) As a result of B, the schemas don't get exercised in running code and
they don't guarantee interop.

D) As a result of C the value prop decreases, reinforcing B and the cycle
continues.


Fixing the RFC format is in hand but we also need tools. This is what I
have been working on. A complete toolset that allows a new specification,
reference and production code to be developed in less than a week.

I produced the following specification from scratch in three working days:

https://tools.ietf.org/html/draft-hallambaker-lurk-02

And by that I mean 24 hours work, not 72.

The reference section in the draft is generated from the schema. The
examples in the draft are generated from running example code generated
from the schema. So when you read one of my specifications you can be as
certain as it is likely to get that the specification and the example are
consistent.


Where I would like the IETF to get to is that for new protocols we start
with the expectation that there will be from the start

* A specification with valid reference and examples sections

* A schema that permits
  * Automated generation of production code in at minimum C.
  * Automated generation of reference code

* A set of test vectors.

In this context, reference code is not the same as production because
reference code should be conservative about what it accepts and permissive
in what it sends. In fact reference code should intentionally send
malformed commands to provide test cases.


Yes the tool is a bit rough and could be improved on. And at the moment it
only generates specifications for application protocols that use JSON or
TLS schema format and it only targets C# and C. And the example code
generator has advanced somewhat ahead of the production generator. But
those are fixable. The code generators and the meta-generator are all on
GitHub and Sourceforge under an MIT license:

https://sourceforge.net/projects/phb-build-tools/

I also have generators that build code for what I regard as 'legacy'
encodings. There is an RFC822 style header parser, an FTP/SMTP/IMAP/etc
style command parser and even an ASN.2 encoder.

The code is believed to run on Linux and OSX under Mono. The main reason
for not working on that recently is that due to the recent acquisition of
Xamarin and their dotNetCore initiative, support for Linux and OSX is in a
state of flux. These is now a new option that is expected to improve the
situation dramatically in the very near future.

The tools are implemented as command line tools and additionally on the
Windows platform as VSIX extensions that are fully integrated into Visual
Studio.


Working this way does have obvious benefits but it also has a few
constraints. One consequence of working through tools that build tools that
build tools for the past 25 years is that I am used to working at a very
high level of abstraction. And that means that when my high level
description of how a protocol works has to start dictating very low level
details of implementation, I see that as an architecture failing.
<Prev in Thread] Current Thread [Next in Thread>
  • 3. Focus on linking open standards to code, operationals, and interoperability., Phillip Hallam-Baker <=