ietf-openpgp
[Top] [All Lists]

Re: [openpgp] "OpenPGP Simple"

2015-03-17 11:30:13
On Tue, Mar 17, 2015 at 2:48 AM, Jon Callas <jon(_at_)callas(_dot_)org> wrote:


On Mar 16, 2015, at 7:04 PM, Peter Gutmann 
<pgut001(_at_)cs(_dot_)auckland(_dot_)ac(_dot_)nz>
wrote:

Jon Callas <jon(_at_)callas(_dot_)org> writes:

Certainly the ASCII Armor checksum is something that could go, since we
don't
need to worry so much about modem line noise. :-) But you have to know
enough
to ignore it.

It's not just the checksum, the entire ASCII armoring should have been
discarded years, no decades, ago.  The whole thing was originally
implemented
because facilities like FidoNet and Usenet didn't handle binary
messages, and
the checksum was because things like 2400bps modems (pre-MNP) on the DOS
PCs
that PGP 1 was written for wouldn't cancel out line noise, so it was
useful to
check for inadvertent message corruption before you warned about invalid
signatures.

The MIME standard (going back to RFC 1341) is over 20 years old and
pretty
much everything supports it, but PGP persists with something from even
earlier
(PEM, from 1987, that's nearly 30 years ago).  It's not just "a museum of
1990s crypto" (thanks to Matthew Green for the great quote), it's also a
museum of 1980s and 1990s everything-else.  The entire discussion of
"ASCII
armour" should have been replaced with "use a mechanism like MIME" years
ago.

(Oh, and by "MIME" I mean proper use of MIME, not "wrap PGP-PEM in MIME
headers and pretend it's MIME", RFC 2015/3156).


Maybe not decades.

ASCII armor as it exists now uses the same encoding as MIME for base64,
purely by chance. It is one of the things that makes me least crazy because
it’s mostly standard and actually kinda useful. There are a lot of semantic
places where it’s nice to know that something is an OpenPGP object in
human-readable form.

Something that seems to be forgotten all over the place is that email is
actually one of the least interesting places to use OpenPGP. ASCII armor
ends up being a nice way to encode something so you don’t have to play
"guess the binary format."


We have been having a similar discussion in ACME which is for issue of
certificates for use in TLS, email, etc.

The body of the message is going to be JSON. But the message needs to be
signed. After a number of proposals we seem to have settled on a scheme in
which the start of the message is a JSON header carrying the signature
which is followed by a JSON message carrying the transaction request or
response.



Relatively recently, I was opining to someone that it would be useful to
come up with a JSON encoding for OpenPGP that would give an easy-to-parse
thing that’s not just ASCII armor. But some years ago, I said the same
thing but it was XML, not JSON. And a few years before that, it was
S-Expressions, most recently in SPKI format, and more Common LISP-ish
before that even. JSON is what the cool kids are using this decade, don’t
you know.

And *that* is the reason to just stick with ASCII armor.


Well you go to MIT, you get S-Expressions... I am kind of surprised the
code made it out of 545 tech square without them.

When I was backing XML it was essentially just S-expressions with angle
brackets and the initial tag duplicated on the end. JSON gets us back to
what we were sold when XML was first offered before the namespace prefix
idiocy was introduced and the schema was botched.

I think I will actually disagree with you Jon, even though I started
thinking I was in agreement. I think that the IT world has in fact picked
winners and stuck with them. But for different purposes.

There is least convergence at the lower levels of the stack. I can't
imagine any new protocol using ASN.1 unless it is directly coupled to PKIX.
The IETF has converged on using the TLS notation and approach. It works.
Above that XML is the only viable choice for a document format, JSON is
emerging as the natural choice for Web Services. There is no consensus on a
binary version of JSON but there are many applications making use of them.


Given where we are today, there are two approaches that make sense. One is
to stay with the current approach, the other is to pick an existing
approach, adding essential features if absolutely required.

To be avoided at all costs is to abandon the current approach and instead
invent a new encoding. Yes, JSON does have limitations that make it
unsuited for some applications and it is therefore inevitable that there
will be some other encoding at some point in the future. But that does not
mean that there will be a new format that looks completely different to
JSON and we can be virtually certain that a new format proposed for PGP
that looks completely different to JSON and the TLS encoding is not going
to be picked up anywhere else.
_______________________________________________
openpgp mailing list
openpgp(_at_)ietf(_dot_)org
https://www.ietf.org/mailman/listinfo/openpgp
<Prev in Thread] Current Thread [Next in Thread>