ietf
[Top] [All Lists]

Re: [Json] Consensus on JSON-text (WAS: JSON: remove gap between Ecma-404 and IETF draft)

2013-12-02 15:30:35
On Wed, Nov 27, 2013 at 11:51 PM, Tim Berners-Lee <timbl(_at_)w3(_dot_)org> 
wrote:

JSON is interesting in being a subset of ECMAscript.  That is a big
dependency -- will it be preserved?
However as it is unwise to feed JSON into an ECMAscript processor for
security reasons, that dependency may not affect code, just mean that JSON
and ECMAscript parsers can share parts at  the moment.


As I see it, encoding X is a subset of encoding Y if and only if an encoder
for X will only produce outputs that are valid inputs of encoding Y.

If an issue was to occur it would be because encoding Y has changed or the
definition of Y has changed.


One could imagine that the arc of ECMAscript's evolution could end up
having all kinds of impact on the data structure syntax and semantics.
(unordered sets as alternative to lists? who knows).  So in that case one
could imagine pressure to make a new version of JSON to match.


Since we are talking about a serialization format, the distinction between
unordered sets and lists cannot occur at the wire level and this is where
we need interoperation.

I do in fact have a schema compiler for JSON that allows an interface to
specify a set of entries rather than a list. But they are only ever
serialized as lists.

Yes, literal ISO dates and dateTimes -- I added them to my own N3/turtle
parsers without much fanfare, wish they had been put in the Turtle language
too.  Maybe they will.


And you probably do exactly what I do and represent a DateTime
representation as a subset of String just as byte, int16, int32, int64,
uint* are all subsets of Integer.

One of the things I think we have learned from JSON is that a
self-describing format only needs to specify the abstract type of the datum
and not the representation.

For convenience, I allow a schema to specify the size of an integer and
whether it is signed or unsigned so that the code generator can create
appropriate code bindings. But none of that shows up on the wire, nor is
that particularly helpful.


What we are doing at this point is to fix a version of the JSON encoding in
time so that when a client and server are negotiating use of JSON encoding,
both sides know what is being negotiated.

So hopefully JSON does not change in future, only the description of JSON.


That is not the same as saying that JSON meets all possible current and
future protocol needs. It does not. It is not possible to pass binary data
efficiently for a start and using decimal for floating point representation
is likely to make the format unacceptable for serious data applications
since it introduces conversion errors.

The next question is whether those unmet needs should be addressed by an
entirely new encoding with a completely new syntax and structure or whether
we could extend the JSON model. My view is that we should do the second.

XML is fine for documents but it is not meeting my needs as a protocol
designer. I find XML Schema to be unnecessarily confusing and baroque, the
schema validation features supported don't help me in application
protocols. XML does not support binary encoding of cryptographic data or
floating point.

There are many data encodings on offer but I would like to be able to write
one decoder that can consume a data stream that contains basic JSON data
and JSON with extensions. This makes negotiating an encoding in a Web
service easy, the consumer states which encodings are acceptable and the
sender makes sure what is sent is compatible, downgrading the encoding to
the level accepted if necessary.

-- 
Website: http://hallambaker.com/
<Prev in Thread] Current Thread [Next in Thread>