[Top] [All Lists]

Re: Requesting comments on draft-cheney-safe-02.txt

2009-08-01 12:56:37


Lets start with the basic definition here:

   Web 1.0  - HTML only, No Javascripting.
   Web 2.0  - HTML only, Javascripting, Ajax
   Web 3.0  - Web 2.0 + Rich Graphics

I know this is likely trivial to the current subject, but I have always
viewed the Web 3.0 as the fabled "semantic Web" where markup exists in
a serialization, presumably XML, and with a descriptive structural
grammar and not merely a descriptive grammar.  Such a markup convention
would allow rapid integration of RDF for ontological parsing through a
language like OWL or KAON.  The intention of such an elaborate
processing scheme is to require a more simplified manner of organizing
human consumable content, where simple refers to the immediate
understandability of humans and the efficiency of parsing grammar
definitions into syntax for computer evaluation of logic.

The overall problem is that the industry is no longer concern with Web
1.0 compatibility.  In fact, Javascript is being enforce at many web
sites. They don't bother with allowing web 1.0 users (those who choose
to turn off javascript in their browser).

The greatest limitation to the creation of solutions for technology
problems, at least with the internet end-user, is the perceived value of
usability from those who most develope upon those technologies.
Usability testing has historically shown and continues to show that
users find beautification and interaction to be factors of extremely low
priority with regard to consumption and access to data.  Companies who
make beautification and usability their top priorities typically score
significantly lower on usability tests compared to competition whose
focus is on other factors such as perceived speed of delivery, clarity
of content, fewer pages/documents/emails necessary to complete a
transaction.  This paradox of usability is obvious to end users
unfamiliar or uncaring of the technologies and features required to make
these technologies available.  Developers, on the other hand, tend to
view the qualities of beautification and usability in the absolute
highest regards completely displacing any competing qualities of design
or development because beautification and usability do more to
illustrate the skills and technology competency of the developer.  I
find this to be a marketing problem of fallacy of proportions.

In other words those who should be most responsible for knowing better
are often the very people most at fault for the conditions of failure to
solve end-user technology problems.  The result of this failure is
technology stagnation, loss of trust, uncurable corner-cases, and loss
of value.  This is quite obvious with JavaScript specifically where the
only solution to its problems is to completely disable it to the point
of ensuring it cannot be allowed to exist.  Even as a new version of
JavaScript is being drafted these problems remain unresolved, and this
lack of solvency for the most important failings of the technology is
aware to those drafting the new version.

The 2nd problem is that newer browsers are not even making it an user
option to turn off javascript. For example, Google Chrome. This
browser is a prime example of the problems you are concern about.
Microsoft and others is following this lead.

This is in an important step in the failure of the web.  It is mandatory
execution of that which is commonly known to fail, from a security
perspective, without regard for the complete inability to solve the
problems faced by execution of the technology.  If this path continues
security vulnerabilities will continue to increase to such an extent
that trust and value will be lost to the end user so that this platform
of technology will be abandoned in favor any solution less faulty based
purely on a cost/savings decision without regard from the proprietary
nature of that technology.  Realistically, any given user will only
tolerate failure to such an extent, even if that failure is not
immediately perceivable.  This is a threshold nobody has bothered to
measure dispite the incredible investment in this technology platform.
That is simply bad business, and technology concerns of open standards
must not rest on the perceived value of such faulty business logic.

In other words never trust the software to make the correct decision
without independant consideration of the costs of that decision to the
technology as a platform.  Vendors of software are self-serving even if
the technology they develope is open source.

The 3rd problem and alternative to the 2nd issue with users's turning
ff javascript and/or the browser doesn't support cross domain
requests, are the client-installed pluggins are bypassing these
restrictions, i.e., Flash and SilverLight.

I completely agree that forcing a flawed technology down a user's throat
where the only solution is to bury the technology is simply a bad idea.
Mandating the execution of that technology in the face of that bad idea
is a worse idea.  That logic is a completely failure, and yet is the
position of interaction on the web.  This is why I wrote this draft

I guess I am trying to see how your draft proposal using SMTP will
help here and to solve what part of the above issue?

The idea is that interaction on the web is becoming a failure due to
security.  The only solution to this problem is to abandon that
interaction.  My draft aims to help that solution become less bitter by
proposing an alternate method of interaction.

Peter's point is that the draft proposal would conflict with the
dynamics here.

I agree.  Event oriented execution is not allowed by the draft.  This
harsh limitation is necessary to prevent abuse.  History of the
development of internet technologies suggest that if you give a develper
the opportunity to abuse a technology to expand usability the technology
will be so abused completely without regard for the harm caused by that

Well, all bets are off.  That is why I think you may be blowing
against the wind here.

Perhaps the clarity and costs of failure of the web as a technology
platform are not yet evident.  At some point this will become painfully
evident.  Nobody else has proposed a solution to this problem.  If my
solution is too bitter in its usability limitations then the only
options are to propose a competing solution or watch the web as a
platform fail.  Honestly, I believe this is all smoke.  Documents worked
perfectly fine on the web before there was JavaScript and JavaScript
worked perfectly fine before was the DOM.  This is even true for
ecommerce and the exchange of data in a server-side application

Anyway, it sounds to me that this is more about having a secured,
certificate based "SAFE" proxy that people can use AJAX or
FLASH/SilverLight with.

This is entirely about security.  Flash/Silverlight are not a solution
to JavaScript.

At the end of the day, either you allow the site to run as it was
designed if you want to be part of it, or just ignore it if you are
concern about its cross domain behavior.

Demeaning the user is not a technology solution.


<Prev in Thread] Current Thread [Next in Thread>