xsl-list
[Top] [All Lists]

Re: Is letting the browser transform XML to XHTML using XSLT a good choice?

2006-03-04 04:29:53
Jesper,

Firstly, you have no idea what you are talking about, and it shows.

See my comments below:

On 3/4/06, Jesper Tverskov <jesper(_at_)tverskov(_dot_)dk> wrote:

Case for client-side/server-side XSLT transformation

Mr Peterson is telling us that XSLT support in browsers has been good enough
for at least a year or so,

Yiou've taken what I said "a year ago, that was true, but that is no
longer the case" and decided to restate my case as "for at least a
year or so, "

Don't do that.  It makes me upset when people try to use twisted words
that have become lies in doing so, and attributing them to me.  Again,
don't do that.  It shows that you are unwilling to reason and argue
and instead twist, turn, and lie at my apparent expense.

Don't do that.

and that many websites could benefit from making
use of client-side transformation.

They can and do.

 This I accept could be true for a small
minority of websites,

It has nothing to do with web sites.  Its has everything to with web
browser support, something you have shown through out this thread you
know nothing about.

More below as to why:

 but not and probably never in general.

You've taken your misguided opinion, "generalized" it, and used this
as the basis for your argument.  Shallow, pointless, useless, get a
clue.

I think most of us have s strong feeling that webdesign must be as simple as
possible to be able to develop and maintain in the long run. We simply hate
the idea of having to test each request for a webpage and serve webcrawlers
like Google one page transformed at the server, then to test if browsers
need an XSLT 1.0 or an XSLT 2.0 stylesheet, and then to send both some xml
data store file and the proper XSLT stylesheet to the browser.

Wow, if I allowed myself to think of it like that, I too would
probably dread the idea.  Here's how it actually works:

You create a tranformation file that fits your needs, test it, and
when all is said and done, thats it -- all is said and done.  Thats
the nature of side effect free progamming... Bugs proliferate when
there are unseen/unknown forces making changes that are unexpected,
and therefore create bugs.  In XSLT 1.0, that is simply not going to
happen.  If you do not understand and realize this, then this is in
and of itself part of the problem.

In regards to XML, as long as you don't decide to make changes to your
XML file format, which if you sat down and designed it properly, you
wouldnt need to, then when both pieces are in place, thats it, your
done.  If no changes take place for a year, guess what?  Both files
are cached, when a client hits your site the browser checks for
updated files, if none, it uses the cached files.  If you are of the
belief that it takes LONGER for the client to transform the same data
(which, in extended reality, depending on the browser, may have a
transformed copy cached as well, which would then use the cached copy
instead.

Some pretty smart people wrote/write these browsers.  Some of these
folks, in fact, I know at a personal level as for various reasons, our
development paths cross quite often > example:
http://www.amazon.com/gp/product/159059536X/ref=pd_sbs_b_1/104-8254139-4166325?%5Fencoding=UTF8&v=glance&n=283155
<



Transformation server-side is not just one thing. We should do it the smart
way, that is we only transform our data store to an XHTML/HTML webpage each
time the data store has changed.

Once again you have just shown you don't understand how browser
caching mechanisms work.  That said, it is true that if your web pages
rarely change, then the cost to send it to the browser pre-rendered
every once in a while is holds the potential to cost less CPU cycles
over the course of a cached files lifetime.  If in fact this turns out
to be the case, you then need to ask yourself "does a clients-side
processor which maintains an idol state for 90% of its life, and
takes, quite literally, milliseconds to transform each file (if thats
even necessary, in many cases it won't be as the browsers are smart
enough to know that the cached result tranformation is the same, and
therefore doesn't need to be rendered again.) -- is that the sticking
point that is going to force you into hardcoding your output, and as
such, be left scrambling when the world suddenly becomes less
interested in XHTML designed for a 1024x768 screen, and XHTML designed
for a mobile device with a screen less than 1/4 the size? And what
about SVG, and what about XUL, and what about XAML, and what about
taking a step back and realizing the world is become less and less
dependent on static HTML/XHTML files, and more and more dependent on
*data*.

Here's whats happening.  Folks are realizing that with Atom and RSS
feeds they can use various readers, tools, and the like to view the
data contained within, in any way they want.  The GreaseMonkey
phenominom showcases thats LOTS and LOTS of folks don't like to look
at your files the way you want them to, and instead hack together
solutions that rips out what they dont like and replaces it with what
they do like.  This trend is building more momentum (speaking more to
Atom and RSS, but to GreaseMonkey and other similar projects as well)
not less.  So then what?  You're left scrambling to adapt to the new
way... you lose site visitors, your competition gains those visitors,
and while your still scrambling theyre making there services  even
better and you lose even more.  < This != Success; This == Failure;

For the majority of webpages the
transformation server-side only takes place once or twice in a lifetime.

We've already established this to be the same.  And, by the way, the
static web page days that are "good enough to last for X" are in fact
numbered.  For archiving, sure, that will be enough, as long as your
site doesn't get a ton of hits to your archived sections.  If it does,
then no, its not good enough.

 For
other pages a couple of times a year, a month, a week, a day, an hour.

Could it really be better to transform each end every webpage every time a
browser requests it, in the browser, compared to once or twice in a life
time server-side? This is what we are talking about for the majority of
webpages.

No were not.  Or are you talking about your personal web site that
gets a limited amount of traffic? If yes, then don't your personal web
site with that of professional developers.  You're confusing the
issues for the sake of personal reasons that have nothing to do with
the business world, and everything to do with your lack of desire to
desing something well.


Even when client-side transformation has some advantages, these must really
be big advantages to make me set up a more complex website.

Its not more complex.  In fact, if built correctly, it becomes less
complex as you never have to worry about transforming your files to
HTML server side, and instead let the client handle this.  You just
feed the client data -- thats it.  Just data.

My conclusion is that the potential benefits of client-side transformation
will in most cases not be great enough to be worth considering.

For you.  Not for folks who are doing this at the professional level. 
If you are doing this at the professional level, then you need to
reconcider things... you're flat out, undeniably so far off, its not
even funny.


But I will implement it for at least one of my websites as I test.

Why bother.  If youve gone through all the trouble to try and prove
why it doesnt make sense, your mind has already been made up.

Jesper, my guess is that youre probably a pretty decent guy.  But it
seems that you have attempted to take what is obviously a small site
(if youre only updating your content once in a blue moon.. why would I
come back more often than once in a blue moon?  You're using the
example of what seems like a personal home page and attempting to use
this as your blanket argument.  Its shortsighted, and confusing the
issues for peopel trying to make decisions for much larger, more
complex, sites.  There is an entire industry preparing for the not too
distant future in which grid computing is fairly standard issue.  Do
you believe that these same folks adhere to the static server content
notion.

They don't.  They understand the notion that once you have, for
example, downloaded, and installed a particular software application,
the next step is to sling data from one node to the next.  They also
understand that software can easily be updated with patches, instead
of downloading the full application all over again.  Think of these
same general ideas when you design your web applications.  Small
amounts of data updates, instead of download the entire HTML page
again because of one tiny little spelling change. Partial HTTP content
is possible, and in fact this is becoming better and better as the
years move forward.  But in these same years, static HTML is being
replaced by dynamically generated HTML/XHTML.  Those are the facts,
like it or not.

Take a look at the following comments:

The Stats

We peaked at approximately 103,000 simultaneous web visitors and 6,000
IRC viewers during the Keynote speech and transmited over 32 GB of
data in a three hour period. If not for the efficiency of the
MacRumorsLive AJAX update system, the same webcast would have required
approximately twice as many servers and would have had to transfer
almost 6 times as much data (196 GB).

The full report is here. >
http://www.macrumors.com/events/mwsf2006-stats.php <  When they refer
to Ajax, I assume you understand the process that is taking place is
the asyncronous requests for small pieces of XML data, instead of a
new request for a new page, which would (as it normally does)
suffocated their servers.

If youre entire point is to point out that "Well for some of us this
doesn't apply..", great!  But don't inject the exception to the rule
into a conversation of folks who are interested in learning more about
how to best server the needs of data intensive sites that are updated
on a fairly consistent and regular basis.

And don't call me Mr. Peterson either.


Best regards,
Jesper Tverskov


--~------------------------------------------------------------------
XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: 
<mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>
--~--




--
<M:D/>

M. David Peterson
http://www.xsltblog.com/
<Prev in Thread] Current Thread [Next in Thread>