I think most of us have s strong feeling that webdesign must be as simple as
possible to be able to develop and maintain in the long run. We simply hate
the idea of having to test each request for a webpage and serve webcrawlers
like Google one page transformed at the server, then to test if browsers
need an XSLT 1.0 or an XSLT 2.0 stylesheet, and then to send both some xml
data store file and the proper XSLT stylesheet to the browser.
I reply: I won't comment on the business reasons why having client side or
server side transformation. It's a matter of business choice and target
market. I will instead reply to the comment about version 1.0 or 2.0. For
the moment, and it seems for still a while, browsers like firefox/mozilla or
IE support only 1.0, therefore I focus on supporting only XSLT 1.0
stylesheets even if 2.0 may bring a lot of benefits. So for me the least
common denominator for an XSLT transformation is XSLT 1.0. At least,
according to the latest logs on several servers more than 95% of the agents
requesting content where XSLT 1.0 enabled and none supporting 2.0.
It seems that actual tools like coccon are:
a) barely used
b) don't do good job of partitioning the transformation process on the
server side or the client side dependent on the capabilities. For instance,
automatically recognizing a search engine crawler and then performing a
server side transform from a IE user agent and in this case performing a
client side transform.
I am amazed that still today we do not have such tool?????
In 2000, in a previous life and in a failed startup, we did an ISAPI
extension for IIS doing precisely that. A pattern matcher recognized the
incoming request and accordingly performed the transform on the server side
when the user agent doesn't support this capability and performed a
transform on the client side on the contrary. The only thing it had to do in
the last case was inserting a processing instruction in the document.
I am, today, very very surprised that this basic functionality is still not
there, is it because:
a) Actual players have vested interests in a mainframe like architecture?
b) People lack imagination with XSLT based technologies and nobody thought
about this simple feature?
c) software producers are sleeping on the switch?
d) XSLT is unpopular
e) an asteroid felt on earth and anybody with the will to do it was
Comment targeted to the general audience:
I remember that in year 2000 we made tests with this kind of server and we
got at least half of the request transformed on the client side with browser
enabled XSLT (in this case IE 5). It wasn't more work for us because we
wrote only a single stylesheet to be executed either on the sever side or
the client side. The HTTP server add on was:
a) performing a pattern match of the requesting agent
b) looking in a DB the agent capabilities
c) was performing the transformation on the server side if the agent was
listed as not being able to execute such transformation and include a
processing instruction otherwise.
Simple? Obviously yes and this is why I am so amazed to see that today with
a much higher probability of having an XSLT enabled request from a client we
still do not have such technology. From the last logs I got on my servers,
more than 90% of the requests are coming from XSLT enabled agents. In other
words, more than 90% of the requests can be potentially transformed on the
Can several of us check their logs and count the actual percentage of
request are coming from XSLT enabled browsers?
Funny to see that basic distributed computing principles are still not there
in the marketplace and that partitioning of processes is out of scope of
mainstream web developers. It's even more funny to see machines with 2 or 3
gHz processor and more power than most ex cray machines reduced to the role
of dumb terminals (great laugh :-) ;-) :-)
I am at least reassured to see some great and useful tools like google maps
not reduced to such short sightedness. Recent efforts on mashups and AJAX
brought me some hope that evolution is still happening. Maybe we will
finally get out of the regression we felt into and let these processing
monsters sitting on our desktops (on our knees, on the floor or any other
place) do more than being dumb terminals. As stocks markets enter into
bubble and recessions, it seems that the software development domain
oscillate between centralization and decentralization, between a PC model
and a mainframe model, between server centric to client centric. I am
patient and still wait for the client centric era. Unfortunately it seems
that this will come from Microsoft and XAML. And like usual, all the other
vendors who slept on the switch with a comfortable mainframe like model will
scream like hell on a monopoly when in fact they where too shortsighted to
lead the evolution themselves. History repeats itself again and again and
people learn very little from its lessons. Amazing no? No, just a bad
Didier PH Martin
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>