There are people who have 100 megabyte product catalogs in XML files. They
obviously don't serve the whole thing to a web page, but they do use XSLT to
process those catalogs, and produce small web pages or XML output for reports.
So, there are several questions you can ask here. First, what does it take
for an XSLT processor to handle XML too big to live in memory all at once? [
That was Kevin Jones's question. ] Second, what other techniques will help
beat the problem? XML databases? What else?
jeff
David(_dot_)Pawson(_at_)rnib(_dot_)org(_dot_)uk wrote:
Speaking only for myself, and as a reader of this list for some time, I
might comment that 'large' has meant (mostly) large human targetted
document instances, often up to a few megabytes, occasionally ten or more,
seldom more.
I might suggest that the processing environment for many users is a desktop
pc, a server, rarely dedicated hardware.
If serving up webpages the options are there, but in terms of loading
a page in a users browser, anything more than a few hundred K is a bit
of a waste of time, since users get teed off and go elsewhere.
--
-------------------------------------------------------------------------
= Jeff Kenton Consulting and software development =
= http://home.comcast.net/~jeffrey.kenton =
-------------------------------------------------------------------------