xsl-list
[Top] [All Lists]

Re: coping with huge xml-saxon

2003-06-17 19:37:12
On Tue, Jun 17, 2003 at 02:42:15PM -0700, 
david_n_bertoni(_at_)us(_dot_)ibm(_dot_)com wrote:

There are many factors to consider.  Markup-heavy documents might take up
less space than content-heavy documents.  But really, it's very
implementation-specific.  I think jd-xslt has an option to page parts of
the document to disk, which would certainly help with large documents.
Xalan-C's default implementation of the source tree keeps the entire
document in memory but tries to be as compact as possible.  However,
someone could write an different implementation which keeps the majority of
the document in a database, etc.


I thought as much (as Michael Kay confirmed below). That means I could
have a document 5,000 pages long, and XSLT should be able to handle it
easily. 

(I'm making an assumption. My largest document is 2,000 pages long, and
it was about 2M. So a 5,000 page document would easily fall within
reason on a machine with 256 M of memory. Anything larger than that
would probably be broken up. Actually, probably even the 2,000 page
document should be split up!)

Paul

-- 

************************
*Paul Tremblay         *
*phthenry(_at_)earthlink(_dot_)net*
************************

 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list