Gunther Schadow wrote:
Hi, what do you guys do if you have HUGE XML instances that your
transforms generate? Let's assume for a moment you have a transform
that replicates the input document 2 billion times, or something
similarly silly that really creates a HUGE string of output tree.
Is there anything in Saxon (or other XSLT processors) that would try
to keep a hold on the whole tree even though it dumps tags to the
output stream? Are there any switches I need to set to make sure that
memory for result trees is freed after a result node has been output?
I don't expect this to be a problem for most processors. Just don't try to
read that output document back into another transformation. The input does
tend to keep the entire document, often in expanded internal form.
I am having a problem with an XSLT based database dumper that ends
up haging after a while. I'm not sure if it is the JDBC driver or
server that hangs, but it's possible that Saxon accumulates memory
allocations. I never get an out-of-memory error ...
More details about exactly what you're running may help here.
--
--------------------------
Jeff Kenton
DataPower Technology, Inc.
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list