xsl-list
[Top] [All Lists]

Re: [xsl] Transforming large XML docs in small amounts of memory

2007-04-30 02:34:36
On 4/30/07, Ronan Klyne <ronan(_dot_)klyne(_at_)groupbc(_dot_)com> wrote:
Hi all,

I am trying to find ways of reducing the memory requirements of our
transforms.
The main factor driving the memory usage up is the size of the input
document (upto about 150Mb), but this is out of our control at this point.
So, the question: Is there anything which can be done (or avoided) in
the XSL to decrease the amount of memory used in the transform?

(I appreciate that this question is very abstract, and I apologise - I'm
mostly fishing for ideas, or a confirmation of my suspicion that not
much can be done...)

Much can be done, but your available options all depend on the
processor and environment you're running, and how flexible you are -
is it a pure XSLT 1.0/2.0 solution you're after, or can you use
extensions or modify the processing pipeline?

Also you need to let us know:

- Is the input uniform chunks of data in a single file?  (likely if
its a "data-centric" xml file) or does the processing require access
to the whole input for the whole transform?

- What is your current memory usage?  Whats the limit, what is an
acceptable bound? etc..

- How are you measuring memory usage?  Is it simply the input XML that
is using up all available memory, or do other parts of the pipeline
use a lot of memory too?

cheers
andrew

--~------------------------------------------------------------------
XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>
--~--