I think that transforming 150Mb of data in 400Mb of RAM would be a
sensible target (is this sensible?)
That's ambitious. To achieve that, you're going to have to do
something that condenses the input document before transformation.
What would you say was a reasonable target? I expect it will
be dependent on many factors.
I reckon a factor of 5x is usually achievable - so 750Mb. But it does
depend. Biggest variable is the proportion of the 150Mb that's taken up with
long element names - sometimes e.g. in FpML this is a huge proportion of the
total, in which case you can do much better than 5x.
Michael Kay
http://www.saxonica.com/
--~------------------------------------------------------------------
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>
--~--