At 2005-07-11 11:43 +0200, Hondros, Constantine wrote:
I am pre-processing batches of about 1000 XML files at a time using Saxon.
Part of the pre-process involves aggregating linked XML documents into the
current document. Naturally, I use the document() function for this:
...
How would you optimise this? Would a deep-copy with <xsl:copy-of> be faster?
Or am I better off writing my own processor for this aggregating step (easy
enough).
Since the XSLT processor is obliged to keep all document() node trees
around somehow (it doesn't know when you might need a given tree again in
the transform and any generated identifiers for the nodes need to be
persistent in the transform), I would recommend a simple SAX process for
this kind of aggregation. Not only would it be faster (no building of the
node tree), it would have a very small footprint (no persistence of any
input documents).
Some XSLT processors will allow you to pass your output SAX events as input
to an XSLT transform so in such a case you wouldn't even have the
aggregated file sitting around to worry about.
I hope this helps.
. . . . . . . . . Ken
--
World-wide on-site corporate, govt. & user group XML/XSL training.
G. Ken Holman mailto:gkholman(_at_)CraneSoftwrights(_dot_)com
Crane Softwrights Ltd. http://www.CraneSoftwrights.com/s/
Box 266, Kars, Ontario CANADA K0A-2E0 +1(613)489-0999 (F:-0995)
Male Breast Cancer Awareness http://www.CraneSoftwrights.com/s/bc
Legal business disclaimers: http://www.CraneSoftwrights.com/legal
--~------------------------------------------------------------------
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>
--~--