Hi all,
I'm trying to create a reference table for a ridiculously large document. The
resulting table is sufficiently huge that it's causing java memory shortages
when I try to churn it into XML because of the degree of recursion in the table
processing scripts, so I'm trying to read all of the nodes into a node-set()
and process them out into a series of 100-row tables.
I'm sure that there's a simple way of doing this; I'm also sure that I've
abjectly failed to find it. Are there any good samples of processing
node-sets() in batches out there? The main issue that I'm having is closing off
a table at regular intervals and starting a new one; all of my tables currently
ended up nested within each other.
Cheers,
Jeff.
--~------------------------------------------------------------------
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>
--~--