Any XSLT processor is going to struggle with 160 Mb input, though it is
just about feasible if you have masses of memory.
What is the nature of the transformation? Can it be broken up into a
series of smaller transformations, each of which deals with one subtree?
If so, you could break up the document using a SAX filter, or use
Saxon's saxon:preview mechanism.
If the transformation really requires direct access to the whole
document, you should be thinking about putting the data into a database.
For example an XML database such as Software AG's Tamino.
Michael Kay
Software AG
home: Michael(_dot_)H(_dot_)Kay(_at_)ntlworld(_dot_)com
work: Michael(_dot_)Kay(_at_)softwareag(_dot_)com
-----Original Message-----
From: owner-xsl-list(_at_)lists(_dot_)mulberrytech(_dot_)com
[mailto:owner-xsl-list(_at_)lists(_dot_)mulberrytech(_dot_)com] On Behalf Of
Jan Mendling
Sent: 11 February 2003 17:35
To: xsl-list(_at_)lists(_dot_)mulberrytech(_dot_)com
Subject: [xsl] HUGE xml input files
Hello to all!
I have problems with xsl and huge xml-input files. Hopefully
some of you had solved similar problems before ;-) I am using
Xalan for xsl-transformations. My input file has the size of
160 MB. This causes trouble as I receive java.outofmemory
errors. I already tried (without success) the enlargement of
the java vm cache up to 5 GB.
Does anyone know what I can do to run that xsl-transformation
with xalan, or does anyone know another xsl-processor that is
better capable to deal with such huge input files? Thank you
in advance,
Jan Mendling
--
~~~~~~~~~~~~~
~ Jan Mendling
~ Güterstr.53
~ 54295 Trier
~~~~~~~~~~~~~
______________________________________________________________
________________
SMS verschicken und die Telefonnummer gleich im Adressbuch
speichern. Geht nicht? - Geht doch! -
http://freemail.web.de/features/?mc=021150
XSL-List info
and archive: http://www.mulberrytech.com/xsl/xsl-list
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list