xsl-list
[Top] [All Lists]

RE: performance issues saxon

2003-02-17 05:34:10
As I replied to another post on this subject today, some of your options
are:

(a) buy lots of memory (at least 10 times the source file size), and
carefully configure the JVM to make sure it is being used

(b) use a SAX filter to break the document up into small pieces before
transformation

(c) use saxon:preview to transform the document one piece at a time

(d) load the data into an XML or SQL database

(e) use STX

Many people have found that saxon:preview works well in this situation.
It's not a feature I'm very fond of (it's rather fragile if you try to
do anything too clever with it), but it does enable you to process large
documents using small amounts of memory, without learning how to write
in Java.

Michael Kay
Software AG
home: Michael(_dot_)H(_dot_)Kay(_at_)ntlworld(_dot_)com
work: Michael(_dot_)Kay(_at_)softwareag(_dot_)com 

-----Original Message-----
From: owner-xsl-list(_at_)lists(_dot_)mulberrytech(_dot_)com 
[mailto:owner-xsl-list(_at_)lists(_dot_)mulberrytech(_dot_)com] On Behalf Of 
Vasu Chakkera
Sent: 17 February 2003 11:37
To: xsl-list(_at_)lists(_dot_)mulberrytech(_dot_)com
Subject: [xsl] performance issues saxon


Hi all,
I have a bit of problem running saxon for my XML which is as 
huge as 250Mg. ( monster markup language :) ). The 
transformer fails as it runs out of memory. Is there any 
suggestions to situations like this. The XML is designed by a 
different team , and I would want to look into it to see if 
there are any ways of optimising . I also looked at -X option 
of java to get round the out of memory exception.While i do 
this , It would be quite helpful if the Gurus here can let me 
know some tips regarding how to deal with situations like 
this.I am looking at ways to reduce the size of the file at 
the moment. Thanks a lot Vasu

 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list



 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list