Kevin Jones wrote:
Thanks to the people who responded to my original post.
I have to say I was a little disappointed/puzzled by the lack of responses. I
am not sure why this was but I can only assume there is either little
interest in transforms over the few 100MB size or little interest in
discussing the problems this presents.
It is not often I get the chance to discuss stuff we are doing on the
processor for obvious commercial reasons. I thought this subject might be
suitable for a wider audience as anecdotal evidence suggests that large XML
file usage is growing and and along with that comes some expectation that
tools can cope.
I use transforms on large data files (~50 - 100MB, perhaps more) using
XSLt fairly often for data conversions.
I have a web/database application which can ingest objects represented
in XML. This makes my life simpler by only having to transform someone
else's XML export (which, in some cases can be X00,000 records) into my
representation for import.
It's been a flexible solution for me thus far, but I haven't needed any
really complex processing yet.
Makes me wonder if you could use some simple database (mysql or
postgres?) as a backend for the project you're intending.
ie, scan the structure of the stylesheet and XML doc, create some
temporary tables to hold the overly large XML document, and then
transform the stylesheet into a set of calls to the database... uggg..
makes me mind dizzy just thinking about it...
Guess I need to learn more about the guts of XSLt engines.
Regards,
Kev
--+------------------------------------------------------------------
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>
--+--