xsl-list
[Top] [All Lists]

Memory problem when stokenize big data

2006-01-10 07:30:00
Thanks for your reply to my prior question about breaking down strings.

Now I am trying to use stokenize to breakdown a big data.

The input big data is like:

         <textdata sep=" &#x000A;&#x000D;">
           5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9
           ...
           ...
         </textdata>
           ...
           ...
         <textdata sep=" &#x000A;&#x000D;">
           5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9
           ...
           ...
         </textdata>

and my xsl template is like:

 <xsl:template match="textdata">
   <data>
   <xsl:for-each select="str:tokenize(.,' &#x000A;&#x000D;')">
     <e>
     <xsl:value-of select="."/>
     </e>
   </xsl:for-each>
   </data>
 </xsl:template>

The textdata can be very big. My question is, will the stokenzing have problem when handling big data? if yes, how big is the data that stokenize can handle? I ran the transformation in Jbuilder and it shows some '10mb help left' problem.

Thanks a lot.
Richard

--~------------------------------------------------------------------
XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>
--~--



<Prev in Thread] Current Thread [Next in Thread>