xsl-list
[Top] [All Lists]

Re: [xsl] Splitting file into N-sized chunks

2009-08-09 20:07:45
Michael Kay schrieb:
I suspect that level of accuracy isn't needed. A heuristic that says 500Kb
of serialized XHTML = 250K characters in text nodes is probably quite
adequate for the purpose.

Indeed, I've produced .epubs with thousends of chunks of minimal size
(less than 5 kBytes), and they do well. (Smaller chunks speed up the
page turning, and the end of an chunk forces a page break.)

A real challenge was to synchronize these chunks with epubs .opf and
.ncx files, where the chunks must be registered, and to handle internal
links (consider <a href="#something">...</a>) and other references like
footnotes. I'm afraid there is no simple answer how to split a file for
epub, because it depends in some way from the input and affects some
other parts of the workflow.

Stefan


--~------------------------------------------------------------------
XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list
To unsubscribe, go to: http://lists.mulberrytech.com/xsl-list/
or e-mail: <mailto:xsl-list-unsubscribe(_at_)lists(_dot_)mulberrytech(_dot_)com>
--~--

<Prev in Thread] Current Thread [Next in Thread>