[texhax] using larger units to determine breaks?
D. R. Evans
doc.evans at gmail.com
Wed May 6 16:37:00 CEST 2009
William Adams said the following at 05/06/2009 06:06 AM :
>
>> It's probably w/in reach of modern equipment, but it won't be a fast
>> typesetting run.
>
Yes, I'd realised that when I asked the question :-)
However, I remember the days when it took several seconds to typeset a
single page, and I wouldn't expect to need a technique like this until very
late in the process if it was very slow, so I wouldn't really mind if it
took a few minutes to typeset a book (on my main computer here, typesetting
a 600-page novel, even with pdfTeX's font expansion in use, takes a little
more than a second, so even if an improved breaking algorithm took 100
times as long, it still would be faster than Ye Olde Dayes; I used to quite
enjoy watching the slow progress as the [n] count0 markers slowly appeared
--- a visceral pleasure that modern equipment denies me :-) ).
>
> I'm beginning to suspect that the above statement was premature, but
> would love to be proven wrong, esp. by a working algorithm.
I wouldn't be surprised if the perfect algorithm doesn't exist, or
theoretically takes infinite time, but still if a nearly-perfect algorithm
solves 90% of the current bad breaks and runs 100 times as slowly as
current TeX, I would regard that as a worthwhile improvement.
Doc
--
Web: http://www.sff.net/people/N7DR
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 260 bytes
Desc: OpenPGP digital signature
Url : http://tug.org/pipermail/texhax/attachments/20090506/641ef22f/attachment.bin
More information about the texhax
mailing list