[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Rational

From: metachromatic
Subject: Re: Rational
Date: Tue, 22 May 2018 22:34:39 -0700

  To get back to practicalities, here is an absolutely minimal example
of the alleged problems with using rationals as internal
representations of durations/positions in Lilypond.

  Please note that I use the term "alleged problems" because it might
not be clear to everyone that this internal integer representation
issue critically affects real music in the real world in a crippling
way. With any luck, this elementary example should illustrate

  1) that the internal use of rationals in Lilypond is a genuine
problem, not some deluded fantasy;

   2) that the use of rationals internally in Lilypond isn't some airy
abstract issue of concern only to the most abstruse programming
honchos, but instead blows away actual composers working with real
music in the real world, and does it in a brutal way. And I do mean
brutal. As in "sorry, you can't enter more than 7 tuplets without
Lilypond blowing up and crashing" depending on the tuplets.

   Here's some dead-simple Lilypond code that illustrates the problem:


\version "2.18.2"

\score {


\new Staff {
    \clef "treble"
    \override TupletNumber.text = #tuplet-number::calc-fraction-text
    \override Staff.TimeSignature #'stencil = ##f
    \omit Score.BarLine
\relative c''
\tuplet 53/37{r4}
\tuplet 43/29{c8}
\tuplet 3/2{d8}
\tuplet 19/13 {e8[ d8 c8]}
\tuplet 11/7{b4.}
\tuplet 17/13{g8}
\tuplet 61/47{r4}
%\tuplet 31/23{a8}
%\tuplet 89/79 {b4}
%\tuplet 97/41{r4}





   "So what's wrong?" you say. "What's the problem?"

   Here's the problem: uncomment the third to last tuplet and rerun Lilypond.

   Then uncomment the second to last tuplet and rerun Lilypond.

   Now rerun the last tuplet and rerun Lilypond.

   Attached as a PNG, the output for

   A) last 3 tuplets commented out.

   B) last 2 tuplets commented out.

   C) no tuplets commented out.

   Let's assume that I'm a moron with a room-temperature IQ and, as
our friend Kieran McMullen has remarked, "You have no idea what you're
talking about." Fine, assume I'm not only dumb as a box of rocks, but
also hopelessly ignorant.

   So since we can't take my word for it, how do we actually _know_
that the internal representation of integers is making Lilypond blow
up when it tries to generate a score for more than the first 7 tuplets
in the above example?

   Well, Lilypond actually gives us information about the internal
representation of durations/positions of the notes. It does this when
you position the cursor in front of the various tuplets, and before
you run Lilypond to produce the score.

  Placing the cursor in front of the 7th tuplet \tuplet 61/47{r4}
tells you the position Lilypond has given to that note:

  Now let's uncomment the 8th tuplet \tuplet 31/23{a8} and place the
cursor in front of it, without running Lilypond to generate a score.
What does Lilypond say the position of that note is?


  That's 203 million over 1.4 billion, roughly, whereas the previous
rational fraction was about 115 million over 97 million.

   Now, why did Lilypond's internal position jump to such large
numbers merely by uncommenting one measly little tuplet?

   Because all the tuplets are prime. Evidently, no programmer ever
imagined any composer would want to use a bunch of tuplets that were
prime numbers. (Why not? Well...further, deponent sayeth not.)  To
generate an internal position for the notes, Lilypond clearly uses
some kind of Least Common Multiple algorithm. Trouble is, when you
have a bunch of prime numbers in both numerators and denominators of
your tuplets, that LCM grows _very_ fast. In fact, the LCM for a set
of absolutely prime number tuplets grows even faster than the
factorial function...and that's pretty fast.

  So the problem Lilypond runs into is that it must calculate a Least
Common Multiple after the first 8 tuplets for the quantity

  61*53*47*43*37*29*19*17*13*11*7*3*2 = 1.3600648 times 10^16.

  You can see the problem, and it only gets worse after the 8th
tuplet. By the 9th tuplet we need a Least Common Multiple of 89*79*
(61*53*47*43*37*29*19*17*13*11*7*3*2) =

  9.5626154 times 10^19

  By the 10th tuplet, we need a Least Common Multiple of 97*41 *
(89*79*61*53*47*43*37*29*19*17*13*11*7*3*2) =

  3.8030521 times 10^23.

  And so it goes, as Vonnegut was wont to say.  Bottom line?  Using
different prime numbers in both numerator and denominator of your
tuplets very quickly runs Lilypond out of integer headroom to
represent the notes internally.

  How quickly is "very quickly"?

  7 tuplets.

  Count 'em, ladies and gents: 7 (that's seven) tuplets.

  Now, call me overly picky, but it seems to me that in the 21st
century, with multicore CPUs and 16 gigs of RAM and a modern language
like Scheme and expert programmers, we really seriously ought to be
able to do better with a program like Lilypond than....

   ...Blowing up if the program tries to generate a score with more
than seven tuplets.

   I mean...folks. Seriously?  Really?  "No, no, you spoiled-rotten
users, you foolish musicians!  Expecting Lilypond to generate a score
with more than 7 tuplets is simply too much to ask!"



   Expecting a computer notation program to generate a score for more
than 7 tuplets does not, in all honesty, seem like asking for the
world.  Solving NP problems in polynomial time?  Sure, that's
unreasonable.  True hard AI?  Yes, that seems unrealistic.  Parsing
real-world literature and summarizing its semantic meaning in a way
that makes sense to humans?  That surely does seem (to me, just
speaking for myself here) like it's asking too much given the current
state of computer science.

   But the current state of computer science really seriously ought to
allow Lilypond to generate a score for more than 7 (seven) tuplets.

   Am I naïve or stupid or so hopelessly clueless that I'm blithely
exhibiting a classic case of the Dunning-Kruger Effect?  Well, you
know, that might be. Maybe it really is asking too much for a program
like Lilypond to generate a score with more than 7 tuplets.

   But boy, it's just really hard for me to believe that.  Maybe
that's my ignorance or my stupidity talking, but it just seems like it
ought to be possible for Lilypond to generate a score even if the
input has (blue skying it here, just wild crazy possibilities), oh, 8
tuplets. Or maybe even 9 tuplets, or (gasp!) 10 tuplets.

   Moreover, it cannot have escaped anyone's notice how this could be
done. The internal representation Lilypond uses is an integer. Fine,
that's limited. Don't change that -- it's too much work! Instead, just
test that integer internally. When it threatens to get too large, you
invoke Euclid's algorithm and approximate the large-integer ratio with
a smaller integer ratio that falls within Lilypond's bounds.  The nice
thing about Euclid's algorithm is that you can get excellent accuracy
with relatively small integers.  To give you an idea, you can get an
accuracy of about 1 part in a million simply by going up to integers
in the range 10,000 in numerator and denominator.

   So let's ask ourselves, as a practical matter, what kind of
accuracy does Lilypond _really_ need internally?

  The largest size paper is ansi D, 22 inches by 34 inches, and let's
assume we've got a printer using 600 dpi. That means we'd want an
accuracy of at least (say) 100 times 34*600 = 2,040,000. At that
accuracy, no note would be printed more than 1/100 of a pixel away
from its proper position on an ansi d page. Probably good enough for
government work, right?

   After 100 pages of a score, at worst-case inaccuracy, a note or
rest might be off by 1/600 of an inch.  Can we live with that in the
real world?  I'm going to go out on a limb here and say "Hell yes we

  Now let's ask "What kind of auditory accuracy do we need?"  And the
answer to that question is absolutely dead simple. We need an accuracy
of only 1/31,250 of a second, because that's the transmission bitrate
of MIDI. You cannot get better accuracy than that if you use MIDI to
play your Lilypond score. Argue as much as you like, but sorry, folks,
the bitrate of MIDI was set back in the late 70s and it is crap.
1/31,250 of a second is the best you get.

  So what we'd like is for no note in a Lilypond score to be off by
more than 1/31,250 of a second in (oh, let's say) 100 page of score.
That means we need a timing accuracy of 1/100*(1/31250) second =
1/(3.125) microseconds. That works out to an accuracy of (ballpark) 3
parts in 10 million.

  As long as Lilypond gives us an internal representation of 3 parts
in 10 million accuracy, we're good as far as MIDI output and score
printing goes. I mean, c'mon!  Can you _really_ hear a difference of
1/31,250 of a second after 100 pages of score has gone by?  I've heard
of golden ears, but, folks...that's really pushing it.

  So all Lilypond needs is an internal accuracy good to within 3 parts
in 10 million.  That's all. We can easily achieve a precision that
fine using 5-digit integers in both numerator and denominator via
Euclid's algorithm for approximating a float with a rational fraction.
In fact, 5-digit integers is almost certainly huge overkill. I would
be willing to bet that 16 bits would give more than 3 parts out of 10
million of accuracy via the Euclidean algorithm for any rational
fraction approximation of a large-integer ratio.

  So when we get close to running out of headroom in the size of
integers in the ratio, just turn it into a float, then approximate it
as a smaller integer ratio using the Euclidean algorithm.  All you
need to make sure of is that the resulting approximation falls closer
than 3 parts in 10 million to the float you're approximating.

  This does not sound like rocket science.

  Switching to some different representation of integers is not the
answer. Kastrup is dead right about that. The way to solve this
problem is to think smarter, not just throw brute force bitsize at the
problem. Just test the integers in the ratio with which Lilypond
represents note durations/positions to see if they're getting near the
internal limit, turn the ratio temporarily into a float, then run a
Euclidean algorithm approximation that falls within 3 parts of 10
million of the float. I guarantee you that you can do that with
integers no bigger than 5 digits in both numerator and denominator.

Attachment: Lilypond_tuplets_crash_full.png
Description: PNG image

reply via email to

[Prev in Thread] Current Thread [Next in Thread]