lilypond-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

MIDI restructuring


From: Han-Wen Nienhuys
Subject: MIDI restructuring
Date: Tue, 20 Apr 2004 21:55:22 +0200

address@hidden writes:
> I am looking at restructuring some of the MIDI code, in the hopes of
> letting the user have a little more control over the output.  I have
> looked carefully at the way things are currently done, and have a few
> ideas how to go about doing this.  But I have several questions first:
> 
> 1) How tied is the performer abstraction to MIDI?  For example, we
>    currently have Audio_staffs, which are mapped to channels.  I want
>    to be able to be able to tweak (as a user) the definitions as to
>    whether a particular voice, staff, staffgroup or score gets its
>    own channel.  If performers are tied tightly to midi, I could
>    rename these Audio_channels or Audio_tracks.  If not, maybe

>    something more generic can be considered.  Ideas?

No, unfortunately, I don't have specific ideas. The audio output has
never been seriously considered, and it's not clear what it should be
offering. Hence, there never was a generic design (A half-hearted
attempt to add a separation between audio and MIDI was added though).

> 
> 2) It is friendlier, in many ways, for fundamental object in LilyPond
>    to be scheme objects, when possible.  But how specific should these
>    objects be?  For example, I am thinking about having a generic MIDI
>    Controller event, which represents the midi controller, and value
>    for a controller value change.  Should the object be structured
>    like this,
> 
>    (controller . value)
> 
>    or like this:
> 
>    ('midi-controller-object controller . value)
> 
>    The former is a little more terse, while the latter makes it clear
>    what type of object this is.  Personally, I lean toward the
>    former, because I don't think there is enough structure to warrent
>    labelling, but I really want your opinions.

I'm not sure. There is no infrastructure at all to put Scheme into the
Audio back end, so an answer wouldn't have any context. Design-wise,
it would be nice if MIDI were handled by a special (Scheme?) library,
and if LilyPond creates a device independent audio description, which
uses the MIDI backend to do the dumping.  This is similar to how
things work on the notation side.


--
 Han-Wen Nienhuys   |   address@hidden   |   http://www.xs4all.nl/~hanwen 





reply via email to

[Prev in Thread] Current Thread [Next in Thread]