lilypond-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: MIDI remapping


From: David Santamauro
Subject: Re: MIDI remapping
Date: Sat, 22 Jan 2011 15:08:19 -0500

On Sat, 22 Jan 2011 12:09:26 +0000
Bernard Hurley <address@hidden> wrote:

> On Sat, Jan 22, 2011 at 10:34:29AM +0000, c.m.bryan wrote:
> > Hi, I have an interesting question.  I know lilypond is not really
> > meant for playback.  HOWEVER :)
> > 
> Personally I would like lily to have much more sophisicated midi
> capabilities.

Me too.

> For instance It would be nice if markup could be tied to program
> change events so, for instance, markup like "pizz" could not only be
> printed in the score but also change the midi instrument. It would
> also be nice if things like staccatto were renereabl in midi, and,
> perhaps "tunable".

This isn't as hard as it seems from a programming standpoint. Although
I am still learning the internals of lilypond (and haven't had much
time of late), I did write a skeleton Expression_performer (MIDI CC 11)
that handles expression spanner events for kicks (not committed or
committable but was a great learning experience). It takes a bit of
digging but the programmers' reference and previous threads[1][2] and
of course, hours of source-code study were invaluable to me.

For program changes (as with expression), I'm pretty sure you would need
to get into c++ down to the Audio_Event level (doxygen documentation was
also a major help in understanding class hierarchy) and then work your
way back up to the scheme and lily init levels.

There is also articulate[2] which is a good starting point for scheme
development, but again, I think the program changes would need to be
implemented at the c++ core level (I could be wrong and more
knowledgeable folks should correct me).

[1] http://www.mail-archive.com/address@hidden/msg32077.html
[2] http://www.mail-archive.com/address@hidden/msg32380.html
[3] http://www.nicta.com.au/people/chubbp/articulate

David

PS Although intercepting 'pizz' and 'marcato' etc are interesting, I,
personally would much rather see a special MIDI syntax coupled with a
special voice context (controller lane in DAW parlance).

Like this:

\new MidiStaff <<
  \new Voice { c1\> f1\! }
  \new MidiControlVoice {
    \set MidiControlVoice.midiProgChange 'violin'
    \set MidiControlVoice.midiVolume #101
    \set MidiControlVoice.midiExpression #64

    % sequence of CC 11 events spanning a measure
    m1\midiExpressionSpanner { <start> <end> <curveType> }
  }
>>

'm' is just a placeholder to attach stream positional information. But
with most "continuous" controllers, expression, modulation, pitch-bend
etc, you will want to span and each spanner would need start, end and
curveType parameters, at the least -- maybe even density.

Sorry for rambling / brainstorming ... 





reply via email to

[Prev in Thread] Current Thread [Next in Thread]