[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [fluid-dev] Midi track functions

From: Pedro Lopez-Cabanillas
Subject: Re: [fluid-dev] Midi track functions
Date: Tue, 6 Dec 2011 14:20:03 +0100
User-agent: KMail/1.13.5 (Linux/; KDE/4.4.4; i686; ; )


On Monday 05 December 2011, David Henningsson wrote:
> There seems to have been a few requests lately about midi tracks and 
> being able to read more of midi events. What do you think of the 
> following outline, would that fulfil your wishes?
>   * Let's add a midi event type named "MIDI_META" or similar. We use it 
> to store all meta events we don't store currently. We'll dump the varlen 
> of copyright, lyrics, etc in there - just like we do for sysex.
>   * We'll also add another callback to fluid_track_send_events, with 
> more info in it. Preferrably here:
>          track->ticks += event->dtime;
>       /* Add callback here */
>          if (!player || event->type == MIDI_EOT) {
> The callback function could look like:
> int handle_extended_playback_event(int track_index, const char* 
> track_name, unsigned int ticks, fluid_midi_event_t* event, void* userdata)
> (Is there more stuff we need?)
> The callback can modify the current event if necessary, and could return 
> FLUID_OK for fluidsynth to continue to process the event, or 
> FLUID_FAILED to ignore the current event.
> If we like, we could also add the same type of callback at parsing time 
> (I think Pedro suggested something like this a while ago?) for players 
> to be able to pick up lyric information etc ahead of time, which could 
> be useful for some midi players.
> What do you think?

FluidSynth is important for me as a real-time synthesizer. The MIDI player is 
a nice-to-have, but low priority feature. Nevertheless, several of my apps 
offer MIDI player functions implemented with other engines and frameworks 
because FluidSynth has several limits
* SMF format. Currently, FluidSynth MIDI player supports up to a point the SMF 
format, but nothing else. My program "KMidimon" plays also Cakewalk/Sonar and 
Overture song formats, which are parsed by my Drumstick-file library. I'm not 
saying that FluidSynth should offer more song formats, but often external  
parsers make sense. Actually, VLC already uses his own SMF parser, with 
FluidSynth as a synthesizer only.
* Sequencer features. There is already a nice Sequencer API in FluidSynth, 
very interesting for algorithmic composition and pattern based apps. Its main  
limitation for MIDI songs is that doesn't support the tempo meta-event that 
often happens inside MIDI songs. There is a nice time-scale function, but it 
is not the same concept and it is not "schedulable". Adding to the sequencer 
API several queue control events (stop, change tempo, time position,...) it 
would be possible to implement the SMF player with it, changing the SMF parser 
into an independent/optional modular component, that could be replaced by the 
user with an external parser for other formats. 

During song file parsing and before actual playback, the user application may 
want to do several processing:
* process meta-events like lyrics, texts, names, markers, cue points, ... 
These events are not meaningful for the MIDI synth, but valuable for the 
application user.
* insert/remove/change MIDI tracks. My program "KMid2" inserts a new track of 
user events (similar to the timer events of FluidSynth' sequencer API) so the 
application can receive feedback at certain song points.

During playback, the user application may want to receive and transform any 
MIDI event before sending it to the synth, for instance to implement pitch 
shift/key transpose of note events, to mute channels or tracks, and so on. It 
should be also possible to insert real time events, like CC volume events, in 
the middle of the playback.

Finally, for some programs like consumer games it may be enough to support 
only a soft synth and no MIDI hardware. For other musical applications 
supporting external hardware MIDI instruments is a must: the users are 
musicians who own MIDI musical instruments, and want to use these instruments 
connected to the computer and their applications. A (negative) example of this 
is "MuseScore", that uses a heavily customized FluidSynth for playback, but 
doesn't support external output MIDI devices, which is a highly desired 
feature for the users. 

FluidSynth can be used already by users of applications developed with 
frameworks like the AudioToolbox in Mac OSX, or the ALSA Sequencer API. I 
wonder if it makes sense for application developers to adopt a competing and 
non-compatible API in this scenario. On the other hand, if makes perfect sense 
to allow porting applications to limited platforms like phones, tablets and 
assorted operating systems.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]