[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [OT] Vivi, the Virtual Violinist, plays LilyPond music

From: Graham Percival
Subject: Re: [OT] Vivi, the Virtual Violinist, plays LilyPond music
Date: Thu, 17 Mar 2011 13:11:09 +0000
User-agent: Mutt/1.5.20 (2009-06-14)

On Mon, Mar 14, 2011 at 04:16:35PM -0400, Shane Brandes wrote:
> I suppose since I have spent so much
> of my life attempting to master keyboard instruments and having watch
> so many students progress in their own studies that it seems to me
> that one cold never hope to replicate a human at an instrument.

You underestimate the power of the Dark Side (aka machine

I can't give a good argument, other than "download marsyas and
weka, and try doing some audio classification.  Look at academic
projects over the past 10 years.  Look at the amazing amount of
progress we've made".  Machine learning is *incredibly* powerful.

> There are all sorts of odd philosophical ramifications of trying
> and already certain deficits are occurring especially in the
> film industry on account of such efforts.

The philosophical ramification is this: who *deserves* to create
good music?
A) only people who are healthy and atheletic (in terms of fine
muscle control), have spent 10,000 hours practicing, and still
practice for 2 hours a day.
B) only people who have spent 1,000 hours practicing, but do not
practice daily any more, and might not be healthy.
C) only people who have spent 100 hours learning their instrument,
and might not have any physical ability at all.
D) only people who have spent 10 hours learning,
and might not have any physical ability at all.
E) everybody.

I believe that the answer is E.  Everybody "deserves" to create
good music.  Consider this: Stephen Hawking has advanced ALS.  It
is physically impossible for him to lift a violin or press piano
keys.  So does he not "deserve" to create violin music?  I think
he does.

Now, at the moment, Vivi doesn't create "good" music, and probably
requires about 10 hours of learning.  I mean, you have to write a
lilypond file (that could be between 1 and 5 hours, for simple
music at least), and then if you know nothing about violin, you'd
need to experiment a bit with slurs and articulations to hear how
they sound.  But she's only 4 months old, and I'd put her in a
competition against 4-month-old human players any day.

> As a tool and a method of rationalizing musical praxis it is
> certainly useful and convenient, but where will the limits be?

Where *will* the limits be, or where *should* the limits be?

> One of my favorite
> examples is that of vibrato. It never would have occurred to me that
> it is possible or even relevant to piano until it was demonstrated to
> me, but yet at the same time it can be achieved simply by the action
> of your fingers upon the keys after they have been struck. The
> difference in tone is of course not terribly obvious but yet it can
> yield a completely different character to the chords thus being
> treated. There are certainly other examples, but that is the one that
> I find least likely to ever be replicated.

- can we measure this "post-stike" finger action?  I mean, can we
  build devices which digitize those actions?  (yes)
- can we describe the actions of a string with mathematics?  (yes)
- can we describe the actions of a piano key -> level -> hammer ->
  felt -> string, with mathematics?  (yes-ish; I've only seen two
  academic papers about the effect of different types of felt on
  the piano hammers, so a bit more research might be needed here)

Other than the slight quibble in the last point, it's done.  We
can replicate this electronically.

I completely agree with David's description of this as an
"obscure physical phenomena" -- just think of Wendy Carlos'
"switched-on Bach".  Is that expressive?  If so, then clearly you
don't need a real piano to create expressive keyboard music!

But if "piano vibrato" _was_ necessary for creating expressive
music, then we could certainly synthesize that, too.

- Graham

reply via email to

[Prev in Thread] Current Thread [Next in Thread]