[sdiy] user interfaces, was Radio Shack catalogs
Paul Cunningham
paul at cometway.com
Fri May 14 23:39:32 CEST 2010
the difference is that you are often resting your fingers on the keys
and can tell where you are by your position in front of the piano and
the black keys.
for me, when i play keyboards in the dark, i really don't have too
much trouble after hitting the first key and getting some sonic
feedback. my fingers already know the distance between relative keys
and find them without too much trouble. when i miss, the physical key
channels them back in the right direction. i even found early on that
playing without being able to look made me a better player.
i also suspect good mallet players -- and drummers -- get very
familiar with where their strike points are independent of vision.
ever watch neil peart play? he's not looking behind him when he hits
stuff there.
i don't expect this kind of thing is easy to learn with something
like an ipad, but you can probably learn it with the right gui layout
(good instrument design). haptic feedback is an important part of
touch screen development in my opinion. feedback is important in any
kind of gui, but it doesn't only have to be visual. fortunately, most
musical instruments provide very specific auditory feedback if
nothing else. if you do the wrong thing it's quite obvious to
everyone. -pc
On May 14, 2010, at 5:23 PM, Rainer Buchty wrote:
> On Fri, 14 May 2010, Ingo Debus wrote:
>
>> Thinking of it, a mallet instrument (vibraphone, marimba) hardly
>> provides a "physical frame of reference", does it? Of course you
>> get into touch with the keys while playing, but once you have hit
>> the wrong one, it's too late. And there are many mallet virtuosos.
>
> And where's the difference to e.g. a piano?
>
> You hit the wrong key, you lose.
>
> Rainer
>
> _______________________________________________
> Synth-diy mailing list
> Synth-diy at dropmix.xs4all.nl
> http://dropmix.xs4all.nl/mailman/listinfo/synth-diy
>
More information about the Synth-diy
mailing list