Archive of the former Yahoo!Groups mailing list: MOTM

previous by date index next by date
  topic list  

Subject: OT: Analog vs. Digital Virtual Rave (was:"This just isn't right")

From: "Tkacs, Ken" <ken.tkacs@...>
Date: 2003-03-12

This is actually similar to issues between digital and analog computers in
general. We live in a society that currently thinks "digital is better"
because digital has given us so many cheap, powerful tools & toys so
quickly. And because we generally like the sound of our CDs over the lousy
separation, frequency response, frailty, and surface noise of the old LPs.

But anyone who's read into cybernetics, chaos, neural networks, fuzzy logic,
and general "analog computing" begins to get a real respect for the analog
approach. You know, the one nature decided to use when she designed our
∗brains!∗ :) Analog computers work instantaneously, in complete parallel
(forget 'dual processor'--in an analog computer, EVERY computational cell is
working in 100% parallel), and excel at pattern-based "thinking," which is a
leap that digital computers just haven't made (and possibly cannot), hence
the reason that the Artificial Intelligence everyone thought was "around the
corner" four decades ago is more out of reach than ever.

To counter this, digital computers double their speed every year & a half or
so, with the idea that faster & faster serial thinking will at some point be
equivalent to real-time parallel thinking. But it's a "half life" kind of
situation that gets you closer but never hits the mark! Why not just go look
into "that other" technology?

Why? Because "analog=primitive" in the modern mind. We throw the word
"digital" around the way that people overused the word "atom" half a century
ago. I had an old organ that had atomic symbols all over it, and the
brochure went on and on about how it was based on the "atomic power of...
THE TRANSISTOR!" Laughable today, but so is the word "digital," just nobody
sees it yet.

The other reason that analog computing was so crippled as a research subject
is that a paper was written back in the 60s, I think, by two guys who
"proved" that the neuron, the basis of the analog computer, could not create
the XOR gate, considered necessary to any computer. So everyone got
depressed, turned out the lights, and walked away.

Many decades later, it was shown that TWO neurons hooked into a feedback
loop DO produce the XOR effect! But that earlier, well-published oversight
destroyed an entire line of exploration for decades, and it still has not
recovered.

(Kind of like what will happen to our manned space program if we 'stop
now').

The reason I went into all of this is that if you do any research on neural
networks, searching the web, and so on, you will find all of this research
on analog computing being done... with virtual simulations on digital PCs!!!
These people are using the easy tool---the desktop PC---to do this work,
oblivious to the simple understanding that this makes everything they do sit
on the "wrong" foundation! To investigate the neuron, you need to get out
the soldering iron and build an artificial neuron! You can't model it
digitally because of the MASSIVE feedback and parallelism issues, similar to
what was expressed below with regard to audio cross-modulation. You're
working with discretion and not a continuum, and this completely changes
your outcome (especially with regard to chaos mathematics experimentation,
which is become increasingly important for secure communications research).

It's the wrong tool for the job!!

Sorry for the rave. But that post just reminded me of how frustrating it is
trying to learn anything about these cool subjects because of the 'digital'
concept hovering over everything.





-----Original Message-----
From: Harry [mailto:motm@...]
Sent: Wednesday, 12 March, 2003 1:55 AM
To: motm@yahoogroups.com
Subject: [motm] Re: OT: This just isn't right

> No matter how good a modular simulation can get, it is not going to be
real.
> At least as far as I'm concerned.

Well, I think that you're dead on from an interface point of view. From
a sound point of view, you're also right - not forever, but probably for
a good few years yet. Let me explain why I think so. I apologise in
advance for the length of this, but this sort of issue is a large part
of why I just forked out a few thousand for a MOTM rig, so I care about
it a lot...


It's not the sound of oscillator, filter or whatever that can't be
handled digitally; all of these can be replicated if you throw enough
effort at it (though arguably, no one has yet). The real shortcoming is
a little less obvious - software modulars suck at audio rate modulation.

The obvious way to write a software modular is to represent each module
as an object. However, calling an object's method that generates the
next bit of audio incurs an overhead (saving registers, setting up stack
pointers etc...). In order to amortise the cost of method call, they
typically pass audio around in blocks of around 50 to 100 samples at a
time.

None of that makes audio rate modulation impossible - in fact it's easy
- but it all falls down when you consider audio modulation in feedback
loops (for example where A modulates B, which in turn modulates A,
though it may be less direct than this). In software, the only way to
handle this is to spot that there is a feedback loop and to insert a
delay into the feedback chain which buffers the audio at that point
until the next time that the synth evaluates all the modules. It's this
delay - where all your audio modulation in feedback loops is delayed by
100 samples or so - that kills you.

Of course, if you throw enough CPU at it, you can just brute-force the
problem and swallow the method call overhead. Then you can pass audio
round in single samples instead of blocks.

That's still not good enough.

When I was still unsure as to whether I was going to buy a MOTM, I went
round to Robert Rich's house (since I live in the same town and he's a
VERY nice guy who didn't mind a total stranger phoning him up) and he
showed me a chaotic patch with two oscillators soft-synced to each
other, each being additionally modulated by noise. (Great patch,
BTW!). To do this properly in the digital world, you not only have to
pass single samples around, you probably have to do it at something
extreme like 200 KHz, converting back to 44.1 KHz at output. (The exact
freq you'd need for this would depend on how fast the soft-sync part of
the MOTM's oscillator reacts).

Now, modern CPUs are fast, but they're not fast enough to run really
expensive oscillator and filter models at 200 KHz, passing round a
single sample at a time. And they won't get there for a few years
yet...


Harry

p.s. I love digital too, but I don't want it trying to be analog when
analog does it better.