[sdiy] Re: Information Content of Signals
Magnus Danielson
cfmd at swipnet.se
Sat May 17 15:47:30 CEST 2003
From: Grant Richter <grichter at asapnet.net>
Subject: Information Content of Signals
Date: Sat, 17 May 2003 02:36:44 -0500
Grant,
> As a point of interest, I have been trying to understand the theoretical
> information content of signals.
55 years ago Claude Elwood Shannon published a lovely article on the subject
which I really hope you have read. It's the "A Mathematical Theory of
Communication" which you can get from:
http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html
This is admittedly not the popular edition of the theory, but if is a very
fresh and crisp reading which still makes very much sense half a century later
like any good text should do.
He works out the formulas for both time-discrete and time-continous signals.
The information content with and without presence of noise.
In relation to the other stuff you are talking about, this text should give the
reference-floor for any further discussions.
As for the noise-floor of sound, this is difficult. It is highly temperature-
dependent since the energy distribution of the atoms/molecules bumping around
in the air is temperature-dependent. Just to make the whole matter even more
interesting sound waves is really the deviations in pressure and particle
velocity (which is zero in equilibrium). There are several good books on sound
and I tend to drop into the works of Olsen, Beranek and Morse for a good
reference.
> Contemplation of this has led me to a
> paradox, and I wonder if anyone can shed light on this.
>
> Information and thermodynamics are linked together by the expression of an
> information "bit" in a thermal sense. The best definition I have found uses
> Boltzman's constant (minimum energy required to create a new thermal state
> at temperature).
>
> Using room temperature and the reference of 0 dB = 1 milliwatt, the
> theoretical information content of a 1 milliwatt-second signal would be
> (approx) 2^64 bits per second. So for sample rate of 192 kHz (~2^18)
> theoretical bit depth would not exceed 2^46 (64-18).
0 dBm <=> 1 mW
You need to properly define things energy of a sound-wave etc. For instance,
the Sound Pressure Level (SPL) is really the excess pressure relative a
reference pressure.
The amount of bits per second (bitrate or information rate per time-unit, but
NOT bandwidth even if it is in popular use within the clueless network
community) comes from relationsships of noise and bandwidth. This is a more
complex analysis than the bits * samples per second that we usually do.
> The paradox comes from the idea of over-sampling a thermally maxed out
> information channel. What information would then be gathered? Would it be
> merely redundant information, or would the waste heat of the conversion
> apparatus become a new signal generator?
Your encoder generates heat, which contribute to the entropy in the acoustical
media, which effectively means more noise. Noise is best understood as an
unwanted information source. The charactersitics of a noise source may be
known, but you may not have a perfect estimator for compensation, so then
statistics have to work for you instead.
> Note that from an information theory standpoint, thermal noise is a
> naturally occurring signal (reverb tail of the Big Bang?) mixed with the
> human generated signal.
There are many non-human sources... I can still hear birds sing forinstance.
Anyway, as for your paradox, I wonder if you just haven't gone astray in the
theory.
Cheers,
Magnus
More information about the Synth-diy
mailing list