[sdiy] Information Content of Signals

Grant Richter grichter at asapnet.net
Sat May 17 09:36:44 CEST 2003


As a point of interest, I have been trying to understand the theoretical
information content of signals. Contemplation of this has led me to a
paradox, and I wonder if anyone can shed light on this.

Information and thermodynamics are linked together by the expression of an
information "bit" in a thermal sense. The best definition I have found uses
Boltzman's constant (minimum energy required to create a new thermal state
at temperature).

Using room temperature and the reference of 0 dB = 1 milliwatt, the
theoretical information content of a 1 milliwatt-second signal would be
(approx) 2^64 bits per second. So for sample rate of 192 kHz (~2^18)
theoretical bit depth would not exceed 2^46 (64-18).

The paradox comes from the idea of over-sampling a thermally maxed out
information channel. What information would then be gathered? Would it be
merely redundant information, or would the waste heat of the conversion
apparatus become a new signal generator?

Note that from an information theory standpoint, thermal noise is a
naturally occurring signal (reverb tail of the Big Bang?) mixed with the
human generated signal.

Any speculation is appreciated.



More information about the Synth-diy mailing list