[sdiy] Re: Information Content of Signals

John L Marshall john.l.marshall at gte.net
Mon May 19 02:37:38 CEST 2003


But of all of the signals only one is of interest at a time. All of the
other signals are noise to desired signal.

This is inpart how CDMA works. The desired signal is actually below the
noise. All of the other signals are noise contributors. But, we know how to
look for the desired signal.

Take care,
John
------------------------------------------------------------------
Pacific Northwest Synthesizer Meeting
August 9, 2003
www.sound-photo.com
------------------------------------------------------------------

----- Original Message ----- 
From: "Grant Richter" <grichter at asapnet.net>
To: "Magnus Danielson" <cfmd at swipnet.se>
Cc: <synth-diy at dropmix.xs4all.nl>
Sent: Sunday, May 18, 2003 1:44 PM
Subject: [sdiy] Re: Information Content of Signals


> My question actually comes from page 47 of that paper. Channel capacity is
> expressed as
>
> Capacity = Bandwidth * log (Signal + Noise / Noise)
>
> But is not defined as Noise approaches the limit of zero. Assuming log
> (infinity) = infinity.
>
> Philosophically, if we are interested in the ABSOLUTE channel capacity,
the
> absolute signal power is the total of all signal sources natural and
> manmade. There is no Noise term because it is included in the Signal term.
>
> So the idea that channel capacity approaches infinity as noise power
> approaches zero, is not intuitive. Quantum mechanics should indicate there
> is a limiting factor, and that is what I am trying to figure out.
>
> Boltzman's constant would seem to suggest that there is minimum
> thermodynamic bit at any temperature. This could possibly be used to
compute
> an absolute thermodynamic channel capacity from temperature and power?
>
> (An erg is the same as a watt-second, according to my reference)
>
> I am way out of my depth, but thrashing around in deep mathematical waters
> can be very enlightening. And I'm counting on Magnus to throw me a life
> preserver.
>
> > From: Magnus Danielson <cfmd at swipnet.se>
> > Date: Sat, 17 May 2003 15:47:30 +0200 (CEST)
> > To: grichter at asapnet.net
> > Cc: synth-diy at dropmix.xs4all.nl
> > Subject: Re: Information Content of Signals
> >
> > From: Grant Richter <grichter at asapnet.net>
> > Subject: Information Content of Signals
> > Date: Sat, 17 May 2003 02:36:44 -0500
> >
> > Grant,
> >
> >> As a point of interest, I have been trying to understand the
theoretical
> >> information content of signals.
> >
> > 55 years ago Claude Elwood Shannon published a lovely article on the
subject
> > which I really hope you have read. It's the "A Mathematical Theory of
> > Communication" which you can get from:
> >
> > http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html
> >
> > This is admittedly not the popular edition of the theory, but if is a
very
> > fresh and crisp reading which still makes very much sense half a century
later
> > like any good text should do.
> >
> > He works out the formulas for both time-discrete and time-continous
signals.
> > The information content with and without presence of noise.
> >
> > In relation to the other stuff you are talking about, this text should
give
> > the
> > reference-floor for any further discussions.
> >
> > As for the noise-floor of sound, this is difficult. It is highly
temperature-
> > dependent since the energy distribution of the atoms/molecules bumping
around
> > in the air is temperature-dependent. Just to make the whole matter even
more
> > interesting sound waves is really the deviations in pressure and
particle
> > velocity (which is zero in equilibrium). There are several good books on
sound
> > and I tend to drop into the works of Olsen, Beranek and Morse for a good
> > reference.
> >
> >> Contemplation of this has led me to a
> >> paradox, and I wonder if anyone can shed light on this.
> >>
> >> Information and thermodynamics are linked together by the expression of
an
> >> information "bit" in a thermal sense. The best definition I have found
uses
> >> Boltzman's constant (minimum energy required to create a new thermal
state
> >> at temperature).
> >>
> >> Using room temperature and the reference of 0 dB = 1 milliwatt, the
> >> theoretical information content of a 1 milliwatt-second signal would be
> >> (approx) 2^64 bits per second. So for sample rate of 192 kHz (~2^18)
> >> theoretical bit depth would not exceed 2^46 (64-18).
> >
> > 0 dBm <=> 1 mW
> >
> > You need to properly define things energy of a sound-wave etc. For
instance,
> > the Sound Pressure Level (SPL) is really the excess pressure relative a
> > reference pressure.
> >
> > The amount of bits per second (bitrate or information rate per
time-unit, but
> > NOT bandwidth even if it is in popular use within the clueless network
> > community) comes from relationsships of noise and bandwidth. This is a
more
> > complex analysis than the bits * samples per second that we usually do.
> >
> >> The paradox comes from the idea of over-sampling a thermally maxed out
> >> information channel. What information would then be gathered? Would it
be
> >> merely redundant information, or would the waste heat of the conversion
> >> apparatus become a new signal generator?
> >
> > Your encoder generates heat, which contribute to the entropy in the
acoustical
> > media, which effectively means more noise. Noise is best understood as
an
> > unwanted information source. The charactersitics of a noise source may
be
> > known, but you may not have a perfect estimator for compensation, so
then
> > statistics have to work for you instead.
> >
> >> Note that from an information theory standpoint, thermal noise is a
> >> naturally occurring signal (reverb tail of the Big Bang?) mixed with
the
> >> human generated signal.
> >
> > There are many non-human sources... I can still hear birds sing
forinstance.
> >
> > Anyway, as for your paradox, I wonder if you just haven't gone astray in
the
> > theory.
> >
> > Cheers,
> > Magnus
> >
>



More information about the Synth-diy mailing list