[sdiy] Re: Information Content of Signals
Magnus Danielson
cfmd at swipnet.se
Mon May 19 01:57:10 CEST 2003
From: Grant Richter <grichter at asapnet.net>
Subject: Re: Information Content of Signals
Date: Sun, 18 May 2003 15:44:19 -0500
> My question actually comes from page 47 of that paper. Channel capacity is
> expressed as
>
> Capacity = Bandwidth * log (Signal + Noise / Noise)
The real formula is
P + N
C = W log -----
N
Where
C is the capacity in bits per second
W is the bandwith in Hertz (per second)
P is the signal power in Watt
N is the noise power in Watt
> But is not defined as Noise approaches the limit of zero. Assuming log
> (infinity) = infinity.
Yes, but that is not a real problem. You *ALWAYS* have noise, one way or
another, so it's not a real problem.
> Philosophically, if we are interested in the ABSOLUTE channel capacity, the
> absolute signal power is the total of all signal sources natural and
> manmade. There is no Noise term because it is included in the Signal term.
I think your error is in the "manmade" term. Here you should distinguis
between the "usefull" or "intended" signal and that which is not usefull.
The whole Information Theory is about that actually.
> So the idea that channel capacity approaches infinity as noise power
> approaches zero, is not intuitive. Quantum mechanics should indicate there
> is a limiting factor, and that is what I am trying to figure out.
But there is a limiting factor, and that lies exactly in the noise term, you
just seems to confuse yourself of what is in the noise term and what is in the
signal term.
> Boltzman's constant would seem to suggest that there is minimum
> thermodynamic bit at any temperature. This could possibly be used to compute
> an absolute thermodynamic channel capacity from temperature and power?
If you lower the temperature you lower the thermical noise term and your
channel capacity increases. Optimum channel capacity is reached when you are at
0 K, however you never really get there and noise is induced through other
paths which is not at 0 K. Infact, the background temperature of the universe
is a couple of Kelvins and is infact a result of the Big Bang. If you die to
learn more about that I suggest to dig into NASAs webside for the WMAP
satellite which have just published the most astounding findings as a result
of measuring the background noise of the universe.
Recall that there is _OTHER_ noise sources than pure thermical noise.
In short, you always have background noise and most of the times you just have
to accept it as a result of temperatures above 273.15 K (0 C). We can tweak
formulas all we like, but they mean nothing if we can't use them under that
conditions. On a few rare applications people do cool down their sensors to
gain in noise, but that is just to be able to get any information our since
their signal power is to low to start with.
> (An erg is the same as a watt-second, according to my reference)
>
> I am way out of my depth, but thrashing around in deep mathematical waters
> can be very enlightening.
Indeed it can. This particular paper I also have in the pocket-book form and I
also have a book with the collected works of Claude Elwood Shannon. I like to
constantly bend my mind. Trouble is I do it on tooo many fields ;O)
> And I'm counting on Magnus to throw me a life preserver.
;O)
Did I throw you anything to keep yourself floating? ;O)
Cheers,
Magnus - protector of all life - well almost then... ;O)
More information about the Synth-diy
mailing list