[sdiy] Information Content of Signals
Magnus Danielson
cfmd at swipnet.se
Tue May 20 23:46:42 CEST 2003
From: Grant Richter <grichter at asapnet.net>
Subject: Re: [sdiy] Information Content of Signals
Date: Tue, 20 May 2003 01:50:42 -0500
Dear Grant,
> Your point is well taken, that in the sense of the original Shannon paper,
> information content is defined as the reduction of entropy from a maximal
> entropy state. That is to say, the information content is defined by the
> ratio of the entropy of a finite measured symbol group (signal) and the
> maximum possible entropy of the entire symbol set (noise).
You increase the entropy or if you so like, the lack of order or
predictability.
However, the entropy measure does not depend on a finite set of "symbols" but
does infact exist for continous value and time signals, i.e. where no
quantization of either or both of value and time is performed. A wall of
sinus tones with constantly freely variating amplitudes is also such a system.
Infact, this email takes it's third jump in the IP network over a system not
that much different (it does use 64 sines and it does use quantized levels,
but the effective quantization changes at need due to the residue noise).
> But in an absolute thermodynamic sense, the maximal entropy symbol group
> still represents a heat source. And again in an absolute thermodynamic
> sense, any heat source could contain information. Easily decodable or
> otherwise.
This is really where the distinction between "signal" and "noise" comes in.
We must accept noise to be unwanted disturbance of our expected signal. This
might seem like a strange definition, but just as thermal noise is a source of
reduced quality, so will we consider interference from other sources such as
TV transmitters, radio stations or whatever. Part of the modulation techniques
has the benefit that it becomes harder for CW (non-modulated) and various
modulated forms of signals to interfere with the wished signal. Various forms
of CDM excell in this such that non-correlating signals gets high damping and
the normal signal looks nothing like the disturbance signals. In this sense
could a CW sine signal be just as much noise as the thermal noise, and this is
also true since they both contribute equalently much to transmitting any
usefull signal in the terms we migth expect from say a talking clock.
So, this is the confusing part, I agree... to understand that the entropy
must be meaningfull in some sense. But then again, the entropy of a signal
which one receiver views as noise may be meaningfull to another receiver which
will then consider the other signal as meaningless noise.
If you only where able, you could see some inner logic information in thermal
noise, but since you can't (except realize it is just continuing booring noise)
it remains as a disturbance for you.
> > Interestingly enought this view on information sources proove very usefull in
> > cryptology.
>
> I would offer this as a thought experiment.
>
> Encryption is the process of taking a symbol group of arbitrary entropy, and
> using an algorithm to generate a new symbol group of maximal entropy. To be
> useful, the new maximal entropy symbol group must retain the information
> content of the original symbol group.
See it really such that the transmitted signal has an unknown entropy system
such that monitoring it renders it useless and unstructured. Just the slightest
hint of structure gives a trace of the original structure such that more or
less advanced cryptoanalysis (such as diffrential cryptoanalysis) may break
into the crypto.
> In any arbitrarily finite time span, you can not be sure that a measured
> maximal entropy signal group is noise, and not a high information content
> signal encrypted with a "near perfect" entropy maximizing algorithm.
This is true. When you have done it perfectly, you can take all the time you
want, but you won't crack it.
> So while I agree with your insight about the method the Shannon paper uses
> to define information, I humbly suggest there may be new insight to be
> gained by considering information that has become available since
> publication of the expanded Shannon paper in 1963.
To the best of my knowledge Shannon only published one set of papers in 1963,
and that where in Russian which I do not speak or read. A clarified reference
would be apprechiated. The original paper of Shannon is from 1948.
I would kindly ask you to detail what material you are refering to, or the
discussion renders meaningless.
> It may be useful to take a more inclusive or absolute view of information,
> for the purposes of experimental thought, than Shannon did in his paper on
> practical engineering applications.
Thing is, to the best of my knowledge, that no research done after his paper
actually contradict his work, but rather just extends its meaning and fill in
thousends upon thousends of empty pages of use of such a theory.
I.e. I have yeat to understand what, if any, things seriously puts his theories
in the cold. Rather, they have prooved themself as fundamental laws when
applied to much more than the specific engineering problems at hand for Shannon
at the time. If you do have any such material I would kinly like to see it.
IMHO the theories are both well established, sound and working, but just like
the relativity is hard to comprehend the full understanding. Why would a bit
be a usefull measure for information content? It is actually prooved to be!
Also, the concept of entropy was taken into the electrical engineering problems
by Shannon from the physics and the laws of thermodynamic since they
conceptually fit and made the same predictions.
Cheers,
Magnus
More information about the Synth-diy
mailing list