[sdiy] Information Content of Signals

Grant Richter grichter at asapnet.net
Tue May 20 08:50:42 CEST 2003


Your point is well taken, that in the sense of the original Shannon paper,
information content is defined as the reduction of entropy from a maximal
entropy state. That is to say, the information content is defined by the
ratio of the entropy of a finite measured symbol group (signal) and the
maximum possible entropy of the entire symbol set (noise).

But in an absolute thermodynamic sense, the maximal entropy symbol group
still represents a heat source. And again in an absolute thermodynamic
sense, any heat source could contain information. Easily decodable or
otherwise.

> Interestingly enought this view on information sources proove very usefull in
> cryptology.

I would offer this as a thought experiment.

Encryption is the process of taking a symbol group of arbitrary entropy, and
using an algorithm to generate a new symbol group of maximal entropy. To be
useful, the new maximal entropy symbol group must retain the information
content of the original symbol group.

In any arbitrarily finite time span, you can not be sure that a measured
maximal entropy signal group is noise, and not a high information content
signal encrypted with a "near perfect" entropy maximizing algorithm.

So while I agree with your insight about the method the Shannon paper uses
to define information, I humbly suggest there may be new insight to be
gained by considering information that has become available since
publication of the expanded Shannon paper in 1963. It may be useful to take
a more inclusive or absolute view of information, for the purposes of
experimental thought, than Shannon did in his paper on practical engineering
applications.



More information about the Synth-diy mailing list