[sdiy] Information Content of Signals
Czech Martin
Martin.Czech at micronas.com
Mon May 19 09:43:41 CEST 2003
In contrast to what SciFi movies tell us, (androids
make chirping or other small bandwidth noise) efficient computer
"language" sounds like noise, i.e. a broad band signal
that fills the complete avaiable bandwith.
I is easy to understand that more information can be send
in this case.
Older schemes like FSK work with narrow bandwidth, i.e.
also slow modulation. But you could think of many FSK
machines in parallel to fill the bandwidth. This
would increase signal throughput.
To achieve a good, noisy signal
phase and amplitude modulation is used at the same time,
the complex signal pointer get's longer and shorter
and the angle is rotated at the same time.
Usually the pointer has a set of NxN fixed points, called symbols,
where it will settle until the next symbol is expressed.
I have heard about 16x16 symbols. They sit on a rectangulkar grid
in the complex plane arround the origin, a square so to say.
This modulation method seems to be very robust against noise.
Communication to very far away space probes seems also to be done
this way.
And: modern modems analyse the incomming stream and try to compress
this with classical methods, before modulation.
You can notice that if you try to transmit a compressed archive.
This can not be sqeezed more, so it takes longer then expected.
m.c.
-----Original Message-----
From: Tim Ressel [mailto:madhun2001 at yahoo.com]
Sent: Samstag, 17. Mai 2003 13:27
To: Grant Richter; Magnus Danielson; synth-diy at dropmix.xs4all.nl
Subject: Re: [sdiy] Information Content of Signals
I have also wondered about this, but from a different
direction. I want to know how they can get 53kbps on a
phone line with 3khz bandwidth. Oh I know they use
this funky phase-constellation thingie.
I have heard that they stopped at 53kbps not because
of the limitation of phiscics, but because the FCC set
a limit there.
This kind of thing makes my brain hurt.
--Tim
--- Grant Richter <grichter at asapnet.net> wrote:
> As a point of interest, I have been trying to
> understand the theoretical
> information content of signals. Contemplation of
> this has led me to a
> paradox, and I wonder if anyone can shed light on
> this.
>
> Information and thermodynamics are linked together
> by the expression of an
> information "bit" in a thermal sense. The best
> definition I have found uses
> Boltzman's constant (minimum energy required to
> create a new thermal state
> at temperature).
>
> Using room temperature and the reference of 0 dB = 1
> milliwatt, the
> theoretical information content of a 1
> milliwatt-second signal would be
> (approx) 2^64 bits per second. So for sample rate of
> 192 kHz (~2^18)
> theoretical bit depth would not exceed 2^46 (64-18).
>
> The paradox comes from the idea of over-sampling a
> thermally maxed out
> information channel. What information would then be
> gathered? Would it be
> merely redundant information, or would the waste
> heat of the conversion
> apparatus become a new signal generator?
>
> Note that from an information theory standpoint,
> thermal noise is a
> naturally occurring signal (reverb tail of the Big
> Bang?) mixed with the
> human generated signal.
>
> Any speculation is appreciated.
>
__________________________________
Do you Yahoo!?
The New Yahoo! Search - Faster. Easier. Bingo.
http://search.yahoo.com
More information about the Synth-diy
mailing list