[sdiy] IN your mind, what is ....

Magnus Danielson cfmd at bredband.net
Sun Feb 1 14:37:01 CET 2004


From: Rainer Buchty <buchty at cs.tum.edu>
Subject: Re: [sdiy] IN your mind, what is ....
Date: Sun, 1 Feb 2004 13:29:41 +0100 (CET)
Message-ID: <Pine.LNX.4.56.0402011312460.15581 at atbode100.informatik.tu-muenchen.de>

> > Zilog's Z-80 was yet another 8080 spinoff, basically, and it was everywhere
> > in Sinclairs and such.
> > IIRC, the progression went 4004 > 8008 > 8080 > Z-80 > 8085 > 8086 > 80186 >
> > '286 > '386 > '486 > Pentium and Celeron.
> 
> Not entirely true, the Z80 is a "sidearm" of the 8080/8085; the Z80
> features did not evolve into the 8086, which in term is not really a
> direct offspring of the 8085 but rather started a new family.

Actually, Zilog's Z-80 was really the same design-team as that of the 8080, but
doing it under their own flag, that is Zilog.

There is an interesting line of stuff happening around the 8080/8085 sucsessor,
8088/8086. Not only did they have the well-known floating point unit 8087
(which is stack-oriented despite the fact that stack-oriented processing was
already out of fashion, but the escaped instruction-space was not large
enought), but there was also the I/O processor 8089 and the object-oriented and
dynamic memory supporting OS-chips of 80130 and 80150. Amazing stuff really.
That where Intels attempt to bridge the semantic gap between what the
processors did and the abstractions happening in the software industry (object
oriented was not big in real applications, but ultra-cool in about 1980). This
was also the fore-runner of the iAPX 432 family, the 32-bit architecture that
has about the same features but also adding redundancy and multiprocessing.
That really didn't fly thought and then the RISC thing happend too.

Still lack alot of info on all this stuff, but I do know the basic plot.

> Just as the 68000 is not really an architectural successor to 6809.

Indeed.

> The 80186 in term is basically a microcontroller built around the 8086,
> i.e. containing timers and UART; it also introduced some new commands
> (e.g. ENTER/LEAVE).

I've always considered the 80186/80188 an embedded version of the 8086/8088
which fits various needs outside those of normal "desktop" computing. For
instance, I have one sitting in the graphical engine of my Tek 11402 scope.
It has a 80286 for main-processor. Whoaaa! Now that's computing power! ;O)

As a side-note, my HP 4195A Network/Spectrum Analyser has a dual 68000 setup,
one for main processor and one for DSP style processing. It also has a NEC 7220
display controller which greatly speed up things like drawing circles and
lines.

> It is also noteworthy, that the Pentium I was the last classic CISC
> machine. With the Pentium Pro (which then led to the Pentium II) those
> CPUs just look like CISC to the progammer, but internally they are
> completely different (depending on the generation either some sort of RISC
> machine or VLIW).

Indeed. They did VLIW in the i860 project and the i960 stuff was quite
impressive for its relatively slow clock. However, the i960 used register
score-boarding to keep unique resources from causing trouble (an instruction
will lock the resources it needs and another instruction can only execute in
parallel if it doesn't use the same resources) while the Pentiums not only use
register renaming (the registers as seen from the programmers point of view is
renamed on-the-fly to a much larger set of registers) but also memory renaming
(think of stack-based variables as extra registers and you see why this is a
wise move). With tricks like that will a small register-file like the one
inherent in the x86 family (and traceable back to 4004 and 8008!) not be as
disasterous as it used to be, infact will the compacts instructions-set be good
since it won't trash cashes as much as 32/64 bit instructions or worse yeat -
VLIW. Adding a number of other issues you really start to wonder what so great
about IA-64 except the amount of money spent on the project. Here the Alpha
stuff seems more logical. But those architectures I must confess I know too
little about.

> > Further off topic, intel maintains 2 separate CPU design teams; they
> > work on alternate generations in a game of R&D leapfrog (of course, they
> > share information, too). Maybe this is why only IBM is keeping up with
> > intel in the CPU race.
> 
> Plus, it's the money Intel can afford to pump into R&D. I remember a press
> conference in Munich where Hans Geyer of Intel was asked if they fear the
> competition with AMD (who had recent success with the K6 back then) and
> his answer was something like "No, why? We put more money into research
> than they [AMD] have revenues..."

The amount of money they can spend on R&D does not necessaryilly mean that they
will end up with a good product. It can even been argued that it talks against
them. It boils down to management abilities and mentality of the staff to
handle it well. The bigger R&D spendings you have, the less accurate can each
project be and it still works out one way or another. It's a paradox, but there
you have it, that's life.

> > There will be pure analog DIY as long as parts are available. But this
> > "digital stuff" is an increasing part of DIY.
> 
> One of the more interesting developments is that at some stage everybody
> started to diss analog design, also in education. It was just not
> fashionable anymore, somewhat tainted in the new, clean digital ages where
> everyone could build machines from Lego parts ... erm ... 74xx TTL chips.

Indeed. Now both Lego and CMOS/TTL chips is becoming less important (some CMOS/
TTL stuff will still be needed for simpler glue-logic stuff for quite some
time).

> Then frequencies started to rise higher and higher and out of the sudden
> all this analog wickednesses started to pop up again playing a major role
> in digital design...

You end up teching engineers basic analog properties, and as the frequency
raises you see multiple gigahertz on PCB and all of a sudden the need for
transmission-line knowledge and PCB-layout details is accutely important.
Being able to pass EMC tests cause another hurdle which is another form of
"black magic" to digital designers. Oh, if they only paid attention to the
right details. Then they think all the wrong things about time and frequency
too. Things like jitter and wander is also "strange" and when they get up to
speed they learn about it from App-Notes and design-rules which they follow.
They learn to read the small-print of the datasheets eventually, but they don't
understand really why it is written like it. The start to need IBIS simulations
at about 100 MHz of system clock for a digital interface, and for the MGTs they
are looking at full-blown SPICE simulations. What used to be simple static
timing analysis has blown up in their face. Toss hot-swap to this brew and you
will see more of the analog properties putting digital thinking into a defunct
state. Multiple clock-domains is another field which cause head-ackes. Digital
I/O like fibre-optics also holds a gate to the "analogue hell" ;O)

So, analog is not dead, it will never die, it has just changed shape in a
digital-oriented fashion such that you can hide many of the analog details and
work with more and more lumped aspects for a large part of the design, meaning
quicker designs, but there is still need for knowledge of the details and
especially when you are outside of this controlled environment.

Then some people juggle their clock to "solve" EMC problems - what a broken
attitude... that only works for a set of designs, and if you have another
design you must learn how to do it properly, so why not do it properly to start
with?

Cheers,
Magnus - looking with analogue eyes on a digitalized worlds



More information about the Synth-diy mailing list