[sdiy] CD sound quality evolution

Mike Bryant mbryant at futurehorizons.com
Mon Feb 15 23:40:03 CET 2021

No they weren't.  I don't know which year you are referring to but early BB DACs were at best 14 bits, maybe 15 for good ones.  And they sounded dreadful - almost everyone working in pro-audio agreed.  The Sony 1 bit DACs released in 1987 quickly replaced them in most digital audio installations.

In 1983 we had already demonstrated near 20 bit performance at 80kHz sampling at HP.  It wasn't commercially viable at the time as it couldn't be integrated into a single IC, but it showed it was possible if Philips and Sony had just waited a little.  Our work was based on a Philips paper anyway so I suspect their engineers also had a 20 bit system working, but marketing probably pushed to get it out of the door before it's time.  The other problem was the lasers weren't good enough to get 45 minutes of 20 bit data at the time so we would have had to wait for those as well.

Totally agree on your later comments.  A lot of the work was done on dithering by lots of people, not just in the audio field.  In fact audio tends to follow other fields, not lead.  The headroom issue is when you only have 16 bits.  But even most CDs now are over-compressed to my liking.  SuperCD solved that problem but never took off.  

The Phil Collins problem was infamous at the time, but is long before the Internet.  It was caused a particular pattern confusing the error correction on some but not all CD players.  They recalled them all and replaced with a new mastering that removed the problem sequence.  And of course record companies then realised they had to test CDs are more than one machine, a new experience for them of course.

-----Original Message-----
From: Brian Willoughby [mailto:brianw at audiobanshee.com] 
Sent: 15 February 2021 21:52
To: Mike Bryant
Cc: synth-diy at synth-diy.org
Subject: Re: [sdiy] CD sound quality evolution

On Feb 15, 2021, at 10:32, Mike Bryant wrote:
> No.  Because the 16 (sic) bit Burr-Brown DACs in every CD player were crap,

I don't know ;-)  the Burr-Brown 16-bit DAC chips were quite good ... a lot better than the 14-bit DAC chips that shifted through 4 quantization step sizes to make up for the fact that full 16-bit resolution was not affordable yet.

I've heard some great BB 16-bit DAC setups. It probably has more to do with source material than the chip quality. Burr Brown were the golden reference for about a decade or more. Admittedly, both the digital, conversion, and analog reconstruction circuits have improved massively since those days.

> the ADCs in the digitization studio weren't much better, and good compressors were rare so a lot of CDs were recorded with so much headroom lots of the time the output was down towards the noise floor.  That's why many CDs were remastered in the 1990s.

It took the industry quite a while to learn how digital really works. The headroom is not really the issue. The real issue is the lack of good dithering. Modern 24-bit mastering with proper dithering can achieve excellent audio quality without dynamics compression. You just have to remove the correlated quantization noise by using proper dither. The human hearing system finds correlated quantization noise highly objectionable, but uncorrelated noise appears all around us in nature, and our brains can easily ignore it. See just about any of the Audio Engineeering Society papers by John Vanderkooy and/or Stanley P. Lipshitz.

> Also remember the Phil Collins debacle where the CD stopped playing half way through because a particular bit sequence matched the end of track flag.

Can you provide any detailed references for this?

I have studied the CD Red Book specification for years, and there is no possibility of an audio bit sequence getting confused with end of track flags. The TOC (Table Of Contents) is one source of track start time stamps, and it's completely separate from the audio. There is also the 8-bit subcode control data that includes a bit for track audio versus silence (track start and end), but this is also completely separate from the audio data.

In other words, CDDA can include literally any pattern of 16-bit stereo 44.1 kHz digital audio without any risk of "end of track" issues. If there is an issue, it's in the non-audio meta data. Of course, there are a few CD players, at least old ones, that misinterpret the CDDA spec, and might have problems. The CDDA spec itself is a bit vague, while also somewhat redundant, leading to manufacturers misinterpreting things that weren't 100% spelled out.

It's entirely possible that the Phil Collins CD was not Red Book compliant.

Where you can get into trouble with audio data is on encodings that go beyond CDDA. I'm a huge fan of DTS Surround discs. I licensed the DTS encoder so I can produce my own surround mixes. The problem here is that the "digital audio" is no longer real audio. There are bit patterns that identify DTS, and restrictions on the amplitude, so there can be problems with decoders that support surround if they get confused by pure CDDA that happens to include a special sequence of samples. In these cases, you have to take care to turn off the surround detection on CDDA sources to avoid the rare chance of glitches.

> Everybody was saying "why the f*** didn't they just wait until 20 bits was available".  Quite a few artistes refused to do CDs until the quality improved when ADC dithering and 1 bit DACs came along.

Yep. It actually took a while for hardware to evolve from 14-bit to full 16-bit. I don't think anyone anticipated 20-bit or 24-bit improvements, especially not the ability to use dither to squeeze 24-bit performance into 16-bit CDDA. But, the industry has learned so much as digital matures.


More information about the Synth-diy mailing list