[sdiy] Can't find 1-bit ADCs

Mike Bryant mbryant at futurehorizons.com
Sat Jun 11 14:10:36 CEST 2022


Sorry but whilst Dan has his opinions, they are not supported by many, notably his advocation of lower sampling rates.

In this case, the quality of an internal clock can never exceed the performance of an external clock.   With very careful design, it might approach it, but can never be better.  The reason is simple, the XIN pin on the crystal oscillator is the most critical node on any timing generator.  More specifically it is the resistor from the XIN pin to ground that performs the current to voltage conversion at this point.  Any noise on the ground end of that resistor will flow through the resistor and modulate the frequency of oscillation, causing jitter.   In an external clock generator there is no audio signal present that can possibly leak onto the ground.  Also the multiple outputs are usually transformer isolated from the drivers giving even more isolation.  Hence there is never any audio induced jitter in the 96kHz sync signal.  This will be PLL'ed up to a higher frequency for the ADCs and DACs, but it is only the 96kHz sync signal that matters for audio performance.  The only way to match this with an internal clock is to provide an isolated supply and ground to the clock generator, and then transformer or optically couple this to the DAC, which is in effect an external generator just in the same box.

In any system you cannot have more than one clock generator, otherwise they will never be in sync and you will need sample rate convertors which generate huge amounts of jitter.  But if you allow following devices to regenerate their clocks from the AES, SPDIF or Ethernet inputs, then use this output to create the clock on the following device, the jitter will accumulate across each device and by the end of the studio or live audio chain, the resulting audio will be truly horrible.  Hence the use of an external clock generator driving every digital device in parallel is the correct, and indeed only, solution for any studio or large scale live audio installation.

But I totally agree with the TASCAM engineer.  Getting the signal and ground routing correct is paramount to any good design, and something most designers seem to get horribly wrong far too often.

Regarding the use of delta-sigma versus multi-bit, you seem to be confusing a full multi-bit convertor, which I don't think anybody actually even makes nowadays, with the multi-bit delta-sigmas currently used in '32 bit(sic)' ADCs.  Whilst the three to five bits used will give better performance than a single bitter, the types of noise aren't eliminated, just reduced by 3dB (not 6dB) per extra bit.  In any case, full 16 bit convertors not using delta-sigma such as the PCM5x devices from the 1980s sound horrible, and I am sure the OP wants better quality than that even if he is also after a DC capability.




-----Original Message-----
From: Synth-diy [mailto:synth-diy-bounces at synth-diy.org] On Behalf Of brianw
Sent: 11 June 2022 05:12
To: synth-diy at synth-diy.org
Subject: Re: [sdiy] Can't find 1-bit ADCs

As Dan Lavry points out, no external clock can possibly exceed the quality of a properly-designed internal clock. The key is that the internal circuitry can introduce jitter if it's not properly designed, and in those situations a "studio quality clock" cannot possibly help because the clock still passes through the internal circuitry on the way to the DAC chip. There's not any sort of magic that exists to get around this requirement (no matter who makes the studio quality clock or how much it costs).

Thus, you really have to understand PCB design and layout to avoid grounding and other issues that might introduce jitter into the sample clock. I once talked to a TASCAM design engineer who pointed out that the company's digital audio product's quality improved dramatically once they understood the important of the clock circuits, including the layout of the traces that carry the clock signals.


There are other sources of noise besides clock jitter, though, and delta-sigma ADC has particular kinds of noise that aren't present in multi-bit ADC. All ADC needs a low-pass before the converter to prevent aliasing. A common technique in modern ADC is to oversample and then use a digital filter to implement the anti-aliasing filter. However, 1-bit delta-sigma doesn't have enough bits to properly implement the dither for the SRC in the digital filter, and thus distortion and idle tones are present. Although it is indeed possible to convert between DSD and PCM and vice versa, it's the original ADC process where the distortion is introduced.

The idle tones completely kill any hope of DC accuracy, and the distortion doesn't help much, either.

It's for this reason that the best implementations use at least five bits for the front end of delta-sigma ADC, and then implement the anti-aliasing filter in a way that can be properly dithered without distortion or idle tones. Once the ADC process is completed (including anti-aliasing), the digital samples can be output in DSD or PCM via appropriate conversion.


>From the description of the project, it seems like there is no reason to prefer 1-bit ADC. Where DC accuracy is important, a multi-bit ADC is necessary. 16-bit is certainly sufficient for CD quality audio.

If the interface out of the ADC and into the DAC needs to be serial, that does not mean that the ADC + DAC need to be 1-bit delta-sigma. There are plenty of I2S interfaces that handle multi-bit samples (zero-padded in the LSBs if less than 24-bit or even less than 16-bit).

Brian Willoughby


On Jun 10, 2022, at 4:32 PM, Mike Bryant <mbryant at futurehorizons.com> wrote:
> You have to have the same clock everywhere no matter how it is distributed, otherwise you'll need flexible length buffers and possibly sample rate convertors.
> 
> The jitter noise I am thinking of is where the audio signal itself gets into the PLL inside the ADC or DAC, usually due to incorrect grounding.  That noise will differ on a channel by channel, and sample by sample basis, but is a common problem with digital audio systems.
> 
> But a studio quality clock is always a good investment.  I use one from Drawmer, but others are available.
> 
> -----Original Message-----
> From: cheater cheater
> 
> All ADC/DAC pairs could/would be connected to the same homogenous clock source spread around the room via coax, and there would be no delay between the output of the ADC and the input to the DAC - think this alleviates some jitter worries? If you call the delay of clock pulses compared to the perfect clock at sample n d[n], my guess is that variance between d[n] and d[n+1] would be much lower than general variance between d[n] and d[n+k] for some k that's up to, say, 64.
> 
> I could just have a single clock for all those ADC/DAC pairs - which means I could blow out on a high precision clock - think that would help?
> 
> On Sat, Jun 11, 2022 at 1:15 AM Mike Bryant <mbryant at futurehorizons.com> wrote:
>> If you are sampling at 500kHz, then the out of band noise extends from say 18kHz up to 250kHz.   Using a higher sampling rate does mean there is less noise from the additive jitter in the audio region as it is spread more thinly, but other sources of noise come into play at higher sampling rates, especially if you are using a lower number of bits, something the advocates of SACD tend to ignore.  For example a few nS of clock jitter due for example to poor crosstalk on a 48kHz sampling signal isn't too bad, but the same absolute jitter on a 2.3 MHz clock will affect the audio quality.
>> 
>> On Sat, Jun 11, 2022 at 12:28 AM Mike Bryant <mbryant at futurehorizons.com> wrote:
>>> Well I googled " bergkristall cable elevator pyramids" and didn't 
>>> find anything :-)
>>> 
>>> The digitised audio signal contains the original signal plus various noise and distortion products, which hopefully if the correct jitter has been applied at the ADC are out of band.  But some will be only just out of band, and hence even if you are sampling at 96/192/384 kHz, you still need a fast cutoff filter.  This inherently has more group delay (not sample delay which can actually be as low as a single sample) than one with a slower cutoff.   The best ADC-DAC systems used to achieve about 2 to 3mS, but in recent years this seems to be down to as low as 800uS.  However I am not convinced they sound quite as good in live audio systems, but that's a subjective opinion of course.
>>> 
>>> On Fri, Jun 10, 2022 at 11:02 PM Mike Bryant <mbryant at futurehorizons.com> wrote:
>>>> You can use DSD at any frequency if it's a self contained system.  Bare in mind that you will still need to move the quantisation noise and distortion products out of band at the D-A conversion at which you will have to introduce some delay.
>>>> 
>>>> Also you need error correction on the transmission medium because DSD does propagate errors, whereas with PCM they just affect one sample.  Of course if it's the MSB of a PCM sample that error will sound like a click, but with DSD the error effect fades away over a much longer time.   But with a perfect transmission medium, there is in truth no real performance difference between the methods for the same bit rate, despite all the numerous webpages written on the subject.


_______________________________________________
Synth-diy mailing list
Synth-diy at synth-diy.org
http://synth-diy.org/mailman/listinfo/synth-diy
Selling or trading? Use marketplace at synth-diy.org




More information about the Synth-diy mailing list