[sdiy] CV input op-amp circuit

Brian Willoughby brianw at audiobanshee.com
Mon Dec 7 05:56:51 CET 2020


No, I don't really stick to the same chip for each design, but I do tend to end up with Texas Instruments converters more than any other. So, I guess that counts as a go-to brand.

There are many categories of ADC, and you will note that even Texas Instruments has different teams for High Precision versus High Speed converters. Sometimes I can't even figure out which category my application falls in to. I actually make a habit of searching Mouser - casting a wide net and then narrowing down to the task at hand.

The biggest division in my view is between "audio" ADC and "control" ADC.

All 24-bit ADC chips use delta-sigma conversion, which makes it impossible to place an analog multiplexer in front of the ADC to expand the channel count. Since ∆Σ functions on the difference between each sample, you can't swap in an unrelated channel without getting erroneous conversions. Thus, if you think you need 24-bit, then it better be dedicated audio channels only, not the typical control signals. The advantage of these is that you often get a full CODEC rather than just ADC, and the sample clock control is much more accurate. I'd love to design with Asahi-Kasei, but I tend to leave audio conversion to existing products rather than design new ones. All of my favorite existing audio conversion products use AK chips.

ADC chips that have built-in multiplexers and/or are compatible with external multiplexers are limited to 8-bit, 10-bit, 12-bit, or 16-bit at the most. I don't think I've seen anything with more than 16 bits of accuracy, and far more often 12 bits is a practical limit. The advantage of these is that sample rate can be highly flexible, as well as channel count. Even the chips with an internal mux can be expanded with an additional external mux.

For one design, I needed 16 channels of high-sample-rate ADC, and I went with the ADS7951 or ADS7953. These are in the ADS79xx family, and have a compatible command structure. They're rather expensive, though. They bring out the link between mux and ADC to external pins, so you can add buffering or filtering - or simply access the ADC without going through the mux. These guys can convert at 1 MHz, which is probably more than is needed for most projects.

I've also selected Microchip ADC chips for low cost, but I ended up not using the ADC expander in the final product, so I have no experience with how well that works.

Strangely, I've seen way more variety in external DAC chips. There, one can find both parallel and serial interfacing, and both current output and voltage output. I have one design that uses a TI DAC that runs at 125 MHz (that's right, not merely 125 kHz) with a 14-bit parallel interface and current output. After converting from current to voltage, this DAC can be multiplexed to multiple channels, which makes the high sample rate more useful.

Bottom line: I wouldn't hesitate to rely on Mouser to search for an appropriate ADC, and just stick to established brands. I've noticed that smart chip designs place the digital interface pins on the opposite side of the chip from the analog interface pins, so you can even divide your PCB between analog and digital domains. There's still a lot of circuit design necessary to obtain clean ADC performance, but getting rid of the CPU clock in the same chip can get you a long way towards the goal.

Brian Willoughby

p.s. Some processors have the ability to shut down the execution unit during and A/D conversion to reduce the noise. That kind of feature isn't compatible with USB - at least not where I wanted to use it - because USB demands a reasonably-low response time. It's doubtful that the STM32 can shut down for the duration of an A/D conversion, but it might be worth looking in to whether that's an option.


On Dec 6, 2020, at 19:25, Chris McDowell <declareupdate at gmail.com> wrote:
> Brian, 
> 
> Do you have a go-to dedicated ADC chip or series or brand? I spend too much time fighting against the STM32 internal ADC being just a little bit too noisy.
> 
> Cheers,
> Chris 
> 
> On Dec 5, 2020, at 3:50 PM, Brian Willoughby <brianw at audiobanshee.com> wrote:
>> On Dec 5, 2020, at 13:42, Mike Beauchamp <list at mikebeauchamp.com> wrote:
>>> On 12/5/20 4:01 PM, Brian Willoughby wrote:
>>>> Even with all of that potential complexity taken care of, there's still the possibility that hitting a chip with all it can take will affect sensitive peripherals like the ADC. Personally, I prefer off-chip ADC, even going to the extent of placing the stand-alone ADC chip near the signals being read - such as pots and faders - so that ground and voltage references are more controlled than they might be near a digital processor switching at tens or hundreds of megaHertz.
>>>> Brian
>>> 
>>> That sounds very smart Brian. I've had good luck using the internal ADC on some Teensy 3.2's but only when averaging many samples, etc.
>> 
>> After lots of problems with noise related to pots and faders, I decided to stick to dedicated ADC chips. After making that decision, I started to notice that lots of modern synths do the same. There's invariably a separate PCB for the front panel controls like pots and faders, and you'll probably notice that many of them have a dedicated 8-channel (or more) ADC, and then use a serial interface from the front panel PCB to the main processor. Any ground loops or other ground / reference issues are eliminated because the analog voltages are all local to a single board, where the only digital component is the ADC.





More information about the Synth-diy mailing list