[sdiy] ADc & DAC voltage dividers.. fixed or trimmer adjustables ??

Brian Willoughby brianw at audiobanshee.com
Sat Feb 13 23:34:51 CET 2021

On Feb 13, 2021, at 12:24, Jean-Pierre Desrochers wrote:
> I’m in the process of design for 16 bits ADC/DAC circuits.
> This circuit will receive a 0-8vdc CV voltage, drop it down to 0-5vdc, process it through a MAX11100 16 bits ADC (reference voltage  REF02 +5.00v)
> Then process the cpu data out to a LTC2641 16 bits DAC (reference voltage  REF02 +5.00v).
> The DAC’s 0-5vdc will then be amplified back to 0-8vdc.
> To keep as much as LSB precision I’m wondering if the voltage dividers
> Should be fixed or adjustable..
> See the attached schematic. The 5v to 8v amplification on the DAC section
> Is made using U19B , R3 & R4 (fixed gain of 1.6)
> And the ADC attenuation is made using U19A, R1, P1, R2. (adjustable attenuation of 5/8).
> What would you use for a 16 bits ADC/DAC transfer like this ?
> Fixed resistor values (shown in DAC) or 2 resistors with a center trimmer (shown in ADC)?

You might consider using the Analog Devices AD5761 as your DAC. It can be configured to work with +10V range voltages, and is available with an internal 2.5V reference of 0.1% accuracy. Because the chip directly outputs the 0..10V signal, the accuracy is a closed loop, and doesn't depend upon resistor accuracy or manual adjustment.

I realize this is only using 80% of the 16-bit range if you only need 0..8V CV range, but (a) it's a lot easier to attain excellent accuracy and (b) perhaps a little extra headroom would be nice. I have designed with this chip, and it's very handy to be able to set the voltage range for the 16-bit value from 8 presets. Check out the data sheet.

The question is how to achieve similar accuracy for the ADC over the 0..8V range. Something like the ADuM7703 might be viable, since it can handle analog inputs to 20V. I have not used this chip at all, though.

One thought that occurs to me is to build around a DAC that can handle the 8V range directly, and implement successive approximation in software for the ADC based on that DAC. As long as the comparator has a voltage supply above 8V, it can work with 8V analog inputs. Such a design would be relatively slow compared to a purpose-built ADC chip, but it doesn't seem easy or cheap to select an ADC that handles 8V analog inputs.

One way around the ADC speed issue would be to use a parallel input DAC and discrete logic (or FPGA) to automatically drive the SAR without software intervention. Of course, this means you're designing your own ADC.

To answer your specific questions, I'd recommend fixed resistor dividers for both ADC input and DAC output. You're starting with 0.3% accuracy with the REF02, and a couple of 0.1% resistors should only degrade that to 0.5% accuracy (would you just add the worse case errors?).

It seems like having any sort of manual calibration makes it more likely that the circuit will be much worse than 0.5% error. The small change of being able to calibrate better than 0.5% and never have that adjustment knocked out seems counterproductive.

In other words, I'd personally recommend removing P1 so that you effectively remove manual calibration errors.

In my estimation, the best design would use a reference of 0.3% or even 0.1% which includes the full-scale 8V analog voltages in the feedback loop. Some converter chips have built-in references; some require external references; and a few chip families are available in variations both with and without an internal reference. The AD5761 that I used above is one of the latter. You can pay more for the chip with the 0.1% reference built in, or provide one externally. There's a register in the chip that selects between internal and external references, so you can even pit them against each other. Unfortunately, this is only easy for the DAC. I've not worked with an ADC that has the same analog range.

Brian Willoughby

More information about the Synth-diy mailing list