[sdiy] Calibration procedures for digital CV modules?

Spiros Makris spirosmakris92 at gmail.com
Thu Apr 22 11:22:47 CEST 2021


Hello,
I am designing the firmware of a quantizer/microtuner that uses a 16 bit
ADC and DAC (which I hope to share with you all in a few weeks!) and I'm
having trouble nailing down the process with which it will be calibrated.
I'll preface by saying that I don't have an accurate meter yet (still in
the mail), but the following has been validated by another person with a
high count DMM.

The input and output ranges of the module are +/- 10V and one bit
corresponds to roughly 0.3mV. Calibrating the input is fairly easy, and due
to the lack of an accurate meter, I just do it using a commercial CV
generator. I figured a mV or so of potential error doesn't matter much when
you identify the input note, and this seems correct in practice. The module
correctly identifies all the incoming notes in its whole range.

The scheme of producing the output value is:
-Decide which note you will output
-Multiply that by a constant "gain" value
-Add the output offset compensation value
-Add any tuning adjustments, if set up (the module gives the option of
detuning individual notes of the scale)

At first, I was using an integer gain value, which quickly became apparent
that won't work properly for the whole range. An error of 0.5mV will become
6mV after an octave, which is already too much, more as you go further away
from the center note, so I abandoned that approach.

I then tried to use floating arithmetic, and calibrate using 2V:
-Adjust the output until you get 2V
-Divide that value by 24 to find the floating value that corresponds to a
single semitone.
This has the advantage (in theory) that any error you introduce will be
divided by 24, so it keeps the tracking better for +/- 4 octaves, but then
starts to have more noticeable errors.

A solution that was suggested to me, but I have yet to implement and try
out, is to calibrate values for each octave and then produce a look up
table, which will be used to interpolate the output values as required.
This should be very accurate, but has the disadvantage of requiring a lot
of measurement points, ideally 20 (the total output range in octaves).
A final thought, which is still very fuzzy, is to calibrate the output dac
for a semitone, then calibrate the ADC based on that and finally let the
module find the points for 1V, 2V etc on its own and produce the
aforementioned table. Still requires an accurate meter, but is more or less
automatic as a process.

What other options do I have that wouldn't require more than a couple of
samples from the calibrator? I use an atmega328 and still have ample
processing time and memory to spare. I know a mV or two of final error is
not the end of the world, but since the hardware offers the accuracy, I'd
like to use it to its full extend. Any additional advice and suggestions
are greatly appreciated.

PS. This is a collaboration project that has not been officially revealed
yet , which is why I am not giving away more info right now. Please bear
with me, this will be eventually released as a DIY, open source project
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://synth-diy.org/pipermail/synth-diy/attachments/20210422/0525b8b7/attachment.htm>


More information about the Synth-diy mailing list