[sdiy] Digital Delay - smoothly varying the delay time
tom at electricdruid.net
Thu May 19 17:23:20 CEST 2016
A couple of weeks ago I was asking about how to reduce noise in my digital delay:
I've got that all sorted now (thanks!), and I've been busy working on the firmware, which now has almost all the features implemented. However, I've got one big thing left to do.
The problem is this - how best to smoothly vary the delay time?
Now, in reality, I know pretty much *how* this has to be done - or at least, I've worked out one way: I've got "write" and "read" pointers chasing each other round and round a circular buffer. The distance between the two pointers sets the delay time. If I want the delay to get longer, I have to slow down the read pointer, so that the write pointer starts to get further ahead of it. Likewise, if I want to shorten the delay, the read pointer has to speed up and start catching up with the write pointer.
In order to limit the potential aliasing when the read pointer speeds up (and the buffer size) and to limit the amount of interp done between samples when slowing down, I've decided to limit the speed-up/slow-down to a couple of octaves in either direction. Having the pitch shift when you twiddle the delay knob limited to 2 octaves still sounds like quite a bit to me.
This means that the read pointer can move at a minimum of 0.25 sample/output sample up to a maximum of 4 samples/output sample. So it actually takes a reasonable amount of time for the delay to change from the minimum setting(few milliseconds) to the maximum (4 seconds) because the write pointer can only get 0.75 secs ahead of the read pointer every second. So changing from zero to max delay takes 4/0.75 = 5.33 seconds.
Does this sound right? Is there a better(quicker?) way to smoothly alter the delay time? Do commercial delays work like this? Do you notice a delay when you twiddle the delay time?
More information about the Synth-diy