[sdiy] Digital Delay - smoothly varying the delay time

Steven Cook stevenpaulcook at tiscali.co.uk
Thu May 19 20:42:07 CEST 2016


Probably not relevant, but another way to change the delay time without 
unpleasant artefacts is to have 2 read pointers, one set to the current 
delay, one to the new delay, then linearly crossfade between them. You only 
need the 2nd pointer when the delay is changed. You also need to prevent the 
delay time being updated while the crossfade is occurring.

Steven Cook.

-----Original Message----- 
From: Tom Wiltshire
Sent: Thursday, May 19, 2016 4:23 PM
To: synthdiy diy
Subject: [sdiy] Digital Delay - smoothly varying the delay time

Hi All,

A couple of weeks ago I was asking about how to reduce noise in my digital 
delay:

http://electricdruid.net/diy-digital-delay/

I've got that all sorted now (thanks!), and I've been busy working on  the 
firmware, which now has almost all the features implemented. However, I've 
got one big thing left to do.

The problem is this - how best to smoothly vary the delay time?

Now, in reality, I know pretty much *how* this has to be done - or at least, 
I've worked out one way: I've got "write" and "read" pointers chasing each 
other round and round a circular buffer. The distance between the two 
pointers sets the delay time. If I want the delay to get longer, I have to 
slow down the read pointer, so that the write pointer starts to get further 
ahead of it. Likewise, if I want to shorten the delay, the read pointer has 
to speed up and start catching up with the write pointer.

In order to limit the potential aliasing when the read pointer speeds up 
(and the buffer size) and to limit the amount of interp done between samples 
when slowing down, I've decided to limit the speed-up/slow-down to a couple 
of octaves in either direction. Having the pitch shift when you twiddle the 
delay knob limited to 2 octaves still sounds like quite a bit to me.
This means that the read pointer can move at a minimum of 0.25 sample/output 
sample up to a maximum of 4 samples/output sample.  So it actually takes a 
reasonable amount of time for the delay to change from the minimum 
setting(few milliseconds) to the maximum (4 seconds) because the write 
pointer can only get 0.75 secs ahead of the read pointer every second. So 
changing from zero to max delay takes 4/0.75 = 5.33 seconds.

Does this sound right? Is there a better(quicker?) way to smoothly alter the 
delay time? Do commercial delays work like this? Do you notice a delay when 
you twiddle the delay time?

Thanks,
Tom



_______________________________________________
Synth-diy mailing list
Synth-diy at dropmix.xs4all.nl
http://dropmix.xs4all.nl/mailman/listinfo/synth-diy 




More information about the Synth-diy mailing list