[sdiy] Digital Delay - smoothly varying the delay time
Richie Burnett
rburnett at richieburnett.co.uk
Thu May 19 21:11:38 CEST 2016
First-order lag (IIR filter) theoretically takes forever to settle after a
step change, but your method of imposing a "speed limit" will at least give
you a bounded catch-up time for any given step in the "delay" control
position.
You seem to be grappling with two conflicting issues:
1. Aliasing causes by the read pointer racing through the circular buffer
and playing back audio at a faster rate than it was recorded, and...
2. Taking too long for the read pointer to get to it's new position when you
change the delay setting, (i.e. How long the audio's pitch is disturbed
until it continues to play at the right pitch again with the new delay
settings.)
I'm not really sure how you can resolve this, and it probably depends on
which of the two things you find most objectionable and how you set the
delay time. Also bare in mind that as soon as you play back the audio even
slightly faster than it was recorded (whilst the read pointer is catching
up) it will incur aliasing. So if you want the audio output to be
"alias-free" while the read pointer is catching up after a sudden reduction
in the delay setting you're gonna need some good band-limited interpolation.
But does it really matter if it aliases badly right after the delay has been
changed? I guess one way to look at this is to ask "How screwed up are you
prepared to accept the audio getting whilst the read pointer catches up?"
The more disturbance you're prepared to accept in the audio the faster you
can let it catch up and the shorter this disturbed period will last.
Also if you're setting the delay with some kind of digital readout, then
this will probably only update at about 3 updates per second anyway for good
readability, so maybe you can just make the pointer jump every time the
display is updated. At least then you never have to wait more than 330ms
for the new delay setting to take effect in the audio output. If three
audible clicks per second disturbs you, you can crossfade between "old" and
"new" read pointers. A crossfade over 330ms is totally click free, but you
will hear some comb-filtering constructive/destructive artefacts as you go
through the mix. If this bothers you then crossfade quicker!
If you're feeling *really* adventurous I guess you could try doing something
with AMDF or autocorrelation to make the read pointer jump towards the
desired new position in multiples of the input signal's period. This is
what real-time pitch shifters do to supress the splicing glitch. It is
quite computationally intensive doing the autocorrelation though, because
you have to search a significant area around where you want to jump to in
order to find a good match with where you're currently at in the waveform.
The less space you search the less chance you have of finding a good a
match, and the worse it will sound unless you crossfade to smooth over the
poor splice. Whilst this works great for monophonic instruments and speech,
it works far less well with complex polyphonic audio.
Hope this gives you some ideas,
-Richie,
-----Original Message-----
From: Tom Wiltshire
Sent: Thursday, May 19, 2016 7:23 PM
To: Declare Update
Cc: synthdiy diy
Subject: Re: [sdiy] Digital Delay - smoothly varying the delay time
Hi Chris,
Adding a 1st order LP provides a pretty effective speed limit, since you
know what the maximum possible input range can be. If the input jumps
instantly from minimum to maximum, what does the filter do? That's your
maximum slope - the maximum speed limit.
How fast did you set your filter? Am I being too cautious with my
many-seconds-required?!
How long is your delay? How long does it take the filter to rise from
minimum to maximum? Sorry for so many questions, but I'm curious to hear
what others have done, since I'm (by far) not the first person to have come
up against this.
Thanks,
Tom
On 19 May 2016, at 17:46, Declare Update <declareupdate at gmail.com> wrote:
> I'm super interested in this too. I've had recent success with feeding the
> delay time control through a first order low pass. Hadn't considered
> enforcing a speed limit like you have, Tom. Would love to hear from some
> more experienced folks on this!
>
> cheers,
> Chris
>
> Sent from my iPhone
>
>> On May 19, 2016, at 10:23 AM, Tom Wiltshire <tom at electricdruid.net>
>> wrote:
>>
>> Hi All,
>>
>> A couple of weeks ago I was asking about how to reduce noise in my
>> digital delay:
>>
>> http://electricdruid.net/diy-digital-delay/
>>
>> I've got that all sorted now (thanks!), and I've been busy working on
>> the firmware, which now has almost all the features implemented. However,
>> I've got one big thing left to do.
>>
>> The problem is this - how best to smoothly vary the delay time?
>>
>> Now, in reality, I know pretty much *how* this has to be done - or at
>> least, I've worked out one way: I've got "write" and "read" pointers
>> chasing each other round and round a circular buffer. The distance
>> between the two pointers sets the delay time. If I want the delay to get
>> longer, I have to slow down the read pointer, so that the write pointer
>> starts to get further ahead of it. Likewise, if I want to shorten the
>> delay, the read pointer has to speed up and start catching up with the
>> write pointer.
>>
>> In order to limit the potential aliasing when the read pointer speeds up
>> (and the buffer size) and to limit the amount of interp done between
>> samples when slowing down, I've decided to limit the speed-up/slow-down
>> to a couple of octaves in either direction. Having the pitch shift when
>> you twiddle the delay knob limited to 2 octaves still sounds like quite a
>> bit to me.
>> This means that the read pointer can move at a minimum of 0.25
>> sample/output sample up to a maximum of 4 samples/output sample. So it
>> actually takes a reasonable amount of time for the delay to change from
>> the minimum setting(few milliseconds) to the maximum (4 seconds) because
>> the write pointer can only get 0.75 secs ahead of the read pointer every
>> second. So changing from zero to max delay takes 4/0.75 = 5.33 seconds.
>>
>> Does this sound right? Is there a better(quicker?) way to smoothly alter
>> the delay time? Do commercial delays work like this? Do you notice a
>> delay when you twiddle the delay time?
>>
>> Thanks,
>> Tom
>>
>>
>>
>> _______________________________________________
>> Synth-diy mailing list
>> Synth-diy at dropmix.xs4all.nl
>> http://dropmix.xs4all.nl/mailman/listinfo/synth-diy
_______________________________________________
Synth-diy mailing list
Synth-diy at dropmix.xs4all.nl
http://dropmix.xs4all.nl/mailman/listinfo/synth-diy
-----
No virus found in this message.
Checked by AVG - www.avg.com
Version: 2016.0.7597 / Virus Database: 4568/12261 - Release Date: 05/19/16
More information about the Synth-diy
mailing list