On Oct 29, 2003, at 1:10 PM, paulhaneberg wrote: > I'm not convinced that a DSP is the best way to implement this. I > would think you would want a continuously variable clock rather than > quantized. Yeah, I had thought about not using a DSP. At the very simplest all one needs is a memory buffer, an input write pointer, and an output read pointer. The delta between the two in the memory buffer is the delay. Then the pointers just advance at the sampling frequency in a loop. Maximum delay time is a function of the size of the memory buffer and the sampling rate. The problem with this approach i that it produces nasty clipping when the head deltas change. > You would also need to get the clock rate quite high if > you were to have a large clock speed ratio and keep the minimum > clock rate above 44.1 kHz. It would be cool if the clock would be > capable of a 1v/Octave response. Then again if you wanted multiple > taps and reverb algorithms in addition to a delay, DSP might be the > only choice. I would think you would want to vary the clock speed > rather than the tap points in response to a control signal to > eliminate zipper noise. Maybe you'd want a send and return for the > feedback signal so you could mangle it as well. I would think you'd > want a huge range of delay times. I would think possibly 100 uSec > to 10 sec. Thats a 100,000 to 1 range. Obviously you're not going > to run your clock at 4 Ghz so I guess you'd have to vary the tap > placement. As I (think) I understand it what you are talking about is having a fixed (or multiple "ranges") buffer length, and then scaling the clock to produce a variable delay based upon a multiple of the buffer length. This makes bandwidth directly dependent on delay time though. Shorter delay times would be higher bandwidth, longer delay times would be lower bandwidth. Lots of harmonic distortion and all that. This is not necessarily a bad thing, and would generate a smooth transition in delay times. I imagine one could have all kinds of fun with the pitch shifting side effect from this ;) You could get a ~4:1 ratio between the buffer length and the max delay time. 96KHz would be the 0 volt level for the clock speed, and a CV signal would swing it */2. This would certainly make for a delay with really, really bad jitter ;) Transparent it would not be, but it might sound cool. > Excuse the thinking outloud. I think a module along these lines has > great potential. I've just had a lot of bad experience with noisy > delays including some ProTools software delays which sound terrible > when changing parameters in real-time. Amen! Thank you for thinking out loud ^_^
Message
Re: [motm] Re: delay's digital
2003-10-29 by Mike Estee
Attachments
- No local attachments were found for this message.