[sdiy] Frequency shifted from BBD?
brianw
brianw at audiobanshee.com
Sat Oct 26 04:31:28 CEST 2024
On Oct 25, 2024, at 1:47 PM, Didier Leplae wrote:
>> My Flangelicious design uses a uP to generate an LFO-modulated BBD clock. The LFO is done in firmware and the clock is an NCO hardware module on the microcontroller. It makes for a very simple circuit compared to most flangers, but it has a weird watery R2D2-like background noise caused by (I think) frequency spurs from the NCO. A genuine VCO for the BBD would sweep smoothly and eliminate this source of noise.
>
> Do you think this is specifically a problem with the PIC’s hardware NCO or do you think this is inherent to any digital clock?
Tom can probably confirm, but I assume that this NCO is using spread spectrum techniques to prevent causing problems with other devices.
A common problem in digital circuits with clocks is that the power of the clock signal is highly concentrated at one frequency (plus the harmonics of that frequency, of course, since it's a square wave). That high power ends up bleeding into other circuits, especially analog circuits, and it can be a big challenge to shield and filter clocks so they don't cause problems.
In digital processing circuits where the precise clock rate doesn't really matter, a modern technique is to purposely vary the frequency of the clock in a random pattern across some range of frequencies. Thus the power is spread out across many frequencies, and there's not concentration of power at any one frequency. For a CPU, it doesn't really matter what the exact clock rate is, so long as it's not too fast for the CPU to operate. Even with a varying clock, a CPU still gets the job done *very fast*
If this NCO is using spread spectrum techniques to reduce EMI, then it might be having an effect on the audio samples, since sample rate has a huge effect on noise, especially if the DAC rate doesn't precisely match the ADC rate, and also if there is jitter in the sample clock. The NCO might be introducing jitter on purpose - to solve a problem completely different from the audio problems.
> There seem to be a bunch of BBD based devices that have tap tempo for example, which I imagine must be done with a uP.
When the delay time is mostly constant, and the design avoids spread spectrum (purposeful jitter), there should be no problem using a uP to measure tap tempo and apply that to the delay time.
I imagine that there are ways to design a digital circuit - that's not uP based - that could measure the timing of a tap tempo input and apply it directly to the delay time. From a certain viewpoint, the way that the Roland GR-300 guitar synth uses digital circuits to measure the period of the guitar string waveform and then synthesize a note with the same period, with optional shift or detune, could probably be used for non-uP tap tempo-to-delay time. It's probably not worth the effort unless maybe you're already designing with FPGA or something.
> In fact I make already make a delay module like this, but it seems to present more of a problem with chorus and flanger circuits because of the higher frequencies.
When you say higher frequencies, do you mean that the BBD is running at shorter delay times for chorus and flange, and thus the BBD clock is a higher frequency than longer delays?
Brian Willoughby
More information about the Synth-diy
mailing list