[sdiy] Getting dsPICs talking

Seb Francis seb at burnit.co.uk
Thu May 28 22:09:15 CEST 2009


Hi Tom,

A quick thought about this says that it makes sense to transfer the 
modulation data directly in digital, so long as you don't want to 
add/modify modulation from analog sources.

The CODEC/DCI interface would be a convenient way of transferring 
multi-channel streams, or you could use one of the other serial 
interfaces (SPI/I2C/etc) without many extra instructions to encode/decode.

Clock the DCI interface at the modulation sample rate, then act on each 
new value as it comes into the audio dsPIC.  It will probably work 
better if the sample rates are exact multiples, so perhaps use 62.5/6 
KHz for the modulation sample rate.  You could make the audio dsPIC the 
DCI master and clock the modulation dsPIC off the DCI interface and thus 
keep them exactly in sync.

Should work I guess ... haven't thought about it in detail though.

Seb


Tom Wiltshire wrote:
> Hi All,
>
> I'm finally into the construction stage of my "Protowave" monosynth 
> project. The synth uses two dsPIC 33FJ128GP802 processors for its 
> voice, followed by analog VCF/VCA.
> One of these uPs is dedicated to production of audio and can generate 
> two oscillators at 62.5KHz (the "Sources uP"). The second uP is 
> dedicated to modulation and deals with all the envelopes, LFOs etc. 
> The final sample rate for this hasn't been determined, but is likely 
> to be around 10KHz. Both processors are controlled via an SPI link 
> from a third processor which deals with MIDI, scans the keyboard and 
> reads the panel controls.
>
> The modulation processor has a 2-channel DAC on chip. The output from 
> this DAC is multiplexed out to sample-and-hold buffers in the usual 
> way. These analogue control voltages control the VCF and VCA.
>
> Now my query:
>
> I have to get modulation information (pitch and waveshape modulations) 
> from the Mod uP to the Sources uP: How best to do it?
>
> I had thought to also feed analogue control voltages from the Mod uP 
> to the ADC inputs on the Sources uP. The fact that the signals go from 
> the digital domain to the analogue domain and then back again 
> conveniently gets around the fact that they are running at different 
> sample rates (off different clocks, actually), since the anti-alias 
> filtering acts as interpolation between sample values.
>
> However, this conversion digital->analogue->digital does seem a bit 
> crazy. Especially so when you consider that the chips have a codec 
> interface on chip. Having realised this, I wondered whether I could 
> send the modulation signals to the Sources uP via the codec interface 
> instead. Presumably I'd be able to send multichannel data (I've got 6 
> channels to transmit) over the digital connection. But how would I 
> handle the different sample rates? Would I need to synchronise the two 
> uPs? Do I need to digitally filter the outgoing/incoming modulation 
> data to deal with the different sample rates? Am I crazy having 
> different rates? Would I be better having one as a multiple of the other?
> Does anyone have any experience making two uPs talk to each other like 
> this? One is essentially pretending to be an audio codec, and the 
> other is processing that signal and outputting it. The difference with 
> the usual situation is that the codec rate and the output sample rate 
> would be the same.
>
> If I can make this work, it'll be really great. But there's a fair 
> mountain of work still. I'd hoped to finish before this year's SDIY 
> UK, but it'll have to be next year!
>
> Any suggestions appreciated.
>
> Thanks,
> Tom
>
>
> _______________________________________________
> Synth-diy mailing list
> Synth-diy at dropmix.xs4all.nl
> http://dropmix.xs4all.nl/mailman/listinfo/synth-diy
>
>





More information about the Synth-diy mailing list