[sdiy] MIDI Clock sync advice

brianw brianw at audiobanshee.com
Sat Mar 9 00:32:30 CET 2024


Commentary inline...

On Mar 8, 2024, at 2:52 PM, Tom Wiltshire wrote:
> Hi Brian,
> 
> Yes, thanks. I'm not sure at exactly what point I realised that the whole design of the MIDI protocol was arranged around low-power processors (the only type they had at the time!) but it might well have been when I started thinking about realtime messages. The structure of the protocol makes it pretty clear that the best way to deal with them is to strip them out in the interrupt so that they get the most time-accurate handling and also don't get in the way of the parsing of other messages. It's a really clever design.

Agreed. This is equally important for higher-power processors, because it reduces jitter significantly. Even a 1.2 GHz processor has perhaps hundreds of cycles of latency between a physical event and the software response, so just running things faster doesn't mean that sloppiness is acceptable.

Although Arduino is convenient and easy, it seems built around a very poorly-designed "MIDI" class object. Looking at MIDI software written for Arduino, it's apparent that the "MIDI" class simply buffers all MIDI bytes in the same queue, and then the client of the "MIDI" object pulls the clock messages out after a great amount of time (from the processor's point of view) has passed. The Arduino design will have a lot of jitter than varies according to software activity. I'm actually surprised that someone hasn't created a more extensive Arduino "MIDI" object that handles MIDI clock directly. Then again, that's getting way more advanced than Arduino users are prepared for.


> My current project is in C on a PIC 18F processor, so it's pretty "bare metal". I've got this far on a 16F chip, but I've decided to give it an upgrade because the 18F chips offer many significant enhancements, particularly DMA for data transfers and the hardware multiply.

I have designed USB-MIDI peripherals around the PIC18F. I subsequently tried designing around a "smaller" PIC, and found it too limiting (never finished that project). The PIC18F has two compilers (at least), but the faster one is limited by the hardware stack depth on the PIC of just 15 levels of subroutines. It's difficult to design C code with a finite stack, but the faster compiler has tools that will report on the depth of your subroutine calls (note: no recursion!).\

The good thing is that the PIC18F has plenty of timer peripherals that you can use to make a really tight sequencer.


> I *was* thinking of mirroring hardware +5V triggers (for analog drum circuits) as MIDI note output. Obviously the timing won't be as tight on the MIDI output, but that's just the nature of the beast. All the hardware triggers can be accurately synchronised to all rise together, which is clearly impossible with MIDI Note On messages which take roughly 1msec for the first one and 600msecs for subsequent ones (because of running status). In practice, most human ears aren't even *that* good. You'd be able to hear the latency if you played the internal sound sources against samples of them triggered via MIDI, for example, but I was mostly thinking of this as a way to use the sequencer to trigger external sound modules with *different* sounds, so I don't see it being too much of a problem. Perhaps if I get that far I could even add in a bit of "pre-compensation" so the MIDI notes start going out slightly ahead of time so that although some might be early, the worst of them are never so late? That might need to account for how many notes were about to be played, which all starts to get more complicated. There's a lot of ways to deal with this stuff, depending how involved you want to get.

Lots to consider here.

One thing to keep in mind is that the human brain automatically adjusts for significant latency, so long as the jitter is minimal. Thus, you might not actually need to pre-compensate any notes to be early, since musicians will automatically adjust for that. Instead, focus on making sure that the timing is as tight as possible, and the jitter in minimized.

You should definitely try to design +5V triggers that can be updated simultaneously. Actually, with the PIC18F, I recall that you can only update 8 GPIO at once. If you have more than 8 triggers, then some will have to be updated on a separate CPU clock. This should not be a problem, given that the venerable TR-808 operates on a 2 millisecond tick rate. On the topic of GPIO, take care to select GPIO Ports that have all 8 bits implemented. There are a few Ports where only 6 or 4 or fewer bits actually make it out to physical pins on the chip. I usually spend a significant amount of time planning the GPIO bits and pins so that they're grouped smartly.

As for MIDI, think of MIDI output as having two types of destinations.

If your MIDI output is driving another MIDI sequencer, then you don't really need to send notes. Just let the remote sequencer store the melody and use MIDI clock to synchronize the two. If you implement MIDI output smartly, with separate queues for System Real Time versus "other" then your output timing can be very tight.

For non-sequencer MIDI destinations, you can design limitations like only allowing a single MIDI channel, such that the running status will cut one third of the MIDI data payload. Or you can have multiple serial ports so each track of your sequencer feeds a different MIDI output jack, and thus each MIDI output jack has less of a bottleneck.

Meanwhile, how did you calculate the 1 msec and 600 msecs values? Are those for USB-MIDI? By my calculations, a 3-byte MIDI Note On should take 0.96 milliseconds, and 2-byte running status would be additional 0.64 milliseconds. I guess that's 1 ms and 640 µs. Never mind, I guess it's just a difference in abbreviations...


> Anyway, many thanks for your thoughts.
> 
> Tom
> 
> On 8 Mar 2024, at 21:33, brianw <brianw at audiobanshee.com> wrote:
>> Hi Tom,
>> 
>> You may have already designed with the following in mind, but I highly recommend taking advantage of the design of MIDI to handle clock messages in an efficient manner.
>> 
>> You will note that all of the System Real Time messages in MIDI are single-byte. This allows you to handle timing inside the interrupt for the serial input, without even involving the usual queue of MIDI data. Basically, your interrupt handler should extract all of the System Real Time messages out of the stream and handle them immediately, tightly integrated with your local time reference. All the remaining MIDI messages (that aren't System Real Time) can then go into the queue for non-interrupt processing.
>> 
>> This will require that you write your firmware as "bare metal" or at least using a Real Time Operating System (RTOS). Admittedly, I'm not familiar with the support for MIDI in higher level operating systems with serial drivers, so perhaps there are correct solutions out there.
>> 
>> In any case, having a good design for interrupt handling, and writing the code so that System Real Time messages are processed instantly, will ensure that your derived clock will be as accurate and stable as possible. Any slack here will result in more slop on your resulting clock.
>> 
>> Brian WIlloughby
>> 
>> p.s. If your device also sends MIDI, then it will help to similarly design your MIDI output code to prioritize MIDI sync output without any latency due to queuing. You may end up needing a queue, but keep the sync queue separate from the non-real-time queue. But, being a drum sequencer, maybe you don't have MIDI output?
>> 
>> 
>> On Mar 8, 2024, at 7:10 AM, Tom Wiltshire wrote:
>>> Hi All,
>>> 
>>> Has anyone got any experience dealing with writing software to sync to MIDI clock that they can share?
>>> 
>>> I'm working on a drum sequencer which will run at 96PPQN, and it'd be nice if it could sync to incoming 24PPQN MIDI Clock messages.
>>> 
>>> I can see a couple of ways to do this:
>>> 
>>> 1) Some sort of PID controller, where we compare the internal timing and the incoming clock timing and derive some error signals.
>>> 2) IIR filtering. We measure the time between incoming clocks and then use an IIR filter to provide some averaging and smoothing. We then set the internal clock based on the filter's output.
>>> 
>>> (2) seems like the simpler approach. Clearly it will introduce some lag when changing tempo, but I'm not sure I see this as a fault - smooth tempo changes could be a feature. And depending on how much filtering is required, that lag might actually be quite short. What's a reasonable time constant for such a thing?
>>> 
>>> How has this been approached in the past? I know that I'm not the first person to do this, so I'm just trying to avoid re-inventing the stone-age MIDI wheel!
>>> 
>>> Many thanks for any ideas/pointers offered,
>>> Tom




More information about the Synth-diy mailing list