Quote:

MIDI resolution increased as processor speed did, if that is your question. A resolution of 120 ticks at 120 BPM means about 4 ms is as accurate as you can get.

Change that to resolution of 1920 and accuracy becomes .26 ms Quite a bit more accurate!
Now we have resolutions twice that accurate available.

As for how this affects realtracks; my guess is they get chopped up no more than the number of possible chord changes, so what may appear as 'flam' could simply be the human performances. Remember, most of these tracks were recorded in separate sessions, not as a group playing together. When I get to that point of desired accuracy I go to Realband where you can slide the audio about in pretty exacting resolutions.

Just my thoughts.




I'd like to add that this is just theoretically true with precise MIDI timing.

In reality there are other weird obstacles to keep audio and MIDI in tight sync. The audio itself is sent to the DA-converters in buffers. Say you had a buffer setting of 44100 samples, then audio would be 1 second behind real time (at 44.1 kHz sampling frequency). Real buffers are smaller, though. The smaller the better. BUT: Small buffers eat processor time. So we have to find a good compromise depending on the own CPU, motherboard and such. This is called "latency". And it is in the order of couple of ms. This obviously doesnt make life easier for programmers of music software.