Best Practices For Sharing MIDI Files Between DAWs

If you're working between multiple DAWs it's fairly easy to transfer audio. But what about MIDI files? Joe Albano explains the how and why when sharing MIDI.  

When MIDI first came out, people quickly realized that it was much easier to share MIDI files than audio files—audio files are large, while MIDI files are tiny, containing only the data that describes the performance (notes, timing velocity, etc.) and not the actual sound waves. MIDI file sharing is now widely done, but the very aspect that makes them so convenient—their lack of many MB of audio data—can also make sharing them a bit of a crapshoot at times, especially when both parties need the sound quality to be up to professional levels.

General MIDI (or not)  

For hobbyists and casual MIDI users, there’s a standard in place, called General MIDI (GM), which specifies a sound set of virtual instruments that’s standardized, so someone who receives MIDI files from, say, a collaborator can drop them into virtual instrument tracks, and embedded patch change messages in the MIDI tracks will be sure to automatically call up the correct sound—for example, program change #1 is always Grand Piano, #34 is always Bass, etc. But just because a patch in one user’s general MIDI set says “Grand Piano”, that doesn’t mean that patch will sound anything like the Grand Piano the other collaborator was hearing when he created the MIDI performance. Not only is the sound likely to be different, but the patch may respond to the MIDI performance data differently, to boot. 

For example, a dynamic piano performance captured as a MIDI file may be almost completely lacking in dynamics if it’s set to trigger a different piano patch, one programmed to have little or no dynamic response. And if the original piano had a dark, moody ambient tone, that suited the music, and the substitute was more of a bright, tinkly, almost honky-tonk sound, the musician using that mismatched sound as a reference would be likely to come up with parts that really won’t fit when his collaborator hears them against the proper piano sound. When two (or more) musicians are using what they each hear from the shared MIDI files as a reference for adding their own parts and building a (long-distance) arrangement, the music may suffer if the sounds they each hear are significantly different, especially in terms of musical nuance. 

So, for casual projects, General MIDI is probably fine, but professional users usually have to put a little more effort into ensuring good, appropriate instrument sounds on both ends. They need to either invest in the same sound libraries (not always a practical $$ option!), or be ready to invest a bit of time and effort in matching MIDI performance data to the sounds they do have, so the parts they each add to the arrangement will not be compromised by the lack of instrumental equivalence. Let’s take a look at the steps that a pair of MIDI collaborators might take, to exchange MIDI files back & forth on a shared project, without undue compromise.

Standard MIDI Files 

First, the basics. There’s a universal format for exchanging MIDI files, and it usually works flawlessly. The data is stored in a Standard MIDI File (SMF), which contains the notes, timing, and all other performance gestures (note velocity, wheel data, pedal data, etc.—everything you’d see in a standard MIDI editor window (piano roll or event list). The SMF also includes Tempo information. Creating a SMF is easy—depending on the specific DAW, typically you select all the MIDI tracks/regions and then just evoke the appropriate “Export as SMF” or “Export MIDI” command. 

Fig 1 Exporting a Standard MIDI File in Logic

Fig 1 Exporting a Standard MIDI File in Logic

But don’t look for a file on the HD with a .smf extension—SMFs actually use the extension .mid instead. To import a SMF, you can usually either use the DAW’s Import command, and choose MIDI File, or simply drag one into the session. What will happen then depends on what type of SMF it is. If the SMF only contains one track, it was likely saved as a Type 0 file (single track), but if there were multiple tracks included, the SMF will be Type 1—it’ll contain individual tracks for each part, and on import, the receiving DAW will create corresponding instrument tracks for each part—sometimes with the appropriate General MIDI sound in place for each (with a test file, Logic instantiated the correct virtual instruments and was ready to play, while Pro Tools created only empty instrument tracks).  

Fig 2 Importing a Standard MIDI File into Logic

Fig 2 Importing a Standard MIDI File into Logic 

Sounds, sounds, sounds 

But whether the DAW goes the extra mile and instantiates the appropriate virtual instruments or not, more demanding musicians and producers will likely want to choose their own. If two collaborators don’t both have the same libraries, then it’s important for each to include a stereo bounce of the rough mix they’ve been working with, as a reference of not only what instruments are in use, but what the performances should sound like when the collaborator assembles a set of instruments on his end. It would also be wise to carefully name the tracks before exporting the SMF (with useful names, not fanciful ones like “big whoosh” or “space pad”), and include a text document as well, documenting any relevant details about the instrumentation and arrangement that might be likely to get lost in translation. 

Remapping drums

So what kinds of tweaks would typically have to be made, to properly translate music in Standard MIDI Files to a new set of virtual instruments? OK, first the more nuts-and-bolts stuff. Obviously, the correct instruments need to be assigned (that’s why good track-naming and documentation habits are key). Drums may represent a particular problem. The General MIDI standard specifies which key triggers which drum sound—C1 is kick, D1 is snare, F#1 is closed hi-hat, etc. 

Fig 3 The General MIDI standard drum map (L) vs a custom drum map (R)

Fig 3 The General MIDI standard drum map (L) vs a custom drum map (R)

This is fine, and it does work, but most of the best drum instruments/plug-ins find this standard too limiting—it doesn’t allow for the multiple samples needed for more expressive drum parts (whether played or programmed). So if two collaborators use different drum engines, at least one of them will have to remap drum parts. As long as he’s provided with a chart of the correct note assignments, this is usually not too difficult, though it can be a bit time-consuming. The best strategy is to create a map, and save it as a MIDI plug-in or Transposition preset, using whatever means are provided by the DAW or drum instrument (many of the best drum instruments let you save maps, making it a breeze to apply them to remap drum files, once the initial remapping has been done and saved). 

Hitting the curve 

But then you get to the potentially more tweaky part—matching the performance nuances to the musical response of the substitute instruments. Some issues may not be immediately apparent, but may show up as odd-sounding bits in the playback. For example, the most standard pitchbend range—how many semitones the sound will be bent up & down at the maximum throw of the pitchbend wheel—to ±2 semitones. But if a part was recorded with an instrument that had a different response programmed into it, then subtle pitch gestures played in (little bends, blued-notes, or manual vibratos) might either get lost, making for a bland performance, or turn into wild-sounding warbling. The same thing can happen with modwheel effects, and even sustain pedal information, let alone instruments where less-common MIDI controllers are in use (like many wind and brass sounds, especially in some of the bigger libraries). Once again, this kind of problem can be avoided with good documentation of any deviations from agreed-upon standards between the collaborators. 

The velocity of MIDI

Velocity data is usually the most tricky to get just right. As we all know, Velocity is part of each MIDI note—it indicates how hard (fast) the note was played, which normally translates to how loud, bright, and sharp the sound will be, reflecting the usual response of real instruments. But the MIDI data doesn’t really know what’s going to happen in response to the embedded Velocities in the notes—that’s a function of the instrument’s programming.

When a substitute instrument doesn’t respond the way the original did, a great deal of the nuance and expression of the performance can be lost, so it’s important to ensure a good translation. This can’t really be described in words, you have to hear what the original sounded like, and then use that as a reference to tweak the new instrument’s velocity response to match the original’s (if you’re up to that task!), or (much more likely) non-destructively scale the Velocity data in the MIDI track itself, by ear, until you hear that the substitute instrument has a reasonably close response to the original’s. 

Fig 4 Some tools (in Logic) for adjusting/scaling Velocity data to match a MIDI performance to a different virtual instrument

Fig 4 Some tools (in Logic) for adjusting/scaling Velocity data to match a MIDI performance to a different virtual instrument

That’s why a reference (audio) mix is important—it may be the only way to know exactly what the other musician was hearing, and to get your setup to match that as closely as possible, to ensure the most musical results as you play along with those imported, redirected MIDI tracks. 

In control 

So, after all this, you may feel “why go to all that trouble”? Instead of exchanging MIDI files, collaborating musicians could always render (bounce) each of the MIDI tracks as audio files, preserving the correct sound and response on the other end—after all, what’s high-speed internet for? But often both collaborators may want to tweak the individual performances of the other, and only MIDI file exchange and editing really allows that. I’ve done a lot of this kind of MIDI file swapping, and it’s not really that tough, or that time-consuming, once you get used to it, establish a protocol with each collaborator, and get a routine going. Many (!) artists are real control freaks when it comes to their music (and can’t abide the limitations of the General MIDI standard), and they feel that the extra effort to get the translation just right when exchanging MIDI files is time well spent, when there just can’t be any compromise that could affect the quality of the musical performances and arrangement.

Learn all about MIDI in this entertaining and educational video course in the AskAudio Academy.

Joe is a musician, engineer, and producer in NYC. Over the years, as a small studio operator and freelance engineer, he's made recordings of all types from music & album production to v/o & post. He's also taught all aspects of recording and music technology at several NY audio schools, and has been writing articles for Recording magaz... Read More

Discussion

Want to join the discussion?

Create an account or login to get started!