Audio & Music Production Concepts: Understanding Phase

You've probably been warned of "phase" when recording with multiple microphones, or discovered phase issues when mixing down. But what is it actually and how can you make use of it when producing?  

You may have heard the term ‘phase’ being used a lot in the audio production world. Phase cancellation, out of phase, phase invert etc. are terms that are part of every day vocabulary for a practicing audio engineer. In fact, most hardware mixers and many plug-ins have a dedicated switch to invert the incoming signal’s phase(ø). 

Understanding the phase relationship between different audio signals is very important to get a better mix. This applies specifically to low frequency content where this phase relationship gets even more amplified. Let's take a look at this concept of phase in the audio world.

Phase Theory

Most musical waveforms are cyclical in nature, alternating between the positive and negative parts of the cycle. You may have heard of the phrase A4 = 440 Hz. This just means that the particular note A at the fourth octave, is cycling 440 times per second. In other words, the note goes through all the different phase values from 0º to 360º, 440 times in one second. Phase can be defined as any point in time of the waveform during its cycle. This is measured in degrees starting from 0º and the complete end of one cycle being at the 360º mark (see image below).

Phase.

By itself, phase doesn’t really mean much, but when two or more waveforms interact, the relative phase between the waveforms is very important to consider as it can lead to constructive or even destructive interaction.

Constructive Interference

Let's say we have two of the exact same waveforms being played together at the same time, with the same phase. This will lead to perfect constructive interference and the result will be the same waveform, but a louder version of it. Its simple math, let's say the highest value at the positive part of the cycle is +1 and the lowest value at the negative part of the cycle is -1. We will also assume that at the middle line, the value is 0. So now when the two waveforms interact, they are basically being summed. So the two positive values will add up to give +2 and the two negative values will also add up to give -2. Of course there are a whole bunch of other values in between as well but since the two waveforms are exactly the same all those intermediate values will also add up which will output a waveform that sounds the same tonally but is twice as loud. When this happens the signals are referred to as being ‘in phase’.

Constructive Interference.

Constructive Interference.

Destructive Interference

Another scenario is when one of the waveform starts at the 180º phase relative to the first. This will lead to destructive interference and the result will be complete silence. Again using our values of +1 for the highest and -1 for the lowest part of the waveform, we will see that the highest part of one waveform is aligning with the lowest part of the other. So mathematically this will lead to a value of 0 and all the intermediate values will also cancel each other out. The output will thus be complete silence. When this happens, the signals are referred to as being ‘out of phase’.

Destructive Interference.

Destructive Interference.

Now in real life this may never happen as you will not usually combine two or more waveforms that are exactly the same. Even the slightest difference between the interacting waveforms will avoid the perfect constructive or destructive interference, but having said that, it will still be noticeable. If a waveform is out of phase with another waveform of similar tonality, there will be some kind of cancellation. The cancellation may not be absolute but it could be noticeable. Usually this is heard as a loss in low frequency content.

Practical application of waveform phase knowledge

Now how can we make use of this knowledge in a real world scenario? Think about a stereo miking example. Let's say you record an acoustic guitar with two microphones. There is a very high possibility that the two signals interact and cancel each other out depending on their phase relationship, keeping in mind that the source is the same acoustic guitar sound. It may not be the same exact audio signal from both mics especially if the two microphones are placed at a distance from each other but there is a definite possibility of signal cancellation. Now when it comes to stereo miking, this cancellation is desired to an extent as it adds to the stereo widening effect, especially when the two signals are panned on either side of the stereo spectrum. But this cancellation should not be extreme to a point where you actually hear a loss in the signal. This loss is generally noticeable in the low frequency content but can also be audible in the mids and highs depending on how sharp your ears are. 

Another example is when you layer two kick drum sounds together in a mix. Generally you will never use the exact same kick drum so there is no need to worry about perfect destructive interference but even with contrasting kick drum sounds there is some possibility of constructive or destructive interference depending on the phase relationship, as the sounds are ‘similar’. 

Now we generally want to avoid destructive interference and try to achieve constructive interference as much as possible. So the best way to do this is to ensure that the signals are in phase. The phase(ø) switch on a mixer or plugin can invert the phase of a signal if required. Adjusting the distance between the two mics in a stereo miking scenario can avoid cancellation as well. Or just subtly moving one of the waveforms forward or backward in your DAW to phase align them together can be a very effective way of ensuring proper phase interaction. 

Now there is no need to worry about every possible signal being in phase with every other signal. In a production of over 100 tracks this can lead to some unnecessary time wastage. The key factor for phase interaction is ‘similar signals’. The top and bottom mic of a snare drum will pick up similar signals. The kick drum mic and the over head mic (if the overhead is also picking up the kick drum sound) will pick up similar signals. Any kind of stereo miking scenario will have similar audio from the two mics. When layering multiple sounds to achieve a different tonality(two kicks or two basses) can also lead to similar signals being mixed together. Basically any scenario where ‘similar signals’ are being mixed together, you need to check for phase interaction.

Interested in learning more? Check out this video course by Joe Albano on Audio Concepts 103: Acoustics, which goes in depth into phase vs polarity and much more!

Rishabh Rajan is an award winning music producer & educator currently based in New York. He produces electronic music under the name code:MONO & hosts a YouTube channel featuring music and live mashup videos using performance controllers like the Ableton Push. He is also a sample library developer having worked with companies like Bela... Read More

Discussion

Want to join the discussion?

Create an account or login to get started!