When listening to a song you’ve never heard before, what is it that draws your attention? Is it the beat that gets you moving? Maybe it’s the lyrics tugging at your heartstrings? Or is it the arrangement and interplay of the instruments?
A great song has a bit of all of these, but what can set a song apart from the rest often involves the hidden science and art of audio engineering helping to put those pieces together.
With the diversity in music today we can all find something that draws our attention. Let’s take a look at how the Musical Trinity of engineering, songwriting, and arrangement creates those songs that linger with us.
Some people may hear “engineering” as a dirty word when it comes to music. They start to think about the dreaded auto-tune and how someone has manipulated the recordings to make flawed artists sound like premiere musicians. Sadly, this has become more commonplace with more famous artists, as they want to have that absolute perfection to the recording. But what about the up-and-comers? They can’t all afford the studio time or the necessary equipment to put out that top-notch production. This has created an interesting gap between the self-produced albums and the label productions. Is one production better than the other?
Sure, you can hear a better fidelity with a label production but that doesn’t take anything away from the quality of the self-produced albums. The key thing to look for when listening to the production of an album is balance. Having that discerning ear to notice where instruments are placed in space, to notice if they all sound like they are playing in the same room, to be able to close your eyes and actually picture it, as if you are sitting in front of this group watching them perform.
The idea behind a modern recording studio is to record the sounds as closely as possible so as to not hear any of the environment around them. During the production and mixing of the album, the engineer will help to create the room that those sounds should be placed in. The engineer creates that width so we hear something from the far left to the distant right.
A lot of inexperienced engineers, though, forget that we don’t just hear in two dimensions when we’re listening. This means that they tend to forget to take depth into account. When we listen to a concert, are all of the musicians right at the front of the stage, hitting us at the same volume? No, of course they’re not. The drummer is usually towards the back of the stage, the singer is front and center, and the guitars and bass will be sitting between the two. We hear with depth, we recognize that a sound is far away or right in front of us (if you want to know more about that, look up the Doppler Effect).
It’s that principle of placing instruments in a three-dimensional field that is the foundation of balance in a recording. I say the foundation because once you’ve figured out where in the environment you are placing the instruments, you have to look at the listening environment and ask yourself the following question: if a sound is directly in front of me or far off, how much ambient noise or reflections in the room will I hear? That question tells you how much reverb and delay to use so that all the instruments will sound like they are playing together.
Article originally appeared on Hypebot (http://www.hypebot.com) and was written by Josh Srago.