This AUD210 series belongs to an assessment from college.
When I received the song to remix, I opened the Pro Tools session and without thinking of listening to the stems individually, I hit play. As the song progressed, the first reference that came to mind was a song called Long to Live, by Metric and Howard Shore. Since the initial idea was to do a soundscape-y remix, I thought the reference would be perfect.
However, to my surprise, I was presented with a big challenge; when I ran through the vocal stem on its own, I found that there was a reverberant electric guitar bleed--or maybe it was simply recorded together with the vocal. I tried separating one instrument from another in iZotope RX but got nowhere as the harmonics of the electric guitar would clash against that of the vocal.
Nevertheless, I decided to go with it and create the remix around the vocal track. Since reverb was already present, there wasn't much I could do rather than warping it to fit a faster tempo.
Because music is my biggest weakness in the audio universe, I wanted to challenge myself in creating the vast majority of the song from scratch--the only elements I took rom the original song was the vocal with the guitar and the bass section from the chorus. Once I had the vocal and the bass appropriately warped to 150 bpm, I moved to figuring out the melody. The song is in the key of B major.
To help me with the harmony, I began by creating the rhythm. I took the advice of progressing in baby steps and began with only the kick and snare, then moved to hi hats. When the drums were in a decent stage, I looked for a groove that would fit well with the style. Suddenly, I realised I moving toward a blues genre rather than atmospheric, so to speak. Despite being different from what I had planned, I embraced the challenge.
As the drums and groove got roughly established, I began working on the bass. At first, I MIDI sliced the original bass and experimented with it for a while only to find out that it wouldn't work well in the rest of the song. For that matter, I took on a somewhat risky approach by applying a different bass to the rest of the remix. I came up with the bassline by experimenting until I found something that was cool to dance to. Figuring out the bassline made it easier to come up with the other elements as I soon realised I needed to spread out the frequency content. Even though I was working on a different genre, I kept on referencing to classical music for the harmony.
Creating the sounds
- Marimba: The idea for the marimba was born by an actual marimba sound. Before working with Operator, I searched for a marimba sample and played it in the remix to get a feel of what it would sound like. In Operator, I started by modifying the fundamental envelope to have a sharp attack, a short decay, no sustain and fast release. Next, I applied a second oscillator with a coarse of 4 and a third one with a coarse of 8 with the same sustain and release values but with varying short decay times.
- Analog Pluck: The analog pluck was a product of the original bass slice created to harmonise the violin solo. At first, the bass slice sounded quite good but as I kept on listening to it in context, I felt like it sounded muddy and as if it didn't really belong there. Eventually, when I was listening to the Assassin's Creed 2 soundtrack, the track 'Home in Florence' caught my attention. What if I could create a pluck in Analog and position in somewhere in the remix? In order to create the sound in Analog, I referenced the section in the video above and watched some tutorials in both Youtube and Lynda.com. And as practically every element in the remix began in the downbeat, I decided to perform the pluck in syncopation to provide a better swing to it.
Two oscillators were used, being one with a filter cutoff at 1.1 kHz and the other at 522Hz. To have a better control over the overall brightness of the sound, I assigned the cutoff frequency, decay and envelope amount of both oscillators to a macro.
- Pad: The pad was created by sampling a "boing" sound I sourced from Freesound.org. I had used the same sound in class but in an entirely different way, and as I like the original sound I wanted to challenge myself in creating a pad out of it. In order to do so, I watched plenty of tutorials on youtube on how to create pads using Sampler. I applied the 'Note Length' MIDI effect to make the note last for 4 seconds, as well as reverb, delay, a bit of compression and EQ filtered at 520 Hz and experimented with other MIDI effects such as 'Random' and 'Chord' to my taste. Along with the sample, I instantiated Analog with two oscillators slightly detuned at an octave lower with a cutoff at 1.4 kHz and a slow attack. That gave an ethereal feel to the sound, so to speak.
When I had all the elements ready, it was time to add expression to some instruments and effects. The most noticeable of all is the violin. Although not electronic, my reference for the violin expression was the Schindler's List soundtrack. While modest in comparison, I wanted the violin to pop out in the mix while leaving the other elements secondary. In terms of insert and send effects, I set specific tracks for certain instruments, such as drums. For the drums, I wanted to apply reverb to the snare, toms and hi hat only, as well as enhance the kick with other samples and compression. To do that, I inserted an instrument rack in the C1 slot and added 3 different kick drum samples. The first sample was meant accentuate the crack, the second was for the body and the third just to sweeten the overall sound a little bit, and each sample contains its own effect inside the instrument rack. For the reverb, I created a Drums Room return track and routed it to the drums rack and then I sent only the elements I wanted.
Conclusion & Feedback
First and foremost, I wanted to understand the DAW and what the available tools do. There is still a lot more to discover, but I managed to learn more than what I needed to complete the song in a reasonable amount of time. The session view is, however, quite confusing to me and therefore I didn't make use of it. Rather, I'd create the loops and work them out in arrangement view.
Prior to start mixing I met with Gabriel and asked him for feedback. He advised me to polish the piano chords as they sounded somewhat dissonant, though his overall impression was positive. He really liked the end result and didn't have any other comments aside from the ones mentioned above.
All in all, I'm very satisfied with the result I was able to achieve while taking into account my lack of musical knowledge. The assignment was both challenging and rewarding and I'm now feeling much more comfortable with Ableton and mesmerised with the possibilities it opens for sound design.