Sound editing vs. sound mixing

Sound editing is the vehicle that moves a project’s audio from the production to post-production stage. The main difference between sound editing vs. sound mixing is that editing focuses on the production, while mixing takes the edited product and hones it during post-production.

The majority of sound editing takes place during production. This covers all stages of production from shooting and recording on set to sound effects, foley and ADR. Any process that generates new audio content requires editing before it is ready for use in post-production, therefore it is all treated as editing. Likening this process to food production makes sound editing the harvesting of a fresh audio crop. The recordings are cut down to size, cleaned and shipped off for further assembly and packaging.

Sound mixing takes those raw elements and combines them to produce a cohesive, self-contained world made up of dialogue, footfalls, effects and music. In the scope of the previous analogy, this is that art of cooking, canning and packaging.

Sound Editing

At a glance, editing and mixing appear quite similar because the same general tools are used for both processes. That being said, they produce significantly different results. The segregation of duties between editing and mixing is a bit more nuanced than the clear-cut choices of when to use Adobe Premiere instead versus After Effects.

Tools – Less Is More

It is easy to perceive the production phase as focused on the shots, when in truth production covers all content generation. For the sound department, editing equates to sound stage recording, ADR, foley, effects and music. This content is edited and added to the project file pool.

Sound editing still uses the same traditional toolset from a point in time when tape was still the medium of choice. Back then, editing involved cutting, gluing and splicing takes and scenes together. Creating a fade between two parts used to mean cutting two pieces of tape at an angle and gluing them together. This is the origin of the razor-shaped cutting, glue and fade tools seen in today’s digital audio workstations (DAWs).

The administrative purpose of editing today is assembling all the approved takes into a new project and lining them up to the video proxy file. The early stages of sound editing involve lots of cutting and deleting to clean up the project.

Digitization has made editing a more visual process, mainly by introducing waveforms. There is absolutely nothing wrong with using visual cues. However, an overreliance on waveforms and lack of listening will result in abrupt cuts, awkward metering and disrupted breaths between words. For example, don’t just rely on using Pro Tools’ strip silence functionality on an aggressive setting. Fortunately, most software editing is non-destructive and cut regions can be extended.

Timing

Editing is more than the removal of background noise. A good sense of rhythm, timing and aesthetics play a big part in crafting compelling cuts, fades and edits. With tape editing, the physical mechanics of the process means that there is not much room to defer a fade for a later time. Splicing two lengths of tape together only leaves two options: cut and fade. The angle and direction of the fade can be changed to make it fade in or fade out at different speeds, but ultimately these are all fades.

What has changed is that editing work no longer needs to be front loaded early on. This workflow progression frees the engineer to make some editing and mixing decisions down the line.

Getting your hands dirty

The editing world is a sandbox and should be treated like one. When the opportunity to try something new comes along, it’s encouraged to make the effort to try it out. John Roesch, a professional foley artist who worked on “Indiana Jones and the Raiders of the Lost Ark” has maintained that you can’t tell good foley from the real thing. Nearly all the audio effects in “Raiders” were foley, with much of the sound recorded on set being full of noise from effects machines and sequences.

Foley is absolutely worth getting involved with, and building up a collection of props only helps improve your foley abilities. The same applies to found sound and capturing new sounds as they come; having a portable recorder on hand never hurts.

Sound Mixing

Sound mixing focuses on refining those freshly-edited tracks and working towards an end point. Direction may come from yourself or others if working with a team. This stage greatly benefits from a production plan and a destination. Sound mixing for the purpose of this article consolidates the work of the sound designer and mixing engineer, formerly known as “sound men”.

Tools of The Trade

The editing stage uses a more modest set of tools compared to the considerably more lavish approach sound mixing offers. The digitization of tools does not stop at the editing stage. All of the outboard gear used in audio production nowadays have digital equivalents. The same can be said for sampled instruments. There are so many cases today where people don’t bat an eyelid at hearing a string-section-in-a-box, quite possibly because the gap has grown narrow enough.

Common tools in the signal processing chain are: EQ, compressors, limiters, noise gates, reverb, delay, chorus, flanger and phaser. Automation is the final piece that ties everything altogether and sets all the changes in stone.

In Practice

Mixing is more than an exercise of balancing levels, trimming EQ’s and adding effects. It requires imagination, intent, patience and a good ear. Decisions have purpose and, if you squint hard enough, use good theory.

Equalizers are used to sculpt sounds and bring out clarity. That is why we trim out the resonance on the lower end of vocal performances and give the midrange a bump. Delays and reverbs are equally capable of creating ambience, atmosphere or outright hypnotic sequences. Compressors can give a sound punch or give consistency to dialogue.

Work is completed in chunks and sections, even when working alone. Do not try to lift everything at once. Complete sections are much easier to manage as exported stems or groups – more on that later. Granted that track limits today have increased, Pro Tools support a maximum of 768 tracks with Logic Pro offering north of 1,000. Groups make life easier and keep your project tidy, which makes automation much easier. Automation is similar to scripting in the sense that it’s typical to automate any manual task performed more than a handful of times.

Moments from history

There comes a moment where you realize that certain sound designers and editors played a very special part in some of your favorite shows and films. Their productions raised the bar to new heights and gave new memories to countless audiences.

These films are now decades old and still share a common thread to today’s blockbuster productions. It is like going back in time and seeing the first uses of tools and methods considered standard practice today. Yes, today’s films are louder and bolder, but the roots remain the same.

The San Francisco Boys

Ben Burtt and Richard L. Anderson graduated from the USC film school in 1975 and worked together on the first “Star Wars” and Indiana Jones films. Both have incredibly successful careers in the industry and shared an Oscar for their work on “Raiders of the Lost Ark.”

The two were part of a group of two dozen young sound editors, dubbed by Frank Warner, known as the San Francisco Boys. The combination of fresh talent and visionary directors like Francis Ford Coppola and George Lucas helped drive the medium to new heights. It was the directors’ attention to sound that enabled and drove this new generation of sound designers and editors.

The sound design for “Apocalypse Now” is the work of sound designer Walter Murch, another USC alumni who studied with George Lucas. The effects featured recordings of guns firing live ammunition, helicopters, and explosions.

“Raiders of the Lost Ark” still feels more modern than its age, and it’s not alone. The years surrounding 1981 saw a number of films pushing the boundaries of sound and video. “Star Wars,” “Apocalypse Now” and “Raiders of the Lost Ark” appeared in the span of four years. They challenged the pretext of what was possible and went on to inspire generations of filmmakers and editors alike.

Learning from the best

Using multitrack recording to the tune of 24 tracks or more, sometimes slaving together multiple tape machines, became a necessity. The early stages use higher track counts, though these numbers come down as more sections are mixed and split into groups. Burtt has discussed needing to control the levels of the brass and woodwinds separately, doing so with track groups, as to not lose the music elements inside the sound effects. Today this is both normal and necessary, as tracks are to be mixed and bounced in place as stems. Tracks can be sent to busses and managed that way, too.

The passion and attention that this brought was incredibly positive. The impactful audio found in those films was made possible by the craft that people like Burtt helped bring about, one that is still alive and well today.

References

  • The Movies’ Golden Ears
    • https://www.washingtonpost.com/archive/lifestyle/1984/07/03/the-movies-golden-ears/fd0a2636-86e6-498c-810b-52e3657c92d9/
  • Raiders of the Lost Ark sound interviews with Ben Burtt and Joe Roesch
    • https://youtu.be/WYuxYkL_AHM
  • Indiana Jones Sound Design Featurette part1
    • https://www.youtube.com/watch?v=YWDSFihqyH8
Blag Ivanov is a contributing editor at Videomaker and works at a software company.