Should I use Avid or Premiere (or Resolve)

As a professional editor of over 15 years, the most common technical question I hear from serious beginners is what software should I use for editing.  Now as of recent people have been starting to talk about DaVinci Resolve a lot more.  I use DaVinci Resolve all the time, but more for coloring, but it does seem like a promising software to learn for the future, especially because they have a free version, which always bring in more users, hence clients becoming more used to the program.  But I don’t use Resolve yet as I don’t think the shortcuts and organization fit editing quite as well as the two dominant editing systems in the industry right now: Adobe Premiere and Avid Media Composer.

Author Justin Joseph Hall in 2013

This is about the strengths of each of these applications, in my opinion, and when I would choose to use one or the other.

Top 4 features in Adobe Premiere

1. Modern simple workflow  that is customizable and accessible.  You can use almost whatever codec, timecode, and type of video file you have and just start editing.  There are some limitations, but overall it is much more flexible and you can decide to make proxies or not.  It adapts to your workflow and allows you to adapt to the system you’re working on and the footage that you have.

2. The Adobe Suite integration is amazing.  There is no other company that does so many visuals as well as Adobe does.  Photoshop is the industry standard.  It is what everyone has used for years for still images.  It’s what Avid Media Composer is to video, but it has even more of a hold on the industry due to its flexibility and ease of use.  Same thing with After Effects and simple motion graphics.  It’s included in the suite and you can do so much in the program.  On top of that there are other great programs and all of this comes at the same price as Avid Media Composer.

3. The ease of editing still photos and simple motion graphics in Premiere is much more flexible and intuitive than the very old system of add ons and nesting that Avid Media Composer makes you do.  It took the ideas of Final Cut Pro 7 and took them to the next level when Apple went a different direction with their video editing program.

Exports are also much easier with a separate program Encoder, and the easily editable outputs.  It’s much simpler than Avid’s confusing export console where exports really tie up your whole system and it’s very slow at encoding.  Even when I work in Avid I usually create a quick export (maybe even a reference export where there is no render made) and then re-encode in Encoder.  

Encoder leaves out options in lieu of simplicity which can really be frustrating at times.  For example Avid does make different color spaces much easier to navigate, but timecode and resolution differences are simple and as straight forward as typing in the values you want in Premiere.

4. Finally the process of reconnecting and managing media is much easier in Premiere.  It’s the main reason why you want to edit there.  Not only can it handle any codec in the timeline naturally (although making proxies is still recommended no matter how powerful of a computer you’re using).  But you have the option to cut natively and it’s easy to connect and reconnect footage.  I should mention Resolve is even better at reconnecting footage as it’s almost automatic, but Premiere is modern and intuitive while Avid’s system is outdated, protective and it erases most of your original file names and folder structure which makes it difficult to figure out things for yourself in finder, which is a hell of a pain.

Because of this ease, Premiere doesn’t require a technical person on the project like Avid does.  This can save money.  You need someone experienced in Avid to run the technical aspects of the system, Premiere is much more quickly accessible and a quicker gateway into creating something in Post without studying too much about very technical facets of moviemaking.


Top 4 Features in Avid Media Composer

1. Avid has all the bells and whistles you can imagine needing for the job.  Although Avid can feel slower, it is more thoughtful and the tools in the end, as you learn them make you work faster, so you can get your ideas from your head to the timeline in the least amount of time.  It’s customizable and is the standard in editing because it seems they thought of everything.  I hear of editors all the time that learn of a new Avid tool after working for 15-20 years.  It does really seem like they listen to editors and are inclusive with ideas, never throwing away old ones, just adding to the toolbox you can use.

2. Markers and script synch are a godsend to anyone who uses them in their workflow.  Organizing scripts, and written information about footage is so easy to manage, export outside of Avid and compile different visual ways of looking at your footage.  First off for anyone who does paper cuts (which I really don’t love, but it’s a common workflow), Script Sync makes it so easy for anyone to quickly find footage from a transcript.  It uses technology to click on the word and it’ll bring you directly to the audio in the video.

On top of that, the marker system is easily editable, color coded and easily navigated in many ways.  You can export markers and send them to Producers, or use them internally using Avid’s search engine to create databases that you can bring elsewhere or search right within Avid.  When you’re making a program with hundreds of hours of footage, this capability can vastly, vastly improve your show as if you prepare properly and are organized, when it comes crunch time at the end of a project (as it always does), you can fix small problems in 15 minutes from your database, versus taking a day to find a phrase, or a specific B-ROLL shot.

On top of that because it’s so easy to edit, the markers are able to be used as a visual indicator on the timeline  You can visually show where interesting sections of the footage are, or color code by person or place to visually see in the timeline different information.  This does take time to prepare, but it’s so excellent and isnt’ as frustrating as Premiere’s uncomfortable marker panel.

3. Avid’s system seems to be everlasting.  Jon Alpert who I’ve worked with, made two movies that spanned about 30 years on and off in the edit room, and the Avid projects could always be recovered because they use the same system as back when Avid started.  The project files still open many versions later.  Editors once they learn Avid will always feel comfortable there. 

4. The main reason Avid is the best is because no matter how large the project gets, it’s still usable and still possible to keep together in one project or a smaller set of projects.  This is because Avid uses bins that hold some of the project’s information, but doesn’t tax the computer with opening the entire project’s information at all times.  Only when you have certain bins open are you reading the information and taxing your computer’s memory with that information.  This makes it feel like your always just working on a small bit of the project and makes everything manageable.

If you do happen to use multiple projects, you can easily transfer bins between projects as long as the media is available wherever  you’re bringing it.  This means Avid is so easy to use with a server on huge television projects, or series of any kind.  If you have full teams working together on complicated archival or cutting multiple shows at the same time, Avid is a no-brainer.  Premiere is a sports car meant for one or two people, whereas Avid is like a vehicle where you can always add an extra seat for anyone you want at a standard predictable costs of a computer and a license.

In the end, both systems have carved clear spaces for themselves in the industry.  Premiere on projects under 20 minutes, for speed of use, exporting ease, and ability to work in the Adobe Suite.  For commercial projects it’s just so much more simple when you’re exporting a lot and switching shoots and footage often enough that you just want to quickly edit, send a project away on a drive and be ready to work for an individual person who can execute in Post-Production.

Avid, I’d use for projects over 15 minutes where organization of the footage content matters most.  Where you expect to spend a lot of time with footage to mold it into perfection and likely are working in a larger team.  Especially projects that will last longer than two or three months.  It’s so worth it.

As a final note, DaVinci Resolve again, doesn’t beat either of these projects when editing, but it’s getting there to compete with Premiere.  Because it’s an industry standard in color correction and media management, the ease of using DaVinci Resolve can speed up finishing especially if you have an editor color the footage.  It takes out a variable of switching software one less time which is great.  It’s just not as comfortable in managing screen space for editing as it is organized for finishing and not for sifting through lots of clips.  So it definitely is not on the way of replacing Avid, but look out Premiere.  Work in here if you want to future-proof your workflow or add a skill to your résumé.

If you have any other questions about Post-Production, please contact me at justin.joseph.hall@fourwindfilms.com

Writer’s biography

Justin Joseph Hall has held positions as Editor for networks such as HBO, NBCUniversal, and PBS.  At Downtown Community Television he helped pitch and develop the show Axios (2018-2021), the Emmy-winning documentary series.  Abuela’s Luck (2018) was picked up by all HBO’s streaming platforms and slated to be adapted into a full-length feature movie.   His mastery of post-production and the visual arts has awarded him opportunities to work with Major League Baseball, Dwayne “The Rock” Johnson, National Geographic, Discovery, and BMW, to name a few.

Choosing Music that Enhances your Film or Commercial Video

Music offers an experience all on its own, but it is also a powerful tool that can enhance other experiences. A song or score is an easy cheat to enhance emotional storytelling because we have such a strong reaction to it. In my opinion, music can often be more effectively used when creating branding, a story, or most things for that matter.

I am a storyteller who directs and edits films as well as a business owner who hosts events. In all of these activities, I use music to enhance the experience of what I am creating.

Read More

How Wearing Many Hats Led Me to the Director’s Chair

By Cat Tassini

Photo by Albany Capture on Unsplash

“If you can think of literally anything else to do with your life, go do that.” This was the mantra that I heard many times during my first year of acting training. “You have to be obsessed with your character,” was another slogan, this time from my contemporary scene study teacher. As a nervous freshman in college, I took these words literally. Growing up, I had been enthusiastic about visual art, dance, theater, filmmaking, writing, music, and sports, but now I turned with laser-like focus to acting. Any time another desire entered my head, I felt agonizingly conflicted. I had auditioned and secured my place in the second most competitive undergraduate theater program in the country. Was I going to blow this opportunity by being unfocused and undisciplined? I was determined to give it my best shot. But I couldn’t keep all of my doubts — or passions — from creeping in.

I stuck it out for the two consecutive years at a professional acting studio required to graduate. However, once I had that under my belt I looked into other opportunities for learning. I ended up interning at a multidisciplinary art space in Brooklyn for credit. That was my introduction to the North Brooklyn DIY music and art scene, which indelibly molded my artistic perspective. It’s where I truly came of age. It felt like I had wandered into a creative wonderland—inspired, intimidated, and elated that I finally found a place that felt right. I even put up my own theater piece there, composing it with my theater troupe, and doing the sound, costume, and set design myself. It felt like I had arrived. 

My time interning opened up my mind and I felt confident enough to keep exploring. By graduation, I had designed costumes for a short film, taken art direction and set design classes, studied directing and producing, interned for a special event production company, stage-managed a show, attained a minor in art history, studied abroad, and put up multiple original theater pieces. 

Photo by Isi Parente on Unsplash

Photo by Isi Parente on Unsplash

However, once the anticlimactic reality of postgraduate life set in, I looked back on my many experiences and wondered whether they actually added up to anything cohesive and meaningful. It didn’t help that I graduated into the 2010 job market. It was easy to feel like all the effort I’d put into my undergraduate education didn’t amount to much of anything in the real world. As I wandered through postcollegiate disorientation, hopping from city to city, and trying out different jobs in and out of the entertainment industry, I felt weighed down by nagging doubts. Would I ever be good enough at anything if I couldn’t concentrate on one thing? Would I ever be able to support myself without a “day job?” Would I ever be able to get a day job outside of the service industry? I felt restless, but I still felt guilty about it.

These doubts still haunt me, but less so than when I was a bit younger and greener. I now have the knowledge and perspective of someone who has written, directed, and edited a body of work, screened short films at festivals and racked up years of experience working in film, television, and event production. What I didn’t realize before is that it’s common to bounce from department to department or take time off from one career to pursue another.  It is also totally okay to take time off from filmmaking because you need to work a day job, care for a child or sick loved one, or take care of your own health. In a field as unstable and full of financial barriers as filmmaking, changes are inevitable. Managing your passion for your craft with real-world demands is a balancing act. As circumstances and priorities change, a career will inevitably go through any number of evolutions.

Multimedia is a constantly changing field, and one must make a conscious effort to keep up throughout one’s career. Along with that learning comes paying for classes, trading something you already know and are adept at, and learning on the job. If you’re trying to work your way up starting as a production assistant, it’s great to have multiple skillsets since you never know quite what you’ll end up doing. It is also valuable to have lots of skills in your back pocket to offer in exchange for someone else teaching you the skills you lack. A mentor of mine once described trading art direction work for an After Effects lesson. Finally, there is the practical reality that until you are locked into a union, if that’s the path you choose, it can be easier to get freelance work when there are more roles you can fill.

Photo by Julio Rionaldo on Unsplash

Now that I am directing my first feature, I can see how my varied experience has prepared me for this. It’s essential to be able to wear multiple hats in independent filmmaking. On a typical day, when working on my own work and freelance projects, I utilize some combination of the following: social media, graphic design, grant writing, crowdfunding, blogging, research, correspondence, scheduling, and video editing. These involve wildly different yet interconnected skill sets. On set, I’ve worked in the following departments: camera, sound, art, locations, wardrobe, makeup, transportation, and of course good ol’ fashioned general production assistance (PA). Having many tools in your toolbelt and a spirit of adventure makes you an asset to any production. 

If there’s one thing I could tell my younger filmmaker self, I would say: don’t be afraid of having multiple interests. Embrace it! And don’t worry so much. Pursue knowledge for the pure love of learning, don’t try to force yourself into something because of its perceived market value. Something that you’re not sure about now could end up being one of your greatest assets in the future. “Follow your bliss,” as Joseph Campbell would say, and try not to be too preoccupied with how it will all turn out. Life doesn’t follow a linear path and that’s okay. Real life isn’t compressed into two hours and doesn’t have to follow the audience’s expectations for continuity of logic. Real life is messy and strange and beautiful in its own way.

Follow Cat Tassini on Instagram @disco_nap_art and check out her website. Follow her current project, a feature-length documentary about Trish Keenan, the visionary creative force behind the English experimental band Broadcast, on Instagram @echos_answer, Facebook, and Youtube. 

If there are questions you want to be answered in a blog post, let us know at info@fourwindfilms.com or visit our website. Also, we work with a large, diverse community of crew and artists working in most aspects of the filmmaking process and are always happy to help make connections. And we are always building our community! Send us your work for review or feedback.

Sound Mixing 101: Compressors and Limiters

By Justin Joseph Hall

compressor_edited.jpg

Compressors and Limiters are audio effects that control volume or amplitude of sound.  They are used to create equilibrium in what you hear.  They are often used in the mixing and mastering stages of music.  They are also used when mixing dialogue for film, video, and radio.

Let’s start with the compressor.  The compressor is created to lessen the ratio of dynamic range in a recorded sound.  For example, if you are recording guitar input and someone accidentally bumps the pickup (which is like the microphone for a guitar that “picks up” the sounds of the strings to amplify them), there will be a spike in the waveform that is much louder than the rest of the recording.  The compressor dampens a spike in volume so it doesn’t stick out as much.  It diminishes the amplitude of the sound wave by “compressing” any sound that registers above a certain threshold at the compression ratio you set it at.

When using a compressor there are a few key terms.  

The threshold refers to the amplitude that the compressor kicks in.  So if you set the threshold at -16dBs then anything that goes over -16 decibels will be affected by the compressor.   

The ratio refers to the amount of compression.  If you have a 3:1 ratio in your compressor then the sound above the threshold will become ⅓ as loud as the original.

The attack or attack time is how fast the compressor will kick in after the amplitude gets over the threshold.  Often too fast of an attack time may compress small peaks that barely get over the threshold and can sound odd unless it’s sustained for at least a short period of time.  The length it takes to turn on the attack time is usually measured in milliseconds.  This is an adjustment you can alter in order to make sure the compressor isn’t turned off and on too often which can be noticeably irritating.

The release or release time is the delay that the compressor should shut off if the amplitude goes under the threshold.  This is to prevent the compressor from turning off and on if the amplitude wavers on the threshold and sometimes drops below it.  The release and attack times are adjusted to make the transition into the effect smoother and less noticeable.  Often the presets work well from the factory, but play around with the setting to see if you can make it sound smoother, especially if the effect is sounding choppy.

The knee is a gradual curve in how the compressor affects the amplitude. So if your compressor is set at -16 decibels at a 3:1 ratio, the knee may prevent the compressor from compressing the sound to 3:1 ratio directly at -16dBs.  Instead, it will be applied on a gradient that is adjusted by the knee.  It makes the compressor effect more gradual and less noticeable.

Compressors limit the range of the amplitude of a sound.  This is key for making recorded sound easier to listen to on speakers.  There are as many different types of speakers as there are flavors of ice cream. Compressing sound to a smaller range makes it so listeners won’t hurt their ears if they turn up the volume during a quiet moment in a film that’s followed by a loud explosion scene. Compressing the sound should make it so that people don’t have to turn the volume up or down at all. 

Often, the more a sound is compressed the more pleasant the listening experience.  Many podcasts, audiobooks, and radio shows use compressors on the vocals so that voices sound similar throughout the program and listeners don’t have to fiddle with their knobs.

In music, pop songs are highly compressed.  This is pleasant and makes all songs have a similar dynamic range.  Some of the least compressed music is classically recorded music where the range of instruments will not be heard without a large dynamic range.  In a recording of an orchestral concert, you want to hear the comparison of the piccolo solo, strings section, and the full orchestra playing without sacrificing the uniqueness of each sound.

Photo of author.

Photo of author.

Limiters

Compressors are often used in conjunction with Limiters.  The two audio effects work very well in tandem.  As the Compressor makes the dynamic range smaller, it softens the loudest sounds.  This is because audio begins from no sound and increases like a bar graph in amplitude.  If we have a sound that is -10 decibels, the peak of that soundwave is at -10 decibels.  When we compress it, the mountain moves to a lower peak.

A Limiter is often applied after a compressor.  This is because once you have the desired ratio of your sound, all peaks are compressed to a certain height of amplitude.  A limiter brings those peaks higher or lower equally across your sound’s amplitude without going over.

The Limiter earned its name because even though it may increase the amplitude, it limits the amplitude to a Limiter’s threshold.  Like the Compressor’s threshold, a Limiter’s threshold is a cutoff point that says no sound peaks will go over this amount.  This is to prevent peaking, which is when a recorded sound is too loud for a mic and distorts.  For digital audio this is 0 decibels.  For analog audio it can vary and depends on what you’re working with, but it is often seen from +6 to +12 decibels.

Limiters also have release and attack inputs which work the same as a compressor’s. They are measured in milliseconds and are when the Limiter kicks in (attack) and when the effect drops out (release).

Why use a Limiter? 

So that the sound you are creating is loud enough to hear after a compressor is applied.  It would be very annoying for someone to compress a song to be very quiet and then listen to a song that is really loud afterward because you’d have to keep adjusting the volume.

For example, if you listen to a classical song that barely uses a compressor and has a huge dynamic range from -60 decibels all the way to -3 decibels, and follow it with a pop song, you want the loudest part of each song to be at the exact same peak.  This is so if you set the music at a house party at a certain volume, that volume is never exceeded and you don’t suddenly scare your neighbors with O Fortuna blasting right after listening to a compressed pop song.  Most music players have a setting that levels the peaks of songs so you may already be familiar with this automated process in those kinds of digital music players.

0287 Rock n' Roll Matthew Solarski Interpol.JPG

Pop songs actually use Limiters a lot. This trend of mixing music slowly became popular with the rise of Rock N’ Roll wanting loud music, and the Metal of the 1970’s pushing that trend even more. Sound mixers kept making mixes louder by compressing them to a small dynamic range and then raising the peaks to the maximum volume. This puts songs near the top of the possible amplitude without going over 0 decibels and distorting.   Eventually, the radio ads also started competing with this music to get the listener's attention and radio ads started being mixed even louder than the music.  This became known as the “Loudness Wars,” which really peaked (pardon the pun) in the 1990s and 2000s.  You can really hear the dynamic range lessening, especially in pop music during that time.

Compression and limiting are powerful tools that help you create the listening experience you want  whether it’s for a song, radio show, or even a movie mix.  Compressors and Limiters are used in virtually every professional sound mix of any sort.  Learn the ins and outs and what sounds good to your own ear, practice a lot, and you’ll have mastered one of the basics of sound mixing.

— — — — — — — — — — — — — — — — — — —

If there are other questions you want to be answered in a blog post, let us know at info@fourwindfilms.com. In addition, we work with a large, diverse community of crew and artists working in most aspects of the filmmaking process and are always happy to help make connections. And of course, we are always building our community! Send us your work for review or feedback.

Six Steps To Finish a Video in Post-Production

By Justin Joseph Hall

I have been working as a professional editor for ten years on commercial, documentary, and narrative films, and this is for producers deciding what to include in video post-production. Many companies skip about three of the steps listed below, so I will explain what each step is, and why completing each one will make your video more professional.

  1. Editing

Editing is sequencing clips that are provided to increase the effectiveness of the video. Commercial videos pull a viewer towards an action or a specific emotional response to a product. A great editor knows this and uses timing and visuals to capture the viewer’s emotional and mental attention.

Author’s editing station. Photo by Author.

Author’s editing station. Photo by Author.

Modern editing tends to go towards getting the highest emotional response with the shortest amount of time. Often ads and commercials are limited to exact seconds. An editor will use their given assets to pace the film to climax emotionally towards the end of the ad, where there’s usually a reveal of a brand, product, or call to action. This leaves the viewer with the greatest emotional impact towards the end of the piece. If the climax happens before this, the edit still needs work.

An editor uses every edit to either seamlessly hide edits so viewers can concentrate on the story or visuals, or they may call attention to a cut to make the viewer pay attention to a certain moment. Editors think about when not to cut just as much as when to cut to the next shot. Too many cuts can distract from the story, or they can help tell the story more effectively to the target audience. MTV style editing from the 2000’s is an example of a style with a lot of cuts that worked well for their generally younger audience.

Editing is about controlling the emotional response with the tools given and knowing how each clip relates to one another. Think of an edited video as an emotional roller coaster for the audience.

2. Placing Music

Placing music is one of the most difficult things to do in post-production because everyone has an opinion about it, and people can have a wide range of emotional reactions to the same song. One of the goals in creating a video in post-production is making sure it will affect the majority of the target audience. Think about who you’re creating the video for, and choose music that will give the emotional reaction you want from them.

For example if you’re creating an ad that is for a Spanish speaking audience, playing Bachata music (or music that sounds like that genre) may remind listeners of times and places they listened to that music. It’s your job to understand what type of feelings or memories this type of music will evoke for the majority of your audience. For myself, when I hear music that has an AC/DC like guitar sound, it reminds me of listening to AC/DC on the radio in my friend’s garage as a teenager, which evokes a certain feeling of nostalgia for me, and might do the same for many people my age who also grew up in the Midwest. Sometimes there are certain associations with instrument choices. For example, if an ad uses ukulele music like in that Apple ad, you may be reminded of this Apple iPad Christmas ad, which could add or detract from your storytelling.

If you do choose a song with lyrics, make sure they don’t conflict with dialogue, voiceovers, or any audio in the video. Also, only choose music in a foreign language if you know what is being said.

Music is not always needed, but when you match the right music to the right scene, it will enhance whatever emotional impact you want to leave with your audience. It can help with the difficult job of keeping the audience’s attention and making them feel something at the same time.

3. Sound Design

Sit and stop reading. Listen to all the sounds you hear in the room. Most likely a heater or air conditioner, noise from people around you, maybe a computer fan. We are rarely listening to silence even if the room is quiet. So in order to make a video a full experience, we need to re-create the world we are watching. We do this by placing Sound Effects to replicate our experience in the real world. This is sound design.

Sound equipment.  Photo by Author.

Sound equipment. Photo by Author.

When humans see something move, there is almost always a sound involved in this. Move around and your clothes rustle. Scratching your skin makes sound. Lightning is always followed by thunder. When a plane flies, we hear the propulsion. When we see something move with motion graphics or animation, we expect to hear something.

If you don’t hear something when it moves, it can feel creepy or otherworldly. Horror films use this for ghosts. Things are eerily silent and then they scare you by breaking that silence or isolating sounds. If that’s not an effect you are going for, then not hearing sound can seem cheap and inorganic. Putting in sound effects like footsteps, clothing moving, and doors closing is a way to make your video come alive.

Sound Design can be a fairly cheap process as long as you have access to a Sound Effects library. Editors or Sound Mixers can usually do it. Keep in mind that you want to have isolated sound effects to make it easier for the Sound Mix. What is a Sound Mix? Well, let’s move on.

4. Sound Mixing

Much of the audio recorded and placed in video has been recorded in a lot of different places. It is being brought into a video that may have motion graphics, multiple days of shooting, or any number of things. In order to unify the sound, we hire a sound mixer.

In addition to unifying the sound, Sound Mixers also prepare the sound to perform best for the platform on which you are showing it. A theater often has great speakers and a high dynamic range. That is a very different environment from viewing a seven second Facebook ad on a phone.

Sound Mixers may also be called Sound Re-Recordists. These days we “Bounce” the file in a computer, or “Export” the final delivery, but the old wording sticks around. Some effects Sound Mixers can add include EQ (Equalization), Compression, Limiting, and Reverb or Echo.

An editor can mix sound, but being a great editor and a great sound mixer are two very different skills. Clients often don’t want to work with multiple people, but if your editor doesn’t have the technical expertise needed to mix sound, hiring a separate sound mixer may help you get the best final product. As an editor I prefer to have professional Sound Mixers finish the audio. They are more efficient with the tools needed to mix audio properly, as it’s often done with separate software from editing software. Sound Mixers often charge the same rate I do, but can improve the audio more in less time.

Sound Mixing can make a huge impact on your end product. The beloved resonance of a voice-over actor or podcast host’s voice are often aided by the sound mixer bringing out their best qualities.

5. Color Correction

You may think, my video looks fine, why would I need to color correct it? Isn’t correction only for when you have problems?

Yes, originally color correction was done when there were problems with film and editors had to match shots that looked a bit different. That still exists, but in addition there is often amazing details that make your image pop or enhance the feeling and goal of the video. It’s the final touch on the visual medium of a movie. Besides not mixing sound, not applying color correction is the first indication that a video is not done professionally. Here are some examples of video before and after it is colored.

Often you’ll hear an editor respond to the request of color correction with “I’ll throw a LUT on it.” What is a LUT? LUT stands for “Look Up Table,” which is is a preset like an Instagram filter but applied to video images. This is a cheap way to complete this process, but it can be problematic if the video contains clips from various parts of the day. Color Correction evenly matches the images to the time of day as well as to the style and mood of the video. This step fully immerses the viewer in the world of the video while also enhancing the visual aspects.

Color correction in process for Silent Forests, a documentary by Mariah Wilson. Photo by Author.

Color correction in process for Silent Forests, a documentary by Mariah Wilson. Photo by Author.

You can do many fun things in color correction like enhance sunsets, make people’s eyes crisper, and blur out backgrounds. Colorists know tricks to make sure the viewer is focused on what the colorist wants them to be focused on in every frame.

Color correction in process for Silent Forests, a documentary by Mariah Wilson. Photo by Author.

Color Correction can be expensive because it is a highly technical art form. However, it doesn’t take long for a Colorist to color a short video. A professional colorist can do up to fifteen minutes of video in one day. Color Correction is a great thing to be able to order in bulk, so finishing more videos at once can help with these costs. A half day (four hours) can be charged for something quick like a three-minute video, and usually that’s the minimum rate a professional will settle on to take a project.

6. Deliverables

Finally, your video is completed. Your colored video is matched with your mixed sound. How should you receive your video?

Many clients only want a video ready for YouTube or Facebook. However, these are very low quality files, and it’s important to obtain the highest quality possible. If you’re not sure what that is consult your editor, but a safe bet is getting a ProRes HQ file at the highest resolution. The main reason you want a high quality file is if you ever have to deliver it to something else or make changes, it saves you or the editor the headache of finding all the old files. For example, in the future you might want to request that longer ads be made shorter for a different platform, like YouTube. An editor can shorten the video easily with a high quality file.

One other option is to request the project files used to create the video. Project files are very small and most editors will deliver these, but it’s great to request them ahead of time as some editors feel their editing work is proprietary and may be wary about another editor re-editing the video in the future.

In addition to the high quality video file and project files, you should request high quality audio with “splits.” “Splits” or “stems” are different audio files the length of your video that contain:

  1. Dialogue

  2. Sound Effects (Most of Sound Design)

  3. Ambient Sounds (part of sound design sometimes)

  4. Music

This way if your video is used for a T.V. spot in the future but needs new music, you can replace it without losing your wonderful sound mix. It allows your video to have a long shelf life in the future. You invested time and money, it’s important to keep good records.

If you do these six steps properly, video professionals and clients will experience the difference, and you’ll see the effectiveness of your video increase.

If you have any more questions about specifics or other options when creating a video, please contact Fourwind Films at info@fourwindfilms.com