If you spend any time editing or outputting video, you will come across the term Codec. Because there are so many of them, and it is difficult to tell the difference between them, we put together a quick look to help you get started. If you can understand certain terms, you can better decide which one fits your needs. Let’s start at the beginning with a simple definition.

Codec is really the meshing of two words: coder and decoder (co/dec). What do they do? In the simplest terms, because video files are so large, you need a way to make them smaller. The codec encodes, compressing the data for storage or sending, then decompresses for playback or editing.

A codec is a computer code that performs its function whenever the file is called up by a piece of software. Codecs can also be used in a physical piece of hardware, like your camera, turning incoming video and audio into a digital format. This happens in real time, either at the point of capture or the point of playback. The codec also reverses the function and turns digital video and audio signals into a playback format. Unless you’re a broadcast engineer, however, you will rely on your computer or device to select a codec. The hardware compresses your video and audio data into a manageable size for viewing, transfer or storage.


How to Make a

DIY Green Screen

Free eBook


How to Make a

DIY Green Screen


Thanks! We will email your free eBook.

Types of Codecs

Now that you know what codecs are, let’s look at the variety of codecs out there. Then you decide which one is the best fit your needs.

You’ll find thousands of Codecs that are grouped under a variety of umbrellas. Lossless codecs are just like they sound. They reproduce video exactly as it is without any loss in quality. Lossy codecs, on the other hand, lose a small amount of information, but can compress material into a much smaller format. Lossy codecs are great for compressing data that needs to be sent via e-mail or uploaded to the internet. Use caution when choosing a lossy Codec. Some color shifting is seen in some formats

Overall, all codecs work toward the same end: put your data into a manageable file type with as little loss of quality as possible.

Transformative codecs cut up the material into smaller chunks before actually compressing it. Predictive codecs compare the data you’re compressing with adjacent data and get rid of unnecessary date. This creates a smaller file. Overall, all codecs work toward the same end: put your data into a manageable file type with as little loss of quality as possible.


The most widely recognized family of codecs are based on MPEG standards. MPEG is an acronym for Moving Picture Experts Group. This is the organization that sets and codifies the standards. There are a number of primary MPEG formats and a multitude of derivative types.

MPEG-1 is a data stream which reproduces with incredibly high quality. The MP3 (MPEG-1 Layer 3) standard for audio compression, developed by Fraunhofer, is an application for the MPEG-1 data stream – MPEG-1 video does not always include MP3 audio.

Almost all computers and consumer DVD players support both MPEG-1 and the MP3 digital audio encoding formats. One drawback is that MPEG-1 allows only for progressive scanning. Progressive scanning is a method of storing and displaying moving images where all of the lines of the image are drawn in sequence. This is in contrast to interlaced scanning, where all the odd lines of an image are drawn first, then all of the even lines are drawn. MP3, while lossy and quite small, is the standard for nearly all digital music storage devices, audio players and retail sites. The typical MP3 audio file is 128kbits per second, around 1/11th of the size of the original audio data that would be on a CD.


MPEG-4 files use both progressive and interlaced video. It employs better compression techniques than MPEG-1 and is a widely-accepted compression standard. In fact, there are a number of codecs that are derived from MPEG-4. One is the H.264 codec, which is another option for encoding video for Blu-ray Disc, as well as for videos found on the iTunes store. H.264 is a family of standards with great flexibility and a wide variety of applications. H.264 enables compression for high and low bit rates and both high and low video resolutions. Adjusting size allows users to use this same standard for compressing for broadcast, multimedia usage and large file storage.


ProRes is another widely-used codec. The format was called Apple ProRes and found on Apple products like Final Cut and iMovie . You can find it in several formats like 422, 4444 and RAW. Developers boast that it will handle up to 8K media with superior playback. Superior color resolution is also a main feature.


Another well known codec or family of codecs is WMV or Windows Media Video. With the glut of Windows users out there, it isno wonder this codec family is so popular.

Originally designed to compress files for internet streaming, WMV was introduced as a competitor to the RealVideo compression codec. Microsoft’s WMV 9 has been around for quite some time at this point, and Microsoft claims that it provides a compression ratio that is two times better than MPEG-4 and three times better than MPEG-2. WMV 9 is also the basis of the SMPTE VC-1 video compression standard, which is another format that can be used for encoding video for Blu-ray Disc.

Other Codecs and Containers

Be sure to note the difference between a Codec and a container. So what is a container? It is a lot like the wrapping on a present. It refers to the way in which information is stored, but not how it is coded. For example, QuickTime is a container that is wrapped around a variety of compression codecs, like MPEG-4, k3g, skm and others.

What Do YOU Need?

So, which codec Do you choose? You will need a little trial and error to discover the right one. Ask yourself a few questions: Are you compressing for storage or for high quality viewing? Are you okay with a little data loss or does the finished files to be clean and pristine? Work backwards and do some research. Find out what the pros are doing to get the same results. They use the best so you are guaranteed to find the codec that is right for the job.

So now that you have at least a slightly better understanding of some of the more popular codecs, it might also be helpful to know which formats utilize those codecs. Read our associated story Transmission Formats for more.

Did you find this content helpful?


  1. Thanks for this article, but I get frustrated with Videomaker’s unwillingness to do a deep dive into its subject matter. While camera reviews are always in-depth, instructional articles like this one never seem to do the subject matter justice. I find myself begging for more in-depth and technical content.

    For example, in this article, a blaring omission is the discussion of how codec choice affects editing. Does less compression really mean improved editing speed when jumping from one part of a clip to another? (Interframe compression puts more strain on the CPU when jumping around, as opposed to intraframe compression, right? Or not?) And there was no mention of bit depth or color subsampling, and which codecs, such as ProRes with its different flavors, support what ranges of values.

    Now take the article on color grading in the July 2019 issue. There was no mention of the array of tools that colorists can use, like waveform, RGB parade, Vectorscope, and even the Hue vs. Hue, Hue vs. Saturation tools. Also, as a good example of grading vs. correcting, you could have mentioned the dozens of films and TV shows for which the colorist chooses to grade shadows with a blue hue.

    Finally, take the article on video editing workflow in the June 2019 issue. Again, very shallow. In the section on audio post, why not mention some of the tasks that the audio engineer has to undertake beside mixing, such as compression, limiting, and EQ?

    I hope your editors will see fit to permit your contributors to do more in-depth (and yes, more technical) treatment of their subjects, and not leave your readers hungering for more. Thank you.

  2. Thanks for the comment Steve. We typically try to restrict the scope of an article to make information easier for searchers to find. Take the color grading article you mentioned for example; we intended that article to be a reference for searchers who are looking for a top-level understanding of what color grading is. We have a separate article on the tools which colorists use (scopes, etc) for people who are looking for that information. This Codecs article is similar. We have a separate article for how codecs influenced editing workflow, which you can read here. I can’t promise that it answers your specific questions, but it illustrates our philosophy on breaking articles into what we call discrete learning objectives.

    I will admit that we haven’t been very good about cross-linking between articles for readers who want to go down the rabbit hole on a topic (so to speak). I expect you’ll see improvement on that front in the coming months.