When to shoot RAW (and when to skip it)

A RAW image frame is like a film negative — different processing can give you different results.

Your camera sensor captures a bewildering amount of data, and by default, your camera makes some decisions on the fly and throws most of this data out. Consequently, a RAW image frame is like a film negative — different processing can give you different results.

Using a non-RAW format, like MPEG, is sort of like sending your film to the local drug store for processing. Maybe most of the time it’s fine, but sometimes you want the ability to go back and make your own decisions about things like white balance, sharpening, exposure, highlights, shadows, lens corrections and more.

Furthermore, uncompressed RAW video is everything coming off the sensor. When your camera throws out data to save on file size, this compression can (and does) lead to “artificating” (also called “compression legacies”). This most often appears as blocks in the shadow areas of your video.

Advertisement

How to Make a

DIY Green Screen

Free eBook

Free

How to Make a

DIY Green Screen

Free

Thanks! We will email your free eBook.


Since your video is probably going to end up in a compressed format ultimately, some artifacting is inevitable. Like so many things it’s a balancing act between convenience and appearance.

When to skip RAW

If my mom texts and says “send me some video of that adorable cat of yours playing with that toy I knitted her,” I’d be a real jerk to send a six-gigabyte RAW camera file back. Your camera is fairly good at making average decisions about average lighting conditions for average use.

Your camera is fairly good at making average decisions about average lighting conditions for average use.
Your camera is fairly good at making average decisions about average lighting conditions for average use.

Similarly, if you’re shooting video simply for recreational purposes, shooting in MPEG is fine. RAW files are enormous and disk space, however-cheaper, is not free. So, if you’re sure you’ll never want to use this footage for something important, you’re probably ok not shooting raw.

RAW is really good at fixing video that was poorly shot in the first place. If you have a lot of control over your video setup — for example, if you’re able to set your white balance in advance and you know lighting isn’t going to change — you’re probably okay shooting MPEG.

RAW is really good at fixing video that was poorly shot in the first place.

Apart from being much larger, RAW requires post-production work to make it viewable; this is often a hindrance if you just want to get something out the door.

Likewise, if you’ve never shot RAW before don’t begin with an important project. The hardware requirements for RAW, especially in high resolutions, are significantly greater than for MPEG. SD cards that work fine for one format may not have the throughput for RAW. Test everything out first — which means taking 30 minutes or so of video to make sure your card and your camera’s buffer can handle all the data.

When to absolutely use RAW

RAW gives you so many options for changing the way your movie looks after you’ve shot it that any time you think you might want to combine footage shot at different places in different times or exert any creative control over what the final product looks like, RAW is what you should use. Other times when you’ll want to use RAW include when you’re unsure of your lighting conditions — whether you have mixed lighting sources or you’ll be moving between indoors and outdoors.

Other times when you’ll want to use RAW include when you’re unsure of your lighting conditions — whether you have mixed lighting sources or you’ll be moving between indoors and outdoors.
Other times when you’ll want to use RAW include when you’re unsure of your lighting conditions — whether you have mixed lighting sources or you’ll be moving between indoors and outdoors.

Make shooting RAW your default plan; you can always change it.

Occasionally I’ve shot video thinking that I’d had my camera set up to shoot RAW footage and come home to find a bunch of .mp4 files. If you spot Bigfoot lumbering through the forests of the Pacific Northwest, gingerly eating daffodil flowers, you’ll definitely want to grab RAW footage of that. But if you don’t start shooting immediately, you may miss it completely. You don’t want to have to fiddle with the camera.

For that reason, I try and leave all my cameras set to record RAW. That way, I have to make a conscious decision to switch (just remember to set it back when you’re done). That way you’re defaulting to the most options. If you find out you accidentally shot RAW when you meant to shoot MPEG, you can convert it.

Is there a middle ground?

There is a place between RAW and MPEG, called log — you can read about it here.

In conclusion

Certainly, if it’s important, shoot RAW. Similarly, if you think it might be important later, shoot RAW. If disk space is not an issue, shoot RAW. If you’re not sure, shoot RAW. You can always convert RAW to a format with a smaller file size, but you can’t go the other way. If it’s not earth-shatteringly important, or you know exactly how your lights will be set up, or you don’t want to do any post-processing work, MPEG is fine.

Kyle Cassidy is a writer and artist living in Philadelphia.

3 COMMENTS

  1. I thought I was going to get some real insight into an issue that has been a “hot button” among still shooters for over a decade. The “artifacting” argument has been grossly overused in the still shooter world and isn’t washing well for me with video either. What we have here is an opinion piece. My two cents is: get it right in camera and you’ll spend more time shooting and less time editing. After all, having more time to shoot is golden.

  2. I think RAW really is for bigger or high end productions. Even then it’s not necessary depending on your content. ProRes 4444 is fine and can be found in cameras under $2000-$3000.

    If you’re shooting an effects laden flick like say a Marvel Studios picture such as Doctor Strange 2, Black Panther 2, or Guardians of the Galaxy Vol. 3, then RAW will make things easier in post because those films have superheroes, destruction, and vast space, ground, or inter-dimensional battles – complete with non existent ships, vehicles, etc. RAW will come in handy for the FX Houses to manipulate with ease without compression* (I’ll return to this in a second).

    But for a film like Ladybird, ProRes is fine because its a personal piece about a young woman goint through teenage issues with her family, friends, etc in a real world setting with no real VFX. You just need to make sure theres no breakage of imagery in color correction which ProRes handles fine.

    However, this brings me to my next point that I said I’d come back to. As much as I hate to say it, the dismissive phrase “We’ll fix it in Post” has never been more valid than it is today.

    You can manipulate and change footage from a smartphone and format it for a movie house with ease nowadays. The rules of RAW are now shrinking as VFX programs become stronger. We shoot RAW now not because we have to, but because it’s easier and doesn’t slow the process down. But Lady Gaga just shot a music video on the iPhone 11 and they previewed it for a bit in an ad before a movie. You can’t tell a major difference anymore – especially for the non pixel people (aka the general public). They can remove noise and grain without sacrificing crispness and pristine imagery (aka looking airbrushed), they can do a lot.

    Shoot in ProRes 4444 or 10 bit 422. You’ll be fine.

  3. To contribute my two cents’ worth, when shooting using RAW also requires a camera to have the capability to

    generate the required output, such as 4:4:4 color space and 10 bit

    per color video data. For some cameras shooting RAW requires

    purchasing an external video recorder, with a SSD as the storage

    medium, having a sufficiently fast sustained write speed.

    Cameras that don’t have the capability for generating RAW output,

    but can still generate a 4:2:2 color space at 8 bit, it may be

    possible to use an external recorder to record either in ProRes 422

    or DNxHD/R in 10 bit. During recording the 8 bit color data may be

    theoretically stored in 10 bit registers in the recorder by setting

    the two leading bits as 00₂, thus possibly giving more head room

    flexibility during color grading in post (production).

    The 4:2:2 color space is sufficient, based on other

    cinematographers’ and professional broadcasters’ opinions for doing

    effective chroma key green screen work.

    The only computer hardware issues with either the RAW, ProRes or

    DNxHD/R files are having a CPU that is fast enough (>3GHz), having

    a fast GPU with support for rendering 10 bit to 14 bit color,

    having sufficient RAM and a fast HDD or SSD (minimum recommended

    4TB capacity), to render the frames in real time during video

    editing and color grading.

    With regards to the GPU having the capability to render 10 bit to

    14 bit color, there isn’t much research material available online

    that shows any real differences in color grading flexibility by

    comparing an 8 bit GPU with a 10 bit GPU and 14 bit GPU, using

    either the same RAW, ProRes or DNxHD/R source file.

    The editing software requirement is having the capability to be

    able to read and render either the RAW, ProRes or DNxHD/R files.

    I’m currently an amateur film maker and have been lazy

    experimenting with my Atomos Ninja V’s DNxHD/R files; I haven’t yet

    tested if the 8 bit 422 data coming out of my Sony NEX FS100’s HDMI

    port (recorded in the Ninja V using 10 bit color setting) can be color graded with a greater flexibility than the AVCHD

    4:2:0 8 bit files stored in the SD Card. I have recently acquired a Sony NEX FS700 and I am fixated on its super slow motion capabilities.

    I have an HP laptop with an Intel i7-4510U CPU (2.0GHz base / 3.1GHz turbo boost) with its turbo boost speed limited to 2.6GHz because this CPU was designed for larger form factor laptops to provide for greater thermal dissipation. I upgraded the RAM to 16GB and swapped out the HDD for a 2TB SSD; next year I’m considering upgrading to a 4TB SSD as the DNxHD/R files take up a much larger amount of storage than originally anticipated.

    To read the DNxHD/R video files I use the Movavi video editor to adjust the Gamma then adjust the color grading before converting the video to the highest possible quality mp4 for editing and color grading fine tuning with Cyberlink Power Director and Color Director software.

    Disclaimer(s): I don’t get paid any commissions by HP, Intel, Atomos, Sony, Cyberlink, Movavi or the owners of the copyrights to the ProRes, DNxHD/R or AVCHD codecs.

Comments are closed.