It seems like a new camera is announced every day, each smaller with higher resolution than the last. Before you rush to preorder a new 6K camera, stop to consider what moving to a higher resolution really means. Will the benefits of the technology really justify spending upwards of $3,000?

Who’s watching?

If you’re considering a 6K camera, one of the first things you should consider is how your customers will view it. There aren’t any 6K screens available, and if there were, how would your client know the difference? Think about when 4K first came out and we scaled it down to HD with no one the wiser. How many people actually make it to the HD channels on cable or have their playback settings turned all the way on when they watch YouTube? 

We are constantly scaling up HD footage for our 4K projects. Yet, I have never had a client point out that I have an HD clip in a 4K video. Most of my clients can’t tell if I delivered the project in 4K or HD unless I mention it. The next time you are on YouTube watch some comparison videos; I bet you won’t be able to tell the difference either. Don’t believe me? Try it.

Time to upgrade everything

Another thing to consider is that upgrading to a camera with higher resolution requires more storage space, faster computing power, longer upload times — and the list goes on.

A couple of years ago, I decided to upgrade to 4K. I had clients constantly asking for it (probably because it was a buzzword at the time). It seemed logical to meet their demands. 

After my first 4K shoot I realized that my project folders went from 15 to 20 GB to 150 to 200 GB.

After my first 4K shoot I realized that my project folders went from 15 to 20 GB to 150 to 200 GB. My upload time to YouTube increased from two to three minutes to five to eight minutes. My render and export time to export a three-minute marketing video soared from about five minutes to three hours! To fix these problems, I had to purchase 20 TB of extra storage space and a new iMac Pro. I also upgraded to a faster Internet service. So, at the end of the day, my $2,200 camera cost me nearly $13,000.

8K on the way

I know one of the most common arguments for moving to higher resolution cameras is that you’re future-proofing your gear. I reasoned likewise when I upgraded mine. Yet, the truth is that once 8K is available, we probably still won’t have 6K screens. Instead of upgrading to the latest and greatest, invest in equipment that will make your life easier and add production value. Keep your HD (or 4K) setup and buy some extra lights, a better tripod or an amazing audio setup. These will make a difference in your videos that your clients will notice, and so will you.

Bill Baraona
I am a video production and marketing professional living in Cleveland, Ohio. I have been working in the video industry since 2004. I'm married to a beautiful graphic designer living our best creative lives. When I'm not working and running the day to day operations of Flex Media you can find me on the shores of Lake Erie, enjoying local restaurants, and searching for the perfect cup of coffee.

18 COMMENTS

  1. I agree totally. If my client can’t view a 4k video, there is no point in even buying the camera I want to shoot at that resolution.
    Personally, I suspect that the whole 4-6-8k resolution race is a ploy to sell more gear, no more, no less. Your comment about the file sizes says a lot about the down side,plus the computer thing… I see no real value in shooting above UHD.
    Content, focus and good sound- those are the points to ponder, and goals to work towards. Carry on, and do good.

  2. This is some of the most commonsense and timely advice I have read in a long time. My Lumix GH5 features 4K, but I have never once been tempted to use it. The sole justification which would induce me to do so, is to enable me to have some ‘wriggle-room’ when it comes to re-planned motion within a series of video frames. I have a kit of very carefully chosen lenses, mostly from the mid 1980’s and where possible, ‘Tamrons’, and focus everything with meticulous care. I have no problem at all with magnifying many of my shots up to 120 percent and recovering any lost sharpness, (there usually is very little), by using an excellent ‘sharpness’ facility built into my editing software. My stuff is all Natural History with an emphasis on the bird-life of wetland areas. I am also acutely conscious that if reasonable sharpness is not there from the outset, no added ‘sharpness’ which I know of, will enhance what simply was not there to begin with.

    I shudder to think what the data-storage requirements would be for 4K, when my present shots in ‘Full-HD’
    are usually into the gigabits for a duration of 16 seconds or so. As with both the author of the article and the first respondent to it, I absolutely agree that any attention over and above what I have already stated should be lavished upon the sound, especially that recorded in-the-field. I don’t think I am being pedantic, when I state that investing craftsmanship in the ‘footage’ I shoot, outstrips the use of 4k per-se, unless the audio sees a similar investment of care and attention to detail. It might pay to bear in mind that any perceived shortcomings in what we see on TV, (which is the Gold Standard by which most people judge image quality, whether we like it or not), comes about, not due to any aspect of HD which leaves much to be desired, but the fact that to be crammed into the limited bandwidths offered by most terrestrial services, squeezes the life out of it due to heavy-handed over-compression.

  3. I’ve had 4k capable cameras for several years (Panasonic super zooms and now a Nikon Z6). The only time I shoot in 4k is when I’m recording the ballet performances for my daughter’s ballet school. I just set up so I can see the whole stage and hit record for each dance. I then go into Premiere and do my editing there, adding different zoom levels and doing camera pans. The end result looks like I shot with 2 or 3 cameras and the school is very happy.

    But other than that, everything else I record is at 1080p.

  4. This article was very timely, Bill! 3 of my 4 cameras are 4K and I’ve often thought about shooting in the higher rez to be able to “punch in” on images and keep sharpness. But in the end, with a 2017 edition PC — even with lots of memory, a monster SSD and an 8GB GPU, trying to ingest 4K footage would probably grind my computer to a halt. So I’m staying in HD for a while longer! 😉 Now if someone comes along with a project and a big budget, well then maybe I’d consider stepping up. ~TW

  5. I own a Panasonic GH5 and I just started shooting in what they call 6K anamorphic but without the anamorphic lens. This gives me a 4:3 aspect ratio.

    Here is one reason I LOVE shooting in the higher resolution.

    Since I release my content in FullHD, shooting in 6K anamorphic mode gives me TONS of room to recompose the shot by panning up or down, zooming in and out, etc. using Adobe Premier and their motion effects in combination with key frames.

    I bring the 6K video which is 4:3 aspect ratio into a Full HD sequence on the timeline.

    I shoot everything handheld in a run and gun style so being able to have all that extra resolution to work with in post is AWESOME!

    Drew
    Las Vegas, NV

  6. I couldn’t agree anymore!
    I have a Zcam E2 which shoots 4k@160fps! Never thought, I’d have to shell out this much money & resources just to process and store that.
    Lots of people can’t differentiate between 720p and 1080p on their phone screens let alone 4k!
    Also, I switched back to 1080p after using up 3TB of HDDs.

  7. There is a benefit not mentioned here. 4k and 6k give you crop options (if you finish in 1080) which when shooting interview, or concert footage, is like having 2 cameras in one package. If you shoot wide, you can get a medium shot or closeup just via crop (with complete flexibility as to the position of the crop) while maintaining 1080p reolution. I often have to travel to shoot interviews, so being able to have essentially 2 cameras in one package is really nice. Storage continually gets cheaper and cheaper. It may not be the deciding factor to upgrade, but I thought it should at least be mentioned. Could be very helpful to some.

  8. I agree in principle with Al Falaschi that ‘4K’ as an acquisition format has quite a bit going for it. I would happily use it in isolated cases for the purposes he mentions, but for one or two requirements which might be uniquely my own. Firstly there is the hassle of paging through the menu to make changes which might well apply to just one shot, while my subject perched fortuitously on a fence post, is showing all the signs of an imminent take-off. When I am confronted by a situation I have literally ‘walked-in-on’, time is very much ‘of-the-essence’, as might well be the time taken to switch back to HD again for the next shot.

    Secondly, mine is a shoe-string effort; a retirement hobby got out-of-hand and now ‘on-steroids’, so to speak. It is a non-profit contribution towards better understanding of conservation issues. Each time a fresh
    4tB hard-drive is needed for archiving or other purposes, (which is at least twice a year), means a further expediture of NZ$219 at my best source. Therefore there is a natural limit upon how much data I am able to store at any given time. Hard-drive storage is both expensive and looked at in a longer time-frame, of doubtful reliability, but archiving all material, both as shot, and in graded form ready for use, is as much insurance as I have. The cost of storage alone, is a huge disincentive; for instance, it is just not possible given my retirement income and the fact that I do not wish for outside finance due to the tendency of such assistance to dictate outcomes, to back-up everything separately. But finally, why employ four pixels at much greater cost, to accomplish what a single pixel is capable of doing more than adequately, in my view.

  9. I have one 4K 1 inch Sony FX 100 camera I use for wide shots mixed with two to four HD cameras. I hoped I could crop to effectively get a 2X zoom factor in a HD timeline, but I find it also magnifes noise so it is not as good as an HD camera zoomed in optically for the same field of view. However, on theatricals if the HD cameras miss something it is a backup that is much better than nothing. The only thing I have found is the cropped 4k did not look as well as the HD when encoded down to SD.

  10. There is one aspect about all of his which causes me to pause and draw-breath. Many images can be too sharp ‘for purpose’ in my view. Yesterday, being in town, I went to a department store which has a great range if current TV sets, up to and beyond 4K, Super HD and similar with up to 80 inch screens. Since the ‘party-of-the-second-part’ was at the supermarket at the time, pre-Christmas, I was not short on time to take everything in and make comparisons. My first impression was, that I did not like many of the images
    at all, but that was likely due to the fact that the colour was ‘screwed-up’ in both senses, in order, no
    doubt to tempt the very people who ought to heed Bill Baraona’s warning. Yes, detail did stand out , but in
    a way likely to confuse instead of enlighten. I viewed at several distances in order to make fair comparisons. Across distances similar to those of viewing in an average room, that detail was largely superfluous to any activity taking place on-screen.

    However, I have, coming-up, my annual trip to a point on the South Otago Coastline (New Zealand)
    where Royal Spoonbills nest over the summer months and raise their young to the point of fledging, on
    a spectacular, but exposed, rock outcrop. I can see some merit in shooting that material in 4K and will
    give it a try. That combined with a 500mm lens, plus a 2X converter might ‘do-the-business’. Of course, even a moderate breeze could turn the whole thing to custard. But if it comes off, the shots should downscale to HD very nicely. Wish me luck……

  11. In the early 90’s I wrote a research paper for college about HD. At the time, there were no consumer outlets for it, broadcasters & cablecasters had no infrastructure or bandwidth for it (pre internet video) and it died. Sort of. It was a “back up and punt” situation. It took close to 20 years of development on all fronts before it became a norm. I have watched 4k, 6K & 8k with the same fascination. I have a couple 4k capable cameras. I rarely use the 4k capabilities. I almost always shoot HD & more often than not, livestream at 720 instead of 1080 because the bandwidth capabilities of our international audience varies greatly. Maybe in another 5-10-15 years 4k will be the norm. I will be over here shooting HD until it really makes sense for me to change to 4k and beyond.

  12. I frequently use our 4K 3-camera switcher feed for large-screen presentations.
    The difference compared to HD is very noticeable.
    Worth the money?
    When you find out, let me know.

    Sort of reminds me of the days when we wondered
    if people could really tell the difference between 720P and 1080P;
    eventually they did, and will, I’m sure, continue to see finer details as time goes along.
    But, oye, that’s a big investment.

    The crop options/virtual camera situation has, for us,
    come thru to great benefit in post.
    We are restricted from getting too close with our camera positions.
    4K has saved us time and again to end up with cleaner, more useful shoots.

    Yes, 4K really slows down the processors.
    Yikes, does it use up a ton of memory, of all sorts.
    But then, I thought OS 9.2.2 never should have been changed, too.

    Maybe Maslow was on to something:
    “In any given moment we have two options: to step forward into growth or step back into safety.”
    I certainly move between these two dichotomies daily.
    Sort of glad I’m closer to the end of this video journey rather than having to begin again.

LEAVE A REPLY

Please enter your comment!
Please enter your name here