Video Rendering: CPU, GPU, Memory, Bus, 64, 32–WHAT?

Videomaker – Learn video production and editing, camera reviews Forums General Video and Film Discussion Video Rendering: CPU, GPU, Memory, Bus, 64, 32–WHAT?

Viewing 8 reply threads
  • Author
    • #49512

      In today’s world of video editing a videographer must also be a computer tech to get what he needs in rendering. There have been a number of questions about computer hardware and rendering so I thought I would just lay it out in a thread. I hope I have not already if so sorry for the repeat.

      Today most people what the fastest processor they can afford to render videos. 4.0GHz awesome, but the CPU does not handle the load of rendering video the GPU does. On computers with built in video the GPU is called the North Bridge, and uses the AGP bus to deliver processing to the CPU. This process is good for basic Internet, photos, playing a DVD, office computer use. It is not good for video rendering.

      A stand alone video card is needed for rendering video. But a good one. When thinking a stand alone video card think cards marketed for gamers, and in at least the $200.00 price range (for the card only). In fact to be sure you are getting a computer that will meet the needs to render video, buy a computer that is targeted toward gamers. Why?

      A gaming computer focuses on rendering video at the highest processing ability possible. This occurs in the video card the GPU; since video games are controlled by the player the direction the player takes in dynamic, so the GPU must be able to redraw everything on the screen in nanoseconds, each and every frame, a game is drawn at the time of its display.

      OEM systems sold at places like Walmart are NOT aimed at the gamers, but Internet junkies. So where to look? There are great resources like that reviews systems monthly.

      At the current time AMD/ATi holds the market for better gaming/video systems. Intel is soon to release a CPU that preforms better in some rendering benchmarks, but their video GPU chipsets are still Moore’s Law months behind ATi.

      So are laptops good for rendering, yes if you get a laptop made for gaming.

      It is noteworthy that if you can turn a screw driver you may be able to build your dream video rendering machine and it may be more cost effective. On the above listed site is step x step to build your own systems.

      Words to avoid when picking your system.
      “Onboard or built in video”
      “32bit Operating System”
      “Under 4GB RAM Memory”
      “SHARED anything when referring to hardware”

    • #202682

       Here’s, I think, a related question.  I’m aware of the Intel I3, I5 and I7 CPUs.  I know you said CPUs don’t matter so much.  But would the higher you go, I5 over I3 or I7 over I5 generally include higher end everything else?  Can I assume that a computer with an Intel I7 CPU would have a proportionally good graphics card?  And while I’m asking, how to AMD processors match (low to high) to Intel processors?

    • #202683

      You should never assume the computer will have a great graphics card. In fact unless you custom build one the chances are the GPU is pretty basic. I will say that my GTX 580 renders video over 2x faster in Vegas Pro 11 than did my GTX 285 which is still a decent card.

    • #202684

      Loaded fanboy questions (the fans of AMD and Intel)

      Intel’s i5-i7 (K) family only, the S & T are target based processors

      When Intel came out with this naming system they had one objective IMO, confuse the poo out of not just the general public but techs as well. BUT, all processors, Intel and AMD have code names. The Intel family of Bulldozer and more so the Sandy Bridge is where an Intel product needs to lie. Although the GPU does handle the rendering the CPU does also work just as hard. So yes go for the best processor and focus on the GPU you can afford.

      NO do not assume anything when it comes to graphics cards, Intel is the worst company for using its own GPU chipset, Intel has the worst case history in the graphics market. That is why there are two players, Nvidia and Ati. if you see “Intel Graphics Chipset” you will choke and sputter in rendering. Intel makes a great CPU but sucks on the GPU market. Intel and ATi hate each other, ATi and nVidia hate each other, Intel and nVidia get along like step-kids. So most Intel based systems may have nVidia GPU’s. Regardless you want the most in GPU stand alone card you can afford.

      At the current time we are seeing a major change in CPU & GPU performance. Intel’s Sandy Bridge-E will be the best processor made for some time to come. However nVidia fell behind ATi in the video market.

      If I were to focus on a new system at this moment and did not want to overclock it (hardware chipsets needs to match to overclock) I would go for the Intel Sandy Bridge-E processor, ATi (made by AMD) Radeon HD 7970 supports 4 monitors with expanded desktop and a price tag of $550.00 (for card only) If you wanted to overclock the GPU to push it harder (faster rendering) you would need to go with an AMD processor as well.

      will always be better at gaming and rendering, it is in the architecture design of the processors.

      Think of it this way, Intel was born out of the IBM error (International BUSINESS machine) their focus was on development of computers for the business world. At first home computers was not a concept of Intel. But Texas instruments did aim at the home user. Over the years processor companies came and went, then came along a little company called AMD. AMD saw a future of gamers, and looked at Apple and saw a future of video rendering. Intel still focused on business.

      Both companies built their basic house, and every since have been adding rooms (improvements) but the basic foundation is still there. Intel will always focus on the business/server market, that is where their investment and money is targeted. so although they would like to capture the gaming market they will not risk too big of investment and loose the business market.

      AMD entered the market with gamers and video rendering in mind, and from the ground up their entire company (ATi as well) was built on that basis, Intel will every once in a while come along with a processor that rocks, but AMD for the current time is the processor of choice for gaming and rendering. it is not a fanboy issue, it all comes down to what do you want your processor to do for you, add your customers receipts or render your video?

    • #202685

      Keep in mind though that your NLE MUST SUPPORT GPU power for rendering. MANY do not. Vegas Pro for example, while it was implemented in pro 10, was pretty weak. It was not until Pro 11 that is just a few months old that it really made made use of a GPU.

    • #202686

      @doubleham I concur and also VSX4 was the first version to make use of GPU and Multicore CPU’s.

    • #202687

      Now if they would start to harness the power of SLI…

    • #202688

      With USB 3.0 I tend to think that SLi may be phased out. Heat has been the biggest hold back to kicking up Video Cards. The heat inside the case no matter what cooling is chosen can get too hot for many. I use an nVidia 8800 and I have to run 8 fans with the side off.

      It has long been dreamed of a video card box on the outside of the computer connected via some sort of cable, USB 3.0 in theory can do just that Maximum PC has been running some great what if articles from ATi and nVidia about such a build. If so video power will go into new heights.

    • #202689

      Just saying my nVidia 580 GTX is double the speed as my old 285 GTX. Right now I can render AVCHD Blu-Ray files in near real time (with just some color correction). I am always hungry for more speed, especially when it comes to adding more complex effects. My Motherboard could support up to 3 of these and could really streamline my flow if every bit of power was used.

Viewing 8 reply threads
  • You must be logged in to reply to this topic.

Best Products

Best cinema cameras — 2021

Determining the best cinema cameras on the market today can be complicated. Here are the first cameras you should consider