You are here

Rendering speed and CPU choices

paulanderegg's picture
Last seen: 1 year 4 months ago
Joined: 02/23/2012 - 7:37am

 In my job shooting news and uploading to my website, I must render my DVAVI files to MPEG4. At the bit rate I use, my cheap $400 Intel i3 laptop struggles and rendering a 4 minute clip takes almost 20 minutes.  If I get a more powerful i7 or whatever laptop, can I expect my rendering times to dramatically change?  I shoot news so saving 10-15 minutes is a big deal to me.

Paul Anderegg

www.citibeatnews.com


Gregory Watts's picture
Last seen: 1 year 6 months ago
Joined: 10/25/2010 - 2:31am

Yes, an an IT guy I can tell you right off, computer power is the most important part of rendering. But not the processor of the computer. (A little background info) Here, Americans at some point did not think video processors were important in computer specs, and focused on processors. OEM makers caught onto this delusion and started to push CPU power over GPU power. It is interesting to note, it is only computers sold in America where the GPU is not a have to feature and is not advertised.

In the case of rendering video the GPU does the rendering, not the processor. Then the amount of memory you have. And here is another point to consider, this is not a Fanboy issue. Intel focuses their market line of computers on the business end, not the video production end. In fact there are some Intel based computers still sold today that cannot render video because of the design of the CPU and GPU.

AMD focused their systems on gaming and in turn video production. So even AMD systems with what appears to be less processing power can do better than Intel based systems.

As for your laptop needs, focus your attention on what are sold as "gaming" laptops, the GPU's needed to process video games far exceeds the needs to render video.

Alienware is a gaming PC company
Avoid ALL budget OEM's. Dell and HP/Compaq just took a nose dive this quarter so you may want to avoid their systems at this time.

Another point remember video rendering depends on the GPU and not all on the CPU, and so far software makers of most NLE's are only able to take advantage of dual core processors, so unless you plan on keeping the system for a few years quad cores will just take money you will never use, unless size matters.

Recommend a system? NO because everyone's taste is different, I can point you in the direction of specs

1) The most GPU (Video Card) power you can afford, at this time ATi is the top end GPU producer. A system with ATi video with the most video power you can afford.

2) Dual Core AMD processors. Processors go by codenames like mine is Black Kuma, that will tell you more than the 3.2GHz. There is a great deal more to CPU's then that GHz number.

3) Get a 64bit OS, Windows 7 64 bit. 64bit OS's can handle more memory and more traces (processing data) than 32bit.

4) Memory, this is an important part as well. With Windows 7 64bit you can grab as much as you can. ALL Windows 32bit OS's even Windows 7 32bit can ONLY access 3.5GB of memory, it will never be able to access more. W7 64bit can. So get as much memory as you can find. 8-16GB range. When you work on something using a computer is stays in the RAM Memory until you save it to the Hard Drive.

Hard Drive, if solid state flash (very costly) as much as you can afford. If old platter drives, stay below 1.5TB. Why the exFAT file structure of Windows 7 has some issues with the way data is written to the current file storage systems of platter drives.

What ever platform you opt Intel or AMD keep within the triangle of hardware,

Intel Processor, Nvidia Video Card, Nvidia Northbridge (Chipset) or Intel

AMD Processor, ATi Video Card, All other chipsets except Intel and Nvidia

Hope this was not too much.

 "A Photo Captures but a Moment in Time: Video Captures a Lifetime in a Moment"


adigraphics's picture
Last seen: 1 year 6 months ago
Joined: 04/24/2013 - 12:52am

Hi, Gregory Watts hw r u, we need a some guidence from your side . I am running my small studio in india by the name of aartsmen . we do architectural walkthrough & tv commercial ad. my question is we having two  i7  3.4 ghz & one amd 3.1 ghz ( 8 core) system. when we do rendering  from 3ds max in vray it take hell of time to render  each frame around 1 hour .

 

so we where planning to buy dell blades. but we having confused which one we should go through it to buy it out regarding to our requirement.

 

please help me out.

 

thanking you

regards

 

aditya tandon

+91 8601958969

personal mail: sonugraphics@gmail.com


paulanderegg's picture
Last seen: 1 year 4 months ago
Joined: 02/23/2012 - 7:37am

Thanks, not too long at all. I would never have thought about a gaming system or GPU's........I can afford around $1600 for a new laptop, and I want to buy into the next generation of chipsets and processors, probably when Windows 8 comes out. Do you think $1600 could buy me what I need, or should I start saving more in advance.

Again, thank you for your tech info :-)

Paul Anderegg


Gregory Watts's picture
Last seen: 1 year 6 months ago
Joined: 10/25/2010 - 2:31am

$1,600 would set you up nicely. Windows 8 is Windows 7 on laptops and desktops. Microsoft is calling it Windows 8 because it can run on ARM processors, cell phones, tablets etc. The only difference in W8 from W7 is the icons, and if you have a touch screen laptop or desktop you can use the touch display. MS will be supporting W7 along with W8, they are one in the same, just W8 is more money for MS.

Here is the best place to get the most reliable information on the direction of laptop

http://www.maximumpc.com/search/laptop%20reviews

This is the best computer tech magazine there is and they stay ahead of the rest of the tech magazines. They also have some nice reviews of Windows 8.

 "A Photo Captures but a Moment in Time: Video Captures a Lifetime in a Moment"


Gregory Watts's picture
Last seen: 1 year 6 months ago
Joined: 10/25/2010 - 2:31am

Also one more thing Paul, we are at the end of Moore's law, there will not really be any major strides in chipsets. What we have is what we have until we leave silicon. Silicon has reached its limits and the amount of transistors that can be put on a chip is only a few years from the end of Moore's law. Basically what we have on the market now is what we have. developers are focusing on removing heat to speed up processing. Heat is the major concern with the power that is now here. So I would not be overly concerned about your system being out of date for some time. At this point until we leave silicon for another building matter, all higher end systems will be IN for years to come.

 "A Photo Captures but a Moment in Time: Video Captures a Lifetime in a Moment"


paulanderegg's picture
Last seen: 1 year 4 months ago
Joined: 02/23/2012 - 7:37am

So it sounds like the best bet is to find a configuration that best suits my application rather than searching out big leaps in tech.......cool :-)

As far as RAM goes....I've never exeeded 1400MB of RAM usage on task manager....would there be a better app that would show the most accurate needs for RAM (is RAM speed a big deal in rendering?) on my system?

Paul


Gregory Watts's picture
Last seen: 1 year 6 months ago
Joined: 10/25/2010 - 2:31am

Paul, RAM is extremely important in rendering. "I've never exceeded 1,400 of RAM" Task manager is not a good judge, and most programs will not give you true readings, the usage of RAM is like a like trying to measure the direction of a Neutrino, I am aware they are all left handed, but the interesting thing about neutrino's is mathematically you can plot them and know their value's but since they are smaller than a particle of light the second you measure them all bets are off, some even flip, so what you measure is not what was there before you measured.

The same is true of RAM, it changes on the nanosecond, what you see may not be what was being measured or what will be measured.

Also the swap or virtual file on the hard drive is also acting as RAM. So 40% of RAM needs to be accessible to the OS at all times, therefore what Task Manager shows you is what the OS allows, your swap file is picking up the rest, and thereby slowing down your computer. He is a most fitting illustration.

You have an office, in this office you have a desk, cluttered on the desk are the documents you are currently working on (desk=RAM) however when you run out of room you have a table behind you under the window, and next to it some file cabinets. Soon your Desk=RAM fills up, except 40% (your work area) so you now have to take extra time to turn around and start laying out papers on the table behind you (swap file), and this turning back and forth takes extra time, and you loose production. But then sometimes you have to get out of your chair and goto the file cabinet to grab a new document, you guessed it file cabinet=hard drive. This takes extra time, and more so when you have to glance at the table to see what is in wait.

You are the CPU, the lines between the table and cabinet is the traces (data buses) on the motherboard. In a computer it happens so fast that the desk (RAM) has to wait on the table and everything has to wait on the file cabinet. but suppose you had a bigger desk and longer arms (More RAM).

ALWAYS max out your RAM Memory in every computer, the performance boost and more so to rendering will be noticeable.

And one more thing, I suppose it is true what my family says, I can't give a short answer.

 "A Photo Captures but a Moment in Time: Video Captures a Lifetime in a Moment"


waxart's picture
Last seen: 1 year 11 months ago
Joined: 10/19/2011 - 5:00am
Plus Member

All this information is so useful that I've marked it as a favorite. Thanks so much!


Gregory Watts's picture
Last seen: 1 year 6 months ago
Joined: 10/25/2010 - 2:31am

You are very welcome Ann. That is what this forum is here for to help all get the best video they can get. It has helped me greatly.

 "A Photo Captures but a Moment in Time: Video Captures a Lifetime in a Moment"


toddboyle's picture
Last seen: 1 year 11 months ago
Joined: 05/14/2009 - 8:25pm

Yeah but does the Graphics processor matter with Sony Vegas studio, platinum, etc? Their comparison chart says "comprehensive GPU support" is only on the Professional version $600 http://www.sonycreativesoftware.com/moviestudiope/compare But the release notes http://dspcdn.sonycreativesoftware.com/releasenotes/moviestudiope11.0.322_readme_enu.htm suggest there is *some* rendering support i.e. rendering AVC/MP4s? I don't want to throw money at a GPU since I am not going to spend $600 on the Pro version.


Gregory Watts's picture
Last seen: 1 year 6 months ago
Joined: 10/25/2010 - 2:31am

Todd;

You can get a good quality GPU for under or at $100.00. The GPU's of last year or even two years ago that cost $500 when released now fall under the $100.00 mark and are still more than able to preform. Even if the lower end NLE's do not rely as much on the GPU to render, the codec is still processed by the GPU. ANY help the CPU can have will result in a better render. NLE's only started making use of GPU's about 2 years ago and some of the versions aimed at the average customer will use the CPU more, BUT as more people in the US become aware of the GPU and with the current trend in technology drifting more and more to relying on the GPU, a last year model GPU under $100.00 is an investment that is worth it. Also the GPU will render better playback of video, ANYTHING to do with video codec's handles better with an added video card.

 "A Photo Captures but a Moment in Time: Video Captures a Lifetime in a Moment"


toddboyle's picture
Last seen: 1 year 11 months ago
Joined: 05/14/2009 - 8:25pm

Well, I shoot long material like lectures. I'm only going to do, like, three things.

-Edit my MTS footage and burn Blu Ray disks,
-Down-conversion of my footage to regular 4.7GB DVDs, and
-Down-conversion to SD Quicktime or MP4s to put on the Internet.

I don't want to have 10 or 20 hour rendering. I need to be sure Vegas Platinum version will use the GPU. If not, then I would want the fastest quad CPU I could afford. But if it will use the GPU there are lots of used, "yesterday's gaming computers" the kids are always selling.


Gregory Watts's picture
Last seen: 1 year 6 months ago
Joined: 10/25/2010 - 2:31am

And there it is...There is not a single average commercial application sold yet that would take advantage of a quad core processor. Nearly all NLE's take advantage of duel core but NOTHING you or I would buy will hit the use of quad cores. Here is the basics of multi-cores. The work load is divided among the amount of cores you are running, the application and the OS make the decision which cores to toss the load to. Hardware CANNOT think it just does the work. XP, Vista, can't take advantage of mutli cores. W7 can make use of several cores, however if the NLE you are using does not have the code written into it to split the load, guess what, those cores are just sitting there doing, NOTHING, but you can say you got them. I have posted a few times the things one should look for in a computer for video rendering. If you have the funds to buy a quad core, cut back to duel and buy a good GPU. Also if you are looking at Intel the only proc to get for video rendering is the i7, Ive Bridge.

The OEM marketers get to me when they push cores on people and not GPU's. Think of this for that quad core proc. It will sit there and suck the electric right out of your power lines for those other two cores, it will give you heat, it will give you bragging rights, a nice big white elephant will do most of the same.

Please please do not waste money on a quad core when considering a system for video rendering, go with duel core and spend the money where it counts, max the memory and get a good GPU.

NO NLE on the market yet priced in the average range has code written for it that can use quad cores.

I have an upset tummy, not at you but at the OEM market and Intel and the rest of the commercial world for the lies they tell, lies they all tell lies.

 "A Photo Captures but a Moment in Time: Video Captures a Lifetime in a Moment"


Anonymous (not verified)

I wanted to weigh in on this topic and clarify cpu vs. gpu rendering.

Initially, we need to define "rendering." When rendering to your monitor, yes, the gpu does most of the work, but when rendering to a video or image sequence file, it's ALL cpu.

I have a 10-node, mini-render farm consisting of AMD FX-8150 3.6 GHZ, 8-core processors. I create my scenes in 3ds Max, After Effects, Photoshop, and Premiere Pro cs 5.5 on my workstation initially, then send them to my farm to render the final sequences.

None of those 10 render nodes even have a video card installed. This lowered the cost and heat production from the nodes. I connect to them via remote desktop to set everything up, then simply send my projects to the farm. The motherboards do not have on-board video either, and there are no gpu's installed, there are not any video/monitro connections of any type on these machines.

It's purely cpu processing power that, "renders" my images. I periodically check in on each machine to ensure everything is going smoothly and will typically check cpu and ram usage as well as temp of the cpu. As the nodes are processing my images, I can clearly see that all 8 cores on each machine are maxed out at near 100% for the duration of the render. The only time all cores are not being utilized is on the final frame that each node renders. As each core finishes it's distributed "bucket," it will then drop to zero use, as the other cores finish their bucket. Other than the final frame, they are all used 100%.

The gpu only "renders" to your monitor, not to the final video file or image sequence being stored on the hard drive. If it was only dependent on gpu, how could I possibly render for weeks on my renderfarm that don't have any video cards installed?

Take a look at the online render farms available. They are mostly tightly packed, 16 core, xeon processors with lots of ram in a small footprint, no mention of graphics cards, because they are not needed at all to "render" an image.

Peter Jackson's, Weta Digital, renderfarm in New Zealand has over 65,000 processors with multiple cores. That's over 100,000 cores running simultaneously; they certainly do not have over 100,000 graphics cards installed. They are the IBM Blade servers with multiple xeon's running on each board.

So, for the original poster, you mentioned "rendering" to an alternative format than what you originally shot. When you transcode the file into another format, it's all cpu intensive calculations. When you are viewing it on your monitor, that's the gpu at work.

To sum it up...

GPU - Definitely needed to display (render) quickly on my monitors while I work creating 3d animation scenes, editing video, creating motion graphics. NOT needed to render final videos or image sequences. A more powerful gpu will allow you to scrub through your timeline more quickly and efficiently, but will not transcode your file any faster, that is the cpu's job.

CPU - The workhorse when "rendering" video/image sequences or transcoding files.

The graphics cards are only used heavily when displaying on your monitor. When you transcode to an alternative format you don't even display the file on your monitor, and therefore don't work the gpu heavily. I watch a progress bar until the CPU finishes the work, then use the GPU to view the end result.

You can go out and pay way too much for an Nvidia Quadro pro graphics card, which will help you scrub through your timeline much faster, provided you have enough ram, but won't help when transcoding to an alternate format. A single Nvideo Quadro high-end card would cost twice as much as my entire render farm, but at the end of the day I'll be waiting on the cpu of the machine with the high-end graphics card to finish rendering while the inexpensive render farm has completed the job days ahead, (yes days!) of theaforementioned"Pro" video card system.

Perhaps the Americans were correct when discussing cpu vs. gpu rendering power. Case in point... I have no gpu at all on 10 render nodes...


Ian Kirkpatrick's picture
Last seen: 2 months 1 week ago
Joined: 07/06/2008 - 1:38am

Hate to get into such a well developed post, but I agree with Robert.

Desktop renders and final output renders are separate issues, unless you tick the box "use preview files" when doing a final output render. ( I very rearly create preview files)

The bigest advance for me was going 64bit and Premier Pro CS5.5. I7 (8 core ) on Windows 7 and a couple of fairly old Nivida cards.

I never need to render in the timeline, unless I have more than about four active video tracks.
When rendering output to MPEG2 Blu rayor H264 Blu ray, rendering time is a littleless than real time.

Cheers Ian



paulanderegg's picture
Last seen: 1 year 4 months ago
Joined: 02/23/2012 - 7:37am

Well, I havent checked this post in about a month, and I ordered a custom Lenovo laptop, complete with their upgrade of a K2000m GPU..........I really screwed up.

I bought my laptop with the basic one stick 4GB of RAM, and paid like $250 extra for the upgraded GPU. I installed Vegas, and rendered an avi to mp4.......my old 1st gen i3 laptop took 1:31 to transcode 1:00 of video. My new 3rd gen i7 took :35 to do the same task in CPU only mode. I then selected use GPU, and the transcode took.....:31......

Wow, all the cash i spent on a good GPU was wasted, :04 per minute of rendered video is nothing. Well, how will things work when I install my speedy $120 16GB dual channel RAM kit......:31, exactly the same as with 4GB stock RAM. So, that's around $375 wasted on stuff that I did not need. It would have been nice if Sony support actually responded to my ticket regarding what specs work with Vegas. I could have spent that $375 on a higher end CPU. I feel like a tool listening to the countless people on the web proclaiming the necessity of RAM and GPU. Maybe this is just how Vegas is, and other programs actually use the GPU and RAM?

Paul Anderegg


Mr FPSDude's picture
Last seen: 5 months 2 weeks ago
Joined: 06/23/2013 - 4:48pm

It's always disheartening to read volumes of advice by so called experts only to find out it was completely wrong.  "As an IT guy" blah blah all that info was incorrect and what's worse, the OP followed his advice, spent a good sum of money that did not help his cause :(

 

My CPU usage is confirmed via Windows Task Manager (I don't know why it was downplayed as not being accurate, works perfect for me), Corsair Link Commander and CPU Usage logger.  All 3 reported the exact same figures: 97%-100% CPU usage, all 8 cores during video rendering.  I used Handbrake, Sony Vegas and Windows Live Movie maker to test and all yeilded the same result.   The CPU is absolutely what does most of the work in rendering captured video and not the GPU. The GPU is what does most of the work in rendering the video live i.e.: video games, etc.  Once the video is captured and you want to put it all together (which is what I believe the OP is talking about) it is your CPU that will go to work and the GPU will assist.  More cores=faster render times.  Sorry to be coming into this so late had I found this I would absolutely have suggested a nice quad core for fast video rendering.  I process 17 minute captured videos in 1920x1080, 25mbps bit rate in about 11 minutes.  I use an AMD FX-8350 CPU.  All 8 cores are being used during the rendering process.

 

Does this rag smell like chloroform?

MrFPSDude Gameplay Videos


David Harry's picture
Last seen: 5 months 2 weeks ago
Joined: 10/08/2013 - 7:56am

If anyone is looking to deliver the absolute best quality H264 then you should use X264. This does not use GPU, it is totally reliant on CPU, the faster the better. Even at faster render settings ( lower quality ) it is still better than most.

 

Examples of GUI's that use X264 are, Handbrake and MEGui.

 

Some encoders do use the GPU for encoding codecs, such as those that use CUDA etc.

 

There is no hard and fast choice for either GPU or CPU with regard to rendering codecs, check first what your encoding software needs.

 

The main advantage to GPU power, of the right sort. Is that it can double up as an accelerator for the likes of Adobe software, or any post software that takes advantage of GPU power, CUDA as a for instance.