Videomaker – Learn video production and editing, camera reviews › Forums › General › Video and Film Discussion › 24FPS vs 60FPS › I never thought something
I never thought something as esoteric as frame rates would spur so much discussion. I guess it is because the different protocols provide a level of artistic control that was previously unobtainable.
To either clarify or mystify further – I’m not sure which – I’ll try to address some comments above. First off, Thank you Mrkinyo for your kind words, I can sympathize with anyone who is confused when it comes to this sort of thing. As far as what I shoot, I shoot anything that moves. I mostly shoot events and performances which include weddings, sports, meetings etc. I use several Sony HVR-Z1,5,7U series cameras.
It is unfortunate that the words “frame” and “field” both begin with the letter “f” so the abbreviation FPS could mean either frames or fields per second. But here’s the fact, 60i almost always means 60 interlaced fields per second. That is the same as 30 (actually 29.97 but the difference does not matter) frames per second. Also, 30p means 30 frames per second. A frame is a whole picture whereas a field is half a picture. That’s why it takes two fields to make a frame. Interestingly, the data rate for the two standards is the same. After all, every 1/30 of a second all the pixels are transmitted regardless of whether they are scanned progressively or interlaced. Of course, the compression method will change the actual data rate, but that is different consideration.
I have read blogs where the writer seemed to attach some special or almost magical importance to 24 frames per second. There isn’t. Originally, when movies were silent, the frame rate was 16 fps. When sound came along it was found that a faster film speed was needed to record the optical sound track. Thus, the speed was raised to 24 fps because it was the lowest film speed that allowed reasonable optical sound recording.
For TV, 30 frames per second was selected back in 1945 because it reduced the effect of 60 Hz hum bars caused by 60 Hz ripple in the power supplies of early TVs. Interlaced was selected because early TV screens (cathode ray tubes or CRTs) showed too much flicker when progressively scanned.
Anyway, my point is shoot whatever standard appeals to to and your type of work. But remember that all analog TVs and the majority of digital TVs default to the 60i standard. That means no matter what you shot at, the viewer is seeing 60i. This is not true for video dubbed to film or for video displayed on a computer or special editing devices.