Separate names with a comma.
Discussion in 'Archived Threads 2001-2004' started by Mike Bledsoe, Jun 7, 2002.
Does anyone know why film and video have different frame rates?
The answer that is usually given is that 30fps (60 interlaced fields) is more compatible with our 60Hz power. However, I believe there is a reason beyond that
TVs are clearly influenced by AC frequency. In the US, with 60Hz electricity, you get 60 fields per second; with 50Hz, like in Europe, you get 50 fields. A better question would be why 60 and 50, and why not the same? At the time, they probably didn't even consider putting movies on TV -- who'd want to see movies on those then-tiny black&white screens? (I wonder when 2:3 pulldown was invented.)
Film evolved totally differently to get to 24. And with film, increasing the frame rate means using more film and changing reels more often, so there's plenty of reason not to make it faster than necessary. This has sort of come full circle with the advent of "24P" HD video, 1080-line HD at 24 progressive frames per second. The data can be automatically converted to 30fps by the display. Using 24 frames for video means 20% fewer frames, which means less storage space, less bandwidth, fewer frames to process, etc.; and really easy conversion to film.
24 fps is also just above the "flicker" rate. Higher film fps makes the image even better. Showscan had a 60 fps rate, the 5 Cinerama travelogues and early Todd AO 27 fps.
One question Mike might be asking is why does film * stay *
at 24fps? Cinerama was 26fps, Todd-AO was 30fps, and
Showscan a whopping 60fps. These formats, Showscan in
particular, greatly benifited from the increased frame rate.
One is compatibility with 24fps theaters, where almost all
projectors are designed to run at only 24fps. Then there's
the additional cost of more film from the initial shooting
through to projection.
Minor cost increases and slight projector modifications
aside, there's another reason: It's been a common practice
for a few generations now to shoot film at 24fps (even
for TV shows ..from the 1960's original Star-Treks to today's
X-Files episodes) as people in North America and Japan,
using the 60-field NTSC system are accustomed to seeing 3/2
pulldown applied to 24fps film. Many consider it part
of the "film" look. ..Is that right? No. ..It's still a
distortion applied to fit 24 frames into 60 fields NTSC,
but people are used to it. Ditto for 50 field PAL for Europe,
where 24fps films is slightly sped up to 25fps to fit the
PAL format. Is that right? Nope, but again, that's what
Europe is used to. ..A, say 30fps film, would look more
like video to many North Americans (not all, remembering
the true Home Theater Buffs) and Europeans would see new
artifacts never seen before, because the 30fps film would
need to be broken down into 50 fields for PAL.
Summery: More frames per second: GOOD ......
Chances of major 30 or 60fps film: SLIM ....