What's new

Ran across this post on 1080P and it has some validity (1 Viewer)

ralphPerez

Stunt Coordinator
Joined
Dec 24, 2001
Messages
88
Ran across this posting, it has some validity.

NO DIFFERENCE between 1080p or 1080i for movies, because movies are only shot in 24 FRAMES PER SECOND. Think of it this way: movies have 24 images to show each second; 1080p uses these 24 frames to refresh the image 60 times every second (at 60hz, refreshing all the lines each time)--1080i uses these same 24 frames to refresh the image 30 times every second (at 60hz, refreshing 1/2 the lines each time). So, the rate of 30 times per second that 1080i uses is more than fast enough to show all 24 frames--all 1080p does is show the exact same frames multiple times per second--there is nothing additional, new, or different for 1080p to display that is not already being shown on 1080i. Also, keep in mind that LCD displays cannot display an interlaced signal--THEY AUTOMATICALLY DE-INTERLACE THE SIGNAL AND "PAINT" IT ON THE SCREEN IN PROGRESSIVE FASHION, LINE BY LINE!
So, by the time you see a 1080i signal displayed on your LCD flatscreen, it is a 24 frame per second movie source that has been taken off the disc, converted to a 60 hz interlaced signal by the disc player, de-interlaced by the display, and then displayed progressively at 60hz. In this case, the display automatically duplicates the original 24 frames, sent as 60 "half pictures" into 60 whole pictures. The difference for a 1080p disc player is that it is the disc player, not the display, that duplicates the original 24 frames into 60 whole pictures, which it sends to the display, and which the display then directly displays without any further processing.
So, for movies, which only have 24 frames per second to be shown, it is just a matter of which component is going to do the duplication work to make the 60 whole pictures which are refreshed on the display at 60 hz--the disc player only (1080p), or the disc player + the display (1080i). In either case, there is nothing to be seen on 1080p that is not being seen on 1080i. The only time it would make a difference would be if the original source had more than 30 frames per second to show, in which case an interlaced signal would begin to lag behind a progressive one--but Hollywood only shoots in 24 frames per second--this is what you are seeing in a theater with a projector--and so interlaced signals, which max out at 30 whole pictures per second, are more than fast enough to display all the frames each second from the original film source.
You don't see any choppiness or image artifacts on a screen in a theater, right? That's because the 24 frames per second that are running by the projector bulb are plenty fast enough to fool the human eye into seeing fluid motion. Well, 1080i is even faster than that--the whole picture is refreshed 30 times per second--it's impossible for the original 24 frames per second to be negatively affected by an interlaced signal, especially when it is automatically de-interlaced by the LCD display--what you see on 1080i movies is exactly the same 24 frames that were originally in celluloid on the film reels--progressive can do nothing to improve the appearance of these 24 frames.
Think of it like 2 slide projectors: say you have 24 slides that you want to show in one second (like the 24 frames per second that are sitting on your HD DVD or Blu Ray). However, one slide projector forces you to switch slides 30 times each second (equivalent to a 1080i signal refreshing half an image each time at 60hz = 30 refreshings of the whole image each second). In this case you would want to make duplicates of 6 of the slides and put them in the carousel so that you didn't have blank images showing up six times after you showed the original 24--thus giving you a smooth progression of pictures with the 6 duplicates put in. Now, say your second slide projector forces you to switch slides not 30 but 60 times per second (equivalent to a progressive signal which refreshes the whole image 60 times per second). In this case you would have to make not just 6 but 36 duplicate slides from the original slides and put them in the carousel so that you would not have blank images showing up 36 times each second. The result that you get with either projector is that the same 24 images are shown either on a total of 30 or 60 slides--when you then watch the two slide shows, your eye sees nothing different--all it sees are 24 different images in one second. In either case, there are only 24 original images, and in both cases all 24 images are shown in one second in a smooth display--thus there is no difference whatsoever in what you are looking at on your display whether the disc player is 1080i or 1080p--all 24 original frames are being shown in their entirety each second, with their original resolution and clarity.
Hope this helps--I agonized over whether to spend $50 more to get a 1080p until I finally figured out that movies are only 24 frames per second and aren't affected by interlaced signal displayed on LCD's. I bought a 1080i player, and it shows a perfect picture on my 42" 1920 x 1080 LCD. I've never noticed even the slightest jitter or imperfection on any picture played from my HD DVD's, and I would challenge anyone to show that a 1080p player could player any of these discs with any aspect of picture quality better than my 1080i player on my LCD.
__________________
Ralph
 

Jeff Gatie

Senior HTF Member
Joined
Aug 19, 2002
Messages
6,531
It is really irrelevant because the number of true 1080i displays is very small and getting smaller. Almost all displays (CRT's being the exception) are now progressive fixed pixel 720p/1080p displays, so the question comes down to the need for a higher resolution resolution and/or display quality (including the quality of the scaling/deinterlacing). The fact these fixed pixel TV's will accept a 1080i source means nothing; the 1080i source is either scaled (720p) or deinterlaced (1080p) to the display's native output.

Basically, the days of arguing interlaced vs. progressive displays at any resolution are just about over. As far as arguing 1080i vs 720p sources, a 1080i source will translate better to a 1080p TV, the same holds true for a 720p source on a 720p display. 720 on 1080 and vice versa depends on the quality of the scaler.
 

chuckg

Supporting Actor
Joined
Apr 27, 2004
Messages
921
While I don't disagree on the subject (that there is essentially no difference between interlaced and progressive image quality) I would say that the explanation is wrong, confusing, and long winded.

A true interlaced signal shows two half-images, separated in time by 1/60th of a second. On your good aulde analog TV, if you took the two half-frames and displayed them at the same time, you would see that an object moving from left to right is disticntly odd looking: broken into stripes that are misaligned L-to-R.

Also, not all TV sets refresh at 60 Hz. Many do 30 Hz progressive scan, and must do a 3:2 pulldown to display movies shot at 24 fps. A 60 Hz set can just display the same frame of film multiple times, but a 30 Hz set shows one frame 3 times, the next 2 times, etc. Or something like that.

The final answer is that you probably can't see the difference in interlaced versus non-interlaced, as long as all the conversions and resampling is done properly.
 

Stephen Tu

Screenwriter
Joined
Apr 26, 1999
Messages
1,572
That article doesn't really paint the complete picture. In theory, if a 1080p TV does proper film-mode cadence detection + deinterlacing of a 1080i source, then indeed it doesn't make any difference whether you feed it 1080p or 1080i from a film-sourced disc. But there have been various tests at places like CNET & Home Theater Magazine using the HQV test disc that show that many/most older 1080p sets aren't doing this (using less accurate video mode deinterlacing instead), so in practice sending 1080i loses something on these sets. Now, it only makes a significant difference in scenes with fine vertical detail combined with motion (usually a slow pan, with fast motion it's hard to notice loss of resolution), so probably most people shouldn't care too much. But a video purist would want to use a player with 1080p out in case the processing in their TV isn't up to snuff. Also 1080p24 can be better for TVs with 120hz refresh avoiding the judder caused by the 3:2 frame repetition pattern.
 

Allan Jayne

Senior HTF Member
Joined
Nov 1, 1998
Messages
2,405
I don't think there are any 30 Hz progressive TV sets although there are 30 Hz progressive video signal transmission formats.

60 Hz TV's also require 3-2 pulldown of 24 Hz (fps) movie source material. There are a few 72 Hz TV's that use 3-3 pulldown for film material and eliminate the judder seen in 3-2 pulldown material. If the film material came in as 60 Hz with 3-2 pulldown already in it, the 72 Hz TV must pro-actively dissect the material to create an accurate 3-3 pulldown from it.

35mm movie projectors actually give two short flashes instead of one long flash of each of the 24 frames per second, effectively delivering 48 Hz.

Not all 120 Hz TV sets eliminate the judder of 24 Hz 3-2 pulldown material. Simply using each incoming frame of 60 Hz material twice will yield 6-4 pulldown if the material was originally from 24 fps film.

Video hints: Line Doublers and De-Interlacers
 

Leo Kerr

Screenwriter
Joined
May 10, 1999
Messages
1,698
Which actually leads to the question: since LCD and Plasma displays aren't tied to a scanning electron beam, why can't they say, "oh, a 24Hz progressive film source! Let's run at 24P?"

In the dark ages of CRTs, that would have been horrible; most phosphors would have gone completely dark by the time the refresh came through. But now? Not a problem with modern display tech. Or is it only the whole "we must show higher numbers!" for the marketing perspective?

Leo
 

Allan Jayne

Senior HTF Member
Joined
Nov 1, 1998
Messages
2,405
It takes extra processing, a lot of it, to say "oh its a 24 fps film source", unless the incoming video format is one of the newer ones, namely with exactly 24 frames per second. (Such processing is known as inverse telecine or 3-2 pulldown recognition.)

It takes extra processing to be able to (let's) run at 24 fps in addition to the 60 fps needed for live video.
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Forum statistics

Threads
357,037
Messages
5,129,376
Members
144,285
Latest member
Larsenv
Recent bookmarks
0
Top