What's new

Difference bewteen 1080i and 1080p? (1 Viewer)

Joseph DeMartino

Senior HTF Member
Joined
Jun 30, 1997
Messages
8,311
Location
Florida
Real Name
Joseph DeMartino
About 7 letters.
htf_images_smilies_smile.gif


But seriously folks...

The letters represent "interlaced" and "progressive". With an interlaced signal the TV displays 30 video frames per second. Each frame is made up of two fields, each of which is displayed for a 60th of a second. The first field is made up of half the horizontal scan lines that make up the image (I forget if the odd ones or the even ones are displayed first.) The lines are displayed in order from the top of the screen to the bottom, something like this:

11111111111
xxxxxxxxxxxxx
33333333333
xxxxxxxxxxxxx
55555555555

As soon as the last line of field one is displayed the first line of field 2 starts near the top of the screen and the process repeats:

xxxxxxxxxxxxxx
222222222222
xxxxxxxxxxxxxx
444444444444
xxxxxxxxxxxxxx
666666666666

Until the entire 1080 line frame is drawn. Because all of this happens so fast, we see a single image 30 times every second. Many (but not all) people detect a slight "flicker" in the image, which can be annoying.

Until fairly recently all televisions and most computer monitors used interlaced displays. The whole NTSC television standard was based on interlaced images. It was a way to squeeze more apparent resolution out of the available technology.

Non-interlaced or progressive scan systems were first introduced in the computer world, where the relatively static nature of most images made the flicker more noticeable and where it caused fatigue for people staring at the screen close-up for hours at a time. Monitors capable of both interlaced and progressive display were developed (called multiscan or multisync), as were video cards that could support various resolutions and interlacing schemes.

CRTs use a scanning beam of electrons to produce an image on the screen, which is one reason why interlacing was both possible and desirable. Direct view, rear projection and front projection CRT systems can display interlaced images. Older SD sets can only display interlaced. New EDTV and HDTV sets can display both progessive and interlaced images, although some older HD sets top out at 1080i. Some early plasmas also display interlaced images. But most new plasmas and all LCD, DLP and LCoS screens are "fixed-pixel" designs. TVs and projectors based on these technologies are inherently progressive and incapable of displaying an interlaced signal. If fed one they de-interlace the image by merging the two fields and display the entire 1080p image at once.

Progressivescan images avoid the flicker of interlaced ones, and are generally perceived as smoother. 1080p content is only available on Blu Ray discs and via computer files at this time. The maximum resolution supported by the DTV standard is 1080i - although, again, if you have a 1080p fixed-pixel display, the set is going to de-interlace it to 1080p anyway.

Does this help?

Regards,

Joe
 

Mike Williams

Screenwriter
Joined
Mar 3, 2003
Messages
1,019
With interlaced, each field is displayed for half a FRAME, not half a SECOND.

We see a single image every frame or 1/30th of a second, not a single image every 30 seconds. Wow!

Hope this helps a little more than the information above.

Again, wow!
 

Joseph DeMartino

Senior HTF Member
Joined
Jun 30, 1997
Messages
8,311
Location
Florida
Real Name
Joseph DeMartino
Wow! A typo! Wow! Imagine that!
htf_images_smilies_smile.gif


As I correctly said at the outset, "With an interlaced signal the TV displays 30 video frames per second." Yes, I had a brain fart when typing out a couple of later references to the rate, but they were both pretty obviously what they were.


Um, NO because a "frame" is not a unit of time. Each field displays half the information contained in a frame, and displays for half the amount of time that a frame displays. (1/60th of a second for 30 fps NTSC, 1/50th of a second for 25 fps PAL, etc.)

So do you really think that your post, which is a) innacurate and b) doesn't actually explain anything, is more helpful than mine, which describes the difference in detail, despite a couple of minor typos (since corrected?) Seriously?

Again, wow! :D

Joe
 

Bob_L

Supporting Actor
Joined
May 19, 2001
Messages
895
Real Name
Bob Lindstrom
Joe's explanation is excellent. To my mind, the most significant point he makes it that digital display devices--like plasma, LCD, and LED HDTVs--are inherently progressive, so any 1920x1080 resolution device is going to display in 1080p, whether fed native 1080p or interlaced 1080i content. (It will deinterlace the 1080i content internally.) The marketeers have made a 1080p a big buzzword but it really isn't all that significant when dealing with digital displays.
 

Mr. Pacino

Second Unit
Joined
Jun 29, 2008
Messages
344
Real Name
Nico
I watch Blu-rays always in 1080i because my TV can´t play 1080p.

Can we see the difference between 1080i and 1080p with our eye?
 

Stephen_J_H

All Things Film Junkie
Senior HTF Member
Joined
Jul 30, 2003
Messages
7,893
Location
North of the 49th
Real Name
Stephen J. Hill
Unless your display is 1080i (i.e. a CRT HD display), you will be viewing a progressive signal anyway, just scaled down to the resolution of your screen e.g. 1366 x 768, 720 x 1280 or whatever. LCDs, plasmas and DLPs will not display an interlaced signal.
 

Michael Allred

Screenwriter
Joined
Aug 13, 2000
Messages
1,720
Location
MI
Real Name
Michael
I have a LCD HDTV, Blu-Ray player and use HDMI cable. When I hit the display button for the BD player it says "1080/60i" so it has an "i" after the 60 but no letter after 1080. What's this mean?
 

Matt DeVillier

Supporting Actor
Joined
Sep 3, 1999
Messages
773

it means your bluray player is feeding the LCD a 1080i signal at 60Hz. The LCD is then converting that to 720p/768p @ 60Hz. You would be much better off setting your bluray player to output 720p, to avoid unnecessary format conversions.
 

Mike Williams

Screenwriter
Joined
Mar 3, 2003
Messages
1,019
Whether a frame is a unit of time or not, a field is still half a frame as it always take two fields to make a frame. With interlace, each field is displayed for half of the frame. With progressive, both fields are read and displayed at the same time for the entire length of the frame. To make your point, you quoted my line which stated:

With interlaced, each frame is displayed for half a FRAME, not half a SECOND.
 

Steve Schaffer

Senior HTF Member
Joined
Apr 15, 1999
Messages
3,756
Real Name
Steve Schaffer

Since the native resolution of "720P" sets is almost always 768x1366 the set must do a conversion of incoming 720p or 1080i. Since BD is encoded at 1080p the player is doing a conversion to output 1080i to sets that won't accept 1080p just as it's doing a conversion to 720p if manually set to do so.

Due to variations in the quality of conversion done by both the player and the tv it's not really safe to say that pq would always be better if the player is set to output 720p, even though that's closest to the set's native resolution. In the past with some players owners of 720p sets have sometimes reported better pq with the player outputting 1080i rather than 720p, so it's perhaps best to experiment with both output settings to see what looks better with one's own player/tv combination.
 

Matt DeVillier

Supporting Actor
Joined
Sep 3, 1999
Messages
773

unless the player has an absolutely horrendous scaler, allowing the player to convert from 1080p24 to 720p60 will be better. I realize there is anecdotal evidence to the contrary, but IME this is almost always due to excessive artifacting produced by the multiple conversions giving a false picture. Maintaining a progressive signal throughout the processing chain is going to yield the most accurate picture 99% of the time. But I agree, in the end most users should go with whatever method their eyes favor.
 

Hartwig Hanser

Second Unit
Joined
Oct 9, 1998
Messages
301

Deinterlacing is not so easy, and deinterlacers make errors. It is the same as with DVDs which are 480i: If the deinterlacer makes an error, you will see jaggies or other artefacts. My Sharp Blu-ray Player for example is not very good in deinterlacing 1080i Material, dito my Panasonic AE2000 projector. So, in many cases, 1080p material will give you a better picture quality, although the difference may be small.
 

Michael Allred

Screenwriter
Joined
Aug 13, 2000
Messages
1,720
Location
MI
Real Name
Michael
As I recall, when I first got my blu-ray player, I selected 1080p in the player's settings and my TV screen went black/blank and I had to reset the settings to get a picture back. I'm assuming that's because my HDTV cannot display 1080p?
 

Matt DeVillier

Supporting Actor
Joined
Sep 3, 1999
Messages
773

not only can it not display 1080p, but it doesn't accept 1080p as an input either (some 720p sets will accept 1080p and scale down to 720p). I'd set the player output to 720p.
 

Cees Alons

Senior HTF Member
Joined
Jul 31, 1997
Messages
19,789
Real Name
Cees Alons
The Sony LCD HDTV KDL-32M3000 has a native screen of 1366x768 pixels. The HDMI connection will tell the player which signal to choose.

In principle, 1080i is the same resolution as 1080p and the circuitry has to make sure that each pixel get at its proper place (which is a rather simple task).

Problem is: some de-interlacers in players or video processors do a more complicated job (e.g. soften the transitions to avoid jaggies in TV-recorded material), which may slightly degrade the image. The TV sets will generally not do the latter.

In your case, the TV set will have to convert the 1080-image to the screen's resolution, and will choose a vertical resolution of 720 lines. The fact that it doesn't accept 1080p stems from a lack of an input circuit that processes 1080p, that's all. A few years ago the need for 1080p input circuitry wasn't present yet.
(Until far into 2006, many HDTV sets were manufactured with an additional circuit converting 1080p to 1080i first, so the "normal" rest of the circuitry would function and the set would be able to accept 1080p and be advertised loudly as such.)

Cees
 

ManW_TheUncool

His Own Fool
Premium
Senior HTF Member
Joined
Aug 18, 2001
Messages
11,960
Location
The BK
Real Name
ManW
Just try both and see for yourself which works better.

In theory, if the various processing algorithms (on both TV and player) perform well, I'd expect you to get marginally better results most of the time by sending 1080i to the TV and let the TV downconvert to its native resolution, which probably is *not* 720p (as noted by a couple others). This is because each scaling of the image will generally introduce some bits of scaling artifacts (though you may or may not notice them), and having the player downconvert to 720p and then *still* have the TV scale to its native res will probably mean one extra such conversion (most of the time).

However, in actual practice, maybe none of the slight PQ diff will matter more to you than the extra inconvenience of forcing some other setting than "auto". ;) :D Afterall, what are we talking about here, a 32" LCD viewed from how far away? I'd think you should worry more about getting the (LCD, of all things :P) display properly calibrated than worry about this scaling issue. ;) :D

_Man_
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Latest Articles

Forum statistics

Threads
357,037
Messages
5,129,271
Members
144,286
Latest member
acinstallation172
Recent bookmarks
0
Top