Jump to content



Sign up for a free account to remove the pop-up ads

Signing up for an account is fast and free. As a member you can join in the conversation, enter contests and remove the pop-up ads that guests get. Click here to create your free account.


Photo
- - - - -

Difference bewteen 1080i and 1080p?


  • You cannot start a new topic
  • Please log in to reply
20 replies to this topic

#1 of 21 OFFLINE   Michael Allred

Michael Allred

    Screenwriter



  • 1,718 posts
  • Join Date: Aug 13 2000
  • Real Name:Michael
  • LocationMI

Posted April 19 2009 - 07:09 AM

Can anyone tell me?

#2 of 21 OFFLINE   Joseph DeMartino

Joseph DeMartino

    Lead Actor



  • 8,308 posts
  • Join Date: Dec 31 1969
  • Real Name:Joseph DeMartino
  • LocationFlorida

Posted April 19 2009 - 07:32 AM

About 7 letters. Posted Image

But seriously folks...

The letters represent "interlaced" and "progressive". With an interlaced signal the TV displays 30 video frames per second. Each frame is made up of two fields, each of which is displayed for a 60th of a second. The first field is made up of half the horizontal scan lines that make up the image (I forget if the odd ones or the even ones are displayed first.) The lines are displayed in order from the top of the screen to the bottom, something like this:

11111111111
xxxxxxxxxxxxx
33333333333
xxxxxxxxxxxxx
55555555555

As soon as the last line of field one is displayed the first line of field 2 starts near the top of the screen and the process repeats:

xxxxxxxxxxxxxx
222222222222
xxxxxxxxxxxxxx
444444444444
xxxxxxxxxxxxxx
666666666666

Until the entire 1080 line frame is drawn. Because all of this happens so fast, we see a single image 30 times every second. Many (but not all) people detect a slight "flicker" in the image, which can be annoying.

Until fairly recently all televisions and most computer monitors used interlaced displays. The whole NTSC television standard was based on interlaced images. It was a way to squeeze more apparent resolution out of the available technology.

Non-interlaced or progressive scan systems were first introduced in the computer world, where the relatively static nature of most images made the flicker more noticeable and where it caused fatigue for people staring at the screen close-up for hours at a time. Monitors capable of both interlaced and progressive display were developed (called multiscan or multisync), as were video cards that could support various resolutions and interlacing schemes.

CRTs use a scanning beam of electrons to produce an image on the screen, which is one reason why interlacing was both possible and desirable. Direct view, rear projection and front projection CRT systems can display interlaced images. Older SD sets can only display interlaced. New EDTV and HDTV sets can display both progessive and interlaced images, although some older HD sets top out at 1080i. Some early plasmas also display interlaced images. But most new plasmas and all LCD, DLP and LCoS screens are "fixed-pixel" designs. TVs and projectors based on these technologies are inherently progressive and incapable of displaying an interlaced signal. If fed one they de-interlace the image by merging the two fields and display the entire 1080p image at once.

Progressivescan images avoid the flicker of interlaced ones, and are generally perceived as smoother. 1080p content is only available on Blu Ray discs and via computer files at this time. The maximum resolution supported by the DTV standard is 1080i - although, again, if you have a 1080p fixed-pixel display, the set is going to de-interlace it to 1080p anyway.

Does this help?

Regards,

Joe

#3 of 21 OFFLINE   Mike Williams

Mike Williams

    Screenwriter



  • 1,020 posts
  • Join Date: Mar 03 2003

Posted April 19 2009 - 12:23 PM

With interlaced, each field is displayed for half a FRAME, not half a SECOND. We see a single image every frame or 1/30th of a second, not a single image every 30 seconds. Wow! Hope this helps a little more than the information above. Again, wow!

#4 of 21 OFFLINE   Joseph DeMartino

Joseph DeMartino

    Lead Actor



  • 8,308 posts
  • Join Date: Dec 31 1969
  • Real Name:Joseph DeMartino
  • LocationFlorida

Posted April 19 2009 - 04:22 PM

Wow! A typo! Wow! Imagine that! Posted Image

As I correctly said at the outset, "With an interlaced signal the TV displays 30 video frames per second." Yes, I had a brain fart when typing out a couple of later references to the rate, but they were both pretty obviously what they were.


Um, NO because a "frame" is not a unit of time. Each field displays half the information contained in a frame, and displays for half the amount of time that a frame displays. (1/60th of a second for 30 fps NTSC, 1/50th of a second for 25 fps PAL, etc.)

So do you really think that your post, which is a) innacurate and b) doesn't actually explain anything, is more helpful than mine, which describes the difference in detail, despite a couple of minor typos (since corrected?) Seriously?

Again, wow! Posted Image

Joe

#5 of 21 OFFLINE   Bob_L

Bob_L

    Supporting Actor



  • 893 posts
  • Join Date: May 19 2001

Posted April 19 2009 - 05:37 PM

Joe's explanation is excellent. To my mind, the most significant point he makes it that digital display devices--like plasma, LCD, and LED HDTVs--are inherently progressive, so any 1920x1080 resolution device is going to display in 1080p, whether fed native 1080p or interlaced 1080i content. (It will deinterlace the 1080i content internally.) The marketeers have made a 1080p a big buzzword but it really isn't all that significant when dealing with digital displays.

#6 of 21 OFFLINE   Mr. Pacino

Mr. Pacino

    Second Unit



  • 346 posts
  • Join Date: Jun 29 2008

Posted April 20 2009 - 12:08 AM

I watch Blu-rays always in 1080i because my TV can´t play 1080p. Can we see the difference between 1080i and 1080p with our eye?

#7 of 21 OFFLINE   Stephen_J_H

Stephen_J_H

    All Things Film Junkie



  • 4,212 posts
  • Join Date: Jul 30 2003
  • Real Name:Stephen J. Hill
  • LocationNorth of the 49th

Posted April 20 2009 - 01:20 AM

Unless your display is 1080i (i.e. a CRT HD display), you will be viewing a progressive signal anyway, just scaled down to the resolution of your screen e.g. 1366 x 768, 720 x 1280 or whatever. LCDs, plasmas and DLPs will not display an interlaced signal.
"My opinion is that (a) anyone who actually works in a video store and does not understand letterboxing has given up on life, and (b) any customer who prefers to have the sides of a movie hacked off should not be licensed to operate a video player."-- Roger Ebert

#8 of 21 OFFLINE   Michael Allred

Michael Allred

    Screenwriter



  • 1,718 posts
  • Join Date: Aug 13 2000
  • Real Name:Michael
  • LocationMI

Posted April 20 2009 - 07:30 AM

I have a LCD HDTV, Blu-Ray player and use HDMI cable. When I hit the display button for the BD player it says "1080/60i" so it has an "i" after the 60 but no letter after 1080. What's this mean?

#9 of 21 OFFLINE   Matt DeVillier

Matt DeVillier

    Supporting Actor



  • 781 posts
  • Join Date: Sep 03 1999

Posted April 20 2009 - 07:38 AM

it means your bluray player is feeding the LCD a 1080i signal at 60Hz. The LCD is then converting that to 720p/768p @ 60Hz. You would be much better off setting your bluray player to output 720p, to avoid unnecessary format conversions.

#10 of 21 OFFLINE   Mike Williams

Mike Williams

    Screenwriter



  • 1,020 posts
  • Join Date: Mar 03 2003

Posted April 20 2009 - 08:06 AM

Whether a frame is a unit of time or not, a field is still half a frame as it always take two fields to make a frame. With interlace, each field is displayed for half of the frame. With progressive, both fields are read and displayed at the same time for the entire length of the frame. To make your point, you quoted my line which stated: With interlaced, each frame is displayed for half a FRAME, not half a SECOND. <---- That IS accurate. A frame CAN be a unit of time, though the time it represents changes depending on the format. If television, it's either 1/30th of a second, since 30 frames make up a second or it can be 1/60th of a second if the frame rate is 60. With PAL it would be 1/25th of a second since there are 25fps. And Joseph, if you're going to go through the time and trouble of such a detailed explanation, and pepper it with lines such as we see a complete image every 30 seconds, then it's going to elicit a WOW! Because you're incredible detailed explanation becomes EXTREMELY confusing, and it's only "very clear" what you meant by those who already know what you're talking about. Rather than get offended, one could say, Oh, yeah, that's what I meant, change it and move on. Perhaps that's not your way.

#11 of 21 OFFLINE   Steve Schaffer

Steve Schaffer

    Producer



  • 3,759 posts
  • Join Date: Apr 15 1999

Posted April 20 2009 - 08:11 AM

Since the native resolution of "720P" sets is almost always 768x1366 the set must do a conversion of incoming 720p or 1080i. Since BD is encoded at 1080p the player is doing a conversion to output 1080i to sets that won't accept 1080p just as it's doing a conversion to 720p if manually set to do so. Due to variations in the quality of conversion done by both the player and the tv it's not really safe to say that pq would always be better if the player is set to output 720p, even though that's closest to the set's native resolution. In the past with some players owners of 720p sets have sometimes reported better pq with the player outputting 1080i rather than 720p, so it's perhaps best to experiment with both output settings to see what looks better with one's own player/tv combination.
Steve S.
I prefer not to push the subwoofers until they're properly run in.

#12 of 21 OFFLINE   Matt DeVillier

Matt DeVillier

    Supporting Actor



  • 781 posts
  • Join Date: Sep 03 1999

Posted April 20 2009 - 08:57 AM

unless the player has an absolutely horrendous scaler, allowing the player to convert from 1080p24 to 720p60 will be better. I realize there is anecdotal evidence to the contrary, but IME this is almost always due to excessive artifacting produced by the multiple conversions giving a false picture. Maintaining a progressive signal throughout the processing chain is going to yield the most accurate picture 99% of the time. But I agree, in the end most users should go with whatever method their eyes favor.

#13 of 21 OFFLINE   Michael Allred

Michael Allred

    Screenwriter



  • 1,718 posts
  • Join Date: Aug 13 2000
  • Real Name:Michael
  • LocationMI

Posted April 20 2009 - 11:12 AM

Oh, the equipment I'm using; Sony LCD HDTV KDL-32M3000 and Sony Blu-Ray player BDP-S350

#14 of 21 OFFLINE   Hartwig Hanser

Hartwig Hanser

    Second Unit



  • 297 posts
  • Join Date: Oct 09 1998

Posted April 20 2009 - 08:42 PM

Deinterlacing is not so easy, and deinterlacers make errors. It is the same as with DVDs which are 480i: If the deinterlacer makes an error, you will see jaggies or other artefacts. My Sharp Blu-ray Player for example is not very good in deinterlacing 1080i Material, dito my Panasonic AE2000 projector. So, in many cases, 1080p material will give you a better picture quality, although the difference may be small.

#15 of 21 OFFLINE   Michael Allred

Michael Allred

    Screenwriter



  • 1,718 posts
  • Join Date: Aug 13 2000
  • Real Name:Michael
  • LocationMI

Posted April 21 2009 - 05:42 AM

As I recall, when I first got my blu-ray player, I selected 1080p in the player's settings and my TV screen went black/blank and I had to reset the settings to get a picture back. I'm assuming that's because my HDTV cannot display 1080p?

#16 of 21 OFFLINE   Matt DeVillier

Matt DeVillier

    Supporting Actor



  • 781 posts
  • Join Date: Sep 03 1999

Posted April 21 2009 - 05:55 AM

not only can it not display 1080p, but it doesn't accept 1080p as an input either (some 720p sets will accept 1080p and scale down to 720p). I'd set the player output to 720p.

#17 of 21 OFFLINE   Peter Overduin

Peter Overduin

    Supporting Actor



  • 781 posts
  • Join Date: Dec 31 1969

Posted April 21 2009 - 07:09 AM

No.
Peter

My Collection

#18 of 21 OFFLINE   Michael Allred

Michael Allred

    Screenwriter



  • 1,718 posts
  • Join Date: Aug 13 2000
  • Real Name:Michael
  • LocationMI

Posted April 21 2009 - 12:20 PM

well I had the player set on auto.....

#19 of 21 OFFLINE   Cees Alons

Cees Alons

    Executive Producer



  • 18,857 posts
  • Join Date: Jul 31 1997
  • Real Name:Cees Alons

Posted April 21 2009 - 12:43 PM

The Sony LCD HDTV KDL-32M3000 has a native screen of 1366x768 pixels. The HDMI connection will tell the player which signal to choose. In principle, 1080i is the same resolution as 1080p and the circuitry has to make sure that each pixel get at its proper place (which is a rather simple task). Problem is: some de-interlacers in players or video processors do a more complicated job (e.g. soften the transitions to avoid jaggies in TV-recorded material), which may slightly degrade the image. The TV sets will generally not do the latter. In your case, the TV set will have to convert the 1080-image to the screen's resolution, and will choose a vertical resolution of 720 lines. The fact that it doesn't accept 1080p stems from a lack of an input circuit that processes 1080p, that's all. A few years ago the need for 1080p input circuitry wasn't present yet. (Until far into 2006, many HDTV sets were manufactured with an additional circuit converting 1080p to 1080i first, so the "normal" rest of the circuitry would function and the set would be able to accept 1080p and be advertised loudly as such.) Cees

#20 of 21 OFFLINE   ManW_TheUncool

ManW_TheUncool

    Producer



  • 5,886 posts
  • Join Date: Aug 18 2001
  • Real Name:ManW

Posted April 21 2009 - 01:00 PM

Just try both and see for yourself which works better.

In theory, if the various processing algorithms (on both TV and player) perform well, I'd expect you to get marginally better results most of the time by sending 1080i to the TV and let the TV downconvert to its native resolution, which probably is *not* 720p (as noted by a couple others). This is because each scaling of the image will generally introduce some bits of scaling artifacts (though you may or may not notice them), and having the player downconvert to 720p and then *still* have the TV scale to its native res will probably mean one extra such conversion (most of the time).

However, in actual practice, maybe none of the slight PQ diff will matter more to you than the extra inconvenience of forcing some other setting than "auto". Posted Image Posted Image Afterall, what are we talking about here, a 32" LCD viewed from how far away? I'd think you should worry more about getting the (LCD, of all things Posted Image) display properly calibrated than worry about this scaling issue. Posted Image Posted Image

_Man_

Just another amateur learning to paint w/ "the light of the world".

"Whatever is true, whatever is honorable, whatever is right, whatever is pure, whatever is lovely, whatever is of good repute, if there is any excellence and if anything worthy of praise, dwell on these things..." (Apostle Paul)





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users