-

Jump to content



Photo
- - - - -

1080i vs 720p


This topic has been archived. This means that you cannot reply to this topic.
14 replies to this topic

#1 of 15 OFFLINE   eshoukry93

eshoukry93

    Auditioning

  • 10 posts
  • Join Date: Nov 19 2006

Posted November 22 2006 - 01:04 AM

Hey guys,

I understand the difference between 1080i and 720p but I was wondering if there was a consensus on this board as to which is better? I'm down to 2 tv's now, one that is 1080i and the other that is 720p and was wondering what you guys thought about these two resolutions?

Thanks.

#2 of 15 OFFLINE   Michael TLV

Michael TLV

    Screenwriter

  • 2,909 posts
  • Join Date: Mar 16 2000
  • Real Name:Michael Chen
  • LocationCalgary, Alberta

Posted November 22 2006 - 02:10 AM

Greetings

There are not too many 1080i native TV sets out there unless you are talking about CRT technology. Anything digital is either using 720p or 1080p or 768p panels. (or 1024p)

720P sets are best for 720p signals ... (ABC, FOX, ESPN)

A 1080p set is best for 1080i signals ... (PBS, CBS, NBC, CW)

For instance

Regards
Michael @ The Laser Video Experience
THX Video Systems Instructor/ISF Instructor
Lion A/V Consultants Network - TLVEXP.com


#3 of 15 OFFLINE   eshoukry93

eshoukry93

    Auditioning

  • 10 posts
  • Join Date: Nov 19 2006

Posted November 22 2006 - 02:29 AM

Thanks. I've seen the hitachi 42" plasma that has 1080i native resolution. Would you take that over a 720p tv?

#4 of 15 OFFLINE   CoreyAC

CoreyAC

    Stunt Coordinator

  • 68 posts
  • Join Date: May 18 2006

Posted November 22 2006 - 03:05 AM

wouldn't it be better to just get a 1080p and run 720p and 1080i thru it?

#5 of 15 OFFLINE   Rolando

Rolando

    Screenwriter

  • 1,318 posts
  • Join Date: Feb 19 2001

Posted November 22 2006 - 03:10 AM

unless something cam in under the radar that I have never heard of... there is no such thing.
As my man Michael said unless it a CRT (RPTV or DVTV) it is NOT 1080i.
LCD, DLP, Plasma, LCOS, SED all are natively progressive. 480p, 720p or 1080p. Actual panels could be as Michael said anything like 768p or 1024p etc but always with a "p" for progressive.
Rolando Avendano

My Collection

#6 of 15 OFFLINE   eshoukry93

eshoukry93

    Auditioning

  • 10 posts
  • Join Date: Nov 19 2006

Posted November 22 2006 - 04:30 AM

Here's a link to the Hitachi I am considering. According to the sources, it's native display is 1080i.

http://www.bestbuy.c....=1142293092238

I think this is one of the few 42" plasmas that does this.

#7 of 15 OFFLINE   Seth=L

Seth=L

    Screenwriter

  • 1,313 posts
  • Join Date: Jul 17 2006

Posted November 22 2006 - 05:42 AM

It says the resolution is 1024x1080.

#8 of 15 OFFLINE   Ken Chan

Ken Chan

    Producer

  • 3,302 posts
  • Join Date: Apr 11 1999

Posted November 22 2006 - 10:26 AM

From the ad:
Quote:
1080i display provides the highest quality interlaced picture possible from a high-definition source
I'm guessing they're trying to have it both (or three) ways: (1) It is a 42" plasma that has 1080 lines. You usually have to go to 50" for 1080. That's good. (2) It does not support 1080p input. That's bad. (3) HDTV only does 1080i, not 1080p. That's just the way it is (it's bad).

So they're being truthful about not supporting 1080p input by selling the fact that it supports 1080i. But the display itself is not interlaced.

Going back to your original question: having watched HD on a CRT that is in fact interlaced, I would definitely take the higher resolution of 1080i. 720p content that is upscaled and then interlaced still looks good. Then again, the CRT could also do all 1920 across, so I was "getting all" of the pixels that CBS and NBC were giving. If I had to choose between 1024x1080 and 1280x720....

#9 of 15 OFFLINE   Rolando

Rolando

    Screenwriter

  • 1,318 posts
  • Join Date: Feb 19 2001

Posted November 23 2006 - 05:06 AM

What the heck is that thing?

Even the Hitachi site itself says it is 1080i! Is that possible? can a plasma be interlaced? is it 1080p but cannot call itself that because it cannot accept a 1080p input?

And how the heck can a 16:9 pannel have more vertical resolution than horizontal? even a 4:3 panel should have more vertical than horizontal.

Something is very wrong...
Rolando Avendano

My Collection

#10 of 15 OFFLINE   Nick:G

Nick:G

    Stunt Coordinator

  • 200 posts
  • Join Date: Jun 17 2006

Posted November 23 2006 - 07:30 PM

Quote:
Here's a link to the Hitachi I am considering. According to the sources, it's native display is 1080i.

http://www.bestbuy.c....=1142293092238

I think this is one of the few 42" plasmas that does this.

Yep, that's Hitachi's new 42" panel. Their website claims they have the highest-resolution 42" plasma display on the market, but how much higher is it, really? Full 1080 HD is 1920x1080 pixels (about 2.1 million pixels). Now, let's take the new Hitachi panel used on all of their new 42" sets: 1024x1080 = 1.1 million pixels. Last year's 42" models employed the Hitachi/Fujitsu ALIS panel with 1024x1024 resolution (1.05 million pixels). And guess what? A 1366x768 panel is right around 1.05 million pixels as well.

I'm not sure if a whopping 5% more resolution is groundbreaking. What we have here is marketing at its finest...

Now don't get me wrong. My store displays the top-line Hitachi 42HDX99 (Director's Series) model and it actually looks pretty darned good. But frankly, Panasonics (cheaper!) and Fujitsus (a lot more expensive) look better...

Quote:
And how the heck can a 16:9 pannel have more vertical resolution than horizontal? even a 4:3 panel should have more vertical than horizontal.

This is nothing new. A lot of the first consumer 42" HD panels had a native resolution of 1024x1024, while still being a 16:9 display. These aren't the square pixels you're accustomed to but rather, a rectangular shape.

#11 of 15 OFFLINE   Rolando

Rolando

    Screenwriter

  • 1,318 posts
  • Join Date: Feb 19 2001

Posted November 24 2006 - 05:15 AM

Ah, thanks Nick.

Still 1024X1024 of rectagular pixels makes a lot more sense that 1024X1080. But who knows! It might be the trick for a more dense picture, no SDE and richer colors.

It sounds bad on paper but as they say, the best test is to actually see it in action.
Rolando Avendano

My Collection

#12 of 15 OFFLINE   ChrisWiggles

ChrisWiggles

    Producer

  • 4,791 posts
  • Join Date: Aug 19 2002

Posted November 24 2006 - 10:12 AM

Quote:
Originally Posted by Rolando
What the heck is that thing?

Even the Hitachi site itself says it is 1080i! Id that possible? can a plasma be interlaced? is it 1080p but cannot call itself that because it cannot accept a 1080p input?

And how the heck can a 16:9 pannel have more vertical resolution than horizontal? even a 4:3 panel should have more vertical than horizontal.

Something is very wrong...

There are many displays which are native 1080p panels but whose electronics do not handle 1080p source signals. If you feed it a 1080p signal it will not recognize or display the signal. You must feed it 1080i, and then it scales that up to 1080p. This may not be a problem if the internal scaling is top notch, however it is a problem if that scaling is inadequate, or you do have native 1080p sources that you can't get into a 1080i form.

#13 of 15 OFFLINE   Ken Chan

Ken Chan

    Producer

  • 3,302 posts
  • Join Date: Apr 11 1999

Posted November 25 2006 - 08:13 AM

Quote:
1024X1024 of rectagular pixels makes a lot more sense that 1024X1080
Not as far as picture quality is concerned. All else being equal, with a digital picture, the best quality would be dot-for-dot. You display a 1920x1080 square-pixel image (like HDTV) on 1920x1080 square pixels, no scaling. But if you can't manage all the pixels across, at least you can be the same line-for-line, like 1024x1080. You lose detail on each line, but at least there's no interpolation between those lines. This is even more important if the source was interlaced, when every other line comes from a different point in time.

#14 of 15 OFFLINE   ChrisWiggles

ChrisWiggles

    Producer

  • 4,791 posts
  • Join Date: Aug 19 2002

Posted November 25 2006 - 09:53 AM

Quote:
Originally Posted by Ken Chan
Not as far as picture quality is concerned. All else being equal, with a digital picture, the best quality would be dot-for-dot. You display a 1920x1080 square-pixel image (like HDTV) on 1920x1080 square pixels, no scaling. But if you can't manage all the pixels across, at least you can be the same line-for-line, like 1024x1080. You lose detail on each line, but at least there's no interpolation between those lines. This is even more important if the source was interlaced, when every other line comes from a different point in time.

Yes, although even better than this is scaled up to a display that is an even multiple in each direction higher in resolution.

#15 of 15 OFFLINE   captaincrash

captaincrash

    Stunt Coordinator

  • 50 posts
  • Join Date: Nov 26 2006

Posted November 26 2006 - 11:02 PM

interesting dialog here...

so what about 1080p... if most of the HD TV broadcast is 720p (is it? Or am I mistaken? I thought the broadcast standard was ALL 720p on HDTV?)... then wouldn't everything have to be scaled up to 1080p? SO might it be better to NOT scale the broadcast 720p signal at all and show it AT 720p?

Or am I just mixed up? Earlier it was mentioned that PBS and some other channels HAD 1080p (or "i"?) broadcast signals... that can;t be so 0 unless it is 1080i... "i".

I'm trying to figure out 9if there is a single spec configuration I should aim for to achieve optimal viewing at the maximum resolution available GIVEN the limits of the media sources available.

Does that make any sense?


Back to Display Devices (TVs/Projectors/Screens)



Forum Nav Content I Follow