What's new

1080i vs 720p (1 Viewer)

eshoukry93

Auditioning
Joined
Nov 19, 2006
Messages
11
Real Name
ehab
Hey guys,

I understand the difference between 1080i and 720p but I was wondering if there was a consensus on this board as to which is better? I'm down to 2 tv's now, one that is 1080i and the other that is 720p and was wondering what you guys thought about these two resolutions?

Thanks.
 

Michael TLV

THX Video Instructor/Calibrator
Senior HTF Member
Joined
Mar 16, 2000
Messages
2,909
Location
Calgary, Alberta
Real Name
Michael Chen
Greetings

There are not too many 1080i native TV sets out there unless you are talking about CRT technology. Anything digital is either using 720p or 1080p or 768p panels. (or 1024p)

720P sets are best for 720p signals ... (ABC, FOX, ESPN)

A 1080p set is best for 1080i signals ... (PBS, CBS, NBC, CW)

For instance

Regards
 

eshoukry93

Auditioning
Joined
Nov 19, 2006
Messages
11
Real Name
ehab
Thanks. I've seen the hitachi 42" plasma that has 1080i native resolution. Would you take that over a 720p tv?
 

Rolando

Screenwriter
Joined
Feb 19, 2001
Messages
1,338
unless something cam in under the radar that I have never heard of... there is no such thing.
As my man Michael said unless it a CRT (RPTV or DVTV) it is NOT 1080i.
LCD, DLP, Plasma, LCOS, SED all are natively progressive. 480p, 720p or 1080p. Actual panels could be as Michael said anything like 768p or 1024p etc but always with a "p" for progressive.
 

Ken Chan

Senior HTF Member
Joined
Apr 11, 1999
Messages
3,302
Real Name
Ken
From the ad:I'm guessing they're trying to have it both (or three) ways: (1) It is a 42" plasma that has 1080 lines. You usually have to go to 50" for 1080. That's good. (2) It does not support 1080p input. That's bad. (3) HDTV only does 1080i, not 1080p. That's just the way it is (it's bad).

So they're being truthful about not supporting 1080p input by selling the fact that it supports 1080i. But the display itself is not interlaced.

Going back to your original question: having watched HD on a CRT that is in fact interlaced, I would definitely take the higher resolution of 1080i. 720p content that is upscaled and then interlaced still looks good. Then again, the CRT could also do all 1920 across, so I was "getting all" of the pixels that CBS and NBC were giving. If I had to choose between 1024x1080 and 1280x720....
 

Rolando

Screenwriter
Joined
Feb 19, 2001
Messages
1,338
What the heck is that thing?

Even the Hitachi site itself says it is 1080i! Is that possible? can a plasma be interlaced? is it 1080p but cannot call itself that because it cannot accept a 1080p input?

And how the heck can a 16:9 pannel have more vertical resolution than horizontal? even a 4:3 panel should have more vertical than horizontal.

Something is very wrong...
 

Nick:G

Stunt Coordinator
Joined
Jun 17, 2006
Messages
200
Real Name
Nick Gallegos

This is nothing new. A lot of the first consumer 42" HD panels had a native resolution of 1024x1024, while still being a 16:9 display. These aren't the square pixels you're accustomed to but rather, a rectangular shape.
 

Rolando

Screenwriter
Joined
Feb 19, 2001
Messages
1,338
Ah, thanks Nick.

Still 1024X1024 of rectagular pixels makes a lot more sense that 1024X1080. But who knows! It might be the trick for a more dense picture, no SDE and richer colors.

It sounds bad on paper but as they say, the best test is to actually see it in action.
 

ChrisWiggles

Senior HTF Member
Joined
Aug 19, 2002
Messages
4,791

There are many displays which are native 1080p panels but whose electronics do not handle 1080p source signals. If you feed it a 1080p signal it will not recognize or display the signal. You must feed it 1080i, and then it scales that up to 1080p. This may not be a problem if the internal scaling is top notch, however it is a problem if that scaling is inadequate, or you do have native 1080p sources that you can't get into a 1080i form.
 

Ken Chan

Senior HTF Member
Joined
Apr 11, 1999
Messages
3,302
Real Name
Ken
Not as far as picture quality is concerned. All else being equal, with a digital picture, the best quality would be dot-for-dot. You display a 1920x1080 square-pixel image (like HDTV) on 1920x1080 square pixels, no scaling. But if you can't manage all the pixels across, at least you can be the same line-for-line, like 1024x1080. You lose detail on each line, but at least there's no interpolation between those lines. This is even more important if the source was interlaced, when every other line comes from a different point in time.
 

ChrisWiggles

Senior HTF Member
Joined
Aug 19, 2002
Messages
4,791

Yes, although even better than this is scaled up to a display that is an even multiple in each direction higher in resolution.
 

captaincrash

Stunt Coordinator
Joined
Nov 26, 2006
Messages
50
Real Name
Thomas
interesting dialog here...

so what about 1080p... if most of the HD TV broadcast is 720p (is it? Or am I mistaken? I thought the broadcast standard was ALL 720p on HDTV?)... then wouldn't everything have to be scaled up to 1080p? SO might it be better to NOT scale the broadcast 720p signal at all and show it AT 720p?

Or am I just mixed up? Earlier it was mentioned that PBS and some other channels HAD 1080p (or "i"?) broadcast signals... that can;t be so 0 unless it is 1080i... "i".

I'm trying to figure out 9if there is a single spec configuration I should aim for to achieve optimal viewing at the maximum resolution available GIVEN the limits of the media sources available.

Does that make any sense?
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Latest posts

Forum statistics

Threads
357,007
Messages
5,128,246
Members
144,228
Latest member
CoolMovies
Recent bookmarks
0
Top