What's new

How Much Would It Cost for Them to Add 720P? (1 Viewer)

Arthur S

Senior HTF Member
Joined
Jul 2, 1999
Messages
2,571
How much would it cost the manufacturers to add native 720P to a CRT RPTV? Also, is it true that you would have to pick up the 720P signal over the air, or maybe over satellite That is, it is not available over cable?

Thanks

Artie
 

Steve Schaffer

Senior HTF Member
Joined
Apr 15, 1999
Messages
3,756
Real Name
Steve Schaffer
The last crt based rptv that could do 720p display was a 56" Panny that cost over $5000.

Cable and satellite boxes can be set to output any scanrate you want, you can convert everything to 1080i, 720p, or you can choose to have it pass whatever the native scanrate of the broadcast is.

I still don't understand what the big whoop is about not being able to do native 720p on a crt based set. The stb will convert 720p to 1080i and it will look just as good as native 1080i, and most all sets will do the same with excellent results.

I get ABC and ESPN-Hd at 720p and about 8 other hd channels whose native scanrate is 1080i. My box converts all of it to 1080i and I can't see any difference whatsoever between a native 720p broadcast and a native 1080i broadcast. It's just not an issue with HD broadcasts.

Even with X-box, a crt set will upconvert it to 1080i and the pq will be outstanding.
 

Arthur S

Senior HTF Member
Joined
Jul 2, 1999
Messages
2,571
Steve

As you said, the very first Panny HD, the 56WXF90 did 720P. Yes, the list price was $5,500, but then again, the Sony 34HD-1, the first HD tube set cost $8,000. Seems to me it is impossible to figure out how much of that cost was due to it being first generation.

As far as why people are interested in 720P, it is because 720P would be the eqivalent of 1440I, if I am not mistaken. This would be a clear advantage over 1080I. The fact that just about everything is now converted to 1080I and looks great doesn't change the POTENTIAL advantage of 720P.

I guess that is why people are interested in 720P and since ABC is broadcasting in 720P, I, for one, would like to 1) see a demo of 720P displayed natively; and 2) like to see RPTV's with 720P capability brought back.

Artie
 

matt-f

Second Unit
Joined
Aug 8, 2003
Messages
267
Arthur,

The info reguarding 720P eqivalent to 1440I it's sort of right, but I would of compared it this way. 1080i which would be 540p. It's just theory based but it depends on how it's done technically depending on the algorithm which will determine who is really better overall.

There is a lot of technical stuff reguarding this and I'm not sure about the full details but the ISF pros do fully know about knowledge about this to explain it correctly.

matt
 

Scott L

Senior HTF Member
Joined
Feb 29, 2000
Messages
4,457
720p is definitely not the equivalent 1440i.

Is it just me or am I the only one who sees interlacing artifacts at 1080i? I don't see why they just didn't settle on 1080p for everything. Bandwidth/hardware concerns?
 

John S

Senior HTF Member
Joined
Nov 4, 2003
Messages
5,460
Hmm well you said natively... Ain't going to happen.

All of them will be upconverted. This has to do with the CRT technology works. Such scan rates needed for 720p, are almost impossible, if / when it is done, it will be expensive, real expensive.


I have to admit, I am not one that see's interlacing artifacts at 1080i....

And in recent experiements, 480p, 720p, 1080i, are a ton closer to each other than most want to admit, even myself before the experiments.

To the eye at viewing distance, in a blind test.. It will take most everybody to task to be able to tell, given equal source, equal up / down conversion, and equal output.

I actually lost a bet on this one recently, before, I really thought I could nail the differences, no so.
 

matt-f

Second Unit
Joined
Aug 8, 2003
Messages
267
Scott,

Yup I second that. I originally said it was not true then changed it. There was some article in one of my post which said 720p is equivalent to 1440p (2 * 1/60th @ 720p), which is also not correct either since it's theory.

They should of made 540p instead of 1080i?
 

John S

Senior HTF Member
Joined
Nov 4, 2003
Messages
5,460
Hmm, well matt, at 1080i you really do have 1080 lines of real data / resolution. So it should win on that note alone.

But to the eye even good 480p at veiwing distance is really quite excellent.
 

matt-f

Second Unit
Joined
Aug 8, 2003
Messages
267
John,

The thing I see is 1/2 the lines are shown even 1/60th sec because of the way interlacead works. In terms of a full frame, it would be 1/30th. If compared to full frame for P, P draws them 1/60th.

matt
 

John S

Senior HTF Member
Joined
Nov 4, 2003
Messages
5,460
Compared to 1080p, yes.... That res is just now starting to hit FP. I have yet to see it anywhere else.

I can't imagine how long it will be before that sort of source may actually be available. I guess with an HTPC, maybe it is available now? Some scalers perhaps?
 

matt-f

Second Unit
Joined
Aug 8, 2003
Messages
267
It's true in that sense since you're comparing to the same resolution. What I'm saying in general is, interlaced implies it's really 1/2 of the number than what it saids.

I think of it this way. 720p vs. 1080i at a time instant. At every 1/60th of a second 720 lines are shown for 720p, while 540 are shown for 1080i. In that sense 720p should have the advantage.

Technically progressive is around 1.5(not 2) times better than interlaced at the same resolution in the ISF world. I don't know the details but they know the inner workings behind that number.
 

Leo Kerr

Screenwriter
Joined
May 10, 1999
Messages
1,698
Don't forget that typically, interlaced sources are vertically filtered (read: blurred) so as to mask interlace twitter... the so-called 'Kell factor.' If I remember rightly, this reduces the vertical resolution of 1080i to roughly 900 lines of 'real' information.

I do not believe that 1080sF formats (segmented frame) are Kell-filtered... they're supposed to be a way of delivering 1080 progressive content through 1080i hardware (switchers and the like.)

It'd be nice, though, if the 'i' just went away forever, along with the whole 29.97-30 / 59.94-60 / 23.976-24 frame clocks went with it...

(quick summary for those who don't know: Early black and white television ran at a true 30fps, 60fields/second. With the addition of color, they added the color-burst to the signal. To avoid having the color burst introduce a nasty harmonic into the audio stream (among some other issues,) they slowed the frame/field rate down to 29.97 or 59.94. The other option would have been to modify the existing television sets with a single capacitor (I believe. In any case, the existing sets could have been retrofitted by any compitant person with a soldering iron in less than half an hour.) Transitioning to 29.97 screwed up a lot of things, though, and resulted in 'drop frame' and 'non-drop frame' time codes; one'll run with/like a clock; the other 'drops' a frame every so often, making it a challange when you're editing video of mixed heritage. For some reason, these odd time rates made it into the infamous Table 6 of the FCC's adopted HDTV proposal (along with the other dozen or so formats describing 18 different flavors of digital television.)) ) )

(now what I need is a feature that I can mark that block and make it so the type starts out at normal size and degenerates to about the size of '....' at the end ;) )

Leo Kerr
 

John S

Senior HTF Member
Joined
Nov 4, 2003
Messages
5,460
Well the difference, is the fact that with 1080i there are really 1080 lines of real res. Not just 540 being doubled.

I mean, to me that probably offsets any difference and advantage 720p might give you. I couldn't tell one from the other whatsoever. I'd love to do some double blind testing to see who actually could.
 

Leo Kerr

Screenwriter
Joined
May 10, 1999
Messages
1,698
there are a lot of things that you can do that will 'break' any interlaced system.

Okay, 'any' may be a bit strong, but until you go to, say, 72+ interlaced frames/second.

There have been various agencies conducting detailed tests on all the I vs. P issues that they could imagine... and I'm using 'agencies' loosely here; they're not all government. Some are other industry and standards organizations.

If one wants to be just a little over-the-top about the whole affair, there are two reasons why there's any interlace in HD at all:

1. Sony Broadcast (not Sony Consumer)
2. NHK

Leo
 

matt-f

Second Unit
Joined
Aug 8, 2003
Messages
267
Johm, yup that's right.
I'm still trying think about the 540 lines even thou it's not doubled but there would be gaps thou when the other 540 lines are there because it's alternating. I would assume that 540p would be better than 1080i?

A blindfold test would be a neat idea! Hopefully the assumption is that the feed is consistant.
 

John S

Senior HTF Member
Joined
Nov 4, 2003
Messages
5,460
Myself and a friend recently majorly double blind tested ourselves. From 480p on up, we really could not tell.

We did this on my display and his display, usuing 1080i HDTV Broadcast as our source. We used 1st half -vs- 2nd half CBS NFL for the test on three consecutive Sundays.

He was of the opinion we could not tell which was which, I was of the opinion that of course you can tell.

I lost hands down. This was ED (480p), 720p and 1080i used in the tests on the output side. I even lost money to him on the bet. he is much more of a videophile than I am, I am much more of an audiophile than he is.

My viewing distance 10' to 12', his viewing distance 8' to 10'..... I now admit, I sure can't tell an "i" from a "p" on the output side. The source seems to be more important than the output is what I took away from the test.
 

Allan Jayne

Senior HTF Member
Joined
Nov 1, 1998
Messages
2,405
We might note that CRT computer monitors operate at 480p, 600p, 768p and often 1024p, and don't seem to be particularly expensive.

In terms of subject matter, 1080i and 540p are different, 1080 unique scan lines for stationary subject matter vs. 540 every 1/30'th of a second. In terms of transmission format, 1080i and 540p are essentially the same, (1125 for 1080i versus either 1124 or 1126 I don't know which for 540p) total scan lines including the retrace interval. Both can be displayed on the picture tube as 1080i (occupying 1080i unique positions on the tube) vs. 540p (occupying 540 unique positions on the tube).

Depending on the fatness of the scan lines you may or may not see the difference between 1080i vs. 540p display with 1080i source material.

The same argument applies for 720p vs. 1140i although no 1140 scan line program material exists.

If only they did not do vertical filtering (done voluntarily at production time) on DVD (and regular broadcasts and VCR tapes and HDTV broadcasts) then de-interlacers would give a better picture.

Video hints:
http://members.aol.com/ajaynejr/video.htm
 

John S

Senior HTF Member
Joined
Nov 4, 2003
Messages
5,460
Allan, at any sort of size, computer montiors are quite expensive still in my experiences with them anyways.
 

Leo Kerr

Screenwriter
Joined
May 10, 1999
Messages
1,698
John,

What sort of display was it?

Second, you were checking I versus P display of I sourced images.

The people I know doing I vs P testing have been doing it with I and P cameras. That is where the fundamental difference comes from.

Unfortunately for the average public, it is very difficult to find I vs. P source material... ABC was doing a lot of demonstrations a few years back, as was Microsoft and a few others. Agency-wide, all HD work and materials for the US Department of Defense is to be progressive because there are significant differences.

Leo Kerr
 

Users who are viewing this thread

Forum statistics

Threads
356,808
Messages
5,123,525
Members
144,184
Latest member
H-508
Recent bookmarks
0
Top