I just read an article *somewhere* (maybe in one of the Sony Qualia set reviews) where they said that most LCoS RP displays use 3 chips.
I think the more expensive DLP sets use 3 chips. I know that the JVC D-ILA sets only use 3 chip engines. And due to the slow response times for LCD, I think all LCD RP sets have to use 3 chips. I think it's only the cheaper DLP sets that don't.
I just got a free edition of E-Gear (or something). Confirmed what I had heard elsewhere:
All LCD RP displays are 3 chip.
Most LCoS RP's are 3 chip.
It's really only DLP where there's a signifant portion of 1-chip systems out there. And, you might not want to know this, but here's how to look for the rainbow effect in these displays: while watching any content, low amounts of action might be best, let your gaze drift across the screen. If it's there, you will see it. Be careful though, for people who know how to look for it, it becomes very irritating over time. Kind of like the chroma bug: once you know what to look for, that's all you see and you quickly become disgusted with it.
Is it just me or am I the only one not fazed by this excitement over 1080P?
When 720P is fully implemented then let’s talk about moving forward. As it is, I have a decent lineup of HD stations from my cable, sometimes they actually play HD content. I have the "demo" channel that endlessly plays the same content over and over to demonstrate how GOOD HDTV can look and boy it looks incredible. But HD episodes of 24 simply don't look nearly as good as the HD "demo" station. I think segments of the industry are too interested in pushing the limits of the technology, which is hardly an admirable feat. What would really impress me is an industry that stands behind existing standards (IE 720P, 1080i) and really makes it the best it can be.
How long will it be before we get stations broadcasting in 1080P? Bandwidth requirements are nearly double. When/if cable/Sat TV ever adopts a single 1080P station will they still use the same old long on the tooth MPEG2? The MPEG2 compression with 19.4M throughput... does a "flawed" job on 720P, often it's downright horrible looking. Ever watch the HD-PPV presentation of Jet Li's Hero? The quick movements and bright colours should look incredible in HD but it suffers from such severe macroblocking that I wanted my money back, I couldn't even watch it.
Now, I'm not looking to start a debate as to whether or not the mpeg2 is up to the task, I understand they can get a lot of mileage out of it by implementing it correctly and might even be able to overcome macroblocking problems in 720P. But I hope they have a better plan for 1080P broadcasting. I am positive that such a dialogue has not even opened up at any corporation that provides video images for any reason TV, movie studios Cable, Sat providers etc. I don't see how 1080P is even on the radar with the folks who really count.
Are we sure that Blu-Ray and HDDVD will be providing a 1080P signal to TVs with a 1080P native resolution? Can you simply de-interlace a 1080i video signal and presto… 1080P? I don’t think it’s that simple, otherwise the bandwidth required of 1080i would be so much more than 720P. I sincerely don't know how this is going to work. Are networks going to invest in and start shooting shows with new HD cameras that can do 1080P? How long will it take for studios to start buying these cameras? Don't hold your breath for studios to spend money on yet another new video standard when it took, how long(?), for them to start shooting in 720P. Will studios cheat and just up-sample 720P and film to 1080P for some HD-DVD releases? Somehow, even if the "industry" claims to stand behind the whole 1080P standard I'm not going to hold my breath. Here it is 2005, the eve of the FCCs shutoff of SDTV and DTV (720P/1080i) still has a long road before it. I’d be more interested in what cable providers, film studios and TV networks have to say about adopting 1080P than I am what Sony, TI or Samsung have to say about it. Hell, I’m still waiting for them to adopt HD.
^^ You raise a hell of a lot of good points, Wayde.
I'm one of the most notorious fence sitters you're ever going to run across, so that's just the kind of material that runs through my mind when it comes to something like this.
Would it help you to know that the manufacturing and distribution costs of front projectors is less than 1/3 of an equivalent RPTV? Think about it. An RPTV is a front projector, except it also needs a very expensive array of short throw lenses & mirrors. It also needs more electronics including speakers, a tuner, and a more elaborate power supply. RPTV also needs a great deal more hardware including a large cabinet.
There are two significant reasons why front projectors have traditionally sold for more than their equivalent front projectors and that is the number of units sold and the marketing that implies that a front projector is a superior product, thus must cost more.
If sales of front projectors continue to grow at their current rate then we will definitely see 1080p FPs selling for under $5,000 well before the end of the decade. After all, there are already 1080p RPTVs priced below $5,000 today! This means if they could sell as many FPs, a manufacturer could make just as much profit by selling a 1080p FP for under $2,000. It does make you think what is possible and perhaps even bitter over the relatively higher cost of FP.
The more popular front projectors get, the cheaper they will become so start spreading the word.
Actually, it is quite easy to say why, and I believe I already mentioned that one of the primary reasons is because RPTVs sell in greater quantity, thus the margin can be significantly smaller, but perception is currently playing a larger roll in the elevated retail pricing of front projectors.
My work allows me access to a great deal of cost analysis data on both front and rear projection units, and on average RPTVs using similar projection technology cost over three times as much to the manufacturer as a similar front projector – for reasons I have already discussed.
To me, the question isn't so much as when will we get 1080p sources, but simply 1080p (with proper upscaling) at the display.
For example, we all know and love (to some degree ) DVD. DVD is 480p. But it's encoded on the disc as 480i. So to me, the importance of a 1080p *display*, is to be able to deinterlace 1080i sources up to 1080p.
I'm going to quote some information directly from a forumer off of AVS out of the monster AVS Samsung 2005 thread because I thought it was really fascinating and I wouldn't dare to try to re-write it myself.
This is about the various TI chips being used in these DLP displays.
Interesting:
"Samsung Electronics and Microsoft Announce Revolutionary HDTV Alliance on Next-Generation Xbox Video Game Console"
I guess I need to understand more about the resolution vs data encoded into the images. I sincerely seek knowledge and am not trying to bad mouth anyone's 1080P aspirations. I took this position a week ago on my blog when TI announced ramping up production and I retain an unimpressed attitude.
There is "upsampling", HD channels do this every time you see 1080i being received when it's an old broadcast. It's just 480i upsampled and still looks like crap.
Any 480i source, including DVD can be deinterlaced and appear more beautiful for it. But there is no more information present, it's just deinterlacing the existing 480i information. This and it makes sense. Now, is there such a thing as native 480P resolution that presents 2X data as 480i? This is what my Xbox games that are 480P give me from what I understand. Is there an appreciable difference between that and 480i deinterlaced?
Why is 720P and 1080i considered comparable in throughput (IE quantity of data being displayed at a time). It's the same quantity of data being viewed at one time. So is a "de-interlaced" 1080i to 1080P video signal really going to look any better? Even if the data requirements haven't doubled?
At a source / camera level it would more data, because it is not just de-interlaced 1080i.
But at the display level, I'll bank 1080p from 1080i does look better. I have not seen one personally though. And I would also assume this relatively easy to do, unlike with a DVD where things like 3:2 pull down and bad edit detaction and ect has to be done as well. Just a simple line double it would seem.
Lots of pixels on them displays, like even the SDE has to be non existant as it would be so fine compared to even a 720p display panel.
But I digress... 480p panels seem good enough to me even, I think I am easily pleased.
I just can't help my skepticism. You may see an exciting new video format but I see potential fragmentation of the market, confusion of the true definition of HD. More fickleness at adopting additional expenses of HD recording, broadcast and transmission equipment.
At best, 1080P is simply ignored by the industry. At worst it delays the DTVization of North America.
My biggest fear is that buzz around 1080P might cause one studio for one network to delay buying that one digital HD camera in anticipation of yet another standard. Does 1080P today mean 2160i tomorrow?
I want to see the breakthroughs in market adoption and consumer acceptance. I want to hear that Digital TVs outnumber SDTVs in America's livingrooms. I want to hear about HDTV TV commercials (a barometer of the health of the DTV revolution). I'm bored with technological breakthroughs already.:frowning:
I bet within 3 years, we won't be seeing a lot of displays that *won't* do 1080p.
Here's my reference for i vs p:
I'm watching a football game in 480i. QB throws a bomb down the field. As the camera pans across the field to follow the ball, you can see the jaggedness of the yard lines on the screen as they move laterally across the screen. *That* is what interlaced video gives you. With 480p? No jaggedness whatsover. Smooth scrolling. This difference also exists for 1080i vs 1080p.
Shoot, during the "early" days of HDTV, I came across a couple of very well written arguments for 720p over 1080i. And the bottom line of all those views was that interlaced *anything* is a poor alternative to good progressive.
This was just updated in the first post. Most notably, the 88 series is now listed with completely equal hardware across the boards like the 68 and 78 series.
Amazingly, they're still sticking to touting an ungodly 10,000:1 Contrast Ratio! Can you imagine if they really pull that off?! 5000:1 isn't anything to snivel at, let alone twice that!