What's new

1080i question (1 Viewer)

Curtis T

Grip
Joined
Jan 24, 2002
Messages
23
I'm kind of new to the whole home theater game and I'm having a difficult time figuring out what is exactly happening.

The main thing I don't really understand is why 1080i is being widely used as a form of high-definition. The interlacing idea was based on the old analog NTSC system because of limitations in bandwidth and the electronics of that day. But we are now in the period where film and video are both captured progressively , both can be transmitted or output progressively , and both can now be displayed progressively.

I can understand that 1080i could be a temporary option because many people still have analog televisions and tv stations are still broadcasting analog signals (not much longer though) so interlacing is still a part of our lives for a little bit longer. However , as soon as the full conversion to digital and in particular high-definition is over wouldn't it make sense to have every source output either 480p , 720p and eventually 1080p.

The DVD format was 480i and I always thought was because the majority of the population was still using interlaced technology. Then came progressive scan (480p) which was for the part of the population that could display a progressive image. Since the DVD format is about to make a change to the HD-DVD format my thinking would be that BluRay and HD-DVD should be either 1080p or 720p. My understanding is that the studio masters are 1080p so I could see why Hollywood may not want us to have 1080p due to piracy concerns. But then it would make sense to give us 720p. 1080i for HD-DVD makes no sense to me because interlacing is a thing of the past. I understand that a 1080i image might be a bit better than a 720p image but 1080p is on the way anyways (for displays anyways) which be better than both 720p AND 1080i. In this age , hy capture soemthing progressively , then interlacing it only to have it displayed progressively again makes no sense to me.

With my way of thinking (which might be completely insane by the way) television broadcasts / digital cable / digital satellite would be in 720p . (I understand that 1080p is not feasible in this area due to bandwidth limitations)

HD-DVD/BlueRay would either be 1080p or 720p depending on what Hollywood wants to give us and how much they are concerned over piracy.

Basically , 1080p would be the best option with 720p being a more-than-suitable alternative. 1080i and interlacing in particular would not be necessary.

Sorry for the long post but I would like it if someone could steer me in the right direction about all us where we heading in the future with all this stuff.

Bottom line is : 1080i might be a high-def standard but it makes no sense to me why it would be considered as being the future.
 

John S

Senior HTF Member
Joined
Nov 4, 2003
Messages
5,460
There are significant price limitations right now to support 1080p on the display side. But aside from that, 1080i is 1080 real lines of resolution. I don't agree that 480p is not even close, but I don't even agree that 720p is better myself....

But the displaying of 1080p is still tough to do period on the display side.

1080p is one of the deignated HD formats though. It would not surprise me at all if not to long from now, you can get an HD Progressive Scan DVD player to output 1080p from 1080i / 720p that is on the disc itself. It will eventually be all the rage I am sure of it.

So you buying some of the first 1080p displays as they come out or what???
 

Curtis T

Grip
Joined
Jan 24, 2002
Messages
23
No , not at $17000. But within a couple of years maybe when they are in the 5k-10k range.

My basic question is why will there still be a need for interlacing in the future when everything will be progressive scan in nature.
 

Leo Kerr

Screenwriter
Joined
May 10, 1999
Messages
1,698
Okay, if you want the cynical/conspiracy theory version of all the 'i' formats, here it is:

Corporate Japan spent a couple hundred billion Yen on Hi-Vision, a 1035i format.

Along comes the process known as the US "Grand Alliance" to figure out what HD will be.

Along comes Corporate Japan, looking for a market for in excess of US$5,000,000,000 in hardware sitting in wearhouses.

Sony proceeds to buy the corporate souls of CBS, David Sarnoff Corporation, and makes a stab at NBC and PBS.

Other equipment manufacturers, however, look to different markets, and when the US DoD issues a policy statement indicating that all future (non NTSC) equipment or production done on behalf of the DoD must be in a 'p' format, Japan sees two near-immediate defectors: Panasonic and JVC. In Europe, Thompson/Philips jumps up and waves the 'p' flag.

It was kind of interesting watching the i v p wars at trade shows... and it lead to the very interesting discovery:

One of the very best I/P comparison/test charts cost US$20, in a world where good camera test charts often go for hundreds of dollars each and need to be replaced on a regular shedule.

A US $20 bill will 'break' an interlaced system. John, you will discover that in a high resolution scene, your 1080i is not actually delivering 1080 lines of useful resolution.

Of course, when Sony started seeing that the I format was a lost cause with US Government (many other agencies have since signed on,) and that Hollywood wanted P, they came out with their first Cine-Alta camera.

I handled it, the first day of NAB... 2000? I don't remember the year.

It was a piece of trash, and if you panned the camera, the image would break. By the end of Day 1 of a four day trade show, Sony had the camera surrounded by their staffers: even though the camera was 'on display,' no visitors were really encouraged to touch it. Several friends of mine were forcably evicted from the Sony booth (about 15,000 square feet,) for having panned the new flagship camera!

In a perverse way, it was rather funny. It was a revealing trip... especially since Thompson/Philips/Polaroid was showing off their LDK-6000 720p camera a few hundred feet away; a very nice camera, too, even if it couldn't see in the dark.

Anyway, as you might have noticed, I got distracted from the original question.

Anyway, another thing of interest to note is that Sony is not a monolithic corporation.

Sony Broadcast was dragged kicking and screaming into the Progressive era. Sony Consumer, on the other hand, went progressive just to spite Sony Broadcast. I'm not sure where Sony Industrial went; I think they went 'buyer's choice,' and'd sell anything to anyone.

(aside: the "and'd" above is a non-standard contraction that I'm trying out, combinind "and" and "would". Comments?)

Anyway, I better cut this off before I keep rambling on for the rest of the night.

Leo
 

Brad E

Second Unit
Joined
Jan 11, 2004
Messages
304
I read over the and'd and didn't even notice it until I read the aside note.
Your on to something there.
 

Vince Maskeeper

Senior HTF Member
Joined
Jan 18, 1999
Messages
6,500
My thoughts:

These standards aren't anywhere as new as the technology that is bringing them. Heck, most of the basic outline for the technical standards for "high definition" television started largely in the 1980's and only really adapted into their current form when much of the technology we're using now was still theory or in an infant stage.

I think the current standards are really just an optimistic projection taking root more than a decade ago. In a time when CRT based displays were the only game in town, and the only affordable game on the horizon-- I don't think it's too surprising to see "realistic" concepts in terms of sync rates-- especially consdiering the cost and availabiltiy of the hardware needed to do this stuff, heck, even 5 years ago.


So, I guess it's like this: if you try to draw up a standard for any technology- right now today, that won't be finalized for 6-8 years and won't be implemented for another 5... I would imagine that there would be some elements to it that might seem aniquated by the time it becomes invoked. Even as HD standards we being implemented, the concept of a 1080p display device in the crt realm was on par to a ferrari-- you dreamed of owning one, you knew that some existed out there somewhere but god knows you'll probably never be able to afford one.

Even 720p was really pushing it at 45 kHz Horizontal scan rate for the equipment available at the time! 1080i only uses 33.75khz horizontal scan rate on analog devices-- and that was still as serious pickle. IIRC the lower end display projector from Sony in the mid-90s was the 1252 (I owned one years later)- and it could take sync rates this high, but it cost $20,000 new!!

And these were "4:3" tubes so when you measured in the Vsync as well you actually still had trouble fully resolving a HD feed. Again, these things cost $20 grand!


So, anyway, I just think the reality is things have changed a lot faster than you think they have. 5-6 years back we were debating if any of the consumer CRT projection sets could really resolve 720p correctly, now we're talking about 1080p LCD devices. What a short, strange trip it's been.
 

John S

Senior HTF Member
Joined
Nov 4, 2003
Messages
5,460
Just keep in mind, that the camera side, and the display side of the resolutions are still pretty unrelated to each other.
 

Allan Jayne

Senior HTF Member
Joined
Nov 1, 1998
Messages
2,405
The existing transmission standards for U.S. HDTV will carry 1080p film source (24 complete frames per second).

Probably not too long from now, film source transmitted as 1080i (with 3-2 pulldown) and 1080p 30 fps video source transmitted as 1080i could be perfectly converted back to 1080p at reasonable cost.

Both the existing 1080i and the existing 720p have the same bandwidth requirement (37 MHz) when transmitted as analog baseband video with retrace intervals. It is easier to skimp on the manufacture of a 1080i TV as opposed to a 720p TV. For the 1080i TV the scan rate is only slightly more than for 480p. Degradation of horizontal resolution as a result of skimping on the bandwidth is less obtrusive for 1080i than 720p.

Video hints:
http://members.aol.com/ajaynejr/video.htm
 

ChrisWiggles

Senior HTF Member
Joined
Aug 19, 2002
Messages
4,791
Cost.

1080p takes a heck of a lot of bandwidth and data, and also a lot of quality electronics in a CRT display to support those kinds of scanrates.

Even the current, pinnacle CRT displays like the G90, 9500LC, etc will soften at such high scanrates because the bandwidth necessary is pretty obscene. I don't even run 1080p on my CRT, an 8500.
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Latest Articles

Forum statistics

Threads
357,002
Messages
5,128,079
Members
144,228
Latest member
CoolMovies
Recent bookmarks
0
Top