Jump to content



Sign up for a free account to remove the pop-up ads

Signing up for an account is fast and free. As a member you can join in the conversation, enter contests and remove the pop-up ads that guests get. Click here to create your free account.

Photo
- - - - -

In laymans terms..Can someone explain 1080p & 1080i??


This topic has been archived. This means that you cannot reply to this topic.
3 replies to this topic

#1 of 4 OFFLINE   todd s

todd s

    Lead Actor



  • 6,935 posts
  • Join Date: Jul 08 1999

Posted February 12 2006 - 06:38 AM

I know they are the resolutions. But, can someone give me a quick and simple run down of what dvd's,tv's, hd use?

Thanks!

ps- I have a Toshiba 65hdx82. I remember waiting an extra few months till it came out because it had a special connection that was going to be needed...DVI/HDCP input. I also have the Infocus 4805 projector. How will either blue-ray or HD-Dvd work on these sets...If they can?
Bring back John Doe! Or at least resolve the cliff-hanger with a 2hr movie or as an extra on a dvd release.

#2 of 4 OFFLINE   ChristopherDAC

ChristopherDAC

    Producer



  • 3,729 posts
  • Join Date: Feb 18 2004

Posted February 12 2006 - 09:30 AM

OK. Every television format around is transmitted by breaking up the image into horizontal lines, which are then broken down into individual element, or "pixels", transmitted from left to right. This arrangement was settled upon in the 1930s when the television camera consisted of a photosensitive surface swept by an electron beam, periodically and in a definite arrangement, to produce an electrical signal. At the time, television could be viewed using a monochrome cathode-ray tube, or what we would now call an LCD projector consisting of a spotlight with a collimator lens, one Kerr cell the size of a pint jar, and two mirrored drums to sweep the resulting point of light across a movie screen.

There are only a few scanning arrangements used in the world today. In the United States, Canada, Japan, and some other countries, the picture consists of 525 lines; of these, several are not visible because they fall into the time occupied in sweeping the electron beam from the bottom of the screen back to the top, to start a new scan patern; they often contain signals directed to the TV reciever itself. The result is that about 483 are visible at any one time, and when the signal is transmitted digitally, this number is rounded off to 480. This type of scanning is usually done 30 times per second, for a total of 15 750 lines per second.

In many other countries, the picture is scanned using 625 lines at 25 scans per second, for a total of 15 725 lines; 576 lines are visible in the picture. A new standard, developed principally by the Japanese, proposed in the late 1970s, and now winning widespread acceptance, scans the picture with 1125 lines 30 times per second, which makes 33 750 lines per second. This format originally had 1035 visible lines, but due to complaints from the computer industry was altered to have 1080 [which results in performance problems with some equipment]; in some countries, the same picture format is used but scanning takes place at 25 frames, 28 125 lines, per second.

All of these systems are what is known as "interlaced scanning" systems. The human eye cannot detect jerkiness in motion which occurs faster than about 20 motions per second, so the "frame rates" — the frequencies at which complete pictures are drawn — for film and TV are set a little above the number. The problem is that the human eye can detect flicker at higher frequencies than this. In theatres, the film projectors use a special shutter which flashes the pictures on the screen at twice the rate at which they are changed by the film's motion.
In the 1930s and 1940s, when digital memory circuits were only a concept at Bell Labs, and a "calculator" was a professional with a slide rule, it was recognised that pictures drawn from top to bottom, continuously, at a reasonable frame rate would flicker unpleasantly, and that if they were drawn at a high enough frequency not to flicker, they would use up a great deal of bandwidth. The amount of bandwidth used by a TV picture is roughly the number of elements in a horizontal line, multiplied by the number of lines per second, divided by two. Since the frequency bands available to the technology of the day were very restricted, relatively narrow channels were alloted for TV service in order to have more stations.
The solution to the problem of providing good resolution — plenty of horizontal lines, and pixels across the width of the screen spaced about as closely as the lines, to permit reproducing fine detail — within a narrow channel, while avoiding flicker, was what is called interlaced scanning. Bluntly put, instead of the whole 525, or 625, or 1125 lines being sent one after the other, in the order they appear down the screen from top to bottom, every other line is sent in 1/60 or 1/50 second, half the frame time, and then the remaining lines are filled in during the other half. There is some reduction in picture quality due to visual effects representing "interference" between the lines, but the picture is still better than what can be sent in the same bandwidth using non-interlaced, that is "progressive", scanning. All television standards up to this time have used this system, including the now-defunct British 405-line System A, the French 819-line system, German 455, Dutch 505, and so forth; all of them used odd numers of lines, in order to get the interlace timing right with the timing circuits of the day.

Now, however, we live in the computer age: much of the video we watch is not transmitted live, but rather made up on the spot. Because people sit closer to computer screens than TVs, and use them in higher-light environments, thus reduceing their tolerance for flicker, the computer industry typically uses progressive scanning at rates of 60, 72, 75, or more frames per second. They can get away with this because they are just reading out information from a digital memory buffer, and the rate that changes is not connected to the display's "refresh rate".
When it became clear in the middle 1990s that American broadcasting was going to adopt a digital compression system, based on computer technology, to transmit "High Definition" television based on the Japanese 1125-line system, the computer companies began to complain that the new digital television was not friendly to their technologies, despite being closely related to them. They managed to force the creation of a new TV format, with 750 lines scanned 60 times per second, and not interlaced; this has 720 visible lines and is known as "720p" for progressive; they also managed to get the old 525 and 625 standards modified to allow progressive digital transmission, as "480p" and "576p", after which the older forms became known as "480i" and "576i" — i for interlace.

In order to obtain the highest possible picture quality, and recognising that video products today are really data-storage-and-interpretation machines, and that more and more television viewers are using "digital" or "flat-panel" displays which do not have the same physical properties as a CRT, and are most effective when scanned progressively, it has been decided that movie films can be stored on the new videodisc formats in digital files created from 1080-line digital images, one per film frame. This is sometimes known as "1080p" because the images, not having been broken into halves composed of alternate lines, can be described as progressive. Such storage eliminates some of the complications in converting film to video images at one of the frame rates TV systems use.
At the same time, these new displays can be refreshed progressively at 60 frames per second, and this is also known as "1080p". The new formats may be capable of outputting, through certain classes of connection, video data at a rate of 60 complete 1080-line pictures per second, and this is also known as "1080p" — although we may note that the 24-to-30 problem is not solved, only moved, when one of these systems is used with 1080p video encoding as explained above.

To be specific one really has to include the framerates — "1080i30", or "1080p24", or "1080p60", or what have you.

Does that clear anything up?


#3 of 4 OFFLINE   todd s

todd s

    Lead Actor



  • 6,935 posts
  • Join Date: Jul 08 1999

Posted February 12 2006 - 11:25 AM

Chris, Thanks for the info. It was very informative...Still a bit confusing. But, I have been shoveling snow all afternoon and my brain is a bit fried. Is their a preference to what most videophiles prefer? Also, can you help with my tv set question?

Thanks again!
Bring back John Doe! Or at least resolve the cliff-hanger with a 2hr movie or as an extra on a dvd release.

#4 of 4 OFFLINE   ChristopherDAC

ChristopherDAC

    Producer



  • 3,729 posts
  • Join Date: Feb 18 2004

Posted February 12 2006 - 03:15 PM

If you have DVI with HDCP, you should be able to recieve whatever resolution your unit is capable of accepting. It may only allow 720p or 1080i inputs, and not 1080p, which would in principle be considered the best.