What's new

Why 720x480? (1 Viewer)

Ken Burkstrum

Stunt Coordinator
Joined
Dec 19, 2003
Messages
149
Why 720x480 which is a 1.5 aspect ratio. I thought it was always 1.33 or 1.77, why this middle aspect ratio?

I've been getting confused lately, I thought for a while that all DVDs are released as 720x480 and scaled from there. Now that I really think about it, it doesn't make to much sense to me. I know 640x480 is a normal 4:3 resolution but I never really hear that number unless it has to do with computers. Ugh I'm confused, the dvd wiki isnt helping me out.
 

Ken Chan

Senior HTF Member
Joined
Apr 11, 1999
Messages
3,302
Real Name
Ken
Yes, almost all DVDs are 720x480. But those pixels can represent either a 4:3 or 16:9 frame -- and in neither case are those pixels square. They're always squeezed/stretched to fit.

As for why that size specifically, 480 is close to the actual number of visible scan lines on a TV, and 720 is close to the number of (non-square) pixels you get using the sampling rate that the TV-standards engineers decided on. The actual values were then rounded so that they are divisible by 16 -- MPEG encoding works on 16x16 blocks -- and (perhaps coincidentally) neatly 3:2 to each other. Perhaps more than you want to know here.

That's how they came up with the pixel dimensions for 4:3. Then they figured out a simple trick for using the same frame for 16:9 that would also work with 4:3 TVs. So it's the same and doesn't match either aspect ratio with square pixels.
 

Ken Burkstrum

Stunt Coordinator
Joined
Dec 19, 2003
Messages
149
720x480 DVDs are the ones that give me a choice between 4:3 and 16:9? Cause why would they use 720x480 if they just want it 4:3, or if they just want it widescreen?

Lets say an original film as a million pixels. When they put it into widescreen are they scrunching a 1 million pixels into that viewable area or do 30-40$ of the pixels lay in waste? Why do people call it 480p, 720p and 1080p when really there's like a 100 scanlines on top and bottom doing nothing?

On the computer, when I change the size of a video, or stretch it in anyway I want (Divx player lets you turn anything into widescreen), is the computer guessing and filling in the extra pixeled areas or is it just stretching the existing pixels so that my screen eventually ends up having more then one pixel displaying a pixel?
 

ChrisWiggles

Senior HTF Member
Joined
Aug 19, 2002
Messages
4,791
That's a lot of questions, and it's slightly confusing.

I don't understand your first two questions.

The second paragraph, it's downscaled to a lower resolution. It's called those resolutions because that's the source encode, that is the exact resolution that is encoded in the content in these digital formats. Scanlines are kind of moot, you're dealing with digital image file formats and transmission, not so much tied to analog at all.

Last paragraph: that's scaling up from the native resolution, it's processing up and basically interpolating new pixels, not just enlarging the pixels which accomplishes nothing except making the image bigger. Upscaling does provide image improvements, but it is no replacement for a higher native resolution, which is why DVD upscaled to HD is a lot better than DVD at its native resolution, but is not as good as a native HD format.
 

Ken Chan

Senior HTF Member
Joined
Apr 11, 1999
Messages
3,302
Real Name
Ken
If you follow that link I posted, you can do the math, and round, and you get 720.
 

MarkHastings

Senior HTF Member
Joined
Jan 27, 2003
Messages
12,013
The reason you can't convert 720x480 into 4x3 is because 720x480 is not a square pixel ratio.

It's a rectangular pixel. If the pixel height is 1, then the pixel width is 0.9

Thus, you first must convert the 720 by multiplying it by 0.9

720 x 0.9 = 648


So, 648 into 480 is 1.35



Now, not to confuse things, but a true digital signal is really 720x486.

SO, now when you divide 648 into 486, you get 1.33333...
 

MarkHastings

Senior HTF Member
Joined
Jan 27, 2003
Messages
12,013
Second lesson:

Anamorphic video has a width of 1.2

So 720 x 1.2 = 864


864 / 486 = 1.777777... which rounds out to 1.78 (i.e. 16x9)
 

MarkHastings

Senior HTF Member
Joined
Jan 27, 2003
Messages
12,013
Final Lesson:

640 x 480 is an analog signal. Since analog signals can't handle rectangular pixels (i.e. it can only deal with square pixels), what it does is, it projects a 640x480 (square pixel) map over the 720x480 (rectangular pixel) image and does the necessary conversions to make it fit.

That's why analog isn't perfect. Because it has to blend pixels together to fit the square map.
 

MarkHastings

Senior HTF Member
Joined
Jan 27, 2003
Messages
12,013
If you take that 720x486 example again.


Multiply the 720 by 0.9 and you get 648


So, you're left with 648 x 486. Now I'm not 100% positive, but the reason computers like 640 x 480 is, they like to deal with numbers that are divisible by 8 or 16.

640 x 480 is just easier for the computer to deal with (as far as memory card processing goes). 640 and 480 can both be divided by 8 and 16 and the answer is a whole number.
 

RAF

Senior HTF Member
Deceased Member
Joined
Jul 3, 1997
Messages
7,061


Actually, you would get 800.

I think you meant to say "Multiply the 720 by 0.9 and you get 648"

Mark, I'm just messing with you.

;)

Thank you (and the other contributors) for the very clear answers to the questions posed in this thread. I've taken the liberty of changing your text in the "Divide/Multiply" response. Hope you don't take it as Big Brotherism.
 

Ken Burkstrum

Stunt Coordinator
Joined
Dec 19, 2003
Messages
149
Ken, does that mean that there are 720x480 pixels in the visible picture of 16:9 and only a non 16:9 TV is wasting pixels with black bars?

Computers read everything progressive right? So it takes 480i and converts it up to 480p or....how does that work? I here people with projects talk alot about upconverting, I've been pretty skeptical towards people saying that it made the picture better.



You kinda lost me on this part. I thought you said it could be 4:3?
 

ChrisWiggles

Senior HTF Member
Joined
Aug 19, 2002
Messages
4,791
Guys, the basic point is that computer and HD content specifies square sampling, which you can interpret as square pixels. NTSC video, which includes DVD, does NOT use square sampling, so in essence you can think of the pixels as being non-square.


Computers generally output progressive signals, but they can also work with interlaced. This was never an issue in the past, but as the computing and home A/V realms merge, many video cards nowadays support standard interlaced scanrates such as 1080i for output to televisions, etc.

Much film DVD content is encoded progressively, at least the fields are intended to be re-constructed in progressive 24p frames. This is different than upscaling, which will yield a higher resolution than the native 720x480 encode. Upscaling if done properly enhances the image by essentially the same ways a gaussian distribution enhances the final apparent resolution. This is no replacement for a higher-resolution native source, however in all cases, the best way to view any image is to use quality scaling to scale up to a theoretically infinite resolution and display it at that infinite resolution.
 

Ken Chan

Senior HTF Member
Joined
Apr 11, 1999
Messages
3,302
Real Name
Ken
There's no yes-or-no answer. Here's what happens. A plain old CRT TV can display approximately 480 visible lines. The DVD player is told that the TV is 4:3. With a 4:3 DVD, the player takes each of the 480 lines of 720 pixels, and converts that to a continuous analog signal, and the TV then displays each line as well as it can. (On many sets, you probably could not see 720 distinct dots.)

A 16:9 frame takes up 3/4 of the height of a 4:3 screen. (For example, 400x300 square pixels is 4:3. 3/4 of 300 is 225. 400x225 is 16:9.) So the DVD player in effect converts 720x480 to 720x360. In effect, because there are different ways of doing it. One is to simply throw out 1 out of every 4 rows. Another is to interpolate each set of 4 rows into 3.

Now that there are 360 rows of picture data, to center that, the DVD player sends 60 blank rows, then the 360 rows of the picture, and 60 more blank rows, for each frame. (Actually, because TVs are interlaced, it's 30/180/30 for each field.)

So does the TV "waste" lines? Yes; to preserve the aspect ratio, there's nothing to display there. Are pixels "lost"? That depends on how the conversion works, but some detail is lost.

Now, consider a high-resolution 4:3 display, like a CRT computer monitor set at 1600x1200. To display a 4:3 DVD, the DVD player software makes each pixel

1600/720 = 2.222 times wider and
1200/480 = 2.5 times taller

(Of course, you can't have a pixel that's "2.5 pixels" tall, just like you can't have "2.3 children" so there's some fudging.) For 16:9, each pixel is

1600/720 = 2.222 times wider again and only
900/480 = 1.875 times taller (3/4 of 2.5)

The rest of the screen is filled with black pixels. Nothing is "lost".
 

ChrisWiggles

Senior HTF Member
Joined
Aug 19, 2002
Messages
4,791
Also, analog signals can handle square sampling without any problem, not sure where that assertion came from.
 

MarkHastings

Senior HTF Member
Joined
Jan 27, 2003
Messages
12,013
The 4x3 ratio is based on square units. 4 square units wide by 3 square units tall.

The tricky part here is the fact that video (i.e. NTSC monitors) and computer monitors have different pixel aspect ratios.

When a 720x480 signal is being played on an NTSC monitor, it is 4x3. That's because on an NTSC monitor, the width of the pixels are 90% thinner than the height.

That's why you have to convert the 720 (i.e. 720 x 0.9 = 648). This 648 will now give you a square unit to compare with the height. Again, since aspect ratios rely on square units, that is why you can't just simply divide 720 by 480.

Once you convert the 720 into 648, then you can use math to get the aspect.


---------

Not to confuse you more, but you can also do it the other way around.

When you have 720 x 486, you can do:
486 / 0.9 = 540

and
720 / 540 = 1.33
 

ChrisWiggles

Senior HTF Member
Joined
Aug 19, 2002
Messages
4,791
While I think the basics are clear while referring to this in terms of pixel aspect ratio, this terminology is misleading.

It is best and most accurate to talk about sampling ratios, because pixels are a kind of layman's shortcut, and not really the best way to describe the image, because a sample is used to represent "a pixel" but the sample is not necessarily uniformly captured over the total "pixel" area.

When dealing with display aspect ratios, it's probably simplest to point out that this has to do mainly with display dimensions, that is, the aspect ratio of the total display area, and not at all with sampling ratios.

It is when people confuse resolution ratios or sampling ratios with display ratios that this apparent (but misleading) mis-match occurs.

As we move beyond non-square image sampling, this will probably fade, and the line between sampling ratio or resolution AR, versus display aspect ratio will likely become obscured.
 

MarkHastings

Senior HTF Member
Joined
Jan 27, 2003
Messages
12,013
Very true. The aspect ratio is the phycical screen itself (i.e. the physical dimensions) and not the pixel ratio.

For example:



The above is comprized of 4 (giant) pixels. 2 pixels wide by 2 pixels high.

The pixel ratio is 2x2, but that doesn't mean the aspect ratio is 2x2.

The physical dimensions are 4" wide by 2" high, so the aspect ratio is 2x1. This is why you really can't use the pixel dimensions to figure out aspect ratios, you must use physical dimensions.


Again, if you want to work out 720x480, basically if a screen was 480" high, the width would be 640" wide (not 720")
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Forum statistics

Threads
357,037
Messages
5,129,331
Members
144,284
Latest member
Ertugrul
Recent bookmarks
0
Top