720p vs 1080i from Hank's files

Discussion in 'Displays' started by Len Cheong, Sep 16, 2003.

  1. Len Cheong

    Len Cheong Second Unit

    Joined:
    Mar 18, 2000
    Messages:
    372
    Likes Received:
    0
    Just wanted to post this info from a website I found:

    On a PC with non-interlaced graphics, 720p would be 1280x720 and 1080i would be 1920x540.

    1280x720 is a wide-screen (16:9) resolution. 1920x540 may seem like an odd resolution, but it takes up a screen space of 1920x1080 when interlaced. That's also a wide-screen (16:9) resolution.

    1280x720 = 921,600 pixels
    1920x540 = 1,036,800 pixels
    1,036,800 - 921,600 = 115,200 pixels

    Thus, 1080i mode has 115,200 more valid pixels than 720p. If the number of pixels alone determines the better resolution, then 1080i resolution is better.

    However, if progressive-scan (non-interlaced) graphics is preferred over interlaced graphics, then 720p may be better since having 115,200 less pixels could be considered an honorable sacrifice to use a progressive resolution of 1280x720.
     
  2. Rick Guynn

    Rick Guynn Second Unit

    Joined:
    Mar 23, 1999
    Messages:
    473
    Likes Received:
    0
    I believe that 1920x1080i = 1920x540p only when talking about bandwidth issues. This has been discussed many times before, and I seem to remember that the resolution of 1080i is truly 1920x1080.

    RG
     
  3. John Royster

    John Royster Screenwriter

    Joined:
    Oct 14, 2001
    Messages:
    1,088
    Likes Received:
    0
    that's all well and good but I don't like interlaced video and prefer 720p.

    [​IMG]
     
  4. Leo_P

    Leo_P Second Unit

    Joined:
    May 13, 2002
    Messages:
    272
    Likes Received:
    0
    Just wanted to jump in here with a couple of things.[​IMG]
    First, I've read that most 1080i sources use about 1440 pixels instead of the full 1920. So 1080 X 1440 = 1,555,200. Still "better" than 720p's 921,600 pixels. Or is it...
    The "problem" is that ALL fixed-pixel displays (DLP, LCD, LCoS, plasma, etc.) are progressive in nature and prefer a progressive signal. Anything fed interlaced into these sets relies on their internal scaler, which might not do too good a job.
    Let's look at how some progressive sets (like my Samsung DLP) handle 1080i. They either: a) de-interlace AND line-double the signal to get 1080p then down-scale it to the sets native resolution or b) de-interlace the signal giving you 540p, then up-scaling to native. The first way is preferable, but it still involves manipulating the signal all kinds of ways.
    With CRT's eventually going the way of the dinosaurs and more people buying fixed-pixel displays, I feel 720p will become more popular and more widely adopted. That is, until 1080p signals and sets become more common.
     

Share This Page