Question about HDTV resolutions.

Discussion in 'Archived Threads 2001-2004' started by Mike Purfield, May 16, 2002.

  1. Mike Purfield

    Mike Purfield Auditioning

    Joined:
    Oct 8, 2001
    Messages:
    8
    Likes Received:
    0
    Hello everyone. I hope I'm not in the wrong thread. I'm not an active member of this board, but I just jumped into the cheap way to get HDTV. I bought an open box RCA DTC-100 from Best Buy for 250 bucks. Being the cheap schmoe that I am, I can't afford a HDTV set, so I'm using a computer monitor. The picture from my one local digital station (PBS) looks amazing. I'm trying to figure out which format the monitor is displaying, because I'm doubting if an old computer monitor is capable of displaying 1080i. Anyways, I've hooked it up to both a 17" monitor and a really old 14" monitor. The picture is great on both (I didn't believe my junky old 14" monitor could display such a wonderful picture). They both seem to be displaying the same resolution. According the the OSD on the 17" monitor, the display is set at a 33.6kHz horizontal refresh rate and a 60Hz vertical refresh. According to http://www.digitalconnection.com/FAQ/HDTV_10.htm this would seem to suggest that this is the frequency for 1080i. If so, how could a 10 year old, 14" monitor with a max resolution of 1024x768 be capable of displaying a 1920x1080 picture? I'm quite certain that whatever resolution is being displayed, it's interlaced, because I can see some interlace flicker.
    Sorry for the long winded post, I'm just trying to figure out if my monitors are displaying the full potential of HDTV... Either way, the picture is amazing.
     
  2. RyanDinan

    RyanDinan Stunt Coordinator

    Joined:
    Oct 25, 2000
    Messages:
    249
    Likes Received:
    0
    Hi Mike,

    1080i's scanrate is 33.75 (close enough to 33.6). Keep in mind that this is an interlaced resolution, so each field is only 540 lines a piece. In other words, 1080i is the same scanrate as 540P (progressive).

    Computer monitors are normally driven with progressive signals, such as 1024x768P. If a monitor can do 768P, than it can most likely sync to a lower scanrate such as 540P/1080i (if it's a multisync monitor, which most are).

    Now, the horizontal resolution is simply a limit of the monitor's bandwidth. The monitor will display whatever it can. A good 17" monitor can probably do 1280 "pixels" or samples comfortably. So you won't be seeing the full 1920 samples of horizontal resolution that HD is capable of - But then again, most consumer HDTV's fall well short of that mark - typically around 1100 - 1200 if they use 7" CRT's. HDTV's that use 9" guns can do more - But only the most expensive HDTV's can actually resolve all 1920 samples.

    Hope this helps!

    -Ryan Dinan
     
  3. Mike Purfield

    Mike Purfield Auditioning

    Joined:
    Oct 8, 2001
    Messages:
    8
    Likes Received:
    0
    Ryan, thanks for the info! If I understand this correctly, the monitors are displaying the full number of scan lines, but the scan lines aren't necessarily displaying the full detail? So would this mean that for, say, my 17" monitor capable of 1280x1024 resolution (1024p), it's displaying the full 540p/1080i scan lines, with each scan line having a resolution of 1280 pixels? Sorry if I'm sounding a little stupid, just trying to educate myself on this stuff. :)
     
  4. RyanDinan

    RyanDinan Stunt Coordinator

    Joined:
    Oct 25, 2000
    Messages:
    249
    Likes Received:
    0
    Mike,

    You got it.

    To clarify one thing on a CRT monitor:

    Just because a monitor is bandwidth-limited to say 1280 "pixels" or samples (samples are more accurate to CRT's since they really don't display actual "pixels" like LCDs do), does NOT mean 1-pixel details (such as a white dot on a black background) will go undisplayed. It's just that if there are several of these small, 1-pixel details right next to each other, they will tend to blur and average together.

    -Ryan
     

Share This Page