What's new

TV as Monitor for Media Center PC (1 Viewer)

SeanA

Second Unit
Joined
Feb 16, 2003
Messages
329

The analog (VGA) output on the HP z558 isn't too bad, but it does fall short of HD quality and the real problem is that the VGA output does not fill the entire screen. This is normal per the Samsung literature, but I am not sure of the reason. DVI will fill the entire screen and may even result in the opposite problem, which I think is called "overscan". Another issue I have is occasional screen flicker or stutter viewing or recording TV content through VGA. I can only suspect this may be due to the extra step of an analog to digital conversion.
 

Parker Clack

Schizophrenic Man
Moderator
Senior HTF Member
Joined
Jun 30, 1997
Messages
12,228
Location
Kansas City, MO
Real Name
Parker
Hmmm....very interesting Sean. You would think that you would be able to output your DVI to the HDMI on the Samsung with no problem. I will definitely check this out before I pick one up.

Parker
 

Kieran Coghlan

Second Unit
Joined
Oct 26, 1998
Messages
262
This conversation brings up an interesting question I've had for a while...

There are many 1080p rptv's that will accept a PC signal via their HDMI port, however, it seems that virtually ALL of them then overscan the resulting image.

If the PC is set to 1920x1080 @ 60Hz, and that's the native resolution & timing of the display, why would the picture be overscanned? Does this mean that any other standard image signal (say HDTV, HD-DVD, etc.) is being down-converted in some way to fit the screen?

Anyone know why this happens with PC signals? It seems almost universal on 1080p rptv's, from the reviews I've read. Many user reviews report it too, only to say that they solved it by using something like powerstrip to send a slightly lower resolution image to the TV, which then fits the screen properly. That's hardly a "solution"!!!
 

SeanA

Second Unit
Joined
Feb 16, 2003
Messages
329

I've stumbled on to a lot of "overscan" discussion at AVS forum, but I have not dug too deeply. At this point, I would be happy to just be able to deal with overscan from my video card. It is perplexing though... why should the video card output be any different from any other source device ???
 

SeanA

Second Unit
Joined
Feb 16, 2003
Messages
329
One other question. What is needed for a video card to be HDCP compliant ? Is it hardware, software, both, or either ? I guess I am just wondering if it is possible to make a card HDCP compliant via a simple driver download. It looks like nVidia's latest drivers support HDCP, but I am assuming that the video card must have some piece of hardware to be HDCP compliant in the first place.
 

Parker Clack

Schizophrenic Man
Moderator
Senior HTF Member
Joined
Jun 30, 1997
Messages
12,228
Location
Kansas City, MO
Real Name
Parker
My Nvidia allows for you to adjust for under and over scan. I think it would be cool to have something like desktop LCD monitors that automatically adjust the input signal to the native resolution of the monitor.
 

Kieran Coghlan

Second Unit
Joined
Oct 26, 1998
Messages
262
Parker: but again, that's just scaling the image (a monitor that adjusts the signal to the native resolution). TV's do that too: send a 1080i signal to a 768x1366 plasma, and it automatically scales the image (and de-interlaces it).

But when we're talking about a PC, there's no reason you can't send the proper native resolution to the monitor in the first place. This would be best. But, for many RPTV's, they overscan, even though the signal is the proper resolution and scan rate.

The Nvidia function to adjust for overscan just scales the image, too. So, either way you're loosing resolution.

I think this is really weird, not to mention lame/annoying! :)
 

Type A

HW Reviewer
Joined
Apr 7, 2007
Messages
898
Location
Aurora Oregon
Real Name
Ty

Our television system has a flaw. From the dawn of television, the outside edge of a tv signal was unstable, and waves develop around the outside edge of the video signal. Rather than fix the problem, manufactures have always created tvs that intentionally overscan. This is something you never noticed because you didnt know what to expect. But with a computer, you know exactly what to expect; you know where the borders are, the task bar, windows, and the desktop image. Suddenly, when you look at your tv displaying a computer image, you notice something that every tv you have ever owned has done to every image it has ever displayed....you just never knew it.

For me, my hc3000 projector has an underscan feature that allows me to send it 1280x720 from my htpc and it fits perfectly on the screen. But I think that type feature is still an exception, rather than the rule. Heres a link to my projector throwing a 106" maximized firefox browser (via a 50' DVI to HDMI cable) with the htpc set to 1280x720:

http://www.highdefforum.com/gallery/...ze/big/cat/502
 

ChrisWiggles

Senior HTF Member
Joined
Aug 19, 2002
Messages
4,791

That's not really an accurate portrayal of overscan. The top and bottom extremes of a lot of content is likely to have noise, that's certainly true, however one of the major concerns is with CRTs and their unstable geometry. Especially with lower-quality consumer CRTs, the size of the image and raster is fairly unstable relative to APL, so overscan helps to ensure that regardless of APL that you don't end up with the image not reaching the edge of the display which consumers would find objectionable.
 

Type A

HW Reviewer
Joined
Apr 7, 2007
Messages
898
Location
Aurora Oregon
Real Name
Ty
In my search, all I found were references to what you are talking about...but I remember reading something about overscan originally being invented to compensate for shortcommings in broadcast technology, not so much display technology. But ok, I stand corrected sir.
 

Users who are viewing this thread

Forum statistics

Threads
356,814
Messages
5,123,724
Members
144,184
Latest member
H-508
Recent bookmarks
0
Top