What's new
World Wide Stereo

Color balance problem with HDMI connections made with HTPC to TV (1 Viewer)


Apr 17, 2015
Real Name
Michael Sanders
The problem exists on modern televisions that are not designed to work with PC connections using HDMI. The true solution to this problem is not on the Internet. And it is reported by users in different ways. Some will say the picture is too dark while others will say muddy.

I defined a test you can perform to determine if you are experiencing the problem. Use the Windows 7 desktop background located at: C:\Windows\Globalization\MCT\MCT-US\Wallpaper\US-wp4.jpg. This is the image that shows a plowed wheat field with rolls of hay. When correctly displayed the wheat will be golden, light brown or even pale in different areas. When you are experiencing the problem it will look pumpkin or carrot orange. Also, text isn't clear and appears harsh. Reflective surfaces appear slightly grainy, such as the Windows 7 Orb start menu. There are many poor quality TV's out there. So you may not realize you have the problem. Even after you correct the problem your display will still look pretty awful if you have a poor TV.

The problem is that many TV manufacturers are choosing to be more compatible with the output strategies of Roku, Chromecast, Sub-par Blu-Ray and game consoles. Even worse, the HDMI connectors on TV's are not the same within the same set. If you want this to work you need to get a TV that is designed to work with HTPC connections. It isn't exclusive. But the best way to determine this, is to see if your TV has a VGA connector. On one TV I tested you can plug the PC into the PC specific HDMI connection and it will work. And if you plug your PC into any other HDMI connector on the same TV it will be pumpkin orange!

I spent 3 weeks doing exhaustive testing using empirical scientific research methods to arrive at this conclusion. And I have a few fixes with varying degrees of acceptability depending on your pickiness. To eliminate software problems I tried many different versions of drivers and clean installs of Windows 7. To eliminate hardware bias I used 8 different video cards half ATI half NVIDIA and 3 different PC's. Four of the video cards were new ones purchased from Microcenter. One of which was an ASUS, specific for HTPC uses. I did my testing on around 15 different televisions. Some I owned some I purchased for the testing. And for the remaining TV's I actually brought my HTPC to Sears and plugged it into as many as I could. I was very, very thorough.

Solutions to the problem

1) Getting a TV that supports PC is the best solution. But that isn't always the best solution for you. I picked one that does not support PC HDMI connections because it looked amazing and was cheap.

2) The first thing I noticed was that DVI output always worked on any TV set. This is the method I used. For ATI you need the special dongle that comes with the retail card otherwise it will not pass audio over DVI-HDMI connections. NVIDIA cards support audio over DVI-HDMI by running PCM from your sound card to pins on the video card. Or some just natively work without any special hardware or connections. This is an odd feature and does not work on all cards. You can also run analog audio to the TV. But in a case like that you'll find that the TV supports HTPC to TV connections. Thus using DVI is unnecessary.

3) Most people who have this problem are adjusting the color settings on the video card and the TV to compensate. This can work. But without testing, you can't notice it imparts a grain or pixel noise to any video you watch. Some may not realize this. Ghostbusters Blu-Ray for example is a really poor transfer. And you may just think this is how it is. This is the worst solution.

4) I also noticed that using PC resolutions fixed this problem. You won't get 1080p this way. But you may have noticed that there are two different sections for resolutions. One will be the square or PC resolutions. And the other section is wide screen or Home Theater modes/resolutions. You'll have to forgive me for not remembering if this is for ATI or Nvidia. But some have suggested making your own custom resolution for 1080p. The control panel won't let you create a PC resolution for 1080p complaining that a resolution already exists with those specifications. So you must choose a slightly off refresh rate of 59.99 Hz. Or 119.99Hz depending on what TV you have. This is fine and won't harm a thing. It simply tricks the control panel into thinking it's a new PC resolution.

Here are some suggestions that do not work: Quality of the cable, changing pixel formats YCbCr/full RGB etc., getting an HTPC card, eliminating home theater receiver pass through connections/HDMI switch boxes, matching HDMI port specifications, disable/enable ARC support on the TV, dynamic/power saving modes on the TV, naked prostration before the PC gods and self-flagellation with a Cat-5 cable. Though I can't say I actually tried that last one.

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more

You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Forum statistics

Latest member
Recent bookmarks
SVS Outlet Sale