OK, say you go out and get yourself a new TV with a HDCP capable DVI input, and then- And then, you are the first on the block to get the new HD DirecTivo when it comes out. You hook it up to your HDCP DVI input so you can take advantage of time shifting all that wonderful HD programing. A bit later you spot the new Samsung DVD player that outputs 720p/1080i and snatch it off the shelf only to come home and find that you have no place to plug it's HDCP encoded DVI input into your monitor. I don't know of any mainstream monitors that include more than one HDCP capable DVI input. What about in a couple of years, say HD DVD becomes a reality. If the players aren't backward compatible you will want to keep that Samsung player so that you can watch your older DVDs at the higher resolution. That leaves you with 3 sources outputting a HDCP encoded DVI signal and still only the one input. Didn't the manufacturers learn anything with component inputs? We went from 1 set of "standard" component inputs, to 1 standard - 1 "wide-band", to 2 wide-band inputs in just a couple of model years. Now we are going from 1 DVI input, to 1 DVI/HDCP input in 1 model year or so. Couldn't they have learned from the past and started putting 2 DVI/HDCP inputs on new displays? Oh.. and another problem. Say in the midst of all of the above you decide to add digital front projection to your room while keeping your regular HD display as well for casual viewing. Now you made sure to get a HDCP capable DVI input on your new projector only to get it home and think hmmm.... How in the world am I going to split that HDCP encoded DVI signal from those 3 components to feed my TV and projector, both of which only have 1 DVI/HDCP input? No real point in all this, it's just the crap that goes on in my brain when it's idle :b Just food for thought or discussion.