I know there are a million discussions on analog cables and quality. Someone I work with just bought a HD931 to go on a Samsung DLP 42". They went and bought a Monster DVI cable. They swear that there is a huge quality difference between the Monster DVI and the stock DVI cable that originally came with their Samsung HiDef DTV receiver. This has been an ongoing 'discussion' as to whether cable quality matters in DVI. I say no, he says yes...my opinion is that its a digital signal consisting of 1's and 0's and a 1 over a 15 dollar cable is the same as a 1 over a 99 dollar cable. His return arguement is that interference and the cables lack of quality can make a 1 a 0 in some cases and all of the digital information doesn't get transmitted as intended. I say if that were true then the new HDCP would not work because if the video information can get corrupted over a cheap DVI cable then the digital security keys would also fail to make it across the cable correctly leading to a loss in video. I've read the DVI spec and the HDCP spec...the 56 bit encryption key must make it across the line intact, otherwise no video, no diff in the video signal and the cipher signal. I understand the bandwidth issue also, single first dual but any cable that meets the DVI spec itself should perform just the same as others, single bandwidth is enough to support 1080i at either the current DLP or LCD resolutions. These IMO are still fairly hefty cables for a low bandwidth media, we run 10x's the data across cheaper and smaller wires with no difference. This reminds me of BB and CC selling the Gold USB and Printer cables, those apps are so low bandwidth its not even funny yet they jack you for 40-50 bux a cable when 3 dollar cables do the same job, or better yet was the Gold Modem Cable for better performance, those screaming 53kbps really need that great cable BTW, I also just bought a HD931 to use on my Sony 60" LCD GW. I'm currently using a 15' 14 dollar DVI cable from NewEgg because its what I had, I have a 6' on order.