One thing that's ticked me off for a while now is any standard digital connection on audio equipment, coaxial or optical, and even CD players with ditital outputs. The idea was originally that once they had a digital connection between equipment, the signal sent would be perfect, and it wouldnt really matter what quality of cable you used. That turned out not to be the case however because of things like jitter in optical cables and interference in coaxial and the quality of cable did end up affecting the sound quality. Now that we realize that we'll have these problems with these standard types of cables, why don't they just come up with a new method of send a digital signal so the exact data is moved from one component to another? Computers do it all the time. Even the cheapest $15 CD-ROM drive can move large amounts of data quickly, and make sure that what it is outputting is exactly what was on the original CD to begin with. If it isn't, the corruption of the data will cause a program that you are trying to install to just not work. Since these data CD's use error correction and can do it very cheaply, why couldnt an audio cd player possibly use this method to just get every piece of audio data to the receiver correctly? The most you would need is a memory buffer on the receiving end in case corrupt data was detected so it could read it again and make sure it was correct. In this case, a $200 cd player would be able to do a perfect job sending the audio data, and you wouldn't need a $20,000 CD player to get the "best" possible sound. Why couldnt they just persue this idea, especially now that they have firewire connections and other fast, accurate ways of transmitting data?