When digital connections really arent

Discussion in 'Archived Threads 2001-2004' started by MikeKaz, Apr 12, 2002.

  1. MikeKaz

    MikeKaz Stunt Coordinator

    Joined:
    Aug 6, 2001
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    0
    One thing that's ticked me off for a while now is any standard digital connection on audio equipment, coaxial or optical, and even CD players with ditital outputs. The idea was originally that once they had a digital connection between equipment, the signal sent would be perfect, and it wouldnt really matter what quality of cable you used. That turned out not to be the case however because of things like jitter in optical cables and interference in coaxial and the quality of cable did end up affecting the sound quality. Now that we realize that we'll have these problems with these standard types of cables, why don't they just come up with a new method of send a digital signal so the exact data is moved from one component to another? Computers do it all the time.

    Even the cheapest $15 CD-ROM drive can move large amounts of data quickly, and make sure that what it is outputting is exactly what was on the original CD to begin with. If it isn't, the corruption of the data will cause a program that you are trying to install to just not work. Since these data CD's use error correction and can do it very cheaply, why couldnt an audio cd player possibly use this method to just get every piece of audio data to the receiver correctly? The most you would need is a memory buffer on the receiving end in case corrupt data was detected so it could read it again and make sure it was correct. In this case, a $200 cd player would be able to do a perfect job sending the audio data, and you wouldn't need a $20,000 CD player to get the "best" possible sound. Why couldnt they just persue this idea, especially now that they have firewire connections and other fast, accurate ways of transmitting data?
     
  2. Saurav

    Saurav Cinematographer

    Joined:
    Feb 15, 2001
    Messages:
    2,174
    Likes Received:
    0
    Trophy Points:
    0
    Digital audio connections, even with jitter, do not result in corrupt data. Computers do not care about jitter, it is entirely possible that a computer may have higher jitter than a digital audio system. If you're talking about data downloaded over the internet, that will almost certainly be the case.

    Anyway... I agree with you, taking care of jitter is still pretty simple, but jitter as it exists today does not cause data errors, so none of the error checking/correcting techniques used in data communications would make the slightest bit of difference.
     
  3. John Sully

    John Sully Stunt Coordinator

    Joined:
    Feb 25, 1999
    Messages:
    199
    Likes Received:
    0
    Trophy Points:
    0
    Funny you should mention jitter.
    I just took a look at patent 5,889,820 assigned to Analog Devices. Basically it is a simple and very clever way of decoupling the input clock (the jittery one) from the output clock in a DAC by using a shift register.
    I know Denon uses Analog Devices DAC's but I'm sure that other manufacturers have similar circutry. Anyway, an input circut such as this is immune to jitter on the incoming signal, so let's not talk about it anymore. [​IMG]
     
  4. Saurav

    Saurav Cinematographer

    Joined:
    Feb 15, 2001
    Messages:
    2,174
    Likes Received:
    0
    Trophy Points:
    0
    That's interesting. Do you know for certain that Analog Devices uses such a technique in all their DACs?
     
  5. John Sully

    John Sully Stunt Coordinator

    Joined:
    Feb 25, 1999
    Messages:
    199
    Likes Received:
    0
    Trophy Points:
    0
    Analog uses this circut in *most* of their DAC's and they definitely use it in the DAC's which Denon uses. Only in their least expensive chips do they use a more traditional circut for the input.
     
  6. MikeKaz

    MikeKaz Stunt Coordinator

    Joined:
    Aug 6, 2001
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    0
    alright, my real point is, why can't the cd player get the data RIGHT at least up to the point when it enters the receiver/DAC to be decoded. my best analogy is from a comptuer device across the system bus to a computer's main processor. it IS physically possible to get data from say the hard drive/whatever to the processor with every single bit being TOTALLY accurate at a very high data transfer rate. if it isn't, the processor can just mess up and the system will freeze. so- the audio electronics world is totally incapable of sending every single one and zero at the correct time off of a disc to some other device? (this is referring to the transfer of data between devices, not the actual conversion from digital to analog audio- sure, paying more at this point to actually make sound come from your speakers is fine.)

    i think this whole situation of still finding differenecs in ditigal connectons is ridiculous. people say for example that coaxial connections are better than optical because of a higher transfer rate. IF ITS REALLY DIGITAL, IT SHOULD SEND THE SAME EXACT INFORMATION. this is a bunch of BS. there has to be a way of transferring at least every one and zero (and that's all it is folks) to the input of another device correctly and at the correct time. jitter correction etc still doesnt account for people saying that coaxial has a better sound quality than optical, and still i havent seen any actual proof that this is the case. the opinions are SUBJECTIVE and this is ridiculous since the digital medium by nature is not subjective at all.

    regardless of what anyone says currently, in however many years from now, we will come up with Music players (not CD by then) where every price class player will transfer every digital bit to a device perfectly, and we will think people who had audio equipment of this era were real morons for paying $20,000 etc to get the "top of the line" digital transports for their high end systems. if the transport is digital it really is digital and every single bit is being transferred correctly.
     
  7. StanleyK

    StanleyK Extra

    Joined:
    Mar 22, 2002
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    0
    You should read this....
    http://shop.barnesandnoble.com/booksearch/isbnInquiry.asp?isbn=0071348190
     
  8. MikeKaz

    MikeKaz Stunt Coordinator

    Joined:
    Aug 6, 2001
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    0
    i guess that would give me the reasons why they can't right now, but im too poor... and its not digital AUDIO (there's no such thing really) thats the issue, its DATA. why cant they send that right...
     
  9. John Sully

    John Sully Stunt Coordinator

    Joined:
    Feb 25, 1999
    Messages:
    199
    Likes Received:
    0
    Trophy Points:
    0
    Mike,

    It is not so much that the digital data is corrupted, you would need a jitter factor equal to or greater than .25T (T being the bit clock time) for that to happen. Rather the analog waveform generation circutry is clocked from a jittery source which leads to samples being converted at a time which is slightly skewed from the time they were recorded at. Whether or not this results in audible degradation of the signal is disputable -- I'm on the fence here -- but it is a real problem in the design of DAC's. The clock PLL circutry in traditional DAC's will try to keep the DAC clock circutry in sync with the recovered clock signal from the SPDIF interface and this results in the above mentioned incorrect conversion.

    Most of the high end jitter traps I have seen described use a brute force approach by buffering the incoming data, sometimes in a ridiculously oversized buffer, and reclocking it to the output with a supposedly more accurate clock. This is an obvious and workable approach, but it does run up the silicon costs which are very important in ASIC's and especially so when as much functionality is incorporated into one chip as is found on modern DAC chips. The beauty of the circut described in the Analog Devices patent is that it can be implemented with a 3 bit shift register clocked from a master clock locked to the initial clock frequency detected by the PLL circut which controls the clocking of the change detector. As long as jitter from the source remains below the .25T level, the level at which data corruption could occur, the input clock to the DAC is completely decoupled from the recovered clock on the SPDIF connection. The result is immunity from non data corrupting jitter.

    As far as clocking within a computer, it ain't as simple as you might think it is. Clock phase shifts induced by the length of a clock distribution trace on a motherboard is a very real problem and much work goes into the design of a clock distribution system which limits these phase shifts. The idea is that for any reasonable definition of a time interval T that things which are supposed to happen at time T do happen during that interval in every place they are supposed to happen. This is why the front side bus of a modern processor runs at a fraction of the processor clock speed. It is why external busses such as PCI run at a fraction of the speed of the front side bus. It is why your IDE cable is limited in length to (I forget the exact length) 12 inches or so.

    I have no doubt that a poorly designed coax cable can induce jitter, although I am less sure about the abiltiy of TOSLink cables to do the same. The longer the coax IC the greater is the chance of inducing jitter because of internal reflections. Personally, I think the greatest source of jitter in this system is poor output clocking circutry and not the cable. Any properly designed cable (basically this means 75ohm termination) should have a nil or minimal effect on the sound of a digital connection.
     
  10. Saurav

    Saurav Cinematographer

    Joined:
    Feb 15, 2001
    Messages:
    2,174
    Likes Received:
    0
    Trophy Points:
    0
     
  11. John Sully

    John Sully Stunt Coordinator

    Joined:
    Feb 25, 1999
    Messages:
    199
    Likes Received:
    0
    Trophy Points:
    0
    I have to disagree with the idea that digital audio data is somehow different than more prosaic types of digital data. It is after all just zeros and ones. The central problem is that because of the economics of ASIC production DAC's have long used the -- possibly jittery -- input clock to drive the conversion process. The reason that DD or DTS bitstreams do not suffer from jitter is because of the extensive decoding which must take place. It is pretty nigh unto impossible to use the input clock to drive the output so jitter is not a problem.

    Similarly if the input clocking of standard SPDIF data is decoupled from the output clocking used to reform the analog waveform jitter is not a problem. This is not an issue of data transmission, rather it is an issue of the economics of ASIC production which generally forces a simple PLL lock on the input clock and it's subsequent use to clock the output. Think of it this way: if you have a clock of time T and a jitter factor of .1T it means that a sample taken at time T can modulate the output signal of the DAC in a range of .9T to 1.1T. This inevitably produces some distortion in the output signal, distortion which is due to jitter and not due to incorrect transmission of the data. There is no magic here, just a question of how the output clock for the DAC is derived.
     
  12. Mike Stutzman

    Mike Stutzman Auditioning

    Joined:
    Jan 2, 2000
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    0
    I contend that jitter should be considered to be a problem in the decode, not the source. Using the unreliable timing of the incoming data stream to run the DAC seems like an obvious mistake to me. If all DACs had independent clocks (perhaps by using that shift register that was mentioned,) then jitter would never have emerged as an issue.

    The data is intact when it arrives, (generally,) it's just not being converted back to audio acurately. This is because the DACs were designed with an output technique that I would have never accepted if I had been the engineer.
     
  13. Saurav

    Saurav Cinematographer

    Joined:
    Feb 15, 2001
    Messages:
    2,174
    Likes Received:
    0
    Trophy Points:
    0
    Agreed with both of your statements. I guess what I was trying to say was that the way PCM audio transmission and clocking works in most implementations makes it a very different ballgame from digital data transmission. And yes, there are simple ways to decouple the data stream and the timing, which would practically eliminate effects of upstream jitter.
     
  14. Chris_Freeman

    Joined:
    Mar 15, 2002
    Messages:
    25
    Likes Received:
    0
    Trophy Points:
    0
    I have to agree with JohnS on this issue, when it comes down to the 1's and 0's, it doesn't matter whether its data or audio, it still 1's and 0's. Where the problem comes in like John said is when the digital signal is converted to analog. New and better ways have been developed to do this, but its still not perfect.

    I can understand your frustration Mike, but comparing a PC to audio gear in this sense is comparing apples to apples while ignoring the fact that some of the apples are turned into apple juice. When transmitted through a PC the digital signal stays digital, while in an audio system it has to be converted to analog, which like John said is the proverbial bad apple. Keep in mind also that just because it is technically correct doesnt mean its going to sound better.

    Look at the digital audio systems used in car audio now. Its digital from start to finish(theoretically), Im not familiar with the technicality of it, but IMO it doesnt sound any better.

    Also, love the quote John!
     

Share This Page