Thanks, Amir. But don't the "better DACs" reclock that signal from S/PDIF and AES/ EBU?
Actually almost all DACs do that in that the clock they need for the DAC runs at a different frequency than what comes over S/PDIF and AES/EBU. The issue is that this doesn't solve anything. See this article I wrote on this topic:
http://www.audiosciencereview.com/f...performance-pc-server-interfaces-async-usb.8/
The architecture you imagine is one where the DAC detects the sample rate of incoming data over S/PDIF and plays at that rate on its own. The problem is that the standard does not mandate that the number of samples actually match the sample rate. Instead of 44100 samples/second the sender may send 44101 samples/sec on the average. Or 44099/sec. The rate can vary up to 5%. Let's take the case of sender/server being one sample too slow. You fill a buffer (memory) with some audio samples and then start playing. Because the DAC clock is running faster, it will eventually consume all the data in the buffer and starves for audio samples. The moment this happens, there will be nasty glitch as the DAC doesn't have anything to play.
The reverse happens when the DAC clock runs a bit slow. The buffer keeps growing and growing and eventually runs out of space and samples need to be discarded.
There is another problem with video. There, audio is slaved to video in that every frame of video has X number of audio samples. Here, you are guaranteed to have different number of samples than the stated sample rate. The assumption in this architecture is that the DAC will play exactly those number of samples, not what the sample rate says. If you do otherwise per above, over time your soundtrack runs too slow or too fast, causing it to run out of sync. If you do the math, you will see that in just a few minutes the audio track gets out of sync so much that you will notice it not matching the video.
The issue is that the DAC cannot tell the server/sender to slow down or speed up. It must act as a real-time slave and do what it is told at the rate that the data is coming.
The solution to all of this is to have your local clock for the DAC but adjust its timing constantly up and down to track the rate of audio samples coming over S/PDIF and AES/EBU. This means that all the noise and vagaries of these interfaces also bleeds into your clock accuracy. Well implemented devices are able to clean this up and offer exceptional performance but it is still a kludge.
With asynchronous USB, you get to pick your DAC clock and keep it constant. Then a microprocessor fetches chunks of data at a rate that keeps it ahead of the DAC speed. This way there is no incoming data to synchronize to.
And don't the biggest proponents of USB claim that the USB signal contains a large amount of "noise"? Of course, they can solve the problem via a $2K converter, like the one from Berkeley...
All else being equal, that is true. USB being a computer interface requires its own processor usually to interpret its protocol and hence is more complicated and capable of creating more interference. And as you mention, the interface is designed for data use so it can easily bleed tons of noise from the computer into your DAC. These are very much solvable problems though and it is what we pay the design engineers to do.
To be sure though, you do want to see the measurements of the interfaces. If USB doesn't measure up, then sure, S/PDIF would be the prefered interface. Then again, where are you getting your S/PDIF interface on your computer? In this day and age it would have to be an add-on interface and you would need to measure that to be sure.
I am not sure if both of these engineering solutions are a wash sonically, (and I would rather kill myself than do the A/B comparisons myself!), but the thought of a computer and a USB cable in the listening room is a huge buzzkill also...
That is what I have and it is not a buzzkill at all. There are networked solutions like the Regen if you want to put the computer/NAS elsewhere. You would then need to deal with Ethernet as your interface but does eliminate the computer in the room.