Bit-perfect USB audio is an oxymoron.

Tam Lin

Well-Known Member
Mar 20, 2011
97
38
923
79
North Texas
The USB Audio standard specifies Isochronous transfer protocol. Isochronous transfer guarantees timely delivery of the data but does nothing to ensure the correct reception of the data. With USB 1.1 that was a reasonable trade off because data transfer was slow (12Mb/s) and latencies were large (1ms). An occasional 1-millisecond dropout due to a corrupted packet was preferable to multiple millisecond delays due to retries and interference from other USB traffic. During the initialization of an isochronous device, the USB host reserves enough bus bandwidth to meet the needs of the device. Thereafter, every millisecond, the device gets one chance to send or receive data.

In the early days of USB Audio, dropouts were a common occurrence, whether due to lost packets or PC player software running on slow CPUs unable to keep up with the sample rate. Now, USB 2.0 is the norm, with 40 times the data rate and one-eighth the latency. Still, occasionally I hear little glitches when listening to USB Audio. Is it in the source or in the delivery? Invariably when I back up the file and replay the section where I heard the glitch, it is not there.

Could it be an isochronous USB dropout? To find out, I needed to know what a 125µs dropout sounded like. I wrote a foobar plugin that introduced a periodic 125µs silence accompanied by a visual clue to mark the event. The results were unexpected.

Years earlier I performed a similar experiment simulating 1ms dropouts. Those sounded like ‘negative’ LP ticks. LP ticks are about 1ms in duration and create a large, transient output. The ticks are audible during the loudest and the softest sections of the recorded music and even the lead-in and lead-out grooves. On the other hand, USB dropouts were 1ms of silence; inaudible during inter-track gaps, barely noticeable during soft passages, but most obvious during loud passages.

As expected, the 125µs dropouts were inaudible during inter-track gaps and very quiet passages but during loud passages, the audibility depended on the dominant frequencies and texture of the music. In addition, the audibility depended on the DAC. Each of the three DACs I tried, including old-school multi-bit and new-school delta-sigma, responded differently.

The solution is to use a USB transfer method that guarantees correct reception. That is easy to do with USB 2.0 using either bulk or interrupt transfers. I am working on a DAC with a non-isochronous USB interface that can sustain 40MB/s with zero errors. Now, that’s bit-perfect.
 

Tam Lin

Well-Known Member
Mar 20, 2011
97
38
923
79
North Texas
Might it be that the dropouts you hear are not a USB problem but a case of a system a bit to high on DPC latency?
I never said what I heard was a USB problem. I just wondered, if it was a USB dropout problem, what it would sound like.
 

Tam Lin

Well-Known Member
Mar 20, 2011
97
38
923
79
North Texas
This made me think you consider isochronous USB a problem

I wasn't sure the glitches I heard were async dropouts; that's why I conducted the test. I know async USB drops packets. When running the USB throughtput tests I tried each of the transfer protocols. The PC sent packets as fast as it could and the receiver immediately cleared its buffer after each packet received. After each run the burst and interrupt versions each reported a small but non-zero number of retried transmissions. Of course, the async version reported no errors becuse the receiver doesn't acknowledge transactions. I assume the electrical environment was similar for each test and the rate of transmission errors experienced by burst and interrupt were also experenced by async.

Async USB has other features that make it unsuitable for my use. The protocol requires tremendous CPU resources to administer at either end of the pipe. For example, the popular XMOS receiver uses an 8-core, 32-bit, 500MIPS microprocessor. And, as I pointed out in another thread, most async receivers send the high-precision, master oscillator into the depths of the DSP to syncronize with the data and create the primary control signals for the DAC chip. With or without galvanic isolation, it is very bad practice.
 

Joe Whip

Well-Known Member
Feb 8, 2014
1,740
563
405
Wayne, PA
I had intermittent dropouts and then used a device to strip out the computer power and add new power. The dropouts are now totally gone. Maybe you should try that. ifi has one as does Schiit. Try the Wyrd.
 

DonH50

Member Sponsor & WBF Technical Expert
Jun 22, 2010
3,969
333
1,670
Monument, CO
Interesting test, Tam. I would not have expected 125 us to be audible but don't really know our threshold for such things.

I suppose if it's bit-perfect at the actual DAC it doesn't matter if asynch or not, how big the elasticity buffer, etc. If I were to build the perfect DAC subsystem I'd probably buffer the entire CD into a well-isolated SSD and forget the USB link. However, that won't do for streaming music.
 

alexandrov

New Member
May 28, 2012
28
0
0
Sofia
I had intermittent dropouts and then used a device to strip out the computer power and add new power. The dropouts are now totally gone. Maybe you should try that. ifi has one as does Schiit. Try the Wyrd.

wow... i just replaced my power supply with a new fanless one and now there are pops and dropouts. I did't want to blame it but...
 

Fitzcaraldo215

New Member
Nov 3, 2014
394
2
0
No evidence of dropouts here, unless they are, unbelievably, somehow euphonic and make the sound "better" in direct comparison to the silver disc from an Oppo player via HDMI or coax. Also, my plain vanilla I7 PC with no exotic parts, cables, etc. will play 6 channel DSF rips from JRiver converting to 88k PCM, apply Dirac Live EQ and feed either my prepro via HDMI or my Exasound e28 via Asynch USB without any sonic degradation. Actually, it does so with better sound than silver disc directly into my prepro, with or without Dirac on. Task Manager says CPU consumption is under 20-25%. Running other unrelated tasks on the PC simultaneously during playback and boosting CPU utilization significantly has no audible effects I can hear, certainly not dropouts.
 

barrows

Well-Known Member
Jun 28, 2012
102
4
258
Boulder, CO
As long as asynchronous USB is well implemented (in the server and DAC) it is bit perfect, and there are no dropouts. Of course, there are plenty of ways which async USB might not be well implemented.
Note: if you have a any interruption of the bit perfect data stream, there will be an audible "tic" or silence, as there is no "error correction" present to fill the gap.
I use a custom server, with an SOtM USB output card, which is powered from a separate power supply rail with its own ultra low noise regulation, and a Sonore isolated USB interface in my DAC, and I never have a dropout or any "tic" related to USB performance. Async USB is essentially perfect when implemented correctly for 2 channel music playback up to 384 kHz.
 

amirm

Banned
Apr 2, 2010
15,813
38
0
Seattle, WA
While it is typical to do nothing on buffer underruns, i.e. ticks and pops, there is no reason why error concealment cannot be implemented in the DAC. Simplest form of this is soft mute. As soon as the DAC runs out, it can simply ramp down the volume. And when new data arrives, ramp up. If it is a short period, it may not even be noticed.

Far more sophisticated error concealment technology exists that is routinely used for telephony/video conferencing. Think of the odd modulations you sometimes hear as people speak in weak cell phone reception area. It is a science to itself although very much focused toward speech. For music, the above soft muting should be standard.
 

Vincent Kars

WBF Technical Expert: Computer Audio
Jul 1, 2010
860
1
0
Often the problems with USB are not USB problems.
Underruns are in general a matter of a PC with a high latency hence it cannot feed the internal USB hub in time.

I have heard of implementations using async USB that simple request to much data (more than required to maintain a steady sample rate) from the PC. The data is stored in a buffer in the DAC so if the PC cannot maintain the stream, one literally has a buffer.
 

amirm

Banned
Apr 2, 2010
15,813
38
0
Seattle, WA
Buffer underrun is always a PC problem. The audio stream needs to be real time but the PC (or just about any audio server you use) is not. Buffering helps but at the extreme, if the PC becomes too busy, it can fall behind and glitching occur. This happens with USB, internal audio, pro sound cards, etc. The only mitigating factor is what you say Vincent which is more buffer in the DAC.
 

Orb

New Member
Sep 8, 2010
3,010
2
0
Just curious,
what current tools are used to test bit-perfect at the USB end device receiver?
Context being Isochronous audio streaming transfers rather than say testing using data storage devices.

Cheers
Orb
 

Orb

New Member
Sep 8, 2010
3,010
2
0
There are tools like this: http://www.passmark.com/products/usb2loopback.htm

As far as I know there is no difference between isochronous and bulk mode transfer as far as error detection (CRC errors) is concerned.
Except in the case of Isochronous there won't be a retry in case of an error.

If there is no difference to bulk mode transfer in error detection then this raises concern when you consider audio streaming; remember Jim Lesurf and the massive jitter issue for DACs in that situation?
Pretty sure error detection/correction is handled differently for audio streaming compared to data transfer; different technology but same way we have UDP (audio/video streaming) and TCP (guaranteed data) that behave differently for detection/correction.

Thanks for the link on the loopback and yeah like you was something I did think of but I feel it probably has limited scope in what it can do; after all I am pretty sure it will show that there is no need for Asynchronous DAC USB setup - ala the issue Jim Lesurf eloquently shows.
And that is just one variable, I remember Paul Miller showing just how random some USB cables are in terms of jitter performance and again this will show no issues in that loopback test, I think so anyway :)

Coming from a telecom background, quite often loopback BERT testing would not necessarily show there to be a problem with the low level frame-network.

I do wonder if there needs to be a very specific test that focuses on sensitive time important streaming end-to-end; would then mean application-down to-framing and decoded-analysed in context of receiver and DAC.
But then the issue is probably not enough to warrant going to that expense in time and resources to develop.
It does not help that something like J-test is not ideal as well for USB solution and considerations.

Just mulling this over.
Cheers
Orb
 

Vincent Kars

WBF Technical Expert: Computer Audio
Jul 1, 2010
860
1
0
Can’t escape the feeling you mix up error detection, error correction and jitter performance.

If we talk SPDIF, Redbook or USB isochronous there is a method to detect errors.
At design time they decided not to implement bit perfect data transmission.
Hence there is no error correction.
Of course they deploy all kind of tricks to make data transmission as stable and reliable as possible but if you push thing to the limit, bit perfect transmission is not guaranteed.

IMHO, this has nothing to do with jitter performance. E.g. a SPDIF header might send the right bits but does so with a high intrinsic jitter.
Likewise a high end transport might send the data to a DAC with extremely low jitter but as the CD is scratched, the data is interpolated by the player so not bit perfect at all in relation to the source.
 

Orb

New Member
Sep 8, 2010
3,010
2
0
Can’t escape the feeling you mix up error detection, error correction and jitter performance.

If we talk SPDIF, Redbook or USB isochronous there is a method to detect errors.
At design time they decided not to implement bit perfect data transmission.
Hence there is no error correction.
Of course they deploy all kind of tricks to make data transmission as stable and reliable as possible but if you push thing to the limit, bit perfect transmission is not guaranteed.

IMHO, this has nothing to do with jitter performance. E.g. a SPDIF header might send the right bits but does so with a high intrinsic jitter.
Likewise a high end transport might send the data to a DAC with extremely low jitter but as the CD is scratched, the data is interpolated by the player so not bit perfect at all in relation to the source.

Not at all mixed up on it Vincent in general (maybe I am though on the use of bulk mode transfer used by DACs that Halide/etc managed to improve); I just also added jitter performance to this discussion because it rarely will show errors in context of this topic, just to add it was not myself who started the discussion in this thread regarding detection and correction.

To put it into context; are you saying the Jim Lesurf jitter issue (when correlated as jitter it is massive as it is not really wow or flutter) with bulk mode transfer DAC would not show errors/not need error correction, or that there is no need for them?
All three are a consideration, but we are digressing a bit from my previous post that was more about how exactly do you analyse bit perfect asynchronous streamed audio (the loopback ties more into traditional bulk mode transfer and removes async dac audio streaming focus)?

Sort of reminds me how many say wireless and ethernet means streamed data is bit perfect, but it is not in this instance due to limited detection and correction when implementing time sensitive streaming.
Detection and correction to me covers functionality at both physical network-framing level up to and including sessions-application; depending upon implementation and scope this can be mean guaranteed data or not - please appreciate this is just very simple summation on my opinion and experience.
And as Amir says there is also error concealment as part of this that most of us take for granted-assume works well (although as he points out this is probably a bad assumption).

Cheers
Orb
 
Last edited:

Orb

New Member
Sep 8, 2010
3,010
2
0
Just to add.
The reason the J-test is a poor way to measure these days is because it is not simulator/modeller of jitter, but actually stimulates jitter by using the worst case scenario that is more specific to traditional interfaces+DAC (specifically AES/EBU and S/PDIF).

Cheers
Orb
 

Vincent Kars

WBF Technical Expert: Computer Audio
Jul 1, 2010
860
1
0
Assuming you are referring to this article: http://www.audiomisc.co.uk/Linux/Sound3/TimeForChange.html

Jim Lesurf compares the performance of the DacMagic (isochronous USB with adaptive mode synchronization) with the Halide Bridge (isochronous USB with asynchronous mode synchronization).
It is not up to me to understand what this got to do with error detection (CRC, implemented in hardware), error correction (not applicable to isochronous USB as it is a quasi real time stream) and bulk mode (not used by both products)
 

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu

Steve Williams
Site Founder | Site Owner | Administrator
Ron Resnick
Site Co-Owner | Administrator
Julian (The Fixer)
Website Build | Marketing Managersing