Page 1 of 6 123456 LastLast
Results 1 to 10 of 52

Thread: Digital Audio Jitter explained in simple terms

  1. #1
    Banned
    Join Date
    Apr 2010
    Location
    Seattle, WA
    Posts
    16,046

    Digital Audio Jitter explained in simple terms

    I got tired for explaining jitter for the 100th times and last year decided to write an article on it for Widescreen Review Magazine. Now that the issue has been out for a while, I converted it to html and post it on the web site: Digital Audio: The Possible and Impossible.

    As usual, let me know if you see any typos and of course, any questions you might have.

    Here is the intro:

    "Pop quiz: Which of these statements would you say is true?
    1. If you change the HDMI cable between your Blu-ray player and AVR/Processor, the analog audio output can change.
    2. If you turn off the display and video circuits in your Blu-ray player or AVR/Processor, the analog audio output can change.
    3. If you have a DAC with USB and S/PDIF inputs, changing from one to the other can change the analog output.
    4. If you change from HDMI input to S/PDIF on your AVR or Processor, the analog output can change."

  2. #2
    Member Sponsor [WBF Founding Member] Johnny Vinyl's Avatar
    Join Date
    May 2010
    Location
    Calgary, AB
    Posts
    8,561
    Thanks so much for that article Amir! This is the type of writing I appreciate as a non-technical person. So if I read correctly, I would be better off using a S/PDF cable going from my BDP to my AVR, as opposed to HDMI. But what of an analog connection (if available of course)? Is this preferred over S/PDF?

    Pop Quiz answers ()

    1. NO
    2. YES
    3. YES
    4. YES
    I love the smell of vinyl in the morning!
    John Adrian Spijkers - "Live Life! Leave A Legacy!"
    Dynavector DV20x2L - Genesis G7.1f - Marantz Reference PM KI Pearl - Nitty Gritty 2.5Fi - Oracle Paris Mk.V

  3. #3
    Member Sponsor Addicted to Best! Ronm1's Avatar
    Join Date
    Feb 2011
    Location
    wtOMitMutb NH
    Posts
    1,700
    Quote Originally Posted by John72953 View Post
    But what of an analog connection (if available of course)? Is this preferred over S/PDF?
    I'll bite, the simple answer is, if DAC's are better at the source, yep. If not, nope. Course we could make it more complicated, but then
    A Bug!! Naa...that's a feature!!
    Cables at DAC end impedance matched. Synergistic

  4. #4
    Addicted to Best! JonFo's Avatar
    Join Date
    Jun 2010
    Location
    Big Canoe, GA
    Posts
    318
    Amir, good article.

    However, we need to clarify that all this applies to LPCM data, such as the typical CD data stream over SPDIF.

    For BluRay movie and audio soundtracks encoded with packetized protocols such as Dolby TrueHD or DTS-MA (or heck even Dolby AC3 and legacy DTS), the transfer of these error-corrected high-level protocols to the AVR will result in complete immunity to cable or interface clocking variations.

    So the answer to #1 and #2 above would be: there is no difference in analog output when using non-LPCM sources.

    Bitstreaming packetized protocols between BluRay and an AVR results in the AVR doing all the protocol unwrapping and decoding then generating the LPCM streams for the DAC inside the AVR using the same master clock the DACs will use, therefore reducing jitter possibilities to their absolute minimums. Ideally, there is none, as source and DAC are slaved to the same clock.

    So for all movie and most music content available on BluRay, HDMI is just fine from an audio standpoint.

    CD players and other LCM sources with no clock-synch, totally agree, Jitter is a challenge.
    - Jonathan
    My System

  5. #5
    Addicted to Best! JonFo's Avatar
    Join Date
    Jun 2010
    Location
    Big Canoe, GA
    Posts
    318
    The benefits of having the AVR/Pre-Pro do the packet decoding and internally generating the LPCM signals can also apply to packetized audio formats, such as FLAC.

    If your AVR can pull FLAC files from a DLNA server and do internal decoding, this will give you total jitter immunity from cables, sources, etc.

    Most modern AVRs are DLNA renderers with at least stereo 16/44 FLAC support (a few get up to 24/96), so now that many people are using music servers with purchased tracks or rips of their CD collection, they can minimize jitter effects by using the AVR as the renderer.

    So on top of increasing the audio fidelity, this approach lowers complexity and cost regarding external rendering boxes and associated cabling.
    - Jonathan
    My System

  6. #6
    Banned
    Join Date
    Apr 2010
    Location
    Seattle, WA
    Posts
    16,046
    Quote Originally Posted by JonFo View Post
    Amir, good article.
    Thanks .

    However, we need to clarify that all this applies to LPCM data, such as the typical CD data stream over SPDIF.

    For BluRay movie and audio soundtracks encoded with packetized protocols such as Dolby TrueHD or DTS-MA (or heck even Dolby AC3 and legacy DTS), the transfer of these error-corrected high-level protocols to the AVR will result in complete immunity to cable or interface clocking variations.
    Actually, this makes no difference at all! I should have added it to the list of items on the pop quiz .

    Everything you describe is true as far as getting the bits to the receiver. Once there, a DSP decodes that stream into PCM samples. Now what? What clock do you use to output them? Answer is that the HDMI video clock will be used just the same as if you had used PCM input! Otherwise, all the sync issues and such that I mentioned in the article applies.

    Bitstreaming packetized protocols between BluRay and an AVR results in the AVR doing all the protocol unwrapping and decoding then generating the LPCM streams for the DAC inside the AVR using the same master clock the DACs will use, therefore reducing jitter possibilities to their absolute minimums. Ideally, there is none, as source and DAC are slaved to the same clock.
    As I mentioned, that is not the way it works or your audio will drift relative to video. The only way to make sure that doesn't happen is to lock to the video clock used on HDMI. And locking means deriving the clock for the decoded bitstream from HDMI.

    So for all movie and most music content available on BluRay, HDMI is just fine from an audio standpoint.

    CD players and other LCM sources with no clock-synch, totally agree, Jitter is a challenge.
    Due to the fact that lossless audio streams are not supported over HDMI, you are forced to use that interface regardless for Blu-ray. In that case, I would try to and find a low jitter system by reading reviews if you can. Otherwise, for movies I would not sweat it and use HDMI regardless.

  7. #7
    Banned
    Join Date
    Apr 2010
    Location
    Seattle, WA
    Posts
    16,046
    Quote Originally Posted by JonFo View Post
    The benefits of having the AVR/Pre-Pro do the packet decoding and internally generating the LPCM signals can also apply to packetized audio formats, such as FLAC.

    If your AVR can pull FLAC files from a DLNA server and do internal decoding, this will give you total jitter immunity from cables, sources, etc.

    Most modern AVRs are DLNA renderers with at least stereo 16/44 FLAC support (a few get up to 24/96), so now that many people are using music servers with purchased tracks or rips of their CD collection, they can minimize jitter effects by using the AVR as the renderer.
    This does work and does put the AVR/DAC in the position of being the master. However, use of networking on your device will likely increase sources of jitter. Networking usually implies a full blown operating system and that means a ton of CPU traffic which can create a noisier environment for the decoder clock. You may actually be better off using an async USB feeding the DAC/AVR over S/PDIF than this method.

    So on top of increasing the audio fidelity, this approach lowers complexity and cost regarding external rendering boxes and associated cabling.
    I think anything that includes networking increases complexity, not reduce it . But I get your point.

    Thanks for the thoughtful responses by the way. I am half way through part II of that article and while I had covered the BD scenario, I had not included the networked scenarios in there. I will add that .

  8. #8
    Addicted to Best! JonFo's Avatar
    Join Date
    Jun 2010
    Location
    Big Canoe, GA
    Posts
    318
    Quote Originally Posted by amirm View Post
    Thanks .


    Re: packetized protocols - Actually, this makes no difference at all! I should have added it to the list of items on the pop quiz .

    Everything you describe is true as far as getting the bits to the receiver. Once there, a DSP decodes that stream into PCM samples. Now what? What clock do you use to output them? Answer is that the HDMI video clock will be used just the same as if you had used PCM input! Otherwise, all the sync issues and such that I mentioned in the article applies.
    OK, now I’m confused, as most pre-pros/AVRs whose schematics I’ve looked at (mostly Denons) have separate clocks for audio.

    If what you describe is actually true, then TrueHD and DTS-MA could NEVER be accurate due to varying clocks. Not something I’ve ever heard about or experienced in practice. Steve Wilsons Grace for Drowning BluRay (5.1 Dolby TrueHD 24/96) is the most accurate recording I have (of 100+ SACD/DVD-A using clock-synched DenonLink players). So my experience as well as my review of the Denon AVP-A1HD schematics tell me that decoded packetized audio has its own clock in this pre-pro.


    Quote Originally Posted by amirm View Post
    As I mentioned, that is not the way it works or your audio will drift relative to video. The only way to make sure that doesn't happen is to lock to the video clock used on HDMI. And locking means deriving the clock for the decoded bitstream from HDMI.


    Due to the fact that lossless audio streams are not supported over HDMI, you are forced to use that interface regardless for Blu-ray. In that case, I would try to and find a low jitter system by reading reviews if you can. Otherwise, for movies I would not sweat it and use HDMI regardless.
    The video is reclocked and timed in the scaler in many pre-pros, as it can (and needs to due to audio DSP processing) apply lip-synch delays to video or audio. So video and audio are synchronized with variable delays depending on latencies of audio OR video processing (upscaling or not, Audyssey or not, etc.).

    I’m sure it’s a typo, but “Due to the fact that lossless audio streams are not supported over HDMI”, is incorrect. DTS-MA and TrueHD are indeed lossless packetized protocols that bitsream over HDMI.

    For movies, and modern BluRay music disc such as the Steve Wilson recording mentioned above, bitstreaming over HDMI is the only way to get the best audio results IMO.


    I could be all wet about the clocking, so pointers to examples of measured issues with internally decoded audio streams would be welcome.
    - Jonathan
    My System

  9. #9
    Banned
    Join Date
    Apr 2010
    Location
    Seattle, WA
    Posts
    16,046
    Quote Originally Posted by JonFo View Post
    OK, now I’m confused, as most pre-pros/AVRs whose schematics I’ve looked at (mostly Denons) have separate clocks for audio.
    They ALL have separate "clocks." Question is the frequency of said clock. We can agree that the clock rate changes to match the sampling rate at least. Yes? Then we are agreeing that the fact that there is a clock, does not mean that it is fixed and master of all things. Look at this diagram from the nice Article Don post a while back in EDN: http://www.edn.com/article/520241-Au...nic+News+Today



    You see that there is a crystal connected to the PLL. PLL's job is then to vary said clock to match the input rate. Here is a zoomed version of that:


    If what you describe is actually true, then TrueHD and DTS-MA could NEVER be accurate due to varying clocks.
    Accurate in what respect? That the samples change? No. The samples are always recovered. The fact that the timing changes does indeed mean that it can never recreate the original analog signal. This is true of all practical digital systems. As soon as we go in and out of analog, we lose precision. Jitter is only one contribution to that. Now, one hopes that the contributions here are below our hearing ability but from a pure measurement point of view, it is there and as such, there is no such thing as "lossless" audio. It is only lossless while you are in digital domain.

    Not something I’ve ever heard about or experienced in practice. Steve Wilsons Grace for Drowning BluRay (5.1 Dolby TrueHD 24/96) is the most accurate recording I have (of 100+ SACD/DVD-A using clock-synched DenonLink players). So my experience as well as my review of the Denon AVP-A1HD schematics tell me that decoded packetized audio has its own clock in this pre-pro.
    Now you are talking about a proprietary exception. DenonLink does indeed attempt to solve this problem by having the target be the master. If feeds the DAC clock up stream to their player, forcing it to adjust the video clock such that jitter is reduced. Pioneer has a similar solution.

    HDMI actually has a spec for this that allows the target to be in control. Problem is, few players support it and even if they did, there is no telling what they do when asked to slow down or speed up.

    The video is reclocked and timed in the scaler in many pre-pros, as it can (and needs to due to audio DSP processing) apply lip-synch delays to video or audio. So video and audio are synchronized with variable delays depending on latencies of audio OR video processing (upscaling or not, Audyssey or not, etc.).
    That's true but I wrote an article to cover how the technology works in general, not in the exceptional case of one system. The fact that DenonLink exists and does what it does, is ample proof that without it the system has the problem I mentioned. And that the mere fact of pushing a bit stream into an AVR does nothing to solve this problem.

    I’m sure it’s a typo, but “Due to the fact that lossless audio streams are not supported over HDMI”, is incorrect. DTS-MA and TrueHD are indeed lossless packetized protocols that bitsream over HDMI.
    Yes, it was a typo. I will fix.

    For movies, and modern BluRay music disc such as the Steve Wilson recording mentioned above, bitstreaming over HDMI is the only way to get the best audio results IMO.
    For just about everyone who doesn't have these proprietary, matched player and AVRs, that is not true as I have explained.

  10. #10
    Moderator Moderator treitz3's Avatar
    Join Date
    Dec 2011
    Location
    The tube lair in beautiful Rock Hill, SC
    Posts
    4,748
    As a new member looking to learn more about the digital realm that touch at the very heart of topics such as this, I thank you for your contribution. Very nice, well written and educational article. I love picking at the brains of those who have mastered the one thing I would like to learn about. You just so happen to be one of those folks. Thank you.
    In search of accurate reproduction of music. Real sound is my reference and while perfection may not be attainable? If I chase it, I might just catch excellence.

    The best way to enjoy digital music reproduction is to never listen to a good analogue reproduction.

    I post my own opinions except when posting as a moderator in green.

Page 1 of 6 123456 LastLast

Similar Threads

  1. Complexity of digital audio
    By amirm in forum General Audio Forum
    Replies: 26
    Last Post: 02-11-2012, 05:33 PM
  2. Loudspeaker Measurements Explained
    By Stereoeditor in forum General Audio Forum
    Replies: 35
    Last Post: 05-06-2011, 11:16 PM
  3. Best audio card with digital output ??
    By MikeSp in forum Computer Based Music Server Forum
    Replies: 5
    Last Post: 01-17-2011, 12:36 PM
  4. Simple Subtraction Shows Huge Difference Between MP3 and PCM Version of Same Audio!
    By Mark (Basspig) Weiss in forum Digital Audio Forum: DAC, Transports, Digital Processing
    Replies: 18
    Last Post: 12-30-2010, 12:59 AM
  5. Interesting tidbits about "jitter" in digital audio reproduction
    By amirm in forum Audio, Video And Computer Technology Expert Forum
    Replies: 49
    Last Post: 07-19-2010, 03:55 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •