Objectivist or Subjectivist? Give Me a Break

Then we go along trying to find ways to measure it and if we fail and it is consistently heard, we work with that .. Eventually we will find a way to measure if, since it exists ...



Of course that suggests ears prevail over measurments.
 
My comments about 80 dB (or 100 dB) are more theoretical than practical. When people argue about stuff like jitter being audible, even though it's typically 120+ dB down, I know they're imagining things.
120 db down would mean a jitter spec of only 50 picoseconds relative to 20 Khz audio bandwidth (would be even lower for higher bandwidth). Consumer equipment routinely has jitter that is orders of magnitude higher. I have some samples from my WSR article: http://www.madronadigital.com/Library/DigitalAudioJitter.html

"Onkyo TX-NR5007 AV Receiver:

S/PDIF: 790 picoseconds
HDMI: 4,870 picoseconds"


The S/PDIF value translates to 92 db down and HDMI -76 db. Further, please note that this is the amplitude of a single distortion spike. It does NOT represent the total energy of all the distortion products. When we measure THD for amps, we sum the power of all the distortion products. These jitter values therefore need to be compounded since each tone in the signal creates a pair of them of that amplitude. You may not hear one spike but might sense the music getting harsher when there is a spray of thousands overlaying on usually weak music signals at higher frequencies for example.

Here are the Yamaha stats;

'Yamaha RX-V3900:

SPDIF: 183 picosecondss
HDMI: 7,700 picoseconds"


First one is -104 db but HDMI is at -72 db. Cassette tapes had this kind of noise floor!

I don't see any reason for not advocating better hygiene here. By making excuses for the equipment vendors and not demanding to see such measurements, they get a license to make their equipment a hair better than bad enough for everyone to hear the distortion....
 
'Yamaha RX-V3900:

SPDIF: 183 picosecondss
HDMI: 7,700 picoseconds"

First one is -104 db but HDMI is at -72 db. Cassette tapes had this kind of noise floor!

I don't see any reason for not advocating better hygiene here. By making excuses for the equipment vendors and not demanding to see such measurements, they get a license to make their equipment a hair better than bad enough for everyone to hear the distortion....

I see two things in that statement Amir. One is that jitter levels and the resulting noise floor are unacceptable in HDMI. Surely the consumer audio industry can do better.

But the other thing I see, if I understand this correctly, is a very low noise floor as a result of jitter from SPDIF, out of a midfi AV Receiver! Surely the high end can turn their attention to things that are actually audible.

Tim
 
...You may not hear one spike but might sense the music getting harsher when there is a spray of thousands overlaying on usually weak music signals at higher frequencies for example.

Amir
 
......and the world keeps on turnin'

Once again we're just "imagining things". Yeah whatever.
 
Curious that most people refers to a -80 dB null, but so far as I have read no one defined what will be signal source used for this experiment, even is it a "theoretical" experiment.

Unless the experimental conditions are clearly stated, all debate is useless - no one knows exactly what he is addressing and the usual straw-man accusations will soon proliferate.

We could formulate the problem is another way. If we have two similar amplifiers playing exactly the same signal, is it possible to add to one of them a signal at the input, created by intentional manipulation of the input signal, that will not affect the output by more than -80dB of the peak output and makes the amplifiers easily distinguishable?
 
That's not really the question. It supposes we passed the "blind test." Then we heard it but still can't measure it. What then?
Greg

Well let's hope, in that case, that it's a good thing, because if you can't identify it, it's going to be mighty hard to get rid of.

Tim
 
So, let's not lose sight of the fact that Ethan has clearly stated that the best he was able to achieve in null testing the output from a loudspeaker is a 50dB null. The speaker output is what we listen to & is the only valid output to test when we want to evaluate a playback system's transparency/equivalence.

This null is 30dB worse than his theoretical 80dB threshold for equivalence of two systems.

Please tell me what this 50dB null proves/shows, if anything?
 
Greg quoted...
You may not hear one spike but might sense the music getting harsher when there is a spray of thousands overlaying on usually weak music signals at higher frequencies for example.

So many ifs, so little time. We're talking about enough simultaneous spikes over program material at such a low level that we can detect distortion at more than 100 dB down? Is it possible? If Amir says so, it probably is. But let's not forget we're talking about DACs - 7 of them - that come built in to a midfi AV receiver, and output via SPDIF. Wouldn't we expect much better performance from a dedicated DAC at the same price as the whole receiver? Or twice the price of the whole receiver? How about 5 or 10times the price of the whole receiver...now we're talking about "high end" DAC prices. At what point along the curve does jitter eradication become practically achievable, when we're that close with the DACs thrown into midfi? $500? $1000? $2000?

Tim
 
When we measure THD at fantastically low levels, am I correct in thinking that there is significant averaging over many cycles? Is the key to improving the null to do it in software with many repeated correlations? Maybe that's cheating, but if it's a noise issue, the noise can be characterised separately. Hopefully the genuine amplifier distortion artefacts would be repeatable and still show up after the averaging.
 
Well, if you're measuring at the amplifiers output terminals, and not exceeding what the amp can drive, the speaker's varying impedance shouldn't have an affect.

Please show reasoning for why not. In my estimation the amp's finite output impedance forms a potential divider with the speaker and cable impedance. What am I missing?

My comments about 80 dB (or 100 dB) are more theoretical than practical. When people argue about stuff like jitter being audible, even though it's typically 120+ dB down, I know they're imagining things.

In practice the jitter may well be higher than -120dB because most DACs these days are of the S-D type which are far more jitter sensitive than multibit types. Jitter's affect on these is to cause noise modulation which doesn't show up in the typical FFT plots demonstrating jitter side-bands. -120dB may well apply to individual tones in the J-test I agree but our ears are not only sensitive to what shows up clearly in FFTs.
 
When we measure THD at fantastically low levels, am I correct in thinking that there is significant averaging over many cycles?

Yes - the THD might be measured in the old fashioned way with a notch filter for the fundamental. In which case the residual is averaged in an analog circuit. If its an FFT then that forms an average over the acquisition window (typically many thousands of sample periods long). Some analysers average several sample windows too - I know AP has this facility.

Is the key to improving the null to do it in software with many repeated correlations? Maybe that's cheating, but if it's a noise issue, the noise can be characterised separately. Hopefully the genuine amplifier distortion artefacts would be repeatable and still show up after the averaging.

I'm unclear how the noise may be characterised separately in practice - do you have any suggestions?
 
So, let's not lose sight of the fact that Ethan has clearly stated that the best he was able to achieve in null testing the output from a loudspeaker is a 50dB null. The speaker output is what we listen to & is the only valid output to test when we want to evaluate a playback system's transparency/equivalence.

This null is 30dB worse than his theoretical 80dB threshold for equivalence of two systems.

Please tell me what this 50dB null proves/shows, if anything?

If Ethan has measured -50 dB nulls between different (models of) amplifiers I am somewhat amazed. How is that Bob Carver couldn't even get to -40 dB nulls between two amplifiers that were specifically designed to sound the same??
 
Yes - the THD might be measured in the old fashioned way with a notch filter for the fundamental. In which case the residual is averaged in an analog circuit. If its an FFT then that forms an average over the acquisition window (typically many thousands of sample periods long). Some analysers average several sample windows too - I know AP has this facility.



I'm unclear how the noise may be characterised separately in practice - do you have any suggestions?

Noise is residual output with no input signal at all. Often this measurement is made with the input terminals short circuited to assure the voltage across it is zero. It is characterized by amplitude as a function of frequency using a notch filter. Usually it's either entirely random (white noise) or skewed to lower frequencies (pink noise.) However, external noise that is induced through capacitive coupling or mutually transconductive (transformer) coupling may cluster around one frequency. If it's a radio broadcast it may even be detected and appear at the output as a low level signal that was broadcast. If you've ever lived on the infield of a high powered radio transmitter as I did you may have experienced this.

All distortions introduced in the digital domain will show up as one form or another of "classical" distortion in the analog domain. It's easy to characterize these distortions completely based on the mathematics of classical waveform analysis. Usually we consider three kinds, linear distortion (FR), harmonic distortion (harmonics appearing in the output that were not in the input, and a catchall for everything else, noise and intermodulation distortion. Later inventions such as transient intermodulation distortion also called slew rate distortion is actually a form of FR distortion that doesn't show up at low output levels. What it really amounts to is a falloff in high end FR as output level increases. If FR were measured at full output into an actual load instead of at a very low output, TIM would be redundant, superfluous. Some random noise called dither is deliberately introduced into the output of circuits that have been converted from digital to analog to mask minute quantization artifacts at very low signal level. In practical terms you probably woudn't notice the difference if it were absent.

In a book published about 5 years ago, a professor of Electrical Engineering at UCLA asserted that at extremely low noise levels, human perception actually improves when a slight degree of noise is introduced. This is true for analog signals as well as digital and for video as well as sound. It's entirely counterintutive, flies in the face of what was previously assumed, but it must be taken seriously given the source. However, this fact probably has no practical application in consumer electronics. The levels involved are much too low.

OK, you've had your course at my 1 minunte university. Send in your tuition money. :)
 
Noise is residual output with no input signal at all.

Not in this context - as I understand the discussion here its the 'N' component in the 'THD+N' measurement.

OK, you've had your course at my 1 minunte university. Send in your tuition money. :)

You might wish to check you've turned up at the correct lecture theatre next time before delivering your homily :D

Incidentally what you spoke of in your second-to-last paragraph is called 'stochastic resonance'. Perhaps the prof you're referring to is Bart Kosko?
 
  1. Quantization noise is related to signal; shot, thermal and other types of noise are present whether signal is present or not.
  2. ADCs and DACs introducing sampling artifacts classical analysis may miss, the obvious one being aliasing, less obvious perhaps clock modulation and jitter.
  3. I am not sure the present usage of TIM; when I first learned the term it applied to small and large signals. Slew rate errors contribute at many places in the chain.
  4. The effect of low-level noise has been known for decades and was part of the reason noise decorrelation (dither) is used in most digital converters today. Our ears (and many RF systems) can integrate to pull correlated signals from below the noise floor.
 
If Ethan has measured -50 dB nulls between different (models of) amplifiers I am somewhat amazed. How is that Bob Carver couldn't even get to -40 dB nulls between two amplifiers that were specifically designed to sound the same??

Don't forget Bob was not comparing two identical amplifiers; he was comparing one to his version after he did his best to emulate a tube amp in a SS design. I suspect he could have gotten a deeper null with more time but it was not worth it. I would hope in-band nulls would be pretty good (50 - 60 dB) for a couple of amplifiers loafing along.
 
If Ethan has measured -50 dB nulls between different (models of) amplifiers I am somewhat amazed. How is that Bob Carver couldn't even get to -40 dB nulls between two amplifiers that were specifically designed to sound the same??
AFAI can ascertain, he measured a -50dB null between two music tracks which were fed through the exact same playback system. The two tracks are the same original track but recorded at different levels & then normalised to the same level for comparison & null testing, I believe.
So one would either expect a better correlation (higher null dB), the test is inappropriate, or it actually proves that his contention is wrong. His contention being that he is breaking another audio myth, namely that recording at different levels has an effect on the final sound.
 
Don't forget Bob was not comparing two identical amplifiers; he was comparing one to his version after he did his best to emulate a tube amp in a SS design. I suspect he could have gotten a deeper null with more time but it was not worth it. I would hope in-band nulls would be pretty good (50 - 60 dB) for a couple of amplifiers loafing along.
No, Don, I don't believe you are correct - Bob states that the best he can get in matching his production amplifiers (for a certain series model) is about -36dB null i.e two identical amplifiers.
 
So if my amp has a SNR of 90db, that refers to maximum output, probably. Reasonable to assume that even if my amp were perfect, the best null I could obtain would be of the order of 90dB..?

If I'm attempting to get a null at much lower output levels then my noise is going to be that much higher in proportion. The only way to eliminate this would be repeated, overlaid trials, processed in software I would suggest. The effect of random noise would be reduced, but repeatable amplifier anomalies would show up.
 

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu

Steve Williams
Site Founder | Site Owner | Administrator
Ron Resnick
Site Owner | Administrator
Julian (The Fixer)
Website Build | Marketing Managersing