Those low-bandwidth cables must have been liked by some if they sold any, so some people must like the roll-off. Or, they reduce EMI/RFI or other issues in the chain.
Point: Those cheap cables I modeled have 500 MHz to 1 GHz of bandwidth in RF systems; the roll-off is due to the source (preamp) and load (amp) impedance, not the cable per se. Of course, in an audio system they are not properly matched at the source or load end...
So Transparent is not transparent. Close enough, though. Us middle-aged guys have a better chance of dating Scarlett Johannson than we have of hearing 0.1db off at 20 khz.
Speak for yourself If that is any consolation, even teenagers are likely not to hear 0.1 dB off ... As usual with cables we are trying so hard to find differences when with knowledge removed ... Oh sorry ... let's not derail the thread ...
Those low-bandwidth cables must have been liked by some if they sold any, so some people must like the roll-off. Or, they reduce EMI/RFI or other issues in the chain.
Well, for the cables I cited, inconsequential at audio frequencies, and RF/mW is a whole 'nother thread (and forum).
Phase shift happens, of course, any time you have a length of cable between two things, but it is usually linear and therefore does not degrade signal integrity, especially at audio frequencies. Linear phase means constant group delay, so all frequencies are delayed equally. That is, a cable acts like a true time delay. I am not sure what "phase delay" means.
At RF, cables can become quite the nuisance. They are also problematic when dealing with very low-level (uV and below) signals, at which point a lot of terms used by audio cable marketeers arise (hysteresis, self-noise, nonlinear dispersion, etc.) I just do not see those being significant at audio frequencies in commonly-used cables. At the other extreme, driving HV and high current through cables may cause all the previous issues plus signal loss, risk of breakdown, etc.
YMMV, FWIWFM, my 0.000001 cents (microcent), etc. - Don
Phase shift happens, of course, any time you have a length of cable between two things, but it is usually linear and therefore does not degrade signal integrity, especially at audio frequencies. Linear phase means constant group delay, so all frequencies are delayed equally. That is, a cable acts like a true time delay. I am not sure what "phase delay" means.
YMMV, FWIWFM, my 0.000001 cents (microcent), etc. - Don
Thanks, as always, for your erudite contributions...I am always appreciative! I have a clearer understanding of group and phase delay from your post and from perusing:
Personally, it would be immensely satisfying to develop a modicum of competence in the technical arena...thanks, again, for your salient contributions.
Thanks for the link! Good old R-S, used a lot of their gear. I see how they derived "phase delay", just not something I have routinely used, but then again the VNAs I use most often are Agilent. For all I know, Agilent defines phase delay as well, but I rarely bother to read the manual... Just stuck with phase shift and group delay on this side of the pond!