You can't argue that these cables lengths are so short that the losses are irrelevant to return loss and then argue that 3dB of attenuation will add jitter. On such short lengths, 3dB of attenuation should not materially affect the eye pattern unless there is a lot noise being introduced into the system. Resistive attenuation by itself will not increase jitter - this will only happen if the SNR at the receiver is impacted.
And if jitter is dominated by the reflections in the system, then adding 3dB of attenuation could actually improve jitter performance.
Lastly, if any such implementation doesn't have at least 3dB of margin, then it is grossly under-designed imo..
Cheers, Joe
If the signal is attenuated by 3dB, then it is out of spec. This is not RF we are talking about, this is a specified digital signal with a 0.5V P-P voltage and timing spec. There is a maximum risetime specified, but no minimum risetime, BTW.
3dB attenuation will likely cause more jitter because the reduction in amplitude will increase the risetime. Also, the detection of the edge may not be exactly at the zero volt crossing (I'm certain it's not actually), so reducing the amplitude will place the detection point on the edges at a slower slew-rate time on the waveform.
When you are talking about picoseconds of jitter, these can easily add significantly. Even signal passing through one more buffer gate can add a few psec to the jitter.
Steve N.
Empirical Audio