Hi alfa100
This is a question that has irked me for a while now also. My current DAC has the option of 5 or 10V RMS output with 30V peak-to-peak at 10V.
I run directly into my power amp at the 5V setting, which has an input sensitivity of 1.5V RMS for full output of 200W into 8 ohms.
I've purchased some MusicLab M300 monoblocks amplifiers (stuck in customs) that has switchable input sensitivity. The designers recommendation is to set the input sensitivity at 1.5V for the best sound, and to limit the RMS input voltage on music peaks to that value. This means the input signal is not clipped by the input stage, which makes sense to me. I'm assuming the distortion from clipping on the input stage is amplified by the output stage.
The question I have is I currently run the 5V setting from the DAC into the existing power amp and it sounds good. One issue is some music can become strident when pushed. I'm assuming this could the I the input stage being driven into clipping.
I have some 10dB attentuators on order for use with the new amps which will allow the DAC to use the full 5V output, attenuated to 1.5V at the amplifier input.
These attenuators are XLR for the balanced input on the new amps, my existing amp is RCA only, so I can't compare between the two.
I'll let you know how it goes.
Cheers