Right.
Yeah, no....
The Wikipedia link that you cite says:
In a loudspeaker, power compression or thermal compression is a loss of efficiency observed as the voice coil heats up under operation, increasing the DC resistance of the voice coil and decreasing the effective available power of the audio amplifier. A loudspeaker that becomes hot from use may not produce as much sound pressure level as when it is cold.[1][2] The problem is much greater for hard-driven professional concert systems than it is for loudspeakers in the home, where it is rarely seen.[3].
Quotation [3] is the Stereophile article that you linked to. This article says, on page 2:
The results, shown in figs. 3 and 4, came as something of a surprise. Despite what rates as a high playback level for me and, I imagine, most Stereophile readers, I had anticipated there being only modest increases in voice-coil resistance. But the increases were even less than I'd expected.
[...]
More realistically, on the wider-dynamic-range material I more usually listen to, and at my habitual replay levels, the rise in voice-coil temperature and the concomitant thermal compression will be lower still. So I strongly suspect that, for most hi-fi users—those who don't habitually wind the volume control to its highest position and indulge in PA listening levels—thermal compression is a paper tiger.
[...]
Right now, the prospect of thermal compression in my listening-room loudspeakers causes me no lost sleep whatsoever.
***
So there you have it. Your citations in favor of your argument turn out to be fuel for the opposite argument, that thermal compression in domestic systems is mostly a non-issue.