Fred, I can't agree with your assertion that "inter-channel phase matching" is not important -
Nowhere did I assert that.
Please re-read my post and see that the point under discussion there was that optimum azimuth and inter-channel phase matching are actually two distinctly different matters.
Of course, my protests about using only correct terminology often fall on deaf ears. It's not helpful that many mastering engineers have long conflated the terms "phase shift" with "azimuth error". Cedar's digital "azimuth corrector" product is a good example how this misunderstanding perpetuates itself.
This is part of the story of why so many of the analog master tape to digital file transfers have been quality-compromised.
We try to show mastering engineers who are working with analog tape that correcting for phase shift is indeed a trivial task when accomplished in the digital domain.
Yet optimizing the playback azimuth on an analog master tape is always best done only in the analog domain.
Complex algorithms in DSP plug-ins that purport to restore high frequency loses resulting from azimuth misalignment in the analog domain are only guesses at restoring unknowable information. The initial domain transfer is a step that requires the greatest of care.
As for the importance of inter-channel phase matching to stereo imaging, keep in mind that when a stereo program has equal amplitude versus time in both channels (in other words, has high inter-channel program correlation), the stereo sound field collapses into a mono point source.
Most mastering engineers are using inter-channel phase adjusting tools mainly to avoid undesirable comb filtering. (See Bob Katz's
Mastering Audio for a discussion of this.) Other tools are employed for establishing a solid image location.