@bonzo75
From a Brit fellow:
UK mains sockets are polarised and have separate live, neutral and protective earth conductors. The Neutral line is defined to be at the same potential as the protective earth (it's actually fed from the star point of the local substation where it is also connected to an earth stake). The Live is at nominally 230V AC relative to Neutral (and Earth).
If an appliance develops a fault it requires a fuse in the live conductor to cut the supply. This may occur due to the device drawing too much current, or excessive current returning via the protective earth due to an insulation fault. For this reason, appliances designed to operate in the UK usually have a fuse fitted in the live supply conductor (and only the live conductor) so that, if a fault causes it to fail, the whole device remains at a safe (Neutral) potential.
If you're going to feed a device from a balanced supply, you need a fuse in the neutral conductor as well, since a fault to earth could draw current through both live and neutral conductors. So, the first problem is that the device could draw as much fault current through the neutral line as it likes, possibly enough to start a fire, without causing the fuse to disconnect it.
Let's say you fuse both the live and neutral lines inside the device. You then have the problem that, if the device simply takes too much current without an earth fault, the fuse in either the live or neutral line could fail, but not both. This leaves the device at a dangerous potential internally despite the fact that the current flow has stopped, whereas, with the usual unbalanced supply in the UK, the device sits at neutral potential internally.
Some of this could be mitigated by the use of an RCD on the "balanced" side of the supply, admittedly, because considerable ground current would flow in these scenarios, but RCDs do fail. The primary protection needs to be a fuse, IMHO, and they simply aren't fitted to most appliances in the UK.
Balanced supplies do work in other countries and installations without leaving a trail of destruction behind them, admittedly, and often with equipment that's pretty much identical to that marketed in the UK. To use one domestically in the UK is to deviate from convention and, in my mind, for no good reason. If such a setup caused a fire and, heaven forbid, a death in a home in the UK I wouldn't fancy the job or arguing to an expert witness that the increase in "dynamics" or reduction in the "noise floor" was worth deviating from the electrical standards mandated by law in the UK.
In regards to the second question, all components should be connected equal to avoid/minimize current leakage, what may also bring noise. If a component works better inverted I personally consider it poor designed and not use it. Looking on the UK example, there you just have one way to connect since the P + N + G should be on the same position on all wall outlets. Of course it can be made with any plug, the Neutrik one is the one I like due to its locking system and I never found anything with better connection.
NOTICE: It is very important to have all components connected to the same reference ground.
IMO, before buy anything related to power conditioner the person should first take care to have a really proper wired room (again the ground1 page is very usefull), then measure whats is going on mains to know what is need to be addressed, and than buy what is needed.
Solving problems you do not have can make the things ever worse.
Another good read:
http://www.audiosystemsgroup.com/SurgeXPowerGround.pdf
Other point people do not know is related with mains requirements of amplifiers, which should be very considered for peaks. I do not remember how to do the calculation now but as reference a 1200W Class AB amplifier needs 1800W of current to work at full power. Most amplifiers do not ever have a power supply to handle it. You can find out running your amplifier at full power for about a hour, if it have a properly made power supply it will run, otherwise will die.
The "power requirements" on the component papers is related of what they think will be needed on average usage. In the past (first hoalf of XX Century) the papers bring it at full power but they changed when the amplifiers become too powerful -
imagine (total) 500.000W amplifiers for a live concert would need a sub-station just to start, and will not use it.
It is very OK because this information is a lot of more useful for everyone, but the problem appeared when the industry began to make the power supplies also based/sized on that specifications and not on the full power requirements, ever the high end one -
what I call scam. Basically most (i.e.) 1200W Class AB amplifiers have power supplies of a 900W one.
An amplifier (or anything) with a not well sized power supply will never sound as it can with a proper one, specially on peaks.
PS. balanced power is real, it works but can remove up to 10dB of noise. Most of the benefices people have when start using BP usually are more related with the proper wired room and the isolated ground it force you to have, otherswise it will not work as it should be and can make the things worse -
not to say unsafe too.
I personally would consider BP as the next-thing to do, but I do not like how BP sound, I feel the bass become
fake sounding.