Hello. Yes, this is another thread about home theater receiver/amp output, but hopefully with a twist. Can someone that knows how things work please answer these questions, I'm aware that there are some slightly incorrect assumptions in my reasoning below, but bear with me. 1. If the total power consumption of a receiver/amp is lower than the total given output effect, the given output invariably is false? Example: Receiver X has total power consumption of 290W. Manufacturer claims that it has 6x75 watts output. This means that the claimed total output of 450W (6x75=450) can NEVER be accurate with all channels driven, since it only consumes 290W. True? 2. The figures given for output are (almost) always related to an 8 ohm load. If I would use 4 ohm speakers with receiver X, which automatically increases the load on the amplifier, would the 6x75W rating be closer to 6x150W (not really, but for the sake of argument), ie a total output of 900W, all channels driven? If so, how on earth could this be done with a total power consumption of 290W? 3. If I were to purchase this receiver X and (by using pre-outs) let my old stereo receiver drive the front channels, would this mean that the total power would be "redistributed" to the remaining four channels on receiver X? That is, the previous 290W/6 = 48W/channel would become 290W/4 = 73W/channel? 4. I live in europe. This debate doesn't even exist over here. Is that in any way related to the fact that the US has 110V standard and we (at least in Sweden) have 230V? Does that affect the receivers ability to perform? If someone could answer these questions I would really appreciate it.