I am not understanding the following: Both amps are class A/B. Both amps are rated at ~125W/channel. Both amps are monoblock per channel designs. Amp X has a 225 VA transformer for each channel. Amp Y has a 300 VA transformer for each channel. But I have seen comparable real world power test results, that Amp X is closer to 150W/ch, but Amp Y is more like 135W/ch. 1.0% THD. But, Amp Y seems to run a lot hotter than Amp X. Now, I haven't measured heat sink area, and it is obviously possible to design a more efficient amp with less heat sink area that will run hotter but... Let's just say that Amp Y runs so much hotter than Amp X that I can't believe it's just due to less heat sink area. (The heat sinks get too hot to be touched even at just idle.) Would I be correct in assuming that Amp Y is running deeper into the power band with class A operation? And that, yes, the next time I leave Amp Y on long enough for it to reach equilibrium, that I should compare heat sink temperature at idle to after stressing it, in that I remember that class A amps actually are hotter at idle and run cooler when they're playing something? Thanks!