I think I have *my* answer, but I was wondering what y'all thought: 1) Let's say I generically compare a 100W/ch amp vs 200W/ch. Now let's also say that I know that I will never ever use more than 100W. (I.e., room size, speaker impedances & efficiencies, how loud I like to listen etc.) Let's say that sure, I hit 80W/ch on transients watching a movie or listening to CD. So in the 100W/ch case, I'm using up to 80% bandwidth of a channel (for transients) at any given time. But for the 200W/ch case, I'm down to 40%. I never use higher than 40%, and typically, it's probably more like 10-20%. Isn't it better to use a greater percentage of an operating range of an amp? (A kludgy analogly is that I work in an industry where rule of thumb, is that you want to operate components within 15 - 85% range of their operating range for maximum acuracy and precision of whatever you're doing. RF generators, mass flow controllers, etc.) 2) Because of the higher gain a higher power amp needs, a higher power amp can actually have a higher noise floor at similar power outputs to the smaller amp. (Well, not getting close to the limits of the smaller amp's output, of course.) I already know 2 is true, because I started with an Acurus 100x3 then went to the 200x3, and the noise floor did increase incrementally. I also had the same experience with a Nakamichi PA-5AII and a PA-7AII. 150W/ch and 225W/ch. More speaker hiss evident for the bigger amp. You could say that *maybe* the Acurus isn't that evolved a design and that shoot, hiss *shouldn't* increase for more power, but the Nak amps were based on Threshold designs by Nelson Pass. Anyways, just curious what the HTF community thought... Oh yeah, and just to clarify, I'm not really thinking about comparing between two different manufacturer's. More within a manufacturer's line where the basic design is exactly the same, just that one amp is more powerful than the other.