Speaker Sensitivity vs Wattage vs SPL?

Discussion in 'Archived Threads 2001-2004' started by Martin Fontaine, Jul 8, 2002.

  1. Martin Fontaine

    Martin Fontaine Supporting Actor

    Aug 15, 2001
    Likes Received:
    I know that a speaker's sensitivity rating means the SPL that is readable at 1 Meter when there is 1 Watt of power. And that whenever the volume level (dBs on the receiver's volume dial) changes by 10 dBs, the wattage is multiplied/divided by 10.

    My Paradigm Titans are rated for 90 dB @ 1w/1m. And my Kenwood VR-509 can do 100 Watts per channel (Ok, so maybe with all channels driven, it can't get that high but that's besides the point here)

    So in theory, if I set the volume level on the receiver at -20dB, I should be outputting 1 watts (Assuming the signal is a peak)

    Calibrating with S&V (Which is basically Avia jr. so the 85db=Ref should be valid right?) Ref level is -18 for AC3 and -22 for DTS (I usually listen to my movies at 10 under ref so -28 or -32)

    If at reference (-18) I'm just over 1 watts (Probably just a little under 2 watts) then my speakers should spit 92dBs since doubling the wattage adds 3 dBs of SPL. Then why would I get 105 dB at that volume level then?

    The formula just doesn't seem right here. I'm sitting maybe a little over 1m from the speakers but still, this doesn't make sense.

    And when I put MP3s from the PC (SPDIF output) anything above -40 is too loud (Probably near ref) so there again, the formula makes no sense.

    Anyone know why?
  2. Mark Tranchant

    Mark Tranchant Stunt Coordinator

    May 9, 2002
    Likes Received:
    Read my post in this thread.
    You're assuming that a 0dB volume setting on the receiver equates to 100W output. This is not necessarily the case: you need to factor in both the input signal level and the volume setting.

Share This Page