What's new

Amps vs Volts vs Watts (1 Viewer)

FrankD

Stunt Coordinator
Joined
Nov 15, 1999
Messages
50
Just trying to get the above straight as I would like to determine the amps used by my home theatre system:

I know that amps x volts = watts.

Now I do own a Radio Shack meter and also purchased a amp probe to go with the meter. I made my own single wire adapter as I could not find one at Radio Shack. I put in a 100 watt bulb in a lamp and measured the amp. The amp came out to 0.7 amps. I thought that a 100 watt bulb on a standard 120 volt circuit should yield 0.83 amp usage (ie. 100 watts/ 120 volts = .83 amps)??? why 20% difference? Is this correct??? ( I did check my voltage and it read 118.6 volts which actually will make the difference a little more). Could the difference be due to too much plastic around the measuring wire (I basically used a three prong electrical wire and separated one of the wires (non-ground). Is their a difference if I where to measure the neutral vs the positive?
 

Chris Tsutsui

Screenwriter
Joined
Feb 1, 2002
Messages
1,865
My guess is that the meter has a margin of error, as does the light bulb's consumption, and where you are measuring.
 

Seth_L

Screenwriter
Joined
Apr 5, 2002
Messages
1,553
There are a few potential sources of error. First the light bulb may not exactly be 100W. The meter isn't a true RMS meter. That also could cause problems.

Stereodude
 

FrankD

Stunt Coordinator
Joined
Nov 15, 1999
Messages
50
The meter I am using is a $150 US multimeter and I have also tested it on a 135 watt and 150 watt bulb with the same difference - About 20%.

Anyone have any other ideas on why this difference or other ways to test my meter. I would really like to get an accurate reading of my home theater systems amp consumption.
 

Seth_L

Screenwriter
Joined
Apr 5, 2002
Messages
1,553
Not to me mean Frank, but $150 is chump change for a clamp on meter. If you were using a clamp on Fluke Power Meter (much more money) then I would be more puzzled by your results. I would guess that the meter is inaccurate.

Stereodude
 

FrankD

Stunt Coordinator
Joined
Nov 15, 1999
Messages
50
OK Eric I understand the losses issue but I am actually gaining 20% here????
 

FrankD

Stunt Coordinator
Joined
Nov 15, 1999
Messages
50
So Seth should I then trust the light bulb and my volt reading and add 20% to whatever amp reading I get?
 

FrankD

Stunt Coordinator
Joined
Nov 15, 1999
Messages
50
Tim,

I believe that my meter was calibrated but I can not be 100% sure.

I use an amount of 20% increase in the amps number that I get to maybe get a more accurate amps number because when I tested it with a supposedly known watt source (light bulbs of 135 watts, 100 watts and 150 watts) it gave me an amps number that was about 20% lower than it should have been. Alternatively if the watts on the bulbs are wrong that would be another explanation.
 

FrankD

Stunt Coordinator
Joined
Nov 15, 1999
Messages
50
In the owners manual of the amp probe it reads:

Accuracy: ......... +/- 0.6A at 50Hz/60Hz for 1A - 2A

+/-(4% + 1A) at 50 Hz/60Hz for 2A - 300A

Hence therefore in the 1amp to 2amp region I guess my meter can be off by just over half an amp or 0.6 amps as stated above for 1amp to 2amp readings!!! Wow what an error factor! Could this be right??

Furthermore for 2amp - 300amps the error is 4% plus 1 amp - not bad at the high end but at say 6 amps that is still a 20% error!!!

I guess for $150.00 it does not buy one a whole lot of amp accuracy!!

Any comments?
 

Tim Hood

Agent
Joined
Mar 11, 2002
Messages
34
Frank...

I'm not sure which bulbs you are using, but just surfing the GE site (a more popular bulb) their A-Series bulbs (common house hold bulbs) have a feature called Watt-Miser which allows them to list their bulbs at the higher wattage rating, but actually draws less power.

That was my interpretation of the site...???

Anyways, perhaps this could play a role?

And that is quite a bit of error...

Tim
 

FrankD

Stunt Coordinator
Joined
Nov 15, 1999
Messages
50
I guess the error is high due to the fact that I am using a probe that is measuring a magnetic field of the current as opposed to the current itself. I am going to try to break the neutral wire and measure the amps from their directly with my multimeter. This is a little more dangerous but from what I have determined should be much more accurate (within 2%).
 

Seth_L

Screenwriter
Joined
Apr 5, 2002
Messages
1,553
I guess the error is high due to the fact that I am using a probe that is measuring a magnetic field of the current as opposed to the current itself. I am going to try to break the neutral wire and measure the amps from their directly with my multimeter. This is a little more dangerous but from what I have determined should be much more accurate (within 2%).

Be careful. All meters have a max current rating. Also some can not measure AC current.

Seth
 

Jeff Rosz

Second Unit
Joined
Sep 24, 2000
Messages
335
ack! dont do that. measure the resistance of the bulb (just the bulb alone sitting safely on a table) and use ohms law to determine the current through the bulb. this will be very accurate and simple and safe.
 

Robert Mee

Agent
Joined
Apr 30, 2002
Messages
42
Also, (now I'm digging back to my college days)
Watts = volts * amps * powerfactor
volts * amps = VA's which includes reactive power (VARS) and inductive power (WATTS).
Most Utilities run their customers at about .75 to .85 power factor.
Of course, in your example, it makes your error that much worse :frowning:
 

FrankD

Stunt Coordinator
Joined
Nov 15, 1999
Messages
50
Well it worked!!

I hooked up my mutimeter directly to the current and it measured exactly what a 100 watt bulb should measure in amps. The meter says that it will measure amps within a 2% error margin.

Also the max current rating on my multimeter is 20amps.

I just tested my ATI 1506 amp (6 channel amp, 150 watts per channel into 8 ohms) and it registered 10.6 amps on turn on and then idled to 0.85 amps. I will now test it using some demanding DVD movies at close to reference levels to see what the amp will consume. But so far wow, 10.6 amp on turn on that is quit a bit!!
 

Jeff Rosz

Second Unit
Joined
Sep 24, 2000
Messages
335
well, we are all glad you are still with us :) and also digging back to high school days, if i remember right.....100mA can be lethal. frank you do know the "keep your left hand in your back pocket rule", right?
 

Will Pomeroy

Stunt Coordinator
Joined
Feb 9, 2002
Messages
144
Actually Jeff,

not that this matters or anything, but measuring the resistance of a non working light bulb is around 20 ohms, which would yeild about 6 amps, which is completely wrong of course... When you turn on a lightbulb in a socket, it gets super hot, and as I'm sure you know, when things get hot, their resistance increases, so that means that you won't get the correct reading of a standing lightbulb.

Just so ya know...

(Thats also why lightbulbs flash super brightly when they burn out, because of the high amp load, and the weak fillament, they burn out the second you turn them on, not while their on...)
 

Jeff Rosz

Second Unit
Joined
Sep 24, 2000
Messages
335
that makes sense. i never actually measured a lightbulb, heh, never felt the surge, um, i mean urge.
 

Users who are viewing this thread

Sign up for our newsletter

and receive essential news, curated deals, and much more







You will only receive emails from us. We will never sell or distribute your email address to third party companies at any time.

Forum statistics

Threads
357,055
Messages
5,129,696
Members
144,283
Latest member
Joshua32
Recent bookmarks
0
Top