Separate names with a comma.
Discussion in 'Archived Threads 2001-2004' started by Allan, Mar 18, 2002.
Can anyone exlain the difference between AC power and DC power?
Alternating Current has the voltage running as a sine wave of amplitude, where the quoted value is actually the average amplitude. Direct Current is a constant voltage, however. Both have their uses, which I'm sure someone can remind me of :b
AC power - Alternating Current. The voltage swings from positive to negative in a sine wave fashion. In the US the frequency of the sine wave is 60 cycles per second (60 Hz). Or in other words, your home outlets (a 110V 60Hz AC power supply) will deliver +110 volts and -110 volts changing 60 times per second. (Again, sine wave, not square wave.)
DC power - Direct Current. The voltage is constant. So a 12V power supply will supply an unchanging +12 volts.
Is that what you wanted to know?
The outlets we plug devices into are AC or DC power? Also, standard batteries (e.g. Duracell) supply which form of power?
Lastly, any advantages/disadvantages to either one?
Anything in your house will be AC power (although some equipment will turn AC into DC within itself). Your car runs off of DC power.
Batteries will be DC power. Typically anytime you need portability with power it will need a DC power source.
DC power has the problem of big current drops for long runs of wire, for instance to wire your house for DC would require chaning the 14 or 16 gauge wire in your walls to something closer to 2-4 gauge for a short run (DC is much more sensitive to the length of the run than AC is), even to do a simple 15 amp outlet. For instance in car audio, where you have massive amounts of current for amplifiers it's not uncommon to see guys running 00 gauge wire, even for a 15ft run. So you can imagine the problems of trying to get power from the power station to your house.
Some history behind it all, I believe it was Edison who was a big supporter of DC power, even in homes many, many years ago. One of his biggest problems with AC power is it's inherent danger and how it reacts with your body, basically a strong enough AC shock will cause your heart to stop, an even stronger one will cause your heart to start beating (very simple way of how a defribulator works). DC power is in some ways a bit safer, it doesn't mess around with the way the body works in that way, but it instead it burns. So Edison actually was one of the guys who invented the electric chair in order to show how dangerous AC power is.
Outlets are AC. Batteries are DC.
I don't know what the advantages/disadvantages are exactly. I think AC can be more easily stepped up to higher voltages and transmitted long distances with less loss in power. Or something like that.
Edit: I see Ajay answered your advantages/disadvantages question at the same time as me.
Thanks guys! Really helpful.
One little side note, that I found interesting when I first started understanding electrical concepts: you'd be suprised at how many devices are actually DC internally. Many many electrical items that are wires to work with househould AC convert it to DC internally.
AC stings a bit more than DC
The only real difference between AC and DC, as far as power transmission is concerned, is that transformers don't work on DC. You have to have a changing magnetic field if you want to couple energy between two windings on a step-up or step-down transformer, and DC doesn't cut it.
Consequently, if we used DC for power transmission like Edison wanted, we would have to transmit DC at the voltage that your household appliances would actually need (110-220V). Because power is the product of voltage and current, and because power lost in a conductor is the square of the current times the conductor's resistance, a few moments' thought will convince you that we're much better off using a higher voltage (7KV-14KV or higher) at a lower current for AC power distribution and then stepping it down once it gets to where it's needed.
Say we need to supply power to a substation that supplies 1000 houses, each of which needs 50 amps at 100 volts (5000 watts * 1000 houses = 5 megawatts). The substation is 100 miles from the generating plant. Using AC, we can send the substation 5 million volts at 1 amp via 100 miles of relatively thin, cheap cable and step the voltage down (and the current up) at the local substation with transformers. But if we had to use DC for distribution, the distribution would have to take place at the final 100-volt level. That means we have to send 10,000 amps through our long-distance transmission line. With that much current flowing through it, the wire would probably have to be as big around as a city bus to keep from losing all the power as heat over 100 miles!
So even if AC versus DC isn't that big a deal at the household level, it would be a big problem if Edison had won his little AC/DC pissing contest with Nikola Tesla and Westinghouse. Unquestionably the right choice was made, even though it's true that there's some added electrocution risk with AC.
AC travels over long distances better, is easier to generate (all generators/alternators produce AC initially) and the voltage can be stepped up or down easily with a transformer. Also, most AC motors are simpler in design than DC motors.
Fluorescent lights are easier to power with AC than DC.
DC is needed for electronic devices, and all electronic devices that plug into the wall will use a device called a rectifier to convert the AC to a pulsated DC, and then a filter (a large capacitor) to smooth out the sine wave into a continuous voltage. More sensitive devices also need a voltage regulator, which produces a constant voltage and compensates for load and source power variations (computer power supplies do this, as do most consumer electronic devices like TVs and stereo/HT gear).
The wall warts you see with some devices contains a transformer to drop the 120 (or 240) volts AC down to a lower voltage (say, 12), and may also contain a rectifier/filter/regulator to produce DC.
I bet you didn't know this: most AC powered digital clocks use the AC as a timing source. In North America, the current alternates 60 times per second; the clock chip uses this to keep accurate time. If you were to take a North American digital clock and use a transformer to run it in Europe (240 V/50 Hz), the clock would run slow and lose 10 minutes every hour!
the clock thing is almost a non-issue anymore, most new wall clocks use a cheap quartz movement so they do not care what the ac cycle rate is, I have not seen one with the little induction motor in awhile
Dc would work fine on the house level and the wireing would actually be identical to ac wiring, I have been in old warehouse's that were actualy DC powered for lighting and cranes through a motor-generator, the power was actually for the crane because is is so much easier to do variable speed on a DC motor that an AC motor, and they discovered that standard incandescent light bulbs last alot longer when run on DC so they did the lighting with DC also
untill the 80's and really good solid state controlls, all railroad locomotives used DC drive motors because of this and elevator drive units are still mostly DC on cable type elevators because the speed control is by a few cheap wire wound resistors and a couple of relays instead of a box full of really expensive SCR devices and a computor board
the main problem with DC comes when you try to transmit a large amount more than a few thousand feet, wires become huge and resistance loss'es make it not economical
AC induction motors are also self regulating do to the induction effect whereas a large DC motor will destroy itself if left to run at full power with no load
Another interesting tibit. The reason NTSC and PAL have different values for how many time they make the picture is related to the power sources. NTSC is 60 fields a second because the power supply in North America is at 60 Hertz and PAL is 50 because the power supply in Europe and other PAL areas id 50 Hz. Back when the standards were made, the refresh rate had to match the power supply cycle or the TV would have hum bars. Not sure if that's still an issue now, but it was back then.
I always wondered why there were different rates for European power and North American power. Is there any advatage to having a lower cycle (50 Hz) and higher voltage (220V) ?
Thanks Holadem - your explanation of RMS values was much better than my attempt. I think I even confused myself there.