Shop Products
Houzz Logo Print
jerrod6

120V 220V - Why?

jerrod6
15 years ago

Why does the USA use a standard of 120V while the rest of the world is using 220V?

Comments (27)

  • bigbird_1
    15 years ago

    Canada uses 120V/240V just like the US.

  • hexus
    15 years ago

    we seem to think it's safer and we like being less efficient than the rest of the world... just look at our cars, government, economy, etc.....

  • jerrod6
    Original Author
    15 years ago

    I kind of thought Canda might use 120V but wasn't sure.
    Is 220V really more efficent that 120V?

  • terribletom
    15 years ago

    Is 220V really more efficent that 120V?

    Yes.

    Given a certain amount of energy needed in watts, the higher the voltage, the lower the percentage voltage drop and the less the energy loss over the same size wire.

    That's why the power company can supply a whole neighborhood with two relatively small wires at 7200 volts (or more). The high voltage is potentially more dangerous, though, so it is transformed to a lower voltage for consumer use.

    It's amperage that determines the wire sizes needed and a 120V system has to transmit almost twice the current (amperage) as a 220v system delivering the same amount of energy (watts). Because ampacity-to-wiresize is basically a square function, the wire has to be roughly four times as large. Otherwise, severe voltage drop will occur over any appreciable distance and more energy will be lost to heat generation due to resistance.

  • jcthorne
    15 years ago

    Why is Europe using 50Hz AC when it makes lights and televisions flicker?

  • llaatt22
    15 years ago

    One reason for choosing the higher ac voltage or not was the availability of copper. Was it mined domestically or did it have to be imported and paid for with scarce foreign exchange? In North America copper was abundant and easily obtained.

    Another factor was probably the scare campaign against ac power funded by Edison.

    No doubt there were many reasons for the way the different systems developed.

  • terribletom
    15 years ago

    If all you're looking at is flickering lights, there's no doubt that the U.S. and Canada got this right. Still, I think the selection of frequency is a compromise involving many factors.

    I believe that power transmission over long distances works better at lower frequencies--having to do with capacitance and inductance of the lines.

    On the other hand, the size of transformers used to step down transmission voltages is inversely related to frequency and they're more economical at 60 Hz than 50 Hz.

    I've also read that back in the "olden days", it was easier to synchronize the speeds of generators turning at lower speeds. Dunno whether that's still an important factor today or not, but it probably influenced the initial selection of standards.

    It's also said that commutator motors don't generally operate as well on higher-frequency AC because rapid current changes are opposed by the inductance of the motor field. Induction motors, though, work well in the 50- to 60-Hz range, although at different "standard" speeds.

    I'll be interested to see if any of the double-Es here weigh in on this question. I, too, would like to know more about the trade-offs in modern usage and whether one standard or the other is clearly superior, all things considered.

  • fa_f3_20
    15 years ago

    The bulk of transmission loss occurs in the grid, where much higher voltages are used. Transmission losses that occur within house wiring are not significant.

    Power transmission at higher frequencies is more efficient, up to a point, provided that the hot and return conductors are kept close together so that they cancel each other's magnetic effects. And, as tom pointed out above, transformers for higher frequencies can be built much smaller (which is why they use 400 Hz on aircraft).

    There were at one time several different frequency standards used in the USA. One that hung on up to the 1970s was 25 Hz. This was used in industry a lot because, back in the day, a lot of industrial machinery couldn't tolerate being run at a high RPM. A four-pole 25 Hz synchronous motor runs at about 700 RPM, which was a good speed for machinery in the early days -- it roughly matched the speed of piston-driven steam engines. There were also 15, 30, 40, and 60 Hz standards in various places. How N. America settled on 60 Hz I don't know, but it was picked out fairly early -- it had to be, because when grids started forming in the Northeast U.S., they of course had to be synchronized.

    The 120/220V thing goes back to when DC distribution was the norm, and a substation actually had motor-generator sets to produce the residential voltages. And yes, a lot of it had to do with Edison. The European systems did go with 220V for more efficient distribution. Edison used a lot of scare tactics, but he did have one good point -- insulation back in the day wasn't very good, and 120V was safer in that regard. (These days, it hardly matters, because pretty much all wire and cable manufacturered is required by code to be insulated to 600 V).

  • brickeyee
    15 years ago

    "Power transmission at higher frequencies is more efficient, up to a point, provided that the hot and return conductors are kept close together so that they cancel each other's magnetic effects."

    Not correct.
    The losses are even lower on DC transmission line sine inductive loss is essentially zero.
    The dielectric (capacitive)losses are also reduced since the voltage is constant.

    Losses occur when the DC is chopped back to AC, but for very long distances the phase shifting effects (wavelength) are also eliminated.

    While the wavelength of a 60 Hz power cycle appears large (~3107 mi) strange things start to happen at about 1/10 of a wavelength (310 miles) and become very odd at certain fractions (like 1/4, 90 degrees, or only 777 miles).
    The power engineers had a crash course as the grid developed and became interconnected over large distances.
    Those odd radio frequency things started to occur.

  • brickeyee
    15 years ago

    The 120/240 system allows the use of 240 volts for larger loads while maintaining the safety of a 120 V to ground system.
    The 50/60 Hz difference was caused solely by perceptions of flicker.

  • aidan_m
    15 years ago

    I believe you can trace the roots back to the DC vs. AC, Edison vs. Westinghouse campaigns.

  • jerrod6
    Original Author
    15 years ago

    Thanks for all the replies. So is there any reason to believe that a motor or other components powered by 220V would last longer than one that runs 120V?

  • dkenny
    15 years ago

    yes. but the motor running on 120 must draw enough current to reduce the voltage at the motor. in this case switching the motor to 240v will reduce the voltage drop at the motor.

  • bigbird_1
    15 years ago

    "yes. but the motor running on 120 must draw enough current to reduce the voltage at the motor. in this case switching the motor to 240v will reduce the voltage drop at the motor."

    Not sure what you're talking about regarding voltage drop. Doublig the voltage from 120 to 240V halves the current and the motor will run cooler and last longer on 240V.

  • jerrod6
    Original Author
    15 years ago

    So you mean I can take a motor running 120V and run it 240V without modification?

  • bigbird_1
    15 years ago

    "So you mean I can take a motor running 120V and run it 240V without modification?"

    Absolutely not! It has to be factory wired for 120V/240V. If you apply 240V to a 120V only motor, it will likely explode. So if you do try it, wear safety glasses, a face shield, a lead apron, welder's goggles, a motorcycle helmet, army boots, and maybe even some space shuttle thermal tile about your private parts. Good luck!

  • pjb999
    15 years ago

    Jerrod, the more technical aspects have been answered here already but I believe the other factor in the 220/240 vs 120 debate was the pain of 'bleeding edge/early adopters' and much of the debate in North America was about getting control of what the developers (Edison, Westinghouse, Tesla et al) was going to be huge....electricity was to that generation what the internet was to mine, an enormous paradigm shift.

    I believe the rest of the world observed the US example of 120v and learned from it, and chose what was considered (and probably is) a better system. I must say I miss 240v (Australia) when it comes to power tools and insulation colour coding, the tools run better and the coding is less confusing (brown=active, blue=neutral grn/yellow stripe=ground) but I guess it depends on what you're used to.

    As for the flicker, to me, (and I'm what I'd consider a trained observer from my time in the film/tv industry) it's relative, you don't reaaaaaly notice the 50hz flicker unless you have something to contrast it to, like a 60hz computer monitor (being mostly a US standard/invention, 60hz is/was the norm worldwide) - some argue PAL tv flickers more at 50hz but that is more than offset by the superior resolution (100 more lines) and colour system, the PAL flicker these days is moot because most better standard def tvs use 100hz, repeating fields twice as often, to get past it. TV standards again was the penalty/argument of the bleeding edge, the advantage of getting something first tends to fade when others develop improved, cheaper versions of the originals. Of course, without the groundbreakers, where would we be?

  • lee676
    14 years ago

    It's worth noting that when used with modern electronic ballasts, fluorescent lights (including T8 and T5 tubes and CFLs) don't noticeably flicker even at 50Hz input (output flicker is at a way higher frequency, beyond human perception). Some American LED lamps - many of them actually - flicker horribly. But some don't seem flicker at all. I found the white LED Christmas lights sold by Sylvania and Philips to flicker so badly I couldn't stand to look at them; instead I bought an off-brand advertised as non-flickering and found they have no noticeable flicker.

  • brickeyee
    14 years ago

    "It's worth noting that when used with modern electronic ballasts, fluorescent lights (including T8 and T5 tubes and CFLs) don't noticeably flicker even at 50Hz input (output flicker is at a way higher frequency, beyond human perception)."

    Electronic ballasts rectify the input to DC, so the only change from 50 Hz to 60 Hz is the filter capacitors on the input side (slightly larger for 50 Hz).

    The light is operated at many tens of kilohertz eliminating any difference between the source frequencies.

    While 240 V does save copper by allowing smaller wire sizes based on reduced current, the hazards are greatly increased.

    Modern European plugs contain fuses to try and add some protection back, but 240 V shocks are significantly more dangerous than 120 V shocks.

    The doubling of the voltage doubles the current delivered into the same human body that gets across the lines.

    Under most circumstances the chief danger from 120 V shocks is falling off the ladder type things.

    At 240 V you have a better chance of lethality from the shock itself.

    The common commercial voltage like 480 V are very lethal.

    Distribution voltages (they start at 7,200 volts) are real killers, and only a few survive to tell the tale.

  • mikemr
    14 years ago

    There's a good summary of electrical service around the world at this web site and several others:

    http://www.kropla.com/electric2.htm

    Interesting reading!

  • jemdandy
    14 years ago

    A 120 V circuit is safer than a 240V one. The main advantage of 240v over 120v is that it requires only half as much current to deliver the same watts into a resistive load, therefore, the wire can be smaller.

    In the US, single phase residential voltage comes from a center tapped transformer. The center of the winding is the neutral while the two outer ends are the two voltages supplied to the distribution box (circuit breaker or fuse box). Both 'hot' lines measure 120v with respect to the neutral, but since these two hot lines are opposed by 180 deg, they measure 240v with respect to each other. With this system, 240 power can be supplied with lines that are at only 120v with respect to the neutral.

    Now for the big surprise about ac motors. For motors larger than a few horsepower, the current does not vary lineraly with the load! As the shaft load increases, the current does increase some, however, the big change is the phase angle between the voltage and current. Power delivered is voltage x current x power-factor. When the current is 90 deg to the voltage, power is zero; When the current is in phase with the voltage (0 deg), power factor is 100%. So it is possible to have considerable motor current in the wires with little power output.

    Fractional horsepower motors have a goodly amount of resistance compared to their reactive component and because of this have greater resistive losses.

    The debate between 50 and 60 hz can go on forever and from a practical sense, has existed since the dawn of electric power. 60 hz has an advantage over 50 hz in that the cross section of the iron in a transformer can be a little smaller for delivering the same power. Hysteresis losses in magnetic materials is a positive function of frequency, thereby increases with frequency. But, at the same time, the peak magnetic flux required to transmit the same power increases with decreasing frequency. Aircraft systems uses 400 hz to reduce weight.

    Europe tends to use 50 hz while the US and Canada uses 60 hz. Sometimes it seems that this difference is maintained to make it more difficult to apply products from one side of the ocean to the other - e.g., trade protectionism.

  • brickeyee
    14 years ago

    " Sometimes it seems that this difference is maintained to make it more difficult to apply products from one side of the ocean to the other - e.g., trade protectionism."

    Not really.

    Many 60 Hz motors are actually 50/60 Hz.

    Some things like hermetic compressors for refrigeration are only made at a few plants in the whole world.
    Rather than maintain separate designs they just add the 'extra iron' needed to work at 50 Hz.
    The speed also changes, but that is easily accounted for in how the compressor is applied.

    Even the voltage ratings can be accommodated.

    The Europeans 'harmonized' their voltage specs by smearing them and most equipment had enough margin it did not create very many real changes.

  • ionized_gw
    14 years ago

    I have some comments about motors and frequency.

    The "A.B. Wood", 12-foot diameter, screw pumps that drain much of New Orleans had their own 25 Hz power plants. They were installed ca. 1915. The design was improved and enlarged to 14 ft (+40% capacity) by the mid 20s and still use 25 Hz AC.

    Higher voltage power does not make an easier-starting motor. Three-phase power makes an easier-starting motor. Larger (industrial) motors are usually three-phase. They are also less expensive to manufacture without so much extra equipment to make them start.

  • davidro1
    14 years ago

    Mexico also has 110-120 Volt house current.

    It's just a standard that developed, for "no" reason worth knowing.
    Also, 50 or 60 Hertz. Same process: a standard developed over time and there is "no" reason to dislike or prefer one over the other.

    The differences have been well explained. One is higher voltage = higher risk but people aren't dying in Europe Asia and Australia because of their household voltage so workarounds have been found. Another difference is efficiency and thickness of wire. Not that big a deal.

  • lee676
    13 years ago

    Good article on Gizmodo about this recently, and why attempts to standardize on a single voltage, frequency, and outlet have failed....

    I think the UK has it right meself (despite the bulky plugs).

    In the meantime, let's just globally standardize on these wallplates.....

    Here is a link that might be useful: Why Every Country Has a Different &%$@# Plug

  • DavidR
    13 years ago

    And Japan uses 100 volts. Go figure.

    Electronic gadgets increasingly are supplied with wall warts or internal power supplies capable of operating from 100 to 250 volts. All you need is the proper plug adapter for your country.

    BTW, most European standard plugs have shielded pins and recessed receptacles, so it's essentially impossible to contact a live pin when plugging or unplugging them. I think you could argue that this makes them safer than the usual 2-pin US 120v plug.

    If the standard 2-pin US plug were introduced today it would spark (sorry) lawsuits over its hazardous design, which makes contact with live pins much too likely when making or breaking connections.

    I also often see old houses with worn-out receptacles and plugs hanging out of them, the pins partially exposed. Just plain lousy design.

    Three-pin grounded plugs are less likely to fall out of the wall receptacle, but it's still possible to contact live pins when making and breaking connections.

  • brickeyee
    13 years ago

    "Higher voltage power does not make an easier-starting motor. "

    Higher voltage allows for less drop in the lines feeding the motor and faster starts with less heat buildup.

    The conductors to feed a multi-horsepower motor from 120 V would be very large.

    3-phase is used in industrial motors because it makes for a far simpler motor.
    A 3-phase motor has a non-zero starting torque.
    A single phase motor has zero starting torque and requires start windings, capacitors, and switches to activate and deactivate the start winding.

    A 3-phase motors has none of the start circuitry a single phase motor requires.

    2-phase was experimented with, but it requires 4-wires while 3-phase can be delivered with just 3-wires in a delta configuration.