What is Amperage?
Amperage is a term often used by electricians, and means electrical current, measured in amperes, or amps. The ampere is the SI unit for electrical current, or the amount of electrical charge that flows through a conductor in a given time. One ampere is a charge of one coulomb — about 6.241 X 1018 electrons — per second flowing past a given point. Electrical devices are rated according to their amperage, or the amount of current they typically draw from a mains supply when operating normally. When electricians speak of the electricity flowing in and out of a home, they may be referring to voltage, amperage or wattage depending on the circumstances, but when considering the effects of electric shock, it is the amperage, rather than the voltage, that is important.
Amps and Volts
Electricity is to home electrical circuits as water is to home plumbing systems. The voltage is roughly equivalent to the water pressure, and the amperage, or current, to the quantity of water that flows past a given point per second. At a given pressure, less water can get through a small pipe than a large one in a given time, so the size of the pipe can be regarded as equivalent to a measure of electrical resistance — a smaller pipe has higher resistance. The higher the electrical resistance of an appliance, the lower its current will be, and resistance is often dependent on the diameter of the wires.
Electricity is brought in to the home through power lines ultimately connected to a generator. To minimize energy loss through the resistance of the power lines, transformers are used to transmit the power at very high voltages. Before it reaches homes, however, additional transformers are used to reduce the voltage to a suitable value for domestic use, which is 110 volts in the USA, but 230 volts in Europe, for example. Voltage is a measurement of “potential” energy available, not necessarily how much is actually used.
This is where amperage comes in: an electrical appliance needs a certain amount of electrical energy to perform its job, and draws that amount of electricity from the “river” of volts in the line. A small device, such as a toaster usually needs less power than a larger appliance such as a refrigerator or power saw. In electrical terms, these appliances work at different current ratings. A large electric motor may draw 100 amps of current, while a small heating element may draw only ten amps. Both tap into the same 110-volt line, but their current needs are noticeably different.
Power Consumption
Watts are the units used to measure power consumption. A current of one amp at one volt uses one watt of power. The power used by a device is simply amps multiplied by volts, so an appliance rated at ten amps plugged in to a 110-volt supply will use 1,110 watts. Since watts are used by power companies to measure electricity consumed, and to charge customers, amperage is important in calculating the cost of running an electrical device. Typically, consumers will be charged by kilowatt-hours of power use — running a ten-amp device on a 110-volt supply for one hour will give a consumption of 1,110 watt-hours, or 1.11 kilowatt-hours.
The general rule of thumb for homeowners is the higher the current rating, the more an appliance will cost to run. There is always a trade-off between power and economy when it comes to electrical devices. If economizing on the monthly utility bills is a priority, then products with a lower amperage should be selected. If power and speed are more important, higher current rating products are generally best.
Protecting Appliances
Amperage must be controlled in order to protect the electrical wires and circuits from overheating or short-circuiting. This is why electricians use fuses and breakers. A 30-amp fuse, for example, will allow smaller appliances to run on the line it protects, but if an electric clothes dryer pulls 60 amps, a metal filament in the fuse will melt and break the circuit immediately. Breaker switches also control current through circuit breaking. Larger electrical devices often have their own circuits with higher capacity fuses or breaker switches to avoid such overloads.
Electric Shock
In the event of a person receiving an electric shock through carelessness or an electrical fault, it is the amount of current that flows through the body, and not the voltage, that determines the severity of the injuries caused, and the likelihood of a fatality. Many high school students will have experienced a shock of perhaps 50,000 volts from a Van de Graaf generator in the physics lab, but this produces an extremely small current, and is harmless. On the other hand, a 110-volt shock, with a current of just a small fraction of an amp, could well be fatal. A current of 0.1-0.2 amps flowing through a human body is usually lethal, due to its effects on the heart. Surprisingly, with prompt treatment, victims exposed to more than 0.2 amps may survive, as the severe muscle contractions induced can protect the heart from electrical interference.
Discussion Comments
My company has just received a machine that we were told could be plugged into a 13amp 240v ring main,the machine has a rating of 13.9 amps on its serial plate,
Do we need to install a 16amp circuit and socket or is our supplier correct?
What happens if my adapter with a rating of 12 volts 2amps is connected to a device with an input rating of 12v and 1.8 amps? Will this damage the device?
I am swapping out a 1/2 hp general dewatering pump with a 1hp general dewatering pump. They are both 115v but the smaller one is 6 full load amps. While the new one is 14 full load amp. Can the new pump be wired the same as the old pump or do i need to do something different?
Does it damage my Refrigerator/Air Conditioner/Water Heater (all rated 15amps) to run on 10 amps? The combined wattage use is fine.
What is the difference between 10(60A) and 20(60A) energy meter?
I have a screen printing machine that requires 13.9 Amps and 115Volts
Can I use this machine at home? How would that affect my electric wiring at home? Will it increase my electric bill too much? Please explain it to me as if I were 5 years old. I am just a housewife with NO knowledge of electricity at all.
What happens if I give 5-10amps to an appliance that needs 15 amps? (DC-12V)
Can I replace a 7 amp battery with a 32 amp battery to a UPS of 12 volts?
My computer (aspire 5732Z) takes a 19v-3.42a adapter but I only have a 19v-4.72A adapter. Can I use 19v-4.72A on my Aspire 5732Z?
I'm currently working on an LED DC circuit. I basically have four LED's (each 11.8v, 5W), each with their own slide dimmer.
How do I make sure that if three dimmers are cutting power to three LEDs, the fourth doesn't have too much current going through it? Any help at all would be greatly appreciated!
I have a tiny computer fan that I want to use to increase air flow in a terrarium. it says DC 12V and 0.07A. Is there any way to plug that into an outlet? I'd like to plug it into an outlet timer then plug that into the wall. The fan is super tiny. It's a 1 in by 1 in box that is .5in tall.
This is a big pet peeve of mine. Sorry for being pedantic, but amperage is a crude invented word. An ampere is a unit of current, the word amperage is a spin off of voltage. In the case of voltage the volt is the unit of voltage so when the number of volts increases the voltage increases. When speaking (writing) about number of amperes, the number of amperes increases when the current increases. So what you meant to say is current. The use of the word Amperage has been introduced by electricians and writers who don't understand the construct of units vs. what the units are a measure of have introduced this "slang" word.
If a magnet increased from one Tesla to two, it didn't double in Teslage (had to fight spell check); it doubled it's magnetic flux density.
@Post No. 37: I'd have to say no, don't use the 18-volt supply for the 6-volt baby rocker, but if you do, please post it to youtube and supply a link.
@Post no. 38: Not if you ever want them to run at the same time. Usually they each require dedicated circuits in electrical code.
@Post No. 39: It should be no problem. LED circuits should always limit current by using a resistor in series with the LED, not by the amount of current available in the power supply. Is this how you designed your circuit?
@Post no. 40: Electrically there is no problem with this. She will save money, but whether it is a good idea plumbing-wise, I can't see any problem, but you might want to ask the same question on a plumbing-geek forum.
@Post no. 41: It will probably be fine, but check the voltage if you can. Those little supplies usually aren't regulated and it might drop it too low.
@Post no. 42: It's a 6-amp pump, but on startup it will draw more current momentarily and you have to accommodate for this with conductor and connector sizing.
I have a 220 volt pump that reads 6 amp on each wire. Is this considered a 12 amp circuit or 6?
@Mark: I have a universal AC adapter with a rating of 12v and 1A. I am currently using it for my modem which requires the said rating.
Suppose I integrate a 12v, .12A cpu fan into the circuit Would there be any problem with that?
My mother has a condo that she visits maybe 15 times a year, mostly on weekends. Is it the safe or the right thing to do, to turn the hot water heater off each time she leaves by throwing the circuit breaker, then turning it back on when she comes back?
I have several leds and switch motors that require only 12 to 16 volt to operate, and very low amperage, like 30 milliamps, will run 30 of these devices. They recommend using 2 dc adapters 12 volt in bipolar mode a +, - and a common.
I have two dc adapters 12 volt 2.5 amp that will be like 10,000 amps to use for small devices. Does it matter that the amperage is so high and the devices don't need much amperage? Can I damage the devices or does the amperage mean nothing unless I go over what the rated amperage is on the power supplies?
Can I plug a washing machine and a refrigerator into the same outlet?
I have a 18 volt output ac/dc wall charger. Can I safely use it as a power source for a infant rocking device that only accepts 6 volts? Thanks for the help.
If I used a 9 volt ac adapter for a 15v dc mixer, what did I fry? The mixer or ac adapter?
I have a nebuliser purchased from the US and now we are in India. On the nebiliser, it's written that we have to connect it to 115 v outlet (2 A max). We have searched in the market for a voltage converter from 220 v to 110 v, but the salesmen ask about the 'watts'. what kind of converter with an output of how many watts will I have to use?
I have a question. If a device requires 5V and 2.6 Amp, can I use a power source which delivers 5V 2.5 Amps?
I have an elliptical machine that did not come with a power supply. It calls for a 6 volt 2A power supply. Can I use a 5 volt 2.5 amp power supply if everything works fine?
Post #6 answer:
Yes, 5000mA is the same as 5amps.
Post #7 answer:
AC circuits should be dedicated to only that unit. You shouldn't plug them into a regular circuit as they all draw more current on startup (it's a motor thing) than when they are running and could overload a regular circuit if something else was also plugged into it. So, you can't or shouldn't do what you are planning.
Post #8 answer
You seem to not have a complete neutral circuit. the voltage is getting to the receptacle but it has no return path to neutral. Check your white wires.
Post #9 answer:
Use the higher current rated power supply, it will be fine. The circuit will only draw as much current as it requires as the supply is still 5 volts (never change voltage supplied, that's where things blow up). Having more current capacity is good as the power supply will probably run cooler and steadier.
Post #11 answer:
Amperage is a part of Wattage. Volts multiplied by Amperes equals Watts.
Think of volts as pressure and current as flow, you need both to transmit power.
Post #12 answer:
This is normal, the unbalanced part of the load is going through the neutral conductor. It's good practice to wire a house to try and balance the phase load as much as possible but it will vary depending upon what is running at the time. It shouldn't affect your bill.
Oh yeah, an answer to the question in post 28:
The answer depends on the circuit, I'm assuming the LED's are in parallel, in which case you would need a resistor in series with each LED to limit the current to a value below the Imax (maximum rated current) of each LED.
Most LED's should work with 15 to 30ma so a standard resistor with a value from 360ohms to 680ohms in series with each LED should limit current enough to not burn out the LED.
Use 1/4 watt resistors if using 15ma, use 1/2 watt if you're going up to 30ma.
My Question: If i need 12vdc 40ma to power a led circuit, what is the max current my ac/dc power adapter can put out without damaging the leds (or the adapter)?
Answers to other posts:
To post 26: That depends on how much power all the devices in your computer CAN drain. Look up power supply calculators.
To Post 25: The amount of amps that can be drawn depends on the conductor (wire size) feeding the plug to the fuse box, the fuse (or breaker) size (limit), and the source amperage (amount being sent to the fusebox). The amount you can draw is always the lowest number out of all of those. 220v ranges from 30a all the way up into the hundreds (depending on what it will be used for).
To Post 24: This will most likely damage the device as the power supply isn't sending enough power to the device.
To Post 23: You have to upgrade the conductor (wire) and fuse and plug to 30a rated. No way around that.
To Post 22: While the device might be rated at those numbers, what is the actual amount its drawing? usually on the tags you will find something like "110v 2.6A". that is the max it will draw. since a washer and a fridge are motor driven if both use more than 15A combined, the most likely thing that will happen is the breaker will trip. Always try to run a dedicated 20A line to a fridge to prevent anything tripping that breaker and spoiling your food.
To post 17: PSI is Pounds per square inch. its usually used when talking about compression (liquids, air, etc).
To post 15: You would trip the breaker, or blow a fuse (depending what your place uses).
To Post 13: Set it to 115v. ac power is never perfectly 110 or 120. as long as its not a major difference you should be fine.
why do we used k v a on transformer and kw for energy measurement.
i want to know what should be the power supply for a CPU. should it be 6A or 10A? Please let me know it soon. i want to connect my cpu.
how many amperes can be drawn by a single 240 volt power outlet?
what happens if my adaptor with a rating of 15 volts 4amps is connected to a device with an input rating of 15v and 5 amps? will this damage the adaptor?
i have a machine which requires 30 amps, and my socket output is only 16 amps. What can i do to run my machine? please advise.
I have a freezer rated at 15 amps and a washer rated at 15-20 amps. What happens when I connect both appliances to the same outlet rated at 15 amps at some point both will run at the same time. Will I cause a fire? Is my only choice to call an electrician to install a bigger breaker? Thanks.
I have a power line of 220V 50Hz Ac and i need an output of 20 volts and as much current that it can deliver. Can you please suggest the circuit diagram for that?
I have a dryer hooked to 220 volts and its pulling 14 amps, but the dryer doesn't heat up. What can I do?
The ampers and breakers you use have to do not only with the amperage of the electric unit, but also with the gauge you used in the line (wire).
In other words if you used a 10 gauge 30 amp wire then the breaker also should be 30 amps to avoid overheating the line; so, if you plan to have a unit that inputs 60 amps then add a wire that match those amps with the gauge itself; probably 6 gauge in that case.
Need your advice. I am replacing an on-off-on toggle switch than runs a hydraulic trailer tilt. Old switch says 10 AMP on it, is it OK to replace it with a sturdier one that has 35 AMP max?
I'm doing a school project and i need to know if psi = amps
I need to produce a burst of 1,000 amps. how can i do this?
If you run two appliances that are 2700 watts and you have 120 voltage and 15 amps, what would happen? Would you break the circuit since you are trying to draw more power than is possible from 120 v and 15 amps?
@#11
P = Power = Wattage
I = Amps = Amperage
V = Volts = Voltage
So, Power is the Amperage multiplied by the voltage.
So, if you have 2A being used and 240V of Voltage, then the Power (Wattage) = 480Watts
i live in japan. i'm english so i don't understand much of what's going on out here to be honest. however my new problem is I've recently bought a cd mixer. It says I must change the voltage either to 115 v or 230 v, but the voltage in my apartment is 125 v. Which voltage should i change my cd mixer to?
I have a main breaker panel that has different amp readings with a clamp meter on the two feeds. The difference is ~18 amps on one side, and ~28 amps on the other, is this normal? I also have the same difference in readings on the AC blower unit. The unit is feed by a 30 amp double pole breaker, with ~7 amps on one feed, and ~12 amps on the other. Is there a reason for this, and could this cause unuasually high power bills?
What is the difference between Amperage and Wattage?
lbealessio,
Since the output of the adapter is higher than the design of the electronic device is designed for, you may damage the device. It is never a good idea to supply more voltage or amperage than the device is designed to handle.
anon15670,
Most devices have two amounts of power used, one is when it is turning on, where it will tend to use the maximum. During normal operation, the power demand will be lower.
With an air conditioner, most of them cycle on or off depending on the temperatures, so you will see more power used as it cycles up, and then less as it works to maintain the temperature. In general, the amp draw will be 15 to 20 amps peak use by most air conditioners, but the voltages will differ based on how efficient the unit is, as well as how powerful the unit is.
Vince,
You need to be really careful when you are dealing with a 220 line. I am not sure about the exact regulations, but in general, any 220 line should be direct into the breaker box with a dedicated breaker. *Never* connect two lines into the same breaker because it is against the building code in many/most places.
It is always better to pay a little more than to risk starting a fire. If your breaker box is full, then replace it with a bigger box, or you can have another smaller box set up with a feed into the main box. Again, check the building code, but you really don't want to fill a breaker box to the limit(risk of fire).
When in doubt, ask a licensed electrician in your area for advice to make sure you are not doing anything stupid(we all make stupid mistakes when learning, but some mistakes will be really obvious to those who know what they are doing).
onthego4,
If you pull more amps at once than the circuit is designed for, either the circuit breaker(or fuse) will trip, or there is the potential to start a fire. This is what fuses and circuit breakers were originally designed to do, prevent fires due to too much electricity flowing through a line that is not designed for it. What can happen is a circuit breaker will go bad and will allow more current through a line than it is designed for, and you end up with an electrical fire. The type of wire used, insulation, etc is a part of how much it can handle without overheating and causing a fire.
If you have an old electrical system in your house, it might be worth the money to have an electrician check things out and replace the circuit breakers. You also should not neglect to check the master breaker to make sure that it can handle the full load for the house and is working properly.
The house I just bought had 100 amp service with an electric range which by itself was rated for 50 amps max load. I knew right away that that was not good, so Ipaid to have the service upgraded to 200 amps with a new larger breaker box(so more circuits could be added). It is worth paying $2000 to avoid a fire, and even if you have a clue about doing the work yourself, if you are not certain about what you are doing, save yourself the pain of a fire by having a real professional(not someone who takes money but doesn't know what he/she is doing) do the work.
If I have a delicate electronic device that requires a a power adapter with an output of 5volt, 1.6 amp. What happens if I plug it into a power adapter that has an output of 5 volts and 2.8 amps?
I have an underground line running to an outside fixture. There is a hot, neutral, and bare copper ground that I can identify inside the house and at the exterior receptacle. When I use a tester I read continuity on the wires and I read voltage of 125 at the receptacle. However when I plug in anything to the receptacle there is not enough amperage? wattage? to light a bulb or start a tool. I have switched the lines around but get no change. Is it possible the or one line is so corroded or nearly severed such that voltage reads but amperage is not sufficient?
I have a 15 amp circuit in my bedroom. My current air conditioner has a posting of 12 amps and i want to replace it so I can use more appliances while it is on. Unfortunately, all the air conditioners I see (even those with lower BTUs than what I currently have) has this "15 amperes", drawn amperes: 4.6, 6.8, etc) What does that mean? All I really want is to find a lower ampere requirement of an A/C. What is the difference between amperes and drawn amperes?
I have a power supply that has 5000mA 60 Watt max.
Is 5000mA the same as 5 Amps?
rms: "I have an LCD monitor that requires 4amps. what happens if the power supply that I use is only 2.5amps?"
I am not an electrician, but I have enough knowledge to be dangerous.
If it is the power supply that was provided by the manufacturer, you are probably all right.
But I needed to get a small computer peripheral working for a conference call, and could only find a unit that was rated for fewer amps (but the same voltage) as the peripheral.
The conference call went OK, but the power supply just about melted before I noticed it was too hot to touch.
If you threw that power supply in there, I'd get one that is properly sized - It's likely a fire hazard.
I have an LCD monitor that requires 4amps. what happens if the power supply that I use is only 2.5amps?
Suppose I have a 30 amp breaker with 10-3 wire run. At a junction box it splits to a electric dryer with 10-3 wire. Would it be appropriate to run a 12-2 wire from the 10-3 wire at the junction box to a fuse box with a 20 amp fuse to run a washing machine? I'm trying to understand the relationship of the current flowing from the 30 amp breaker through the 10gauge wire and the 12 gauge wire. I'm thinking the only current flowing will be that which is pulled by the device. If so the 12 gauge wire is adequate since the device, the washing machine, will only pull up to 20 amps. Therefore the 12 gauge wire would handle the current flowing to the washing machine.
What will happen if you plug a machine that requires 32.0 amps to an outlet that only produces 31.2 amps?
When you buy a product and it says 15 amps, does that mean that it requires that much amps to operate it or needs to be plugged to outlet with that much amperage but uses less than the outlet put out?
In the first circuits course of my electrical engineering program my instructor said
"If you are going to be a professional you must never use the terms amperage and wattage, the correct terms are current and power"
Post your comments