While not as power-hungry as appliances like air conditioners or washing machines, it’s important to know how much electricity a computer uses when you’re looking at your whole home’s energy usage.
Key takeaways about powering a computer
- On average, laptops use about 30 to 70 watts of electricity.
- Large desktop and gaming computers use between 200 and 500 watts of electricity, on average.
- Using a computer for 8 hours per day will use about 12.2 kilowatt-hours of electricity per month and 146 kilowatt-hours of electricity per year.
- A computer costs an average of $1.73 to use for a month and $20.72 to use for a year.
- The best way to save money on electricity is to install solar panels. Start comparing your options on the EnergySage Marketplace today.
In this article
- How much electricity does a computer use?
- Definitions: watts, volts, amps, and more
- How much does it cost to power a computer?
How much electricity does a computer use?
Generally, computers use between 30 and 70 watts (W) of electricity, depending on the model. Computers usually use between 3 and 5 amps, and connect to a 120-volt outlet. Larger desktop and gaming computers can use up to 500 W.
How much you use your computer has the biggest impact on how much electricity it uses over time. Assuming an average computer needs 50 W to run:
- Using your computer for 6 hours per day results in 2.1 kilowatt-hours (kWh) of electricity per week, 9.1 kWh per month, and 109.5 kWh per year.
- 8 hours per day of computer usage comes to 2.8 kWh per week, 12.2 kWh per month, and 146 kWh per year.
- On the upper end, running a computer for 10 hours per day uses 3.5 kWh of electricity per week, 15.2 kWh per month, and 182.5 kWh per year.
Different wattage computers use different amounts of electricity over the course of a year. Assuming you run your computer an average amount (8 hours per day), here’s how much electricity you’ll use over the course of a year for different wattage computers:
How many watts do different computers use in a month and a year?
|Computer wattage||Hours per year run||Monthly kWh of electricity||Yearly kWh of electricity|
|30 W||2,920 hours||7.3 kWh||87.6 kWh|
|40 W||2,920 hours||9.7 kWh||116.8 kWh|
|50 W||2,920 hours||12.2 kWh||146.0 kWh|
|60 W||2,920 hours||14.6 kWh||175.2 kWh|
|70 W||2,920 hours||17.0 kWh||204.4 kWh|
We’ll mostly be referring to the electricity used by computers in terms of kWh in this article. The reason is simple: your electric bill is measured in kWh, and you get charged based on the kWh of electricity you use per month!
Watts, amps, voltage, and more: what do they mean?
There are a lot of terms you can use to describe how electricity flows and is used by appliances. We’ve already mentioned most of them – here are a few definitions to keep things straight:
- Volts (V): volts (short for voltage) are measures of electrical pressure differences. Put simply, voltage is the speed of electricity passing through a circuit.
- Amps (A): amps (short for amperes) are a measure of electrical current. Put simply, amps are the amount of electrons (which make up electricity) flowing through a circuit.
- Watts (W) and kilowatts (kW): multiplying volts x amps gets you watts (or wattage). Put simply, watts are the rate of electricity consumption. A kilowatt is just 1,000 watts.
- Kilowatt-hours (kWh): lastly, kilowatt-hours are how your electric bill measures your energy usage. Simply put, kilowatt-hours are electricity consumption over time.
You can think of these terms like water flowing through a pipe. Voltage is the water pressure, amps are the amount of water flowing past any point, and wattage is the overall rate of water flow through the pipe.
How much does it cost to power a computer?
When you get your monthly electric bill, you only see the total amount you’re charged, not how much each appliance contributes to your final bill. Based on an average wattage of 50 W for computers (amounting to 146 kWh/year) and using state average electricity rates, here’s how the cost to run a computer pans out over the course of a month and a year:
Monthly and yearly costs to run a computer by state
|State||Average electricity rate||Cost per month||Cost per year|
|California||22.00 ¢ / kWh||$2.68||$32.12|
|New York||20.59 ¢ / kWh||$2.51||$30.06|
|Texas||12.56 ¢ / kWh||$1.53||$18.34|
|Massachusetts||22.59 ¢ / kWh||$2.75||$32.98|
|Florida||12.21 ¢ / kWh||$1.49||$17.83|
|Virginia||12.58 ¢ / kWh||$1.53||$18.37|
|New Jersey||16.20 ¢ / kWh||$1.97||$23.65|
|Maryland||14.48 ¢ / kWh||$1.76||$21.14|
|Washington||10.38 ¢ / kWh||$1.26||$15.15|
|US Average||14.19 ¢ / kWh||$1.73||$20.72|
Note: average electricity rates are based on October 2021 data from the U.S. Energy Information Administration (EIA).
Looking to offset your electric bills (and the energy these appliances use) with solar? When you sign up (for free!) on the EnergySage Marketplace, you can compare solar quotes from high-quality, local solar installers. Make sure to keep in mind your current and future electricity usage, and talk about how that could change with your installer for the most accurate quotes.
Calculate how much energy your own computer uses
If you want to know how much electricity your computer uses (or at least is supposed to use), take the estimated yearly electricity use in kWh – this is probably your best bet for an accurate number. Simply multiply this number by the average electricity rate in your area to get an estimate of how much you spend to power your computer each year. For an estimated monthly cost, divide the estimated yearly cost by 12.
Frequently asked questions about powering a computer
If you’re on a time-of-use (TOU) rate plan, you are charged different amounts for electricity throughout the day. In general, it’s cheaper to use appliances during “off-peak” hours, which are usually overnight.
All popular home batteries can power a computer: most lithium-ion batteries like the Tesla Powerwall or Generac PWRcell have a power rating of 4 to 5 kW or higher and 10+ kWh of usable capacity. Computers use about 50 W (0.05 kW) of power at any one time, meaning a battery will be plenty suitable for backing up and powering your computer, even for long periods and with more powerful computers.
On average computers use about 50 W of electricity to stay powered. With solar panels rated at around 350 W, you’ll be able to power a computer with one solar panel easily.
ENERGY STAR is a U.S. government-backed system that certifies how energy efficient appliances are. If an appliance is better than the average appliance in its category by a certain amount, it is labeled as “ENERGY STAR certified.” ENERGY STAR appliances cost less money to run, given that they are more efficient with the electricity they use.
How much money can solar panels save you?
Solar savings vary widely, and your unique savings depend on factors like electricity usage, location, and electric rates and plans. In general, most homeowners can expect to save somewhere between $10,000 and $30,000 over the lifetime of a solar panel system. On average, it takes between 7 and 8 years for most homeowners who shop for solar on EnergySage to get their solar panels to pay for themselves.
Going solar is one of the most effective ways to reduce or eliminate your electric bill, and you should make sure you are getting several quotes from reputable installers before you decide to move forward. Visit the EnergySage Marketplace to get solar quotes from installers in your area and begin comparing options.