Energy Study

Taming the Energy Use of Gaming Computers

Nathaniel Mills
GreeningTheBeast.org

Evan Mills
Lawrence Berkeley National Laboratory
evanmills.lbl.gov
emills@lbl.gov


Just published in the journal Energy Efficiency, this new study presents a novel analysis of the energy use of gaming PCs…. 

Download the full report here.

NEWS RELEASE

News Coverage


One billion people around the world today engage in digital gaming. Gaming is the most energy-intensive use of desktop computers, and the high-performance “racecar” machines built expressly for this purpose comprise the fastest growing type of gaming platform.

We found enormous performance-normalized variations in power ratings among the gaming computer components available on today’s market.  For example, central processing units vary by 4.3-fold, graphics processing units 5.8-fold, power supply units 1.3-fold, motherboards 5.0-fold, RAM 139.2-fold, and displays 11.5-fold. Similarly performing complete systems with low, typical, and high efficiencies correspond to approximately 900, 600, and 300 watts of nameplate power, respectively.

While measured power requirements are considerably lower than nameplate for most components we tested--by about 50% for complete systems--the bottom-line energy use is massive compared to that of standard personal computers.

Based on our actual measurements of gaming PCs with progressively more efficient component configurations, together with market data on typical patterns of use, we estimate that the typical gaming PC (including display) uses about 1400 kilowatt-hours of electricity per year. The energy use of a single typical gaming PC is equivalent to the energy use of 10 game consoles, 6 conventional desktop computers, or 3 refrigerators. Depending on local energy prices, it can cost many hundreds of dollars per year to run a gaming PC.

While gaming PCs represent only 2.5% of the global installed personal computing equipment base, our initial scoping estimate suggests that gaming PCs consumed roughly 75 billion kilowatt-hours per year of electricity globally in 2012, or approximately 20% of all personal desktop computer, notebook, and console energy usage combined.  For context, this corresponds to about $10 billion per year in energy expenditures, or the equivalent electrical output of 25 typical electric power plants.

Given market trends and projected changes in the installed base, we estimate that this energy consumption will more than double by the year 2020 if the current rate of equipment sales is unabated and efficiencies are not improved. Although they will represent only 10% of the installed base of all types of gaming platforms globally in 2020, relatively high unit energy consumption and high hours of use will result in gaming computers being responsible for 40% of overall gaming energy use.

This significant energy footprint can be reduced by more than 75% with premium efficiency components and operations, while improving reliability and performance. This corresponds to a potential savings of approximately 120 billion kilowatt-hours or $18 Billion per year globally by 2020.

There is a significant lack of current policies to achieve such improvements, and very little guidance is available to help consumers make energy efficient choices when they purchase, upgrade, and operate their gaming PCs.  Key opportunities include product labeling, utility rebates, and minimum efficiency standards.

Following are a cross-section of comments and questions we've received, and our effort to offer constructive responses:

Q: What single piece of advice would you give to someone building a new gaming PC?


A: Do the math to determine your total cost of ownership (purchase + operations).  A $2000 machine could use again that much energy over just a few years....   If you're in a place with high electricity prices the operating cost can be particularly prohibitive.  Take the challenge of improving efficiency as a cool challenge and geek out on it!
 
Q: How has the gaming community and industry responded to the research?

A: We've had the whole gamut of reactions. Many people react to brief news summaries without reading the actual report.  People tend to compare their particular machine and personal utilization to our "average" one and balk at any differences.  Of course, no one person is average but one needs to use averages in order to extrapolate to large scales.  Lots of people assert that it's not cost-effective to save energy, but this is a red herring insofar as some things can be done that cost nothing, other things will have a reasonable payback time, and others will yield non-energy benefits such as noise and heat reduction that are certainly valued by gamers.  Some people are in disbelief that energy use can be reduced without taking a performance hit, but the data show clearly that this is quite possible.

Q: You recognize and address some of the limitations of the study, do you have plans for further research and if so, what form does that take? 

A: There are seven further research ideas at the bottom of this page.

Q: Are there free/cheap things that gamers can do to monitor their energy use, and if so, is there a place for crowdsourced data?

A: Nine items atop this pageCrowdsourcing would be a great idea.  I don't know of anywhere that does it at any scale.  However, before this  could be done meaningfully there would need to be standard, accepted methodologies for measuring ... otherwise we will just have a lot of inconclusive apples-and-oranges data.
 
Q: It’s clear that some gamers don’t much care about environmental concerns - what’s one thing you would say to those people?

A: Many people say that they simply don't care about energy or environment and that gaming is a necessity rather than an option.  We are actually not advocating that people stop gaming or even reduce it, only to look at ways of specifying more efficient rigs.  Even if environment isn't considered important, an inefficient gaming PCs also release tons of heat and are noisier than efficient ones, so, it's a win-win proposition to improve efficiency ... not to mention saving money. The younger set may count on their parents to pay energy bills, but they will begin to become aware now and it will be a shock.

Others care, but think our attention is misplaced.  It's easy to dismiss any particular use of energy as being insignificant, and easy to point to "big polluters" elsewhere.  The reality is that energy use is highly diffuse and there is no magic bullet or single type of use that will solve the problem.  Gaming computers use more energy than previously thought, with each one consuming as much as three new US refrigerators. And gaming PCs are about the only piece of equipment left that has had a free ride in terms of no energy labeling, standards, or incentives for improvement ... this double standard will no doubt soon be resolved.

Q: 
TLDR [Too Long, Didn't Read].  
A:  (Still working on an answer to this one)

Q: Did you just make these crazy numbers up? You must be assuming that people are gaming 24x7 full-out at 1000 watts of load or something like that.

A: Well, that would certainly be unrealistic, and would result in energy use of 8,760 kWh/year (6-times what we estimate.  Based on the most thorough market research we could find, the average gamer plays 4.4 hours per day, and this is what we assume for the period of time spent gaming. We found that nameplate power is often twice as high as the actual (peak) power requirements that we measured, so we did the calculation using the lower actual value. Keep in mind that the typical user also uses their machine for reading email, web browsing, and other tasks, plus idle time, so we incorporate each of those modes as well. The power demand during those periods is much lower, but still adds up to about half of the total annual use. We assume the machine is off 8 hours of the day.

Q: Why don't you pick on cars or something that uses enough energy to worry about?

A: This is one whopper of a red herring. The question is not usefully framed as a matter of which energy use to scrutinize.  For those who think that energy is at all a problem (cost, pollution, whatever), then all uses can and should be looked at for efficiency opportunities.  And virtually all of them (lighting, heating/cooling, fridges, etc.) have been addressed extensively over the past 30 years, with huge improvements in efficiency and gains for consumers.  Gaming PCs are one of the only loads that has not had the benefit of a close look from the energy community, so far. Anyway, the typical gaming PC in the study uses about 1/3 as much electricity as the average California home -- i.e., not trivial. The typical gaming PC costs from $140 to $560/year to run (depending on energy prices) - https://sites.google...ast/cost-carbon - also not trivial, and this does not include what can be a comparable bump in air-conditioning use if you're in a hot climate (cold climate - sometimes this waste heat is useful throughout the house, but mostly not distributed well).


Q:
The GPU is never enough, you buy as much as you can and it will be outdated pretty fast anyway. 


A: Another red herring.  You're conflating performance with energy use.  Our study shows (in spades) that there is not a 1:1 correlation between the two.  For whatever performance level is desired, energy can be saved by choosing wisely from what's on the market.


Q:
T
he synthetic benchmarking software used to test performance of the PCs in the studies is not an accurate reflection of real life.

A: We didn’t see different power draw when running actual games, but the use synthetic benchmarks is the way this should be done — it’s like MPG ratings for cars. If everyone just compared their mileage you’d have no way to untangle which of the differences were due to fuel economy and which were due to driving style, roof racks, weight, etc.


Q: I just wanted to let you know that a lot of your data is wrong or self contradicting on your website
"The energy use of a single typical gaming PC is equivalent to the energy use of 10 game consoles"
Well in your own diagrams you reference a 450 watt PC, yet a PS4 and XboxOne both pull around 110 to 140 watts. A gaming pc does not pull ten times either of these systems, especially not a "typical" one. You've either purposefully altered the data or not taken the time to do the research. Luckily extreme-tech did - Ref: http://bit.ly/1jfQ4sr

A: You're mixing up some very important units here. We are discussing energy (Wh, not power W). Average utilization of gaming PCs and consoles are different (not to mention different levels of load for each mode of operation -- see article), and so energy use does not at all scale with nameplate power. Also, our methodology was to characterize typical equipment in the stock rather than random non-representative examples. The peer-reviewed literature indicates significantly less average power for consoles than your example. The energy (kWh) gap is even greater because national-average gaming PC use is 4.4h/d and average console is 2.2 (see cited literature in our article). Yes, there will certainly always be large variations around the means (cars, fridges, gaming PCs, you name it), but means need to be used for macro-level analyses.

By the way, we actually reviewed the NRDC report that you referenced, and it is flawed on the high side - see footnotes 1 and 15 in our paper. NRDC received pre-publication comments pointing out the problems and disregarded them.  There are much more careful studies in the literature.

If you want to look in a balanced way at reasons our analysis could yield lower (or higher) numbers, you need to also consider HVAC interactions (which we do not even treat in our analysis). For the US overall, waste heat in homes costs far more to remove with AC than it "saves" with heating. Yes, a user in Bangor Maine may not experience it this way, but a user in Phoenix very much will. Also, claims of 'beneficial' waste heat are ill-founded given the vastly higher efficacy of heat pumps than electric resistance (e.g., a gaming PC or strip heater), not to mention better distribution within the home.

Anyway, we're very clear in the piece that this is a scoping study, the first to look at this in any detail.  No doubt our numbers can be improved.  Let's talk about how to do that!

Q: That's fine, but at what cost?  You talk about building a PC using more efficient components. Is it even beneficial for a person using a 2 generation old graphics card to spend 200-300 dollars to save 10-50 dollars a year? I would say no.  Just buy the latest tech and you'll save energy.

A: This is not properly framed. No one recommends scrapping equipment that has remaining service life; at issue is the incremental cost of more and less efficient gear at the time of normal replacement (as well as no-cost operational fixes. 

In any case, one can't have an intelligent conversation about cost-benefit analysis until one knows the operating cost side of the equation.  That's what this study was about.

Cost-benefit analysis is the next step and is not methodologically trivial. Randomly picking prices off the internet could yield all kinds of results; a systematic effort is needed. There are also co-benefits that need to be considered (heat, noise, size); a simplified engineering-economic calculation looking only at energy cost will not fully represent all that's important to users decision making. That said, think of the trends in DDR (vastly less power despite vastly more capacity) are teachable -- people opt for the better tech for all kinds of reasons, not just simple energy payback time. Consideration also needs to be given to measures entail no cost, and other measures save cost (e.g., not over-sizing a PSU, buying a fancy graphics card that is bottlenecked by an inadequate CPU; hot unused slots in your motherboard). Also, proper cost-benefit analysis would have to take into account above-mentioned HVAC interactions. We are interested in what does ... and can ... happen, as distinct from an idealized baseline where everything is already optimized (which it certainly is not).

Q:
CPU the bottleneck in gaming? On what planet? You can get a dual core unlocked pentium that sometimes can be found at as low as 45$ and overclock it for minimal loss in GPU perf despite being just dual core with less cache. Anyone claiming the CPU as a bottleneck clearly hasn't even spent 15 minutes researching the topic. You need to cripple the CPU perf hard and game at low res for the CPU to be the bottleneck. If anything , most of the time the CPU is overdimensioned for gaming but that's , at least in part, because that CPU perf is needed for other tasks. And again price is always a factor.

A: Earth, third stone from the Sun.  You seem to be misreading the work and overgeneralizing a bit.  In a real-world game, a config like this could produce a hit of 10-20 FPS, which is not trivial to most gamers. This will be game-specific.  Anyway, we don't make any assumptions about bottlenecking in our base machine or subsequent analysis, it is just a side comment describing one of many potential issues...

Presume you mean "hi-res"....  Anyway, we are saying only that without proper specification the GPU 
can form a bottleneck. We're sure this happens in the real world, as opposed to an idealized one. We're not saying that it's the norm.  If you're video editing on the rig, too, then sure maybe the CPU will be less likely to become a bottleneck.  Keep in mind that PartPicker won't warn you if you have a mismatch, and lots of people use PartPicker.

Q: What about cost? Of course the latest tech is going to be more power efficient, but new also means expensive. The Nvidia 900 series and the AMD 300 series aren't cheap and the cost of the card doesn't outweigh the cost of the electricity.

A: Latest tech is by no means automatically more power-efficient. If you look at the field you'll see little correlation between price and energy efficiency. In any case, not sure we agree with your characterization of the 970, as it is often referred to as the "Performance King", indicating market approval for the price-performance relationship (even before thought is given to operating cost). The AMD 300 doesn't seem like a good way to make your point either, as it is a mere refresh (no meaningful performance bump over the 200 series, just more pricey, and actually draws more power at a higher price). This is not an isolated example from the world of energy-using technology where a more expensive but otherwise 'equivalent' product actually uses more energy. An example is the comparison of an Nvidia 980Ti vs AMD 295X2. Per gpuboss.com, the former actually performs better by Passmark, has half the nameplate power (~250W / ~500W) and substantially lower price (~$650 / ~$1000).

Q: PSUs are a bit oversized but efficiency is a high priority for buyers and there is pretty good data available to the DIY scene. 


A: Agreed.  We make this point clearly in the study, and cite PSUs as one of the (few) places where success has been attained.  Note that 80 Plus did not emerge from within the PC community, it was introduced by colleagues of mine who came from the energy analysis side of things.  It would not have happened (or at least not nearly as quickly) without the same kind of curiosity that we approached this paper with.

Q: G-Sync and FreeSync have a rather high premium, not to mention that nobody, absolutely nobody buys such a screen to be able to use a lesser GPU. 

A: See cost FAQ, above. Moreover, we are talking about what is possible if energy is a goal, rather than describing current behavior.  That said, our base machine certainly did not assume G-Sync, arguably for the reasons you give, i.e., it is not standard practice.  This is also a bit of a Red Herring; I'm not going to second guess NVIDIA or AMD on the market research that compelled them to invest in the R&D, tooling, and marketing to bring these to the market. At any rate, any new tech is more "expensive" than what will ultimately be its stabilized price.  Time will tell.

Q: The GPU is never enough, you buy as much as you can and it will be outdated pretty fast anyway. 

A: Another red herring.  The question mixes up performance with energy.  Our paper shows (in spades) that there is not a 1:1 correlation between the two.  We're saying that energy can be saved by choosing wisely from what's on the market.

Q: Plus GPU makers do make efforts to be more efficient and most of the share Nvidia gained in the last 1 year+ was because Nvidia was more power efficient than AMD.  

A: We agree and think that comes out clearly in our paper.  But, you'll see from Fig 5 that there remains a HUGE (factor-of-2) variance in energy use per given level of performance.

Q: This whole argument doesn't even matter unless people can maintain a decent system, for very cheap, to save maybe 50 dollars a year. You cite "hundreds" of dollars could be saved each year with more efficient hardware. This is wrong, let me explain. Average cost per KW is 12 cents per KW. Ref. http://n.pr/1JKAbM6 Assuming I'm gaming 22 hours a per week Ref: http://bit.ly/1gvHsTU That means I game 1144 hours per year. 1144 x 0.12 = $137.28. This is not hundreds. Also spending more money to reduce this cost, does not eliminate the cost. The end result is that people have less money, power still gets consumed, and it's bad all around.

A: Again, we're talking energy here, not power. You've mixed up your calculation, i.e., energy use isn't the product of hrs/year and price (your equation assumes 1kW sustained load for the machine which is clearly high). You've also assumed zero load for all non-gaming hours, which may mean that you unplug your machine when not gaming and don't use it for any other purpose ... but that is not representative of the typical use case that we are modeling. You need to add average load (in kW) to the equation, which is the time-weighted average across all the modes of operation (see article). I think that was somewhere between 0.2 and 0.3 kW for our test machines.

Also, you can't get the economics right unless you look at electricity tariff structure. The average price is not meaningful here because in most markets tariff structures are steeply inverted such that the marginal kWh is far more expensive than the first or average kWh. Marginal tariffs (where the gaming PC's energy use--and potential savings--occur are closer to 40 cents/kWh, depending on where you are. We provide a table showing the sensitivity of the cost to this and operational variables.
Even with the right math, your off-hand characterization of cost-benefit analysis isn't how the energy market is actually working. People invest to gain profit. And sophisticated decisionmakers think about TCO (total cost of ownership, i.e., the sum of first cost and operating cost). There is an enormous literature showing that energy efficiency pays for itself many times over. That is why energy efficiency is well documented to be saving hundreds of billions of dollars annually already in the US economy compared to the efficiencies we had before the energy crises began. Closer to home, what has happened with 80Plus and EnergyStar displays gives an indication that there is a market for and demand for improved efficiency, even for gaming PCs.

Anyway, our job here was not to tell people how aggressively to invest in efficiency, but, rather, to show the envelope of possibilities. Each will decide for themselves, and many, including presumably yourself, may not do anything. Where that has been the case in the past, a whole host of things have been done, ranging from product labeling to financial incentives to mandatory standards. The gaming PC space has thus far been largely overlooked in this regard.

Q: I think the consumer information out there is perfectly adequate.

A: Hard to agree here. IN any case, it's way less than is available (and much harder to get) for most consumer products.  Many components' nominal power requirements are not labeled on the box, and can be hard to find even online. More importantly, the study shows that actual power draw is on average half of nameplate for complete system (varying by component, but almost always substantially less than nameplate).

That's "good" when it comes to energy use, but, still in order to make informed decisions (and not oversize PSUs) we need better bench testing and labeling. To do this meaningfully, there need to be industry standards and test procedures for this gear -- the study notes examples of how product reviews report significant variations in measured power demand for identical components and benchmarks. Annual operating costs depend not only on the integral of (actual) energy use over all the modes of operation (see chart in study) but also on electricity tariff structures. Electricity prices vary by a factor of 10 or more across the US today. And, there are second-order effects like increased air-conditioning energy use that few people know how to estimate. The information environment could be better.

Fridges actually provide a great example of what we see as the differences between Gaming PCs and most other consumer products. Law requires all refrigerators to be prominently labeled with the FTC's EnergyGuide label at point of sale.  The labels are bright yellow and probably 5x8 inches or so in size.  The USEPA also maintains a very exhaustive, online list of fridges (sortable by brand, model, size, features, etc) and ranked by energy use per standardized test procedure ... here.

Q: It's not gonna happen. Gaming computers will continue to consume insane amounts of power for as long as they exist. Compared to what we had only 5 years ago, the power to performance ratio has more than doubled. But at the same time we've used that increase in efficiency to get double the performance at the same power ratings rather than the same performance at half the power consumption. It's not like we finally hit a point where 'computers are good enough and don't need to be improved', if anything, its the complete opposite. We NEED more computational power. Not want, NEED. We are still well within the territory of hardware holding back what can be achieved with software. Until I can run F@H simulations in real time instead of in nanoseconds per day, graphics cards will continue to consume upwards of 300W and there's nothing that branding or labels can do about it. Instead of trying to do the impossible, how about you put some effort into affordable solar panels to cancel out the power used by my computer instead? I would love to power my home with solar, but I don't have several thousand dollars lying around. Don't blame computers for using power, blame local power companies for continuing to rely on fossil fuels and charge asinine rates for something as basic as electricity. The technology already exists and we could easily generate more than enough power for the whole world using renewable resources, don't blame us for the failures of those who actually have the ability to bring about change.

A: Hmmmm. I didn't hear a question there. Because it's cheaper to save a unit of energy than to generate it with solar, it's wasteful to run inefficient devices with solar.  

The rest of this reminds us of the other guy who said:

I overclock the shit out of my computer most the time which increases power draw to about double what it is at stock. This nets an additional 20% performance (average). And then I suffer because it turns my room into a toaster. But I do it for the love of the PC MASTER RACE, Gabe N., and max fps.
TLDR [Too Long, Didn't Read]: People above me already said it all, we have capitalism and I can waste my money on dank pc gaming framerates if I want to.

And the guy who said:

Interesting read. Still don't give a shit though because I like to harness thy gigahertz.

Anyway, this point of view is why the ice caps are melting and why policymakers often feel compelled to employ mandatory equipment efficiency questions. It's called market failure. We love solar, too, but as long as money is a finite resource, powering inefficient loads with costly solar means that there are other loads that won't be reached by the clean energy. The people with the most ability to bring about change here are actually the gamers.
Ċ
Evan Mills,
Jul 27, 2015, 11:02 AM
Ċ
Evan Mills,
Jul 18, 2015, 5:07 AM