Taming the Energy Use of Gaming Computers
Just published in the journal Energy Efficiency, this new study presents a novel
analysis of the energy use of gaming PCs….
Download the full report here.
One billion people around the world today engage in digital gaming. Gaming is the most energy-intensive use of desktop computers, and the high-performance “racecar” machines built expressly for this purpose comprise the fastest growing type of gaming platform.
We found enormous performance-normalized variations in power ratings among the gaming computer components available on today’s market. For example, central processing units vary by 4.3-fold, graphics processing units 5.8-fold, power supply units 1.3-fold, motherboards 5.0-fold, RAM 139.2-fold, and displays 11.5-fold. Similarly performing complete systems with low, typical, and high efficiencies correspond to approximately 900, 600, and 300 watts of nameplate power, respectively.
While measured power requirements are considerably lower
than nameplate for most components we tested--by about 50% for complete systems--the bottom-line energy use is massive compared to that of standard personal computers.
Based on our actual measurements of gaming PCs with progressively
more efficient component configurations, together with market data on typical patterns of
use, we estimate that the typical gaming PC (including display) uses about 1400 kilowatt-hours of electricity per year. The energy use of a single typical gaming PC is equivalent to the energy use of 10 game consoles, 6
conventional desktop computers, or 3 refrigerators. Depending on local energy prices, it can cost many hundreds of dollars per year to run a gaming PC.
While gaming PCs represent only 2.5% of the global installed
personal computing equipment base, our initial scoping estimate suggests that
gaming PCs consumed roughly 75 billion kilowatt-hours per year of electricity globally in 2012,
or approximately 20% of
all personal desktop computer, notebook, and console energy usage combined. For context, this corresponds to about $10 billion per year in energy expenditures, or the equivalent electrical output of 25 typical electric power plants.
Given market trends and projected changes in the installed base, we estimate that this energy consumption will more than double by the year 2020 if the current rate of equipment sales is unabated and efficiencies are not improved. Although they will represent only 10% of the installed base of all types of gaming platforms globally in 2020, relatively high unit energy consumption and high hours of use will result in gaming computers being responsible for 40% of overall gaming energy use.
This significant energy footprint can be reduced by more than 75% with premium efficiency components and operations, while improving reliability and performance. This corresponds to a potential savings of approximately 120 billion kilowatt-hours or $18 Billion per year globally by 2020.
There is a significant lack of current policies to achieve such improvements, and very little guidance is available to help consumers make energy efficient choices when they purchase, upgrade, and operate their gaming PCs. Key opportunities include product labeling, utility rebates, and minimum efficiency standards.
Following are a cross-section of comments and questions we've received, and our effort to offer constructive responses:
Q: What single piece of advice would you give to someone building a new gaming PC?
Q: How has the gaming community and industry responded to the research?
A: We've had the whole gamut of reactions. Many people react to brief news summaries without reading the actual report. People tend to compare their particular machine and personal utilization to our "average" one and balk at any differences. Of course, no one person is average but one needs to use averages in order to extrapolate to large scales. Lots of people assert that it's not cost-effective to save energy, but this is a red herring insofar as some things can be done that cost nothing, other things will have a reasonable payback time, and others will yield non-energy benefits such as noise and heat reduction that are certainly valued by gamers. Some people are in disbelief that energy use can be reduced without taking a performance hit, but the data show clearly that this is quite possible.
Q: You recognize and address some of the limitations of the study, do you have plans for further research and if so, what form does that take?
Q: Are there free/cheap things that gamers can do to monitor their energy use, and if so, is there a place for crowdsourced data?
Q: It’s clear that some gamers don’t much care about environmental concerns - what’s one thing you would say to those people?
Q: TLDR [Too Long, Didn't Read].
Q: Did you just make these crazy numbers up? You must be assuming that people are gaming 24x7 full-out at 1000 watts of load or something like that.
A: Well, that would certainly be unrealistic, and would result in energy use of 8,760 kWh/year (6-times what we estimate. Based on the most thorough market research we could find, the average gamer plays 4.4 hours per day, and this is what we assume for the period of time spent gaming. We found that nameplate power is often twice as high as the actual (peak) power requirements that we measured, so we did the calculation using the lower actual value. Keep in mind that the typical user also uses their machine for reading email, web browsing, and other tasks, plus idle time, so we incorporate each of those modes as well. The power demand during those periods is much lower, but still adds up to about half of the total annual use. We assume the machine is off 8 hours of the day.Q: Why don't you pick on cars or something that uses enough energy to worry about?
A: This is one whopper of a red herring. The question is not usefully framed as a matter of which energy use to scrutinize. For those who think that energy is at all a problem (cost, pollution, whatever), then all uses can and should be looked at for efficiency opportunities. And virtually all of them (lighting, heating/cooling, fridges, etc.) have been addressed extensively over the past 30 years, with huge improvements in efficiency and gains for consumers. Gaming PCs are one of the only loads that has not had the benefit of a close look from the energy community, so far. Anyway, the typical gaming PC in the study uses about 1/3 as much electricity as the average California home -- i.e., not trivial. The typical gaming PC costs from $140 to $560/year to run (depending on energy prices) - https://sites.google...ast/cost-carbon - also not trivial, and this does not include what can be a comparable bump in air-conditioning use if you're in a hot climate (cold climate - sometimes this waste heat is useful throughout the house, but mostly not distributed well).
Q: The GPU is never enough, you buy as much as you can and it will be outdated pretty fast anyway.
Q: The synthetic benchmarking software used to test performance of the PCs in the studies is not an accurate reflection of real life.
Q: I just wanted to let you know that a lot of your data is wrong or self contradicting on your website
"The energy use of a single typical gaming PC is equivalent to the energy use of 10 game consoles"
Well in your own diagrams you reference a 450 watt PC, yet a PS4 and XboxOne both pull around 110 to 140 watts. A gaming pc does not pull ten times either of these systems, especially not a "typical" one. You've either purposefully altered the data or not taken the time to do the research. Luckily extreme-tech did - Ref: http://bit.ly/1jfQ4sr
A: You're mixing up some very important units here. We are discussing energy (Wh, not power W). Average utilization of gaming PCs and consoles are different (not to mention different levels of load for each mode of operation -- see article), and so energy use does not at all scale with nameplate power. Also, our methodology was to characterize typical equipment in the stock rather than random non-representative examples. The peer-reviewed literature indicates significantly less average power for consoles than your example. The energy (kWh) gap is even greater because national-average gaming PC use is 4.4h/d and average console is 2.2 (see cited literature in our article). Yes, there will certainly always be large variations around the means (cars, fridges, gaming PCs, you name it), but means need to be used for macro-level analyses.
By the way, we actually reviewed the NRDC report that you referenced, and it is flawed on the high side - see footnotes 1 and 15 in our paper. NRDC received pre-publication comments pointing out the problems and disregarded them. There are much more careful studies in the literature.
Q: That's fine, but at what cost? You talk about building a PC using more efficient components. Is it even beneficial for a person using a 2 generation old graphics card to spend 200-300 dollars to save 10-50 dollars a year? I would say no. Just buy the latest tech and you'll save energy.
A: This is not properly framed. No one recommends scrapping equipment that has remaining service life; at issue is the incremental cost of more and less efficient gear at the time of normal replacement (as well as no-cost operational fixes.
Q: CPU the bottleneck in gaming? On what planet? You can get a dual core unlocked pentium that sometimes can be found at as low as 45$ and overclock it for minimal loss in GPU perf despite being just dual core with less cache. Anyone claiming the CPU as a bottleneck clearly hasn't even spent 15 minutes researching the topic. You need to cripple the CPU perf hard and game at low res for the CPU to be the bottleneck. If anything , most of the time the CPU is overdimensioned for gaming but that's , at least in part, because that CPU perf is needed for other tasks. And again price is always a factor.
A: Earth, third stone from the Sun. You seem to be misreading the work and overgeneralizing a bit. In a real-world game, a config like this could produce a hit of 10-20 FPS, which is not trivial to most gamers. This will be game-specific. Anyway, we don't make any assumptions about bottlenecking in our base machine or subsequent analysis, it is just a side comment describing one of many potential issues...
Q: What about cost? Of course the latest tech is going to be more power efficient, but new also means expensive. The Nvidia 900 series and the AMD 300 series aren't cheap and the cost of the card doesn't outweigh the cost of the electricity.
A: Latest tech is by no means automatically more power-efficient. If you look at the field you'll see little correlation between price and energy efficiency. In any case, not sure we agree with your characterization of the 970, as it is often referred to as the "Performance King", indicating market approval for the price-performance relationship (even before thought is given to operating cost). The AMD 300 doesn't seem like a good way to make your point either, as it is a mere refresh (no meaningful performance bump over the 200 series, just more pricey, and actually draws more power at a higher price). This is not an isolated example from the world of energy-using technology where a more expensive but otherwise 'equivalent' product actually uses more energy. An example is the comparison of an Nvidia 980Ti vs AMD 295X2. Per gpuboss.com, the former actually performs better by Passmark, has half the nameplate power (~250W / ~500W) and substantially lower price (~$650 / ~$1000).
Q: PSUs are a bit oversized but efficiency is a high priority for buyers and there is pretty good data available to the DIY scene.
Q: G-Sync and FreeSync have a rather high premium, not to mention that nobody, absolutely nobody buys such a screen to be able to use a lesser GPU.
Q: The GPU is never enough, you buy as much as you can and it will be outdated pretty fast anyway.
Q: Plus GPU makers do make efforts to be more efficient and most of the share Nvidia gained in the last 1 year+ was because Nvidia was more power efficient than AMD.
Q: This whole argument doesn't even matter unless people can maintain a decent system, for very cheap, to save maybe 50 dollars a year. You cite "hundreds" of dollars could be saved each year with more efficient hardware. This is wrong, let me explain. Average cost per KW is 12 cents per KW. Ref. http://n.pr/1JKAbM6 Assuming I'm gaming 22 hours a per week Ref: http://bit.ly/1gvHsTU That means I game 1144 hours per year. 1144 x 0.12 = $137.28. This is not hundreds. Also spending more money to reduce this cost, does not eliminate the cost. The end result is that people have less money, power still gets consumed, and it's bad all around.
A: Again, we're talking energy here, not power. You've mixed up your calculation, i.e., energy use isn't the product of hrs/year and price (your equation assumes 1kW sustained load for the machine which is clearly high). You've also assumed zero load for all non-gaming hours, which may mean that you unplug your machine when not gaming and don't use it for any other purpose ... but that is not representative of the typical use case that we are modeling. You need to add average load (in kW) to the equation, which is the time-weighted average across all the modes of operation (see article). I think that was somewhere between 0.2 and 0.3 kW for our test machines.
Q: I think the consumer information out there is perfectly adequate.
Q: It's not gonna happen. Gaming computers will continue to consume insane amounts of power for as long as they exist. Compared to what we had only 5 years ago, the power to performance ratio has more than doubled. But at the same time we've used that increase in efficiency to get double the performance at the same power ratings rather than the same performance at half the power consumption. It's not like we finally hit a point where 'computers are good enough and don't need to be improved', if anything, its the complete opposite. We NEED more computational power. Not want, NEED. We are still well within the territory of hardware holding back what can be achieved with software. Until I can run F@H simulations in real time instead of in nanoseconds per day, graphics cards will continue to consume upwards of 300W and there's nothing that branding or labels can do about it. Instead of trying to do the impossible, how about you put some effort into affordable solar panels to cancel out the power used by my computer instead? I would love to power my home with solar, but I don't have several thousand dollars lying around. Don't blame computers for using power, blame local power companies for continuing to rely on fossil fuels and charge asinine rates for something as basic as electricity. The technology already exists and we could easily generate more than enough power for the whole world using renewable resources, don't blame us for the failures of those who actually have the ability to bring about change.