FAQs
Answers to questions raised in response to the 2018 Mother Jones article
Reader comments here.
Thanks for reading and sharing your comments. It's good to see the overall consensus that energy use matters and being energy efficient is a good thing. Most of the confusion and questions I'm seeing are cleared up in the report itself. I realize it's long a long slog to read, but the Executive Summary is pretty comprehensive for those not having more time. All the nuances can't be captured in a brief article like Bryan's, so we do encourage you to look at the report and the other reports in the series (test procedures, etc.) referenced in the notes.
We can offer a few thoughts on some of the comments:
Some readers asked for more context. See figs 54 and 55 for comparison to other familiar energy uses in the home. Bottom line: gaming about 5% of household electricity use and about 20% of the "miscellaneous" category. The average "power plant" used for reference is 500 megawatts.
Concern is expressed that all the burden of improvement is being placed on the consumer. Certainly there are things consumers can do (shopping with energy in mind, paying more attention to settings, doing upgrades to their systems (e.g. more efficient PSUs), but there are plenty of other parties that have key roles to play, ranging from manufacturers to game developers to utilities and governments. See 7-page discussion of all this beginning on page 13.
Some commenters seemed to assume that the original PC was a very high-end system (price no object). Actually, it ran about $1200. In the subsequent study we looked at a full gamut, but most of the systems were in the $500-1500 range. Our highest-end system was actually used as much or less energy than many of the lesser ones. All those details are in the report.
The lack of correlation between frame rates and power is quite real if you look across the broad population of systems and games that we evaluated. Fig 41a-b shows the enormous spread across all 1000 parametric tests. This is a reflection of the widely varying energy-efficiency of systems, and, of course, the quality of the gaming experience per metrics other than FPS. This isn't a surprise; it's true for virtually any product you look at (cars, dishwashers, ...). Fig 31 shows that the power use of Skyrim varies by 21-fold (!) depending on which system it's played on. But, of course, if you constrain measurements to a specific system and game combo, you'll generally get higher power during gameplay if you bump up the frame rate. But FPS isn't everything, e.g., we saw big reductions in FPS accompanied by big increases in power use when switching to 4k displays (Fig 23).
Some comments embody an assumption that better performance requires more power (F1 versus Volkswagen, etc.). This has been conventional wisdom for ages, but engineering shows this not to be cut and dried. Energy efficiency can be improved without reducing performance. in fact, sometimes it improves things. In the case of gaming systems, an efficient rig is quieter and releases less heat. The graphics pipeline can also flow better; we found that many metrics of user experience improved as we made the systems more efficient (Fig 51).
The term "simple" game has definitely thrown people. What we wrote in the actual report (page 5) was "However, we discovered that game genre is not a predictor of energy use; games that appear relatively simple can use comparable amounts of energy as high-intensity games due to the quality of imagery and visual effects used." In the report you can see average power during gameplay by game for 37 specific games played across 26 different gaming systems (Figs 27-30). Energy use does vary, a lot, depending on which game is being run. A simple-looking game like perhaps SIMS actually has a lot of simulation and computation going on behind the scenes. A game like Skyrim has a lot of action, but less computationally intensive graphics happening. In the report we also show how throwing shaders onto Minecraft creates a very big bump (30%) in energy use. Energy use is driven more subtle factors than the superficial look of the game.
It's tempting to say, "why worry", this is a small percentage of total energy. The catch is that virtually everything is a small percentage of the total (refrigerators, dishwashers, industrial pumps, fans in office buildings). There are no silver bullets. To put a dent in greenhouse-gas emissions society has to simultaneously address a myriad of energy uses. And the good news is that this has been happening; all kinds of programs and policies have been successfully mounted to address fridges and the rest. Energy ratings are ubiquitous now, with gaming gear being a glaring exception. Yet, when we started, there were no standardized methods of testing and reporting energy use for gaming systems, which was another big reason for undertaking this study. It is important. Someone has to fund and build those 25 billion-dollar power plants, and then pay for the fuel they burn each and every day.
There seems to be some confusion about what modes gaming equipment uses are in our numbers. The answer is all of them: gaming, video streaming, web browsing, idle, sleep, etc. We look at the entire duty cycle because the goal is to understand the equipment's overall energy use. The report clearly breaks all this out, so you can see how much is used in gaming mode versus other modes. The proportion of total use use in gaming mode varies widely, depending on game and equipment choices.
One commenter is concern that we're lumping in non-gaming uses of gaming-like equipment and use cases, and/or ignoring energy use in the network and data centers. Fair questions. As described in the report, we count energy use in the user's equipment as well as in Internet infrastructure, servers in the data centers (IT and cooling). For the macro numbers, we include only systems used by consumers that game 15-minutes a day or more. We deliberately exclude unrelated uses of similar equipment (cryptocurrency, commercial animation or game dev, scientific visualizations, or GPUs used in supercomputers for other purposes). These are all relevant questions, just not in the scope of our study.
We did our testing with factory and browser power management settings. Looked at variations in our sensitivity studies. Note that many systems have poor power management in the sense of throttling back power when in non-active mode - the ratio of active/idle varies from ~1:1 to ~5:1 across the systems (Fig 22). This is something that's baked in and that user's can't control.
The energy use by PCs (and displays) not used for gaming was estimated earlier this year by the Fraunhofer Institute for the Consumer Electronics Association. We did net those numbers out of our gaming numbers. Details in the report. We find that gaming equipment uses somewhat more energy than all computers not used for gaming (Fig 55).
There were questions about how cloud gaming increasing overall energy use by 60 to 300%, since the computations are shifted out of the home. Indeed, that's exactly the point: We use the word "overall" to mean the energy use in the home, in the network, and in the data center. As one commenter correctly speculated, the incremental energy use at the gaming device is about that of viewing HD video. Remember: there's a lot of data flowing, and the network energy intensity is 0.027 kWh/GB, which really adds up. There's a detailed discussion about cloud gaming energy analysis in the report.
Can you put your estimate of global energy use for gaming in some kind of context?
Why is this so much energy?
How can gamers benefit from your research. I see a few products named, but is there enough info for them to build an energy efficient gaming machine?
Are gamers interested in saving energy, or are mandates are necessary?
What do you plan do to next in this research area?