FAQs

Answers to questions raised in response to the 2018 Mother Jones article 
    Reader comments here.

    Thanks for reading and sharing your comments.   It's good to see the overall consensus that energy use matters and being energy efficient is a good thing.  Most of the confusion and questions I'm seeing are cleared up in the report itself.  I realize it's long a long slog to read, but the Executive Summary is pretty comprehensive for those not having more time.  All the nuances can't be captured in a brief article like Bryan's, so we do encourage you to look at the report and the other reports in the series (test procedures, etc.) referenced in the notes. 

    We can offer a few thoughts on some of the comments:

    Some readers asked for more context.  See figs 54 and 55 for comparison to other familiar energy uses in the home.  Bottom line: gaming about 5% of household electricity use and about 20% of the "miscellaneous" category.  The average "power plant" used for reference is 500 megawatts (see https://drive.google.com/file/d/0B1s8219SGDIjMG5IamxRckM2QVk/view)

    Concern is expressed that all the burden of improvement is being placed on the consumer. Certainly there are things consumers can do (shopping with energy in mind, paying more attention to settings, doing upgrades to their systems (e.g. more efficient PSUs), but there are plenty of other parties that have key roles to play, ranging from manufacturers to game developers to utilities and governments.  See 7-page discussion of all this beginning on page 13.

    Some commenters seemed to assume that the original PC was a very high-end system (price no object).  Actually, it ran about $1200.  In the subsequent study we looked at a full gamut, but most of the systems were in the $500-1500 range.  Our highest-end system was actually used as much or less energy than many of the lesser ones.  All those details are in the report.

    The lack of correlation between frame rates and power is quite real if you look across the broad population of systems and games that we evaluated.  Fig 41a-b shows the enormous spread across all 1000 parametric tests.  This is a reflection of the widely varying energy-efficiency of systems, and, of course, the quality of the gaming experience per metrics other than FPS. This isn't a surprise; it's true for virtually any product you look at (cars, dishwashers, ...).  Fig 31 shows that the power use of Skyrim varies by 21-fold (!) depending on which system it's played on.  But, of course, if you constrain measurements to a specific system and game combo, you'll generally get higher power during gameplay if you bump up the frame rate. But FPS isn't everything, e.g., we saw big reductions in FPS accompanied by big increases in power use when switching to 4k displays (Fig 23).

    Some comments embody an assumption that better performance requires more power (F1 versus Volkswagen, etc.).  This has been conventional wisdom for ages, but engineering shows this not to be cut and dried.  Energy efficiency can be improved without reducing performance.  in fact, sometimes it improves things.  In the case of gaming systems, an efficient rig is quieter and releases less heat.  The graphics pipeline can also flow better; we found that many metrics of user experience improved as we made the systems more efficient (Fig 51).

    The term "simple" game has definitely thrown people.  What we wrote in the actual report (page 5) was "However, we discovered that game genre is not a predictor of energy use; games that appear relatively simple can use comparable amounts of energy as high-intensity games due to the quality of imagery and visual effects used."   In the report you can see average power during gameplay by game for 37 specific games played across 26 different gaming systems (Figs 27-30).  Energy use does vary, a lot, depending on which game is being run. A simple-looking game like perhaps SIMS actually has a lot of simulation and computation going on behind the scenes.  A game like Skyrim has a lot of action, but less computationally intensive graphics happening.  In the report we also show how throwing shaders onto Minecraft creates a very big bump (30%) in energy use.  Energy use is driven more subtle factors than the superficial look of the game. 

    It's tempting to say, "why worry", this is a small percentage of total energy.  The catch is that virtually everything is a small percentage of the total (refrigerators, dishwashers, industrial pumps, fans in office buildings).  There are no silver bullets.  To put a dent in greenhouse-gas emissions society has to simultaneously address a myriad of energy uses.  And the good news is that this has been happening; all kinds of programs and policies have been successfully mounted to address fridges and the rest.  Energy ratings are ubiquitous now, with gaming gear being a glaring exception.  Yet, when we started, there were no standardized methods of testing and reporting energy use for gaming systems, which was another big reason for undertaking this study.  It is important.  Someone has to fund and build those 25 billion-dollar power plants, and then pay for the fuel they burn each and every day.  

    There seems to be some confusion about what modes gaming equipment uses are in our numbers. The answer is all of them: gaming, video streaming, web browsing, idle, sleep, etc.  We look at the entire duty cycle because the goal is to understand the equipment's overall energy use. The report clearly breaks all this out, so you can see how much is used in gaming mode versus other modes. The proportion of total use use in gaming mode varies widely, depending on game and equipment choices.

    One commenter is concern that we're lumping in non-gaming uses of gaming-like equipment and use cases, and/or ignoring energy use in the network and data centers.   Fair questions.  As described in the report, we count energy use in the user's equipment as well as in Internet infrastructure, servers in the data centers (IT and cooling).  For the macro numbers, we include only systems used by consumers that game 15-minutes a day or more.  We deliberately exclude unrelated uses of similar equipment (cryptocurrency, commercial animation or game dev, scientific visualizations, or GPUs used in supercomputers for other purposes). These are all relevant questions, just not in the scope of our study.
     
    We did our testing with factory and browser power management settings.  Looked at variations in our sensitivity studies.  Note that many systems have poor power management in the sense of throttling back power when in non-active mode - the ratio of active/idle varies from ~1:1 to ~5:1 across the systems (Fig 22). This is something that's baked in and that user's can't control.

    The energy use by PCs (and displays) not used for gaming was estimated earlier this year by the Fraunhofer Institute for the Consumer Electronics Association.  We did net those numbers out of our gaming numbers.  Details in the report.  We find that gaming equipment uses somewhat more energy than all computers not used for gaming (Fig 55).

    There were questions about how cloud gaming increasing overall energy use by 60 to 300%, since the computations are shifted out of the home. Indeed, that's exactly the point: We use the word "overall" to mean the energy use in the home, in the network, and in the data center.  As one commenter correctly speculated, the incremental energy use at the gaming device is about that of viewing HD video. Remember: there's a lot of data flowing, and the network energy intensity is 0.027 kWh/GB, which really adds up.  There's a detailed discussion about cloud gaming energy analysis in the report.

    Can you put your estimate of global energy use for gaming in some kind of context? 
    • 75 billion kilowatt hours per year is like 25 standard electric power plants.  It's also like 160 million refrigerators, globally.  Or, 7 billion LED light bulbs running 3 hours per day -- that's one LED light bulb for every man, woman, and child on the planet.
    Why is this so much energy?
    • Two main factors.  First is the fact that the internal components all suck far more power when working, particularly the graphics card, followed by the CPU and then the motherboard.  The power supply is also particularly important because it has its own inefficiencies every bit of electricity used in the PC passes through it, and so the losses are compounded.  That's overlain by the fact that the gaming PC spends much more time in peak power mode (CPU and GPU firing) -- because of the nature of gaming -- national average is 4.4 hours/day (nearly 20% of the day).  For a standard PC it's far, far less because just web browsing and things like that are not compute-intensive (about 1% of the time at peak).  
    How can gamers benefit from your research. I see a few products named, but is there enough info for them to build an energy efficient gaming machine?
    • Gamers, particularly DIY folks who build their own machines from scratch already, can scrutinize name-plate power ratings of products that provide the performance levels to which they aspire, and then choose wisely from among those sets.  Measurement is key, and very affordable now given low-cost power meters.  This is important because nameplate ratings are usually way over actual energy use rates, even at peak load.  This matters because it helps people not fall into the trap of oversizing their power supply unit (PSU has a--oversizing has a two-fold downside: higher first-cost than necessary, and potentially lower operating efficiency leading to (even) higher energy use.  More specific recommendations here.
    • In tandem with including energy performance in the equipment specification process, gamers can do quite a bit to save power by optimizing settings and shutting down power to specific points (e.g., empty slots on the mother board) that don't need to be "hot").  Choosing equipment like displays with software correction (e.g. G-Sync) for visual flaws like tearing enables use of less souped-up graphics processing units without experience any compromise in visual performance.  It's also important not saddle oneself with bottlenecks that impede the ability of one component to achieve the performance promised by a neighboring one.  Classic example is a CPU that cannot fully power a GPU.  This leads to significant energy waste, in that a lower-power GPU could be paired with the given CPU without any performance hit.  Sometimes equipment and operation considerations dovetail together, e.g., with fanless PSUs that are becoming available or with intelligent fan controls that only push air when it's needed rather than every minute the machine is running. In-game settings are also very impactful, as is over-clocking.
    Are gamers interested in saving energy, or are mandates are necessary?
    • Like anywhere you look today, there is a mix of sentiments.  This segment is arguably a bit more challenging for a couple of reasons.  First, for the younger gamer set, the user and specifier isn't usually the one actually paying utility bills (that would be mom and dad, ahem).  Also, the information environment is terrible (much of the data required to make good choices doesn't exist or is a pain to gather).  There is also a mythology that higher performance automatically requires higher energy input.  The opposite has been borne out again and again; think of the Tesla.  The data in our paper show that this is clearly not the case for gaming computers either.  Like many in society, some gamers' gut reaction is to point the finger at other users of energy (hummers, cement factories, ... take your pick).  The reality is that there is no silver bullet and we need to pursue savings anywhere and everywhere we can find them. Gaming computers are certainly playing catch-up with the rest of the technology space, where standards, incentives, and a more vibrant energy information environment are more status quo. Short of standards is transparency in product labeling.  Currently, only PSUs and displays have labels.  Voluntary labeling could be brought into the picture elsewhere.  Regulators will observe customers and markets and come in with standards where they find that the market is not doing the economically optimal thing on its own, but standard-setting would be extremely challenging given the difficulty of quantifying user experience might change across various energy factors.  Time will tell in this case. 
    What do you plan do to next in this research area?
    • There are many frontiers and questions.  What's the range of energy use across different classes of gaming equipment?  How much effect does time in gameplay have on consumption?  Does over-clocking impact energy use? What about VR?  What about cloud gaming?  Does choice of game make a difference?.  Certainly more measurement of individual components is needed. Equally important is more granular segmentation of the market so that baseline energy demand and scenarios can be developed for specific geographies and user categories.  For example, we didn't even look at workplace use of these kinds of machines.  It's important to study the "load shape" of gaming to better understand how the hourly patterns of use coincide (or not) with the utility's most challenging periods of peak demand. Another need is for standardized benchmarking and energy test procedures.  The industry of course does that well on the performance side (Unigine, etc.), but not on the energy metering side.  As a result, one runs into all sorts of conflicting "results" for identical configs. Giving the rapid place of technology change, the space needs to be tracked closely.  Last but not least, a critical next-step would be looking into the cost-performance relationships among components and how much investment is needed to gain energy savings.  There is a popular misconception that any and all energy savings beyond business-as-usual are too costly; this needs to be debunked.  We were been able to address most of these questions in our follow-on study.