Can you put your estimate of global energy use for gaming in some kind of context? 
  • 75 billion kilowatt hours per year is like 25 standard electric power plants.  It's also like 160 million refrigerators, globally.  Or, 7 billion LED light bulbs running 3 hours per day -- that's one LED light bulb for every man, woman, and child on the planet.
Why is this so much energy?
  • Two main factors.  First is the fact that the internal components all suck far more power when working, particularly the graphics card, followed by the CPU and then the motherboard.  The power supply is also particularly important because it has its own inefficiencies every bit of electricity used in the PC passes through it, and so the losses are compounded.  That's overlain by the fact that the gaming PC spends much more time in peak power mode (CPU and GPU firing) -- because of the nature of gaming -- national average is 4.4 hours/day (nearly 20% of the day).  For a standard PC it's far, far less because just web browsing and things like that are not compute-intensive (about 1% of the time at peak).  
How can gamers benefit from your research. I see a few products named, but is there enough info for them to build an energy efficient gaming machine?
  • Gamers, particularly DIY folks who build their own machines from scratch already, can scrutinize name-plate power ratings of products that provide the performance levels to which they aspire, and then choose wisely from among those sets.  Measurement is key, and very affordable now given low-cost power meters.  This is important because nameplate ratings are usually way over actual energy use rates, even at peak load.  This matters because it helps people not fall into the trap of oversizing their power supply unit (PSU has a--oversizing has a two-fold downside: higher first-cost than necessary, and potentially lower operating efficiency leading to (even) higher energy use.  More specific recommendations here.
  • In tandem with including energy performance in the equipment specification process, gamers can do quite a bit to save power by optimizing settings and shutting down power to specific points (e.g., empty slots on the mother board) that don't need to be "hot").  Choosing equipment like displays with software correction (e.g. G-Sync) for visual flaws like tearing enables use of less souped-up graphics processing units without experience any compromise in visual performance.  It's also important not saddle oneself with bottlenecks that impede the ability of one component to achieve the performance promised by a neighboring one.  Classic example is a CPU that cannot fully power a GPU.  This leads to significant energy waste, in that a lower-power GPU could be paired with the given CPU without any performance hit.  Sometimes equipment and operation considerations dovetail together, e.g., with fanless PSUs that are becoming available or with intelligent fan controls that only push air when it's needed rather than every minute the machine is running. In-game settings are also very impactful, as is over-clocking.
Are gamers interested in saving energy, or are mandates are necessary?
  • Like anywhere you look today, there is a mix of sentiments.  This segment is arguably a bit more challenging for a couple of reasons.  First, for the younger gamer set, the user and specifier isn't usually the one actually paying utility bills (that would be mom and dad, ahem).  Also, the information environment is terrible (much of the data required to make good choices doesn't exist or is a pain to gather).  There is also a mythology that higher performance automatically requires higher energy input.  The opposite has been borne out again and again; think of the Tesla.  The data in our paper show that this is clearly not the case for gaming computers either.  Like many in society, some gamers' gut reaction is to point the finger at other users of energy (hummers, cement factories, ... take your pick).  The reality is that there is no silver bullet and we need to pursue savings anywhere and everywhere we can find them. Gaming computers are certainly playing catch-up with the rest of the technology space, where standards, incentives, and a more vibrant energy information environment are more status quo. Short of standards is transparency in product labeling.  Currently, only PSUs and displays have labels.  Voluntary labeling could be brought into the picture elsewhere.  Regulators will observe customers and markets and come in with standards where they find that the market is not doing the economically optimal thing on its own, but standard-setting would be extremely challenging given the difficulty of quantifying user experience might change across various energy factors.  Time will tell in this case. 
What do you plan do to next in this research area?
  • There are many frontiers and questions.  What's the range of energy use across different classes of gaming equipment?  How much effect does time in gameplay have on consumption?  Does over-clocking impact energy use? What about VR?  What about cloud gaming?  Does choice of game make a difference?.  Certainly more measurement of individual components is needed. Equally important is more granular segmentation of the market so that baseline energy demand and scenarios can be developed for specific geographies and user categories.  For example, we didn't even look at workplace use of these kinds of machines.  It's important to study the "load shape" of gaming to better understand how the hourly patterns of use coincide (or not) with the utility's most challenging periods of peak demand. Another need is for standardized benchmarking and energy test procedures.  The industry of course does that well on the performance side (Unigine, etc.), but not on the energy metering side.  As a result, one runs into all sorts of conflicting "results" for identical configs. Giving the rapid place of technology change, the space needs to be tracked closely.  Last but not least, a critical next-step would be looking into the cost-performance relationships among components and how much investment is needed to gain energy savings.  There is a popular misconception that any and all energy savings beyond business-as-usual are too costly; this needs to be debunked.  We were been able to address most of these questions in our follow-on study.