Example and average electricity use pattern associated with Bedtime and Getting home
Average consumption associated with activities
At the end of the working day, we get home, turn on the oven, and perhaps set the washing machine. These daily routines are remarkably synchronised across households: most people tend to do them at roughly the same time. The result is the so-called “Evening Peak”—a period, in the UK typically from 5 to 7 pm, marked by particularly high electricity demand. Meeting this surge requires switching on the most polluting power plants—something we would clearly prefer to avoid as a society.
Reducing carbon emissions in the residential sector, while fully integrating intermittent renewables such as wind and solar, therefore requires either a major advance in energy storage or greater flexibility in household demand. To understand the potential of demand side response (DSR), however, we first need to understand how households actually use electricity. A common approach is non-intrusive load monitoring, which disaggregates the electricity signal into a list of likely appliances in use at any given moment. What this technique cannot tell us is why those appliances are used — how important each activity is to the household at that time. And yet, that is the crucial step in assessing flexibility. To reach that level of understanding, we must shift our focus from appliances to activities, allowing us to build on the extensive body of knowledge developed by social scientists through national time-use studies.
METER was the first study of its kind to collect simultaneous electricity and activity data at the household level. It was developed by Phil Grunewald within Oxford’s Environmental Change Institute, a research group long recognised for applying social-science insight to energy policy. This approach — effectively mining social science for empirical understanding of human behaviour — is essential if we are to design policies grounded in evidence rather than anecdote.
The project’s first aim was to understand how daily activity patterns translate into electricity use. Among our findings was that hot meals are the focal point of a household’s evening — implying that changing energy demand requires expanding the accepted range of food-related practices. We also found that activities are strongly interrelated, and that the electricity signature of each activity varies depending on what else is happening in the household at the same time. These insights carry important implications for bottom-up demand modelling, which attempts to recreate household profiles from “typical” time-use data.
The second stage of METER used these insights to design intervention strategies aimed at shifting demand. Two localised field studies have demonstrated the potential for a 10% reduction in electricity use during peak periods. Because METER combined electricity and activity data, we could decompose these reductions to understand how they occur. For example, we observe that the decrease is achieved primarily through substituting hot meals with snacks and tea — very British, and a small but measurable step towards a more flexible, low-carbon energy system.
METER produced multiple research findings, and has gone on to form an integral part of the DSR work at Oxford - see https://energy-use.org/.
Our work on the cover of Nature
The idea that society can be studied with the same analytical precision as matter dates back to Auguste Comte and his notion of social physics. This idea is particularly relevant today, when our preferences and opinions are readily available for analysis through social media data, and when understanding how opinions form — and how they might be influenced — has become a central question for both science and policy.
My main body of work on opinion dynamics was carried out at IFISC (Palma de Mallorca). There, we examined how opinions evolve when individuals interact through complex networks. Our models explored several realistic conditions: the coexistence of multiple networks, the influence of homophily and social imitation, and the tendency of people to connect preferentially with friends of friends rather than strangers. One of our papers, selected as an Editor’s Choice in Physical Review E, illustrates this work. We showed that while a noiseless system tends to fragment into echo chambers — groups of individuals sharing identical opinions — even a small amount of external randomness or influence is enough to prevent this fragmentation, sustaining open-mindedness and a constant level of dialogue across the network.
In a separate line of inquiry, we recently published a paper in Nature demonstrating the importance of network connectivity in shaping collective outcomes. We introduced the phenomenon of information gerrymandering, showing how the strategic placement of voters within a communication network can profoundly alter electoral results, particularly in systems requiring a supermajority. In such first-past-the-post settings, splitting votes across multiple candidates on the same side can make strategic voting — choosing a second-best candidate — a rational behaviour. However, because individuals infer their understanding of the broader electorate from their immediate social connections, this network structure can distort perceptions, leading people to act on incomplete or misleading information. Recognising and understanding these network-based biases is essential for interpreting how information flows shape democratic outcomes.
A Graphical User Interface created in Processing (Java-based) to simulate Probabilistic Celluar Automata e.g. Toom´s PCA
Simple rules can give rise to surprisingly complex and unforeseen behaviour. That is the premise behind Complexity Science — the study of systems that can be described in this way, from the coordinated patterns of flocking birds to the spread of diseases or opinions.
Detecting structure in such systems often requires analysing large amounts of data. This, in turn, is what had partially led to the development of algorithms now familiar from Machine Learning, as well as visual and computational demonstrations used to explore system dynamics. In many ways, Complexity Science was one of the main precursors of modern Data Science (for an alternative vision, see the fascinating argument on statistical v algorithmic learning in Breiman´s Two Cultures paper).
At its core, however, the notion of complexity is rooted in philosophy: how can global behaviours arise that are described differently from the behaviour of the constituent parts? Throughout history, various intellectual movements have revisited this question of emergence. Two periods stand out: first, in the wake of Darwin’s theory of natural selection, when anti-reductionists used the term “emergence” to capture the apparent mystery of life; and second, more recently, with the return to the problem of consciousness that has accompanied the rise of robotics and artificial intelligence.
Complexity Science as we know it truly came into its own with the advent of computation. Its intellectual forefathers, Claude Shannon and Andrey Kolmogorov, proposed that what matters most in understanding such systems is how they transmit and encode information. To this day, information-theoretic quantities such as entropy remain central to analysing large, interconnected systems with many components.
In my PhD, I used an information-theoretic framework to quantify emergence. To measure the extent to which emergent properties were present in a system, we introduced an entropy-based measure called Persistent Mutual Information (PMI). PMI quantifies how much a system’s future can be predicted from its past. We tested it on low-dimensional chaotic dynamical systems and found that it relates closely to the system’s degree of disorder. For the Logistic and Tent Maps, PMI is directly linked to the Lyapunov exponent — a number that measures how rapidly similar initial conditions lead to different outcomes and is one of the hallmarks of chaos.
We also showed that emergence can, in principle, be described as non-uniqueness of space–time phases, and demonstrated how to measure it in a system of Probabilistic Cellular Automata (Toom’s PCA).