Next Generation Economy and Policy Modelling
A Conversation
Abstract:
The current generation of models used to analyze the economy by policymakers, central banks, Wall Street firms, and businesses are typically either structural models that look at the economy from a highly aggregate view and have numerous unrealistic assumptions, or econometric models that rely on the future being similar to the past. In contrast, agent-based models enable analysis of economic and social systems from the “bottom-up”, capturing heterogeneity, dynamics, system evolution, and a much richer view of behavior and institutions. Furthermore, these models can take advantage of the vast quantities of micro data that are now available (e.g. agent behavioral data, supply chain data, transaction data). There are also exciting opportunities to integrate developments in AI and ML. We will discuss how such next generation models are being developed and the potential to develop new insights for policymakers on issues such as central bank policy, climate change, and inequality, and for businesses on issues from supply chain robustness, to demand forecasting, insurance risk, and investment strategy.
Bio:
Eric Beinhocker is Professor of Public Policy at the Blavatnik School of Government and Executive Director of the Institute for New Economic Thinking, University of Oxford. Prior to Oxford he had a 25 year career in technology, venture capital, business, and public policy.
Summary
Agent-based modeling:
Represent micro-behavior by heterogeneous interacting agents
Complex macro behavior emerges
Equilibrium models are not accurate
They miss heterogeneity and are biased towards previously observed equilibria
Reality is almost never in equilibrium, something always pushes onto some dynamic cycle
Most theoretical games of any real complexity are rarely in equilibrium
Same for economy
Has the potential to be far more realistic than traditional aggregate equilibrium models of the economy
Challenges for using ABMs
Very complex
Computationally expensive
Hard to validate and tune parameters
Inertia in economics literature
Most current ABMs are “toy models” to demonstrate what they might be able to do. They are not predictive
Traditional structural models are used by most world institutions (central banks, IPCC)
State of the art in ABMs
Real estate model of UK (INET) Adopted by Bank of England
US firms model (Axtell)
Ways to exit COVID lockdown in a way to balances economic damage and virus transmission (INET)
New ways to validating/tuning model parameters against real data
Goal: way of testing out various policies
Directions forward:
Models of agents
Fine-grained models
Models of institutions, social networks, physical geography
Describe real-world rules (loans, jobs)
Put everything together to create models of macro phenomena
Create a LEGO set of ABM modules that can be recombined
Combine ABMs and AI:
Sample complex space and learn from data
Use game playing to explore complex space
Train agents by letting them play games
Applications:
Economy
Energy zero carbon transition
Insurance
Supply chains
Validation:
Structure of markets can be pulled directly from the market
Aggregate data is too coarse to fine-tune
Axtell’s model is a good example of tuning some parameters from aggregate data
Models are best used for short-term forecasting with short-term updating (e.g. like weather models)
Models may be able to give warning signs of likely events
Predict major outcomes of policy choices
Models are most useful when the dynamics are changing
Equilibrium models will tell you very little about the post-transition world
ABMs have more of a chance to reflect the changing environment
What’s a good level of model granularity?
Best level “depends”
Axtell’s FIRM model is roughly 1:1
Need to capture the heterogeneity inherent in the system (bank size, attributes, connectivity to other banks)
Model interpretability
One can connect agents in models to real-world entities
They do work on understanding the state spaces of the model to confirm that the model goes to places where the real world does
They do parameter sweeps to ensure that they move in sensible ways relative to parameter changes
Debugging is still a large challenge
Building ABMs from causal inference studies?
Models structural breaks
Detection of such breaks from data
Econometric tradition
Role of behavioral economics?
Economists worry that ABMs don’t force people to be rational
But in reality humans behave heuristically
Learn these from lab experiments
Cars Hommes (UAmsterdam): lab experiments of switching heuristics
People switch heuristics based on how well they work
Population of people switching heuristics based on experiments
Should we build models of human thinking and then work our way up or infer from data and lab experiments?
Relatively few heuristics are good enough to capture major dynamics
Challenge: real systems have an ecology of shifting strategies by multiple agents that interact and change over time
Collaboration opportunities
Building the LEGO set of ABM components
E.g. great model of US households, model of firms, banks, etc.
Applications:
Make predictions where traditional models don’t give sensible predictions
Climate change, transition to zero-carbon economy
Supply chain robustness
This takes a multi-disciplinary approach
Software engineering
Data
Economists
Psychologists
Academic world