The idea of creating a digital twin of human society is both audacious and fascinating. The concept essentially revolves around creating a highly detailed, dynamic simulation of the socio-economic system, taking into account economic, financial, sociological and environmental factors. This “digital twin” can then be used as a forecasting tool as well as a sandbox to test hypotheses, policies and financial plans and to predict the potential outcomes of those changes in the real world.
This is not just an academic curiosity; you can hear pundits on business TV or journals on a daily basis, speculating on the state of the markets, trade, and economic indicators such as inflation, demand and investments. Individuals, corporations and government engage in $32 trillion in global trade, $2 trillion on real estate transactions, $42 trillion on equities trade and $133 trillion of trading in bonds annually. Each of these players rely on their understanding of the markets at the moment but also the future expectation of where the economy is heading.
Even when think of this as an individual, the reason you buy a house, or invest your excess cash into an asset is a function of how you think its future value will be. Right now we are at the mercy of economic bubbles and crises.
The idea of simulation of society has been around since the beginning of the computing age. Similar to AI, computational science has improved significantly. A digital twin of the global economy is fundamentally a multi-physics agent based model. In this model, every economic player — including individuals, corporations and firms, institutions and government — is considered. Fundamentally, all information we need to prescribe the economic machinery is summarized in these players income statement and balance sheet. As Ray Dalio puts it, your spending is someone else’s income and your liability is someone else’s asset. Additionally, it’s essential to understand how these agents act within their environment. For this, we need models with parameters that need to be calibrated to the human behavior. Lastly, the agents are subject to their physical environment, such as production of energy and commodities and distribution of assets such as land and natural resources.
In the past few years, there has been rapid advancement in creation and calibration of agent based models (ABM) of the economy. See a review here. The recent breakthrough compared with historical development of such models is that now there are real world country-scale models available that outperform any previous economic models in their forecasting ability. Seminal works by Sebastian Poledna and colleagues are examples of models build for Austria, Canada and entire Euro Area.
These models are powerful specifically for forecasting scenarios outside equilibrium (that is most of the cases in this turbulent world). For example, the rampant inflation after COVID recession is predicted by these models whereas other tools failed. In addition, given the buttom-up nature of these models, accurate forecasting at the scale of economic sectors, regions or even firms is possible. For example, one can predict how the damage caused by natural disasters or geopolitical tensions propagates across various firms and sectors of the economy, even at an individual level.
First of all, this will not come out of academia or some side project of a central bank alone.I am betting on a tech startup instead.
When there is an acute pain point for a particular industry, the business strategy follows the lean startup approach. However, when there is a revolutionary technology with a wide area of adoption, one needs to adopt a different business strategy. A great example is OpenAI developing and commercializing Large Language Models. The thesis was that once a sufficiently powerful AI is created, many benefits will ensue. The company had earlier version of the GPT as an API with a much smaller traction. But when ChatGPT came out the adoption rate of the product even surprised OpenAI. Commercializing a digital twin of the economy would take a similar approach. When the model surpass a forecasting threshold and is simply accessible to anyone, adoption would grow exponentially.
We are likely still a few years away from building a full-scale model of the global economy. There are a few minor research questions left such as investigating how the addition of big data information (e.g. credit card spending of millions of people) would improve the forecasting capability of the model. The main challenges here are more technological such as developing the production-ready software to handle the calibration and subsequent execution of the model at scale.
Various data sources also need to be purchased and curated and appropriate data pipelines must be built in production.
We then need to determine the software interface on top of this model. Who will be the early adopters; large financial institutions, government or retail investors? What are their usage requirements? Do they want a web interface, an API, etc?
Currently there are a few academic group, or teams in big consulting firms and central banks working on ABMs of the economy. I suggest that our startup adopts an open source approach in developing the core ABM model. This allows for participation from all various groups and allow for easier adoption later on. It also allows for open linking on the ABM with companion physical models of supply chain, energy, climate and the environment.