Everyone’s talking about AI.
And rightly so. Its impact is now and the implications are massive.
How to use it? How to invest in it? How far reaching will AI’s disruptive capacity extend?
These are important questions and there is no single answer.
To help me put the pieces together, I think of AI as a new economic agent. One with a massive comparative advantage over humans in terms of “providing accurate, comprehensive responses across a broad domain of topics faster than any human can possibly hope.”
Ultimately, its reach will be governed by the fact that humans will always be the market. Just what that market consists of is anyone’s guess.
Despite such an uncertain trajectory, I can make one statement with confidence:
AI will move everyone’s cheese. Some will end up with more cheese. Some will end up with less.
And today I want to see if it’s possible for AI to help us all make more cheese…
Wheelbarrow to Assembly Line
When China opened up to the world in the late 70s and early 80s, 818 million Chinese were rural. That was roughly 80% of the population. And, when I say rural, I’m talking banjos and donkeys rural.
They were peasants by any measure, with most working with nothing more than a shovel and wheelbarrow – lest anyone forget where Marxism ultimately leads.
But take that shovel out of their hands and replace it with a button on an assembly line? Now that peasant can now produce goods for possibly thousands of people, whereas before he was lucky to produce enough for himself on the farm.
You can see the impact of China’s urbanization-to-fuel-globalization productivity boost in a metric called GDP per capita – or the value of a country’s economic output per person.
From 1980 to 2010 China’s GDP per capita soared from $194 in 1980 to $4,428.
That near 23-fold increase is what I would call a productivity miracle.
Alongside the wild growth in American debt to pay for it, that miracle did more than anything else to make “Crazy Rich Asians” a thing.
But what happens when you’ve already moved beyond wheelbarrows?
The Computer for the Rest of Us
When PCs became commonplace in the early 80s, economists expected a productivity miracle here in the U.S. Now, no one expected a similar magnitude leap in production. Putting such a powerful tool on everyone’s desk would surely turn one person into two, for instance. Or maybe at least 1.2 people.
Instead, what we got was something like an extra 4/100ths of a person. And this lackluster boost is what economists call the productivity paradox.
You Can Only Get So Fat and AI Can’t Change That
“Another paradox you say, Don?”
Yes. And since we’re talking about AI, I asked ChatGPT to summarize the productivity paradox for you.
The productivity paradox refers to the period of time during which the widespread adoption of information technology, including personal computers, was expected to lead to significant increases in productivity, but in reality, this increase was limited. The concept was first introduced in the late 1980s and continued to be a topic of discussion throughout the 1990s and early 2000s.
I know… Informative but dry, but let’s give him a break. He’s just a few months old.
Anywho, as support for Chatty’s claims, give the chart below a look and you’ll see just how little PCs have done to boost each individual’s contribution to economic growth.
Over the 20-year period following the Macintosh computer’s release in early 1984, real GDP per capita in the U.S. (the real meaning accounting for inflation) increased by a measly 4%.
While the period under examination was slightly above trend (the green, dotted line), that had more to do with the U.S. economy coming out of the Volker-led recession of the early 80s than from productivity gains.
Were productivity the driver, the gap between the blue and green line would widen. But, instead, the gap ended roughly where it began.
Moreover, moving those computers from the desktop to the palm of your hand during the subsequent iPhone years coincided with a clear drop in productivity (i.e., the gap dropped below the trend line and it’s getting wider).
It hit an air pocket during the Global Financial Crisis (GFC) and has been losing ground ever since.
During this whole period (say 1980 to 2020) computer chips went from holding 100,000 transistors per chip to 50 billion.
And even with that 500,000-fold leap in humanity’s ability to compute, Americans generate scarcely more value today than they did in 1980.
The computer revolution did almost nothing to help humans make more cheese.
To be sure, going from pickaxes to computers produced astounding results (basically the story of Asian emerging markets). But for developing economies, it’s more like running to stand still.
In other words, once a market is developed, technology doesn’t make more cheese than the market can consume.
That’s the logic from the demand side.
From the supply side, I’m sure there’s a “resource required versus output desired” graph I could draw to show how much an economy can produce and demonstrate that resources are now the main constraint.
But I won’t. It’s easier to resort to reason and, frankly, common sense.
Whatever work AI produces, it can’t produce those goods with less energy and commodities than we can now.
The industrial revolution helped us reach the limits of physics in that regard. Goods may be produced differently. But there won’t be meaningfully more goods, services, or content to produce assuming we could consume more anyway.
So, I guess there are two statements I can confidently make about AI…
It will move everyone’s cheese but it won’t help everyone make more cheddar.