Euphoria over artificial intelligence has seized the technology industry, the stock market, and the world. According to one AI company founder, AI is comparable to the discovery of fire or the wheel. It will soon enable any of us to plan a military invasion, design a hospital or discover new cancer drugs. AI is already being used by leading retailers like Walmart to better understand consumer trends, write ad copy, and adjust prices.
Last quarter, Google doubled its capital spending to $13 billion, and Microsoft spent $19 billion, most of those funds going on building AI-powered data centers around the world. Google’s CEO said he worried he might be under-investing.
IMF managing direct Kristina Georgieva recently predicted that AI will hit the global labor force “like a tsunami,” impacting 60% of workers in advanced economies and leading, she said, to a “tremendous increase in productivity.”
AI will be a great benefit to business and to millions of users. If I were still an active investor, Nvidia at $2.5 trillion market cap would be too rich for my blood (that’s its HQ pictured above, the most glamorous HQ in Silicon Valley today), but I would be buying shares in the networking and memory chips and other components that enable AI. But for the U.S. economy as a whole, the AI boom is very likely to be a huge productivity disappointment.
That’s right. A disappointment. I base that prediction on our experience with the Internet, which has been a monumental disappointment for the U.S. economy. Millions of individuals found the Internet enhanced their productivity, but for the economy as a whole it was deeply disappointing. This is not widely appreciated within the technology industry, but it certainly is by the government’s Bureau of Labor Statistics, which has dedicated a team and a section of its website to researching U.S. productivity growth, or lack thereof.
Figures 1 and 2 below come from that website. Figure 1 shows that from 2007 to 2019 and again from 2019 to 2024, U.S. productivity, i.e. output per hour worked, fell to just 1.5% growth a year. That’s far below the 2.7% level of the post-World War II economic boom, 1947-1973, and also below the short-lived 2.8% annual productivity surge from 2000 to 2007. Figure 2 shows that in the manufacturing sector the collapse in productivity growth was even worse, with a zero growth rate between 2007 and 2019.
Remember, it was 2007 when Steve Jobs first unveiled the iPhone. Long lines appeared instantly outside Apple stores worldwide whenever a new model was launched. In 2007, Apple sold 1.4 million iPhones. By 2012 it was selling 125 million a year. Samsung and others jumped into the business and in every year since 2014, phone makers have sold over 1.2 billion (that’s a B) smartphones a year. There are 7 billion people on the planet, so in the last six years, phone makers have sold more phones than there are people.
The Internet did become truly ubiquitous, with you everywhere you went. I was then in the tech industry, working for a maker of long-distance fiber-optic Internet data systems. I remember a lunch in Dallas, where one of our customers whipped out his iPhone to show me he could monitor his nationwide network from his phone. He could check for outages and see how many petabytes of data traffic were traveling between any two cities on his network. The wireless-available-everywhere network seemed like a huge boon to business, leisure, education, and everything else.
Figure 1. U.S. productivity growth has been weak from 2007 to today.
Figure 2. Manufacturing productivity growth was zero from 2007 to 2019 and remains disappointing today.
And yet, the year of the iPhone was the year that the big U.S. productivity slump started. Why?
According to economist Robert Gordon, American prosperity in what he calls the “American Century” (1870-1970) was due to the revolutionary nature of the automobile industry and home electrification, both of which date back to the early 19th century. What he calls Industrial Revolution #3, including personal computing and the Internet, lacks the capability to revolutionize an entire economy. It’s worth quoting his view directly:
“The inventions of the second industrial revolution (IR #2) gathered momentum between 1870 and 1920 and then between 1920 and 1970 created the most rapid period of growth in labor productivity experienced in American history, bringing an utter change from 1870 in most dimensions of human life. The inventions of the third industrial revolution (IR #3), though revolutionary within their sphere of influence—entertainment, communication, and information technology—did not have the same effects on living standards as had electricity, the internal combustion engine, running water, improving life expectancy, and the other Great Inventions of the special century, not to mention the improvement in the human condition as work hours declined from 60 to 40 per week." (1)
“…The most recent decade, 2004–14, has been characterized by the slowest growth in productivity of any decade in American history.” (2)
As the BLS data in Figures 1 and 2 show, the slow growth continued after Gordon published his book and is still with us today. Some economists talk about productivity as if it is solely about how much an individual worker can produce in an hour or a week or a year. But at the national level, there is another more important consideration. That is how it redistributes labor into new and different industries and how it affects the demand for labor. To raise the productivity of a nation with 160 million workers and a $27 trillion GDP is a different proposition from simply enabling one worker to double or triple her productivity.
In most stories of outstanding national economic growth, the growth revolution involves pulling large numbers of workers into new industries where their productivity is far greater than before. That’s different from increasing the productivity of the users or consumers of the new technology. Think of the Mississippi sharecroppers who moved to Detroit in the 1920s to go to work at Ford or GM or their suppliers. Their productivity (as measured by their wages) was a multiple of their previous earnings. Ford and GM could afford to pay high wages because millions of Americans were buying their products, in spite of the high profit margins and the high prices for automobiles, equivalent to roughly one year of earnings for the average American. This was also made possible by an elastic financial system which made finance available for auto purchasers.
The growth model I am describing is more or less the opposite of the neoliberal economic model preached by free-market Friedmanite conservatives like Phil Gramm or neo-Brandeisian liberals like FTC chair Lina Khan. In the neoliberal vision, perfect competition (i.e. dozens or hundreds of firms competing in every industry) is widespread, profit margins are slim, every worker’s wage matches her productivity, and the economy is in equilibrium.
I call that the equilibrium of the graveyard. That’s not how economies grow. In a real-world successful economy, oligopolies, excess profit, fast-growing companies, and “overpaid” workers are the driving forces of growth (3).
MIT economist Daron Acemoglu expressed a related view recently when he argued that some technological innovations can improve workers’ standard of living, but others can hurt it.
He cited the early British Industrial Revolution, where the automation of cotton spinning (turning raw cotton into yarn) in the late 18th century made yarn much cheaper, creating thousands of jobs for the weavers, who were paid to weave the yarn into fabric and textiles. But subsequently, in the early 19th century, the automation of weaving eliminated thousands of jobs without creating many new jobs elsewhere in the supply chain. The result was recession, wage cuts, widespread poverty, hunger and violence, such as the Peterloo Massacre in 1819 in Manchester, England when government troops shot into a crowd of protesting workers, killing 18. Acemoglu says that in those years weaver wages fell by half, while another source says they fell by two thirds. (4) (5)
Technological change disrupts economies, and while in the long term, it increases national income, in the shorter term it can reduce it, or at least generate enough disruption to lead to stagnant productivity growth. Technological change can also lead to reduced demand for labor in a national economy if the new industries created are in another country. This is a big part of the story of the Internet productivity disappointment. The Internet relies on the smartphones, physical network, and millions of computer servers that make it work. Millions of jobs were created manufacturing those devices, but those jobs were in Asia.
In the early 2000s, Apple and every other major technology manufacturer offshored jobs to Asia, especially China. The usual process of sharing above-normal profits with subcontractors and workers did take place (economists sometimes call this “rent-sharing”) but the revenue and benefits went to companies and workers in Asia, not in the U.S.
That process is likely to continue with AI. The technology chain from microprocessor and software to personal computing, the Internet, the cloud, and now AI has a consistent structural effect on the U.S. economy. The increased productivity of the user of these technologies is offset by the stagnant productivity of U.S. workers as they are pushed increasingly into lower-revenue occupations (retail, food service, education).
The Hothouse Economy
With this producer-centered concept of economic growth, it is better to contain the impact of the growth within a single nation to maximize the wealth creation and broad-based income distribution effects. I call this the “hothouse economy” approach to growth. U.S. economic growth from 1947 to 1973, the most impressive period of economic growth in our history was the story of a hothouse economy, as Robert Gordon explains in his description of that period:
“The lack of competition for jobs from recent immigrants made it easier for unions to organize and push up wages in the 1930s. The high tariff wall allowed American manufacturing to introduce all available innovations into U.S.-based factories without the outsourcing that has become common in the last several decades. The lack of competition from immigrants and imports boosted the wages of workers at the bottom and contributed to the remarkable “great compression” of the income distribution during the 1940s, 1950s, and 1960s.” (6)
Back to the present and the outlook for AI. Computer servers are the heart of the data center and also the heart of the AI boom. Most of the technology inside the servers was developed in America. According to analyst firm IDC, last year server makers sold $119 billion worth of servers. With the unstoppable pressure of the AI boom, that figure is expected to rise 22% this year to $144 billion. Virtually all those servers are made in Asia, even though at least one third of them will be deployed in U.S. data centers. If you add in the $100 billion a year Americans spend on smartphones, and all the other systems and components involved in the Internet, cloud, and AI, you can begin to see that if manufactured here, the hardware industry would rival any other U.S. industry in terms of size, scale, and jobs.
“Big Tech,” as the technology industry is now called, has grown from a small community of nerds in Silicon Valley and Boston into a central feature of American life. Big Tech is now intensely disliked intensely by politicians, on the right, left, and center. It spends millions on lobbying to hold back the proponents of regulation and meddling, most of whom have only the slightest understanding of how the technology works.
If, instead of manufacturing its products in Asia, the tech industry had them made here in the U.S., it would have millions of employees with a vested interest in the health of the industry. Sure, prices would be somewhat higher, but they would still be coming down every year, and the direction of travel is more important than the level. The prosperity of the industry would be shared with thousands of subcontractors and millions of workers. The virtuous circle of increased producer productivity, increased consumer spending, and greater income equality would be restored.
[1] Gordon, R. J. (2017). The Rise and Fall of American Growth: The U.S. Standard of Living since the Civil War, Princeton University Press, pg. 522
[2] Ibid, pg. 528.
[3] For more on this, see my two-part essay, The American Way of Growth, available here.
[4] Daron Acemoglu and Simon Johnson, History Already Tells Us the Future of AI, April 23, 2024, Project Syndicate. Available here.
[5] See Ian Hernon, Riot! Civil Insurrection from Peterloo to the Present Day, Pluto Press, 2006.
[6] Op. cit., pg. 554.
MADE IN AMERICA.
CPA is the leading national, bipartisan organization exclusively representing domestic producers and workers across many industries and sectors of the U.S. economy.
Economic View: The Coming AI Productivity Disappointment
Euphoria over artificial intelligence has seized the technology industry, the stock market, and the world. According to one AI company founder, AI is comparable to the discovery of fire or the wheel. It will soon enable any of us to plan a military invasion, design a hospital or discover new cancer drugs. AI is already being used by leading retailers like Walmart to better understand consumer trends, write ad copy, and adjust prices.
Last quarter, Google doubled its capital spending to $13 billion, and Microsoft spent $19 billion, most of those funds going on building AI-powered data centers around the world. Google’s CEO said he worried he might be under-investing.
IMF managing direct Kristina Georgieva recently predicted that AI will hit the global labor force “like a tsunami,” impacting 60% of workers in advanced economies and leading, she said, to a “tremendous increase in productivity.”
AI will be a great benefit to business and to millions of users. If I were still an active investor, Nvidia at $2.5 trillion market cap would be too rich for my blood (that’s its HQ pictured above, the most glamorous HQ in Silicon Valley today), but I would be buying shares in the networking and memory chips and other components that enable AI. But for the U.S. economy as a whole, the AI boom is very likely to be a huge productivity disappointment.
That’s right. A disappointment. I base that prediction on our experience with the Internet, which has been a monumental disappointment for the U.S. economy. Millions of individuals found the Internet enhanced their productivity, but for the economy as a whole it was deeply disappointing. This is not widely appreciated within the technology industry, but it certainly is by the government’s Bureau of Labor Statistics, which has dedicated a team and a section of its website to researching U.S. productivity growth, or lack thereof.
Figures 1 and 2 below come from that website. Figure 1 shows that from 2007 to 2019 and again from 2019 to 2024, U.S. productivity, i.e. output per hour worked, fell to just 1.5% growth a year. That’s far below the 2.7% level of the post-World War II economic boom, 1947-1973, and also below the short-lived 2.8% annual productivity surge from 2000 to 2007. Figure 2 shows that in the manufacturing sector the collapse in productivity growth was even worse, with a zero growth rate between 2007 and 2019.
Remember, it was 2007 when Steve Jobs first unveiled the iPhone. Long lines appeared instantly outside Apple stores worldwide whenever a new model was launched. In 2007, Apple sold 1.4 million iPhones. By 2012 it was selling 125 million a year. Samsung and others jumped into the business and in every year since 2014, phone makers have sold over 1.2 billion (that’s a B) smartphones a year. There are 7 billion people on the planet, so in the last six years, phone makers have sold more phones than there are people.
The Internet did become truly ubiquitous, with you everywhere you went. I was then in the tech industry, working for a maker of long-distance fiber-optic Internet data systems. I remember a lunch in Dallas, where one of our customers whipped out his iPhone to show me he could monitor his nationwide network from his phone. He could check for outages and see how many petabytes of data traffic were traveling between any two cities on his network. The wireless-available-everywhere network seemed like a huge boon to business, leisure, education, and everything else.
Figure 1. U.S. productivity growth has been weak from 2007 to today.
Figure 2. Manufacturing productivity growth was zero from 2007 to 2019 and remains disappointing today.
And yet, the year of the iPhone was the year that the big U.S. productivity slump started. Why?
According to economist Robert Gordon, American prosperity in what he calls the “American Century” (1870-1970) was due to the revolutionary nature of the automobile industry and home electrification, both of which date back to the early 19th century. What he calls Industrial Revolution #3, including personal computing and the Internet, lacks the capability to revolutionize an entire economy. It’s worth quoting his view directly:
As the BLS data in Figures 1 and 2 show, the slow growth continued after Gordon published his book and is still with us today. Some economists talk about productivity as if it is solely about how much an individual worker can produce in an hour or a week or a year. But at the national level, there is another more important consideration. That is how it redistributes labor into new and different industries and how it affects the demand for labor. To raise the productivity of a nation with 160 million workers and a $27 trillion GDP is a different proposition from simply enabling one worker to double or triple her productivity.
In most stories of outstanding national economic growth, the growth revolution involves pulling large numbers of workers into new industries where their productivity is far greater than before. That’s different from increasing the productivity of the users or consumers of the new technology. Think of the Mississippi sharecroppers who moved to Detroit in the 1920s to go to work at Ford or GM or their suppliers. Their productivity (as measured by their wages) was a multiple of their previous earnings. Ford and GM could afford to pay high wages because millions of Americans were buying their products, in spite of the high profit margins and the high prices for automobiles, equivalent to roughly one year of earnings for the average American. This was also made possible by an elastic financial system which made finance available for auto purchasers.
The growth model I am describing is more or less the opposite of the neoliberal economic model preached by free-market Friedmanite conservatives like Phil Gramm or neo-Brandeisian liberals like FTC chair Lina Khan. In the neoliberal vision, perfect competition (i.e. dozens or hundreds of firms competing in every industry) is widespread, profit margins are slim, every worker’s wage matches her productivity, and the economy is in equilibrium.
I call that the equilibrium of the graveyard. That’s not how economies grow. In a real-world successful economy, oligopolies, excess profit, fast-growing companies, and “overpaid” workers are the driving forces of growth (3).
MIT economist Daron Acemoglu expressed a related view recently when he argued that some technological innovations can improve workers’ standard of living, but others can hurt it.
He cited the early British Industrial Revolution, where the automation of cotton spinning (turning raw cotton into yarn) in the late 18th century made yarn much cheaper, creating thousands of jobs for the weavers, who were paid to weave the yarn into fabric and textiles. But subsequently, in the early 19th century, the automation of weaving eliminated thousands of jobs without creating many new jobs elsewhere in the supply chain. The result was recession, wage cuts, widespread poverty, hunger and violence, such as the Peterloo Massacre in 1819 in Manchester, England when government troops shot into a crowd of protesting workers, killing 18. Acemoglu says that in those years weaver wages fell by half, while another source says they fell by two thirds. (4) (5)
Technological change disrupts economies, and while in the long term, it increases national income, in the shorter term it can reduce it, or at least generate enough disruption to lead to stagnant productivity growth. Technological change can also lead to reduced demand for labor in a national economy if the new industries created are in another country. This is a big part of the story of the Internet productivity disappointment. The Internet relies on the smartphones, physical network, and millions of computer servers that make it work. Millions of jobs were created manufacturing those devices, but those jobs were in Asia.
In the early 2000s, Apple and every other major technology manufacturer offshored jobs to Asia, especially China. The usual process of sharing above-normal profits with subcontractors and workers did take place (economists sometimes call this “rent-sharing”) but the revenue and benefits went to companies and workers in Asia, not in the U.S.
That process is likely to continue with AI. The technology chain from microprocessor and software to personal computing, the Internet, the cloud, and now AI has a consistent structural effect on the U.S. economy. The increased productivity of the user of these technologies is offset by the stagnant productivity of U.S. workers as they are pushed increasingly into lower-revenue occupations (retail, food service, education).
The Hothouse Economy
With this producer-centered concept of economic growth, it is better to contain the impact of the growth within a single nation to maximize the wealth creation and broad-based income distribution effects. I call this the “hothouse economy” approach to growth. U.S. economic growth from 1947 to 1973, the most impressive period of economic growth in our history was the story of a hothouse economy, as Robert Gordon explains in his description of that period:
Back to the present and the outlook for AI. Computer servers are the heart of the data center and also the heart of the AI boom. Most of the technology inside the servers was developed in America. According to analyst firm IDC, last year server makers sold $119 billion worth of servers. With the unstoppable pressure of the AI boom, that figure is expected to rise 22% this year to $144 billion. Virtually all those servers are made in Asia, even though at least one third of them will be deployed in U.S. data centers. If you add in the $100 billion a year Americans spend on smartphones, and all the other systems and components involved in the Internet, cloud, and AI, you can begin to see that if manufactured here, the hardware industry would rival any other U.S. industry in terms of size, scale, and jobs.
“Big Tech,” as the technology industry is now called, has grown from a small community of nerds in Silicon Valley and Boston into a central feature of American life. Big Tech is now intensely disliked intensely by politicians, on the right, left, and center. It spends millions on lobbying to hold back the proponents of regulation and meddling, most of whom have only the slightest understanding of how the technology works.
If, instead of manufacturing its products in Asia, the tech industry had them made here in the U.S., it would have millions of employees with a vested interest in the health of the industry. Sure, prices would be somewhat higher, but they would still be coming down every year, and the direction of travel is more important than the level. The prosperity of the industry would be shared with thousands of subcontractors and millions of workers. The virtuous circle of increased producer productivity, increased consumer spending, and greater income equality would be restored.
[1] Gordon, R. J. (2017). The Rise and Fall of American Growth: The U.S. Standard of Living since the Civil War, Princeton University Press, pg. 522
[2] Ibid, pg. 528.
[3] For more on this, see my two-part essay, The American Way of Growth, available here.
[4] Daron Acemoglu and Simon Johnson, History Already Tells Us the Future of AI, April 23, 2024, Project Syndicate. Available here.
[5] See Ian Hernon, Riot! Civil Insurrection from Peterloo to the Present Day, Pluto Press, 2006.
[6] Op. cit., pg. 554.
MADE IN AMERICA.
CPA is the leading national, bipartisan organization exclusively representing domestic producers and workers across many industries and sectors of the U.S. economy.
TRENDING
CPA Sends Letter To Senate Leaders Schumer and McConnell Opposing Advancement of Recent USITC Nominations
CPA: Liberty Steel Closures Highlight Urgent Need to Address Mexico’s Violations and Steel Import Surge
CPA Applauds Chairman Jason Smith’s Reappointment to Lead House Ways and Means Committee
Senator Blackburn and Ossoff’s De Minimis Bill is Seriously Flawed
JQI Dips Due to Declining Wages in Several Sectors as November Jobs Total Bounces Back from Low October Level
The latest CPA news and updates, delivered every Friday.
WATCH: WE ARE CPA
Get the latest in CPA news, industry analysis, opinion, and updates from Team CPA.
CHECK OUT THE NEWSROOM ➔