The epic mistake about manufacturing that’s cost Americans millions of jobs

Editors note: this story is important because it again debunks the disproven claim that automation caused manufacturing job losses rather than trade. 

Looking back, there were two kinds of people who lived in America in 2016: people who believed Donald Trump, and people who believed data.

[Gwynn Guilford | November 8, 2018 | Quartz]

Trump claimed on the campaign trail that globalization had destroyed US manufacturing—and in the process, the American economy—by letting China and other countries steal American factory jobs. From the turnout at Trump’s rallies and the “Make America Great Again” stickers slapped on bumpers across America, it was clear the message was resonating.

The data camp didn’t get it. Yes, the US had hemorrhaged manufacturing jobs, losing close to 5 million of them since 2000. Trade may have been a factor—but it clearly wasn’t the main culprit. Automation was. Robots and fancy machines had supplanted workers, turning the US into a manufacturing dynamo at the cutting edge of innovation. An article in Vox, published a month before the 2016 presidential election, spelled out the situation.

“Declining manufacturing employment over the past 30 years has given a lot of people the impression that America’s manufacturing sector is in decline. But that’s actually wrong,” the Vox article explained. “American factories are about twice as efficient today as they were three decades ago. So we’re producing more and more stuff, even as we use fewer and fewer people to do it.”

This was hardly a groundbreaking insight. For a decade or so, this phenomenon had been put forth by Ivy League economists, former US secretaries of treasury, transportation, and labor, Congressional Research Services, vice president Joe Biden, president Barack Obama—and by Quartz too, for that matter. In a 2016 New York Times articletitled “The Long-Term Jobs Killer is Not China. It’s Automation,” Harvard economist Lawrence Katz laid out the general consensus: “Over the long haul, clearly automation’s been much more important—it’s not even close.”

Manufacturers’ embrace of automation was supposedly a good thing. Sure, some factory workers lost their jobs. But increased productivity boosted living standards, and as manufacturing work vanished, new jobs in construction and other services took its place. This was more of a shift than a loss, explained Bradford DeLong, economics professor at the University of California, Berkeley.

So when Trump won the presidential election, the true-blue data believers dismissed his victory as the triumph of rhetoric over fact. His supporters had succumbed to a nativist tale with cartoon villains like “cheating China” and a shadowy cabal of Rust Belt-razing “globalists.”

But it turns out that Trump’s story of US manufacturing decline was much closer to being right than the story of technological progress being spun in Washington, New York, and Cambridge.

Thanks to a painstaking analysis by a handful of economists, it’s become clear that the data that underpin the dominant narrative—or more precisely, the way most economists interpreted the data—were way off-base. Foreign competition, not automation, was behind the stunning loss in factory jobs. And that means America’s manufacturing sector is in far worse shape than the media, politicians, and even most academics realize.

In the four decades between 1960 and 2000, US manufacturing employment was basically stable, averaging around 17.5 million jobs. Even during the 1980s and 1990s, as Korea and other smaller Asian nations joined the ranks of Germany and Japan to threaten the dominance of US factories, the absolute number of manufacturing workers stayed mostly flat. That’s why what happened next is so alarming.

Between 2000 and 2010, manufacturing employment plummeted by more than a third. Nearly 6 million American factory workers lost their jobs. The drop was unprecedented—worse than any decade in US manufacturing history. Even during the Great Depression, factory jobs shrunk by only 31%, according to a Information Technology & Innovation Foundation report. Though the sector recovered slightly since then, America’s manufacturing workforce is still more than 26% smaller than it was in 2000.

What’s odd is that, even as US factories laid off an historically unprecedented share of workers, the amount of stuff they made rose steadily—or at least, it appeared to. The sector’s growth in output, adjusted for inflation, had been chugging away at roughly the same pace as US GDP since the late 1940s. That makes sense given that productivity—that is, advances in technology, skill, or management that allow workers to make more stuff in less time—has also been growing at a zippy clip.

How, then, do you reconcile the epic employment slump of the 2000s with the steady rise in output? The obvious conclusion is that factories needed fewer people than they did in the past because robots are now doing more and more of the producing. That’s tough for factory workers, but US manufacturing is doing fine.

What’s odd is that, even as US factories laid off an historically unprecedented share of workers, the amount of stuff they made rose steadily—or at least, it appeared to. The sector’s growth in output, adjusted for inflation, had been chugging away at roughly the same pace as US GDP since the late 1940s. That makes sense given that productivity—that is, advances in technology, skill, or management that allow workers to make more stuff in less time—has also been growing at a zippy clip.

How, then, do you reconcile the epic employment slump of the 2000s with the steady rise in output? The obvious conclusion is that factories needed fewer people than they did in the past because robots are now doing more and more of the producing. That’s tough for factory workers, but US manufacturing is doing fine.

An economist at the Upjohn Institute, an independent organization that researches employment, Houseman specializes in measuring globalization. She had been working with a team of Federal Reserve economists with access to more granular data than was publicly available, which allowed them to strip away the computers industry output from the rest of the data. That revealed just how the rest of manufacturing was doing—and it was much worse than what Houseman and her colleagues expected.

“It was staggering—it was actually staggering—how much that was contributing to growth in real [meaning, inflation-adjusted] manufacturing productivity and output,” says Houseman.

This was especially striking given that the two measures lay at the heart of the prevailing narrative that US manufacturing is growing healthily.

In 2011, Houseman and her colleagues mentioned their discovery in a paper they published in the Journal of Economic Perspectives. But the point went largely unnoticed.

Undeterred, Houseman spent the next few years digging further into why this relatively small industry was driving so much growth—and what was really going on with America’s manufacturing.

In order to understand how the manufacturing sector is doing, economists look at how much stuff factories are making compared with previous years. The key measure of this output is “value added”: manufacturing sales, minus the cost of things like electricity and parts used in the manufacturing process. They look at this across a dozen or so manufacturing subsectors, such as paper, apparel, furniture, and chemicals.

But that figure alone isn’t enough. To make the output volume comparable from one year to the next, the statisticians aggregating the data adjust for price changes, as well as improvements in product quality. For example, let’s say statisticians want to figure out how much the sales of Intel processors grew in 2017 versus 2016.

The problem is, the processor released in 2017 is superior to that sold in 2016 in many tangible ways. But how do you account for the fact that a 2017 processor provides users with more value? In general, statisticians assume the difference in value between the two models is just the difference in their prices. If, say, the 2017 processor costs twice as much as the 2016 one does, then selling one 2017 processor counts as selling two of the 2016 versions in the statisticians’ books.

In this hypothetical, the real output data might look like increased sales of processors. But it could also simply reflect the statisticians’ assumptions that people value their new processor more than they did earlier models, because the new version’s superior performance.

Government data wizards do this sort of quality adjustment for all sorts of products, including automobiles. However, the biggest adjustments show up in processors and the other goods made by the computers subsector, in which the blazing pace of technological change makes for dramatic and ultra-fast leaps in quality.

In other words, the method statisticians use to account for these advances can make it seem like US firms are producing and selling more computers than they actually are. And when the computers data are aggregated with the other subsectors, the adjustment makes it seem like the whole of American manufacturing is churning out more goods than it actually is.

Read more at Quartz

MADE IN AMERICA.

CPA is the leading national, bipartisan organization exclusively representing domestic producers and workers across many industries and sectors of the U.S. economy.

The latest CPA news and updates, delivered every Friday.

WATCH: WE ARE CPA

Get the latest in CPA news, industry analysis, opinion, and updates from Team CPA.