Discussions of how advances in technology, trade, and other factors lead to disruption of jobs often seems to begin with an implicit claim that it was all better in the past when the assumption seems to be that most workers had well-paid, secure, and life-long jobs. Of course, we all know that this story isn’t quite right. After all, about one-half of US workers were in agriculture in 1870, down to one-third by early in the 20th century, and less than 3% since the mid-1980s. About one-third of all US nonagricultural workers were in manufacturing in 1950, and that has now dropped to about 10%. These sorts of shifts suggest that job disruption and shifts in occupation have been a major force in the US economy throughout its history.
Indeed, Robert D. Atkinson and John Wu argue that the extent of job disruption was higher in the US economy in the past in “False Alarmism: Technological Disruption and the U.S. Labor Market, 1850–2015,” written for the Information Technology & Innovation Foundation (May 2017). They write:
“It has recently become an article of faith that workers in advanced industrial nations face almost unprecedented levels of labor-market disruption and insecurity. … When we actually examine the last 165 years of American history, statistics show that the U.S. labor market is not experiencing particularly high levels of job churn (defined as the sum of the absolute values of jobs added in growing occupations and jobs lost in declining occupations). In fact, it’s the exact opposite: Levels of occupational churn in the United States are now at historic lows. The levels of churn in the last 20 years—a period of the dot-com crash, the financial crisis of 2007 to 2008, the subsequent Great Recession, and the emergence of new technologies that are purported to be more powerfully disruptive than anything in the past—have been just 38 percent of the levels from 1950 to 2000, and 42 percent of the levels from 1850 to 2000. …
No Comments