April 04, 2013

Productivity in a Ditch? Some Scenario Planning Reactions

Peter Kennedy
Partner

On the heels of David Stockman’s cranky New York Times op-ed about how successive Democratic and  Republican administrations have irreversibly destroyed the U.S. economy comes a notably more calm and measured piece by John Cassidy in the current New Yorker about future prospects for productivity growth (and therefore rising living standards) in the post-internet era.  Do not read this if you’re still in a Stockman funk.  But do read it if you’re open to a skeptical perspective on future productivity gains.  The big boom may be all behind us.  Or not.  Either way, the discussion is important scenario-planning fodder.

Cassidy’s rundown of recent productivity statistics is jarring:
  • The recent high-water mark for productivity growth (as measured by average annual growth in non-farm output per hour) was the five years leading up to 2000, when annual non-farm output increased 2.75%.
  • Even before the Great Recession hit, productivity had slowed to 1.4% (2005 to 2007), less than half of what it had been during the previous nine years.
  • From 2005 to 2012, output per hour rose at just 1.5%, which Cassidy points out is the same growth rate registered between 1973 and 1996.
  • Productivity growth in both 2011 and 2012 was less than 1%.
  • In the last quarter of 2012, per hour output may have actually fallen – by as much as 1.9 %, according to revised Labor Department data.

Cassidy acknowledges all the possible quirkiness around the recent productivity data but keeps coming back to the view held by a number of technology skeptics (e.g., economist Robert Gordon) that the big tech-driven productivity boom may have already happened, starting with large-scale introduction of computers into back offices in the 1970s and 1980s.  Then came the Internet and e-commerce revolutions, which had indisputable impacts on productivity.  But Gordon and others argue that developments since the early 2000 decade, notably the advent of mobile computing, smart phones and social networking, do not fundamentally improve labor productivity.  In fact, their potentially distracting nature could mean that they represent a net negative on worker output.

If Gordon is right, then in the absence of the next big thing, we’re stuck with low productivity growth and therefore, at best, stagnant incomes.  But people like Wired founding editor Kevin Kelly say not so fast. There are a host of new technological innovations – from big data, to cheap artificial intelligence and personal robotics – that have yet to have their impact registered on the economy.  Furthermore, FSG’s ongoing scenario-planning work for the US Coast Guard  and others points to potentially dramatic upsides from the quantum computing revolution, which is still very much in an embryonic state.  And the genomics revolution, as we have learned from work with clients in the health care sector, may well cause the cost and quality of health care to massively improve over the next two decades; and the cross-fertilization of these and other technologies may cause a feeding frenzy that will push us into an entirely new era of human history. The best may be yet to come. (This optimism jibes well with book that just came out, Internet entrepreneur Byron Reese’s Infinite Progress: How the Internet and Technology Will End Ignorance, Disease, Poverty, Hunger and War, which we will review on this site very shortly.)

Even so, it does not necessarily signal a return of the supposed happy economy days of the 1990s and early 2000s.  Some recent research cited by Cassidy reveals a break in the once solid link between productivity and wages.  So even if productivity gathers steam once more there’s no guarantee that benefits will be widely shared.  Income inequality could, in fact, worsen.

At the end of the day we in the scenario-planning world need to maintain a degree of agnosticism on the subject.  All plausible paths to the future need to be taken seriously and carefully examined – except of course any suggestion that scenario planners may one day be replaced by more productive machines. 

 

 

Thoughts?