Is There Really a Great Stagnation? The Problem of Measuring Economic Growth in America’s Digital Economy

 

shutterstock_282670979Last month, Goldman Sachs economists Jan Hatzius and Kris Dawsey put out a research report arguing US government statistics understate GDP growth because they understate productivity growth. Over the past five years, productivity growth has averaged 0.6% annually vs. 2.6%  over the prior 15 years. Here’s the gist of Goldman’s argument in “Productivity Paradox v2.0″:

— Measured productivity growth has slowed sharply in recent years, and we have reduced our working assumption for the underlying trend to 1½%. This is the same sluggish rate that prevailed from 1973 to 1995 and stands well below the long-term US average of 2¼%. The proximate cause of the slowdown is a slump in the measured contribution from information technology.

 But is the weakness for real? We have our doubts. Profit margins have risen to record levels, inflation has mostly surprised on the downside, overall equity prices have surged, and technology stocks have performed even better than the broader market. None of this feels like a major IT-led productivity slowdown.

— One potential explanation that reconciles these observations is that structural changes in the US economy may have resulted in a statistical understatement of real GDP growth. There are several possible areas of concern, but the rapid growth of software and digital content—where quality-adjusted prices and real output are much harder to measure than in most other sectors—seems particularly important.

— Specifically, we see reasons to believe that the well-known upward biases in the inflation statistics related to quality changes and the introduction of new products are particularly severe for software and digital content. Quantifying the effects is difficult, but it is not unreasonable to think that they could offset a substantial portion of the measured productivity slowdown.

— Our analysis has three practical implications. First, confident pronouncements that the standard of living is growing much more slowly than in the past should be taken with a grain of salt. Second, given the uncertainty around GDP, it is better to focus on other indicators—especially employment—to gauge the cumulative progress of the recovery and the remaining amount of slack. And third, true inflation is probably even lower than measured inflation, reinforcing the case for continued accommodative monetary policy.

Yesterday, JPMorgan economist Michael Feroli and Jesse Edgerton offered a direct counter in the cleverly titled “Do androids dream of electric growth?”:

— Slowing economic growth has prompted speculation that the data aren’t capturing the digital economy

— For this explanation to work one needs to demonstrate that measurement issues are getting worse over time

— There is evidence that there are measurement issues, but little evidence thus far that they are getting worse

— The conjecture that the recent growth slowdown is due to mismeasurement has little empirical support

So two big issues for Goldman. First, calculating the “overall amount of “consumer surplus” for new products is hard. This is an especially troublesome measurement when dealing with “free” digital content. Hatzius and Dawsey: “How important is this issue for GDP measurement? Hard evidence is again hard to come by, but Erik Brynjolfsson and Joo Hee Oh estimated in 2012 that the growth in time spent on free internet sites between 2007 and 2011 created incremental consumer surplus worth ¾% of GDP each year. Although the uncertainty around these types of estimates is obviously very large, as the authors acknowledge, we do think it is plausible that increased new products bias could be holding down the measured real GDP growth rate by a meaningful amount.”

Second, inflation statisticians are better at doing quality-adjustments for computer hardware rather than software. Measured prices of the latter have declined only fraction of the former. [See chart below.] As Hatzius and Dawsey memorably put it, “How much better is Grand Theft Auto V than Grand Theft Auto IV?” Tough to know.

061215goldman

JPMorgan sees things differently. First, while Feroli and Edgerton concede we may be chronically undermeasuring productivity, they are skeptical that bias is getting worse. Cellphones, email, and Internet existed before the Great Recession. The digital economy is not new, but the productivity slowdown is. And do today’s technological advances present “thornier challenges for price measurement than did the introduction of televisions, air conditioning, washer/dryers, electric guitars, dishwashers, microwaves, VCRs, game consoles, and all the rest of the decades-long parade of once-amazing new products that have continually improved living standards”?

Second, why focus on how, say, Uber is an undercounted quality improvement but ignore opposite examples like how airline seats are smaller and delayed departures have increased? Feroli and Edgerton: “Quality-adjusted airfares have likely been increasing much faster than the reported data. Other examples abound. We should be cautious about cherry-picked arguments.

Third, the productivity slowdown is also seen in industries that are easier to measure, such a manufacturing.

061215jpm

So who is right? I think the honest answer is that we aren’t really certain, though I do know that Goldman’s conclusion syncs with the innovation and productivity research of AEI’s Stephen Oliner. More research is needed and, thankfully, is on the way. Policymakers should hope that Goldman is right but act as if JPMorgan is right and pursue needed reforms in areas such as education, immigration, entrepreneurship, and basic research.

 

Published in Economics, Science & Technology
Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 4 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Bryan G. Stephens Thatcher
    Bryan G. Stephens
    @BryanGStephens

    My staff have not received a Cost of Living increase in 6 years. Yes there is stagnation.

    • #1
  2. Ricochet Member
    Ricochet
    @IWalton

     A difficult subject.  Even disaggregated data are just slightly less aggregated thus have lost the most important information.   Then there are new ventures that aren’t happening, tiny, small and large, some of which would fail, but the risk takers learn more from their failures than their successes.  These are the hounds that aren’t barking.   New ventures also drive innovation and technology.  It’s hard enough to measure such things when they occur but we eventually see flourishing at all levels.  Same with stagnation; we see growth of big companies and relatively new sectors that haven’t felt the stifling hand of the regulatory apparatus  but with time we see fraying at the edges,  weakness in the middle, decay at the bottom, symbiotic relationships between governments and private activities at the top. It shows up among the weakest sectors, the weakest groups, and the most vulnerable cities.  Averages and aggregates don’t capture this , but are we flourishing? 

    • #2
  3. Belt Inactive
    Belt
    @Belt

    Hmm.  I don’t know enough about economics to even guess if you’re right.  But two things you might want to comment on sometime:

    First, let’s say that the standard model is mis-measuring some segment of the economy.  You’ve found something that indicates that things are better than they appear.  But how can we know if there’s not some other indicator that really shows that things are even worse?

    Second, and this is the real point, as someone who has no training in economics and only casually looks at the issue, I have to rely on experts to explain it to me.  But I really don’t trust any expert any longer.  It’s not merely that some are blinded by ideology, or twisting data to reach certain results, or obsessing over a pet theory.  It’s that so often it smacks of the central-planner fallacy, presuming that they really know what’s going on, and they should be the ones to tell everyone what needs to be done.  It almost feels like some form of magical thinking.

    I see that same problems in climate studies, and even in the angst over the 2012 presidential race polling results.  I should work this up into a post some day.

    • #3
  4. user_199279 Coolidge
    user_199279
    @ChrisCampion

    I (vaguely) remember some analysis done in the 1990s that was showing the finally-realized benefit of computers in the workplace, which seems a little dated now but the idea was that the desktop versions of Windows that were becoming ubiquitous was finally leveraging long-promised productivity enhancements of the computer age.  Basically, even though PCs were available in the 1980s and networked, the applications were much more specific, harder to use (think of applications built on AS/400s or mainframe applications, but available on terminals), inflexible, and had high maintenance costs.

    These sorts of “soft” productivity gains might be empirically un-measurable.  Say a new application is developed that obviates the need to have an FTE perform the work manually, it’s done by software, and it takes 15 minutes/week to run the job that previously took 40 hours for the FTE.  That’s a huge productivity gain, right?

    But there’s also now the ongoing maintenance, documentation, migration, upgrade/enhancement, testing of the app, on a regular and ad hoc basis.  So even though some gains are made, it’s not all gravy, and you incur ongoing labor and systems/hardware costs for as long as the app is in a production environment (and it’s development/test/QA environments where enhancements/upgrades are tested and vetted before moving to production).

    In short, there’s no easy answer. Might have to be tied to a pure FTE/headcount basis to measure what’s been gained or lost.

    • #4
Become a member to join the conversation. Or sign in if you're already a member.