How AI Is Like That Other General Purpose Technology: Electricity

 

Do we live in a time of rapid, sweeping technological change or one of persistent, maddening stagnation? Even as politicians and pundits warn about robots stealing all the jobs, economic statistics show weak productivity growth. So perhaps a paradox similar to the 1980s when economist Robert Solow famously said, “You can see the computer age everywhere but in the productivity statistics.”

Then the 1990s happened and so did an information technology revolution and productivity boom, finally. One takeaway from that experience is that it can take considerable time to fully understand and harness new technologies so that measured productivity increases. And that’s not just the case with advanced tech such as incorporating artificial intelligence into a business. For example: The first barcode scan took place in the mid-1970s, but it took 30 years for organizations throughout the manufacturing-retail supply chain to make needed investments in “complementary technological, organisational, and process change,” as explained in “Upstream, Downstream: Diffusion and Impacts of the Universal Product Code” by Emek Basker and Timothy Simcoe.

Maybe the most well-known example is research from economic historian Paul David who found that it took decades for American factories to electrify and reorganize production after the shift to polyphase alternating current. And here’s a complementary finding in the new working paper “Does Electricity Drive Structural Transformation? Evidence from the United States” (bold by me):

The effects of electrification take a very long time to unfold, illustrating the long time scales involved when trying to understand the impact of a general purpose technology on the labor market. First, even though electricity was developed commercially in the late 19th century, most residents and commercial activities in the U.S. did not have electricity by 1910, and only large urban areas were electrified. The development of higher-voltage power lines was very slow until 1920, and only accelerated significantly between 1920 and 1940. We show that this acceleration of electrification affected the structure of employment, especially favoring the growth of operative jobs and negatively affecting farmer jobs. The long timescale for the impact of electrification on the labor market suggests that we may see a similarly protracted development of the full impacts of ICT—the latest general purpose technology—on the labor market.

Indeed, many technologists also consider AI to have the potential to be an important general-purpose technology, just be patient. In “Unpacking the AI-Productivity Paradox,” Erik Brynjolfsson, Daniel Rock, and Chad Syverson conclude thusly:

The recent productivity slowdown says nothing about future productivity growth and is no reason to downgrade prospects. In fact, history teaches the opposite lesson. Past surges in productivity were driven by general-purpose technologies (GPTs) like electricity and the internal combustion engine. In turn, these technologies required numerous complementary co-inventions like factory redesigns, interstate highways, new business processes, and changing workforce skills before they truly fulfilled their potential. Importantly, these co-inventions took years or even decades to materialize, and only then did productivity improve significantly. We believe that AI has the potential to be the GPT of our era. And like earlier technologies, it requires numerous complementary innovations — including new products, services, workflow processes, and even business models — that are often costly and time-consuming to develop. The low productivity growth of recent years may partially reflect these costs and may also be a harbinger of significantly higher growth once necessary co-inventions are put in place. Accordingly, we see no inherent inconsistency between forward-looking technological optimism and backward-looking disappointment.

Published in Economics, Technology
Tags: ,

Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 3 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Steve C. Member
    Steve C.
    @user_531302

    Change is hard. The way we’ve always done it has an at times overpowering inertia.

    Change is expensive. Can you fund it from cash flow or do you need to borrow money?

    Change is risky. You buy software that doesn’t meet your needs. You install new tech and skimp on training. You buy one copy of a critical manual that sits in the office not on the shop floor. Or, my favorite, you buy software and insist the software mimic your old work processes.

    AI is coming, but it’s still a long way away.

    • #1
  2. Matthew Singer Inactive
    Matthew Singer
    @MatthewSinger

    Steve C. (View Comment):

    Change is hard. The way we’ve always done it has an at times overpowering inertia.

    Change is expensive. Can you fund it from cash flow or do you need to borrow money?

    Change is risky. You buy software that doesn’t meet your needs. You install new tech and skimp on training. You buy one copy of a critical manual that sits in the office not on the shop floor. Or, my favorite, you buy software and insist the software mimic your old work processes.

    AI is coming, but it’s still a long way away.

    Its “been coming” since the late 80’s.  Recalling the rise and fall of Symbolics computers and Prolog expert systems.  Still waiting on anything that is really the “I” part of “AI”

     

     

    • #2
  3. Steve C. Member
    Steve C.
    @user_531302

    Matthew Singer (View Comment):

    Steve C. (View Comment):

    Change is hard. The way we’ve always done it has an at times overpowering inertia.

    Change is expensive. Can you fund it from cash flow or do you need to borrow money?

    Change is risky. You buy software that doesn’t meet your needs. You install new tech and skimp on training. You buy one copy of a critical manual that sits in the office not on the shop floor. Or, my favorite, you buy software and insist the software mimic your old work processes.

    AI is coming, but it’s still a long way away.

    Its “been coming” since the late 80’s. Recalling the rise and fall of Symbolics computers and Prolog expert systems. Still waiting on anything that is really the “I” part of “AI”

     

     

    It depends on your expectations. Everyone who evangelizes for AI tells a tale of omniscience. In practice, applying AI is very useful if you keep your expectations modest. An expert system which ingests lots of data over a long enough time and can crunch many iterations will produce better solutions.

    • #3
Become a member to join the conversation. Or sign in if you're already a member.