Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.
Do we live in a time of rapid, sweeping technological change or one of persistent, maddening stagnation? Even as politicians and pundits warn about robots stealing all the jobs, economic statistics show weak productivity growth. So perhaps a paradox similar to the 1980s when economist Robert Solow famously said, “You can see the computer age everywhere but in the productivity statistics.”
Then the 1990s happened and so did an information technology revolution and productivity boom, finally. One takeaway from that experience is that it can take considerable time to fully understand and harness new technologies so that measured productivity increases. And that’s not just the case with advanced tech such as incorporating artificial intelligence into a business. For example: The first barcode scan took place in the mid-1970s, but it took 30 years for organizations throughout the manufacturing-retail supply chain to make needed investments in “complementary technological, organisational, and process change,” as explained in “Upstream, Downstream: Diffusion and Impacts of the Universal Product Code” by Emek Basker and Timothy Simcoe.More
The American economy accelerated nicely in the middle of last year. A Two Percent Economy no more! Well, at least for a bit. Economic growth now seems to be reverting to the humdrum pace seen over most of the post-Financial Crisis recovery. (The Trump White House, it should be noted, sees things more optimistically.) The combo of slower labor force growth and productivity growth means the economy’s growth potential isn’t what it once was.
But maybe artificial intelligence can accelerate economic growth on a sustained basis by boosting productivity growth. In their 2018 paper, “AI and the Economy,” economists Jason Furman and Robert Seamans point out that many experts think “AI and other forms of advanced automation, including robots and sensors, can be thought of as a general purpose technology that enable lots of follow-on innovation that ultimately leads to productivity growth.”More
My phone buzzed while my watch thumped my wrist. I was in a meeting and so made a surreptitious glance at my wrist. My wife was calling and I declined the call, knowing that if it was urgent she would either leave a message or send me a text. The text came through a few minutes later, asking if I wanted to join her for lunch. I waited until the meeting had ended, and until I had taken care of other business that had piled up, before finally messaging her back about when I would be free. We had our lunch date, but as we were leaving I pulled out my phone to check on my work emails, and there on the lock screen was a “Siri Suggestion” that I return my wife’s call from an hour and half before. Siri is a brilliant idiot. Brilliant enough to guess that I should probably call my wife back, then put that as a suggestion right on the lock screen, but idiotic enough to not know that the suggestion was unwelcome and unnecessary.
Over the last couple of iterations in Apple’s IOS (the operating system used in their mobile devices), Apple has layered in assorted habit-gathering machine-learning routines into Siri, its smooth-voiced “Digital Personal Assistant”. The latest iteration of IOS, version 12, has extended these habit-watching routines to the point where, by default, they constantly monitor what you do and where you do it, then attempt to build macros of commands to automate and guide those habits. The suggestion that I return the call to my wife was based on the phone having observed that I do usually return calls to my wife, but had not yet done so in this case.More
Arthur C Clarke famously said, “Any sufficiently advanced technology is indistinguishable from magic.” Likewise, any sufficiently advanced automation will be mistaken for life. Recently, I watched Netflix’s Altered Carbon series, based on the science fiction novel by Richard K Morgan. The core premise is that the essence of humanity is intelligence or consciousness alone. Therefore, […]
So much of the conversation about artificial intelligence is negative, with much of that negativity about the potential for job loss. (I see things differently.) So a new analysis from Obama White House economist Austan Goolsbee is a welcome change of pace.
In many ways, it is unfortunate that labor market policy has dominated our thinking about the AI economy. The main economic impact of AI is not about jobs or, at least, is about much more than just jobs. The main economic impact of these technologies will be how good they are. If the recent advances continue, AI has the potential to improve the quality of our products and our standard of living. If AI helps us diagnose medical problems better, improves our highway safety, gives us back hours of our day that were spent driving in traffic, or even just improves the quality of our selfies, these are direct consumer benefits. These raise our real incomes and the economic studies valuing the improvements from quality and from new products tend to show their value is often extremely high
Being interested in Artificial Intelligence, when I ran across this article in The Atlantic I was hoping to find something interesting. The article focuses on Judea Pearl, an AI researcher who pioneered Bayesian (calling Midget Faded Rattlesnake) networks for machine leaning. Pearl is disappointed that most AI research nowadays is centered around his previous bailiwick of machine learning (what he calls fancy curve fitting) and not around his new interest, which is around causal reasoning models.
This is all well and good and somewhat interesting, however near the end of the article he and the interviewer talk about free will and have the following exchange about evil.More
As I’ve blogged about at length in this space, the US economy won’t see sustained growth unless we can boost productivity. And there are a few different theories out there for why productivity growth has been so sluggish since the mid-2000s. Maybe ideas are becoming harder to find, maybe productivity has increased and we aren’t measuring it correctly, or maybe productivity growth is here but it’s just not evenly distributed yet.
If that last theory is correct, and there’s some reason to think it is (per a Commerce Department study, the digital sector has grown at an average annual rate of 5.6% over the last decade, compared to 1.5% overall), then the relevant question for policymakers is how to get these innovations to spread throughout the rest of the economy. That’s where the new McKinsey report “Notes from the AI Frontier” comes in. “Artificial intelligence (AI) stands out as a transformational technology of our digital age,” they write, and after studying 400 different use cases across 19 different industries, they estimate AI can “potentially enable the creation of between $3.5 trillion and $5.8 trillion in value annually” — if its use is broadly adopted.More
It’s been a bad stretch for techno-optimists, with new doubts being raised about our social media giants and the progress of autonomous driving. And of course there isn’t much sign yet in the broad economic data that we are on the verge of a new productivity and growth boom.
But researchers Erik Brynjolfsson, Daniel Rock, and Chad Syverson offer some good news in “Unpacking the AI-Productivity Paradox,” an article in the MIT Sloan Management Review. They think there’s good reason to believe artificial intelligence could be an important general-purpose technology like electricity and the internal combustion engine. Moreover, they find no “inherent inconsistency” between that optimism and the current statistical sluggishness, which some have labeled the “Great Stagnation.” In fact, BRS think they have a pretty good explanation for it:More
The Luddites and technophobes have a point. Machines do displace workers. Always have. From the cotton gin, machine tools, and punch cards to combine harvesters, industrial robots, and business software. And it is this “displacement effect” that leads to scary forecasts about AI and robots leading to mass technological unemployment and underemployment.
But MIT’s Daron Acemoglu and Boston University’s Pascual Restrepo argue in a rich new paper, “Artificial Intelligence, Automation and Work,” that there is far more to the story. For starters, automation may allow tasks to be performed more cheaply, increasing demand for them. The introduction of ATMs was followed by more jobs for tellers because it reduced the costs of banking, and banks opened more branches. Or the productivity effect could be broader: Agricultural mechanization lowered food prices and created more demand for non-agricultural goods and the workers producing them.More
Most of medicine is about information, not tools. Consider, for example, that getting the diagnosis right is the single most critical element in the vast majority of medical situations, which means that expertise, not toys are the critical piece. And enormous broadband capacity is making it possible to eliminate the gap between patient and expert […]
Elite: Dangerous is a space simulation game in which players pilot ships to mine and trade, explore the galaxy, police smugglers and pirates, or become smugglers and pirates. It includes plenty of NPCs (Non-Playable Characters) operated by artificial intelligence to compete with players. From Julian Benson at Kotaku UK: More
Unless you’re a gamer, you probably have not heard of Horizon Zero Dawn. Guerrilla Games’ upcoming product for the Playstation 4 console, the story is set a thousand years into the future and proceeds from a fascinating premise. Typically, science fiction involving “the rise of the machines” — domination of humanity by independent AI (use whatever […]
In our recent sit-down for Uncommon Knowledge, I asked Peter Thiel — with, you’ll note, an assist from Ricochet’s own John Walker — about the prospects for artificial intelligence. What say you, Ricochet? Is Thiel right about the (perhaps inherent) limitations?More