Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 40 original podcasts with new episodes released every day.
Product Launch Failure
Let’s be honest: it’s only human to get a kick out of the spectacle of powerful, infinitely rich companies making a disastrous entry into a splashy new field, like Google’s recent epic fail with Gemini Al. These are rare, special cases of plunging right over the edge, involving an unusually choice degree of hubris and corporate humiliation.
Thirty years ago, it was Apple’s turn in the barrel when it introduced Newton, predecessor of today’s tablets and smartphones. At the time, Newton seemed almost miraculous. To be fair, new products often don’t succeed, even when they aren’t paradigm-shaking novelties. Most of the time there’s nothing unusual, let alone disgraceful in that normal winnowing process. It doesn’t usually involve becoming an instant, media-wide, national laughingstock that spoils the launch of a Whole New Thing.
Back then, business people often carried Dayrunners, a trendy brand of totable small looseleaf notebook with calendars, notes, and contact info. Newton let you carry all that, reduced to the size of a paperback book and easily kept in sync with your desk computer. You could scribble notes to yourself, and—here comes the hangup—it could even read your handwriting. It was that one feature that led to ridicule, because Apple released it too soon after insufficient testing. Product designer Jony Ive joined Apple to revise the early Newton. A year later, handwriting recognition worked much better, but by then the PR damage had been done. After Steve Jobs’ return, he would keep Ive for every major product launch thereafter.
Newton’s main problem was it wasn’t networked. WiFi didn’t exist yet. When your 2024 smartphone does handwriting and voice recognition, it’s acting solely as a thin client; the work is done in the cloud and instantly returned to your screen. Newton, by contrast, had to do everything all by itself.
In retrospect, Apple’s big mistake was overpromising, with a pretentious ad campaign that implied a lot more AI than the little MessagePad could deliver. Newton’s basic idea wasn’t dumb. The proof was the late Nineties success of the Palm Pilot, a smaller, lighter, and cheaper Personal Digital Assistant.
My Newton, ready for an overseas trip in 1995.
Four years before the first Betamax arrived, more than a decade before Blockbuster stores would spread through our towns and cities, a U.S. company named Cartrivision released a domestically designed and built home video player. Unlike the handful of earlier attempts to sell home video recorders, this one had a specific marketing target and function: playing pre-recorded movies. Sold through Sears, Roebuck, & Company, the actual manufacturing was done by Packard Bell, a respected if second tier electronics company. Every Sears store that sold it also offered a fifty-film rental library of legal, licensed Hollywood feature films. No previous home video machine offered any.
This pioneering effort did things a little differently. Since most people didn’t live within daily driving range of a Sears, those rentals weren’t charged by the day, but by the number of viewings before a customer brought it back. (Each video cartridge had a simple mechanical counter, like an odometer. There were also a few cartridges for outright sale, mostly of the same kind of drearily cheap content that would later fill direct-to-VHS tapes.) The need to trade off rental tapes would bring customers back into the store frequently, which Sears liked.
It wasn’t a crazy scheme, but it didn’t quite work. Aimed at the top end of the market, for most of its existence Cartivision was available only as a built-in to “wideboy”-styled mahogany console TV sets for the living room. But the target buyers already had expensive color TVs, and didn’t need another console set. It also meant the huge, heavy sets wouldn’t be brought in for servicing, and few of the field technicians in the Sears service trucks had any training in fixing the moving parts of a videotape machine.
In contrast, when Sony came to America with Betamax, it too targeted a specific purpose: time shifting of broadcast programs. Just about any of the previous video recorders could have done that, but Sony was the first to make it the key to sales.
Often, the development period has been so long and expensive that the sunk cost fallacy takes over. That’s what happened to RCA’s incompatible, non-laser videodisc system; by the time it made it to market in 1981 they already knew it would flop, but after seven years of effort and $100 million, they couldn’t go back to their stockholders without making a valiant try. That four-year “try” wasted an additional $60 million.
In other cases, the product is stillborn, barely on the market at all. In 1951, it was CBS’s early form of color TV; a much later example is HD-DVD, the 2007 competitor of Blu-Ray. In 2012, Hewlett-Packard’s tablet computer lasted only 49 days before it was pulled. They all made it to the sales counter, but were almost immediately abandoned by companies that belatedly realized they weren’t going to win.
Most technological products become cheaper over time, but some have an intrinsic wall of high cost and limited demand that never fully goes away. Supersonic flight, Concorde style, never did enter a virtuous circle of becoming affordable. AT&T’s 1970 Picturephone, at a monthly rent of $100 ($775 in today’s money), couldn’t attract enough customers to make calling each other worthwhile. Color TVs, which started out three times as expensive as black and white, spent ten years as a niche for the rich that didn’t hit its 1955 sales goals until 1965, when they were only twice as expensive. But they doggedly hung around long enough to succeed, mostly because RCA, color’s chief developer, never gave up.
The failure of the Alto computer, invented a half century ago at Xerox’s Palo Alto Research Center and a decade ahead of its time, was more than a simple case of lazy or overcautious corporate timing. What made Alto so unique, so advanced was its windows-icons-and mouse interface and its (relatively) high-definition bitmapped screen. At the time, fifty years ago, that required so many expensive silicon storage chips that the product would have been impractically costly. Then, a key breakthrough in cyber engineering allowed the computer to save most of that money by continuously, imperceptibly swapping parts of that screen image in and out of a smaller memory.
With the Alto, Xerox had a five-year head start on a workplace-capable, networked, windowing computer with a laser printer. But this is critical: they knew, they had to know it would be a rapidly depreciating asset. It was literally a case of use it or lose it. They didn’t use it in time, so, in one of the great what-ifs of corporate history, they lost it.
Apple’s nearly forgotten 1983 Lisa computer series preceded the Macintosh by a year, with much of the same system architecture—windows, icons, a mouse. The problem was a familiar one for groundbreaking new technology: it was way more expensive than planned. A well-equipped Lisa workstation cost about half as much as a car. From that point forward, the brand that once called itself “The computer for the rest of us” was twice as expensive as the computer for the rest of “them”—i.e., MS-DOS, later in the form of Microsoft Windows (with a capital “W’).
Sometimes it’s a problem of timing. The project is rushed to the market before it’s ready, like first generation fission power stations, the space shuttle, Newton, or VR headsets in the Nineties. Or it’s too late to reach the market, like Polaroid’s Polavision, instant (silent) home movies that would have rocked the Super 8 market in 1966. But when they finally appeared ten years later, they were overshadowed by truly “instant home movies”—video, with sound, on reusable tape. Sony’s Walkman cassette players were world-beating hits, but their much-ballyhooed 1992 follow-up, the MiniDisc, had a unique Sony recording format that didn’t have time to catch on before direct online digital delivery and storage (MP3 and Apple iTunes) took over from physical media.
At the other end of a particular technology’s birth, widespread adoption, and economic lifetime, there are often a few ideas, once novel, some quirky, that have had unpredictable staying power beyond the grave of obsolescence. Super 8 movies, vinyl records, vacuum tube music amplifiers, instant film cameras; hobbyists gave them ghostly afterlives, decades after their abandonment by the mainstream market.
Some costly corporate misfortunes are outright blunders, combined with, in many cases, infighting and ego. In the case of Google Gemini, it was wokeness, combined with willful blindness. To be fair, plenty of other bad product launches were normal misjudgments, combined with unexpected technical snags and plain bad luck.
Apple’s very recent cancellation of its decade-long, multi-billion-dollar pursuit of a practical self-driving car, is a rare, apparently laudable case of a megacorporation that prudently pulled back from the brink before taking that last, bet-the-entire-company risk of manufacturing and selling a vastly expensive new product. Even the world’s most successful firms are wary of the tricky transition between a promising idea that works in a lab and a commercially viable product for sale to the public.
One suggestion, though: if you’ve got a risky new product, don’t give it a too-easily-mockable name, like Edsel, Ishtar, or Gigli.
You don’t always know what, at the outset, will fail; on the other hand, you don’t always know when an established institution that seems to have been humming along practically forever is about to enter a failure spiral.
I recently opened a used book (Man Without a Face, 1997). After a lifetime of disillusion, Markus Wolf, Communist east Germany’s coldly brilliant spy master, spells out fifty different ways that he helped the DDR state cynically betray the cause of socialism. Nonetheless he couldn’t resist ending his postwar tale with a defiant dedication to Marx:
A demain, Karl! Until tomorrow.
Now, if that’s not a chilly ending, I don’t know what is.
Published in General
The Newton was a big hit with certain vertical markets, especially physicians, who could easily carry one in their lab coats. Maybe it was the lab-coat-wearing set as a whole who loved it.
On its death, my understanding, both from talking to someone who worked on the Newton project and from reading the Walter Isaacson biography of Steve Jobs, is that it wasn’t really either the tech or the public reaction to the Newton that killed it. As Gary points out, the last iteration of the hardware (the 2000?) and NewtonOS before it was killed was pretty good, with the handwriting recognition having made great strides. What killed it was a who: Steve Jobs. A) It was created and developed when he was not with Apple, and therefore was Not His Baby, and B) it used a stylus, which at the time he hated the idea of. (He famously said “you have five styluses on each hand” when asked why no stylus for the iPhone/iPad.) So no matter how good the Newton was, it was doomed.
And now we have Apple Pencil and Scribble. Almost all my comments are written in cursive and translated into print (to the best of its ability to translate my cursive). Sometimes I use the pencil to tap on the keyboard because its fine point is more accurate than my finger on the small keyboard keys.
I take it as another sign of Jobs’ antipathy to the Newton that it took Apple forever to adapt/port NewtonOS’s handwriting recognition to MacOS/iOS/iPadOS. It didn’t show up until Inkwell, many years after the fact.
Since I write cursive* like a doctor with a head injury, I’ve never really been able to get the hang of doing a lot of handwriting on my iPads.
*And, to be honest, block letters too.
Thanks for your comments, Archie! I agree that not-invented-here syndrome is largely responsible for the end of Newton. IIRC, in the Isaacson book Jobs comes across, if anything, a trifle less anti-Newton than comments that Jobs made back in the Nineties. By ten years later, his attitude was less hostile than dismissive, as in, “It was too minor a product and I had far more pressing problems with Apple that had to be dealt with first”.
It’s forgotten now, but there were a handful of company-owned Newton stores before there were Apple stores. Since Macintosh ads and packaging had a distinctive set of colors and typefaces, Newton’s were different.
One issue that I had with it was the relatively high price of software, due to the small size of the Newton market. This was something of a vicious circle.
In some ways, the way Jobs acted when he came back to Apple reminds me of Lorne Michaels’ return to SNL after a gap of five years when the show was run by Dick Ebersol, the NBC executive who was Michaels’ partner in creating the show. For many years, clip shows and compilations ignored the Dick Years, although cynical observers noted that Lorne made a partial exception for Eddie Murphy, because he was just too big an asset to exploit.
Do you think it was “not invented here” or more “I personally didn’t have anything to do with its invention?” I always wanted a Newton, but at the time I couldn’t afford one. By the time I could, they were gone.
“I personally didn’t have anything to do with its invention?” seems like a subset of “not invented here.” It was “not invented here” according to Jobs because it wasn’t invented by Apple when Jobs was in charge.
It is, but there is a distinction between the two to be drawn. Usually “not invented here” is due to a more general corporate culture. Jobs’ objection was more specific because he personally disliked the Newton, regardless of what the rest of the company thought, and had the power to kill it. That’s what I’m talking about here.
K’s formula–that it’s a subset of not-invented-here is probably right. Though it must be said that Jobs didn’t kill off a number of other Apple products that were designed during the John Sculley-Gil Amelio years, like MacBook or QuickTime. MacBook was a big success, and not even Jobs was going to kill off a winner. Newton had a near-disastrous launch but managed to settle into a niche. The niche was small, though, and smartphones were already in the works. It’s unknown when, exactly, Jobs decided to go into the phone business.
Apple wasn’t Apple when Jobs wasn’t there. It’s a superset of “not invented here.”
Naw. Not a superset. The cases where “not invented here” involve a particular person, must necessarily be smaller than – and “surrounded by” – those involving the whole business.
Nothing was invented by Apple when Steve Jobs wasn’t there. Because it wasn’t Apple.
No true Scotsman.
He wasn’t working there either?
One True Scotsman.
Well, until the day he became a True American.
And it makes me laugh to this day, 90 years later: “Race: Scotch”.
Race?? Hey!
In any case: Nowadays, “Scots” or “Scottish” are preferred. But in 1934, what the hell…I’m sure my grandfather’s generation took it with a grin!
Yeah, that really should be blood type.