Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 40 original podcasts with new episodes released every day.
Product Launch Failure
Let’s be honest: it’s only human to get a kick out of the spectacle of powerful, infinitely rich companies making a disastrous entry into a splashy new field, like Google’s recent epic fail with Gemini Al. These are rare, special cases of plunging right over the edge, involving an unusually choice degree of hubris and corporate humiliation.
Thirty years ago, it was Apple’s turn in the barrel when it introduced Newton, predecessor of today’s tablets and smartphones. At the time, Newton seemed almost miraculous. To be fair, new products often don’t succeed, even when they aren’t paradigm-shaking novelties. Most of the time there’s nothing unusual, let alone disgraceful in that normal winnowing process. It doesn’t usually involve becoming an instant, media-wide, national laughingstock that spoils the launch of a Whole New Thing.
Back then, business people often carried Dayrunners, a trendy brand of totable small looseleaf notebook with calendars, notes, and contact info. Newton let you carry all that, reduced to the size of a paperback book and easily kept in sync with your desk computer. You could scribble notes to yourself, and—here comes the hangup—it could even read your handwriting. It was that one feature that led to ridicule, because Apple released it too soon after insufficient testing. Product designer Jony Ive joined Apple to revise the early Newton. A year later, handwriting recognition worked much better, but by then the PR damage had been done. After Steve Jobs’ return, he would keep Ive for every major product launch thereafter.
Newton’s main problem was it wasn’t networked. WiFi didn’t exist yet. When your 2024 smartphone does handwriting and voice recognition, it’s acting solely as a thin client; the work is done in the cloud and instantly returned to your screen. Newton, by contrast, had to do everything all by itself.
In retrospect, Apple’s big mistake was overpromising, with a pretentious ad campaign that implied a lot more AI than the little MessagePad could deliver. Newton’s basic idea wasn’t dumb. The proof was the late Nineties success of the Palm Pilot, a smaller, lighter, and cheaper Personal Digital Assistant.
My Newton, ready for an overseas trip in 1995.
Four years before the first Betamax arrived, more than a decade before Blockbuster stores would spread through our towns and cities, a U.S. company named Cartrivision released a domestically designed and built home video player. Unlike the handful of earlier attempts to sell home video recorders, this one had a specific marketing target and function: playing pre-recorded movies. Sold through Sears, Roebuck, & Company, the actual manufacturing was done by Packard Bell, a respected if second tier electronics company. Every Sears store that sold it also offered a fifty-film rental library of legal, licensed Hollywood feature films. No previous home video machine offered any.
This pioneering effort did things a little differently. Since most people didn’t live within daily driving range of a Sears, those rentals weren’t charged by the day, but by the number of viewings before a customer brought it back. (Each video cartridge had a simple mechanical counter, like an odometer. There were also a few cartridges for outright sale, mostly of the same kind of drearily cheap content that would later fill direct-to-VHS tapes.) The need to trade off rental tapes would bring customers back into the store frequently, which Sears liked.
It wasn’t a crazy scheme, but it didn’t quite work. Aimed at the top end of the market, for most of its existence Cartivision was available only as a built-in to “wideboy”-styled mahogany console TV sets for the living room. But the target buyers already had expensive color TVs, and didn’t need another console set. It also meant the huge, heavy sets wouldn’t be brought in for servicing, and few of the field technicians in the Sears service trucks had any training in fixing the moving parts of a videotape machine.
In contrast, when Sony came to America with Betamax, it too targeted a specific purpose: time shifting of broadcast programs. Just about any of the previous video recorders could have done that, but Sony was the first to make it the key to sales.
Often, the development period has been so long and expensive that the sunk cost fallacy takes over. That’s what happened to RCA’s incompatible, non-laser videodisc system; by the time it made it to market in 1981 they already knew it would flop, but after seven years of effort and $100 million, they couldn’t go back to their stockholders without making a valiant try. That four-year “try” wasted an additional $60 million.
In other cases, the product is stillborn, barely on the market at all. In 1951, it was CBS’s early form of color TV; a much later example is HD-DVD, the 2007 competitor of Blu-Ray. In 2012, Hewlett-Packard’s tablet computer lasted only 49 days before it was pulled. They all made it to the sales counter, but were almost immediately abandoned by companies that belatedly realized they weren’t going to win.
Most technological products become cheaper over time, but some have an intrinsic wall of high cost and limited demand that never fully goes away. Supersonic flight, Concorde style, never did enter a virtuous circle of becoming affordable. AT&T’s 1970 Picturephone, at a monthly rent of $100 ($775 in today’s money), couldn’t attract enough customers to make calling each other worthwhile. Color TVs, which started out three times as expensive as black and white, spent ten years as a niche for the rich that didn’t hit its 1955 sales goals until 1965, when they were only twice as expensive. But they doggedly hung around long enough to succeed, mostly because RCA, color’s chief developer, never gave up.
The failure of the Alto computer, invented a half century ago at Xerox’s Palo Alto Research Center and a decade ahead of its time, was more than a simple case of lazy or overcautious corporate timing. What made Alto so unique, so advanced was its windows-icons-and mouse interface and its (relatively) high-definition bitmapped screen. At the time, fifty years ago, that required so many expensive silicon storage chips that the product would have been impractically costly. Then, a key breakthrough in cyber engineering allowed the computer to save most of that money by continuously, imperceptibly swapping parts of that screen image in and out of a smaller memory.
With the Alto, Xerox had a five-year head start on a workplace-capable, networked, windowing computer with a laser printer. But this is critical: they knew, they had to know it would be a rapidly depreciating asset. It was literally a case of use it or lose it. They didn’t use it in time, so, in one of the great what-ifs of corporate history, they lost it.
Apple’s nearly forgotten 1983 Lisa computer series preceded the Macintosh by a year, with much of the same system architecture—windows, icons, a mouse. The problem was a familiar one for groundbreaking new technology: it was way more expensive than planned. A well-equipped Lisa workstation cost about half as much as a car. From that point forward, the brand that once called itself “The computer for the rest of us” was twice as expensive as the computer for the rest of “them”—i.e., MS-DOS, later in the form of Microsoft Windows (with a capital “W’).
Sometimes it’s a problem of timing. The project is rushed to the market before it’s ready, like first generation fission power stations, the space shuttle, Newton, or VR headsets in the Nineties. Or it’s too late to reach the market, like Polaroid’s Polavision, instant (silent) home movies that would have rocked the Super 8 market in 1966. But when they finally appeared ten years later, they were overshadowed by truly “instant home movies”—video, with sound, on reusable tape. Sony’s Walkman cassette players were world-beating hits, but their much-ballyhooed 1992 follow-up, the MiniDisc, had a unique Sony recording format that didn’t have time to catch on before direct online digital delivery and storage (MP3 and Apple iTunes) took over from physical media.
At the other end of a particular technology’s birth, widespread adoption, and economic lifetime, there are often a few ideas, once novel, some quirky, that have had unpredictable staying power beyond the grave of obsolescence. Super 8 movies, vinyl records, vacuum tube music amplifiers, instant film cameras; hobbyists gave them ghostly afterlives, decades after their abandonment by the mainstream market.
Some costly corporate misfortunes are outright blunders, combined with, in many cases, infighting and ego. In the case of Google Gemini, it was wokeness, combined with willful blindness. To be fair, plenty of other bad product launches were normal misjudgments, combined with unexpected technical snags and plain bad luck.
Apple’s very recent cancellation of its decade-long, multi-billion-dollar pursuit of a practical self-driving car, is a rare, apparently laudable case of a megacorporation that prudently pulled back from the brink before taking that last, bet-the-entire-company risk of manufacturing and selling a vastly expensive new product. Even the world’s most successful firms are wary of the tricky transition between a promising idea that works in a lab and a commercially viable product for sale to the public.
One suggestion, though: if you’ve got a risky new product, don’t give it a too-easily-mockable name, like Edsel, Ishtar, or Gigli.
You don’t always know what, at the outset, will fail; on the other hand, you don’t always know when an established institution that seems to have been humming along practically forever is about to enter a failure spiral.
I recently opened a used book (Man Without a Face, 1997). After a lifetime of disillusion, Markus Wolf, Communist east Germany’s coldly brilliant spy master, spells out fifty different ways that he helped the DDR state cynically betray the cause of socialism. Nonetheless he couldn’t resist ending his postwar tale with a defiant dedication to Marx:
A demain, Karl! Until tomorrow.
Now, if that’s not a chilly ending, I don’t know what is.
Published in General
I believe that since the days of Lady Ada Lovelace and Admiral Grace Hopper, we have rarely seen a woman write about computers so compellingly as She.
From your mouth to God’s ear, Stad, thanks!
BTW…wouldn’t “God’s Ear” be a good nickname for an NSA intercept sation?
I started my Engineer/programmer career in the mid 60’s and lived through two (at least) “That will never happen!” moments.
The first was that even when PC based word processors were pretty useful, the general opinion was that executives would never type their own documents. A secretary was too much of a power symbol.
The second was when our company was asked to develop a machine to count and record the paperback book returns by retailers to the wholesalers. Since it was too hard to get the paperback itself back into circulation, the front covers were removed and the job was to read and account for these. The covers were to be identified by image recognition a pretty complicated task at that point.
Our Engineers in the meeting (marketing hated us in meetings) said – “the solution is easier than that, just add a barcode to the cover”
We were told in no uncertain terms that the American public would never accept a barcode on a product!
A seldom mentioned part of the reason for this was that as women became eligible for better jobs, the quality of the secretarial applicant pool dropped precipitously. The women who were executive secretaries when you started are now executives, and with what was left, along with word processors meaning you could screw up and then fix it along the way, it became easier to do it yourself.
I don’t think they would have, until it was law and it was every single product. Remember, people hated that when it started. Especially for books and magazines.
I was party to a discussion where several of the participants predicted that pilots would reject fly-by-wire in favor of control cables running the length of the airplane, the way it had always been done. Telling the pilots that having their inputs transmitted to the actuators just wouldn’t be accepted.
Fly-by-wire is pretty much standard now.
This is an absolutely wonderful post. I love the history of technological innovation, and why/when it works and does not. It is very close to home for my day job/obsession.
Interestingly, when I started, engineers “engineered” and the actual programmers were women. I was an “Engineering Aide ” (the person who the technicians bossed around) at the start and my career was given a huge boost when one of the female programmers helped me work on a program that my boss said would never be used (until I finished it, anyway)
Emo Philips (a comedian from the mid 80s – 90s) had a joke, were he claimed the CD ruined his love life… He’d put on a 45 record, and he could go for as long as the record… taking breathers to change the record each time the song ended… The CD however would just go on and on …
There was both. VHS HQ was a “cleaner” standard VHS player, used the same RCA connectors.
S-VHS (IIRC) used a three plug cable with a special connector on the player side. (It’s been 20-plus years since I had mine, hard to remember exactly).
I had an S-VHS player for a couple years, I don’t recall special tapes, or a compatibility problem.
Edit: in this and the previous comment I believe I confused “S-VHS” and “S-video on a standard VHS player”.
I had the original Palm Pilot for a few years, got it at a Gartner Conference in October 1996. They handed them out at check-in with the program schedule for the week pre-loaded, then at the end of the week you could turn it back in, or buy it for a reduced price. My boss let me buy it and expense it to the company.
A couple years later I replaced it with a palm 3, which was a nice upgrade.
Hi. My name is Rodin, and I had a Palm Pilot. (I think you can start seeing a pattern, here.)
The Palm Treo was a smartphone before smartphones were cool.
In the Nineties, Pen Computing was (IMHO) the best of the ones that covered the field. It had to divide its attention between Newton, Palm, Pocket PC, Symbian, Casio, you name it.
Newton was licensed from the start; Sharp had their own, and Motorola made one with limited cell phone abilities for the industrial market.
IIRC, Magic Cap and General Magic were a sideways offshoot of Apple’s Newton, which originally was going to be a thick tablet at a premium price. A subsequent project director reoriented it towards something like a transistor radio form factor.
For the true S-VHS experience, you used the S-Video connection, which had the chroma and luma signals separated. The connectors were similar to PS/2 mouse/keyboard. Then you had the audio connections, RCA type, which would be stereo by that point. Color-coded red and white.
If you still used the single composite video connection – an RCA type, usually color-coded yellow – on an S-VHS machine, you lost at least some of the advantages of S-VHS which also recorded them in a different format on the tape. Which is why regular VHS tapes and S-VHS tapes were not interchangeable. At least not completely. You could use regular VHS tapes in an S-VHS machine, but you didn’t get the true S-VHS quality. (The electronics etc of an S-VHS machine might still work better than a standard VHS, but that’s not why you wanted S-VHS.)
The tapes were physically identical looking, which you’d expect since you could use VHS tapes in an S-VHS machine. They just weren’t recorded or played at S-VHS quality. I don’t remember if the S-VHS tapes had a special notch or anything so that a regular VHS machine wouldn’t accept them, but for sure if you got an S-VHS-recorded tape into a regular VHS machine, it wouldn’t play right.
Later in the VHS game there was some kind of near- or psuedo- S-VHS that used regular tapes, I don’t recall exactly what it was called. But I never fell for it.
I worked with an engineering guy years ago who designed and built video systems, among other things. So I picked up quite a bit from him too. One of the reasons S-VHS and using the S-Video connections was superior, was that a lot of the video processing circuitry – especially ICs – for composite video wound up separating the chroma and luma signals internally, processing them, then putting them back together again for the output. And then they went to the next stage, which separated them again… Adding loss/distortion each time.
S-Video connections were an early favorite with large-screen and projection TVs, where the loss/distortion was more obvious. People who wanted the best possible result from their Laserdisc player also used S-Video.
3-part Component Video came next, which had 3 video connections plus stereo sound. (Technically, S-Video was 2-part Component Video but I never saw it referred to that way in practice.) That was pretty common on early DVD players and even allowed the first levels of HD. (I haven’t checked recently, but I think Component Video supported up to 1080i.) And that was all still analog. It was also pretty common in higher-end computer graphics to use display technology where the Red, Green, and Blue signals for RGB were on 3 separate cable connections. Although that was different from 3-part Component Video for TV etc.
Some combo Laserdisc/DVD players also had Component Video outputs, but from what I’ve picked up, those only worked when playing DVD, not LD. I’ve never had a combo player myself, only straight DVD or straight LD.
Then came DVI, and HDMI…
Back in the day when I recorded a lot of movies and stuff from satellite TV, I always got receivers that included an S-Video output. And the recording computers I built had S-Video inputs. Both because it allowed for higher-quality recording, and because the S-Video connector didn’t carry any kind of copy-protection signal that might block or mess up a recording.
It’s one of those awkward situations that doesn’t lend itself to rah-rah Sisterhood movies on the Lifetime Network.
I was just reading Another Life, Michael Korda’s memoirs of 40 years in the book biz. He said that the willingness to take and credit returns began as a Depression-era desperation ploy to keep the booksellers afloat, but hung on after the war to be an albatross around the publishers’ necks.
Regular ol’ video was “composite”. When it was delivered over Red, Green and Blue separate lines it was “component” video. S-video was a compromise: the colors were carried combined, but they were carried separately from the black and white.
I got a Newton when the price dropped preceding the rollout of a newer version. I could draw and label pictures (such as for system architecture diagrams) on the fly and get them onto my computer. It replaced the notebooks that I was always carrying around.
My freshman year in college the communication department bought a backpack Sony VTR and camera setup which I remember as costing almost $30,000.
VTR meaning it was the open-reel type?
That would have been a very good setup. Probably very similar to what TV news stations would have been using at the time.
If it’s the Portapack I know (Sony AV series black and white), I think you’re talking more like $3,000. Then again, I don’t know what era you were in college.
An Ikegami ENG camera (electronic news gathering) and a portable 3/4 inch recorder, maybe.
I carried a Newton when I visited Hong Kong in ’95 and then Taipei in 1996. Every time i used it on the subway I attracted a small crowd. I was new in Asia, and was amused and surprised that people who are notably shy and polite were utterly un-inhibited about getting in your face (literally). When I told people in Taiwan that the chips were made there, they were a mixture of pride and disbelief. “Really? Us??”
A sidenote or two about competition with Polaroid, a name that was synonymous with instant photos. There was a weak competitor in the Sixties, Chrislin. It was a different chemical process that didn’t infringe Polaroid’s patents, so it was ignored. The Long Island-based Camera Company of America kept it going from 1965-’69.
Then in the mid-to-late Seventies, Eastman Kodak released its own version of instant photography. It was slick, well designed, and nationally distributed, a much greater competitor. And it was one of the all-time bad decisions in the history of American business. They infringed Polaroid’s patents. Kodak not only had to settle a ruinously expensive lawsuit, plus damages, but effectively refund every single person who ever purchased their camera. It all but destroyed Kodak financially.
For those who love really obscure media technology (and let’s face it, this readership has self-selected itself to an extraordinary degree), there was something in radio pre-WWII called Apex, wideband AM on a shortwave frequency. It had some of the quality advantages of FM. Like radio facsimile, it faded quickly after the war. But at one time, circa 1940, Apex was going to be the future of radio.
Did they think it would be better than FM? Interesting.
No, VTR means Video Tape Recorder … Its a commercial machine much larger than a consumer grade VCR…. But it had much better recording capabilities. The one I was looking at belonged to a Credit Union, and they used it to produce their TV ads in house… Something like this: