“Computer user” defines the limits of my expertise. I can’t describe them with the fluency of @hankrhody. I can’t build precision electronics like @SkipSul. I can’t program them the way @judgemental or @arahant can. But people like me had an important part to play in the microcomputer revolution: We’re the suckers who paid for it, usually cheerfully. I flipped through a few quarter-century old computer magazines, noticing just how wildly expensive everything was in 1994-’97, for much less performance and far fewer capabilities than today’s computers. Still, to a non-computer specialist like me, the mid-Nineties is a world that’s almost two thirds a modern one. There were slick magazines advertising laptops and desktop machines with color monitors. Accessories like printers and modems plugged right in. The software was by then largely standardized on MS-DOS/Windows 3.1. It was already assumed that you’d want a modem for online use, although it would be for contact via plain old telephone lines with bulletin board systems, not the World Wide Web just quite yet. 1994 or so, in other words, is a primitive but recognizable world to a computer user of today.
Recently I acquired a copy of Byte Magazine from August 1982. This is a lucky find because it’s from a brief, in between period in the history of personal computers. 1982 is most of the way back to the crudely printed newsletters and bulletins of the geeky computer clubs of the Seventies, like the one in northern California that spawned Apple. This issue of Byte runs to 512 pages (!), an amount of advertising that demanded filling in with a whole bunch of dry-as-sawdust technical articles about object-oriented programming, and defining characteristics of sprites on mapped x-y coordinates. That was Byte’s readership.
There’s very little here yet about what actual end users might do with these machines. Almost every article and ad page in the 512 of them would be incomprehensibly challenging to anyone who innocently stumbled in, hoping to find out something about using computers. In the early to mid-Eighties, the only true ease-of-use was found with toy computers, the Sinclair, Commodore, Atari and ColecoVision ones you could buy for $99-$199 and hook up to your living room TV. They didn’t do much that was useful. Even the games were lame.
There are applications for sale in Byte, plenty of them, selling at jaw-dropping high prices by today’s standards, but they are either programming fragments that you have to stitch together yourself, or they’re simple turnkey packages dedicated to one purpose, like printing dry cleaning tags. Like the microcomputer newsletters of the Seventies, most of these voluminous ads are black and white, crudely hand-drawn, with a variety of cheap typefaces that would do justice to a 1950s church bulletin. Apple, as well as IBM and Microsoft, are among the few advertisers who’d still be widely recognized today, and they have color ads (still an expensive rarity in computer mags in 1982). These ads are surprisingly ordinary-looking, not that different than nearby pages for Ashton-Tate’s dBASE.
I attended the second Applefest in Boston, May 14-16, 1982. Some friends of mine worked for a new magazine, Softalk, so I had a floor pass. The two Steves were still doing their buddy act at the conference, but almost everyone at the show skipped the Friday night opening in favor of the premiere of “Conan the Barbarian”. It was quite a weekend. Across the street from the Hynes Center, a giant Jolly Roger fluttered in the wind, marking the Pirate’s Convention.
At Applefest, voice I/O system cards and magnetic storage media were all the rage that spring. 1982 was a peculiar half-and-half era, feminism-wise. The term “sexism” had already been in use for a dozen or so years. Women were already writing software for micros and running start-up companies. Yet even in liberal Boston, an Eighties computer show was also full of “booth babes”, like the young women who pose at auto shows. One group of models wore tight t-shirts that proclaimed “We’ve Got the Best Twin Floppies in Town!”. Undeniably eye-catching but rather crass. But another, more subtle approach worked better with this crowd: a booth of nice, but normal-looking women giving away shirts that merely promised “No Bad Memories”–a romantic ideal that both sides can agree on.
There’s an amazing variety of vendors of products that few people in today’s world have ever had to buy. In ’82, regardless of who you bought from, you probably had a green-and-black or orange-and-black monitor and not much to do with it. You couldn’t just plug a computer into a printer. Usually, you needed an add-on circuit card that had to be configured via tiny rocker switches to run with your specific computer and your printer, each end of which could be almost impossible to straighten out. Speaking of printing, a far-from-exotic business necessity, if you didn’t want your expensive machine to come to a stuttering halt while it printed things out, you needed a print buffer, a costly block of outboard memory that accepted full files from the computer and doled them out to the printer, a little bit at a time.
But then, the outlay didn’t seem like all that big a deal when your printer already cost you $1300, and your computer $3000. That $4300 starter system would be about $11,292 in today’s money. To add insult to financial injury, the computers you bought for that kind of money were no great shakes, and that wouldn’t even have included the main software you’d want to make the thing minimally useful. For example, the first really successful word processing software for microcomputers was WordStar. At a hefty $400; say a thousand bucks in 2019.
Another approach to personal computing was briefly popular. The Kaypro and Osborne computers were similar packages–a (damn heavy!) “portable” computer with a built-in monochrome monitor, two floppy disk drives, and—the dealmaker!—a library of name-brand business software guaranteed to run. Both companies were too small and ill-managed to survive, but they had a great idea for making computing as non-threatening and worry-free as possible 35 years ago. For $1795, either company gave you a complete package that you didn’t have to be a computer hobbyist to use. My own office was first equipped with Kaypros, which became a great Hollywood favorite. Arthur C. Clarke and Peter Hyams used it to send each other overnight drafts of the script to “2010: The Year We Make Contact”. William F. Buckley liked his Kaypro so much he did all his writing on it almost to the end of his life.
The IBM AT series and Apple’s Macintosh would appear in 1984. That generation of the personal computer would grow over the years into being a powerful step up in usefulness, as well as ease of use. But it took time. Alfred Sloan, the longtime chairman of General Motors during its glory days, confessed in his memoirs that the unsung hero of early automobiling was the patient, long-suffering customer, who paid for the progress we all benefit from now. Personal computers were no different.
I owe you an explanation about baby Moses’s very own IBM computer and how it ended up in my hands. Here it is: Charlton Heston was one of the most influential of trustees of the American Film Institute. Like a number of other industry big shots, like Ray Stark and Jerry Weintraub, he donated filmmaking gear and then-current office equipment to AFI. One batch from the Hestons included a few family-owned personal computers, still very expensive at the time.
AFI had just received a massive grant from Apple, both in cash and in-kind contributions, and one requirement was Appletalk wiring and an all-Apple AFI campus. That meant they couldn’t use donated IBM computers anymore, so they quietly asked a few people if they had use for them. I walked away with Fraser Heston’s 1983-vintage XT, a big heavy thing with two hard drives and two floppy disc drives.
Fraser had been pressed into service in 1956 to play his own father as a baby in Cecil B. de Mille’s “The Ten Commandments”. And that’s how come I have Moses’s computer in my storeroom.Published in