The Infant Moses Owned an IBM Computer. Now It’s Mine

 

“Computer user” defines the limits of my expertise. I can’t describe them with the fluency of @hankrhody. I can’t build precision electronics like @SkipSul. I can’t program them the way @judgemental or @arahant can. But people like me had an important part to play in the microcomputer revolution: We’re the suckers who paid for it, usually cheerfully. I flipped through a few quarter-century old computer magazines, noticing just how wildly expensive everything was in 1994-’97, for much less performance and far fewer capabilities than today’s computers. Still, to a non-computer specialist like me, the mid-Nineties is a world that’s almost two thirds a modern one. There were slick magazines advertising laptops and desktop machines with color monitors. Accessories like printers and modems plugged right in. The software was by then largely standardized on MS-DOS/Windows 3.1. It was already assumed that you’d want a modem for online use, although it would be for contact via plain old telephone lines with bulletin board systems, not the World Wide Web just quite yet. 1994 or so, in other words, is a primitive but recognizable world to a computer user of today.

Recently I acquired a copy of Byte Magazine from August 1982. This is a lucky find because it’s from a brief, in between period in the history of personal computers. 1982 is most of the way back to the crudely printed newsletters and bulletins of the geeky computer clubs of the Seventies, like the one in northern California that spawned Apple. This issue of Byte runs to 512 pages (!), an amount of advertising that demanded filling in with a whole bunch of dry-as-sawdust technical articles about object-oriented programming, and defining characteristics of sprites on mapped x-y coordinates. That was Byte’s readership.

There’s very little here yet about what actual end users might do with these machines. Almost every article and ad page in the 512 of them would be incomprehensibly challenging to anyone who innocently stumbled in, hoping to find out something about using computers. In the early to mid-Eighties, the only true ease-of-use was found with toy computers, the Sinclair, Commodore, Atari and ColecoVision ones you could buy for $99-$199 and hook up to your living room TV. They didn’t do much that was useful. Even the games were lame.

There are applications for sale in Byte, plenty of them, selling at jaw-dropping high prices by today’s standards, but they are either programming fragments that you have to stitch together yourself, or they’re simple turnkey packages dedicated to one purpose, like printing dry cleaning tags. Like the microcomputer newsletters of the Seventies, most of these voluminous ads are black and white, crudely hand-drawn, with a variety of cheap typefaces that would do justice to a 1950s church bulletin. Apple, as well as IBM and Microsoft,  are among the few advertisers who’d still be widely recognized today, and they have color ads (still an expensive rarity in computer mags in 1982). These ads are surprisingly ordinary-looking, not that different than nearby pages for Ashton-Tate’s dBASE.

I attended the second Applefest in Boston, May 14-16, 1982. Some friends of mine worked for a new magazine, Softalk, so I had a floor pass. The two Steves were still doing their buddy act at the conference, but almost everyone at the show skipped the Friday night opening in favor of the premiere of “Conan the Barbarian”. It was quite a weekend. Across the street from the Hynes Center, a giant Jolly Roger fluttered in the wind, marking the Pirate’s Convention.

At Applefest, voice I/O system cards and magnetic storage media were all the rage that spring. 1982 was a peculiar half-and-half era, feminism-wise. The term “sexism” had already been in use for a dozen or so years. Women were already writing software for micros and running start-up companies. Yet even in liberal Boston, an Eighties computer show was also full of “booth babes”, like the young women who pose at auto shows. One group of models wore tight t-shirts that proclaimed “We’ve Got the Best Twin Floppies in Town!”. Undeniably eye-catching but rather crass. But another, more subtle approach worked better with this crowd: a booth of nice, but normal-looking women giving away shirts that merely promised “No Bad Memories”–a romantic ideal that both sides can agree on.

There’s an amazing variety of vendors of products that few people in today’s world have ever had to buy. In ’82, regardless of who you bought from, you probably had a green-and-black or orange-and-black monitor and not much to do with it. You couldn’t just plug a computer into a printer. Usually, you needed an add-on circuit card that had to be configured via tiny rocker switches to run with your specific computer and your printer, each end of which could be almost impossible to straighten out. Speaking of printing, a far-from-exotic business necessity, if you didn’t want your expensive machine to come to a stuttering halt while it printed things out, you needed a print buffer, a costly block of outboard memory that accepted full files from the computer and doled them out to the printer, a little bit at a time.

But then, the outlay didn’t seem like all that big a deal when your printer already cost you $1300, and your computer $3000. That $4300 starter system would be about $11,292 in today’s money. To add insult to financial injury, the computers you bought for that kind of money were no great shakes, and that wouldn’t even have included the main software you’d want to make the thing minimally useful. For example, the first really successful word processing software for microcomputers was WordStar. At a hefty $400; say a thousand bucks in 2019.

Another approach to personal computing was briefly popular. The Kaypro and Osborne computers were similar packages–a (damn heavy!) “portable” computer with a built-in monochrome monitor, two floppy disk drives, and—the dealmaker!—a library of name-brand business software guaranteed to run. Both companies were too small and ill-managed to survive, but they had a great idea for making computing as non-threatening and worry-free as possible 35 years ago. For $1795, either company gave you a complete package that you didn’t have to be a computer hobbyist to use. My own office was first equipped with Kaypros, which became a great Hollywood favorite. Arthur C. Clarke and Peter Hyams used it to send each other overnight drafts of the script to “2010: The Year We Make Contact”. William F. Buckley liked his Kaypro so much he did all his writing on it almost to the end of his life.

The IBM AT series and Apple’s Macintosh would appear in 1984. That generation of the personal computer would grow over the years into being a powerful step up in usefulness, as well as ease of use. But it took time. Alfred Sloan, the longtime chairman of General Motors during its glory days, confessed in his memoirs that the unsung hero of early automobiling was the patient, long-suffering customer, who paid for the progress we all benefit from now. Personal computers were no different.

I owe you an explanation about baby Moses’s very own IBM computer and how it ended up in my hands. Here it is: Charlton Heston was one of the most influential of trustees of the American Film Institute. Like a number of other industry big shots, like Ray Stark and Jerry Weintraub, he donated filmmaking gear and then-current office equipment to AFI. One batch from the Hestons included a few family-owned personal computers, still very expensive at the time.

AFI had just received a massive grant from Apple, both in cash and in-kind contributions, and one requirement was Appletalk wiring and an all-Apple AFI campus. That meant they couldn’t use donated IBM computers anymore, so they quietly asked a few people if they had use for them. I walked away with Fraser Heston’s 1983-vintage XT, a big heavy thing with two hard drives and two floppy disc drives.

Fraser had been pressed into service in 1956 to play his own father as a baby in Cecil B. de Mille’s “The Ten Commandments”. And that’s how come I have Moses’s computer in my storeroom.

Published in General
This post was promoted to the Main Feed by a Ricochet Editor at the recommendation of Ricochet members. Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 105 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    She (View Comment):

    Gary McVey (View Comment):

    One of the selling points of the Atari ST was it had a Motorola processor like the Macintosh and the GEM graphics environment, but it read and stored data in an IBM/Microsoft compatible format. With the right set of applications, it was the best of both worlds, at least at that low price point.

    Backstory: Holocaust survivor Jack Tramiel founded Commodore and eventually bought its rival, Atari. Hence the Atari ST was nicknamed the “Jackintosh”.

    I have a couple of Atari STs lying around. We were an Atari family (started with the 400, for which I fiddled and soldered to create an external “real” keyboard). Had a Rana disk drive, and a bunch of other stuff. Mr. She wrote a couple of programs (Player/Missile graphics!) for games that could be played by kids with limited physical mobility. Loved “Star Raiders.” Loved the “Video Easel” fractal graphics cartridge most of all. Had a couple of “illicit for the time” programs given to us by the local games store, just before things went south, including Ball Blaster (not what you may think) and a game whose name I can’t remember, but it was astronauts landing on a planet and escaping from their spaceship.

    It’s been a few years since I played “Apple Panic” and “Pooyan,” after hooking the unit up to the only monitor I have that still has the right hookups, but you’ve inspired me! Will get them out in the next couple of weeks and report back.

    I remember Jack Tramiel. And we were members of the local Pittsburgh Atari Computer Enthusiasts (PACE) Group. Lots of (gentle and kind) whackos in it. I miss that.

     

    Good for you for creating a way around the membrane keyboard! Truly you are a woman of rare gifts.

    I waited until the 800XL went on sale for $199 to get a real keyboard. Bought a floppy drive for $99 and had myself a computer for $300, pretty good penny pinching for January 1985. I bought the ST after a trip to Europe, where Atari had more market share than in the USA; your home country of Britain had quite a few Atari magazines, and one enterprising Brit used an ST to digitize John Logie Baird’s Phonovision discs, giving us the first real look at what 1929 television looked like. 

     

    • #91
  2. SkipSul Inactive
    SkipSul
    @skipsul

    James Gawron (View Comment):
    You are paying 3 times as much for an illusion. Of course, people like their illusions and are happy to pay for them.

    You know, every time anyone mentions Apple around you, you start using language like this and seem to be making moral judgements about people who like and use Apple products.  What specifically is your grudge against them, and why the need to take down people who use them?  Why is it that their mere mention, like here in a column largely about the fun past of computers, requires you to find some reason to light into them and their customers now?  It’s rather curious.

    • #92
  3. SkipSul Inactive
    SkipSul
    @skipsul

    Hank Rhody, Drunk on Power (View Comment):

    Just speaking personally, though, I would never willingly buy an Android phone just because of the way Google monetizes and subsidizes them – they’re basically portable surveillance devices for Google advertising.

    What a curious game. The only winning move is not to play.

    When I consider the amount of time I spend on my phone, and the way I reflexively pull it out and start checking alerts, I have been wondering if it would be better to ditch the thing entirely.  I’ve already removed all games from my phone.  I’m mostly just down to essentials (depending on your definition of “essentials”), and yet still I fidget with the thing far too often.

    They’re both a blessing and a curse, and I wonder often now if tech has outpaced our ability to use it wisely or well.

    • #93
  4. Hank Rhody, Drunk on Power Contributor
    Hank Rhody, Drunk on Power
    @HankRhody

    SkipSul (View Comment):
    if tech has outpaced our ability to use it wisely or well.

    I’m fairly certain that ship sailed before Noah’s Ark.

    • #94
  5. She Member
    She
    @She

    Gary McVey (View Comment):

    I waited until the 800XL went on sale for $199 to get a real keyboard. Bought a floppy drive for $99 and had myself a computer for $300, pretty good penny pinching for January 1985. I bought the ST after a trip to Europe, where Atari had more market share than in the USA; your home country of Britain had quite a few Atari magazines, and one enterprising Brit used an ST to digitize John Logie Baird’s Phonovision discs, giving us the first real look at what 1929 television looked like.

    Thank you for mentioning John Logie Baird, who, in my mother’s estimation, never got the credit he deserved.

    • #95
  6. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    The Eighties were a time when CEO worship was in style, and every young entrepreneur with a nice wristwatch and a pair of suspenders figured on taking his start-up public. In those days there was a lot of high concept BS about how much business was like war. Jack Tramiel was, for all his other virtues, a big BS-er on that subject, always trying to come across like Patton on steroids. Fact is, Jack’s business secrets weren’t that secret: he made his distributors pay up in advance, then he stiffed his vendors and creditors as long as he could. But as long as Commodore (and then Atari) delivered more computer for the money than virtually anyone else, he got away with it. 

    The “tragedy” of the ST was it was effectively the 4th operating system option in a market that was deciding that even two competing systems were too much. It and the Amiga got squeezed out, and even Apple barely held on by its fingernails. 

    • #96
  7. Clavius Thatcher
    Clavius
    @Clavius

    Gary McVey (View Comment):

    I remember when a “Fat Mac”meant it had 512 K. When the upper limit of MS DOS was 640 K, it once seemed tremendous.

    I had a Fat Mac.  The key was having an external drive (there was no hard drive in my machine) so you could actually run things.  I learned Excel on it and never looked back.

    • #97
  8. Randy Webster Inactive
    Randy Webster
    @RandyWebster

    Randy Webster (View Comment):
    Is removing the battery the only way to really turn them off?

    The reason I ask is that I saw a video once of a guy who moved around DC (I think) with two phones, one on and one off.  The phone that was off had just about as many location data points as the one that was on.

    • #98
  9. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    She (View Comment):

    Gary McVey (View Comment):

    I waited until the 800XL went on sale for $199 to get a real keyboard. Bought a floppy drive for $99 and had myself a computer for $300, pretty good penny pinching for January 1985. I bought the ST after a trip to Europe, where Atari had more market share than in the USA; your home country of Britain had quite a few Atari magazines, and one enterprising Brit used an ST to digitize John Logie Baird’s Phonovision discs, giving us the first real look at what 1929 television looked like.

    Thank you for mentioning John Logie Baird, who, in my mother’s estimation, never got the credit he deserved.

    My paternal grandparents were from Falkirk, like JLB.

    Jane Wall is a British actress, a friend of my wife’s, and she’s currently in “Gray’s Anatomy”. Where we live in Santa Monica has weak TV reception, so we went over to Jane’s a few nights ago to try to boost her antenna enough so her kids could see her on ABC. She was appreciative; I said it was payback for JLB’s invention of TV. It turned out she once lived in North London and was familiar with the Muswell Hill neighborhood, but had no idea that the Alexandra Palace was the original home of BBC television. 

     

    • #99
  10. She Member
    She
    @She

    OK, I’ve been losing my mind here for a couple of hours trying to think of the name of the Atari game I’d forgotten.  Rescue on Fractalus.  I knew it was special.  A very early effort from Lucasfilms Games (later LucasArts Entertainment).  Ours is the development version.

    I see that the Wikipedia page I’ve linked to refers to “Ballblazer” as the other pre-release game we have.  That was, in its pre-release days called “Ball Blaster” which is how I know it.

    Lord.  Theys were the doze.

    • #100
  11. James Gawron Inactive
    James Gawron
    @JamesGawron

    SkipSul (View Comment):

    James Gawron (View Comment):
    You are paying 3 times as much for an illusion. Of course, people like their illusions and are happy to pay for them.

    You know, every time anyone mentions Apple around you, you start using language like this and seem to be making moral judgements about people who like and use Apple products. What specifically is your grudge against them, and why the need to take down people who use them? Why is it that their mere mention, like here in a column largely about the fun past of computers, requires you to find some reason to light into them and their customers now? It’s rather curious.

    Skip,

    I really didn’t mean it like that. Sorry, I’m just an Apple curmudgeon I guess.

    Regards,

    Jim

    • #101
  12. SkipSul Inactive
    SkipSul
    @skipsul

    Randy Webster (View Comment):

    Randy Webster (View Comment):
    Is removing the battery the only way to really turn them off?

    The reason I ask is that I saw a video once of a guy who moved around DC (I think) with two phones, one on and one off. The phone that was off had just about as many location data points as the one that was on.

    A dead battery is the only way these things are every truly off.  They don’t come with a hard-wired on/off switch, they just go into a sort of deep standby, but one that still is awake enough for certain activities beyond “listening” for the power button to signal.

    I once had a phone lock up so badly that it would not respond to the normal reset routines.  Only way to clear the problem was to just let it burn itself out.  On an old PC, the solution would be to hit the power button, but you can’t do that with a phone – closest equivalent is if your phone has an accessible battery..  

    • #102
  13. Clavius Thatcher
    Clavius
    @Clavius

    Gary McVey (View Comment):

    One thing we glossed over was how crummy 40 column text was. An ordinary TV couldn’t display anything finer, and it obviously made WYSIWYG text editing hard if not impossible. Plenty of us had switches between color and B+W monitors to toggle between writing and games. 

    Black and white? I should say “monochrome”, because few monitors were “paper white”. Funny how that worked; most of us had, not that long ago, gotten rid of monochrome TVs in favor of color. That’s why the greatly sharper graphics available on the 286s (and later, on the color Macs) was so remarkable: it was the first high definition color picture most of us had ever seen. You could actually read the small print. 

    On my Apple IIe I used AppleWriter which was a non-WYSIWYG edit.  It did come naturally as prior to getting the Apple II I did my word processing on the open-access DEC-10 using RUNOFF commands for formatting and doing the editing on a Dec-Writer using TECO, a character-based editor.  Going from editing on a single line to a full screen was a big improvement.

    • #103
  14. James Gawron Inactive
    James Gawron
    @JamesGawron

    Clavius (View Comment):
    Going from editing on a single line to a full screen was a big improvement.

    Clav,

    Glad you mentioned this. Modern software really didn’t exist anything like it is now until the personal computer. It was just too expensive to create and use on the old minicomputer technology. If you showed your humble (aging) laptop with Office 2007 running on it to a guy in 1975 using a $250,000.00 DEC PDP11 (1975 dollars) he’d probably pass out. He’d assume you were an alien from another planet he’d be so blown away how great it was.

    Regards,

    Jim

    • #104
  15. Django Member
    Django
    @Django

    Clifford A. Brown (View Comment):

    If it was baby Moses’s computer, it must have been an early tablet design. Tell us more!

    He was the first to download data from the cloud to a tablet. 

    • #105
Become a member to join the conversation. Or sign in if you're already a member.