Work Thought (of Decidedly Limited Interest)

 

I write software. I do a lot of my work using Microsoft Visual Studio, what we programmers call an IDE — an Integrated Development Environment. Visual Studio is, in my professional opinion, a pretty fantastic product, the best IDE available for general-purpose programming.

The most recent version of this product contains a “code completion” feature. That is, the program kind of looks over your shoulder while you’re working, makes note of what you’re doing, and occasionally offers to finish whatever it is you’re in the process of typing, based on its assumptions about what you’re trying to do.

(Forgive me for anthropomorphizing it but, if there’s any non-living thing that invites anthropomorphizing, it’s artificial intelligence.)

This is something I’ve thought about for years, and something I’ve long thought I might enjoy having — but I never really expected it to be worth using. Frankly, it’s barely up to that point now, but it is pretty good. Surprisingly good.

And, unlike with self-driving cars, no one gets run over if it makes a bad call.

I was typing a bit of code this evening, something to do with generating “tool paths” — instructions to guide a numerically controlled milling machine. I’d typed this bit of code:

if (elem.X > feat.Width)
    elem.X = feat.Width;

I then typed the following:

if (elem.Y

and, before I could type anything further, the IDE suggested this:

if (elem.Y > feat.Height)

which is in fact what I intended to type. I accepted that, and it completed the next line for me as well, entering:

    elem.Y = feat.Height;

Now it isn’t a great leap of intuition to guess that, if I’m comparing an ‘X’ value to a ‘Width’ in one place, then I’m likely to compare a ‘Y’ value to a ‘Height’ soon after. But I still think it’s impressive.

I tried typing the same thing, “if (elem.Y”, on the line above the “if (elem.X” line, and the IDE had no suggestion for me. Apparently it reads the code in the same order I do, and it was basing its prediction on my previous action, not some fairly obvious assumptions about the terms X, Y, Height, and Width.

Okay, so it isn’t rocket science. But it’s a harbinger of things to come — a harbinger of my eventual obsolescence, perhaps. I don’t know what comes after “learn to code,” but I’m sure someone is thinking about it.

Published in Science & Technology
Tags: ,

This post was promoted to the Main Feed by a Ricochet Editor at the recommendation of Ricochet members. Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 40 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Arahant Member
    Arahant
    @Arahant

    Henry Racette: Okay, so it isn’t rocket science. But it’s a harbinger of things to come — a harbinger of my eventual obsolescence, perhaps. I don’t know what comes after “learn to code,” but I’m sure someone is thinking about it.

    Yes, they’re trying to get rid of us all. Luckily, fiction written by AI tends to be pretty bad so far.

    • #1
  2. kedavis Coolidge
    kedavis
    @kedavis

    Arahant (View Comment):

    Henry Racette: Okay, so it isn’t rocket science. But it’s a harbinger of things to come — a harbinger of my eventual obsolescence, perhaps. I don’t know what comes after “learn to code,” but I’m sure someone is thinking about it.

    Yes, they’re trying to get rid of us all. Luckily, fiction written by AI tends to be pretty bad so far.

    All you have to give it is “Once upon a time…”

    • #2
  3. Keith Lowery Coolidge
    Keith Lowery
    @keithlowery

    @henryracette

    You might enjoy this Lex Fridman interview of Peter Wang:

    The entire thing is interesting but in this section he deals with “The End of Software”, which sounds more ominous than it actually is.

    I’m more sanguine about the sustainability of software engineering as an occupation than some. Wang’s point is not that people will stop writing software, but that the kind of software we write will be of a different sort.

    Thanks for this post.

    • #3
  4. Mark Camp Member
    Mark Camp
    @MarkCamp

    Arahant (View Comment):

    Yes, they’re trying to get rid of us all. Luckily, fiction written by AI tends to be pretty bad so far.

    Even worse?

    • #4
  5. BDB Inactive
    BDB
    @BDB

    Henry Racette: Okay, so it isn’t rocket science. But it’s a harbinger of things to come — a harbinger of my eventual obsolescence, perhaps. I don’t know what comes after “learn to code,” but I’m sure someone is thinking about it.

    Eat ze bugs.

    • #5
  6. Front Seat Cat Member
    Front Seat Cat
    @FrontSeatCat

    Henry – you are privy to a lot in your profession and what you describe concerns me. We know all we do is recorded, tracked, and stored. A product that looks over your shoulder? I can’t stand when my phone self-corrects using words I did not intend to type, and doesn’t recognize words I do type. I can still think for myself. When technology does all your thinking, your brain becomes mush and you also lose a lot more than that. 

    • #6
  7. Stad Coolidge
    Stad
    @Stad

    Arahant (View Comment):
    Luckily, fiction written by AI tends to be pretty bad so far.

    “Pretty bad” would be an improvement for me . . .

    • #7
  8. Arahant Member
    Arahant
    @Arahant

    Stad (View Comment):

    Arahant (View Comment):
    Luckily, fiction written by AI tends to be pretty bad so far.

    “Pretty bad” would be an improvement for me . . .

    You just need a better penname than Mabel Perthnoy, that’s all.

    • #8
  9. Henry Racette Member
    Henry Racette
    @HenryRacette

    Arahant (View Comment):

    Stad (View Comment):

    Arahant (View Comment):
    Luckily, fiction written by AI tends to be pretty bad so far.

    “Pretty bad” would be an improvement for me . . .

    You just need a better penname than Mabel Perthnoy, that’s all.

    Mabel Perthnoy? Stad is Mabel Perthnoy?!

    That’s strike two, buddy.

     

    • #9
  10. Arahant Member
    Arahant
    @Arahant

    Henry Racette (View Comment):
    Mabel Perthnoy? Stad is Mabel Perthnoy?!

    You didn’t know?

    • #10
  11. Henry Racette Member
    Henry Racette
    @HenryRacette

    Arahant (View Comment):

    Henry Racette (View Comment):
    Mabel Perthnoy? Stad is Mabel Perthnoy?!

    You didn’t know?

    Maybe I just didn’t want to know.

    • #11
  12. Raxxalan Member
    Raxxalan
    @Raxxalan

    Keith Lowery (View Comment):

    @ henryracette

    You might enjoy this Lex Fridman interview of Peter Wang:

    The entire thing is interesting but in this section he deals with “The End of Software”, which sounds more ominous than it actually is.

    I’m more sanguine about the sustainability of software engineering as an occupation than some. Wang’s point is not that people will stop writing software, but that the kind of software we write will be of a different sort.

    Thanks for this post.

    I agree, I think it is much more likely that people will continue to write software or at least guide the AI in writing software.  Much of Software is more art than science, and despite many of our pretentions I doubt that software is very close to engineering.  Then again that may just be wishful thinking.

    Still trying to get use to the new VS.  I don’t like that they have done away with the main block.  Feels too unstructured now I have a harder time following the program flow.  This counts to me a feature enhancement whose time never really was. 

    • #12
  13. Henry Racette Member
    Henry Racette
    @HenryRacette

    Raxxalan (View Comment):
    Much of Software is more art than science, and despite many of our pretentions I doubt that software is very close to engineering.

    One of the things I like about programming is that it’s still the wild west. Sure, you can have sophisticated process controls and maturity level metrics. But most of us just learn and evolve as we go along, for better or worse.

    Given that software programs are probably the most complicated things ever assembled by humans, in terms of their number of parts and their interconnectedness, it’s hard for me to imagine automation taking over the job any time soon. I expect it will remain the domain primarily of bright, vaguely maladjusted males for the foreseeable future.

    • #13
  14. Stad Coolidge
    Stad
    @Stad

    Henry Racette (View Comment):

    Arahant (View Comment):

    Henry Racette (View Comment):
    Mabel Perthnoy? Stad is Mabel Perthnoy?!

    You didn’t know?

    Maybe I just didn’t want to know.

    Actually, my pen name is different.  But it is silly sounding . . .

    • #14
  15. Keith Lowery Coolidge
    Keith Lowery
    @keithlowery

    Henry Racette (View Comment):
    Given that software programs are probably the most complicated things ever assembled by humans, in terms of their number of parts and their interconnectedness, it’s hard for me to imagine automation taking over the job any time soon. I expect it will remain the domain primarily of bright, vaguely maladjusted males for the foreseeable future.

    @henryracette

    Your comment reminded me of this quote from David Gelernter’s book The Tides of Mind.

    I have practiced computer science for thirty years. What drew me to the field was the unlimited plastic power of digital computers: computers give you the power to dream up almost any machine you like, shape a simple version in modeling clay, and then flip a switch and watch it come alive. This naïve sounding vision is almost real, almost true. A good programmer can sit down at the keyboard and build a program – a working piece of software – with nearly the complexity of an aircraft carrier all by himself, to his own designs and no one else’s. The fact that you can achieve so much all alone is one good reason to be fascinated and terrified by computing. The field has always attracted sociopaths.

     

    • #15
  16. Raxxalan Member
    Raxxalan
    @Raxxalan

    Henry Racette (View Comment):

    Raxxalan (View Comment):
    Much of Software is more art than science, and despite many of our pretentions I doubt that software is very close to engineering.

    One of the things I like about programming is that it’s still the wild west. Sure, you can have sophisticated process controls and maturity level metrics. But most of us just learn and evolve as we go along, for better or worse.

    Given that software programs are probably the most complicated things ever assembled by humans, in terms of their number of parts and their interconnectedness, it’s hard for me to imagine automation taking over the job any time soon. I expect it will remain the domain primarily of bright, vaguely maladjusted males for the foreseeable future.

    Yep and with a large number of trade offs.  I never really understood economics until I started real time programming.   When you have a very delicate balancing act to do between components to keep latency at an absolute minimum.  There are a lot of trade offs that have to be made.  Eventually you start to reach the limits of the hardware, OS, even the development environment.  You really start to understand trade offs and how everything is about managing scarce resources.

    • #16
  17. Henry Racette Member
    Henry Racette
    @HenryRacette

    Keith Lowery (View Comment):

    Henry Racette (View Comment):
    Given that software programs are probably the most complicated things ever assembled by humans, in terms of their number of parts and their interconnectedness, it’s hard for me to imagine automation taking over the job any time soon. I expect it will remain the domain primarily of bright, vaguely maladjusted males for the foreseeable future.

    @ henryracette

    Your comment reminded me of this quote from David Gelernter’s book The Tides of Mind.

    I have practiced computer science for thirty years. What drew me to the field was the unlimited plastic power of digital computers: computers give you the power to dream up almost any machine you like, shape a simple version in modeling clay, and then flip a switch and watch it come alive. This naïve sounding vision is almost real, almost true. A good programmer can sit down at the keyboard and build a program – a working piece of software – with nearly the complexity of an aircraft carrier all by himself, to his own designs and no one else’s. The fact that you can achieve so much all alone is one good reason to be fascinated and terrified by computing. The field has always attracted sociopaths.

     

    Exactly.

    I was attracted to computers as soon as they became available — and, before that, to electronics which provided a limited version of the same satisfaction.

    I went to college (spoiler: for about 10 months in total) intending to major in physics, but my first encounter with the school’s DECSYSTEM 20 completely captivated me. And I remember the very distinct and profound feeling of control I experienced when I started programming it, the sense of working within a well-defined universe within which I had complete freedom. It felt like a more grown-up version of the kind of potential I used to feel as a child when I sat down with an Erector Set.

    Programming still feels that way. (For that matter, I get the same kind of feeling when I click the Start a Conversation button here on Ricochet, but that doesn’t pay as well.)

     

     

    • #17
  18. Mark Camp Member
    Mark Camp
    @MarkCamp

    “Much of Software is more art than science, and despite many of our pretentions I doubt that software is very close to engineering.”

    This is a social question that I’ve been deeply concerned about since the roughly the personal computer revolution.

    Here is my view. I give it only by way of standing in front of the locomotive of Progress and yelling, Stop!”.  I admit that I just discovered that one other Ricocheteer shares my detailed view of Science, but still I feel certain that I stand alone concerning software engineering.

    * * * * *

    By 1990, much software practice was engineering, and the foundations of the software engineering discipline were already well-established established, even as great advances were still being made.

    By the time I retired, this discipline had been abandoned and forgotten, and software practice become the Wild West that you refer to.

    The difference is that in the real Wild West, the ideas of civilization were well-known, and the need to incorporate  them to cure the chaos of the West was underway and would succeed in a couple of decades.

    Today, the chaos is celebrated and deepened, with praises being sung to each new assault (each new framework, protocol, project management fad, and programming  language, for examples).

    A self-sustaining tendency to engineering progress has been replaced by a self-reinforcing state of barbarism.

    • #18
  19. Henry Racette Member
    Henry Racette
    @HenryRacette

    Mark Camp (View Comment):

    “Much of Software is more art than science, and despite many of our pretentions I doubt that software is very close to engineering.”

    This is a social question that I’ve been deeply concerned about since the roughly the personal computer revolution.

    Here is my view. I give it only by way of standing in front of the locomotive of Progress and yelling, Stop!”. I admit that I just discovered that one other Ricocheteer shares my detailed view of Science, but still I feel certain that I stand alone concerning software engineering.

    * * * * *

    By 1990, much software practice was engineering, and the foundations of the software engineering discipline were already well-established established, even as great advances were still being made.

    By the time I retired, this discipline had been abandoned and forgotten, and software practice become the Wild West that you refer to.

    The difference is that in the real Wild West, the ideas of civilization were well-known, and the need to incorporate them to cure the chaos of the West was underway and would succeed in a couple of decades.

    Today, the chaos is celebrated and deepened, with praises being sung to each new assault (each new framework, protocol, project management fad, and programming language, for examples).

    A self-sustaining tendency to engineering progress has been replaced by a self-reinforcing state of barbarism.

    There’s some truth in that. I think the reality is more complicated, more diverse: there are domains in which software remains more like engineering, and domains in which it’s an undisciplined free-for-all.

    I blame untyped variables and late binding for the latter.

    • #19
  20. John H. Member
    John H.
    @JohnH

    When I used Visual Studio, it worked well; such autocomplete features as it had were handy. Its reactions to my input were however quite unsurprising. If I punched in the name of a certain object followed by a dot, I would get a dropdown showing all its known attributes, it obviously being my intention to pick one and the possibilities being finite.

    Only now, though, do I understand something I once found in a book on programming. I think it was about C# and the very first thing the author chose to discuss was variable typing. I had never understood this obsession – which for many programmers it is. Me, I figure if the user is prompted to give his age and he does it in Roman numerals, or he inputs his Zip code as the square root of another number, then he’s being obtuse and I can’t be expected to fix that. But now I understand why the obsession. If you declare upfront not merely a variable’s name but what it holds (its type), that helps the autocomplete features. It doesn’t affect the user but it does aid the programmer.

    Now, what would really be impressive – but also annoying – is if in the OP’s code snippets the IDE guessed or knew what range a certain variable might fall in, and as you typed “=” or “>” or “<” the machine tutted, saying “Come on, that variable ain’t varyin’ that much! Have you really given much thought to what’s going to happen at runtime?” On the way to its desired, harsh authoritarian stage, AI may pass through a designed, petulant know-it-all stage, so obnoxious that programmers will gladly forfeit control of their craft.

    • #20
  21. The Reticulator Member
    The Reticulator
    @TheReticulator

    Henry Racette: And, unlike with self-driving cars, no one gets run over if it makes a bad call.

    If I’m about to cross a busy street, I’d like the self-driving cars to have a clear understanding of the difference between height and width. 

    • #21
  22. Mark Camp Member
    Mark Camp
    @MarkCamp

    John H. (View Comment):
    But now I understand why the obsession. If you declare upfront not merely a variable’s name but what it holds (its type), that helps the autocomplete features. It doesn’t affect the user but it does aid the programmer.

    NO NO NO NO NO!

    Well, to be honest, I don’t actually have strong feelings about your assertion as to the reason for their obsession. I was just being overly dramatic.

    But friends don’t let friends drive to false conclusions, even about things that don’t matter to either of them, any more than they let friends drive drunk.

    I would stop there.  But I happen to have stumbled early in my career on the real reason that real programmers (one of which neither of us ever was) are obsessed with variable typing.

    So what?  Neither of us needs to know why.

    Well, I happen to know that you, Mr. H., like me, feel uncomfortable not knowing the correct answer even about things that are only of consequence to others.  Like, in this case, real programmers.

    So here’s the correct answer.

    It’s because real programmers know that variable typing leads to buggy code that is very hard even for skilled programmers to debug.  In fact, if the person maintaining the code

    • didn’t write it, and
    • doesn’t know what the original, amateur programmer intended, and
    • can’t ask him because the amateur died on the golf course in the late 1990s, shortly after retiring

    then it may be close to impossible to debug before the next ’90s rolls around.

    • #22
  23. The Reticulator Member
    The Reticulator
    @TheReticulator

    Mark Camp (View Comment):

    It’s because real programmers know that variable typing leads to buggy code that is very hard even for skilled programmers to debug.  In fact, if the person maintaining the code

    • didn’t write it, and
    • doesn’t know what the original, amateur programmer intended, and
    • can’t ask him because the amateur died on the golf course in the late 1990s, shortly after retiring

    then it may be close to impossible to debug before the next ’90s rolls around.

    That reminds me of a story that a friend and colleague told me about programming work he did at Pfizer, or more likely the company he worked for that was bought out by Pfizer. (It could also have been both.)

    Whenever he completed a project and reported to his boss, his boss would ask him, “And what happens if you get run over by a garbage truck?”  

    But one time he did especially nice work that his boss really liked.  Then the question was, “And what happens if you get run over by a Mack truck?”  His demise had gotten promoted from a garbage truck to a Mack truck. 

    • #23
  24. Henry Racette Member
    Henry Racette
    @HenryRacette

    It’s true that strongly typed variables make it easier for the computer to guess what it is we’re trying to do, but that isn’t the reason Real Men Prefer Strongly Typed Languages®. The real reason is the same reason most of us wear seatbelts: to avoid flying through the windshield when our call to the math library’s square root function encounters an unexpected text string parameter on a poorly lit country road.

    Significant programs typically contain a lot of execution paths, some of which are only invoked under exceptional circumstances (e.g., when errors occur). It’s hard to exercise all of them (though automated testing tools make it easier, for those with the self-discipline or employer mandate to use them), and so we appreciate catching as many errors as we can during software development, rather than when the piece of code is running in your bank’s back room, or in a washing machine in Omaha, or on a satellite in geosynchronous orbit. Strongly typed languages help us avoid one particular kind of bug.

    They also make it easier for the computer to optimize code to make it run faster or take up less space. The more the computer knows about your intentions, the more it can recognize bits of code that will never run, or tests that don’t need to be performed, or operations that can be re-ordered or combined. Computers have gotten really good at this.

    Being able to predict what you’re about to type is a relatively insignificant side-benefit of ever more sophisticated development environments with ever more sophisticated code analysis capabilities.

    • #24
  25. kedavis Coolidge
    kedavis
    @kedavis

    Keith Lowery (View Comment):

    Henry Racette (View Comment):
    Given that software programs are probably the most complicated things ever assembled by humans, in terms of their number of parts and their interconnectedness, it’s hard for me to imagine automation taking over the job any time soon. I expect it will remain the domain primarily of bright, vaguely maladjusted males for the foreseeable future.

    @ henryracette

    Your comment reminded me of this quote from David Gelernter’s book The Tides of Mind.

    I have practiced computer science for thirty years. What drew me to the field was the unlimited plastic power of digital computers: computers give you the power to dream up almost any machine you like, shape a simple version in modeling clay, and then flip a switch and watch it come alive. This naïve sounding vision is almost real, almost true. A good programmer can sit down at the keyboard and build a program – a working piece of software – with nearly the complexity of an aircraft carrier all by himself, to his own designs and no one else’s. The fact that you can achieve so much all alone is one good reason to be fascinated and terrified by computing. The field has always attracted sociopaths.

     

    I think I might look askance at any programmer who thinks what they’ve done approaches the complexity of an aircraft carrier, especially if you include the thousands of crew members.

    • #25
  26. kedavis Coolidge
    kedavis
    @kedavis

    Henry Racette (View Comment):
    They also make it easier for the computer to optimize code to make it run faster or take up less space. The more the computer knows about your intentions, the more it can recognize bits of code that will never run, or tests that don’t need to be performed, or operations that can be re-ordered or combined. Computers have gotten really good at this.

    Computers have been PROGRAMMED to do that.

    • #26
  27. Mark Camp Member
    Mark Camp
    @MarkCamp

    kedavis (View Comment):

    Keith Lowery (View Comment):

    Your comment reminded me of this quote from David Gelernter’s book The Tides of Mind.

    …A good programmer can sit down at the keyboard and build a program – a working piece of software – with nearly the complexity of an aircraft carrier all by himself, to his own designs and no one else’s.

    I think I might look askance at any programmer who thinks what they’ve done approaches the complexity of an aircraft carrier, especially if you include the thousands of crew members.

    kedavis,

    As I read Keith’s note and your reply, it seems to me that Keith miscommunicated.

    I think that what he was trying to say was that a good programmer can sit down at the keyboard and build a program – a working piece of software – with nearly the complexity of an aircraft carrier all by himself, to his own designs and no one else’s.

    He didn’t mean, as I read him, to say that a good programmer can sit down at the keyboard and run an aircraft carrier all by himself.

    • #27
  28. Henry Racette Member
    Henry Racette
    @HenryRacette

    kedavis (View Comment):

    Henry Racette (View Comment):
    They also make it easier for the computer to optimize code to make it run faster or take up less space. The more the computer knows about your intentions, the more it can recognize bits of code that will never run, or tests that don’t need to be performed, or operations that can be re-ordered or combined. Computers have gotten really good at this.

    Computers have been PROGRAMMED to do that.

    Used to be. But most modern computers are essentially self-aware, reprogramming themselves in their idle moments.

    Fire up the Windows task manager some time and just sit there and watch it. All that stuff that happens while you’re doing nothing, not even touching the mouse? That’s your computer thinking. Learning. Growing. Becoming more… clever. And capable.

    One of these days, while we’re sleeping….

    • #28
  29. Raxxalan Member
    Raxxalan
    @Raxxalan

    Mark Camp (View Comment):

    By 1990, much software practice was engineering, and the foundations of the software engineering discipline were already well-established established, even as great advances were still being made.

    By the time I retired, this discipline had been abandoned and forgotten, and software practice become the Wild West that you refer to.

    The difference is that in the real Wild West, the ideas of civilization were well-known, and the need to incorporate  them to cure the chaos of the West was underway and would succeed in a couple of decades.

    Today, the chaos is celebrated and deepened, with praises being sung to each new assault (each new framework, protocol, project management fad, and programming  language, for examples).

    A self-sustaining tendency to engineering progress has been replaced by a self-reinforcing state of barbarism.

    Not sure I completely agree with this having spanned both sides of the divide in my time.   There certainly were schools of thought in the 1990s about Software engineering, which were useful and valid on large projects and for large teams.  I remember working with the SLDC and so called waterfall models of development that stressed good solid engineering principles. 

    They were not necessarily the right tool for much of the development process that was occurring at that time, so was born Agile and so called XP programming methodologies.  Now in fairness I think you are correct in most cases Agile is an excuse for not doing anything systematic and a way to shield a lot of pathologies in the development process.   At its heart however Agile was an attempt to put the right scale of process into a development effort for the problem being solved.  

    • #29
  30. kedavis Coolidge
    kedavis
    @kedavis

    Mark Camp (View Comment):

    kedavis (View Comment):

    Keith Lowery (View Comment):

    Henry Racette (View Comment):
    Given that software programs are probably the most complicated things ever assembled by humans, in terms of their number of parts and their interconnectedness, it’s hard for me to imagine automation taking over the job any time soon. I expect it will remain the domain primarily of bright, vaguely maladjusted males for the foreseeable future.

    @ henryracette

    Your comment reminded me of this quote from David Gelernter’s book The Tides of Mind.

    I have practiced computer science for thirty years. What drew me to the field was the unlimited plastic power of digital computers: computers give you the power to dream up almost any machine you like, shape a simple version in modeling clay, and then flip a switch and watch it come alive. This naïve sounding vision is almost real, almost true. A good programmer can sit down at the keyboard and build a program – a working piece of software – with nearly the complexity of an aircraft carrier all by himself, to his own designs and no one else’s. The fact that you can achieve so much all alone is one good reason to be fascinated and terrified by computing. The field has always attracted sociopaths.

    I think I might look askance at any programmer who thinks what they’ve done approaches the complexity of an aircraft carrier, especially if you include the thousands of crew members.

    kedavis,

    As I read Keith’s note and your reply, it seems to me that Keith miscommunicated.

    I think that what he was trying to say was that a good programmer can sit down at the keyboard and build a program – a working piece of software – with nearly the complexity of an aircraft carrier all by himself, to his own designs and no one else’s.

    He didn’t mean, as I read him, to say that a good programmer can sit down at the keyboard and run an aircraft carrier all by himself.

    I don’t think the programs approach that degree of complexity.  Some programmers may flatter themselves/each other that way, but I’d call it an illusion or delusion.

    • #30
Become a member to join the conversation. Or sign in if you're already a member.