Tech’s Productivity Problem

 

I am a computer graphics software engineer. Computer graphics is unusual in tech for having an appetite for performance that far outstrips what computer hardware can deliver. In many ways, we are the ones who drive computing performance forward, and our hardware is often used by other fields for unrelated tasks (for example, AI makes heavy use of the GPUs–graphical processor units–that are used to power computer games).

Most of the software world does not face these constraints. For most programmers (and the managers who employ them) the increases in computing power over the past 30 years were truly awe-inspiring; far in excess of what they thought they needed. It didn’t matter how crappy programmers or their tools were when in a few years’ time computers could be counted on to be exponentially faster.  Programmers didn’t need to learn to deal with hard things like memory management, optimizing performance, or writing code that can run on multiple CPU cores simultaneously. Even worse, the people who write the tools programmers use–programming languages–felt they need not worry about these things either. One of the members of the C++ standards committee (a widely used programming language) admitted to me earlier this year to having once thought this way.

But computers aren’t getting faster anymore. There is a physical limit to how small you can make transistors, and there is also a limit to how many transistors you can turn on at once and not melt the chip. We have probably reached both limits and we certainly will have reached them in a year or two’s time.

People are panicking. Industry leaders are wondering how they will manage, but their dependence on ever-faster CPUs will ultimately be their salvation. There is a wide scope to make computer software faster simply by rewriting old code. Most managers (and many programmers) fear this the way most people fear math, but I think they will be pleasantly surprised. Parallel programming and memory management are simply not as hard as they think, not when the right tools are used. Programmers who have spent 20 years thinking they can’t deal with managing memory or write parallel code are going to find that, actually, they can do these things.

Moore’s Law may be ending, but software will continue to advance.

Published in General
This post was promoted to the Main Feed by a Ricochet Editor at the recommendation of Ricochet members. Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 64 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Henry Castaigne Member
    Henry Castaigne
    @HenryCastaigne

    namlliT noD (View Comment):
    I mean, once you can deliver high resolution digital porn, it’s like, y’know, mission accomplished, what else do you need?

    Sex robots.

    • #61
  2. Henry Castaigne Member
    Henry Castaigne
    @HenryCastaigne

    Arahant (View Comment):

    Joseph Eagar: But computers aren’t getting faster anymore. There is a physical limit to how small you can make transistors, and there is also a limit to how many transistors you can turn on at once and not melt the chip. We have probably reached both limits and we certainly will have reached them in a year or two’s time.

    We’re running out of oil. I’ve been told that as long as I can remember, yet now we have more oil than we have ever had before. Moore’s Law is going to break soon. I’ve been told that as long as I’ve been associated with computers, which started in 1980. Forgive my skepticism.

    Couldn’t there be a middle way? Couldn’t we get better and better technology but at a slower rate? Instead of More’s law occurring every two years.  Maybe we would double computer power every four years or every eight years. That’s still incredible progress.

    • #62
  3. Arahant Member
    Arahant
    @Arahant

    Henry Castaigne (View Comment):
    Couldn’t there be a middle way? Couldn’t we get better and better technology but at a slower rate? Instead of More’s law occurring every two years. Maybe we would double computer power every four years or every eight years. That’s still incredible progress.

    Sure, Henry. I’m just saying that I have been told it’s about to break for forty years. Wake me when it really does.

    • #63
  4. namlliT noD Member
    namlliT noD
    @DonTillman

    Arahant (View Comment):

    Henry Castaigne (View Comment):
    Couldn’t there be a middle way? Couldn’t we get better and better technology but at a slower rate? Instead of More’s law occurring every two years. Maybe we would double computer power every four years or every eight years. That’s still incredible progress.

    Sure, Henry. I’m just saying that I have been told it’s about to break for forty years. Wake me when it really does.

    Yeah.  This.  Exactly.

    Also, a number of tech companies have made what turned out to be poor decisions when they did not fully consider the power of Moore’s Law.

    • #64
Become a member to join the conversation. Or sign in if you're already a member.