17 February 2013

Computers Aren’t Fast Anymore

Hello, doctor. I am only 22 years old, but I think computers used to be faster. Why do I feel this way? Can you help me?

Well, it couldn’t be CPUs. Everyone knows that chips were once getting twofold faster every couple of years. That is slowing down a bit now. But it couldn’t be GPUs, either. They’ve seen impressive leaps in speed too.

Hmm? Are we developers to blame? No, I don’t think so. We’re getting better and better at operating our newfangled machines, to take advantage of their futuristic capabilities. At the very least, the smart folks are writing languages and frameworks to bestow that power on the rest of us.

Of course, the extent that most developers see this innovation is in taking embarrassingly data-parallel problems and slapping them on a GPU. But that’s something, isn’t it, doctor? Memory’s faster, CPU caches are bigger, and hard disks are faster than they used to be. SSDs are even faster than that.

Tell you about my past? Okay…

At my high school, there were these run-of-the-mill Windows 2000 machines. We programmed on them in VB6. And let me tell you, for all its downsides, VB6 was screaming fast. These crappy amateur applications were downright speedy. Everything was.

When I use a computer now, I don’t feel that way. I don’t feel good about the speed or crisp responsiveness of applications. Not on a desktop, not on a high-end laptop, and especially not on a mobile device. And being that my job includes developing software for mobile devices, I have messed around with a great many of them.

I was deeply concerned by this. So I sat and I thought. Hmm. And it dawned on me: I don’t use real applications anymore.

I write this very psychiatric confessional blog post within a WYSIWYG editor running inside a web browser. (Sorry, doctor, but you’re just a figment of my overactive rhetorical imagination.) In the browser lies the problem. Almost all of the erstwhile proper applications that I use on a daily basis are now web applications.

Wait. Hold up. Give me a minute to breathe. We’re on to something.

All of the (user-facing–type) applications I use. They are running in a web browser. Which runs HTML and JavaScript. In case you haven’t heard, or you’re thinking of a different thing, we’re talking about HTML the document markup language. JavaScript the dynamically typed scripting language.

What. What. This is not okay. This is so beyond not okay that I have difficulty articulating the manner in which it is not okay because I feel like it should be obvious and why aren’t people freaking out? I’m freaking out.

Calm down? Okay, doc, I’ll try.

[Minutes pass, glacially.]

I guess the problem that I have with this is that programs aren’t…programs anymore. They’re not running on my hardware. They’re running umpteen layers removed from my hardware. They exist on this fundamentally different substrate, one that can’t be efficiently implemented on the real machines that we’ve spent so much effort making fast for real programs. It’s insane.

The things that are running natively are running on underpowered hardware, such as mobile games. And still a great number of them rely on internet services to function properly. Like a parasite, no, like an organ. Everything is part of this enormous whole, of which I am somehow not a part, and I wanted the brain and the muscle but you handed me a kidney.

Developers? No, I don’t fault them at all. You’ve got to have fun, and you’ve got to make money, and you’ve got to make programs. That’s just how it is. Everybody but me seems to have accepted a long time ago that it’s okay for an application to take this many milliseconds to display a character on the screen, or that many milliseconds to scroll. Or very many milliseconds stoned on TCP, roundtripping to a server for my data.

And my poor data. It wasn’t safe on the ground, so we sent it to live in the cloud with foster parents. Now I don’t know where it is, but it isn’t here. It’s holed up in my web applications, and it never comes to visit.

So do you think you can help, doctor? Is it just me? Should I adapt to this brave new world and wait for the award-winning silicon implementation of JavaScript running alongside an HTML coprocessor? Can I have my fast back then?

In the meantime, I guess I’ll continue to sacrifice a fraction of a second off my life here and there for the convenience of using a web program.

Rendering to a canvas.

Running in a browser.

Rendering to a screen.

10 comments:

  1. From my lofty age of 44: My poor, poor boy, you have so much history to cover.

    Start with

    http://www.loper-os.org/?p=388

    Then everything in this category:

    http://www.loper-os.org/?cat=23

    ReplyDelete
    Replies
    1. Believe me, I know the history, and especially how it can be deceptive. The point is that this character shouldn’t feel this way. Computers are faster than ever, and it does no good to be outraged that nobody seems to be taking advantage of it—they are! It’s just that “most people” (whoever they are) use web applications these days, and perhaps they aren’t getting as many of the benefits.

      Delete
    2. Similar observations: http://www.loper-os.org/?p=300

      Delete
  2. Fucking stop using web apps, then. Local applications didn't magically disappear; you just decided you'd be a fucking hipster and upload your life to the web.

    ReplyDelete
    Replies
    1. Haha, I know, this is fiction. Most of my day is spent in a terminal, not a browser. But applications are moving away from hardware, and I just think that’s a mistake.

      Delete
  3. Dude, chill out, Anonymous. Web apps give you backups. Lots of advantages.

    But this has been the problem throughout the entire history of computers. Moore's law says that the number of transistors doubles; this helps RAM, which allows software to include more features and layers of abstraction, which makes the CPU take more time to scan over the code to implement them (and the hard drive and/or network take more time to load them).

    My 386 was faster than my 286 -- until I loaded Windows 3 on it. I could do the same basic things in PC/Geos on a 286 and a 386, and the 386 was way faster; but Windows didn't do anything pleasant or even mentionable in polite company on a 286. Actually, it was so bad I didn't bother switching from PC/Geos until years later.

    Anyhow, it's all about the software. As application space expands (and with web apps, space has expanded IMMENSELY) the time taken to run apps increases as well.

    ReplyDelete
  4. Also, you kids and your Windows 2000 stay off my lawn. :-)

    ReplyDelete
    Replies
    1. Haha, that’s the joke, though. The character in this article (who is a parody of me) is only 22 but already starting to go mini-Luddite. I grew up in equal measure on DOS 6, Windows 95, and Mac OS 8. Maybe not impressive, but at least not Windows 2000.

      Delete
  5. I started programming on Commodore Pet which had a 1 Mhz processor and 32K of RAM, but my first machine was an Atari 8 bit. Programming in assembly language meant you could calculate the time it would take because LDA #3 takes 2 cycles, and there are 1.79 million cycles per second. Now when you program it depends on whether the value you are fetching from ram is in a cache, and which cache it is in L1, L2, or even if it is in memory (perhaps it has been paged out). But hardly anyone programs in assembly anymore, so if you are using C/C++ how does the code the compiler generate turn into assembly. What about if you are writing the code in Java, then it will be interpreted from bytecode (or it might have been optimized by a hotspot compiler). The memory for the screen took 1K (character mapped) on the Atari but on a PC you might have 1920*1080 pixels in 24 bit color which would take around 8MB, so although the computer is 1,500 faster you are manipulating 8,000 times as much memory.

    ReplyDelete
  6. Late to the party, but here's food for thought:

    I'm pretty sure you know this, but Moore's Law says nothing about speed - it describes the trend in transistor density. We stopped making processor cores much faster around 2005 or so, and instead started building out multicore processors. So the *density* (and complexity) of processors and other chips is still keeping pace with Moore.

    But the software for simulating modern ASICs (what we in the biz call what layfolk typically call "chips") are all single-threaded. So because our processor cores aren't getting faster, our *simulation capacity* has effectively plateaued.

    See a problem here?

    ReplyDelete