Hello, doctor. I am only 22 years old, but I think computers used to be faster. Why do I feel this way? Can you help me?
Well, it couldn’t be CPUs. Everyone knows that chips were once getting twofold faster every couple of years. That is slowing down a bit now. But it couldn’t be GPUs, either. They’ve seen impressive leaps in speed too.
Hmm? Are we developers to blame? No, I don’t think so. We’re getting better and better at operating our newfangled machines, to take advantage of their futuristic capabilities. At the very least, the smart folks are writing languages and frameworks to bestow that power on the rest of us.
Of course, the extent that most developers see this innovation is in taking embarrassingly data-parallel problems and slapping them on a GPU. But that’s something, isn’t it, doctor? Memory’s faster, CPU caches are bigger, and hard disks are faster than they used to be. SSDs are even faster than that.
Tell you about my past? Okay…
At my high school, there were these run-of-the-mill Windows 2000
machines. We programmed on them in VB6. And let me tell you, for all its
downsides, VB6 was screaming fast. These crappy amateur applications were downright speedy. Everything was.
When I use a computer now, I don’t feel that way. I don’t feel good about the speed or crisp responsiveness of applications. Not on a desktop, not on a high-end laptop, and especially not on a mobile device. And being that my job includes developing software for mobile devices, I have messed around with a great many of them.
I was deeply concerned by this. So I sat and I thought. Hmm. And it dawned on me: I don’t use real applications anymore.
I write this very psychiatric confessional blog post within a WYSIWYG editor running inside a web browser. (Sorry, doctor, but you’re just a figment of my overactive rhetorical imagination.) In the browser lies the problem. Almost all of the erstwhile proper applications that I use on a daily basis are now web applications.
Wait. Hold up. Give me a minute to breathe. We’re on to something.
What. What. This is not okay. This is so beyond not okay that I have difficulty articulating the manner in which it is not okay because I feel like it should be obvious and why aren’t people freaking out? I’m freaking out.
Calm down? Okay, doc, I’ll try.
[Minutes pass, glacially.]
I guess the problem that I have with this is that programs aren’t…programs anymore. They’re not running on my hardware. They’re running umpteen layers removed from my hardware. They exist on this fundamentally different substrate, one that can’t be efficiently implemented on the real machines that we’ve spent so much effort making fast for real programs. It’s insane.
The things that are running natively are running on underpowered hardware, such as mobile games. And still a great number of them rely on internet services to function properly. Like a parasite, no, like an organ. Everything is part of this enormous whole, of which I am somehow not a part, and I wanted the brain and the muscle but you handed me a kidney.
Developers? No, I don’t fault them at all. You’ve got to have fun, and you’ve got to make money, and you’ve got to make programs. That’s just how it is. Everybody but me seems to have accepted a long time ago that it’s okay for an application to take this many milliseconds to display a character on the screen, or that many milliseconds to scroll. Or very many milliseconds stoned on TCP, roundtripping to a server for my data.
And my poor data. It wasn’t safe on the ground, so we sent it to live in the cloud with foster parents. Now I don’t know where it is, but it isn’t here. It’s holed up in my web applications, and it never comes to visit.
In the meantime, I guess I’ll continue to sacrifice a fraction of a second off my life here and there for the convenience of using a web program.
Rendering to a canvas.
Running in a browser.
Rendering to a screen.