Computer technology and software starting to feel old?

Is anyone else starting to feel like the everything in the world of personal computing is starting to feel old? I mean, we’re getting processors that use less energy, and we’re upping hard drive capacities and RAM but to me it seems like personal computing has reached a sort of plateau. Where is it headed (if anywhere) and when are we going to get there? To me, it seems like we are long overdue for the next big breakthrough. I think XP and OSX were big breakthroughs, but both of those happened a good while ago. Does anyone else feel like we are due for a big new and exciting change in the world of personal computing?

I don’t.

Computers are, essentially, doing what they’ve been doing for the last 15 years (ever since they really went mainstream with the 486, or the horrible old Mac LS a few years later). Sure, they’re doing it faster and better, but I don’t see a big change on the horizon; surely nothing as significant as the adoption of the GUI over the console for day-to-day tasks (in the late 1980s), to say nothing about fundamental architectural changes (like the switch from tubes to transistors, back in the bad old days, long before my time). Maybe in a few decades, they’ll have figured out basic quantum computing techniques. That’s a “big new and exciting change”, but it will take (much) longer still to filter down to the consumer.

I’d say that for the foreseeable future, on the hardware end of things, we’ll see a de-emphasization of clock speed (of course, this is already occurring, and has as much to do with marketing as it does with architecture), and a focus on parallel execution of processes, as manufacturers try to build more and more cores onto a die, instead of trying to make them faster. To the average user, this won’t make much of a difference in terms of “the computing experience”, but this does present some fundamental problems for the programmers trying to make that parallelism useful to the average user. The techniques that they will eventually have to adopt have been in use for decades on multiprocessor systems found in business and scientific applications, so it’s not hardly new ground. Concurrently, chip fabrication is taking place on denser and denser dies, with smaller and smaller transistors. Though there appeared to be some difficulties at 90 nm, those turned out to be more of a speed bump than a roadblock, and now many manufacturers are moving to 65 nm processes. Again, an evolutionary, rather than revolutionary change. Other “new” technologies, like PCI Express, Serial ATA, etc. are similarly derivative.

Perhaps the only big hardware change will come in hard drive technology, when the current form factors reach their maximum capacities (due to the areal density reaching the limit of the current magnetic technology). But this is several years away, and not before 3.5 in drives reach several terabytes each. It’s up for debate whether any new medium will be ready when this day arrives, but if not, the quantity or physical size of the drives will increase—in other words, a very mundane occurrence.

Contrary to your assertion, XP and OSX aren’t anything special. XP is very much a derivative work (i.e. of Windows NT 4.0 and 2000), though OSX is somewhat more novel (though hardly all-new). This isn’t to say that they’re not solid operating systems, both eminently suitable for most everyday tasks. It’s just that they’re merely ordinary. Vista has recently drawn much attention to itself, but from the initial descriptions, it doesn’t appear to pick up far from where XP leaves off. I’d be remiss not to mention the recent growth of the open-source “movement”, but right now, I get the sense that it is largely devoted to improving current projects, rather than searching for the next big thing. But perhaps that’s understandable; we all know that the big software firms spend billions on R&D. It’s somewhat less clear whether or not any really significant ideas will emerge (completed) from the loosely-knit groups of coders in the open-source communities. (N.B. FireFox is not a big idea.)

In a wider sense, I could see tablet- or PDA-like devices taking off, but only with the combination of smaller portable computers, better workflow (e.g. UI, text/speech recognition, etc.) and a pile of “killer apps” to make it worthwhile. It will probably take a much reorganized OS to make this practicable. Even so, it could happen, but judging by the price of early adoption, it may take years and great marketing for this to catch on.

In essence, I see steady, gradual change on the horizon. It doesn’t make for a great story, but I think it’s the most realistic assessment of the state of the industry today. Maybe you’re just itching for a new computer to replace the Mac SX?