Saturday, July 11, 2009

One Reason Why Computers Suck

Fate's theory on computers: there is no consumer computing device which actually works.

Everything you see at BestBuy, the lines of laptops from HP and Apple - all of them - are broken in some fundamental way. I don't care how awesome your home PC is, give me five or ten minutes with it, and I'll figure out something you "should" be able to do that causes it to spit up all over itself. Funny enough, there are a good number of people out there actually paid to do that. That certain talent, is the reason why I'm involved with the computing industry, and simultaneously also, technology incompatible.

Now, in my profession, I see a lot of computer equipment, and generally deal with it on a far more detailed level than most people. Floating in my head are a great number of random facts about how various pieces of computer hardware work. What amazes me though, is that when it comes right down to it, computers don't.

If you're reading this blog, you'll likely know the difference between Hardware and Software. Driver software makes the hardware tick. An outsider might expect that driver software would be developed alongside with the hardware directly - when the hardware is done, so is the driver software. Looking at the problem in detail though, it's obvious you've got something of a chicken and an egg here - how do you develop software for non-existent hardware? I'll spare the gritty details of how that works, and jump straight to the results.

And that result, is generally with either you - the consumer - or the OEM (big box PC makers like HP and Apple.) On a modern Windows PC, you've probably noticed "Windows Update", asking on a regular basis if you'd like to ruin...er.... update parts of the software on your computer. Depending on your level of technical sophistication / bravery / stupidity / intelligence, you may have chosen to install / ignore / install randomly these updates. For today, we'll focus on the Driver updates.

Now, let's say I'm an up and coming competitor to ATI and nVidia, and have developed the new wizz-bang awesome 3d video card which has 10 spanking new features, including the ability to render photorealistic boobies realtime. Gamers the world over drool (understandably) over this new piece of computing excellence. The announcement is made, and the new Wank-O-Matic 5000 video card hits shelves. Immediately, my competitors begin discussing how their next version of video card will render even better photorealistic boobies realtime in their next generation of cards. That, however, doesn't stop Gamers from lining up overnight to purchase the Wank-O-Matic 5000.

On buying it however, they take it home, and discover that only 8 of the ten new features work well, and the other 2 don't work well at all. Sad and dismayed, gamers announce it's a good video card, but the major selling point is overrated. Not to be stopped, I now promise every gamer - but wait, it'll be fixed in an update! As time goes on, the major issues with the new creation are fixed, and at long last, the Wank-O-Matic 5000 does everything as advertised. The only thing is, noone is using it anymore, it's now been obseleted by the Wank-O-Matic 6000, and the new up and coming Wanktastic 8G.

Which brings me to the heart of the matter - all of the computer hardware in existence today, is not fully utilized. Indeed, any computer purchased within the last year or so will have a shiney new 64 bit chip in it - being used to run a 32 bit operating system. And, this isn't something created recently. The first 32 bit processor was introduced in 1985. It wasn't, however, until 10 years later, that consumers could take full advantage of that.

I know it's difficult, but, at some point, I just have to start to wonder. The personal computing industry is now over 30 years old. At what point do we stop saying "computers are new" and start expecting everything to work as advertised when we buy it? When do we begin expecting hardware vendors to not simply say "fix it later" when the product can't even be used anymore from being so out of date?

Thinking about it, the answer is obvious to me: people are happy with mediocre. Computers crash, things occasionally just don't work. Part of me hopes, one day that will change - people will begin to expect that this sort of technology shouldn't crash or be difficult to use. Another part of me though, I must admit, is happy we techies can half-ass solutions. Cause really, as long as it's good enough, it works right?

No comments:

Post a Comment