Archive for April 4th, 2007

What cost, software innovation?

April 4, 2007

Much fun has been had of late by the anti-Microsoft brigade as the new $9 billion Windows operating system release, Vista, has hit the desktops. There have been concerns about the cost of the UK licence, worries about the general uptake within the business community and reports of incompatibilities with software drivers.

A particular point that is being raised a great deal is that Vista requires some serious ‘beef’ when it comes to hardware. Many users will need to upgrade. Indeed, environmental campaigners have raised the amount of potentially unnecessary dumping of old computers (old as in last year’s) as an issue.

What’s the response from Microsoft? Well, Andrew Herbert, speaking at last week’s jubilee event (see previous blog entries) made an interesting point. He said that new software and operating systems are planned so that they can still be in distribution in six or so years’ time and that system designers are forced to think carefully about what future hardware will be capable of. This is why new operating systems are often quite ‘clunky’ when first released: they are pushing the technical limits of current PCs (processor, memory etc.) with the knowledge that Moore’s Law (of ever-increasing computer power) will be able to deliver the goods in a few months’ or years’ time.

I’m sure this is scientifically and technically true and it’s a view that fits with the history of the personal computer. But I think it raises a question. After thirty years should the computer industry continue to prioritise software innovation over making use of previous generations of hardware? This is perhaps particularly true as more and more applications are performed as services over the Web. This sets operating systems designers a challenge: can they design more backwards-compatible systems that work really well on new kit but are still adequate on older machines?