When details of the next version of Microsoft Windows first started appearing, there was much talk of it being a lot more modular in nature. There were even Microsoft demo’s’ of a bare bones ‘MinWin’ that removed the many layers of Vista and gave a fast and compact core that used 40Mb or Ram and 25Mb of disk space. For once it sounded like Microsoft’s engineers were leading the development of Windows 7 instead of the marketing teams.
Unfortunately thing don’t seem to be going the way I had hoped. Arstechnica has an article titled ‘Why modular Windows will suck for Microsoft and suck for you’ that makes a good argument for why it’s not a good idea. It’s all about the way in which Windows will become modular, using software or services that are bought individually or subscribed to. If you want the full Windows package you end up paying more for it, or users who want to strip out the bits they don’t use could get a cheaper deal.
It doesn’t sound good, and at first look sounds like a nightmare to support and produce software for. But isn’t this a missed opportunity? Windows has been accused for many years of being bloated by support for software and services going all the way back to DOS. The marketing department don’t want to rip that out since some customers will still want it. So why not make all the legacy support into independent modules that load when needed? There’s much talk of the way software is going parallel with multiple processor cores as the norm, so why can’t windows do the same?. Imagine a Windows that at its heart is a compact, fast and secure core of vital services. On top of this sits the ‘user experience’ module, the desktop, the ‘communications’ module for networking, and the ‘media’ module for sound and video. The standard ‘software’ module provides support for the latest API’s and services necessary to run modern software from Vista and maybe XP. That’s it, no support for anything older, just what you need for a fast computing experience. If you want to use older software, there’s one or more ‘compatibility’ modules for Windows 98, DOS etc. that load when you need them, or you could chose to load at startup for convenience.
Just like Mac OS X, the core or kernel can be replaced with a new version for bug fixes and optimisations. When new API’s and modules become available they slot into the existing framework and just work.
It’s unlikely to happen like this, since the software world seems to be trying to move towards online applications and subscription models. The days of the OS as a focus are numbered if you listen to the analists, but aren’t we all predicted to be driving hover cars and living on the moon by now?