What happens when the operating system you use doesn’t really matter any more? It started with dual-booting Windows and Linux, and using things like Crossover Office to run Windows apps under Linux (which is balky at best), and then things like Virtual PC for Mac, and now we have Apples with Intel chips that can dual-boot Windows and Mac OS-X with Boot Camp. But dual-booting is a pain, because you have to close everything and restart your computer.
Virtualization is where it’s at — running two operating systems side-by-side, so you can flip back and forth. I’ve never used it, but Parallels looks like a truly amazing experience. Windows XP and Mac OS-X running right next to each other, and the latest upgrade allows you to move Windows apps outside the Parallels window and drag and copy things from one OS to the other. Very cool. Michael Verdi has a screencast here.
There has been talk that Apple would include some form of virtualization in Leopard, the next upgrade to the Mac OS, but Apple executives recently quashed that speculation, saying the company is happy with Boot Camp and that Parallels involves “performance degradation.” By which they mean it causes your system to run a lot slower. Some Parallels users have said the same, but others have said for most normal computing tasks it runs fine (in other words, no video games or other graphics-hogging apps).
If you can run Mac OS and Windows on the same machine and use whichever program you want, and drag data back and forth at will between the two, what does an operating system mean? In a sense, it just becomes a visual preference rather than a system or standards choice. And if you spend most of your time using Web apps, the operating system means even less. We’re not quite there yet, of course, but would such a world help Apple or Windows more?