I won't anger those fans of Vista by saying that Vista should be replaced and a more updated version of XP should be released with all the good stuff of Vista in it, but let's face it, it's the right choice, isn't it?
Let me tell you an analogous story. It's about a product called the [url=http://en.wikipedia.org/wiki/GeForce_FX_Series]GeForce FX.[/url]
The time was late 2003. The Radeon 9xxx series was out and selling like wildfire. It was a massive performance boost over the popular GeForce 4 series. It provided DirectX9-level features for the first time, and it was quite fast at performing them. For the very first time, ATi released a competing card that was flat-out superior in every way to anything nVidia had.
Of course, everyone waited. The GeForce FX was on the way, nVidia's answer to ATi's gauntlet.
Then, the FX hit. Benchmarks came out. Every performance review said it was inferior next to ATi's weakest 9xxx cards. Even Gabe Newell of Valve said that HL2 would run better on Radeons than GeForce FX's. The DX9 performance of FX's was nothing next to the Radeon 9xxx cards.
Everyone was screaming for nVidia to throw away the FX and bring out a completely new card. They called the FX hardware crap and suggested that nVidia should start from scratch.
They didn't.
The wildly popular GeForce 6xxx and 7xxx series cards share the same hardware structure as the FX's. Oh, they tweaked the internals for performance and such. But the FX architecture was their foundation; they were products derived from that architecture, rather than entirely new cards. Yet the 6xxx and 7xxx easily stomped all over ATi's comperable offerings in performance tests. Why?
Because the Radeon 9xxx architecture was an evolutionary dead-end. Oh, for its day, the performance was great. But it's architecture was firmly rooted in square one, in the old way of thinking about GPU hardware. The FX architecture was incredibly forward looking.
In short, the drumming that the FX took was
necessary to provide nVidia with the successes of the 6xxx and 7xxx line. We've seen similar things with the GeForce 8xxx line; very expensive cards that perform only slightly better than 7xxx's. But the 88000GT and the new GeForce 9xxx's are all much cheaper and much more performant. Why? Because the early GeForce 8xxx's were again, a new architecture. One that needed some tweaking to get good performance and so forth.
nVidia ran a distraction on the entire industry when they dumped the FX name from the 6xxx products. That made people think that the FX hardware was long gone, when in fact, it was right there all along.
Let me tell you another story. This one is of an Operating System. It was called MacOS X. You may have heard of it.
MacOS X has no relation to MacOS 9. And by no relation, I mean
none. They don't think the same, they don't work the same, etc. MacOS 9 is basically the worst operating system you can imagine. Basic concepts like memory protection, that have been in competing OS's like Win95 and Linux, were absent. Apple knew that MacOS sucked. So they threw the whole thing out.
MacOS X 10.0 was crappy. Just about everyone called it a downgrade from MacOS 9. It was slow, buggy, annoying, etc. What people call Vista now. But it was a modern operating system. It provided those features that a good OS should.
MacOS X 10.5, the modern version, is considered one of the best OS's ever. It's built on the foundation of 10.0, just like any good OS revision. It's slick, smooth, fast, etc. The growing pains of 10.0 were
necessary to create a more modern OS.
The point is this: Vista may seem crappy now. It may ultimately be crappy in the future. But Vista's internal technology
will be in future Windows OS's. Oh, Microsoft will probably focus on usability enhancements in the next windows revision. But just because it's not named "Vista" doesn't mean that Vista's guts won't be there.
These kinds of growing pains are necessary. As were the growing pains from Win3.1 to Win95.