Conclusion

So now that we have seen what both ATI and NVIDIA could do with their respective drivers in the previous generation, and how this is apparently influenced by console development, let's first talk a bit about NVIDIA specifically on the subject of PC-native games.

Overall, the performance improvements with the 6800 Ultra are a bit disappointing. With a new architecture we had hoped NVIDIA would have been able to pull out some of their famous performance increases, and this did not happen. This is not to discount the 6800 Ultra, as the entire NV40-based family are all quite strong cards. However, part of the market is working out performance improvements over a card's lifetime in order to improve performance, win the mindshare of buyers by showing them that they're going to get more out of their purchases over time, and thus create better positioning for future products. With only a handful of significant (>10%) performance improvements, it's hard to say NVIDIA really has done this well.

Accordingly, they don't end up sticking to all of our previously listed trends that ATI ended up following. In general, NVIDIA was able to provide a little bit of a general performance improvement with each of their drivers instead of being solely focused around one-time performance boosts, but the most promising time period for a performance boost is still shortly after a game comes out. In fact, the biggest performance NVIDIA experienced appears to be in debugging and optimizing the drivers early in the life of the hardware. As with ATI's Catalyst drivers, there seems to be little reason to do a ForceWare driver upgrade unless there is a specific change in a driver you are interested in, and this likely isn't going to change with the NV40-derrived G70 series.

On the positive side, we reiterate how impressed we are with NVIDIA's lack of performance improvements in 3DMark. If this is truly because they didn't devote resources to doing so, or if they simply couldn't pull it off, is something we may never know, but it's nice to see that their greatest improvements were in real games, not synthetic benchmarks. Similarly, we are glad to see over the last year and a half that they haven't made the same mistake as ATI with regard to shoehorning users into needing to use a bloated toolset to manipulate their cards. The NVIDIA control panel is lean and mean, and that's the way we like it.

As for the burning question of performance improvements with drivers, ATI has clearly shown that they can work magic with their drivers if they need to. This in turn once again reiterates just how much of an impact drivers can have on overall performance, and ultimately purchasing choices. Although even with some of these large performance changes, they would not have likely changed our initial recommendations for what to purchase, it is close to that threshold.

It is clear that just looking at a card once at its introduction is not enough; the performance improvements and corrections offered by later drivers are just too much to ignore. Yet at the same time, other than re-reviewing a dozen cards every time a new driver is released - and even then we can't very well tell the future - we have no way of really measuring this impact other than with hindsight, which is by its very nature too late. We can offer guesses at what kind of performance improvements might be in the pipeline in the future from NVIDIA and ATI, and certainly they will make every effort to tell the world about how proud they are whenever they do manage to come up with such a performance improvement. The reality is that many of the proclaimed improvements are for certain uncommon settings - i.e. running a budget card at high resolutions with AA/AF enabled - and judging by the results of our regression testing, buyers should be aware of the marketing hype that's going on.

At the same time, the increase in games being ported or developed with console versions paints a very different picture. With the current generation consoles, NVIDIA has benefited from a nice performance lead from what we've seen today, as Xbox-only titles have been much harder on ATI than NVIDIA. Since NVIDIA is effectively the de-facto leader in console GPUs for the current generation (ATI only has a chip in the Gamecube, of which few titles are ported over to the PC), there appears to be little reason to expect major performance improvements out of NVIDIA's drivers on Xbox ports, while ATI appears to be able to work out (and need) performance improvements for Xbox ported games.

That said, the number of games that will still be ported from current generation consoles to the PC is dwindling, which brings us to the discussion of the next generation. ATI powers the Xbox360 with an R500-like chip and will power the Wii with what we believe to be an R300-like successor to the Gamecube GPU. Meanwhile, NVIDIA will be powering the Playstation 3 with a straight-up G70-generation chip. If our results from the Xbox ports are any indication, it seems like there will be a natural favoritism in the case of games that are exclusive to one console or another. Ultimately this may mean that the home team for a game will not be improving driver performance for those games due to an inherent lead, while the visiting team will need to try and make some optimizations to catch up. This is in stark contrast to the way PC-native games work, and cross-platform titles are still up in the air entirely. This facet of performance will be an interesting subject to watch over the next few years as all the next generation consoles come out.

In the mean time, getting back to ATI, NVIDIA, and the PC, what have we learned after today? We still can't predict the future of PC games, drivers, or hardware, but after looking at these results and their new track records, it certainly seems like with the introduction of the X1900 and 7900 series, that ATI may be in a better position to offer more substantial performance improvements than NVIDIA. As we've seen, NVIDIA hasn't been able to work a great deal of improvements out of the NV40/G70 architecture in our tests. With the R500 being quite different from venerable R300, we're interested in seeing what sort of improvements ATI can get from their new architecture. If ATI stays ahead in the driver game, they may very well have what it takes to beat out NVIDIA in the majority of games running on the current generation architectures.

The Wildcard: Console Ports
Comments Locked

24 Comments

View All Comments

  • z3R0C00L - Thursday, May 11, 2006 - link

    Forgot to mention another fact..

    nVIDIA releases BETA drivers on a regular basis (usually not too stable and still plagued with issues that may fix a few issues in certain games but break others.. of course they're BETA).

    ATi release WHQL drivers each month. A new set is release on a monthly basis, usually with MANY bug fixes. ATi is better at fixing issues quicker then nVIDIA. nVIDIA releases a WHQL'd driver once every soo often (like 4 times a year.. maybe up to 6 if lucky).

    This means most nVIDIA users run BETA, non tested drivers. You're the guinea pigs. ATi at least rigourously test each release and even have a 3rd party corporation (Microsoft) test them and certify them. This is a commitment to the utmost quality in drivers.
    So those who complain of ATi driver's are quite honestly liars (they probably own competitor cards and suffer from a disease known as fanboyism).

    These are FACTS. Call me a fanboy for posting FACTS.. it's ok.. Anandtech knows it's true, as do HardOCP, Tom's Hardware, Elitebastards and Beyond 3D.

    Before I leave I want to post another FACT. nVIDIA's OpenGL drivers remain top dog. This is NOT because they code the drivers better. It's because nVIDIA own more OpenGL extension patents that are more widely used by OpenGL game dev's. Most Dev's use nVIDIA OpenGL paths then ATi paths. Forcing ATi to either use a path optimised for nVIDIA cards or a generic path. This is partly ATi's fault for not creating there own path and pushing dev's to use them.

    There... the FACTS. ;)
  • gamara - Wednesday, May 17, 2006 - link

    There are always fewer bugs to fix in less buggy code. If ATI got it right the first time in more cases, maybe their bug fix total wouldn't be as high. I have to agree with another poster on having several issues with ATI drivers in some games and not having anywhere near the same number of issues with beta drivers from nVidia. I had more issues with drivers on a single ATI card than on my Riva TNT, GeForce2, GeForce4, FX5600 Ultra, 6600GT, and 7800GTX combined.
  • Ryan Smith - Thursday, May 11, 2006 - link

    quote:

    nVIDIA releases a WHQL'd driver once every soo often (like 4 times a year.. maybe up to 6 if lucky).


    You have the right idea, but the wrong terminology. In the past, Nvidia has only released around 4 official drivers a year, compared to ATI's 12(though recently have since then been releasing more often). However, they submit many more drivers for WHQL certification than those 4 drivers; usually any "beta" drivers they officially release are already WHQL certified. Unlike ATI there are non-certified drivers out there too since Nvidia shares its drivers more freely with its OEM partners than ATI, and hence you'll see a leak now and then, but for the most part Nvidia drivers are WHQL certified. In fact for this article, I reference the following:

    However given the simply enormous number of such drivers, we used only Windows Hardware Quality Labs (WHQL) certified drivers, which means these are drivers NVIDIA was confident enough to release in a final form and submit to testing to Microsoft.
  • Wesleyrpg - Thursday, May 11, 2006 - link

    hey there,

    you guys mention FFXI tests on page 3, but theres no results on any of the pages? whats up with that?
  • Wesleyrpg - Thursday, May 11, 2006 - link

    whoops....its under console ports
  • Ryan Smith - Thursday, May 11, 2006 - link

    Sorry about that, it's been made clearer now. Karen is on vacation, and it's more or less the worst kept secret in the world that we're terrible bachelors when it comes to writing.
  • etriky - Thursday, May 11, 2006 - link

    The reason ATI has been able to get large increases in their drivers is their first ones are terrible. I'll be the first to admit they make very good hardware. But their control panel is annoyingly bloated and driver stability is terrible. My hat's off to people that will put up with their software.
  • Griswold - Thursday, May 11, 2006 - link

    The usual humbug. The only thing I can agree with is that CCC is unwanted bloatware. Besides that, ATIs drivers are excellent and that is coming from a current nvidia user (though I still have an old box with my trusty 9700 pro and I've enjoyed catalysts ever since I bought this card a few months after its launch).
  • Spoonbender - Thursday, May 11, 2006 - link

    Currently they are, yes. But some years back, they sucked.

    I think the article neglects to mention the possibilty that maybe NVidia's drivers were just better optimized to begin with? If that's the case, ATI has "merely" been catching up.

    I don't even find it unlikely. Today, both ATI and NVidia has great drivers (although some find the ATI control panel a bit bloated, but that's hardly a *driver* issue)
    But, say, 5 years ago, ATI just couldn't make drivers, while NVidia were about as good as they are today. So ATI has obviously been catching up, and obviously, the scores in this article reflect that. ATI has just had more room for optimizing because they started at a disadvantage.

    So I'm not sure I agree with the article that "ATI is the victor for getting the most out of its drivers." That's only true if we assume they were even when they started out.

    However, one final thought. It's pretty clear that if you want an accurate picture of performance, you should wait at least two driver revisions from launch. Seems to more or less stabilize after that. So is there any chance, with future hardware releases, that you're going to revisit them after, say, two driver updates? Would be interesting to say the least.
  • Griswold - Friday, May 12, 2006 - link

    I was talking about a few years back. I didnt buy that 9700 last year. I bought it at the end of 2002. It was launched in august 2002. And I've never had any issues with ATIs drivers with that card to this very day. YMMV but you wont hear me say "they suck" for an obivious reason.

Log in

Don't have an account? Sign up now