NVIDIA's Back with NV35 - GeForceFX 5900 Ultra
by Anand Lal Shimpi on May 12, 2003 8:53 AM EST- Posted in
- GPUs
Sneak Preview - Doom3 Performance
What's the most frustrating part of talking about all this high-power DX9 hardware? There are absolutely no games available that truly stress it; the market is waiting for the release of one very pivotal title, the game that will determine whether ATI or NVIDIA is crowned the true next-generation performance king: Doom3.
We have been anxiously awaiting the release of a pre-release benchmark version of Doom3 for our graphics tests, and last Thursday we were given a unique opportunity to benchmark Doom3 on a number of the latest GPUs. The opportunity was put together by idSoftware and NVIDIA, and was very limited in nature, but it was something we had to jump on.
We were given one evening with a system running Doom3, to test the latest GPUs from both ATI and NVIDIA using whatever drivers we pleased using whatever settings we would like. The only stipulation was that we could not comment on how the game looked, just on the performance of the GPUs. Doom3 is still a while away from shipping and thus we can understand id not wanting us to talk about the look of the game, and we gladly agreed to their terms in order to bring you a quick look at the performance of Doom3 on today's GPUs.
Here are some words of forewarning though; for starters, neither ATI nor NVIDIA gave us Doom3 optimized drivers to test with, we tested using the same drivers we used in all of our other tests (both drivers will be publicly available next week from ATI and NVIDIA). We actually ended up using ATI's Catalyst 3.2 instead of 3.4 for our Doom3 tests simply because the Catalyst 3.4 drivers we had were significantly slower in Doom3 and were much more unstable than the 3.2 release.
NVIDIA obviously wouldn't agree to this opportunity unless they knew they would come out ahead in the benchmarks, and as you are soon to see, they did. John Carmack did have a few words of warning, which we received after our benchmarking time with the system was up:
"We have been planning to put together a proper pre-release of Doom for benchmarking purposes, but we have just been too busy with actual game completion. The executable and data that is being shown was effectively lifted at a random point in the development process, and shows some obvious issues with playback, but we believe it to be a fair and unbiased data point. We would prefer to show something that carefully highlights the best visual aspects of our work, but we recognize the importance of providing a benchmark for comparison purposes at this time, so we are allowing it to be used for this particular set of tests. We were not happy with the demo that Nvidia prepared, so we recorded a new one while they were here today. This is an important point -- while I'm sure Nvidia did extensive testing, and knew that their card was going to come out comfortably ahead with the demo they prepared, right now, they don't actually know if the demo that we recorded for them puts them in the best light. Rather nervy of them, actually.
The Nvidia card will be fastest with "r_renderer nv30", while the ATI will be a tiny bit faster in the "r_renderer R200" mode instead of the "r_renderer ARB2" mode that it defaults to (which gives some minor quality improvements). The "gfxinfo" command will dump relevant information about the functioning renderer modes and optimizations. At some point, after we have documented all of the options and provided multiple datasets, Doom is going to be an excellent benchmarking tool, but for now you can still make some rough assessments with it."
So although the game isn't finished and the drivers are not tuned, we should be able to get a decent idea of how performance is shaping up under Doom3. We were only able to test the following cards:
ATI Radeon 9800 Pro
ATI Radeon 9600 Pro
ATI Radeon 9200
NVIDIA GeForceFX 5900 Ultra
NVIDIA GeForceFX 5800 Ultra
NVIDIA GeForceFX 5600 Ultra
NVIDIA GeForceFX 5200 Ultra
The only options that we could change in game were the quality settings which we could set to low, medium or high. We benchmarked medium and high, but on the fastest cards the performance drop in high quality mode was negligible so we stuck to reporting the medium detail level scores due to time constraints.
id allowed us to publish a screenshot to give you an idea of the level of detail we're talking about with Doom3. The game is nothing short of impressive and makes the rest of our benchmark suite look extremely dated.
We ran at 1024x768, 1280x1024 and 1600x1200 with AA/Aniso off as well as with 4X AA and 8X Quality Aniso enabled.
Excited yet? Let's get to the results…
19 Comments
View All Comments
Anonymous User - Thursday, October 16, 2003 - link
After reading this article, how can I determine which GeForceFX 5600 card has the NV30 core or the NV35 core. I'm currently interested in purchasing one, but on any of the retail boxes or manuals from the manufacturer's web site say nothing about the type of core used. Did NVidia corrected themselves using the NV35 core before releasing their 5600 cards to the market? Or are there 5600's NV30 cards on the retail shelves too. Help is appreciated. Thanks.JamesVon - Thursday, December 27, 2018 - link
Have you tried to play any Fortnite in GeForceFX 5600? Actually you can get free v-bucks or free fortnite leaked skins here if you interested <a href="https://newfortnite.com/">https://newfortn...Anonymous User - Saturday, September 6, 2003 - link
You should be ashamed. The linking of words to completely unrelated MARKETING ADS is absolutely ridiculous...as if you don’t have ENOUGH ads already.-J
Shagga - Saturday, August 9, 2003 - link
I certainly found the article informative. I read the article with a view to making a decision on which card to purchase over the next week or so and to be honest the article said enough to convince me to sit tight. I also felt there is more to come from both ATI and nVidia and the results which are presented are perhaps not entirely complete. This is pointed out by Anand and at $499 I need to be making the right choice, however, Anand did succeed in convincing me to wait a tad longer.Good article I thought.
Anonymous User - Friday, August 1, 2003 - link
Please stop using Flash graphics!JamesVon - Thursday, December 27, 2018 - link
What is the problem with Flash Graphics? Have you tried using Steam Platform? You can get free steam keys here https://steamity.com if you want to download free steam gamesPete - Tuesday, July 22, 2003 - link
It's only fair that I praise the article, as well. As I said above, in the initial article comment thread, I congratulated Anand on what I thought was a well-written article. I appreciate his lengthy graphics pipeline summary, his extensive image quality investigation, and his usual even-handed commentary (though I had problems with the latter two).Pete - Tuesday, July 22, 2003 - link
I think this is a great article with a few significant flaws in its benchmarking.Firstly, the Doom 3 numbers. Anand acknowledged that he could not get the 9800P 256MB to run the tech demo properly, yet he includes the numbers anyway. This strikes me as not only incorrect but irresponsible. People will see 9800P 256MB numbers and note that its extra memory makes no difference over its 128MB sibling, yet only if they read the article carefully would they know that the driver Anand used limits the 9800P 256MB to only 128MB, essentially crippling the card.
Also, note the difference between Medium Quality and High Quality modes in Doom 3 is only anisotropic filtering (AF), which is enabled in HQ mode. Note that forcing AF in the video card's drivers, rather than via the application, will result in higher performance and potentially lower image quality! This was shown to be the case both in a TechReport article on 3DM03 ("3DMurk"), in forum discussions at B3D, and in an editorial at THG. Hopefully this will be explored fully once a Doom3 demo is released to the public, and we have more open benchmarking of this anticipated game.
Secondly, Anand's initial Quake 3 5900U numbers seemed way off compared to other sites that tested the same card in similar systems at the same settings. At 1600x1200 with 4xAA 8xAF, Anand was scoring over 200fps, well higher than any other review. And yet, after weeks of protest in the forum thread on this article, all that happened was the benchmark results for 12x10 and 16x12 were removed. The text, which notes:
"The GeForceFX 5900 Ultra does extremely well in Quake III Arena, to the point where it is CPU/platform bound at 1600x1200 with 4X AA/8X Anisotropic filtering enabled."
was left unchanged, even though it was based on what many assumed were erroneous benchmark data. I can only conclude that the data were indeed erroneous, as they have been removed from the article. Sadly, the accompanying text has not been edited to reflect that.
Thirdly, the article initially tested Splinter Cell with AA, though the game does not perform correctly with it. The problem was that NVIDIA's drivers automatically disable AA if it's selected, yielding non-AA scores for what an unsupsecting reviewer believes is an AA mode. ATi's driver allow AA, warts and all, and thus produce appropriately dimished benchmark numbers, along with corresponding AA errors. The first step at correcting this mistake was to remove all Splinter Cell graphs and place a blurb in the driver section of the review blaming ATi for not disabling AA. Apparently a second step has been taking, expunging Splinter Cell from the article text altogether. Strangely, Splinter Cell is still listed in the article's drop-down menu as p. 25; clicking will bring you to the one last Quake 3 graph with the incorrect analysis, noted above.
Finally, a note on the conclusion:
"What stood out the most about NVIDIA was how some of their best people could look us in the eye and say "we made a mistake" (in reference to NV30)."
What stands out most to me is that NVIDIA still can't look people in the eye and say they made a mistake by cheating in 3DMark03. Recent articles have shown NVIDIA to be making questionable optimizations (that may be considered cheats in the context of a benchmark) in many games and benchmarks, yet I see only a handful of sites attempt to investigate these issues. ExtremeTech and B3D noted the 3DMark03 "optimizations." Digit-Life has noted CodeCreatures and UT2K3 benchmark "optimizations," and Beyond3D and AMDMB have presented pictorial evidence of what appears to be the reason for the benchmark gains. NVIDIA appears to currently foster a culture of cutting corners without the customer's (and, hopefully, reviewer's) knowledge, and they appear reticent to admit it at all.
I realize this post comes off as harsh against both Anand and NVIDIA. In the initial comment thread on this article, I was gentler in my (IMO, constructive) criticism. As the thread wore on for weeks without a single change in the multiple errors perceived in the original article, I gradually became more curt in my requests for corrections. Anand elicits possibly the greatest benefit of the doubt of any online hardware reviewer I know, as I've read his site and enjoyed the mature and thoughtful personality he imbued it with for years. I'm sorry to say his response--rather, his lack of response, as it was only Evan and Kristopher, not Anand, that replied to the original article thread--was wholly unsatisfactory, and the much belated editing of the article into what you read today was unsatisfactory as well. I would have much preferred Anand(tech) left the original article intact and appended a cautionary note or corrected benchmarks and commentary, rather than simply cutting out some of the questionable figures and text.
Consider this post a summation of the criticism posted in the original article thread. I thought they would be useful to put this article in context, and I hope they are taken as constructive, not destructive, criticism. The 5900 is no doubt a superior card to its predecessor. I also believe this article, in its current form, presents an incomplete picture of both the 5900U and its direct competition, ATi's 9800P 256MB. Hopefully the long chain of revelations and commentary sparked by and after this article will result not in hard feelings, but more educated, thorough, and informative reviews.
I look forward to Anandtech's next review, which I believe has been too long in coming. :)
ritaeora - Tuesday, December 11, 2018 - link
I like your review about the GeForce.http://www.linkedin.com/company/free-instagram-fol...
kyrac - Monday, December 24, 2018 - link
I am a user of Nvidia and i have a great experience using it.https://www.linkedin.com/company/virtual-assistant...