The Test

Our test platform is the same as the one we used in our recent articles. Necessarily departing from our norm, this round of testing is performed under the 32-bit version of Windows Vista. We used the latest beta drivers we could get our hands on from both AMD and NVIDIA. Here's a breakdown of the platform:

Performance Test Configuration:
CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: ASUS P5W-DH
Chipset: Intel 975X
Chipset Drivers: Intel 8.2.0.1014
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 8.38.9.1-rc2
NVIDIA ForceWare 162.18
Desktop Resolution: 1280 x 800 - 32-bit @ 60Hz
OS: Windows Vista x86


We were also able to obtain a beta version of FRAPS from Beepa in order to record average framerates in DirectX 10 applications. Without this, we were previously limited to only testing applications that generate statistics for us. Armed with a DX10 capable version of FRAPS, we can now also take a look at the performance of DX10 SDK samples and other demos that don't include built in frame counters.

For now, though, we are sticking with real world performance. We'll be looking at Call of Juarez, Company of Heroes, and Lost Planet: Extreme Condition. Except for Call of Juarez, we will be looking at DirectX 9 and DirectX 10 path performance. The Call of Juarez benchmark explicitly highlights the enhanced features of their DX10 path, and they don't offer an equivalent benchmark for DX9. If there is demand for Call of Juarez benchmarking down the road, we may look at using FRAPS in both DX9 and DX10 versions. Lost Planet testing required the use of our DX10 version of FRAPS, but Company of Heroes testing was performed using the same method previously available (the performance test in the graphics options section).

In addition to looking at each game on its own, we will take a look at how DX9 and DX10 performance compare overall. Performance scaling with and without AA under each API as well as relative performance of cards under each API will be analyzed.

Index Call of Juarez
Comments Locked

59 Comments

View All Comments

  • slickr - Monday, July 9, 2007 - link

    Great review, thats what we all need to get Nvidia and ATI stop bitchin around and stealing our money with slow hardware that can't even outperform last generations hardware. If you ask me the 8800Ultra should be the middle 150$ class here and top end should be some graphic card with 320 stream processors 1GB GDDR4 clocked at 2.4GHZ and 1000MHz core clock, same from amd they need the X2900XT to be middle 150$ class and top of the line should be some graphic card with 640stream processors 1GB GDDR4 2.4GHz and 1000MHz core clock!

    More of this kind of reviews please so we can put to ATI and Nvidia we won't buy their hardware if its not good!!!!!!!!

  • ielmox - Tuesday, July 24, 2007 - link

    I really enjoyed this review. I have been agonizing over selecting an affordable graphics card that will give me the kind of value I enjoyed for years from my trusty and cheap GF5900xt (which runs Prey, Oblivion, and EQ2 at decent quality and frame rates) and I am just not seeing it.

    I'm avoiding ATI until they bring their power use under control and generally get their act together. I'm avoiding nVidia because they're gouging the hell out of the market. And the previous generation nVidia hardware is still quite costly because nVidia know very well that they've not provided much of an upgrade with the 8xxx family, unless you are willing to pay the high prices for the 8800 series (what possessed them to use a 128bit bus on everything below the 8800?? Did they WANT their hardware to be crippled?).

    As a gamer who doesn't want to be a victim of the "latest and greatest" trends, I want affordable performance and quality and I don't really see that many viable options. I believe we have this half-baked DX10 and Vista introduction to thank for it - system requirements keep rocketing upwards unreasonably but the hardware economics do not seem to be keeping pace.

  • AnnonymousCoward - Saturday, July 7, 2007 - link

    Thanks Derek for the great review. I appreciate the "%DX10 performance of DX9" charts, too.
  • Aberforth - Thursday, July 5, 2007 - link

    This article is ridiculous. Why would Nvidia and other dx10 developers want gamers to buy G80 card for high dx10 performance? DX10 is all about optimization, the performance factor depends on how well it is implemented and not by blindly using API's. Vista's driver model is different and dx10 is different. The present state of Nvidia drivers are horrible, we can't even think of dx10 performance at this stage.

    the dx10 version of lost planet runs horribly eventhough it is not graphically different from dx9 version. So this isn't dx10 or GPU's fault, it's all about the code and the drivers. Also the CEO of Crytek has confirmed that Nvidia 8800 (possibly 8800GTS) and E6600 CPU can max Crysis in Dx10 mode.

    Long back when dx9 came out I remember reading an article about how it sucked badly. So I'm definetly not gonna buy this one.
  • titan7 - Thursday, July 12, 2007 - link

    No, it's not about sucky code or sucky drivers. It's about shaders. Look at how much faster cards with more shader power are in d3d9. Now in d3d10 longer, richer, prettier shaders are used that take more power to process.

    It's not about optimization this time as the IHVs have already figured out how to write optimized drivers, it's about raw FLOPS for shader performance.
  • DerekWilson - Thursday, July 5, 2007 - link

    DX9 performance did (and does) "suck badly" on early DX9 hardware.

    DX10 is a good thing, and pushing the limits of hardware is a good thing.

    Yes drivers and game code can be rocky right now, but the 162 from NVIDIA are quite stable and NV is confident in their performance. Lost planet shows that NV's drivers are at least getting close to parity with DX9.

    This isn't an article about DX10 not being good, it's an article about early DX10 hardware not being capable of delivering all that DX10 has to offer.

    Which is as true now as it was about early DX9 hardware.
  • piroroadkill - Friday, July 6, 2007 - link

    Wait, performance on the Radeon 9700 Pro sucked? I seem to remember games several years later that were DirectX 9 still being playable...
  • DerekWilson - Saturday, July 7, 2007 - link

    yeah, 9700 pro sucks ... when actually running real world DX9 code.

    Try running BF2 at any playable setting (100% view distance, high shadows and lighting). This is really where games started using DX9 (to my knowledge, BF2 was actually the first game to require DX9 support to run).

    But many other games still include the ability to run 1.x shaders rather 2.0 ... Like Oblivion can turn the detail way down to the point where there aren't any DX9 heavy features running. But if you try to enable them on a 9700 Pro it will not run well at all. I actually haven't tested Oblivion at the lowest quality so I don't know if it can be playable on a 9700 Pro, but if it is, it wouldn't even be the same game (visually).
  • DerekWilson - Saturday, July 7, 2007 - link

    BTW, BF2 was released less than 3 years after the 9700 Pro ... (aug 02 to june 05) ...
  • Aberforth - Thursday, July 5, 2007 - link

    Fine...

    Just want to know why a DX10 game called Crysis was running at 2048x1536 res with 60+ FPS equipped with Geforce 8800 GTX.

    crysis-online.com/?id=172

Log in

Don't have an account? Sign up now