It's almost ironic that the one industry we deal with that is directly related to entertainment has been the least exciting for the longest time. The graphics world has been littered with controversies surrounding very fickle things as of late; the majority of articles you'll see relating to graphics these days don't have anything to do with how fast the latest $500 card will run. Instead, we're left to argue about the definition of the word "cheating". We pick at pixels with hopes of differentiating two of the fiercest competitors the GPU world has ever seen, and we debate over 3DMark.

What's interesting is that all of the things we have occupied ourselves with in recent times have been present throughout history. Graphics companies have always had questionable optimizations in their drivers, they have almost always differed in how they render a scene and yes, 3DMark has been around for quite some time now (only recently has it become "cool" to take issue with it).

So why is it that in the age of incredibly fast, absurdly powerful DirectX 9 hardware do we find it necessary to bicker about everything but the hardware? Because, for the most part, we've had absolutely nothing better to do with this hardware. Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag. What were we able to do with this kind of hardware? Run Unreal Tournament 2003 at 1600x1200 with 4X AA enabled and still have power to spare, or run Quake III Arena at fairytale frame rates. Both ATI and NVIDIA have spent countless millions of transistors, expensive die space and even sacrificed current-generation game performance in order to bring us some very powerful pixel shader units with their GPUs. Yet, we have been using them while letting their pixel shading muscles atrophy.

Honestly, since the Radeon 9700 Pro, we haven't needed any more performance to satisfy the needs of today's games. If you take the most popular game in recent history, the Frozen Throne expansion to Warcraft III, you could run that just fine on a GeForce4 MX - a $500 GeForce FX 5900 Ultra was in no way, shape or form necessary.

The argument we heard from both GPU camps was that you were buying for the future; that a card you would buy today could not only run all of your current games extremely well, but you'd be guaranteed good performance in the next-generation of games. The problem with this argument was that there was no guarantee when the "next-generation" of games would be out. And by the time they are out, prices on these wonderfully expensive graphics cards may have fallen significantly. Then there's the issue of the fact that how well cards perform in today's pixel-shaderless games honestly says nothing about how DirectX 9 games will perform. And this brought us to the joyful issue of using 3DMark as a benchmark.

If you haven't noticed, we've never relied on 3DMark as a performance tool in our 3D graphics benchmark suites. The only times we've included it, we've either used it in the context of a CPU comparison or to make sure fill rates were in line with what we were expecting. With 3DMark 03, the fine folks at Futuremark had a very ambitious goal in mind - to predict the performance of future DirectX 9 titles using their own shader code designed to mimic what various developers were working on. The goal was admirable; however, if we're going to recommend something to millions of readers, we're not going to base it solely off of one synthetic benchmark that potentially may be indicative of the performance of future games. The difference between the next generation of games and what we've seen in the past is that the performance of one game is much less indicative of the performance of the rest of the market; as you'll see, we're no longer memory bandwidth bound - now we're going to finally start dealing with games whose pixel shader programs and how they are handled by the execution units of the GPU will determine performance.

All of this discussion isn't for naught, as it brings us to why today is so very important. Not too long ago, we were able to benchmark Doom3 and show you a preview of its performance; but with the game being delayed until next year, we have to turn to yet another title to finally take advantage of this hardware - Half-Life 2. With the game almost done and a benchmarkable demo due out on September 30th, it isn't a surprise that we were given the opportunity to benchmark the demos shown off by Valve at E3 this year.

Unfortunately, the story here isn't as simple as how fast your card will perform under Half-Life 2; of course, given the history of the 3D graphics industry, would you really expect something like this to be without controversy?

It's Springer Time
Comments Locked

111 Comments

View All Comments

  • Anonymous User - Friday, September 12, 2003 - link

    Anand: When you re-test with the Det 50's, make sure you rename the HL2 exe!!!

    Gotta make the comparison as fair as possible...
  • Anonymous User - Friday, September 12, 2003 - link

    #69 How does the 9500 not fully support DX9? It's the same core EXACTLY as the 9700.
  • Anonymous User - Friday, September 12, 2003 - link

    #53 - So YOU'RE that bastard who's been lagging us out!!! Get out of the dark ages!
  • Anonymous User - Friday, September 12, 2003 - link

    What kind of conclusion was that ?

    In terms of the performance of the cards you've seen here today, the standings shouldn't change by the time Half-Life 2 ships - although NVIDIA will undoubtedly have newer drivers to improve performance. Over the coming weeks we'll be digging even further into the NVIDIA performance mystery to see if our theories are correct; if they are, we may have to wait until NV4x before these issues get sorted out.

    For now, Half-Life 2 """ SEEMS """ to be best paired with ATI hardware and as you've seen thorugh our benchmarks, whether you have a Radeon 9600 Pro or a Radeon 9800 Pro you'll be running just fine. Things are finally """heating up""" and it's a good feeling to have back...

    HL2 ""seems"" better on ATI??? , should be, HL2 looks way better and faster on ATI.

    Things are finally """heating up""" ??? shoul have been , ATI's performance is killing Nvidia's FX.

    The conclusion should have been :
    Nvidia lied and sucks , Valva had to lower standard ( actually optimize (cheat) in favor of Nvidia) and make HL2 game look bad , just so you could play on your overpriced Nvidia Fx cards.

    How about a word of apology from Anand to have induced readers in errors , and have told them to buy Nvidia Fx card's in is last Fx5900 review. ???

    From a future ATI card owner, (bundled with HL2 of course)

    Boy I'm pissed off!
  • Anonymous User - Friday, September 12, 2003 - link

    82, those are 9600 regulars (!), click the links. Pricewatch has been fooled. A Pro isn't much more, though, just about $136.

    I'd go with a 9500 over a 9600 any day. The 9500 can be softmodded to 9700 performance levels (about 50-70% of the time, IIRC, and it's actually a little cheaper than the 9600 Pro!). If the softmod doesn't work out, then you return it for a new one. Of course, not everyone wants to do this, and a 9600 Pro is a respectable and highly overclockable card.. but..

    I'd still love to see 9500 Pros at lower prices, like they would have been if ATi had kept it out.. but whatever. If you don't know, the 9500 Pro is/was considerably faster than the 9600 Pro. Valve said that HL2 isn't memory-limited, so the 128-bit memory interface on the 9500 Pro (which never made a big difference vs. the 9700 anyway) shouldn't even be noticeable, and the fact that the Sapphire-made ones were just as overclockable as the 9500 regulars and 9700s (think up to 340 core, 350 if you're lucky) is going to make it one HELL of an HL2 card for the $175 most people paid.
  • Anonymous User - Friday, September 12, 2003 - link

    Nvidia got schooled, but not on hardware or drivers. ATI locked this up long ago with their deal with Gabe and buddies.
    Why is everyone just trying to keep a straight face about it? ATI paid handsomely for exactly what has happened to NVidia.
    But as always happens, watch out when the tables turn, as they ALWAYS do, and Valve could be on the OUTSIDE of lots of other deals.
  • Anonymous User - Friday, September 12, 2003 - link

    I am just glad there is finally a damn game that can stress out these video cards. Wonder if Bitboys Oy of whatever there name is come out saying they have a new video card out now that will run Half Life 2 at 100+ FPS :) What made me think of them I have no idea!
  • Anonymous User - Friday, September 12, 2003 - link

    Not to detract from the main issue here, but #19 raises a good point. Why does the 9600Pro lose only <1% performance going from 1024 to 1280? The 9800P and 9700P lose between 10-15%. The 5900U loses 30%, sometimes more. I wonder if the gap between the 9800P and 9600P shrinks even more at higher resolutions.

    What aspect of the technology in the 9600 could possibly account for this?
  • Anonymous User - Friday, September 12, 2003 - link

    #81 You can find 9600pro's for ~$160 from newegg.

    A couple of small webstores have a "Smart PC 9600" non-pro 128 meg for <$100. But the smart pc card is a cheap oem unit...I'm not sure if it's as good as the more expensive 9600's.
  • Anonymous User - Friday, September 12, 2003 - link

    Pricewatch:
    $123 - RADEON 9600 Pro 256MB
    $124 - RADEON 9600 Pro 128MB

Log in

Don't have an account? Sign up now