GPU Performance

The Source engine has been making its mark in computer gaming since it first hit the scene in 2004 (though not with Half-Life 2, but with Vampire: The Masquerade Bloodlines). Valve has continually revamped Source, keeping it looking fresh and (while not ground breaking) modern, and with the release of Episode Two we see another solid update.

It makes sense that Valve would build Source to last. The original Half-Life engine stuck around for quite some time due to the popularity of the game and mods like Counter Strike. Rather than put energy into developing new cutting edge engines from the ground up as soon as the technology becomes available, Valve has made Source extensible and evolved it over time to keep up with the times.

While the latest updates to the engine aren't as apparent as the addition of HDR, there are some subtle changes. Valve's focus lately has been immersion and character interaction. We have more detailed character animation and motion blur. More subtle and varied facial expressions are possible lending to a broader array of emotions that game developers can use to draw the player in to the story. As we said, this release isn't revolutionary, but all the evolutions we've seen in the past have found their way into Episode Two. Everything rolled into one package does look good, but we won't see 200+ framerates at decent resolutions with high quality settings.

Valve has plastered an ATI logo on their video settings screen, claiming that ATI will provide the best experience for gamers playing the latest installment of Half-Life. We will certainly investigate this claim under two different conditions. One indoor firefight test and one outdoor driving test will be used to explore performance. Here we look at a couple resolutions per card, targeting resolutions likely to be paired with the hardware. Each test is run with the highest settings (with the exception of Texture Quality which is set to High). First up is our indoor test (which is less graphically stressful).

Indoor Tests

hl2ep2indoor


Our 2560x1600 high res tests pits all of our high end cards against each other. We tested the 8800 GTS 640 and 320MB cards, but the only difference in performance was at 2560x1600 with 4xAA enabled (the 320MB card was 20% slower at that one data point due to the high memory requirements of that test). For readability sake, we chose to include only the 320MB card with has been much more popular and generally offers terrific performance for the money.

Obviously the high end NVIDIA cards take the lead here, as AMD has not put forth any competition to the 8800 GTX or the 8800 Ultra. The 2900 XT does outperform the 8800 GTS (both 640 and 320MB), but the 320MB card has a significant price advantage, and the 640MB card can be found for not much more money in overclocked varieties. As we didn't test any overclocked GTS cards, the win for the $400 price point certainly goes to the 2900 XT. For those who want to spend closer to $300, the GeForce 8800 GTS 320MB handles Episode Two well enough to run at 2560x1600 with all the bells and whistles.

hl2ep2indoor


This test includes our high end cards and the next step down hardware offered by NVIDIA and AMD. The 8600 GTS and the 2600 XT are fairly similar in performance here, and come in at nearly half the framerate of the 8800 GTS 320MB. While certainly not speed demons, these cards are still playable at this resolution.

The NVIDIA GeForce 8800 GTS 320MB falls way short of the AMD Radeon HD 2900 XT in this test, as we see the AMD card performing nearly on par with the 8800 GTX this time around. It seems that with our indoor test, high end NVIDIA cards scale better from 1920x1200 to 2560x1600 than the high end AMD part.

hl2ep2indoor


Here we drop off the high end parts and take a look at lower end hardware compared to our "mid range" solutions. At 1600x1200, the 2600 XT and 8800 GTS are just about neck and neck again, but the 8600 GT leads the 2600 Pro. The 8600 GT is also the more expensive of the lower end hardware here, but with this as our less graphically intense test, the 2600 Pro threatens to sink beyond the bounds of playability. While we tested these on very high quality settings, it might be wise to ease up on quality to preserve performance at this resolution with a lower end card.

hl2ep2indoor


At 1280x1024, a very widely used panel resolution, the 2400 XT can't quite muster what it needs to maintain good performance under Episode Two. While the 2400 XT won't really be sold as a gaming card, it is certainly good to know it's limits, this test starts to push them.

For an overview of how all the cards perform relative to eachother, here is performance scaling for our indoor test.

.

Next we will take a look at a more graphically intensive scene to see how the cards respond.

Outdoor Tests

at_outland_10


For both 2560x1600 and 1920x1200, we see very similar trends to our less stressful indoor test.

at_outland_10


At 1920x1200, while the 2900 XT again outperforms the 8800 GTS, the 2900 XT doesn't snuggle up against the 8800 GTX this time around. Performance of the AMD part is still better than its direction competition from NVIDIA, but not by as much when in a more graphically intense situation.

at_outland_10


Our outdoor 1600x1200 test looks similar to our indoor test, but the lower end cards are really not playable at this framerate using the source engine. We really need to maintain a framerate of about 35fps or higher to remain playable for Episode Two. Of course we'd prefer to see 45fps for maximum smoothness. These low end parts just won't cut it at 1600x1200.

at_outland_10


At the lowest resolution we tested, it's abundantly clear that 2400 XT is not capable of delivering a quality gaming experience with all the features enabled at the most popular LCD panel resolution around. The good news is that anyone who will be gaming at 1280x1024 will be satisfied with any card that costs at least $100 USD.

To take a look at scaling with our outdoor tests:

.

Antialiasing

We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.

at_outland_10


at_outland_10


hl2ep2indoor


hl2ep2indoor


The Test and CPU Performance Final Words
Comments Locked

46 Comments

View All Comments

  • Trixanity - Saturday, October 13, 2007 - link

    I kinda agree with that. I would like to see some tests with more standard specs, than top-end. You should do top-end when comparing new hardware against each other (AMD vs Intel and Nvidia vs AMD), but when testing a game, you may still use these, but try also to include older hardware such as DX9 cards like X1900XTX and 7800GTX or 7900GTX (And GT).
    Would be more helpful to the average gamer, as not all have the money to invest in the top end each month. About the resolution, mostly you do out-of-reach resolution for most, but this time I think you did include 1024x768 for CPU testing, right? I think you should do 1024x768 and 1280x1024 (Or Wide Screen resolutions) more often. I myself run my games at 1024x768, but would probably change when I get a new monitor. (Have old CRT). Those 1900x1200 resoluations are only for people with 24" screens or more.
  • Cali3350 - Saturday, October 13, 2007 - link

    Demo files arent linked correctly, cant get them to download in either firefox or IE.
  • tmx220 - Saturday, October 13, 2007 - link

    Why wasn't the HD 2900Pro not tested?
    The review boasts the 8800GTS 320MB as the best value but clearly it didn't have its price competitor (the HD 2900Pro) to go up against
  • Proteusza - Saturday, October 13, 2007 - link

    This test is kinda screwed I think. They didnt test the 8800 GTS 640, but then claim ATI is the winner at that price point. Hello? You didnt test it, how do you know? They also state that the 320 mb version performs identically at lower resolutions and AA settings, which is true, but they didnt test the 640mb version so we wont know how it performs against the 2900XT at high res/AA settings. Thanks guys.
  • tonjohn - Saturday, October 13, 2007 - link

    Probably b/c the GTS is already competitive with the XT as it is.
  • 8steve8 - Friday, October 12, 2007 - link

    it's misleading (not saying intenionally... but still...)
    to compare cpu costs without considering chipset costs.
    intel motheboards have higher costs partially due to their memory controller.

    what is more applicable to us is cpu+motherboard cost comparasons
    or cpu+motherboard+ram (but here we assume its all the same ddr2 so doesn't really matter)



    just one example (i was recently looking at building a system with hdmi)

    cheapest core 2 duo board with hdmi is $114 shipped at newegg (an ati chipset)
    cheapest intel chipset board for core 2 duo with hdmi is $126 shipped at newegg

    cheapest AM2 board with hdmi is $72 shipped (ati chipset)
    cheapest AM2 nvidia chipset board with hdmi is $79 shipped.

    so maybe this particular type of motherboard isnt a great example. but here we see an avg price difference of over $40.

    for my purpose that means intel cpus have to be $40 cheaper with teh same performance...





    before anyone spams me, i agree intel has better cpus right now, but comparing cost on cpu alone is not relevant to consumers, when its useless without a motherboard.
  • mcnabney - Friday, October 12, 2007 - link

    That is a very valid point to make. However, motherboards and the chipsets inboard can also impact performance, so once you start adding more variables this simple article will need a Rosseta stone to decipher.

    also, since most of these GPUs can be run in SLI/Crossfire, does either ATI or Nvidia scale better with a second card?
  • KeithP - Friday, October 12, 2007 - link

    It would have been far more useful with some older GPUs benchmarked.
  • johnsonx - Sunday, October 14, 2007 - link

    indeed... why spend hours testing with 5 different speeds of the same CPU, yet not even bother with any X1k series and GeForce 7 series GPU's? Do we really need tests to tell us an X2-5000 is a little faster than an X2-4800, which is a little faster than a 4600, which is a little faster than a 4400, etc.? At most we'd need tests of the fastest and slowest sample of each processor type. I guess for Intel this is mostly what was done, although even a couple of those could be dropped, but for AMD X2's only 2 or maybe 3 needed to be included. Also, how about a couple of common single cores to see how much dual-core benefits the new source engine, say an A64-3500 and a 3.2Ghz P4?
  • Vidmar - Saturday, October 13, 2007 - link

    No doubt. Here I was expecting to see how the game may run on nVidia 7800 or 7900 cards. A bit misleading on them to suggest that this would be a comprehensive review. Far from it with only 4 GPUs used.

Log in

Don't have an account? Sign up now