GPU Performance

The Source engine has been making its mark in computer gaming since it first hit the scene in 2004 (though not with Half-Life 2, but with Vampire: The Masquerade Bloodlines). Valve has continually revamped Source, keeping it looking fresh and (while not ground breaking) modern, and with the release of Episode Two we see another solid update.

It makes sense that Valve would build Source to last. The original Half-Life engine stuck around for quite some time due to the popularity of the game and mods like Counter Strike. Rather than put energy into developing new cutting edge engines from the ground up as soon as the technology becomes available, Valve has made Source extensible and evolved it over time to keep up with the times.

While the latest updates to the engine aren't as apparent as the addition of HDR, there are some subtle changes. Valve's focus lately has been immersion and character interaction. We have more detailed character animation and motion blur. More subtle and varied facial expressions are possible lending to a broader array of emotions that game developers can use to draw the player in to the story. As we said, this release isn't revolutionary, but all the evolutions we've seen in the past have found their way into Episode Two. Everything rolled into one package does look good, but we won't see 200+ framerates at decent resolutions with high quality settings.

Valve has plastered an ATI logo on their video settings screen, claiming that ATI will provide the best experience for gamers playing the latest installment of Half-Life. We will certainly investigate this claim under two different conditions. One indoor firefight test and one outdoor driving test will be used to explore performance. Here we look at a couple resolutions per card, targeting resolutions likely to be paired with the hardware. Each test is run with the highest settings (with the exception of Texture Quality which is set to High). First up is our indoor test (which is less graphically stressful).

Indoor Tests

hl2ep2indoor


Our 2560x1600 high res tests pits all of our high end cards against each other. We tested the 8800 GTS 640 and 320MB cards, but the only difference in performance was at 2560x1600 with 4xAA enabled (the 320MB card was 20% slower at that one data point due to the high memory requirements of that test). For readability sake, we chose to include only the 320MB card with has been much more popular and generally offers terrific performance for the money.

Obviously the high end NVIDIA cards take the lead here, as AMD has not put forth any competition to the 8800 GTX or the 8800 Ultra. The 2900 XT does outperform the 8800 GTS (both 640 and 320MB), but the 320MB card has a significant price advantage, and the 640MB card can be found for not much more money in overclocked varieties. As we didn't test any overclocked GTS cards, the win for the $400 price point certainly goes to the 2900 XT. For those who want to spend closer to $300, the GeForce 8800 GTS 320MB handles Episode Two well enough to run at 2560x1600 with all the bells and whistles.

hl2ep2indoor


This test includes our high end cards and the next step down hardware offered by NVIDIA and AMD. The 8600 GTS and the 2600 XT are fairly similar in performance here, and come in at nearly half the framerate of the 8800 GTS 320MB. While certainly not speed demons, these cards are still playable at this resolution.

The NVIDIA GeForce 8800 GTS 320MB falls way short of the AMD Radeon HD 2900 XT in this test, as we see the AMD card performing nearly on par with the 8800 GTX this time around. It seems that with our indoor test, high end NVIDIA cards scale better from 1920x1200 to 2560x1600 than the high end AMD part.

hl2ep2indoor


Here we drop off the high end parts and take a look at lower end hardware compared to our "mid range" solutions. At 1600x1200, the 2600 XT and 8800 GTS are just about neck and neck again, but the 8600 GT leads the 2600 Pro. The 8600 GT is also the more expensive of the lower end hardware here, but with this as our less graphically intense test, the 2600 Pro threatens to sink beyond the bounds of playability. While we tested these on very high quality settings, it might be wise to ease up on quality to preserve performance at this resolution with a lower end card.

hl2ep2indoor


At 1280x1024, a very widely used panel resolution, the 2400 XT can't quite muster what it needs to maintain good performance under Episode Two. While the 2400 XT won't really be sold as a gaming card, it is certainly good to know it's limits, this test starts to push them.

For an overview of how all the cards perform relative to eachother, here is performance scaling for our indoor test.

.

Next we will take a look at a more graphically intensive scene to see how the cards respond.

Outdoor Tests

at_outland_10


For both 2560x1600 and 1920x1200, we see very similar trends to our less stressful indoor test.

at_outland_10


At 1920x1200, while the 2900 XT again outperforms the 8800 GTS, the 2900 XT doesn't snuggle up against the 8800 GTX this time around. Performance of the AMD part is still better than its direction competition from NVIDIA, but not by as much when in a more graphically intense situation.

at_outland_10


Our outdoor 1600x1200 test looks similar to our indoor test, but the lower end cards are really not playable at this framerate using the source engine. We really need to maintain a framerate of about 35fps or higher to remain playable for Episode Two. Of course we'd prefer to see 45fps for maximum smoothness. These low end parts just won't cut it at 1600x1200.

at_outland_10


At the lowest resolution we tested, it's abundantly clear that 2400 XT is not capable of delivering a quality gaming experience with all the features enabled at the most popular LCD panel resolution around. The good news is that anyone who will be gaming at 1280x1024 will be satisfied with any card that costs at least $100 USD.

To take a look at scaling with our outdoor tests:

.

Antialiasing

We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.

at_outland_10


at_outland_10


hl2ep2indoor


hl2ep2indoor


The Test and CPU Performance Final Words
Comments Locked

46 Comments

View All Comments

  • Spoelie - Sunday, October 14, 2007 - link

    any word on the msaa units in the render backen of the rv670, are they fixed?
    seeing the x1950xtx with the exact same framerate as the 2900xt when 4xAA is turned on really hurts.

    this just shows how good the r580 core really was, being able to keep up with the 88kGTS on this workload
  • NullSubroutine - Thursday, October 18, 2007 - link

    I believe the RV670 did address AA as well as some other fixes and improvements (per reported on other sites that the core was not just a die shrink.)
  • xinoxide - Sunday, October 14, 2007 - link

    why is this card not included, the increase in memory speed and capacity do help framerates, especially when sampling many parts of the image, an example being shadow cache, and AA resamples. ive been finding anand to be "somewhat" biased in this respect, they show the 8800ULTRA performance over the 8800GTX itself, and while ati struggles in driver aspects, doesnt mean there is no increase in performance from the HD2900XT 512MB to the HD2900XT 1GB
  • felang - Sunday, October 14, 2007 - link

    I can't seem to download the demo files...
  • DorkOff - Saturday, October 13, 2007 - link

    I for one really, really appreciate the attached *.dem files. I wish all benchmarking reviews, be it of video cards or cpus, would include these. Thank you Anandtech
  • MadBoris - Saturday, October 13, 2007 - link

    I'm starting to get disappointed in the trend of the effects of console transitions in technology enhancements. It's no suprise that UE3 and Valve source will perfom well, they have to run well on consoles afterall. What I don't like is lack of DX10 integration (although it can be argued it's premature and has been fake in other games), as well as things like AA 'notably absent' in UE3. Obviously the platform compatibility challenges are keeping them more than busy where extra features like this are out of reach. I guess that is the tradeoff, I'm sure the games are fun though, so not too much to complain about. Technology is going to level off a bit though for a while apparently. I'm sure older value HW can be glad by all this.

    Real soon, if not already, we will have more power on PC than will be used for some time to come with the exception of one...This year it will be Crytek to pushing the envelope, and those GTS320's (great for multiplatform games limited by consoles) are going to show that indeed the 512MB memory that was on video cards which started appearing many years ago, shouldn't have been a trend so easily ignored by PC gamers going for a 320.

    Otherwise, looking forward to many more of these for UE3 and of course Crysis.
  • MadBoris - Saturday, October 13, 2007 - link

    I forgot to mention things like x64 adoption as another thing that won't be happening in games for years due to limited memory constraints in consoles setting the bar. Crytek cannot move the industry along by itself, infact, it may even get a black eye for doing what Epic, id, Valve used to do, but don't anymore. Many despised the upward demands of HW of the gaming industry, but the advances we have today are due to them, including things like multicores and GPU's like we have, have all been helped move along faster by the gaming industry.

    Memory is inexpensive and imagine if we could move on up to 3 or 4 GB in games in the coming couple years, game worlds could become quite large and full, and level loading not nearly as often or interrupting. But alas x64 will be another thing that won't be utilized in the years to come. With the big boys like Epic, id, Valve all changing their marketing strategies and focus it appears things will never be the same, atleast for the leaps every 5 years by the consoles now dictating terms. It's even doubtful that consoles in next gen would use x64 because they won't spend money to add more memory, therefore have no need for x64. 32 bit, how do we get out of it when MS won't draw the line.
    Sorry if this is deemed a bit off topic, but being a tech article about a game, it kind of got my brain thinking about these things.
  • munky - Saturday, October 13, 2007 - link

    What's the point of running your cpu benches at 1024 resolution? We already know Intel will have the performance lead, but these benches say nothing about how much difference the cpu makes at resolutions people actually use, like 1280, 1600, and 1920.
  • Final Hamlet - Saturday, October 13, 2007 - link

    ...your GPU-world starts with an ATI 2900XT and ends with an Nvidia 8800Ultra. Mine does not. I really would like to see common GPUs (take a look @ the Valve Hardware Survey... how many of those ridiculous high-end hardware do you see there?). Please include the ATI 1xxx and the Nvidia 7xxxx @ some _realistic_ resolutions (sry... maybe you have a home theater with 10,000x5,000 - but testing that has no relevance to about 99%+ of your readers, so why do you keep on producing articles for less than 1% of PC players?)
  • archcommus - Saturday, October 13, 2007 - link

    I'm running an Athlon 64 3200+, 1 GB of memory, and an X800 XL, and the game plays beautifully smooth (not a single hiccup even during action) at 1280x1024 with MAX EVERYTHING - by max everything I mean very high texture detail, high everything else, 6x AA, 16x AF. So if you're running a regular LCD resolution (not widescreen) you basically don't need benchmarks at all - if you built your system anytime within the last 2-3 years (up to two generations ago), it's going to run fine even with max settings. Thus the benchmarks are tailored to people running much higher resolutions, and because of that need higher-end hardware.

    Considering the game still looks great with the exception of some more advanced lighting techniques that new games have, I'm very impressed with what Valve has done.

Log in

Don't have an account? Sign up now