GPU Performance

The Source engine has been making its mark in computer gaming since it first hit the scene in 2004 (though not with Half-Life 2, but with Vampire: The Masquerade Bloodlines). Valve has continually revamped Source, keeping it looking fresh and (while not ground breaking) modern, and with the release of Episode Two we see another solid update.

It makes sense that Valve would build Source to last. The original Half-Life engine stuck around for quite some time due to the popularity of the game and mods like Counter Strike. Rather than put energy into developing new cutting edge engines from the ground up as soon as the technology becomes available, Valve has made Source extensible and evolved it over time to keep up with the times.

While the latest updates to the engine aren't as apparent as the addition of HDR, there are some subtle changes. Valve's focus lately has been immersion and character interaction. We have more detailed character animation and motion blur. More subtle and varied facial expressions are possible lending to a broader array of emotions that game developers can use to draw the player in to the story. As we said, this release isn't revolutionary, but all the evolutions we've seen in the past have found their way into Episode Two. Everything rolled into one package does look good, but we won't see 200+ framerates at decent resolutions with high quality settings.

Valve has plastered an ATI logo on their video settings screen, claiming that ATI will provide the best experience for gamers playing the latest installment of Half-Life. We will certainly investigate this claim under two different conditions. One indoor firefight test and one outdoor driving test will be used to explore performance. Here we look at a couple resolutions per card, targeting resolutions likely to be paired with the hardware. Each test is run with the highest settings (with the exception of Texture Quality which is set to High). First up is our indoor test (which is less graphically stressful).

Indoor Tests

hl2ep2indoor


Our 2560x1600 high res tests pits all of our high end cards against each other. We tested the 8800 GTS 640 and 320MB cards, but the only difference in performance was at 2560x1600 with 4xAA enabled (the 320MB card was 20% slower at that one data point due to the high memory requirements of that test). For readability sake, we chose to include only the 320MB card with has been much more popular and generally offers terrific performance for the money.

Obviously the high end NVIDIA cards take the lead here, as AMD has not put forth any competition to the 8800 GTX or the 8800 Ultra. The 2900 XT does outperform the 8800 GTS (both 640 and 320MB), but the 320MB card has a significant price advantage, and the 640MB card can be found for not much more money in overclocked varieties. As we didn't test any overclocked GTS cards, the win for the $400 price point certainly goes to the 2900 XT. For those who want to spend closer to $300, the GeForce 8800 GTS 320MB handles Episode Two well enough to run at 2560x1600 with all the bells and whistles.

hl2ep2indoor


This test includes our high end cards and the next step down hardware offered by NVIDIA and AMD. The 8600 GTS and the 2600 XT are fairly similar in performance here, and come in at nearly half the framerate of the 8800 GTS 320MB. While certainly not speed demons, these cards are still playable at this resolution.

The NVIDIA GeForce 8800 GTS 320MB falls way short of the AMD Radeon HD 2900 XT in this test, as we see the AMD card performing nearly on par with the 8800 GTX this time around. It seems that with our indoor test, high end NVIDIA cards scale better from 1920x1200 to 2560x1600 than the high end AMD part.

hl2ep2indoor


Here we drop off the high end parts and take a look at lower end hardware compared to our "mid range" solutions. At 1600x1200, the 2600 XT and 8800 GTS are just about neck and neck again, but the 8600 GT leads the 2600 Pro. The 8600 GT is also the more expensive of the lower end hardware here, but with this as our less graphically intense test, the 2600 Pro threatens to sink beyond the bounds of playability. While we tested these on very high quality settings, it might be wise to ease up on quality to preserve performance at this resolution with a lower end card.

hl2ep2indoor


At 1280x1024, a very widely used panel resolution, the 2400 XT can't quite muster what it needs to maintain good performance under Episode Two. While the 2400 XT won't really be sold as a gaming card, it is certainly good to know it's limits, this test starts to push them.

For an overview of how all the cards perform relative to eachother, here is performance scaling for our indoor test.

.

Next we will take a look at a more graphically intensive scene to see how the cards respond.

Outdoor Tests

at_outland_10


For both 2560x1600 and 1920x1200, we see very similar trends to our less stressful indoor test.

at_outland_10


At 1920x1200, while the 2900 XT again outperforms the 8800 GTS, the 2900 XT doesn't snuggle up against the 8800 GTX this time around. Performance of the AMD part is still better than its direction competition from NVIDIA, but not by as much when in a more graphically intense situation.

at_outland_10


Our outdoor 1600x1200 test looks similar to our indoor test, but the lower end cards are really not playable at this framerate using the source engine. We really need to maintain a framerate of about 35fps or higher to remain playable for Episode Two. Of course we'd prefer to see 45fps for maximum smoothness. These low end parts just won't cut it at 1600x1200.

at_outland_10


At the lowest resolution we tested, it's abundantly clear that 2400 XT is not capable of delivering a quality gaming experience with all the features enabled at the most popular LCD panel resolution around. The good news is that anyone who will be gaming at 1280x1024 will be satisfied with any card that costs at least $100 USD.

To take a look at scaling with our outdoor tests:

.

Antialiasing

We did take a quick look at antialiasing for this review using our high end cards. The bottom line is that our 2900 XT and 8800 GTS class cards are capable of 4xAA up to but not including 2560x1600. Frame rate just drops too low under our outdoor test to remain playable in the $300 - $400 price range at the highest resolution with 4xAA enabled. There is a big performance impact that comes along with 4xAA with all of these cards, especially in our already graphically demanding outdoor benchmark.

at_outland_10


at_outland_10


hl2ep2indoor


hl2ep2indoor


The Test and CPU Performance Final Words
Comments Locked

46 Comments

View All Comments

  • tonjohn - Friday, October 12, 2007 - link

    I would also like to see tests on:
    * 1gb vs 2gb vs 4gb of ram
    * WinXP vs Vista (maybe even 32bit vs 64bit OS comparisons)
    * EP1 performance vs EP2 performace & CSS (or DODS) performancs VS TF2 performance.
  • tonjohn - Friday, October 12, 2007 - link

    This article claims that the new engine only takes advantage of two cores. However, Valve's comments all suggest that the engine can take advantage of four or more cores.

    As for proof, this is what one of my co-workers from Valve's forums reports:
    "How about this?!

    http://i24.tinypic.com/rvefs6.png">http://i24.tinypic.com/rvefs6.png

    I was in the level called "our mutual fiend" where you go down into the depths to find out what is going on. Multiple hunters and all kinds of activity going on. I'm going to run this same level with graphical settings all on low to eliminate GPU bottlenecks and set the core affinity for HL2 to use one, then two, then three and then four and record the fps in game."
  • ViRGE - Friday, October 12, 2007 - link

    I could be wrong here, but that doesn't indicate that HL2 is using all 4 cores. It could be bouncing processes between cores (which does happen with other applications), which makes it look like all 4 cores are in use.
  • tonjohn - Saturday, October 13, 2007 - link

    Here is the official official word on this topic:
    quote:

    ----- Original Message -----
    From: "Mike Durand" <mdurand@valvesoftware.com>
    To: <hlcoders@list.valvesoftware.com>
    Sent: Thursday, October 11, 2007 12:45 PM
    Subject: RE: [hlcoders] questions

    We default to taking advantage of no more than three threads due to some problems that we believe to be due to cache issues with current quad-core processors. You can override this limitation by specifying '-threads 4' on the command line if you like. '-threads 8' should work with this build as well when eight core processors become available.

    -Mike

    So AnAnd needs to adjust the article.
  • tonjohn - Monday, October 15, 2007 - link

    The article still says that the new engine only supports two threads when that is actually wrong. Please fix this.
  • FrankThoughts - Saturday, October 13, 2007 - link

    I have several thoughts.

    First, the whole "30FPS is too slow" stuff is garbage. Maybe in multiplayer, but in single player HL2 Episode Two is perfectly good at 30-40 FPS. I say this because:

    Second, I played through the WHOLE GAME at 2560x1600 4xAA with an X1900 XT GPU. (2GB of RAM, Athlon X2 4800+ CPU, Windows XP). I checked frame rates with FRAPS, and typically got anywhere from 30-45 FPS. What's odd is that I actually thought I was getting much higher rates and it was only after completing the game that I checked the real values and discovered I was hitting low to mid 20s at times.

    Third, as usual CrossFire support is broken out of the box. Maybe the 7.10 drivers that just got released have fixed this, but those weren't available the day EP2 released and so I played and beat the game with a single GPU running. (CrossFire ran, but there was several graphical corruption and major slowdowns. Same goes for Portal. Not sure about TF2 yet....)

    Fourth, CPU cores and performance scaling. Well, I may be doing something wrong (I've tried setting affinity in task manager as well as using "threads [x]" to set CPU core use to one or two cores. Other than minor variations of around .4 FPS (using a timedemo I created, since the AT demos apparently aren't available for download), I get the same result with one or two CPU cores. So, while the engine might be threaded to the point where it can "use" four or even eight cores, the reality is that it doesn't impact performance at all that I can see. Perhaps I'm GPU limited, but I was testing at 1280x800 and averaged 125 FPS (plus or minus 0.3 FPS) at every CPU threading level I tried. CPU usage still got up to around 60% max regardless, so I'm thinking either sound drivers or the graphics drivers are utilizing the other core. Bottom line is that best case (and not even likely) the multithreading is giving about a 5-10% performance boost. Usually threading has an overhead of 5-10%, so most likely CPU utilization went up but performance remained virtually unchanged.

    If you have X1900 series hardware, this game runs perfectly well, turning in great performance even at insane 2560x1600 4xAA resolutions. (If I turn off AA, I get about 40% faster frame rates.) What's funny is that my performance seems to be within spitting distance of the 8800 GTS and HD 2900 XT, all on 22 month old old hardware. I've been thinking I "need" to upgrade for a long time, but every time I actually play some new title I end up with perfectly reasonable performance. My next upgrade will be quad core and either SLI or CrossFire (probably SLI since CrossFire has left me irritated on more than one occasion - basically every new game fails to run with CrossFire for anywhere from 1 to 4 months), but I'm not going to take the plunge until I actually feel the performance gain will be worthwhile. At the current rate, I might be sticking with DX9, XP, and X1900 until late 2008!
  • tonjohn - Friday, October 12, 2007 - link

    If you noticed how, prior to the alt-tabbing, the Core0 has high usage and then the other cores are all being used but the % varies with each core. This is a good indicator that the engine is actively taking advantage of each core and not simply the OS bouncing the supposed two threads around.

    More supporting evidence that the information in Anand's article is potentially incorrect:
    quote:

    Multicore Support - Team Fortress 2 only makes use of multiple CPU cores for its particle system. However, both Episode 2 and Portal make use of the Source engine’s new scalable multicore system.

    Their multicore solution will scale dynamically with however many cores you have on your system. The more cores you have, the more Source engine subsystems will be offloaded to these cores. What are these “subsystems” I speak of, you may be wondering. Areas such as the particle simulation, the materials system, and artificial intelligence are just a few of these subsystems that can be offloaded onto other cores for increased performance across the board.

    However, there are some drawbacks to this. There will obviously come a point where the performance gain from offloading these subsystems to additional cores is hampered by a weak GPU. As is the case now with single and dual-core solutions, making sure to strike a balance between a strong CPU and a GPU that can keep up.

    From CSNation, http://www.csnation.net/articles.php/article_234/">http://www.csnation.net/articles.php/article_234/
  • steamsucks - Friday, October 12, 2007 - link

    Steam sucks. You can't even dl the piece of crap game because of server issues. Don't buy this game, and don't support Valve/Steam.
  • Zak - Sunday, November 11, 2007 - link

    I never had any issues with Steam since HL2 release. I actually like the idea of not having to deal with copy-protected CDs and "please insert CD number 5" every time I want to play a game and having my games always updated. I buy a game in the evening and next day it's INSTALLED and ready on my hard drive. I can make fully re-installable backups of any Steam content on a hard drive so I don't have re-download the whole games when I need to reinstall. I only wish they discounted the Steam games more than the actual physical products. Other than that Steam is a great idea IMHO.

    Z.
  • sc3252 - Monday, October 15, 2007 - link

    Steam does suck. I wish I didnt have to use steam to play orange box, but thats how things work. I dislike it but acknwolege its not as bad as bioshocks drm.

    The first thing I did when I bought this game was put it on a seperate account, so that my brother can play counter strike while I play team fortress 2.

    As far as performance goes, I am gaming on a 3000+ amd 64 with a 7600gt. Everything plays perfectly at 1680x1050. I have aa turned off but everything else is turned up. I cant say the same for the ut3, or any other modern game that has graphics even close to half-life 2's level.

Log in

Don't have an account? Sign up now