Comments Locked

46 Comments

Back to Article

  • tonrav - Monday, December 3, 2007 - link

    ERROR: demo network protocol 11 outdated, engine version if 14.

    Google reveals little of what this means except that Valve possibly rev'd something in EP2 without putting backwards compatibility in the code. Anybody know a way to play these demo's or convert them?
  • NullSubroutine - Thursday, October 18, 2007 - link

    Did you use Super Sampling or Multi-Sampling on ATI cards?
  • Powered by AMD - Tuesday, October 16, 2007 - link

    i downloaded the dem files, but where do i have to put them and what do i have to type in the console to get it working?
    thanks in advance.
  • Zaitsev - Tuesday, October 16, 2007 - link

    Thanks for taking the time to bench some older cards. It's crazy that a x1900xtx = hd 2900xt with antialiasing on. o.0
  • zero2dash - Monday, October 15, 2007 - link

    FSB O/C that E2160 to 3 gig on air and then benchmark it.

    Who runs the E21x0's at stock? Seriously??

    I know you're trying to equal things across the board, but at least throw in an O/C number in there *somewhere*.

    -Just a thought.- =)

    Radeon 1950 FTW
    Looks like the best card below the 8800GTS line especially at the $150 price point...nice.
  • NARC4457 - Monday, October 15, 2007 - link

    quote:

    What you'll see here today is that every single component we tested, down to the cheapest CPU and GPU, are more than enough to run Half Life 2: Episode Two.


    I'm just sad that my GPU/CPU are even older than this list. That said, Ep2 is running great at 1280x1040 with most settings at medium and 2xAA.
  • bojaka - Monday, October 15, 2007 - link

    First of I have no idea how much shortcuts they have taken when trying to get Motionblur and good shadows to work since they are both awful!
    The motionblur is just strange and the shadows looks good, but they don't seem to be calcylated correctly from the flashlight. Try standing in front of something with your flashlight turned on so the shadow falls on a wall in front of you and turn from left to right and you should see the shadow move in a very unrealistic manner.
    How ever... It's a beautiful and fun to play game that really shouls have been benchmarked indoors and outdoors with full shadows and the flashlight turned on. Then you should have seen some different framrates! =( So disappointed at the performance (1280x1024 4xAA, 8xAniso end everything on highest on my C2D E6600 (800Mhz FSB), 2GB RAM, 8800GTS 640MB)
    New benchmarks wanted!!! Flashlight on and shadows on!!!
  • DerekWilson - Tuesday, October 16, 2007 - link

    all the graphics options were turned all the way up -- shadows were on. I played with the flash light but didn't see any significant difference in framerate.
  • tonjohn - Tuesday, October 16, 2007 - link

    Derek,

    You need to fix the part of the article that says that the new build of the Source engine only supports two threads. Mike Durand of Valve has confirmed that the latest build of the Source engine defaults to three threads for EP2 and Portal and two threads for TF2.
  • jeffrey - Sunday, October 14, 2007 - link

    The article states that the GPU wasn't the limiting factor and that's fine, however I would still like to know what card/driver was used in the CPU test rig.
  • Spoelie - Sunday, October 14, 2007 - link

    any word on the msaa units in the render backen of the rv670, are they fixed?
    seeing the x1950xtx with the exact same framerate as the 2900xt when 4xAA is turned on really hurts.

    this just shows how good the r580 core really was, being able to keep up with the 88kGTS on this workload
  • NullSubroutine - Thursday, October 18, 2007 - link

    I believe the RV670 did address AA as well as some other fixes and improvements (per reported on other sites that the core was not just a die shrink.)
  • xinoxide - Sunday, October 14, 2007 - link

    why is this card not included, the increase in memory speed and capacity do help framerates, especially when sampling many parts of the image, an example being shadow cache, and AA resamples. ive been finding anand to be "somewhat" biased in this respect, they show the 8800ULTRA performance over the 8800GTX itself, and while ati struggles in driver aspects, doesnt mean there is no increase in performance from the HD2900XT 512MB to the HD2900XT 1GB
  • felang - Sunday, October 14, 2007 - link

    I can't seem to download the demo files...
  • DorkOff - Saturday, October 13, 2007 - link

    I for one really, really appreciate the attached *.dem files. I wish all benchmarking reviews, be it of video cards or cpus, would include these. Thank you Anandtech
  • MadBoris - Saturday, October 13, 2007 - link

    I'm starting to get disappointed in the trend of the effects of console transitions in technology enhancements. It's no suprise that UE3 and Valve source will perfom well, they have to run well on consoles afterall. What I don't like is lack of DX10 integration (although it can be argued it's premature and has been fake in other games), as well as things like AA 'notably absent' in UE3. Obviously the platform compatibility challenges are keeping them more than busy where extra features like this are out of reach. I guess that is the tradeoff, I'm sure the games are fun though, so not too much to complain about. Technology is going to level off a bit though for a while apparently. I'm sure older value HW can be glad by all this.

    Real soon, if not already, we will have more power on PC than will be used for some time to come with the exception of one...This year it will be Crytek to pushing the envelope, and those GTS320's (great for multiplatform games limited by consoles) are going to show that indeed the 512MB memory that was on video cards which started appearing many years ago, shouldn't have been a trend so easily ignored by PC gamers going for a 320.

    Otherwise, looking forward to many more of these for UE3 and of course Crysis.
  • MadBoris - Saturday, October 13, 2007 - link

    I forgot to mention things like x64 adoption as another thing that won't be happening in games for years due to limited memory constraints in consoles setting the bar. Crytek cannot move the industry along by itself, infact, it may even get a black eye for doing what Epic, id, Valve used to do, but don't anymore. Many despised the upward demands of HW of the gaming industry, but the advances we have today are due to them, including things like multicores and GPU's like we have, have all been helped move along faster by the gaming industry.

    Memory is inexpensive and imagine if we could move on up to 3 or 4 GB in games in the coming couple years, game worlds could become quite large and full, and level loading not nearly as often or interrupting. But alas x64 will be another thing that won't be utilized in the years to come. With the big boys like Epic, id, Valve all changing their marketing strategies and focus it appears things will never be the same, atleast for the leaps every 5 years by the consoles now dictating terms. It's even doubtful that consoles in next gen would use x64 because they won't spend money to add more memory, therefore have no need for x64. 32 bit, how do we get out of it when MS won't draw the line.
    Sorry if this is deemed a bit off topic, but being a tech article about a game, it kind of got my brain thinking about these things.
  • munky - Saturday, October 13, 2007 - link

    What's the point of running your cpu benches at 1024 resolution? We already know Intel will have the performance lead, but these benches say nothing about how much difference the cpu makes at resolutions people actually use, like 1280, 1600, and 1920.
  • Final Hamlet - Saturday, October 13, 2007 - link

    ...your GPU-world starts with an ATI 2900XT and ends with an Nvidia 8800Ultra. Mine does not. I really would like to see common GPUs (take a look @ the Valve Hardware Survey... how many of those ridiculous high-end hardware do you see there?). Please include the ATI 1xxx and the Nvidia 7xxxx @ some _realistic_ resolutions (sry... maybe you have a home theater with 10,000x5,000 - but testing that has no relevance to about 99%+ of your readers, so why do you keep on producing articles for less than 1% of PC players?)
  • archcommus - Saturday, October 13, 2007 - link

    I'm running an Athlon 64 3200+, 1 GB of memory, and an X800 XL, and the game plays beautifully smooth (not a single hiccup even during action) at 1280x1024 with MAX EVERYTHING - by max everything I mean very high texture detail, high everything else, 6x AA, 16x AF. So if you're running a regular LCD resolution (not widescreen) you basically don't need benchmarks at all - if you built your system anytime within the last 2-3 years (up to two generations ago), it's going to run fine even with max settings. Thus the benchmarks are tailored to people running much higher resolutions, and because of that need higher-end hardware.

    Considering the game still looks great with the exception of some more advanced lighting techniques that new games have, I'm very impressed with what Valve has done.
  • Trixanity - Saturday, October 13, 2007 - link

    I kinda agree with that. I would like to see some tests with more standard specs, than top-end. You should do top-end when comparing new hardware against each other (AMD vs Intel and Nvidia vs AMD), but when testing a game, you may still use these, but try also to include older hardware such as DX9 cards like X1900XTX and 7800GTX or 7900GTX (And GT).
    Would be more helpful to the average gamer, as not all have the money to invest in the top end each month. About the resolution, mostly you do out-of-reach resolution for most, but this time I think you did include 1024x768 for CPU testing, right? I think you should do 1024x768 and 1280x1024 (Or Wide Screen resolutions) more often. I myself run my games at 1024x768, but would probably change when I get a new monitor. (Have old CRT). Those 1900x1200 resoluations are only for people with 24" screens or more.
  • Cali3350 - Saturday, October 13, 2007 - link

    Demo files arent linked correctly, cant get them to download in either firefox or IE.
  • tmx220 - Saturday, October 13, 2007 - link

    Why wasn't the HD 2900Pro not tested?
    The review boasts the 8800GTS 320MB as the best value but clearly it didn't have its price competitor (the HD 2900Pro) to go up against
  • Proteusza - Saturday, October 13, 2007 - link

    This test is kinda screwed I think. They didnt test the 8800 GTS 640, but then claim ATI is the winner at that price point. Hello? You didnt test it, how do you know? They also state that the 320 mb version performs identically at lower resolutions and AA settings, which is true, but they didnt test the 640mb version so we wont know how it performs against the 2900XT at high res/AA settings. Thanks guys.
  • tonjohn - Saturday, October 13, 2007 - link

    Probably b/c the GTS is already competitive with the XT as it is.
  • 8steve8 - Friday, October 12, 2007 - link

    it's misleading (not saying intenionally... but still...)
    to compare cpu costs without considering chipset costs.
    intel motheboards have higher costs partially due to their memory controller.

    what is more applicable to us is cpu+motherboard cost comparasons
    or cpu+motherboard+ram (but here we assume its all the same ddr2 so doesn't really matter)



    just one example (i was recently looking at building a system with hdmi)

    cheapest core 2 duo board with hdmi is $114 shipped at newegg (an ati chipset)
    cheapest intel chipset board for core 2 duo with hdmi is $126 shipped at newegg

    cheapest AM2 board with hdmi is $72 shipped (ati chipset)
    cheapest AM2 nvidia chipset board with hdmi is $79 shipped.

    so maybe this particular type of motherboard isnt a great example. but here we see an avg price difference of over $40.

    for my purpose that means intel cpus have to be $40 cheaper with teh same performance...





    before anyone spams me, i agree intel has better cpus right now, but comparing cost on cpu alone is not relevant to consumers, when its useless without a motherboard.
  • mcnabney - Friday, October 12, 2007 - link

    That is a very valid point to make. However, motherboards and the chipsets inboard can also impact performance, so once you start adding more variables this simple article will need a Rosseta stone to decipher.

    also, since most of these GPUs can be run in SLI/Crossfire, does either ATI or Nvidia scale better with a second card?
  • KeithP - Friday, October 12, 2007 - link

    It would have been far more useful with some older GPUs benchmarked.
  • johnsonx - Sunday, October 14, 2007 - link

    indeed... why spend hours testing with 5 different speeds of the same CPU, yet not even bother with any X1k series and GeForce 7 series GPU's? Do we really need tests to tell us an X2-5000 is a little faster than an X2-4800, which is a little faster than a 4600, which is a little faster than a 4400, etc.? At most we'd need tests of the fastest and slowest sample of each processor type. I guess for Intel this is mostly what was done, although even a couple of those could be dropped, but for AMD X2's only 2 or maybe 3 needed to be included. Also, how about a couple of common single cores to see how much dual-core benefits the new source engine, say an A64-3500 and a 3.2Ghz P4?
  • Vidmar - Saturday, October 13, 2007 - link

    No doubt. Here I was expecting to see how the game may run on nVidia 7800 or 7900 cards. A bit misleading on them to suggest that this would be a comprehensive review. Far from it with only 4 GPUs used.
  • tonjohn - Friday, October 12, 2007 - link

    I would also like to see tests on:
    * 1gb vs 2gb vs 4gb of ram
    * WinXP vs Vista (maybe even 32bit vs 64bit OS comparisons)
    * EP1 performance vs EP2 performace & CSS (or DODS) performancs VS TF2 performance.
  • tonjohn - Friday, October 12, 2007 - link

    This article claims that the new engine only takes advantage of two cores. However, Valve's comments all suggest that the engine can take advantage of four or more cores.

    As for proof, this is what one of my co-workers from Valve's forums reports:
    "How about this?!

    http://i24.tinypic.com/rvefs6.png">http://i24.tinypic.com/rvefs6.png

    I was in the level called "our mutual fiend" where you go down into the depths to find out what is going on. Multiple hunters and all kinds of activity going on. I'm going to run this same level with graphical settings all on low to eliminate GPU bottlenecks and set the core affinity for HL2 to use one, then two, then three and then four and record the fps in game."
  • ViRGE - Friday, October 12, 2007 - link

    I could be wrong here, but that doesn't indicate that HL2 is using all 4 cores. It could be bouncing processes between cores (which does happen with other applications), which makes it look like all 4 cores are in use.
  • tonjohn - Saturday, October 13, 2007 - link

    Here is the official official word on this topic:
    quote:

    ----- Original Message -----
    From: "Mike Durand" <mdurand@valvesoftware.com>
    To: <hlcoders@list.valvesoftware.com>
    Sent: Thursday, October 11, 2007 12:45 PM
    Subject: RE: [hlcoders] questions

    We default to taking advantage of no more than three threads due to some problems that we believe to be due to cache issues with current quad-core processors. You can override this limitation by specifying '-threads 4' on the command line if you like. '-threads 8' should work with this build as well when eight core processors become available.

    -Mike

    So AnAnd needs to adjust the article.
  • tonjohn - Monday, October 15, 2007 - link

    The article still says that the new engine only supports two threads when that is actually wrong. Please fix this.
  • FrankThoughts - Saturday, October 13, 2007 - link

    I have several thoughts.

    First, the whole "30FPS is too slow" stuff is garbage. Maybe in multiplayer, but in single player HL2 Episode Two is perfectly good at 30-40 FPS. I say this because:

    Second, I played through the WHOLE GAME at 2560x1600 4xAA with an X1900 XT GPU. (2GB of RAM, Athlon X2 4800+ CPU, Windows XP). I checked frame rates with FRAPS, and typically got anywhere from 30-45 FPS. What's odd is that I actually thought I was getting much higher rates and it was only after completing the game that I checked the real values and discovered I was hitting low to mid 20s at times.

    Third, as usual CrossFire support is broken out of the box. Maybe the 7.10 drivers that just got released have fixed this, but those weren't available the day EP2 released and so I played and beat the game with a single GPU running. (CrossFire ran, but there was several graphical corruption and major slowdowns. Same goes for Portal. Not sure about TF2 yet....)

    Fourth, CPU cores and performance scaling. Well, I may be doing something wrong (I've tried setting affinity in task manager as well as using "threads [x]" to set CPU core use to one or two cores. Other than minor variations of around .4 FPS (using a timedemo I created, since the AT demos apparently aren't available for download), I get the same result with one or two CPU cores. So, while the engine might be threaded to the point where it can "use" four or even eight cores, the reality is that it doesn't impact performance at all that I can see. Perhaps I'm GPU limited, but I was testing at 1280x800 and averaged 125 FPS (plus or minus 0.3 FPS) at every CPU threading level I tried. CPU usage still got up to around 60% max regardless, so I'm thinking either sound drivers or the graphics drivers are utilizing the other core. Bottom line is that best case (and not even likely) the multithreading is giving about a 5-10% performance boost. Usually threading has an overhead of 5-10%, so most likely CPU utilization went up but performance remained virtually unchanged.

    If you have X1900 series hardware, this game runs perfectly well, turning in great performance even at insane 2560x1600 4xAA resolutions. (If I turn off AA, I get about 40% faster frame rates.) What's funny is that my performance seems to be within spitting distance of the 8800 GTS and HD 2900 XT, all on 22 month old old hardware. I've been thinking I "need" to upgrade for a long time, but every time I actually play some new title I end up with perfectly reasonable performance. My next upgrade will be quad core and either SLI or CrossFire (probably SLI since CrossFire has left me irritated on more than one occasion - basically every new game fails to run with CrossFire for anywhere from 1 to 4 months), but I'm not going to take the plunge until I actually feel the performance gain will be worthwhile. At the current rate, I might be sticking with DX9, XP, and X1900 until late 2008!
  • tonjohn - Friday, October 12, 2007 - link

    If you noticed how, prior to the alt-tabbing, the Core0 has high usage and then the other cores are all being used but the % varies with each core. This is a good indicator that the engine is actively taking advantage of each core and not simply the OS bouncing the supposed two threads around.

    More supporting evidence that the information in Anand's article is potentially incorrect:
    quote:

    Multicore Support - Team Fortress 2 only makes use of multiple CPU cores for its particle system. However, both Episode 2 and Portal make use of the Source engine’s new scalable multicore system.

    Their multicore solution will scale dynamically with however many cores you have on your system. The more cores you have, the more Source engine subsystems will be offloaded to these cores. What are these “subsystems” I speak of, you may be wondering. Areas such as the particle simulation, the materials system, and artificial intelligence are just a few of these subsystems that can be offloaded onto other cores for increased performance across the board.

    However, there are some drawbacks to this. There will obviously come a point where the performance gain from offloading these subsystems to additional cores is hampered by a weak GPU. As is the case now with single and dual-core solutions, making sure to strike a balance between a strong CPU and a GPU that can keep up.

    From CSNation, http://www.csnation.net/articles.php/article_234/">http://www.csnation.net/articles.php/article_234/
  • steamsucks - Friday, October 12, 2007 - link

    Steam sucks. You can't even dl the piece of crap game because of server issues. Don't buy this game, and don't support Valve/Steam.
  • Zak - Sunday, November 11, 2007 - link

    I never had any issues with Steam since HL2 release. I actually like the idea of not having to deal with copy-protected CDs and "please insert CD number 5" every time I want to play a game and having my games always updated. I buy a game in the evening and next day it's INSTALLED and ready on my hard drive. I can make fully re-installable backups of any Steam content on a hard drive so I don't have re-download the whole games when I need to reinstall. I only wish they discounted the Steam games more than the actual physical products. Other than that Steam is a great idea IMHO.

    Z.
  • sc3252 - Monday, October 15, 2007 - link

    Steam does suck. I wish I didnt have to use steam to play orange box, but thats how things work. I dislike it but acknwolege its not as bad as bioshocks drm.

    The first thing I did when I bought this game was put it on a seperate account, so that my brother can play counter strike while I play team fortress 2.

    As far as performance goes, I am gaming on a 3000+ amd 64 with a 7600gt. Everything plays perfectly at 1680x1050. I have aa turned off but everything else is turned up. I cant say the same for the ut3, or any other modern game that has graphics even close to half-life 2's level.
  • retrospooty - Saturday, October 13, 2007 - link

    I have never had an issue with steam, using it since HL2 first came out. Not sure what your issues are, but you might want to look into them.
  • RamarC - Friday, October 12, 2007 - link

    what a twit. hundreds of thousands of satisfied users, but he can't get it to work so it must be a piece of crap. go play your ds... even six year olds can handle those.
  • cmdrdredd - Friday, October 12, 2007 - link

    You're just a moron then. Steam works fine, the game is fine, Valve did a good job.

    Everyone knows that the HD2900XT does pretty poorly at high resolutions with AA. Every game is like this. To be surprised means you weren't paying attention
  • redfirebird15 - Friday, October 12, 2007 - link

    Can you post screenshots comparing the AA enabled to the non-AA enabled tests? Just wondering if the increase image quality compares with the impact on performance. Thanks!

    Oh and could you post results for a 1900xtx? It may be older but the cost of upgrading hasn't been justified yet.
  • shabby - Friday, October 12, 2007 - link

    Wow that 2900xt just tanks when you enable aa, bummer.
  • Spartan Niner - Tuesday, October 16, 2007 - link

    Last time I checked, the 2900XT doesn't scale as nicely with AA as 8800 series cards do. Also, the resolutions tested and the cards used in this comparison are far from what mainstream gamers use... the CPU comparisons are also nice but shouldn't be the main focus. The results basically tell us more expensive CPUs give better stock performance...

    At the very least, a test utilizing more common resolutions 1600x1200, 1280x1024, 1024x768 for 4:3 and resolutions such as 1680x1050 and 1440x900 for the widescreen crowd combined with the usage of X1xxx-series ATI cards and 7xxx nVidia cards would be more realistic.

    For reference, I use a E2140 @ 2.4GHz and an X850 XT video card and run CS/DOD Source at max everything, 4xAA. At least for my needs it seems like any video card short of a 8800 series card or a 2900 series card is not a cost-effective upgrade when mid-range cards don't offer that much of a performance boost over my X850...

Log in

Don't have an account? Sign up now