Crysis: Warhead

Kicking things off, we’ll start with Crysis: Warhead. Warhead is still the single most demanding game in our arsenal, with cards continuing to struggle to put out a playable frame rate with everything turned up.

Update: As a few of you pointed out, there was something a bit off with our Crysis results; we had a Radeon 4850 beating the 5770. As it turns out we wrote down the maximum framerate for the 4850 instead of the average framerate. None of the other results were affected, and this has been corrected. Sorry, folks.

There are a few different situations we’re going to be interested in. The first is the matchup between the 5770, the 4870, and the GTX 260. The second is the matchup between the 5750, the 4850, and the GTS 250. The third is the 5770 as compared to the 5800 series, in order to see what another $100 or $200 is buying you in the Evergreen family.

Unfortunate for the 5770, this is not a game that treats it well. In spite of the clock speed advantage over the 4870, and the architectural advantages (extra caches and what-not), it underperforms the 4870 by about 15% here. AMD had once told us that they believed that they weren’t memory bandwidth constrained on the 4870/4890, but when that’s the only significant difference between the 5770 and the 4870 that would explain the performance difference (certainly Juniper wouldn’t be slower than RV770), we are beginning to doubt that. Meanwhile the GTX 260 outscores the 5770 here too.

Looking at the 5770 compared to the 5850, $100 buys you roughly 50% more performance.

The 5750 fares much better here. It beats the 4850 by 10%-20%, and beats the GTS 250 by a similar margin.

The Test Far Cry 2
Comments Locked

117 Comments

View All Comments

  • GrizzlyAdams - Tuesday, October 13, 2009 - link

    That may be due to some architectural improvements in the 5770's shaders. The drop in performance in other games may be due to the decreased memory bandwidth, which may not matter with regards to Far Cry 2.
  • papapapapapapapababy - Tuesday, October 13, 2009 - link

    this cards are super lame... 5750, now with +80 stream processors ! XD that 5750 is basically a ( lower clocked!) 4770... guess what ati? that cost me $85 bucks 6 months ago! but who cares right? nvidia is dead so why bother? just slap a dx11 sticker, rice the price ati?
  • The0ne - Tuesday, October 13, 2009 - link

    Just wanted to say I like the conclusion and it's dead spot on on the suggestions and advices.

    I'm very surprise almost no one is talking or bringing up the subject of DirectX. DX11 has more chance to succeed yet less attention. It's amazing how badly DX10 was to sway consumers about face.
  • kmmatney - Tuesday, October 13, 2009 - link

    The problem with DX10 was that you had to buy Vista to get it...
  • MadMan007 - Tuesday, October 13, 2009 - link

    DX10 rendering paths of games that were also DX9 (meaning all of them at the time and even now) were also *slower* and provided little to no i.q. improvements. So even if it hadn't been Vista-only (and only morans keep on with the Vista FUD after SP1) there was no real benefit. DX11 looks to be different in all respects.
  • Lifted - Wednesday, October 14, 2009 - link

    Yeah, get a brain!

    http://24ahead.com/images/get-a-brain-morans.jpg">http://24ahead.com/images/get-a-brain-morans.jpg
  • Zool - Tuesday, October 13, 2009 - link

    Quite strange that with die size 166mm2 againts 260mm2(rv770) and with 128bit memmory it costs this much. And the 5750 has disabled one simd which even increase the amount of usable chips (but maybe its disabled just for the diference or else the two cards would be exatly the same except clocks).
    The Tessellation part with fixed units is exatly the same as 5800 series or tuned down ?
  • philosofool - Wednesday, October 14, 2009 - link

    I chalk it up to lowish 40nm yields at TSMC.
  • Spoelie - Wednesday, October 14, 2009 - link

    + higher cost per wafer as a 55nm one
    + ddr5 prices
  • Mint - Tuesday, October 13, 2009 - link

    Unless you absolutely need to take advantage of the lower power requirements of the 40nm process (e.g. you pay a ton for power)...

    According to your tests, the 5770 consumes a whopping 48W less idle power than the 4870, and other reviews have comparable results. If your computer is out of standby a modest 10 hours a day, that works out to 175 kWh per year. That's easily $15/year even for people with cheap electricity.

    The funny thing is that I usually see people overstating the savings from power efficiency...

Log in

Don't have an account? Sign up now