BattleForge

BattleForge is a game we have always taken as being shader-heavy, which is what makes our results here so interesting. While Far Cry 2 was a dead heat between the 5830/4890/GTX275, BattleForge quickly separates the cards. We’re anywhere between 15-20% slower than those cards, leaving the 5830 to hang with the 4870 most of the time, and to even lose there at 1680. Meanwhile against the 5850, it’s 25-30% slower.

Given that the 5830 has a higher ratio of shader power to rendering than any of these other cards, our best conclusion is that BattleForge is in fact a ROP-heavy game when not completely starved of shader power.      The result is that the 5830 is placing near a card that costs some 33% less. It’s clear that the impact of cutting the ROPs in half will vary from game to game.

Far Cry 2 HAWX
Comments Locked

148 Comments

View All Comments

  • MadMan007 - Thursday, February 25, 2010 - link

    Since when is ATi taking marketing technique pointers from nVidia?

    "...the 5830 has some very useful advantages over the 4890 – DX/DirectCompute 11, Eyefinity, better OpenCL support, and bitstreaming audio..."

    Substitute PhysX, CUDA, and 3D display and that would be an NV marketing line.

    (btw why does using quote tags always throw an error in article comments?)
  • Ramon Zarat - Saturday, February 27, 2010 - link

    I beg to differ. There are very clear distinctions between the technologies you mentioned!!!


    CUDA: Proprietary API, closed platform strictly regulated by Nvidia that will be soon obsolete due to OpenCL broad adoption. Market penetration is still limited to vertical market niches.
    Stream: Based on OpenCl, an open platform supported by the whole community representing the future of the industry which will presumably enable any if not all applications and games properly coded and compiled to benefit from it.

    PhysX: Proprietary API supported by only a dozen games out of which 10 are very bad.
    Havok: Will transparently use OpenCL open standard to do in-game Physics, which will ensure a wide adoption.

    Nvidia: Bitsreaming *REGULAR* audio over HDMI
    ATI: Bitstreaming *TrueHD/DTS-HD Master Audio* audio over HDMI

    3D vision: Proprietary API. Need one of *ONLY* 4 Nvidia approved 120Hz LCD, ( http://www.nvidia.com/object/3D_Vision_Requirement...">http://www.nvidia.com/object/3D_Vision_Requirement... ) and the games must be supported in driver. Costly setup, low market penetration.
    Eyefinity: Actually work out of the box for 2D environment. You only need any 2 LCD/CRT + 1 LCD with display port (any brand) or a DVI/HDMI panel with an active converter. A 6 ports version is launching in a couple of weeks. For 3 panels gaming, game profiles are now outside drivers and available almost as soon as a new games come out. Drivers for games still need some polishing.

    I try very hard to be objective, but the facts speak by themselves. ATI is doing better technology right now and shouldn't be ashame to publicize its superiority. By contrast, Nvidia's totalitarian TWIMTBP program, dictatorial proprietary stuff everywhere, and deceptive general attitude as of late ("late", as in the last 5 years...), are ethically highly questionable. The day ATI do the same, I will denounce them as well.
  • piroroadkill - Thursday, February 25, 2010 - link

    Exactly, nobody gives a shit.

    The 4890 is faster and cheaper, the end
  • ImSpartacus - Thursday, February 25, 2010 - link

    No kidding. I am so thankful that I got my 4890 when it came out. I only paid $225 for it too.

    It still hasn't been topped in its price point.
  • Makaveli - Thursday, February 25, 2010 - link

    I picked up my 4890 in Oct for $189 and still laughing about it.

    I won't bother upgrading until the successor too the 5xxx series comes out.
  • kmmatney - Thursday, February 25, 2010 - link

    I jumped on the MSI HD4890OC deal for $180 a year ago, and actually received the rebate after 4 months. Amazing that you can't spend the same amount of maney and get something that performs better a year later.
  • strikeback03 - Thursday, February 25, 2010 - link

    Not really that amazing, it is what happens when there is no real competition. If Nvidia can shock the world and drop something new and good at the $200 price point it is a good bet you will see the whole market adjust quickly.
  • Deville - Thursday, February 25, 2010 - link

    Exactly. It's silly to offer another card that performs in the range of last gen's cards. What's the point of "upgrading" if there's no upgrade?
    If it can barely keep up with last year's models, how can we expect it to do the DX11 stuff? And isn't the DX11 stuff pretty much the only reason to upgrade anyway?

    Here's the problem when comparing new versions of 5000 series cards:
    The numbering system helps, but we have precious little data to show us how DX11 even performs under these new cards.

    I love reading your shootouts, but give us DX11 benchies, please.
  • san1s - Thursday, February 25, 2010 - link

    that's exactly what I was thinking
  • gumdrops - Thursday, February 25, 2010 - link

    Where are all the DX11 game tests like DIRT 2 or Alien vs Predator? BattleForge is the only one and it's unclear if the game was even run in DX11 mode for cards that support it.

Log in

Don't have an account? Sign up now