Crysis: Warhead

Kicking things off, we’ll start with Crysis: Warhead. Warhead is still the single most demanding game in our arsenal, with even high-end cards continuing to struggle to put out a playable frame rate with everything turned up.

For testing these low-end cards, we have deviated some from our normal testing. These tests were done with Mainstream graphics quality at resolutions more appropriate for these cards.

The last card we ended up testing for this article was the 4670, and the moment we ran this benchmark it basically killed any enthusiasm we had for the GT 220. The GT 220 is not competitive here with the 4670, which is its slightly cheaper competition from AMD.

If there is a bright side, Crysis does run decently even on the GT 220 after turning down the quality settings to Mainstream.

The Test Far Cry 2
Comments Locked

80 Comments

View All Comments

  • abs0lut3 - Tuesday, October 13, 2009 - link

    When is GT 240 coming out and when are you going to review it? I had expected the GT 220 to be as low as it comes (reaaallly low end), however, I saw some preliminary reviews on other forums on the GT 240, the supposedly new Nvidia 40nm mainstream card with GDDR5 and quite fascinate with result.
  • MegaSteve - Tuesday, October 20, 2009 - link

    No one is going to buy one of these cards by choice - they are going to be thrown out in HP, Dell and Acer PCs under a pretty sticker saying they have POWERFUL GRAPHICS or some other garbage. Much the same as them providing 6600 graphics cards instead of 6600GTs, then again, I would probably rather have a 6600GT because if the DirectX 10 cards that were first released were any indication this thing will suck. I am sure this thing will play Bluray...
  • Deanjo - Tuesday, October 13, 2009 - link

    "NVIDIA has yet to enable MPEG-4 ASP acceleration in their drivers"

    Not true, they have not enabled it in their Windows drivers.

    They are enabled in the linux drivers for a little while now.

    ftp://download.nvidia.com/XFree86/Linux-x86_64/190...">ftp://download.nvidia.com/XFree86/Linux-x86_64/190...

    VDP_DECODER_PROFILE_MPEG4_PART2_SP, VDP_DECODER_PROFILE_MPEG4_PART2_ASP, VDP_DECODER_PROFILE_DIVX4_QMOBILE, VDP_DECODER_PROFILE_DIVX4_MOBILE, VDP_DECODER_PROFILE_DIVX4_HOME_THEATER, VDP_DECODER_PROFILE_DIVX4_HD_1080P, VDP_DECODER_PROFILE_DIVX5_QMOBILE, VDP_DECODER_PROFILE_DIVX5_MOBILE, VDP_DECODER_PROFILE_DIVX5_HOME_THEATER, VDP_DECODER_PROFILE_DIVX5_HD_1080P

    *

    Complete acceleration.
    *

    Minimum width or height: 3 macroblocks (48 pixels).
    *

    Maximum width or height: 128 macroblocks (2048 pixels).
    *

    Maximum macroblocks: 8192

  • Deanjo - Tuesday, October 13, 2009 - link

    I should also mention XBMC already supports this as well in linux.
  • Transisto - Tuesday, October 13, 2009 - link

    zzzzzzzzzzzzzzzzz...............
  • Souleet - Monday, October 12, 2009 - link

    I guess the only place that actually selling Palit right now is newegg. http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...&Des...
  • MODEL3 - Monday, October 12, 2009 - link

    Great prices, lol (either they have old 55nm stock or the 40nm yields are bad or they are crazy, possibly the first)

    Some minor corrections:

    G 210 ROPs should be 4 not 8 (8 should be the Texture units, GT220 should have 8 ROPs and 16 Texture units)

    http://www.tomshardware.co.uk/geforce-gt-220,revie...">http://www.tomshardware.co.uk/geforce-gt-220,revie...

    (Not because tomshardware is saying so, but because otherwise, it doesn't make sense NV architects to designed a so bandwidth limited GPU) (and based on past architecture design logic)

    G 210 standard config CPU core clock is 589MHz, shaders 1402MHz.

    (check Nvidia's partner sites)

    9600GSO (G94) Memory Bus Width is 256bit not 128bit.

    http://www.nvidia.com/object/product_geforce_9600_...">http://www.nvidia.com/object/product_geforce_9600_...

    58W should be the figure NV is giving when GT 220 is paired with GDDR3, with DDR3 the power consumption should be a lot less.

    Example for GDDR3 vs DDR3 power consumption:

    http://www.techpowerup.com/reviews/Palit/GeForce_G...">http://www.techpowerup.com/reviews/Palit/GeForce_G...
    http://www.techpowerup.com/reviews/Zotac/GeForce_G...">http://www.techpowerup.com/reviews/Zotac/GeForce_G...
  • Souleet - Monday, October 12, 2009 - link

    I'm sure there is cooling solution but it will probably going to hurt your wallet. I love ATI but they need to fire their marketing team and hire some more creative people. Nvidia needs to stop under estimating ATI and crush them, now they are just giving ATI a chance to steal some market share back.
  • Zool - Monday, October 12, 2009 - link

    Its 40nm and has only 48sp 8rop/16tmu and still only 1360MHz shader clock.Is the TSMC 40nm this bad or what. The 55nm 128sp gt250 has 1800 Mhz shaders.
    Could you please try out some overckocking.
  • Ryan Smith - Tuesday, October 13, 2009 - link

    We've seen vendor overclocked cards as high as 720MHz core, 1566MHz shader, so the manufacturing process isn't the problem. There are specific power and thermal limits NVIDIA wanted to hit, which is why it's clocked where it is.

Log in

Don't have an account? Sign up now