Unreal Tournament 2004 Performance

UT2004 continues to be quite popular, so we take a look at how well the entry level cards play Epic's latest game.

At 800x600, many of the cards appear to be CPU limited, with the exception of the GeForce 6200, X300 and X300SE (and, of course, the integrated graphics solution).

Unreal Tournament 2004 - AT_Timedemo

The performance is very similar between all of them because of this CPU limitation, so let's step back and see what the whole picture tells us:



The X700 continues to dominate performance, but here, it mostly allows you to play at higher resolutions more than anything else. The 6600 and X600 Pro actually perform quite similarly, as do the 6200 and X300, which isn't too good for NVIDIA.

Notes from the Lab

ATI X300: The added memory bandwidth really helps. It's definitely a noticeable improvement in performance over the X300SE. Interestingly enough, the X300 is basically as fast as the 6200 here, with a higher core clock and less memory bandwidth.

ATI X300SE: Not obscenely fast, but the card will play UT.

ATI X600 Pro: Visual quality, again, looks similar to NVIDIA. Performance at lower resolutions is CPU limited and competitive with NVIDIA. At 800x600, the X600 manages to stay ahead of the game, where NVIDIA falls behind a bit with the 6200. The game locked up switching resolutions once. It is interesting that average frame rates are actually higher in Doom 3 than they are in UT2004. It looks like Doom 3 is a much more peaky game, with more peaks and dips than UT2004, which offers a more stable frame rate. A quick check with FRAPS reveals what we had suspected. Although both UT2004 and Doom 3 have a minimum frame rate around 30 fps with the X600, Doom 3 peaks at about 150fps while UT does so at 113fps. Doom 3 peaks a full 30% higher than UT, despite the fact that the average frame rates are the same. Performance of the X600 is very strong; it's better matched for the 6600 than the 6200, despite NVIDIA's marketing.

ATI X700: At lower resolutions, it's the same speed as the X600. Only when you get past 1024 does it really separate itself.

NVIDIA GeForce 6200: Anything below 10x7 is a bit too aliased, but 10x7 seems to play well and look great on the 6200, despite what the average framerate may indicate. Launching the timedemo while still in the video settings screen caused UT to GPF (General Protection Fault).

NVIDIA GeForce 6600: It's tough to tell the difference between the 6600 and the 6200 at lower resolutions. The 6600 gives you the ability to play at up to 10x7 with no real drop in frame rate, but the 6200 does work well for the beginning/casual gamer.

Intel Integrated Graphics: The game is playable at 800x600 with the integrated graphics solution. You have to turn down some detail settings to get better responsiveness, though. It can work as a platform to introduce someone to UT2004.

The Sims 2 Performance Battlefield Vietnam Performance
Comments Locked

44 Comments

View All Comments

  • MemberSince97 - Monday, October 11, 2004 - link

    OT, I wonder about the outcome for us 6800 owners and the VP... Nvidia screamed this new feature to us and I bought it . Will this end in a class action,or perhaps some kind of voucher for people that bought the 6800 specifically for this highly touted feature....
  • Lonyo - Monday, October 11, 2004 - link

    Why is there no X300 in the CS: Source stress test?
    It seems oddly missing, and with no comment as to why...
  • projecteda - Monday, October 11, 2004 - link

    x700 > 9800 Pro?
  • NesuD - Monday, October 11, 2004 - link

    there is some kind of error concerning your max power graph and this statement.

    "other than the integrated graphics solution, the 6200 is the lowest power card here - drawing even less power than the X300,"

    the graph clearly shows the 6200 drawing 117 watts while the x300 is shown drawing 110 watts. Just thought i would point that out.

Log in

Don't have an account? Sign up now