Introduction

We have been excited about lots of new games being released and we've had our hands full testing and playing as many as we can. Starting with games like Battlefield 2, we've been seeing some big advancement in game graphics even within the past few months. Black and White 2, in particular, impressed us recently with its amazing images of water and overall environments. We are always excited about a game that has beautiful looking graphics and rich gameplay as well, and it seems like this is happening more often lately, much to our delight. The Call of Duty 2 demo also has us all giddy, and it looks and plays great, even if it is frustratingly short.

Some other games that have us waiting in anticipation are Quake 4 and Age of Empires 3. We wish that we had some good demos of these games, but unfortunately we have to wait for the release date like everyone else. It seems like the bar is being raised higher and higher with new games in terms of graphics that video card manufacturers might have trouble keeping up, and this past Tuesday, with the release of FEAR, the bar was raised a very significant notch. Yes, FEAR is out, and it is beautiful.

We recently sat down and tested FEAR with the 1.01 patch that came out the day on which the game was released. We also tested with the absolute latest drivers from ATI (press sample 8.183.1017 which should be available in catalyst soon) and NVIDIA (81.85 available on nzone now), both of which offer increased performance in FEAR. Our results were interesting to say the least, and we'll give you the details on how this game performs on a wide range of boards, including ATI's new X1000 line.

While the single and multiplayer demos of this game have been available for quite some time, we had the (quite correct) understanding that final performance would not look anything like what the demo showed. Today, readers can rest assured that the numbers that we have collected will be an accurate reflection of FEAR performance on modern hardware.

The Game/Test setup
Comments Locked

117 Comments

View All Comments

  • LocutusX - Thursday, October 20, 2005 - link

    I have FEAR, and have been playing it for the past day or so ("sick day" from work).

    I can't believe AnandTech would consider it good-looking on non-cuttingedge hardware where you have to put the details down. Have you actually played the game for more than 5 minutes? Performance & Graphics Quality in the later levels is CRAP if you're using mostly medium settings, which is a NECESSITY if you're using a slow X800 part or anything worse. (think X800XL)

    For the level of graphics you get, the performance of FEAR is unacceptable. Chronicles of Riddick looked much better, and performed slightly better, on my system. That's an OPENGL game on ATI hardware! Significant, no?

    BTW I also just tried Quake4... much much better performance than FEAR, and the indoor sequences look better by comparison (since I can afford to increase details in Q4, because the D3 engine actually runs pretty decently on ATI hardware with the most up-to-date drivers and CatalystAI enabled).
  • Jackyl - Thursday, October 20, 2005 - link

    LOL. He thinks X800XL is "slow"! A few months ago, everybody here was raving the X800XL as being best price/performance that actually beat a lot of higher end Nvidia cards. Ugly on medium textures? Go play the first DOOM or Wolfenstein 3D, and then come back and say Medium textures are ugly on Fear.

    Some people just won't be satisfied. It's people like you that pay $400-600 for a graphics card, that is causing prices to inflate. I really can't wait until silicon hits the limit where they can't reduce size anymore, and Moore's Law goes obsolete. Then the engineers will have to actually OPTIMIZE the hardware and drivers, instead of just cranking out more raw GPU power. I'm really sick of upgrading, and a lot of my friends have already stopped upgrading their systems, two generations ago. They just gave up.

    Tell me something...Everyone sure talks big on here, wanting to upgrade their cards. But why is it when I go to a game store, there are barely any PC games available on the shelves? I don't think a lot of people are buying PC games today, even though ATI and Nvidia would like to say otherwise. The shelves are totally full with console games instead.
  • LocutusX - Thursday, October 20, 2005 - link

    Jackyl, thanks for your totally useless post:

    "LOL. He thinks X800XL is "slow"!"

    Yes, compared to the 7800GTX - which apparently is what you NEED for Fear to run at a decent rate AND look good - the X800XL is slow.

    "Ugly on medium textures? Go play the first DOOM or Wolfenstein 3D, and then come back and say Medium textures are ugly on Fear."

    Not necessary to go back 10 years, chico. Anyone who has played Doom 3 (August 04), Half-Life 2 (December 04) or even Far Cry (March 04) will agree that Medium textures are "ugly" on Fear, although some may not use such strong language.


    "It's people like you that pay $400-600 for a graphics card, that is causing prices to inflate."

    Uh, no genius, I paid less than $200 for my X800Pro at a fire sale. And then I overclocked the sh!t out of it, so now it's a little bit faster than a stock X800XL.


    "Then the engineers will have to actually OPTIMIZE the hardware and drivers, instead of just cranking out more raw GPU power."

    Uh, that's precisely the point of my post - sorta. FEAR is a horribly unoptimized, perhaps even poorly-written, engine. In my opinion, it is unacceptable that an X800XL-class card should have so much trouble with it.

    So, what exactly was the point of your post anyways?
  • Pannenkoek - Thursday, October 20, 2005 - link

    On the esthetics of FEAR: it would have been kind of the article writers to include screenshots to underline their judgement. As one who has seen Unreal2 rendered by a lousy GF2 I can understand the parent poster's point. Also thanks for listening to my request for the absence of subjective opinions on the "playablity" of a game in benchmarks...
  • Kegh - Thursday, October 20, 2005 - link

    I played through the demo and thought the graphics were pretty good (considering my setup - 9700 pro, AMD 2500). But more to your point... I have never been a big fan of the Monolith LithTech Engine, every game or demo I have played which used it always feels clunky, the controls always seem "off" and the general engine performance is generally not on par with the other 3d game engines available. To be fair, I haven't played enough of this game to bash the current engine that much and once the game goes on sale I will probably pick it up. But not until after I buy a new system. :)

  • michal1980 - Thursday, October 20, 2005 - link

    common now, high end with no sli, maybe the 7800gt/gtx X 2 could really for the first time shine?
  • Jackyl - Thursday, October 20, 2005 - link

    Tried the demo with my 9800 Pro 128MB, 2.2GHz 64-bit proc, 1GB DDR. I ran the game in 1024x768, no soft shadows and no AF/AA, medium textures, and it still ran great on my system. It also still looked great with medium textures and ran smooth.

    I'm not sure why the X800 GT got such low framerate? Because of high textures? Maybe I'll try that on my card too tonight.
  • Jedi2155 - Friday, October 21, 2005 - link

    At first i was very skeptical that my friends system could handle it but it worked great and was perfectly smooth @ 1024x768 with medium details. And he's only running a AXP 1800+ with a Radeon 9800 pro. So it can still work pretty well on the old systems.
  • xsilver - Thursday, October 20, 2005 - link

    high end textures absolutley kills the 9800pro - killed mine anyways ;)
    if you let it autodetect the settings it should run smooth, all the tests here were on max settings except for the aformentioned soft shadows and AA/ansio

    with your settings in the demo (i had basically the same setup), while the frames were good, it still hitches when there is a lot of enemies/action going on

    it would just be good for AT to test it to compare apples with apples :)
  • Jackyl - Thursday, October 20, 2005 - link

    Actually I had ATI's 2x AF turned on in the drivers.

Log in

Don't have an account? Sign up now