F.E.A.R. Performance

F.E.A.R.

F.E.A.R. - Power Consumption

F.E.A.R. - Performance per Watt

With the latest 1.08 patch, F.E.A.R. has gained multi-core support, potentially using even up to quad core CPUs in order to deliver improved performance. We were able to confirm a performance increase with Core 2 Duo, and we will try to take a look at whether or not Core 2 Quad helps in the near future. Either way, this means that we should now be completely GPU limited in F.E.A.R. testing.

The new GeForce 8800 GTX card still manages to come out faster than the competition, but this time a single 8800 GTX is not able to surpass the performance of dual X1950 XTX cards (or 7900 GTX SLI for that matter). Quad SLI also manages to make a decent showing in this particular benchmark, coming in second place except at the highest resolution. Meanwhile, GeForce 8800 GTS doesn't fare as well, only managing to tie the X1950 XTX for performance, and it even loses that battle at 2560x1600. F.E.A.R. is a game that can use a lot of memory bandwidth, so it's likely that the 2GHz GDDR4 memory on the X1950 XTX is helping out.

If money isn't a concern, 8800 GTX SLI will finally allow you to play F.E.A.R. at 2560x1600 with 4xAA without dropping below 30 FPS. Is that really necessary? Probably not to most people, but if a similar situation exists in other games it becomes a bit more feasible.

Company of Heroes Performance Black & White 2 Performance
Comments Locked

111 Comments

View All Comments

  • haris - Thursday, November 9, 2006 - link

    You must have missed the article they published the very next day http://www.theinquirer.net/default.aspx?article=35...">here. saying they goofed.
  • Araemo - Thursday, November 9, 2006 - link

    Yes I did - thanks.

    I wish they would have updated the original post to note the mistake, as it is still easily accessible via google. ;) (And the 'we goofed' post is only shown when you drill down for more results)
  • Araemo - Thursday, November 9, 2006 - link

    In all the AA comparison photos of the power lines, with the dome in the background - why does the dome look washed out in the G80 images? Is that a driver glitch? I'm only on page 12, so if you explain it after that.. well, I'll get it eventually.. ;) But is that just a driver glitch, or is it an IQ problem with the G80 implementation of AA?
  • bobsmith1492 - Thursday, November 9, 2006 - link

    Gamma-correcting AA sucks.
  • Araemo - Thursday, November 9, 2006 - link

    That glitch still exists whether or not gamma-correcting AA is enabled or disabled, so that isn't it.
  • iwodo - Thursday, November 9, 2006 - link

    I want to know if these power hungry monster have any power saving features?
    I mean what happen if i am using Windows only most of the time? Afterall CPU have much better power management when they are idle or doing little work. Will i have to pay extra electricity bill simply becoz i am a cascual gamer with a power - hungry/ ful GPU ?

    Another question pop up my mind was with CUDA would it now be possible for thrid party to program a H.264 Decoder running on GPU? Sounds good to me:D
  • DerekWilson - Thursday, November 9, 2006 - link

    oh man ... I can't believe I didn't think about that ... video decoder would be very cool.
  • Pirks - Friday, November 10, 2006 - link

    decoder is not interesting, but the mpeg4 asp/avc ENCODER on the G80 GPU... man I can't imagine AVC or ASP encoding IN REAL TIME... wow, just wooowww
    I'm holding my breath here
  • Igi - Thursday, November 9, 2006 - link

    Great article. The only thing I would like to see in a follow up article is performance comparison in CAD/CAM applications (Solidworks, ProEngineer,...).

    BTW, how noisy are new cards in comparison to 7900GTX and others (in idle and under load)?
  • JarredWalton - Thursday, November 9, 2006 - link

    I thought it was stated somewhere that they are as loud (or quiet if you prefer) as the 7900 GTX. So really not bad at all, considering the performance offered.

Log in

Don't have an account? Sign up now