Fallout 3 Game Performance

Bethesda’s latest game uses an updated version of the Gamebryo engine (Oblivion). This benchmark takes place immediately outside Vault 101. The character walks away from the vault through the Springvale ruins. The benchmark is measured manually using FRAPS.

Fallout 3 - 1680 x 1050 - Medium Quality

Finally! We have a test where the Athlon II X3 435's clock speed gives it the advantage over the 620. If you're a gamer but want more cores, the 435 is a good balance of performance in existing games but better than dual-core performance in well threaded apps.

Left 4 Dead

Left 4 Dead - 1680 x 1050 - Max Settings (No AA/AF/Vsync)

I've got no complaints about the X3's performance in Left 4 Dead either, it's nearly as fast as the more expensive Core 2 Duo E7500 (and with a much tastier upgrade path).

FarCry 2 Multithreaded Game Performance

FarCry 2 ships with the most impressive benchmark tool we’ve ever seen in a PC game. Part of this is due to the fact that Ubisoft actually tapped a number of hardware sites (AnandTech included) from around the world to aid in the planning for the benchmark.

For our purposes we ran the CPU benchmark included in the latest patch:

Far Cry 2 - 1680 x 1050 - Playback (Action Scene) - Medium Quality

Even in our most heavily threaded game test, the X3 435 is a bit faster than the 620.

Crysis Warhead

Crysis Warhead - 1680 x 1050 - Mainstream Quality (Physics on Enthusiast) - assault bench

Excel & Blu-ray/Flash Creation Performance Power Consumption
Comments Locked

177 Comments

View All Comments

  • maddoctor - Tuesday, October 20, 2009 - link

    Currently, Intel does not need any competitor. Intel will crush them as soon as possible with prices cutting. This is their own fault, why they are not so competitive in performance and can not wins the benchmark
  • fsdetained - Wednesday, October 21, 2009 - link

    Because they've done this so well with AMD for the last 20+ years... great logic little boy...
  • santiago321 - Tuesday, October 20, 2009 - link

    I am sure this joker has no idea about computers. If only designing GPU chips and killing competition was so simple.

    Tell you what, consider these two facts;

    1. Intel has been trying to kill AMD for last 40 years and still trying

    2. It took intel 5 years to add Northbridge in the die after amd did that in 2004.

    Tech forums are no paradise for jokers. Go home and sleep well
  • coldpower27 - Friday, October 30, 2009 - link

    1. Intel isn't trying to kill AMD, they are trying to shackle AMD into the budget range and keep them there, AMD sole purpose is to remain as a second supplier and to give the appearance of competition, so for Intel AMD serves a purpose.

    2. Intel didn't care about having a memory controller on their processors as they were already outperforming AMD's offerings without it. It isn't the be all and end all to performance and also comes with disadvantages as well. Core 2 outperformed Athlon 64's of the time, and were even to equivalently clocked 45nm Phenoms, 65nm AMD had it's own share of issues.
  • maddoctor - Tuesday, October 20, 2009 - link

    What Intel never trying to kill AMD. This is only AMDiot assumption. Ask Anand, he will never agree with you AMDiot. This is the nature of capitalism. Money and product competitiveness are the facts that you can not denied.
  • fsdetained - Wednesday, October 21, 2009 - link

    Shitty pun, knock it off. Intel has tried to crush AMD since 1986. Where have you been? Oh, right, you probably weren't even born when the 8080 and 8086 processors came out for the IBM PC or even when the AM286 came out.
    Money, product, and competitiveness are not facts they are words you just threw out to sound like you know something. You have no clue about business, go back to your 6th grade classes.
  • yuhong - Tuesday, October 20, 2009 - link

    "2. It took intel 5 years to add Northbridge in the die after amd did that in 2004. "
    Do you mean memory controller? But, BTW, Transmeta was able to put northbridge into the die back in 1999.
  • AnnihilatorX - Tuesday, October 20, 2009 - link

    My sacarsm meter dial has turned through 3 circles and landed the negative spot.
  • StevoLincolnite - Tuesday, October 20, 2009 - link

    There is so much wrong with your post it's not funny, but I'll point some out.

    1) We have no idea how Larrabee will perform.

    2) nVidia ATI/AMD are not going anywhere soon.

    3) Intel got beaten by ATI and nVidia in the Graphics card market before when Intel released the Intel i740i AGP graphics card.

    4) With Intels history of poor driver support for there Graphics solutions, I don't have much faith in Larrabee being different, I hope I'm wrong though, the added competition would be awesome.

    5) If Intel owns the Chipset/CPU/Graphics markets 100% that would be -bad- no competition to keep prices low, remember when a decent computer would cost over 3 thousand bucks?

    6) Both ATI and nVidia have had YEARS of experience in the Graphics industry, they will not let a new player into that market without a fight.

    7) I would -not- be happy if we lost any extra company's especially AMD and nVidia, it would be all-round bad for everyone, I wouldn't have been able to build a stupidly cheap Quad-Core system (Athlon 2 x4 620) if it weren't for AMD.
  • pinguin - Tuesday, October 20, 2009 - link

    >>remember when a decent computer would cost over 3 thousand bucks?

    I wouldn't mind paying $3000 for a decent computer, but please remember that even an undecent one would have set you back $1500 back then, while nowadays you can buy a $500 AMD rig

Log in

Don't have an account? Sign up now