Final Words

Our first few weeks playing with PhysX have been a bit of a mixed bag. On one hand, the technology is really exciting, game developers are promising support for it, two games already benefit from it, and the effects in supported games and demos do look good. On the other hand, only two games really currently support the hardware, the extent of the added content isn't worth the price of the hardware, promises can be broken, we've observed performance issues, and hardware is only really as good as the software that runs on it.

Playing the CellFactor demo for a while, messing around in the Hangar of Doom, and blowing up things in GRAW and City of Villains is a start, but it is only a start. As we said before, we can't recommend buying a PPU unless money is no object and the games which do support it are your absolute favorites. Even then, the advantages of owning the hardware are limited and questionable (due to the performance issues we've observed).

Seeing City of Villains behave in the same manner as GRAW gives us pause about the capability of near term titles to properly support and implement hardware physics support. The situation is even worse if the issue is not in the software implementation. If spawning lots of effects on the PhysX card makes the system stutter, then it defeats the purpose of having such a card in the first place. If similar effects could be possible on the CPU or GPU with no less of a performance hit, then why spend $300?

Performance is a large issue, and without more tests to really get under the skin of what's going on, it is very hard for us to know if there is a way to fix it or not. The solution could be as simple as making better use of the hardware while idle, or as complex as redesigning an entire game/physics engine from the ground up to take advantage of the hardware features offered by AGEIA.

We are still excited about the potential of the PhysX processor, but the practicality issue is not one that can be ignored. The issues are two fold: can developers properly implement support for PhysX without impacting gameplay while still making enhancements compelling, and will end users be able to wait out the problems with performance and variety of titles until there are better implementations in more games?

From a developer standpoint, PhysX hardware would provide a fixed resource. Developers love fixed resources as one of the most difficult aspects of PC game design is targeting a wide range of system requirements. While it will be difficult to decide how to best use the hardware, once the decision is made, there is no question about what type of physics processing resources will be afforded. Hopefully this fact, combined with the potential for expanded creativity, will keep game developers interested in using the hardware.

As an end user, we would like to say that the promise of upcoming titles is enough. Unfortunately, it is not by a long shot. We still need hard and fast ways to properly compare the same physics algorithm running on a CPU, a GPU, and a PPU -- or at the very least, on a (dual/multi-core) CPU and PPU. More titles must actually be released and fully support PhysX hardware in production code. Performance issues must not exist, as stuttering framerates have nothing to do with why people spend thousands of dollars on a gaming rig.

Here's to hoping everything magically falls into place, and games like CellFactor are much closer than we think. (Hey, even reviewers can dream... right?)

Playing Demos on PhysX
Comments Locked

67 Comments

View All Comments

  • Tephlon - Wednesday, May 17, 2006 - link

    yeah, no. I know that havoc is doing genaric physics. And light poles DO normally bend without the card. Cars do shake and explode. Cans can be kicked. All that stuff is normally threre.
    I'm just saying the card seems to accentuate all of it. Not just more particles, but better explosions. Better ragdoll. Pots break a bit different, etc.
    It was definately there before, but I think it all looks better with the physx. My roommate said he noticed the difference as well. I let him borrow it for a while while I was at work.
    Again, I know I have no proof, atleast not to show you atm... but to me it all seems better than before.
    If I get a chance I'll fraps a run through a level once with and once without, and throw the links up here. I personally have seen several sites' comparison vids, but I don't feel they show everything very well.
    Again, I'd heard it only adds particles to explosions, like you did, but I swear I can see the difference with everything.
    Anyone ever heard Ageia say EXACTLY what difference there is for GRAW with their card?
  • DerekWilson - Wednesday, May 17, 2006 - link

    Perception of experiences can greatly be affected by expectations. None of us are ever able to be 100% objective in all cases.

    That being said, in your first post you mention not "feeling" the performance impact. If you'll take a look at our first article on PhysX (and the comments) you will notice that I reported the same thing. There aren't really huge slow downs in GRAW, but only one or two frames that suffer. If the errant frame(s) took a quarter of a second to render, we would definitely notice it. But while an AVERAGE of 15 frames per second can look choppy having a frame or two take 0.066 seconds to render is not going to significantly impact the experience.

    Minimum framerates are important in analysing performance, but they are much more difficult to properly understand than averages. We want to see high minimum framerates becuase we see that as meaning less slow-down or stutter. But generally (in gpu limited situations) minimum framerates aren't outliers to the data set -- they mark a low point where framerate dips down for a good handful of frames. In the case of GRAW with PhysX, the minimum is really non contiguous with the performance of the rest of the frames.

    CoV is another story. The framerate drops several times and we see stuttering. It's definitely something easily "felt" during gameplay. But CoV Issue 7 is still beta, so we might see some performance imporvements when the code goes live.
  • Tephlon - Wednesday, May 17, 2006 - link

    Derek, I totally agree. I wasn't arguing about anything technicial the article, or the relativity minimum fps has on the 'feel' or 'playablilty'. It just doesn't seem like most readers (here and elsewhere) understand it. I also won't hide the fact that I DO WANT this tech to succeed, partly because I heard them speak at quakecon and I liked what I heard/saw, and partly because I've dropped $300 in good faith that my early adoption will help the cause and push us forward in the area of true physics in gaming. And even though my perception is undoubtably affected because of my expectations, its not entirely misled either. Even with my bias I can be realistic and objective. If this thing did nothing for visuals/gameplay and killed my experiance with crappy performance, I'd of course have a different opinion on the matter.

    I was simply saying that readers seem to lose sight of the big picture. Yeah, its in the rough stages. Yeah, it only works with a few games. I'm not here to pitch slogans and rants to make you buy it, I just wanted people to understand that device 'as it is now' isn't without its charm. It seems the only defense that's brought in for the card is that the future could be bright. It DOES have some value now, if your objective about it and not out to flame it immediately. I like what it does for my game, even if its not revolutionary. I just hope that there are enough people objective enough to give this company/card a chance to get off the ground. I DO think its better for industry if the idea of a seperate physics card can get off the ground.
    I dunno, maybe I see too much of 3DFX in them, and it gets me nostalgic.
    Again, Derek, I wasn't knocking on the report at all, and I hope it wasn't taken that way. I think it said just what it was supposed to or even could say. I was more trying to get the readers a balenced look at the card on the use side, since straight numbers send people into frenzies.

    Did all that get to what I was trying to convey? I dunno, I confuse myself sometimes. I wasn't meant to be an author of ANYTHING. In any case, good luck to you.
    Good luck to all.
  • DerekWilson - Wednesday, May 17, 2006 - link

    lol, I certainly didn't take it as a nagative commentary on anything I said. I was trying to say that I appreciate what you were saying. :-)

    At a basic level, I very much agree with your perspective. The situation does resemble the 3dfx era with 3d graphics. Hardware physics is a good idea, and it would be cool if it ends up working out.

    But is this part the right part to get behind to push the industry in that direction?

    AnandTech's first and foremost responsibility is to the consumer, not the industry. If the AGEIA PhysX card is really capable of adding significant value to games, then its success is beneficial to the consumer. But if the AGEIA PhysX card falls short, we don't want to see anyone jump on a bandwagon that is headed over a cliff.

    AGEIA has the engine and developer support to have a good chance at success. If we can verify their capabilities, then we can have confidence in recommending purchasing the PhysX card to people who want to push the agenda of physics hardware. There is a large group of people out there who feel the same way you do about hardware and will buy parts in order to benefit a company or industry segment. If you've got the ability and inclination, that's cool.

    Honesly, most people that go out and spend $300 on a card right now will need to find value in something beyond what has been added in GRAW, CoV, and the near term games. If we downplayed the impact of the added effects in GRAW and CoV, its because the added effects are no where near worth $300 they cost. It is certainly a valid perspective to look towards the future. You have the ability to enjoy the current benefits of the hardware, and you'll already have the part when future games that make more compelling use of the technology come out.

    We just want to make sure that there is a future with PhysX before start jumping up and down screaming its praises.

    So ... I'm not trying to say that anything is wrong with what you are saying :-)

    I'm just saying that AnandTech has a heavy responsibility to its readers to be more cautious when approaching new markets like this. Even if we would like to see it work out.
  • Tephlon - Thursday, May 18, 2006 - link

    true. I do get your point.

    And again, you're right. With a more balenced perspective on the matter, I sure can't see you suggesting a 300 dollar peice of hardware on a hunch either. I do respect how your articles are based on whats best for the little guy. I think I'd honestly have to say, if you were to suggest this product now AS IF it was as good as sliced bread... I would be unhappy with my purchase based on your excitment for it.
    teheh. Yeah, you made the right call with your article.
    Touche', Derek. TOUCHE'



    thehe. I guess not everyone can gamble the $300, and thats understandable. :-(

    Like I said... here's hopin'. :-D
  • RogueSpear - Wednesday, May 17, 2006 - link

    I'm not an expert on CPUs, but all of this has me wondering - isn't physics type code the kind of mathematical code that MMX and/or SSE and their follow-ons were supposed to accelerate? I'm sure physics was never mentioned way back then, but I do remember things like encryption/decryption and media encoding/decoding as being targets for those technologies. Are game developers currently taking advantage of those technologies? I know that to a certain point there is parity between AMD and Intel CPUs as far as compatibility with those instruction sets.
  • apesoccer - Wednesday, May 17, 2006 - link

    Seems like this was a pretty limited review...Were you guys working with a time table? Like 4hrs to use this card or something?

    I think i would have tried more then just single core cpu's...since we're heading mostly towards multicore cpus. I also would have run tests at the same lvl (if possible; it feels like we're intentionally being kept in the dark here) to compare software and hardware with the same number of effects, at different levels and at resolutions...At low res, you're maxing the cpu out right? Well, then if the ppu uses 15% of the cpu but outputs 30% more effects, you're being limited by the cpu even more...You should see greater returns the higher the resolution you go...Since you're maxing your gpu's out more (rather then the cpus) the higher the res. All of this is moot if the overhead cpu usage by the ppu can be run on a second cpu core...since that's where the industry is headed anyway. And making software/hardware runs on a dual core should give us a better idea of whether or not this card is worth it.
  • peternelson - Wednesday, May 17, 2006 - link

    To the people who say it's a decelerator. It is a little slower but it is NOT doing the same amount of work. The visual feast is better in the hardware accelerated game than without the card. But we need a way to quantify that extra as just "fps" ignores it.

    Second, Anandtech PLEASE get yourselves a PCI bus analyser, it need not be expensive. I want to know the % utilisation on the PCI bus. At 32 bit 33MHz it is potential max 133 MByte/sec.

    How much of that is being used to talk to and from the PHYSX card, and is it a bottleneck that would be solved by moving to PCI Express? Also in your demo setups, considering what peripherals you are using, are you hogging some of the PCI bandwidth for (say) a PCI based soundcard etc which would be unfair on the physx card.

    ALSO one of the main purposes of THIS review I would say is to COMPARE the ASUS card with the BFG card. You don't seem to do that. So assuming I want a physx card, I still don't know which of the two to buy. Please compare/contrast Asus vs BFG.
  • DerekWilson - Wednesday, May 17, 2006 - link

    honestly, the asus and bfg cards perform identically, pull about the same ammount of power and produce similar levels of noise.

    If you are trying to decide, buy the cheaper one. There aren't enough differences to make one better than the other (unless blue leds behind fans really does it for you).

    We didn't do a more direct comparison because we have an engineering sample ASUS part, while our BFG is full retail. We generally don't like to make direct comparisons with preproduction hardware in anything other than stock performance. Heat, noise, power, pcb layout, and custom drivers can all change dramatically before a part hits retail.

    We will look into the pci bus utilization.
  • peternelson - Wednesday, May 17, 2006 - link


    Thanks, so, I will have to choose on features like the nice triangle box on the BFG ;-)

    In gaming on older machines where both the sound and network and possibly other things are all on the same PCI bus, then either the physx or the other stuff could suffer from bus contention.

    I hope you can either ask or do some analysing to watch the amount of traffic there is.

Log in

Don't have an account? Sign up now