More on Mixed-Mode for NV3x

We briefly mentioned the Mixed Mode of operation for NV3x GPUs that Valve implemented in Half-Life 2, but there is much more to it than just a special NV3x code path. In fact, the mixed mode NV3x code path was really only intended for the GeForce FX 5900 Ultra (NV35). The mainstream FX chips (5200/5600) require a slightly different code path.


Here you can see the 40% performance boost NVIDIA gets from the special NV3x code path.

The GeForce FX 5600 (NV31) uses a code path that is internally referred to as dx82; this path is a combination of DX9 (pixel shader 2.0) and DX8.1 (pixel shader 1.4) code, and thus, doesn't look as good as what you'll see on the 5900 Ultra.

Although the 5900 Ultra performs reasonably well with the special NV3x mixed mode path, the 5600 and 5200 cards do not perform well at all. Valve's recommendation to owners of 5600/5200 cards is to run the DX8 (pixel shader 1.4) code path in order to receive playable performance under Half-Life 2. The performance improvement gained by dropping to the DX8 code path is seen most on the GeForce FX 5200; although, there is a slight improvement on the 5600 as you can see below:

The sacrifices that you encounter by running either the mixed mode path or the DX8 path are obviously visual. The 5900 Ultra, running in mixed mode, will exhibit some banding effects as a result of a loss in precision (FP16 vs. FP32), but still looks good - just not as good as the full DX9 code path. There is a noticeable difference between this mixed mode and the dx82 mode, as well as the straight DX8 path. For example, you'll notice that shader effects on the water aren't as impressive as they are in the native DX9 path.

Are the visual tradeoffs perceptive? Yes. The native DX9 path clearly looks better than anything else, especially the DX8.0/8.1 modes.

Improving Performance on NVIDIA The Test
Comments Locked

111 Comments

View All Comments

  • uturnsam - Friday, November 28, 2003 - link

    #110 continued
    Now I know why the guy behind the counter told me to steer clear of the ATI Radeon cards because of the known compatability problems when running games.

    (Computer sales guy thinking-I just read the article in the AnandTech post)

    Translated: I have a shit load of Nvidia cards and if I don't lie my ass off to my Customer's it will be game over for me!!!

    The only reason I started looking at ATI cards was I decided to spend what I saved on the CRT monitor (over the $$LCD) for higher performer card. Mr $Sales$ had me convinced I would be buying an inferior card with ATI. Worth shopping around and scouring reviews :O)
  • uturnsam - Friday, November 28, 2003 - link

    I was going to buy a Geforce5600 but looked at a 9600Pro today the thing is I was wondering if I should really blow the budget and lash out on a 9800Pro.
    I am so glad I came across this article I will stick with the 9600Pro, save some cash, sleep better at night and know when half life 2 is released I will be getting the best performance for the outlay.

  • Anonymous User - Thursday, October 16, 2003 - link

    you can count on your 9500 being in between the 9800 and the 9600, about 30% frame rate above the 9600. the 4 pipelines will help.
  • Anonymous User - Tuesday, September 30, 2003 - link

    I would like to see a test of the dx8 paths on some of the really older cards for those of us who are too broke for these new ones!!

    For instance, I have a geforce2 GTS that I love very much and works just fine on everything else. I don't want to have to upgrade for one game.
  • Anonymous User - Sunday, September 21, 2003 - link

    I would like to see how they compare with a 5900 using Detonator 44.03 driver. Yes I know its an older driver. But in my tests it provided higher benchmarcks than the 45.23 driver.

    Has any body else noticed this?
  • Anonymous User - Friday, September 19, 2003 - link

    So actually Nvidia shader(16/32) are not
    comparable with ATI shader(24-ms dx9 standard)!
    Too bad in a way or another they try to cheat
    again and again.......
    Very bad idea!
  • Anonymous User - Tuesday, September 16, 2003 - link

    #104, the benchmarks and anand's analysis show that hl2 is gpu power limited, not memory/fillrate limited... the 9600 will be limited more by that than by memory or fillrate.
  • Anonymous User - Monday, September 15, 2003 - link

    I think #84 mentioned this, but I didn't see a reply. In the benches, the 9600 pro pulled the exact same (to within .1 fps, which could just be roundoff error) frame rates at 1024 and 1280.

    I don't think I've ever seen a card bump up res without taking a measurable hit (unless it was cpu-limited). In every other game, the 9600 takes a hit going from 1024 to 1280. And the 9700 and 9800 slow down when the resolution goes up, even though they're basically the same architecture. Someone screwed up, either the benchmarks or the graphs.
  • Anonymous User - Monday, September 15, 2003 - link

    #61 Did you take the time to see that valve limited their testing use. Anandtech had no say in all the tests because they were very time limited. Also, try to make coherent sentences.
  • Anonymous User - Sunday, September 14, 2003 - link

    It's not as if GIFs gobble bandwidth, I (as CAPTAIN DIALUP) don't even notice them loading. They're tiny. Even though I don't have trouble receiving this Flash stuff, it pisses me off, because sometimes the same scores will load for all the pages. Why not have a poll or something on this?

Log in

Don't have an account? Sign up now