Introduction

Until now, we haven't had the pleasure of playing with a midrange part based on current generation technology. At present, those who have wanted good performance at lower prices have gone with older cards that have fallen in price. This is all well and good, but consumers lose out on all the new and improved features of the latest architectures when buying high end cards of a previous generation over the midrange cards built with current technology. This is especially pertinent in light of NVIDIA's Shader Model 3.0 support. Generally, anything that can be done in SM3.0 can be done in SM2.0, but the advantage is code complexity and (sometimes) performance improvements. We've already seen examples of this in our SM3.0 analysis under FarCry.


The NV43 GPU behind the 6600 GT

Also with the new 6600 line of cards, NVIDIA is bringing out their first native PCIe line of GPUs. These should quickly be bridged back to AGP (we are told), and the sooner we see the AGP version the better. Even if PCI Express platform market share were better right now, the niche the 6600 series of cards proposes to fill is one that could appeal to everyone who uses a computer. The keys behind the 6600 series (aside from feature set) are performance and price point. All of the aspects of the 6600 series fall in line to offer a card that promises amazing value.

But we don't care about promises here. We will take a handful of the latest and greatest games across the spectrum (with a heavy focus on PS2.0), and we'll see how well the newest member of the NVIDIA family performs. As far as competition goes, we'll stack it up against current and previous generation ATI and NVIDIA cards and we'll include ATI's current midrange PCIe card, the X600XT. This isn't supposed to be a direct comparison, as the X600 is still based on previous generations ATI technology. We will make a bigger deal of the ATI/NVIDIA comparison when we have a midrange R4xx desktop part in our hands.

High-Tech Mid-Range
Comments Locked

44 Comments

View All Comments

  • SuperDuper28 - Monday, September 13, 2004 - link

    Anyone got any info on release date?Only thing I have seen is "sometime in Sep" but I don't see any place to preorder or any manufactuers advertising it.

    As far as benchmarks.I don't care how much memory it has,how many pipes,how many bits,what the core and memory is clocked at...ect,I've seen lots of different benches and different sites with different rigs and in all of them it performs close to a 6800 and 800x and in some cases slightly outperformed.I don't know how to explain it but thats the way it is and $200 for this card compared to $400+ for the others is a no brainer.
  • Thatguy97 - Saturday, May 9, 2020 - link

    Corona will kill us all
  • JarredWalton - Thursday, September 9, 2004 - link

    Actually, mickyb, it's not so much that NVIDIA neglected DX9 performance as it is ATI neglecting OpenGL performance. If a game doesn't make any use of the special features of the GF6 cards (i.e. SM3.0 extensions), the X800 XT PE has a very large advantage in pixel rates and vertex rates. They're both 16x1 pixel and 6 vertex designs, but the ATI card runs at 520 MHz compared to 450 MHz on the 6800UE. That's a 15% performance advantage in clock speed, which correlates pretty well with the Source Stress Test results. So, I agree with you to a point, but really ATI just needs to get their OpenGL drivers up to par.
  • mickyb - Wednesday, September 8, 2004 - link

    **** WARNING: RANT ****
    I am spending way too much time on this article, but everytime I read the benchies on the 6800 and X800 series, I keep asking what is going on with NV and DirectX and ATI and Open GL? I have a hard time believing that the two APIs are so different that the hardware has to be designed for them. I think NV spent way too much time in the closet with Quake III and ATI in their closet with HL-2. Both engines are going to have an presence in other games and I don't want to have to pick a video card for the games that I play. If this continues, it will turn into the same war that the consoles have. I chose a PC so I can play all games. Maybe this will all correct itself after the next driver releases.
  • JarredWalton - Wednesday, September 8, 2004 - link

    I don't have any insider information, and I haven't signed any NDA's, so I can say that NF4 is probably due out within the next few months. That could be wrong, of course.

    There will be a 6600 vanilla part, Visual, but due to the use of different RAM and other factors, we do not have benches for that yet. It was mentioned on the second page, forth paragraph. I'm sure they're coming soon.

    Finally, regarding the 6800 vanilla vs. 6600GT, there are two things to consider. First, how much faster is the 6800 when running on the same system? Generally, it is slightly faster and up to 20% faster when the system is essentially the same, so 50% more for up to 20% performance probably isn't worth it to most people. The bigger problem, however, is that the 6600GT is a PCIe part. There should be AGP bridged versions in the future, but how far off are they? After the "paper launch" of the X800 and 6800 cards, I wouldn't expect an AGP 6600GT for another six weeks at best.

    I wouldn't be surprised to see NF4 and SLI and 6600GT all reach widespread availability around the same time. Well, that's what I would try to shoot for if I were Nvidia, anyway. If NF4 gets delayed due to technical issues, though, who knows?
  • Visual - Wednesday, September 8, 2004 - link

    Oooh, reading mickyb's post, imagining two of these things each with a TV tuner, with the SLI bridge between them... mmmm :)
    The 3d performance would be perfect for today, and you'd have 4 monitors (or 2 monitors/2tvs) max.. if it could work without disabling NF4's IGP, 5/6 monitors, just trashing ATI's hyrdaview!
    Two tuners giving you picture in picture or recording independently of what you're watching... Or one of the cards can be with digital tuner the other with analog... there'd be something for everyone :)

    I'm drooling :)
    Visual

    P.S. Quite off-topic, but what's know about nVidia's plans for NF4, and is it expected soon?
  • Visual - Wednesday, September 8, 2004 - link

    Seeing this card named "GT" I wonder if there will be cheaper versions of it... any info on that from nVidia?

    It'd be good to see a comparison between this card, the cheaper x600XT and the bit more expencive vanilla 6800, all on the same system. I'm pretty sure you guys won't bother to make another article for this, but at least if you see something on the web I hope you'll post in the news section :)

    Also I wonder if the 6800 is worth the extra cash in your oppinion. I was almost decided on getting one but this article makes me consider waiting again...

    Thanks for the article,
    Visual
  • mickyb - Wednesday, September 8, 2004 - link

    Not sure any of this is viable, but going along with my previous comment, this looks like it would be a good candidate for SFF systems. It provides good performance at the right price. Furthermore, it supports SLI. For non-SFF systems, it is a nice way to incrementally get to the performance needs that your game requires. Practically, it is always a challenge to incrementally approach any performance target, because when you are ready to step up, it is time to get a whole new system. Allright, back to SFF. It would be cool if this was the onboard graphics candidate for NF4. This chip coupled with Video-in and it could finally compete with ATI in the AIW space. Currently I would always go for ATI, due to having to make the performance vs. all round use decision. Now here is where I think NV could have something. Since this chip supports SLI, it would be really something if the NF4 had this video on-board and if you wanted a little more power, you just add another card. I don't know about you all, but I don't like buying a nice MB just to have to disable an almost passable video card because it couldn't run the next game.
  • JarredWalton - Wednesday, September 8, 2004 - link

    First, perhaps the disclaimer wasn't big enough, but running the tests on different platforms is never an optimal solution. This can and will affect the results in various instances. A 3400+ is a very fast chip, but the 3.4 GHz P4EE is still going to beat it in many situations. However, going back and running hundreds of benchmarks on different platforms really isn't an option. At some point in the future, I'm sure there will be a more equal roundup of the cards, but it's just an approximate measurement of performance anyway.

    If you look at the "performance estimates" in the GPU Cheatsheet article we recently published, you can see some additional support for why the 6600 GT would sometimes beat the 6800 vanilla. In pixel performance, the raw clockspeed of the 6600GT puts it just a fraction faster than the 6800, while it's slower in bandwidth by a relatively large amount. They're also pretty close in vertex processing power. Change the CPU from a 2.2 GHz 1MB L2 A64 to a 3.4 GHz 2 MB L2 P4EE and the closeness of the cards can end up pushing the GT ahead in cases where the P4EE is faster.

    (I may have the wrong number of vertex pipelines as well, which could mean that the 6600GT has substantially more vertex processing power. Any word on that Derek? I had 3 vertex pipelines as "rumor" on the 6600 chips, but the benchmarks seem to indicate that it might have more.)

    Without knowing the internal structure of the chips and cards, we can't say for sure, but there could be occasions where an 8x1 chip makes more efficient use of its resources than a 12x1 chip. Perhaps certain latencies are higher on the NV40 than on the NV43, or maybe a 12x1 pipeline setup just taxes the memory in a different fashion.

    Relative to the older generation hardware, it should come as no surprise that a 500 MHz 8x1 design is generally faster than a 412 MHz 8x1 (9800XT) or a handicapped 475 MHz 4x2 design 5950U). At lower resolutions, the memory bandwidth advantage isn't as much of a factor, and the 6600 has a superior architecture. In bandwidth hungry games, the 6600GT would likely lose out, but on PS/VS heavy operations, the raw clock speed will be very beneficial.
  • kmmatney - Tuesday, September 7, 2004 - link

    What! No Quake3 benchmarks?
    :)

Log in

Don't have an account? Sign up now