Intel did a very good job of drumming up support for PCI Express over the past two years.  Look around and note that all of the motherboard manufacturers have quite a few PCI Express based motherboard designs.  Then look at the latest GPU launches from ATI and NVIDIA, all of the exciting products appear to be launched first (or primarily) as PCI Express designs.  While everyone industry-wide has done a great job of supporting PCI Express, there's one little problem - no one seems to be interested in buying PCI Express solutions just yet. 

The OEM markets have no problems shipping PCI Express motherboards and graphics cards in their systems, after all they want to sell the idea of buying an entirely new PC in order to get access to brand new technologies like PCI Express.  However in the channel and upgrade markets, PCI Express solutions aren't selling well at all.  Most enthusiast users appear to be sticking with their AGP platforms and while they would consider a GPU upgrade, they are not willing to upgrade their motherboard (and sometimes CPU and memory) just to get a faster graphics card. 

There was a huge debate early on about whose PCI Express design would prove to be the best for performance.  ATI chose to produce separate PCI Express and AGP enabled GPUs, offering a native solution for both interfaces; meanwhile, NVIDIA chose to keep manufacturing their AGP GPUs and use a bridge chip to interface with PCI Express.  While ATI argued that NVIDIA's solution offered less performance, NVIDIA said that ATI's approach was far too costly.  The problem with ATI's approach was that their production was inherently split between AGP and PCI Express chips, and predicting market demands for an appropriate ratio between the chips is quite difficult.  If you overproduce PCI Express chips, then there will be a shortage of AGP cards, and vice versa.  ATI's initial approach to only producing native PCI Express or AGP designs is part of the reason why their latest mainstream GPUs (e.g. X700) are still only available as PCI Express designs.

Even though NVIDIA has gone to manufacturing native PCI Express GPUs (e.g. GeForce 6600GT), they already have a working chip to bridge back down to an AGP interface, which is what makes today's launch possible.  Thanks to the use of NVIDIA's PCI Express-to-AGP bridge chip, NVIDIA is able to not only launch but also begin selling an AGP version of their GeForce 6600GT today.  We are told by NVIDIA that cards should be available for sale today continuing a very recent trend of announcing availability alongside a product launch, which we greatly applaud. 


NVIDIA's PCI Express to AGP bridge

ATI is working on a PCI Express-to-AGP bridge of their own, but it will not be ready until later this year - meaning that ATI will not have an AGP version of their PCI Express X700 until early next year.

The GeForce 6600GT AGP runs at the same core clock speed as the PCI Express version (500MHz) but has a slightly lower memory clock (900MHz vs. 1GHz on the PCI Express version).  By lowering the memory clock NVIDIA helps to offset the additional cost of the PCI Express-to-AGP bridge.  The performance impact of the reduction in memory clock as well as the on-board bridge is between 0 - 5%.  For example, in Doom 3 at 1024 x 768 (High Quality) the PCI Express version of the GeForce 6600GT is 3.5% faster than the AGP version.  There is a performance difference, but it does not appear to be huge. The AGP version of the 6600GT obviously lacks SLI support given that you can only have a single AGP slot on a motherboard.

The latest AGP specification calls for a maximum of around 45W of power to be delivered via the AGP slot itself, while a PCI Express x16 slot can supply up to 75W.  Because of the reduction in power that can be delivered via the slot interface, the GeForce 6600GT AGP requires the use of a 4-pin molex connector on the board itself to deliver extra power to the GPU.  You may remember that the PCI Express version of the 6600GT does not require a separate power connector. 


This 4-pin molex connector is only present on the AGP version of the 6600GT

As of now, NVIDIA is only releasing the 6600GT in an AGP flavor; the regular non-GT 6600 will remain PCI Express only. the 6600GT AGP will retail for between $200 and $250. If you are interested in learning more about the architecture of the 6600GT, feel free to read our review of the PCI Express version for greater detail.

The Cards
Comments Locked

66 Comments

View All Comments

  • ShadowVlican - Thursday, November 18, 2004 - link

    seems like #28 got pwned... think AGP ^_-

    very nice review Anand, it's quite astonishing how fast technology can grow isn't it? with the "top" cards of the last generation being eaten by this generation's top mid card... i'm looking forward to your next review when you have your vanilla 6800!
  • Speedo - Thursday, November 18, 2004 - link

    Hmm... I also agree with you people, which wonder if an upgrade to a faster graphics card would help and if you perhaps already are CPU limited.

    One way of checking the "status" of your current system is to play around with resolutions for a given game. For example, lets say you normally play UT2004 at 1024x768. Try setting the resolution to 512x384 and see where your framerates go. You will not probably go much above that, no matter how fast video card you upgrade to.

    You can also try upping the resolution one step from what you are usually using. If the framerate drops a lot, you would probably benefit from an upgrade.

    I know this doesn't tell *which* new card you should get. But if your low-res test shows that your CPU can deliver double the framerate, then a good balance could be to upgrade to a card that is at least double as fast as your current one.

    In my own system I seem to have a pretty good balance right now, with a 9800pro(xt mod) & barton@2.3Ghz.
  • bigpow - Thursday, November 18, 2004 - link

    I agree with the previous commentators.

    Most of us are stuck with our older generation platform, say P4 2.4c or AthlonXP 1700+ or 2500+

    Where's the result for these platforms, AT?

    Most of us (see above) will decide whether it is worth it to upgrade to 6600GT if we see these numbers.

    AT, step up and beat the competition.
    Don't be lazy and just compare with the expensive and uncommon FX CPU.
  • Ender17 - Wednesday, November 17, 2004 - link

    Those charts with the precentages are awesome!! and the head to stuff was great as well. Keep up the good work and try to get us that head to head with the 6800nu.
  • Niatross - Wednesday, November 17, 2004 - link

    Even when he's cpu limited he's limited by an FX-55 not an XP Barton. Yea I wonder how many 1000 dollar cpu systems have a 200 dollar card?

    Yea he's showing the cards abilities off well by using an FX-55 but it TELLS me nothing about what my experiance might be. I would just like to see what it would run like on the average machine. I said before that I've seen this hashed out many times on various sites and I see the value of the way it's usually done, just wishing I had my way (STOMP,STOMP BOO HOO,(LOL) I guess ;-) J
  • ciwell - Wednesday, November 17, 2004 - link

    Maybe an article/chart that lists the CPUs from the past couple of years and the theoretical GPU to go along with it that would MAX out, given a CPU bottleneck or what-not.
  • navsimpson - Wednesday, November 17, 2004 - link

    While I get why the fastest CPU must be used to prevent CPU bottlenecking, what I don't understand is why someone who can afford a $1000 processor would buy a $200 video card and not shell out the extra 100 or so bucks to move up a notch. These reviews end up being technically sound - we do our best to see what the cards are actually capable of - but of much less consequence to those of us looking to figure out what cards to buy. Will it be worth it to get a 6600gt or would a 9600xt max out the performance of my Athlon 2600? That's what I - and a heck of a lot of other people - want to know.
  • Pete - Wednesday, November 17, 2004 - link

    Derek emerged from his underground bunker! Now that you've recovered enough to type ;), can you verify and maybe explain those 9700P Far Cry numbers?
  • nserra - Wednesday, November 17, 2004 - link

    #57 ciwell

    What is really funny is that nvidia almost didn’t beat a 2 year old card!
    And that a similar hardware 5900 (to some people) at that time some even say it was better is in the ground.
    Where are the 5900 PS2.0+ and VS2.0+?

    This is the anandtech 5900 test conclusion:

    “From the ATI camp the $499 Radeon 9800 Pro 256MB, just like the NV35, is a difficult purchase to justify; even more difficult in this case because the GeForceFX 5900 Ultra does outperform it in a number of tests.”

    Where is the 5900 in all the benches? Who have bought an nvidia 5900 based on those comments?
  • ciwell - Wednesday, November 17, 2004 - link

    I find it funny how nVidia has beaten ATI to the punch and the fanbois are coming out of the woodwork. :D

Log in

Don't have an account? Sign up now