The First PCIe 2.0 Graphics Card

NVIDIA's 8800 GT is the "world's first consumer GPU to support PCI Express 2.0." Although AMD's Radeon HD 2400/2600 have PCIe 2.0 bandwidth, they don't implement the full spec, leaving the 8800 GT technically the first full PCIe 2.0 GPU. Currently, the only motherboard chipset out that that could take advantage of this is Intel's X38. We have yet to play with benchmarks on PCIe 2.0, but we don't expect any significant impact on current games and consumer applications. Currently we aren't bandwidth limited by PCIe 1.1 with its 4GB/sec in each direction, so it's unlikely that the speed boost would really help. This sentiment is confirmed by game developers and NVIDIA, but if any of our internal tests show anything different we'll certainly put a follow-up together.

PCIe 2.0 itself offers double the speed of the original spec. This means pairing a x16 PCIe 2.0 GPU with a x16 electrical PCIe 2.0 slot on a motherboard will offer 8GB/sec of bandwidth upstream and downstream (16GB/sec total bandwidth). This actually brings us to an inflection point in the industry: the CPU now has a faster connection to the GPU than to main system memory (compared to 800MHz DDR2). When we move to 1066MHz and 1333MHz DDR3, system memory will be faster, but for now most people will still be using 800MHz memory even with PCIe 2.0. PCIe 3.0 promises to double the bandwidth again from version 2.0, which would likely put a graphics card ahead of memory in terms of potential CPU I/O speed again. This will still be limited by the read and write speed of the graphics card itself, which has traditionally left a lot to be desired. Hopefully GPU makers will catch up with this and offer faster GPU memory read speeds as well.

For now, the only key point is that the card supports PCIe 2.0, and moving forward in bandwidth before we need it is a terrific step in enabling developers by giving them the potential to make use of a feature before there is an immediate need. This is certainly a good thing, as massively parallel processing, multiGPU, physics on the graphics card and other GPU computing techniques and technologies threaten to become mainstream. While we may not see applications that push PCIe 2.0 in the near term, moving over to the new spec is an important step, and we're glad to see it happening at this pace. But there are no real tangible benefits to the consumer right now either.

The transition to PCIe 2.0 won't be anything like the move from AGP to PCIe. The cards and motherboards are backwards and forwards compatible. PCIe 1.0 and 1.1 compliant cards can be plugged into a PCIe 2.0 motherboard, and PCIe 2.0 cards can be plugged into older motherboards. This leaves us with zero impact on the consumer due to PCIe 2.0, in more ways than one.

The Card $199 or $249?
Comments Locked

90 Comments

View All Comments

  • Spacecomber - Monday, October 29, 2007 - link

    It's hard to tell what you are getting when you compare the results from one article to those of another article. Ideally, you would like to be able to assume that the testing was done in an identical manner, but this isn't typically the case. As was already pointed out, look at the drivers being used. The earlier tests used nvidia's 163.75 drivers while the tests in this article used nvidia's 169.10 drivers.

    Also, not enough was said about how Unreal 3 was being tested to know, but I wonder if they benchmarked the the game in different manners for the different articles. For example, were they using the same map "demo"? Were they using the game's built-in fly-bys or where they using FRAPS? These kind of differences between articles could make direct comparisons between articles difficult.
  • spinportal - Monday, October 29, 2007 - link

    Have you checked the driver versions? Over time drivers do improve performance, perhaps?
  • Parafan - Monday, October 29, 2007 - link

    Well the 'new' drivers made the GF 8600GTS Perform alot worse. But the higher ranked cards better. I dont know how likely that is
  • Regs - Monday, October 29, 2007 - link

    To blacken. I am a big AMD fan, but right now it's almost laughable how they're getting stepped and kicked on by the competition.

    AMD's ideas are great for the long run, and their 65nm process was just a mistake since 45nm is right around the corner. They simply do not know how to compete when the heat is on. AMD is still traveling in 1st gear.
  • yacoub - Monday, October 29, 2007 - link

    "NVIDIA Demolishes... NVIDIA? 8800 GT vs. 8600 GTS"

    Well the 8600GTS was a mistake that never should have seen the light of day: over-priced, under-featured from the start. The 8800 GT is the card we were expecting back in the Spring when NVidia launched that 8600 GTS turd instead.
  • yacoub - Monday, October 29, 2007 - link

    First vendor to put a quieter/larger cooling hsf on it gets my $250.
  • gamephile - Monday, October 29, 2007 - link

    Dih. Toh.
  • CrystalBay - Monday, October 29, 2007 - link

    Hi Derek, How are the Temps on load? I've seen some results of the GPU pushing 88C degrees plus with that anemic stock cooler.
  • Spacecomber - Monday, October 29, 2007 - link

    I may be a bit misinformed on this, but I'm getting the impression that Crysis represents the first game that makes major use of DX10 features, and as a consequence, it takes a major bite out of the performance that existing PC hardware can provide. When the 8800GT is used in a heavy DX10 game context does the performance that results fall into a hardware class that we typically would expect from a $200 part? In other words, making use of the Ti-4200 comparison, is the playable performance only acceptable at moderate resolutions and medium settings?

    We've seen something like this before, when DX8 hardware was available and people were still playing DX7 games with this new hardware, the performance was very good. Once games started to show up that were true DX8 games, hardware (like the Ti-4200) that first supported DX8 features struggled to actually run these DX8 features.

    Basically, I'm wondering whether Crysis (and other DX10 games that presumably will follow) places the 8800GT's $200 price point into a larger context that makes sense.
  • Zak - Monday, November 5, 2007 - link

    I've run Vista for about a month before switching back to XP due to Quake Wars crashing a lot (no more crashes under XP). I've run bunch of demos during that month including Crysis and Bioshock and I swear I didn't see a lot of visual difference between DX10 on Vista and DX9 on XP. Same for Time Shift (does it use DX10?). And all games run faster on XP. I really see no compelling reason to go back to Vista just because of DX10.

    Zak

Log in

Don't have an account? Sign up now