Original Link: https://www.anandtech.com/show/1506
NVIDIA's GeForce 6200 & 6600 non-GT: Affordable Gaming
by Anand Lal Shimpi on October 11, 2004 9:00 AM EST- Posted in
- GPUs
Although it's seeing very slow adoption among end users, PCI Express platforms are getting out there and the two graphics giants are wasting no time in shifting the competition for king of the hill over to the PCI Express realm.
ATI and NVIDIA have both traded shots in the mid-range with the release of the Radeon X700 and GeForce 6600. Today, the battle continues in the entry-level space with NVIDIA's latest launch - the GeForce 6200.
The GeForce 6 series is now composed of 3 GPUs: the high end 6800, the mid-range 6600 and now the entry-level 6200. True to NVIDIA's promise of one common feature set, all three of the aforementioned GPUs boast full DirectX 9 compliance, and thus, can all run the same games, just at different speeds.
What has NVIDIA done to make the 6200 slower than the 6600 and 6800?
For starters, the 6200 features half the pixel pipes of the 6600, and 1/4 that of the 6800. Next, the 6200 will be available in two versions: one with a 128-bit memory bus like the 6600 and one with a 64-bit memory bus, effectively cutting memory bandwidth in half. Finally, NVIDIA cut the core clock on the 6200 down to 300MHz as the final guarantee that it would not cannibalize sales of their more expensive cards.
The 6200 is a NV43 derivative, meaning it is built on the same 0.11-micron (110nm) process on which the 6600 is built. In fact, the two chips are virtually identical with the 6200 having only 4 active pixel pipelines on its die. There is one other architectural difference between the 6200 and the rest of the GeForce 6 family, and that is the lack of any color or z-compression support in the memory controller. Color and Z-compression are wonderful ways of reducing the memory bandwidth overhead of enabling technologies such as anti-aliasing. So, without support for that compression, we can expect the 6200 to take a bigger hit when turning on AA and anisotropic filtering. The benefit here is that the 6200 doesn't have the fill rate or the memory bandwidth to run most games at higher resolutions. Therefore, those who buy the 6200 won't be able to play at resolutions where the lack of color and z-compression would really matter with AA enabled. We'll investigate this a bit more in our performance tests.
Here's a quick table summarizing what the 6200 is and how it compares to the rest of the GeForce 6 family:
GPU | Manufacturing Process | Vertex Engines | Pixel Pipelines | Memory Bus Width |
GeForce 6200 | 0.11-micron | 3 | 4 | 64/128-bit |
GeForce 6600 | 0.11-micron | 3 | 8 | 128-bit |
GeForce 6800 | 0.13-micron | 6 | 16 | 256-bit |
The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration. While NVIDIA insists that they cannot force their vendor partners to distinguish the two card configurations apart, we're more inclined to believe that NVIDIA simply would like all 6200 based cards to be known as a GeForce 6200, regardless of whether or not they have half the memory bandwidth. NVIDIA makes a "suggestion" to their card partners that they should add the 64-bit or 128-bit designation somewhere on their boxes, model numbers or website, but the suggestion goes no further than just being a suggestion.
The next issue of variability comes in the topic of clock speeds. NVIDIA has "put a stake in the ground" at 300MHz as the desired clock speed for the 6200 GPUs regardless of configuration, and it does seem that add-in board vendors would have no reason to clock their 6200s any differently, since they are all paying for a 300MHz part. The variability really comes when you start talking about memory speeds. The 6200 only supports DDR1 memory and is spec'd to run at 275MHz (effectively 550MHz). However, as we've seen in the past, this is only a suggestion - it is up to the manufacturers as to whether or not they will use cheaper memory.
NVIDIA is also only releasing the 6200 as a PCI Express product - there will be no AGP variant at this point in time. The problem is that the 6200 is a much improved architecture compared to the current entry-level NVIDIA card in the market (the FX 5200), yet the 5200 is still selling quite well as it is not really purchased as a hardcore gaming card. In order to avoid cannibalizing AGP FX 5200 sales, the 6200 is kept out of competition by being a strictly PCI Express product. While there is a PCI Express version of the FX 5200, its hold on the market is not nearly as strong as the AGP version, so losing some sales to the 6200 isn't as big of a deal.
In talking about AGP versions of recently released cards, NVIDIA has given us an update on the status of the AGP version of the highly anticipated GeForce 6600GT. We should have samples by the end of this month and NVIDIA is looking to have them available for purchase before the end of November. There are currently no plans for retail availability of the PCI Express GeForce 6800 Ultras - those are mostly going to tier 1 OEMs.
The 6200 will be shipping in November and what's interesting is that some of the very first 6200 cards to hit the street will most likely be bundled with PCI Express motherboards. It seems like ATI and NVIDIA are doing a better job of selling 925X motherboards than Intel these days.
The expected street price of the GeForce 6200 is between $129 and $149 for the 128-bit 128MB version. This price range is just under that of the vanilla ATI X700 and the regular GeForce 6600 (non-GT), both of which are included in our performance comparison - so in order for the 6200 to truly remain competitive, its street price will have to be closer to the $99 mark.
The direct competition to the 6200 from ATI are the PCI Express X300 and X300SE (128-bit and 64-bit versions respectively). ATI has a bit of a disadvantage here because the X300 and X300SE are still based on the old Radeon 9600 architecture and not a derivative of the X800 and X700. ATI is undoubtedly working on a 4-pipe version of the X800, but for this review, the advantage is definitely in NVIDIA's court.
NV4x's Video Processor - What Happened?
When NVIDIA launched NV40, they were very quick to tout a huge hunk of transistors on the chip, which they called the NV40 Video Processor. This "Video Processor" was composed of more than 20 million transistors and NVIDIA was proud to announce that they put more transistors into NV40's Video Processor than they did in the entire original GeForce 256 chip itself. NVIDIA promised quite a bit with the Video Processor. They promised full hardware accelerated MPEG1/2 and WMV9 encoding and decoding at 1080i resolutions. What it meant was that our CPU video encoding tests would be a thing of the past - a slow CPU paired with any graphics card featuring NVIDIA's Video Processor would be able to handle even the most tasking video encoding without a problem. NVIDIA originally told us that they would have a driver which could take advantage of the processor 2 weeks after the launch of the GeForce 6800 Ultra. We even pressured NVIDIA to work on getting support for the Video Processor in the DiVX codec, since it's quite popular with our readers. The launch came and went, as did the two weeks with nothing from NVIDIA.I personally emailed NVIDIA every other week from May until August asking for an update, with no official or unofficial response as to why nothing had happened with the illustrious Video Processor. Finally, when 23 out of the 35 slides of the NVIDIA press presentation about the GeForce 6200 featured the GPU's "Video Processor", I had had enough. It was only then that NVIDIA came clean about the current state of the Video Processor.
The Video Processor (soon to receive a true marketing name) on the NV40 was somewhat broken, although it featured MPEG 2 decode acceleration. Apparently, support for WMV9 decode acceleration was not up to par with what NVIDIA had hoped for. As of the publication of this article, NVIDIA still has not answered our questions of whether or not there is any hardware encoding acceleration as was originally promised with NV40. So, the feature set of the Video Processor on NV40 (the GeForce 6800) was incomplete, only in its support for WMV9 acceleration (arguably the most important feature of it).
NVIDIA quietly fixed the problem in the 6600GT and since the 6200 is based on the 6600, the 6200 also features the "fixed" Video Processor with WMV9 decode acceleration support. After much explaining to NVIDIA that their credibility when it comes to the Video Processor is pretty much shot, they decided to pull the talk about the Video Processor from their launch of the 6200. As a result, you won't see any benchmarks of it here. NVIDIA is currently aiming to have us a functional driver and codec that will enable the Video Processor and take advantage of its capabilities in the next month or so; given that the feature has already been on cards (in one form or another) for 6 months now, we're just going to have to wait and see.
There are still some unresolved issues here - mainly clarification of what the Video Processor really can and can't do. NVIDIA is touting excellent deinterlacing and video scaling quality, which are both important to DVD playback as well as TV playback. They are also claiming hardware assisted WMV9 decode, although they have yet to provide us with information on how much of the decoding process is actually handled by the video processor and how much of it is still software (CPU) driven. Finally, we still don't know what this thing does when it comes to encoding, but we're inclined to believe that it's far less than full-fledged GPU based video encoding.
We'll keep you updated on this topic as we get more information and we will get more information.
The Contenders
When it comes to reviewing PCI Express graphics cards, our hands are a bit tied, since there are much fewer cards available in PCI Express versions as there are in AGP versions. So, our comparisons here are similarly constrained. That being said, we are able to develop some interesting comparisons, and here are the cards that we're featuring:ATI's X300 and X300SE
These two cards are both 0.11-micron, 4 pipe versions of the RV360, making them perfect candidates for comparison to the GeForce 6200. The prices on these two cards are significantly lower than the MSRP of the upcoming 6200. Street prices on the 64-bit memory bus X300SE are around $75, while the 128-bit bus X300 (much like the 6200 that we're reviewing) is priced at around $100. Keep in mind that both of these cards are still old technology based on the same core as the Radeon 9600, and thus, will have a tough time competing against the 6200.ATI' X600 Pro
Retailing for around $130, the X600 Pro was one of the first PCI Express cards to hit the market. It is basically a PCI Express version of the Radeon 9600 Pro, even down to using the same clock speeds.ATI's X700
Recently, ATI released the X700 XT, a direct competitor to the GeForce 6600GT. Alongside the flagship announcement, ATI also introduced three other X700 parts, a 256MB X700 Pro, 128MB X700 Pro and a regular X700, the latter carrying an MSRP of $149. While the X700 isn't available yet, its clock speeds promise to make it a heavy hitter in the mid-range market. The X700 features an 8-pixel pipe design like the XT, but much lower clock speeds; with a 400MHz core clock and more importantly a 700MHz memory clock, the regular X700 allows board vendors to use much cheaper memory to drive the price down to $149.NVIDIA's GeForce 6600
While the 6600GT received all the attention, the regular 6600 will find its way into more computers, thanks to lower prices. Specification-wise, the 6600 is identical to the 6600GT. It's still an 8-pipe 128-bit design, but as you can guess, it runs at much lower clock speeds. The 6600 runs at a 300MHz core clock, but what really kills it is the 500MHz memory clock. Not only does the regular X700 have a 100MHz core clock advantage, but an impressive 200MHz higher memory clock - the only advantage the 6600 has now is that it's actually available, albeit at clearly higher than its $149 MSRP. The card that we used in our tests was purchased from Newegg for $168.NVIDIA's GeForce 6200
This is the card that's the focus of attention obviously. NVIDIA sent us a reference card that, unfortunately, used a fan. We were hoping that the 300MHz 0.11-micron GPU would feature a passively cooled design much like ATI's X300, but we were left disappointed with the initial reference design. There is hope, however. NVIDIA claims that a passive design is in the works and it should be possible; we tend to believe NVIDIA here, as the heatsink on their sample to us was about 3mm thick beneath the fan. There's clearly room for improvement there.Intel's Graphics Media Accelerator 900
The new integrated graphics core from Intel found in the 915G chipsets was a must-include for this review, simply because we are comparing it to the slowest PCI Express graphics options available today. As we've already seen in previous articles, the 915G is far from a contender when it comes to gaming performance, but we'll see if it's able to scrape by at all in our tests.Power Consumption
A new and much needed addition to our GPU review is tracking power consumption. Here, we're using a simple meter to track the power consumption at the power supply level, so what we're left with is total system power consumption. But with the rest of the components in the test system remaining the same and the only variable being the video card, we're able to get a reasonably accurate estimate of peak power usage here.At idle, all of the graphics cards pretty much perform the same, with the integrated graphics solution eating up the least amount of power at 106W for the entire system:
Then, we fired up Doom 3 and ran it at 800x600 (High Quality) in order to get a peak power reading for the system.
Interestingly enough, other than the integrated graphics solution, the 6200 is the lowest power card here - drawing even less power than the X300, but that is to be expected given that the X300 runs at a 25MHz higher clock speed.
It's also interesting to note that there's no difference in power consumption between the 128-bit and 64-bit X300 cards. The performance comparison is a completely different story, however.
In the end, none of these cards eat up too much power, with the X700 clearly leading the pack at 10% greater system power consumption than the 6600. It will be interesting to find out if there's any correlation here between power consumption and performance.
The Test
Performance Test Configuration | |
Processor(s): | Intel Pentium 4 560 (3.6GHz) |
Motherboard: | Intel 915GUX (Intel 915 Chipset) |
Video Card(s): | ATI Radeon X700ATI Radeon X600 Pro ATI Radeon X300 ATI Radeon X300SE NVIDIA GeForce 6600 NVIDIA GeForce 6200 Intel Graphics Media Accelerator 900 |
Video Drivers: | ATI - CATALYST 8.07 Beta Drivers (CATALYST AI Enabled) NVIDIA - 66.81 Drivers |
Doom 3 Performance
While the Doom 3 frenzy isn't nearly as bad as it was a month ago, the performance seen under id's latest engine is quite important as there are a number of games in development right now using the Doom 3 engine. We have two sets of benchmarks here to look at - playable resolution benchmarks, as well as a chart of performance vs. resolution to see how the cards compare at sometimes not-so-playable resolutions.Since we're dealing with relatively entry-level cards, we found that the perfect balance between frame rate and image quality lands at 800x600, and thus, that's where our first graph comes from.
Here, we see that the GeForce 6600, despite its lower fill rate and lower memory bandwidth, is still able to outperform the regular X700 by about 8%. It's not a huge margin, but impressive considering that the card is underpowered compared to the X700. The explanation as to "why" is more of an architectural discussion, as we've seen that NVIDIA's GeForce 6 series of GPUs are much better suited for the Doom 3 engine than ATI's.
The GeForce 6200 comes in a valiant third, clearly outperforming the 4-pipe competitors from ATI and nipping away at the heels of the slightly more expensive X700. Here's the tricky part though. Depending on what price the 6200 and X700 are actually available for when they hit the streets, the recommendation could go either way. At the same price, the X700 is the clear winner here, but at a higher price, the decision becomes more of a question of budget rather than which one to pick.
Next, we have the resolution scaling chart to see how all of these cards fair in the grander scheme of things. Here, we see that none of the cards are particularly CPU limited under Doom 3 and all of them experience a serious drop in performance as you increase the resolution. Doom 3 is clearly taxing enough for even the fastest of contenders here.
What about playability? We took some notes during our testing of the cards and will share them here as to what our gaming experiences were with all of the cards in a section we like to call "Notes from the Lab".
ATI X300: The card is clearly slower than the 6200. The added memory bandwidth gives it a performance advantage over the 64-bit SE, but it's nowhere near in the same league as the 6200. ATI desperately needs to have an X800 derived part for their low-end, much like they have in the mid-range with the X700.
ATI X300SE: The game plays "OK" at 640x480, definitely sluggish in certain areas. The aliasing is particularly bad at 640, so the resolution only really works if you have a small monitor or if the person playing isn't much of a gamer at all and has never been introduced to the fact that you can get rid of aliasing. At 800x600, things just get too slow for comfort and beyond that is basically unplayable.
ATI X600 Pro: You can't notice any visual quality differences between ATI and NVIDIA when it comes to Doom3, not to mention that the game is frankly too dark to notice any differences in texture filtering to begin with. 640x480 and 800x600 play quite well on the X600, despite the fact that the frame rate is clearly lower than the two NVIDIA cards. Unfortunately, anything above 800x600 is a bit too slow on the X600 Pro. It's "playable", but honestly, just frustratingly slow compared to the other cards.
ATI X700: The X700 performs clearly better than the X600 Pro and close to the 6600, but the 6600 is clearly faster in actual gameplay.
NVIDIA GeForce 6200: 800x600 seems to be a sweet spot of image quality to performance ratio for the 6200. The game played very smooth with no noticeable image quality issues. 1024x768 looked better, but started to get a little slow for our tastes. 1280x1024 was far too slow, although it looked great. If you want to go up to 1280, you're going to want to go for a 6600 at least.
NVIDIA GeForce 6600: At 800x600, the 6600 completely blows away the 6200; it makes the 6200 feel like a slow card. 1024x768 is still sluggish in places, but overall, much better than the 6200. 1280x1024 is fine when just walking around, but once you get enemies on the screen and they start attacking you, things slow down. It may be that it takes the 6600GT to truly be smooth at this resolution. That being said, it continues to amaze us about how good lower resolutions look in Doom 3.
Intel Integrated Graphics: Surprisingly enough, Intel's integrated graphics will actually run Doom3, but it is basically unplayable at medium quality at 640x480 - not to mention that we couldn't get it to complete a single benchmark run (the driver kept on crashing).
Half Life 2 (Source) Visual Stress Test
Although it's around a year since we thought that the game would be out, all of the preloads off Steam and magazine reviews seem to indicate that Half Life 2 is finally coming. The closest thing that we have to being able to benchmark the actual game is the Visual Stress Test supplied with Counter Strike: Source.First, let's start off with seeing how the performance stacks up at 800x600. Here, ATI clearly takes the lead, which is what we would expect, given the X700's clock speed advantages over the 6600. The 6600 puts forth a decent effort, securing a hold on the 2nd place position, but what's truly interesting is the X600 Pro in third. Outperforming the GeForce 6200 by almost 20%, the X600 Pro looks like it will be the faster card under Half Life 2, if these scores are indicative of anything.
The Visual Stress Test would not run at 640x480. Thus, our resolution scaling comparison begins at 800x600. We see that all of the cards, once again, exhibit steep slopes when it comes to resolution scaling in Half Life 2. We're once again not CPU bound here, just limited by the GPUs.
Notes from the Lab
ATI X300SE: The X300SE did an OK job at 800x600, but once the resolution started to go up, we saw some choppiness in the test. Again, since this isn't a gaming scenario, it's tough to tell what actual gameplay would be like with the X300SE.ATI X600 Pro: If you don't restart the game between resolution changes, there appears to be a texture corruption issue, causing some textures to appear pink. The same issue occurs on NVIDIA cards, but it just seems to happen less frequently on ATI cards. The test is beta, so we're not too surprised and it doesn't seem to impact performance. The performance of the X600 is pretty solid, clearly faster than the 6200, but a bit slower than the 6600.
ATI X700: A clear performance leader here, no issues with running at even the highest resolutions. At 1280x1024, it did get a little sluggish in places during the test, but 1024x768 ran very smoothly.
GeForce 6200: Water reflections really look a lot better at 10x7, the aliasing is pretty horrible at 640x480. Performance was decent, but clearly not great.
GeForce 6600: It's good to note that both the 6 series cards are fully DX9 compliant under HL2. The 6600 seemed to offer similar performance to the X700, but it was slower by a noticeable margin.
Intel Integrated Graphics: When benchmarkng the VST, there were two cards that didn't appear in Valve's database - the GeForce 6200, because it hadn't been released yet, and Intel's integrated graphics. I guess that it's no surprise why no one uses the integrated graphics for gaming. The integrated graphics only runs in DX8.1 mode under CS: Source. The display driver crashed running this benchmark as well. It's becoming quite easy to benchmark Intel graphics - we get to skip half the benchmarks.
Star Wars Battlefront Performance
Released alongside the latest Star Wars DVDs, Battlefront has become popular quickly, thanks to the large scale feel of its battles as well as fairly decent graphics. There is no built-in benchmark in this game, so we developed an easily repeatable path to take during the second mission of the Clone Wars single player campaign that resulted in fairly consistent frame rates between runs. We used FRAPS to record the average frame rate during the sequence.The Radeon X700 completely destroys the competition here with an average of 85 fps compared to the next highest Radeon X600 Pro at 45 fps. We have no explanation for the poor performance of the GeForce 6 cards; there were no visual anomalies, so it's either a driver performance issue or an issue with the game and the GeForce 6 architecture.
We continue to see fairly linear scaling with resolution across all of the cards, but it is most pronounced on the X700, since its producing much higher frame rates than the rest of the cards. Interestingly enough, the X700 at 1280x1024 is faster than any of the other cards at 800x600.
Notes from the Lab
ATI X300SE: The X300SE is basically too slow to play this game. There's nothing more to it. The X300 doesn't make it much better either.ATI X600 Pro: The mouse is very laggy at 12x10, and performance is a bit better than the 6600 in average frame rates, but not really noticeable in actual gameplay. For all intents and purposes, the X600 Pro performs similarly to the 6600.
ATI X700: The X700 serverely outperforms the competition here - wow. And it is quite noticeable in game play. 1280x1024 was a little slow in areas, but 1024x768 was perfect.
NVIDIA GeForce 6200: An OK performer; at resolutions above 800x600, the card isn't nearly as responsive as we would like it to be.
NVIDIA GeForce 6600: Interestingly enough, the 6600 doesn't feel that much faster than the 6200, but you can start to tell more of a difference at the higher resolutions.
Intel Integrated Graphics: Starting the game with just the 915G's integrated graphics gives us the following warning:
Remember that Intel's graphics core has no vertex shaders. All vertex operations are handled by the CPU; thus, it fails to meet the specifications of many games that require hardware vertex shaders. Luckily, most games allow you to continue playing despite the error, but in this case, the display driver would just crash while running Battlefront.
The Sims 2 Performance
A new addition to our test suite, made especially for this article, is the latest installation of the Sims series. Sims 2 isn't the type of game that requires a $600 GeForce 6800 Ultra, but it is the type of game that does require some minimum level of graphics performance and is sometimes found installed on computers on which you would otherwise not find a single game. So, what are the minimum graphics requirements for a playable Sims experience? To find out, we benchmarked the camera flyby that occurs when you select the Pleasantview neighborhood. We used FRAPS to measure the average frame rate throughout the sequence.At 800x600, there's once again one clear winner here, the Radeon X700 by a huge margin (42%) over even the GeForce 6600. The GeForce 6600 is the distant 2nd place performer, and there's a huge clump of cards that perform similarly to the Radeon X600, with the 64-bit X300SE coming in last. Interestingly enough, even the slowest X300SE manages to play the game reasonably well at 800x600 with the highest detail settings possible.
For more, let's look at the resolution scaling graph:
Notes from the Lab
ATI X300: The X300 offers performance very similar to that of the X600 Pro and the GeForce 6200. The game is not totally smooth, but is definitely playable at 800x600. There is a significant amount of aliasing at 800x600, but without a faster card, there's little you can do about it.ATI X300SE: There is a noticeable performance difference between the X300 and X300SE, yet even the X300SE can play the game reasonably well at 800x600. If you turn down the detail settings, the performance improves dramatically.
ATI X600 Pro: Although the X600 Pro performs similarly to the GeForce 6200 and 6600, the frame rate is much more stable than either of those two. There's far less stuttering when scrolling around the game world.
ATI X700: The X700 continues to be much, much faster than the rest of the contenders here.
NVIDIA GeForce 6200/6600: Both the 6200 and 6600 exhibit stuttering issues under Sims 2, although the game is definitely playable using either.
Intel Integrated Graphics: Here's where performance truly matters for Intel graphics - in a game like The Sims 2. This is the type of game that will be played by people who don't come within 100 yards of Doom 3 and who, honestly, shouldn't need to spend even $100 on a video card to play a game like this. How does the 915G fair? It actually plays the Sims pretty well. There is some loss in image quality it seems (just detail), but it's actually not bad at all. If you're building a computer for someone who only plays the Sims, Intel's integrated graphics is actually all you need. 800x600 looks pretty bad, but luckily, the game is playable at 10x7. You may have to turn down the detail settings as there is a bit of stuttering at the highest settings.
Unreal Tournament 2004 Performance
UT2004 continues to be quite popular, so we take a look at how well the entry level cards play Epic's latest game.At 800x600, many of the cards appear to be CPU limited, with the exception of the GeForce 6200, X300 and X300SE (and, of course, the integrated graphics solution).
The performance is very similar between all of them because of this CPU limitation, so let's step back and see what the whole picture tells us:
The X700 continues to dominate performance, but here, it mostly allows you to play at higher resolutions more than anything else. The 6600 and X600 Pro actually perform quite similarly, as do the 6200 and X300, which isn't too good for NVIDIA.
Notes from the Lab
ATI X300: The added memory bandwidth really helps. It's definitely a noticeable improvement in performance over the X300SE. Interestingly enough, the X300 is basically as fast as the 6200 here, with a higher core clock and less memory bandwidth.ATI X300SE: Not obscenely fast, but the card will play UT.
ATI X600 Pro: Visual quality, again, looks similar to NVIDIA. Performance at lower resolutions is CPU limited and competitive with NVIDIA. At 800x600, the X600 manages to stay ahead of the game, where NVIDIA falls behind a bit with the 6200. The game locked up switching resolutions once. It is interesting that average frame rates are actually higher in Doom 3 than they are in UT2004. It looks like Doom 3 is a much more peaky game, with more peaks and dips than UT2004, which offers a more stable frame rate. A quick check with FRAPS reveals what we had suspected. Although both UT2004 and Doom 3 have a minimum frame rate around 30 fps with the X600, Doom 3 peaks at about 150fps while UT does so at 113fps. Doom 3 peaks a full 30% higher than UT, despite the fact that the average frame rates are the same. Performance of the X600 is very strong; it's better matched for the 6600 than the 6200, despite NVIDIA's marketing.
ATI X700: At lower resolutions, it's the same speed as the X600. Only when you get past 1024 does it really separate itself.
NVIDIA GeForce 6200: Anything below 10x7 is a bit too aliased, but 10x7 seems to play well and look great on the 6200, despite what the average framerate may indicate. Launching the timedemo while still in the video settings screen caused UT to GPF (General Protection Fault).
NVIDIA GeForce 6600: It's tough to tell the difference between the 6600 and the 6200 at lower resolutions. The 6600 gives you the ability to play at up to 10x7 with no real drop in frame rate, but the 6200 does work well for the beginning/casual gamer.
Intel Integrated Graphics: The game is playable at 800x600 with the integrated graphics solution. You have to turn down some detail settings to get better responsiveness, though. It can work as a platform to introduce someone to UT2004.
Battlefield Vietnam Performance
The Battlefield series continues to be popular; with no built-in benchmark, we're forced to develop a benchmark by starting a multi-player game in the Quang Tri 1972 level and manually navigate through a path using FRAPS to record the average frame rate.We kept all of the settings at high except for the graphics detail slider, which we had to keep at low in order to provide a true "apples to apples" comparison between all of the cards here. The standings will remain the same at higher detail settings; the frame rates will simply be lower.
The X700 continues to dominate performance here, outperforming the GeForce 660 by 32%. In the lower end of things, the X600 Pro and the GeForce 6200 are pretty much neck and neck, with the X300 clearly falling behind.
The rest of the resolution picture isn't much different from what we've already shown you:
Notes from the Lab
ATI X600 Pro: Mouse isn't as laggy, but still laggy enough, especially at 12x10.ATI X700: This continues to dominate performance here as well.
NVIDIA GeForce 6200: The mouse speed is erratic in the menus before you enter a game. The card isn't really considered "fast", but smooth at 800x600.
NVIDIA GeForce 6600: Performance is much better than the 6200, and the mouse still lags in the menu for some reason. This card plays the game nice and fast.
Intel Integrated Graphics: At 800x600, it actually plays and looks pretty decent. Graphics quality always defaults to lowest for some reason. The mouse is surprisingly not as laggy as on the NV cards.
Halo Performance
Although Halo 2 is due out next month on the Xbox, the only Halo action that PC gamers will be getting is from the currently available Halo PC title. It is still pretty fun and well played, thanks to its multi-player modes and thus, we still use it as a benchmark. We ran the built-in timedemo with the - use20 extension to force Pixel Shader 2.0 shaders when possible.The standings are similar in Halo to what we've already seen, with the X700 leader, and the 6200 basically performing as well as the X600 Pro.
We see that at 640x480 and even a little bit at 800x600, there are some elements of CPU limitations, but by the time you're at 1024x768, most of the CPU limitations have faded away. Once again, we see that regardless of resolution; the GeForce 6200 and Radeon X600 Pro perform identically to one another.
Notes from the Lab
Intel Integrated Graphics: We were greeted with another error when attempting to run the Intel benchmarks:Luckily, Halo let us continue and the card actually worked reasonably well. The image quality was clearly inferior to everything else - especially in texture filtering quality, the amount of shimmering in the game was incredible. But if you ask, "does it work?" and "can I play the game?", then the answer is "Yes." It makes Halo look like a game from 1999, but it works. The reason for the warning is because the integrated graphics has no built-in vertex processing, which Halo requires to be able to run. But with a fast enough CPU, you should be fine, which is what Intel was going for with their integrated graphics architecture to begin with.
FarCry Performance
For our final test in this suite, we have FarCry 1.2. Once again, we see the X700 clearly on top, but this time, the X600 Pro is actually noticeably faster than the GeForce 6200, which offers performance closer to that of the X300.The resolution scaling graph tells a similar story:
Anti-Aliasing Performance
Although the GeForce 6200 lacks the color and z-compression support that the 6600 has, we've already seen that it is not a fast enough card to run at the high resolutions that would benefit from those features with AA enabled.We also see from the following benchmarks that the loss of color and z-compression does not negatively impact anti-aliasing performance at playable resolutions with the GeForce 6200. For the sake of simplicity, we left off cards that were simply too slow.
Under Doom 3, the move to 2X AA is a huge hit on all cards. There is a slight difference in the performance penalty between the 6200 and 6600 here. With 2X AA enabled, the GeForce 6600 offers around 79% of its original performance, while the 6200 offers only 75%. It's not a huge difference, which leads us to believe that NVIDIA made the right tradeoff in removing color and z-compression from the 6200 series. It honestly won't matter to most users.
Under UT2004, the impact from AA is much less pronounced as even the GeForce 6200 is still fairly CPU bound even with 2X AA enabled. The 6200 does take a bigger hit from enabling AA (89% of its original performance in 2X mode) when compared to the 6600 (93% of its original performance in 2X mode), but again, it's nothing dramatic.
Anti-Aliasing Image Quality
Although we noticed no differences in image quality, we wanted to provide some screenshots to compare the AA modes that both ATI and NVIDIA offer. With the GeForce 6 series, NVIDIA finally offers a rotated grid AA solution, which is also present in the GeForce 6200.The default images below are the ATI images; mouse-over to see the NVIDIA images:
There are slight differences between the two AA methods, but the differences aren't too noticeable in actual gameplay. Both ATI and NVIDIA do a good job here. You can download high resolution screens of the scene that we tested here
Anisotropic Filtering Performance
As both ATI and NVIDIA are now more similar in their approach to anisotropic filtering, it's no surprise to see that neither company is taking a significant hit from anisotropic filtering. There is an initial hit when 2x is enabled, but above and beyond that, it is fairly minimal.Anisotropic Filtering Image Quality
To look at anisotropic filtering performance, we dropped the guns and looked at the floor for a while. The default images are taken from ATI's drivers; hold your mouse over them to view the images on an NVIDIA card.16X Anisotropic Filtering
Hold your mouse over for the NVIDIA image.
This particular scene is perfect for looking at anisotropic filtering performance, as you can see the separation between the individual tiles towards the back of the scene when anisotropic filtering is enabled.
Again, there are no huge differences between ATI and NVIDIA here.
Final Words
Who knew that there would be so much to talk about surrounding the launch of a new entry-level part? Concluding is thankfully a little more concise in this case.If all of the cards in this review actually stick to their MSRPs, then the clear suggestion would be the $149 ATI Radeon X700. In every single game outside of Doom 3, the X700 does extremely well, putting even the GeForce 6600 to shame; and in Doom 3, the card holds its own with the 6600. Unfortunately, with the X700 still not out on the streets, it's tough to say what sort of prices it will command. For example, the GeForce 6600 is supposed to have a street price of $149, but currently, it's selling for closer to $170. So, as the pricing changes, so does our recommendation.
In most cases, the GeForce 6200 does significantly outperform the X300 and X600 Pro, its target competitors from ATI. The X300 is priced significantly lower than the 6200's $129 - $149 range, so it should be outperformed by the 6200 and it is. The X600 Pro is a bit more price-competitive with the GeForce 6200, despite offering equal and even greater performance in certain cases.
However, we end up back at square one. In order for the 6200 to be truly successful, it needs to either hit well below its $129 - $149 price range, or ATI's X700 needs to be much more expensive than $149. In the latter case, if the card is out of your budget, then the 6200 is a reasonable option, but in the former case, you can't beat the X700. Given that neither one of the cards we're debating about right now are even out, anything said right now would be pure speculation. But keep an eye on the retailers. When these cards do hit the streets, you should know what the right decision should be.