Final Words

We had no problems expressing our disappointment with NVIDIA over the lackluster performance of their 8600 series. After AMD's introduction of the 2900 XT, we held some hope that perhaps they would capitalize on the huge gap NVIDIA left between their sub $200 parts and the higher end hardware. Unfortunately, that has not happened.

In fact, AMD went the other way and released hardware that performs consistently worse than NVIDIA's competing offerings. The only game that shows AMD hardware leading NVIDIA is Rainbow Six: Vegas. Beyond that, our 4xAA tests show the mainstream Radeon HD lineup, which already lags in performance, scales even worse than NVIDIA. Not that we really expect most people with this level of hardware to enable 4xAA, but it's still a disappointment.

Usually it's easier to review hardware that is clearly better or worse than it's competitor under the tests we ran, but this case is difficult. We want to paint an accurate picture here, but it has become nearly impossible to speak negatively enough about the AMD Radeon HD 2000 Series without sounding comically absurd.

Even with day-before-launch price adjustments, there is just no question that, in the applications the majority of people will be running, AMD has created a series of products that are even more unimpressive than the already less than stellar 8600 lineup.

While we will certainly concede that video decode capability may be a saving grace in some applications, the majority of end users are not saving their money for a DX10 class video card in order to play movies on their PC. For those who really are interested in this, stay tuned for an article comparing UVD and PureVideo coming next week.

We also won't have data on the performance of these cards under DX10 until next week. Maybe DX10 could make a difference, but we still won't have the full picture. These first DX10 games are more like DX9 titles running on a different API. Of course, this is a valid way to use DX10, but we will probably see more intense and demanding uses of DX10 when developers start targeting the new features as a baseline.

All we can do at this point is lament the sad state of affordable next generation graphics cards and wait until someone at NVIDIA and AMD gets the memo that their customers would actually like to see better performance that at least consistently matches previous generation hardware. For now, midrange DX10 remains MIA.

Supreme Commander Performance
Comments Locked

96 Comments

View All Comments

  • valnar - Friday, June 29, 2007 - link

    Maybe I'm the opposite of most people here, but I'm glad ATI/AMD and Nvidia both produced mid-range cards that suck. Maybe we will finally get the game developers to slow down and produce tighter code or not waste GPU/CPU cycles on eye-candy and actually produce better game play. While I understand that most game companies write games that play acceptably on the $400 flagship video cards, I for one am not one of those people. It's not that I can't afford to buy a $400 card once in a awhile - it's having to spend that every year that ticks me off. I'd much rather upgrade my card every year to keep up the times if said card was $120.
  • titan7 - Saturday, June 30, 2007 - link

    Most game developers already do that. If you don't have the power to run the shaders and enable d3d10 features you can run in d3d9 mode. If your card still doesn't have the power for that you can run in pixel shader 1 mode.

    Take a game like Half Life 2 for example. Turn everything up and it was too much for most high end cards when it shipped. But you can turn it down so it looks like a typical d3d8 or d3d7 era game and play it just fine on your old hardware.

    If you're hoping that developers can somehow make things run just as well on a PentiumIII as on a core2 duo you're hoping for the impossible. The 2600 only has about 1/3 the processing power as a 2900. The 2400 has about 10% of the power! Think about underclocking your CPU to 10% speed and seeing how your applications run ;)

    Thank goodness we can disable features.
  • DerekWilson - Friday, June 29, 2007 - link

    PLEASE READ THE UPDATE AT THE BOTTOM OF PAGE 1

    I would like to apologize for not catching this before publication, but the 8600 GTS we used had a slight overclock resulting in about 5% across the board higher performance than we should have seen.

    We have re-run the tests on our stock clocked 8600 GTS and edited all the graphs in our article to reflect the changes. The overall results were not significantly affected, but we are very interested in being as fair and acurate as possible.

    We have also added idle and load power numbers to The Test page.

    Again, I am very sorry for the error, and I will do my best to make sure this doesn't happen again.
  • coldpower27 - Friday, June 29, 2007 - link

    Meh, thanks Derek, but if you already have Factory overclocked results it may as well be lovely to leave them in as they are fair game, if Nvidia's partners are selling them in those configurations. This is of course in addition to the Nvidia reference clock rates.
  • DerekWilson - Friday, June 29, 2007 - link

    the issue is that overclocked 8600 GTS parts generally go for closer to $200, putting them well out of the price range the 2600 XT is expected to hit.

    it's not a fair comparison to make at this point (expecting the 2600 XT to come in at <= $150 anyway.
  • coldpower27 - Saturday, June 30, 2007 - link

    Yeah, but you have all kinds of GPUs on the chart anyway from many different price points, 7950 GT is not close to $150 either, and neither is the 8800 GTS 320.

    I think people would be quite aware that the Factory OC cards if placed are indeed priced higher but if you already have the results leave them in, in addition to the Nvidia reference clock designs.
  • dm0r - Friday, June 29, 2007 - link

    And please, keep us informed about performance with new drivers because im really interested in midrange video cards :)
  • harpoon84 - Friday, June 29, 2007 - link

    For gods sake, isn't it obvious by now that running DX10 games on these cards will result in LOWER performance, not HIGHER? If you are averaging 30fps @ 1280x1024 on DX9 games, it's only gonna get worse in DX10!

    http://www.extremetech.com/article2/0,1697,2151677...">http://www.extremetech.com/article2/0,1697,2151677...

    Company of Heroes DX10 - SINGLE DIGIT FRAMERATES!!!

    Yes, the 2600 cards are twice as fast as the 8600 cards, but we are talking totally unplayable framerates of 5 - 9 FPS!

    Yeah, designed for DX10 alright! /SARCASM

    Wheres TA152H now huh?
  • frostyrox - Friday, June 29, 2007 - link

    To TA152H:
    Hi. It should've been painfully obvious about 10 comments ago that nobody here agrees with or, well, understands anything your saying. Can you please stop commenting on hardware articles when you don't know what you're talking about? To say that dx9 benchmarks aren't important or, heck, not the most important aspect of these cards makes 0 sense. These cards might be dx10 capable, but they obviously haven't even been given the hardware or raw horsepower to even handle dx9 (even at common resolutions). It also makes 0 sense whatsoever to even suggest these cards will somehow magically perform drastically different in pure dx10 games.

    Furthermore...
    How does it make any sense for AMD/ATI to have a card thats over 400 dollars (2900xt) that trades blows with the 320 and 640mb 8800gts (which are cheaper), but then have nothing between that card and a heavily hardware-castrated x2600xt for 150 dollars (a 250~ dollar difference). Also consider the fact that Nvidias dx10 mid-low end cards have matured, so even if you were froogle and wanted the cheapest possible clownshoes videocard around you should just go Nvidia. Don't even bother calling me a fanboy unless you feel like making me laugh. I currently own a radeon x800xl. I'm just being honest. It's about time the rest of you do the same. Rant over.
  • GlassHouse69 - Thursday, June 28, 2007 - link

    Crossfire gets better results than SLI always.... two crossfire 2600XT cards would take MUCH less wattage than any other option around for its framerate. At least some azn websites are noticing.

Log in

Don't have an account? Sign up now