More From CeBIT: New Mobile Parts

Unfortunately, we were unable to get any notebooks with these new GPUs to test them out before we tell you about them, but both NVIDIA and AMD are announcing new mobile GPUs today.

NVIDIA's move parallels what's happening on the desktop in that the newest additions to the mobile line up are 55nm G92 based parts with names in the new style NVIDIA has chosen. Actually, the entire lineup of 9xxxM series parts is being replaced by parts with new names. This is certainly more expected on the mobile side, as we usually do see much more lag in this space than on the desktop.

As for the specifics, the new parts are the top of the line models. The GTX 280M will be up to 50% faster than the 9800M GTX, which is nice in theory, but final performance will still be up to notebook makers who will set the final clocks on the part on a per notebook basis to accommodate their power budget. The GTX 260M is one step down from the 280M in that it has 112 SPs enabled (like the original G92 introduced as the 8800 GT) and lower maximum clock speeds.

These two high end GTX parts replace the top end 9800M parts, and subbing for the 9800M GS is the GTS 160M which will also offer improved performance, although we didn't get full specifications on this part. Rounding out the bottom of the lineup are the GT 130M and the G 110M.

On the AMD front, we see something a little more intriguing in the form of the first 40nm GPUs in the mobile space. Smaller die sizes, lower power and better power are promised, though the general naming will stay the same for AMD. The new 40nm 4800 series parts can be paired with either DDR3, GDDR3, or GDDR5; the choice is up to the notebook maker. AMD touts the fact that they can get about double the processing power in the same area with their new process, which will only benefit them going forward.

NVIDIA paints the GDDR5 option as overkill, but we really won't know about performance of either the new NVIDIA or AMD parts until we have hardware to test.

The NVIDIA and AMD supplied relative performance graphs are nearly useless in sorting out how these parts should compare to each other, so we'll really have to save the head to head for a time when we have hardware on our hands. 40nm could be a big plus for AMD, but remember that NVIDIA has made the first move in making mobile drivers available from their web site. The value of that is very high, as notebook OEMs tend not to like updating their drivers very often. Sure, it's possible to hack desktop drivers onto a mobile part, but it is a supreme headache and we hope AMD will soon follow in NVIDIA's footsteps with this move.

Back to the Tests at Hand

Now that we've covered all the announcements and introductory material, let's get to testing the hardware we've got in our hot little hands.

We got our card just a couple days ago, so we haven't had time to test everything, and we've only received one card so we haven't been able to test SLI with the 1GB version. We would also have added to our benchmarks by including 1280x1024 in our tests if we had had the time. This is a very important resolution for this class of hardware, but 1680x1050 should be a good enough indicator of relative performance in most cases so that this won't matter too much.

Our comparisons will be a little lop sided though. We've got two each (for single and dual configurations) of the 512MB 4850 and the 512MB GTS 250 (the 9800 GTX+). These comparisons we can do, and it's nice and neat as both parts are now set at a $130 (cutting recent street prices by about $15). We do have a GTS 250 1GB, but we don't have a 1GB 4850 to compare it to. On the flip side, since we've only got 1 GTS 250 1GB, we can't compare GTS 250 1GB SLI to the 4850 X2 2GB we have.

The test setup hasn't changed for this article, except that we've had to use the 182.08 for the GTS 250 1GB.

Test Setup
CPU Intel Core i7-965 3.2GHz
Motherboard ASUS Rampage II Extreme X58
Video Cards Sapphire ATI Radeon HD 4850 X2 2GB
ATI Radeon HD 4870 512MB CrossFire
ATI Radeon HD 4850 CrossFire
ATI Radeon HD 4870 512MB
ATI Radeon HD 4850
NVIDIA GeForce GTX 260 SLI
NVIDIA GeForce 9800 GTX+ SLI
NVIDIA GeForce GTX 260 core 216
NVIDIA GeForce GTS 250 1GB
NVIDIA GeForce 9800 GTX+
Video Drivers Catalyst 8.12 hotfix
ForceWare 181.22
Hard Drive Intel X25-M 80GB SSD
RAM 6 x 1GB DDR3-1066 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W
Why NVIDIA Did It Age of Conan & Call of Duty World at War Performance
Comments Locked

103 Comments

View All Comments

  • Hrel - Thursday, April 9, 2009 - link

    you should specify when you're being sarcastic and when you're being serious. Also, all that red rooster loon red camp green goblin crap simply doesn't make any sense and makes you sound like a tin-foil hat wearing crazy person. Just sayin' dude, lighten up. Do you work for Nvidia? Or do you just really hate AMD?

    Yes, they're both good cores, and yes, it'd be great if Nvidia used DDR5, but they don't, so they don't get the performance boost from it; that's their fault too. And they DID make the GT200 core too big and expensive to produce, that's why the GTX260 is now being sold at a loss; just to maintain market share.
  • Hrel - Wednesday, March 4, 2009 - link

    Oh, also... I almost forgot: you still didn't include 3D Mark scores:( PLEASE start including 3D Mark scores in your reviews.

    Also, I care WAY more about how these cards perform at 144x900 and 1280x800 than I do about 2560x1600; I will NEVER have a monitor with a resolution that high. No point.

    It's just, I'm more interested in seeing what happens when a card that's on the border of playable with max settings, gets the resolution turned down some, then what happens when the resolution gets turned up beyond what my monitor can even display.

    It's pretty simple really, more on board RAM means the card won't insta-tank at resolutions above 1680x1050; but the percent differences should be the same between the cards. Where, comparing a bunch of 512MB and 1GB cards, at resolutions at 1680x1050 and lower, that extra RAM doesn't really matter; so all we're seeing is how powerful the cards are. It seems like a truer representation of the cards performance to me.
  • Hrel - Wednesday, March 4, 2009 - link

    I really do mean to stop adding to this; just wanted to clarify.

    When I say that the extra RAM doesn't matter, I mean that the extra RAM isn't necessary just to run the game at ur chosen resolution. Of course some newer games will take advantage of that extra RAM even at resolutions as low as 1280x800. I'd just rather see how the card performs in the game based on it's capabilities rather than seeing one card perform better than another simply because that "other" card doesn't have enough on board RAM; which has NOTHING to do with how much rendering power the card has and has only to do with on board RAM.

    I think it would be good to just add a fourth resolution, 1280x800, just to show what happens when the cards aren't being limited by their on board RAM and are allowed to run the game to the best of their abilities; without superficial limitations. There, pretty sure I'm done. Please respond to at least some of this; it took me kind of a long time; relative to how long I normally spend writing comments.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Hmmm... you'd think you could bring yuorself to apply that to the 4850 and the 4870 that have absolutely IDENTICAL CORES, and only have a RAM DIFFERENCE.
    Yeah, one would think.
    I suppose the red fan feverish screeding "blocked" your immensely powerful mind from thinking of that.
  • Hrel - Saturday, March 21, 2009 - link

    What are you talking about?
  • Hrel - Wednesday, March 4, 2009 - link

    I'm excited about this; I was kind of wondering what Nvidia was going to do, considering GT200 costs too much to make and isn't significantly faster than the last generation; and I knew there couldn't be a whole new architecture yet, even Nvidia doesn't have that much money.
    However I'm excited because this is a 9800GTX+, still a very good performing part, made slightly smaller, more energy efficient and cooler running; not to mention offered at a lower price point! Yay, consumers win!(Why did Charlie at the Inquirer say it was MORE expensive but anandtech lists lower prices?) I really hope the 512MB version is shorter and only needs 1 PCI-E connector/lower power consumption; if not, that almost seems like intentional resistance to progress. However the extra RAM will be great now that the clocks are set right; and at $150, or less if rebates and bundles start being offered, that's a great deal.

    On the whole, Nvidia trying to essentially screw the reviewers... I guess I don't have much to say; I'm disappointed. But Nvidia has shown this type of behavior before; it's a shame, but it will only change with new company leadership.

    Anyway, from what I've read so far, it looks like the consumer is winning, prices are dropping, performance is increasing(before at an amazingly rapid rate, now at a crawl, but still increasing.) power consumption is going down and manufacturing processes are maturing... consumers win!
  • san1s - Wednesday, March 4, 2009 - link

    365? are you sure about that?
    "even when the 9800 was new... iirc the 4850 was already making it look bad"
    google radeon 4850 vs 9800 GTX+ and see the benchmarks... IMO the 9800 was making the brand new 4850 look bad
    "i'd doubt that anyone buying a 9800 today is planning to sli it later"
    what if they already have a 9800? much cheaper to get another one for sli than a new gtx 260
    "hahaha, less power useage relative to"
    read the article
    "name some mainstream cuda and physx uses"
    ever heard of badaboom? folding@home? mirror's edge?
    the gts 250 competes with the 4850, not 4870
    "continually confusing their most loyal customers "
    what's so confusing about reading a review and looking at the price?

    The gts 250 makes perfect sense to me. Rather than spending $ on R&D for a downgraded GT200 (that will perform the same more or less), why not use an existing GPU that has the performance between the designated 240 and 260?
    Its a no win situation, option #1 will mean a waste of money for something that won't perform better than the existing product that can probably be made cheaper (the G92b is much smaller), and #2 will cause complaints with enthusiasts who are too lazy to read reviews.
    Which option looks better?
  • kx5500 - Thursday, March 5, 2009 - link


    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    I saw that more than once in Combat Arms - have you been playing to long on your computer?
  • rbfowler9lfc - Tuesday, March 3, 2009 - link

    Well, whatever it is, be it a rebadged this or that, it seems like it runs on par with the GTX260 in most of the tests. So if it's significantly cheaper than the GTX260, I'll take it, thanks.

Log in

Don't have an account? Sign up now