NVIDIA GeForce GTS 250: A Rebadged 9800 GTX+

by Derek Wilson on 3/3/2009 3:00 AM EST
POST A COMMENT

103 Comments

Back to Article

  • mard - Wednesday, April 15, 2009 - link

    just wondering if anybody knew if the Thermalright HR-03 GTX Rev.A
    would be compatible with the GTS250. if not, what would be another passive cooling option for this card

    thanks
    Reply
  • Core Core - Monday, April 06, 2009 - link

    I'm glad this review was done, it really has given me more data on which card to buy. I hope it is updated with some more focus on people who have my set of concerns, see below...

    I want a newer HD Ready/DX10/Shader4 card, and it has to work in a SFF case. I have only one dual slot and one 2x6 video card power supply, so i want to choose one of the two 1GB cards from ATI/nVidia (250 vs. 4850).

    Low heat & power & noise are very important to me. I also think dual slot exhaust is needed in my case. Currently, i have a very hot, noisy, power hog (ATI's X1900XTX) that i want to replace.

    A nVidia GTS250 or ATI 4850 are in my price range and are roughly double the performance i have now, i am connecting to a very large HD Ready display and i want to watch HD movies, game, and compute without problems.

    Your review did not do the ATI 4850 1GB card or go into any details on High Definition 1080p uses, i would like a comparison and review of HDCP, 1080p, and clarity of displayed text on a HD ready test system.

    I'm a total gamer, i watch heaps of HD anime, as well as compute & web browse.
    Reply
  • cactusdog - Sunday, March 22, 2009 - link


    SiliconDoc, you should see a doctor. Instead of blaming everybody else for Nvidia's poor standing in the eyes of the tech community, maybe you should look at why no one likes them....and your own bullying attitude should give you a clue.

    I've read a lot of "fanboy" comments but you take it to a new level. Psychofanboy would be more appropriate for you.

    Reply
  • SiliconDoc - Wednesday, April 08, 2009 - link

    LOL - At least this fella tells the truth, and yet you admit "your idea is the tech community hates them".
    Believe me, I know EXACTLY why, I've seen it all too often, no need for me to find out, mr cryptic with the EPIC FAIL.
    .
    Well, that leaves the SANE PERSON with the conclusion all the little red haters are LYING SACKS OF FUD AND CRAP, and they are near always blabbing out a lie for unfair red advantage, and THEREFORE - buying the nvidia card is the smart thing to do. The more they hate (with their endless stream of lies), the better the nvidia card really is.
    Now, if you don't have an actual counterpoint to the OTHER posts I've made, that destroyed and exposed the 6 months long plus red rooster fanboy fud parrot lines, why then you just go ahead and respond like you did above - because this one absolutely matches yours - PURE SPECULATION with nary a fact in it - just like you, you idiot.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    The problem is even on all the other computer hardware, a naming scheme NEVER tells the avid consumer which one is really better. Same with CARS. Same with kitchen blenders.
    The REALITY of the product only hits the interested public when word actually gets around...when people buy it, reviewers and Consumer Digest take a stab, Tv commenters blab they have one and it's great or it's in the shop - facebook or myspace spreads the news, someone tweets about it...
    THIS is the reality of our computer age !
    In other words, I'm sure everyone wants an easier way out, and wants it all perfectly suited to absolute fairness - but the FACT REMAINS, on EVERYTHING one purchases, without some information far more extensive than the pretty PR ad box and name gives you - YOU WILL EITHER GET SCREWED OR GET LUCKY. PERIOD.
    If you have a really keen eye and some awesome circus sense, you just might make the right call from sight, smell, cover and wording, and placement on the shelves - but then...
    you'd be a wonderous expert with a special gift that could be put to work for pay.
    Face REALITY.
    Reply
  • earthshaker87 - Monday, March 09, 2009 - link

    My 4850 runs better than what these benchmarks say. I recently tested my card in COD: WaW with FRAPS. Im running XPSP3, C2DE8200 2.66, 2GB Kingston Value RAM, ASUS P5K-SE/EPU, MSI R4850(ref clocks and cooler) and got average FPS of 53.36 on same settings as yours. Could it be that Windows XP is the difference? Reply
  • cbm - Friday, March 06, 2009 - link

    How bout testing this on a system that people actually would own at this point in time. Reply
  • Hrel - Tuesday, March 10, 2009 - link

    They've said this before; they test on the highest end system they can to try and remove all system limitations so the only difference you're seeing in test results come from the GPU's. Instead of the CPU, RAM or HDD. If they tested GTS250 in SLI on a dual core DDR2 system the GPU would be limited by the system, so you wouldn't get accurate results comparing the cards. These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards. Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    In other words, with "our systems" with limited cpu, ram, hd's and motherboards, these results especially at the enourmous resolutions and excessive framerates are really overkill and border on meaningless.
    The are meaningless to a large extent until games catch back up with the gpu's, or people catch up with the test beds and monitors.
    So when they, in these reviews, parse a few percentage framerate difference at the high rezz - on the high end rig, on the expensive 30" monitor, then screed out a winner, they are essentially DELUDED.
    It's a winner "for them" while they are at work, mind numbingly whacking away at the hundreds of runs... the few little frames that they have NO CLUE are any different even at high resolutions weren't it for fraps and the pretty yellow numbers on screen.
    Yes, it's a sad day, huh.
    Then, the raging wackos scream about the 1,2,3 maybe 10% difference on the supposedly "one to one" card comparisons - at resolutions and system powers they can only dream of.
    I think that makes it MORE THAN CLEAR that the added value is much more important - what comes with the card, a game, the adapters, the looks, cuda, physx, folding , video conversion, fan type - heat generation - and very important - drivers and stability.
    Well NVIDIA wins those, hands down (save the bundle in some cases). TWIMTBP - and plenty of reasons WHY.
    Reply
  • Hrel - Saturday, March 21, 2009 - link

    I don't necessarily disagree with anything you said, other than saying the tests are meaningless. But you seem to be block headed and not want to listen so here... repetition yeah:
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.

    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    These articles aren't supposed to show you how the cards will perform in your system, they're just supposed to show yo the difference between the cards.
    Reply
  • SiliconDoc - Saturday, March 21, 2009 - link

    Thanks for going completely nutso (already knew you were anyway), and not having any real counterpoint to EVERYTHING I've said.
    Face the truth, and stop spamming.
    A two year old with a red diaper rash bottom can drool and scream.
    Epic fail.
    Reply
  • kx5500 - Thursday, March 05, 2009 - link


    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
    Reply
  • Baov - Thursday, March 05, 2009 - link

    Does this thing do hybridpower like the 9800gtx+? Will it completly power down? Reply
  • san1s - Wednesday, March 04, 2009 - link

    "365 wwwwwelll no but how old is the g92 regardless of die size.. g80?
    lol?"
    if you think all the changes that went from g80 to g92b were insignificant, then I guess you'll think that the difference from an intel x6800 and an eo stepping e8400 is meaningless too. I mean, they are both around 3 Ghz right? and they both say core 2, so that means that they're the same./sarcasm off. I'm not going to continue with this any further- if you don't get it, then you'll never will. The gpu in the 9800 GTX+ was released last summer, over half a year ago, but not quite a year.

    "at all resoutions?"
    at all the resolutions that a educated person purchasing a midrange video card plays at. Mid range card= midrange monitor. You don't mix high end with low end or midrange components as that will result in bottlenecking. Anyway, the difference between 8 and 12 FPS @ 2560 by 1600 are meaningless as they are not playable anyway.

    "i wouldnt say $50 would stop me from getting a 260 it is at least a newer arch. or ahem a 1gb 4870.
    what if they do have a 9800/250... well if they look at the power #'s for sli in this article they'd definately reconsider"
    not everyone has the luxury to overshoot their budget on a single component by $50 and call it insignificant.

    "most people don't care enough to engage in this activity"
    lol. How would they ever get their custom built PCs to work without knowing a bit of background info? Give a normal person a bunch of components and lets see how far they get without knowing anything about PCs. If you don't know your hardware you shouldn't be building computers anyway. I personally wouldn't go out and buy tires by myself if I were up change them myself without researching. I don't have a clue about tire sizes, and I as sure as hell won't buy new tires without researching just because I don't care for that activity.

    "and what about option #3 buy ati?"
    That's not what I was talking about. Consumers should support all the sides of competition to drive prices down, not just only ati or only nvidia. What I meant was people blaming nvidia for their own mistakes. There is a gap in the current line of nvidia gpus, and to fill it, what would be the best way while maintaining performance relative to the price and naming bracket?
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Good response, you aren't a fanboy, but the idiots can't tell. You put the slap on the little fanboys COMPLAINT.
    This is an article about the GTS250, and the whining little fanboy red wailers come on and whine and cry about it.
    To respond to their FUD and straighten out their kookball distorted lies IS NOT BEING A FANBOY.
    You did a good job trying to straighten out the poor ragers noggin.
    As for the other whiners agreeing "fan boys go away" - if they DON'T LIKE the comments, don't read 'em. They both added ZERO to the discussion, other than being the easy, lame, smart aleck smarmers that pretended to be above the fray, but dove into the gutter whining not about the review, but about fellow enthusiasts commenting on it - and I'm GLAD to join them.
    I hope "they go away" - and YOU, keep slapping the whiners about nvidia right where they need it - upside the yapping text lies and stupidity.
    Thank you for doing it, actually, I appreciate it.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    PS - as for the red fanboy that did the review, I guess he thought he was doing a "dual gpu review".
    I suppose "the point" of having all the massive dual gpu scores above the GTS250 - was to show "how lousy it is" - and to COVER UP the miserable failure of the 4850 against it.
    Keep the GTS250 at OR NEAR 'the bottom of every benchmark"...
    ( Well, now there's another hint as to why Derek gets DISSED when it comes to getting NVidia cards from Nvidia - his bias is the same and WORSE than the red fan boy commenters - AND NVIDIA KNOWS IT AS WELL AS I DO.)
    Thanks for the "dual gpu's review".
    Reply
  • Totally - Thursday, March 05, 2009 - link

    Dear fanboys,

    Go away.

    Love,

    Totally
    Reply
  • Hxx - Thursday, March 05, 2009 - link

    lol best post gj Totally

    Seriously, 3 main steps to buy the righ card:

    1. look at benchmarks
    2. buy the cheapest card with playable fps in the games u play
    3. don't think its future proof - none of them are.
    Reply
  • Mikey - Wednesday, March 04, 2009 - link

    Is this even worth the money? In terms of value, would the 4870 be the one to get?

    http://findaerialequipment.com/">aerial lifts ftw
    Reply
  • Nfarce - Wednesday, March 04, 2009 - link

    Mikey, the 4870 is the way to go in just about all scenarios. Search AnandTech's report from last fall on the 4870 1GB under the Video/Graphics section. The GeForce 260/216 is still more and performs lower. Normally I'm an Nvidia fanboy, but in this segment where I'm purchasing, it's ATI/AMD hands down no questions asked. Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    The cheapest 4870 1G at the egg right now is 194.99 + shipping and they go up well over $200 from there -

    The cheapest GTX260/216 at the egg right now is 179.99 + shipping.
    __________________________________-


    Now let's look further - in order ! (second # after rebate)
    4870 1g
    199.99
    199.99/169.99
    199.99/179.99
    214.99/194.99
    234.99/209.99
    239.99/214.99

    GTC260/216
    189.99/159.99
    208.99/189.99
    212.99/177.99
    229.99/199.99
    232.99/197.99
    234.99/214.99

    _______________________________

    Oh well, another red fantasiacal lie exploded all over the place, AGAIN.
    Reply
  • Griswold - Wednesday, March 04, 2009 - link

    It goes like this:

    8800GTS 512 -> 9800GTX(+) -> GTS250

    Weak, nvidia...
    Reply
  • Nfarce - Wednesday, March 04, 2009 - link

    When Uncle Sam gives me some of my money back in a few weeks, it will be spent on a mid-range i7 build. For months I debated two things in my GPU build spec: the less headaches of going Nvidia but paying more for less performance vs. ATI's more driver/support headaches but paying less for more (or in a few cases generally equal) performance. To this day there are a lot of Catalyst issues, especially in Crossfire. Even so, articles like this have helped push me over to a first time ATI/AMD GPU buyer. :) Reply
  • earthshaker87 - Monday, March 09, 2009 - link

    Dude just stick to Single GPU setup. Ive had 4 Cards from ATi now: 9550,X800GT,HD3850,HD4850. None of them gave me headaches at all. I think the drivers are working just fine for me. No one needs 2 GPUs, its a stupid buy really...you pay double for most of the time not double performance and get issues with it. Why do you need it if you can buy a perfectly capable Single Radeon 4850 for dirt cheap or if you got more cash get a GTX285 the top single GPU card, no problems and headaches or inconsistent FPS. Multi GPU splutions is just not perfect yet... Reply
  • Frallan - Wednesday, March 04, 2009 - link

    Please include the 4830 in some tests in the future - Im not personally interested but 2 or 3 of my friends and family has asked and i honestly dont know what to say. A 4830 is about 1k SEK in Sweden and a 4850 is around 1.4k (+40%) (also a Gigabyte 4850 with the Zalmann cooler is 1.6k SEK *sigh*).

    For me this segment is getting more imprtant as almost all ppl I know wants dedicated graphics but without splurging for the best.

    Reply
  • frozentundra123456 - Wednesday, March 04, 2009 - link

    In a way this just shows how strong the last generation of nvidia cards was, in that they can still compete with AMD. I definitely think the AMD naming scheme is much more straightforward (honest) than that of nvidis though. I have more of a problem with nvidia renaming a weak card with the latest model numbers such as the 8600GT which became the 9500GT which is now the GT120 or something. Someone who is not informed could easily think this is a high performace part due to the new model number, which it is not.
    What we really need is a benchmark of some sort to give relative performance like the windows experience index. That benchmark is really not useful now because even a midrange card rates the max in the windows experience index. Granted the relative performance varies from game to game, but some sort of performance index would give somewhat of a way to measure relative overall performance.
    Reply
  • Hrel - Thursday, March 05, 2009 - link

    The test you're looking for is called 3D Mark, and I keep messaging them about that asking them to include that test in their articles. Come one, join me in messaging them every day till they start to include that test! Reply
  • Adjudicator - Wednesday, March 04, 2009 - link

    Although the 1 GB Version of the GTS 250 looks "Further refined" (Shorter card length and requiring only 1 6 pin connector instead of two), It is practically the same card as the 1 GB version of 9800 GTX+ sold by eVGA.

    http://www.evga.com/products/moreInfo.asp?pn=01G-P...">http://www.evga.com/products/moreInfo.a...p;family...


    This shows that the "new" reference design was not really new after all; this design was already in existance before NVIDIA announced the release of the GTS250.

    To those who enquire if there will be a 512 MB version of the GTS 250 that needs only one 6 pin:

    eVGA had released a 9800+ 512 MB that uses the refined short PCB and 1 6 pin connector:

    http://www.evga.com/products/moreInfo.asp?pn=512-P...">http://www.evga.com/products/moreInfo.a...p;family...


    Even Gigabyte had released a 1 GB version of the 9800 GTX+ on a shortened PCB with one 6 pin, although it uses a non-reference cooling solution:

    http://www.gigabyte.com.tw/Products/VGA/Products_O...">http://www.gigabyte.com.tw/Products/VGA/Products_O....


    After all this rebadging of the G92b, I will not be surprised if NVIDIA's next move will be to release a 9800+ GX2 / GTS 250 GX2 rebranded as the GTS 255.


    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    I wonder if nvidia heard all the constant ragging women nagging endlessly about the names of their cards, and finally decided the line them up in the 100-200 etc nomenclature....
    And now, the bleeding, edgy, old, wrinkled, crybaby know it alls that demanded a proper naming scheme are getting the new name lineup and the very first thing they do is forget they are the ones that demanded it be done, and they whip out a supergigantic tampon and fill it full up to overflowing.
    There's not much blood left, you're all white as ghosts, in fact, you've been zombies for quite some time now.
    I hope you're enjoying it.
    Reply
  • XiZeL - Wednesday, March 04, 2009 - link

    FAIL!!!by nVidia Reply
  • sbuckler - Wednesday, March 04, 2009 - link

    I don't understand the hate. They rebranded but more importantly dropped the price too. This forced ati to drop the price of the 4850 and 4870. That's a straight win for the consumer - whether you want ati or nvidia in your machine. Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Oh, now stop that silliness ! Everyone worthy knows only ati drops prices and causes the evil green beast to careen from another fatal blow. ( the evil beast has more than one life, of course - the death blow has been delivered by the sainted ati many times, there's even a shrine erected as proof ).
    Besides, increasing memory, creating a better core rollout, redoing the pcb for better efficiency and pricing, THAT ALL SUCKS - because the evil green beast sucks, ok ?
    Now folllow the pack over the edge of the cliff into total and permanent darkness, please. You know when it's dark red looks black, yes, isn't that cool ? Ha ! ati wins again ! /sarc
    Reply
  • Hrel - Wednesday, March 04, 2009 - link

    I can't wait to read your articles on the new mobile GPU's and I'm REALLY looking forward to a comparison between 1GB 4850 and GTS250 cards; as well as a comparison between the new design for the GTS250 512MB and the HD4850 512MB.

    It seems to me, if Nvidia wanted to do right by their customers, that they'd just scrap the 1GB GTS250 and offer the GTX260 Core216 at the $150 price point, it has a little less RAM so there's a little savings for them there. But then, that's if they wanted to do the right thing for their customers.

    It's about time they introduced some new mobile GPU's, I hope power consumption and price is down as performance goes up!

    I look forward to AMD releasing a new GPU architecture that uses significantly less power, like the GT200 series cards do. 40nm should help with that a bit though.

    Finally, a small rant: When you think about it, we really haven't seen a new GPU architecture from Nvidia since the G80. I mean, the G90 and G92 are just derivatives of that and they only offer marginally better performance on their own; if you disregard the smaller manufacturing process the prices should even be similar at release. Then even the GT200 series cards, while making great gains in power efficiency, are still based on G92 and STILL only offer marginally better performance than the G92 parts; and worse, they cost a lot to make so they're overpriced for what they offer in performance. I sincerely hope that by the end of this year there has been an official press release and at least review samples sent out of completely new architectures from both AMD and Nvidia. Of course it'd be even better if those parts were released to market some time around November. Those are my thoughts anyway; congrats to you if you actually read through all of this:)
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    " It seems to me, if Nvidia wanted to do right by their customers, that they'd just scrap the 1GB GTS250 and offer the GTX260 Core216 at the $150 price point, it has a little less RAM so there's a little savings for them there. But then, that's if they wanted to do the right thing for their customers. "
    _________________

    So, they should just price their cards the way you want them to, with their stock in the tank, to satisfy your need to destroy them ?
    Have fun, it would be the LAST nvidia card you could ever purchase. "the right thing for you" - WHAT EVER YOU WANT.
    Man, it's just amazing.
    Get on the governing board and protect the shareholders with your scheme, would you fella ?
    Reply
  • Hrel - Saturday, March 21, 2009 - link

    Hey, I know they can't do that. But that's their fault too; they made the GT200 die TOO BIG. I'm just saying, in order for them to compete in the market place well that's what they'd have to do. I DO want them to still make a profit; cause I wanna keep buying their GPU's. It's just that compared to the next card down, that's what the GTX260 is worth, cause it's just BARELY faster; maybe 160. But that's their fault too. The GT200 DIE is probably the WORST Nvidia GPU die EVER made, from a business AND performance standpoint. Reply
  • SiliconDoc - Saturday, March 21, 2009 - link

    PS - you do know you're insane, don't you ? The " GT200 is the probably the worst die from a performance standpoint."
    Yes, you're a red loon rooster freak wacko.
    Reply
  • Hrel - Thursday, April 09, 2009 - link

    you left out Business standpoint, so I guess you at least concede that GT200 die is bad for business. Reply
  • SiliconDoc - Saturday, March 21, 2009 - link

    Now you claqim you know, and now you ADMIT there is no place for it if they did, anyhow. Imagine that, but "you know" - even after all your BLABBERING to the contrary.
    Now, be aware - Derek has already stated - the 40nm is coming with the GT200 shrunk and INSERTED into the lower bracket.
    Maybe he was shooting off his mouth ? I'm sure "you konw" -
    ( Like heck I am )
    Six months from now, or more, and 40nm, will be a different picture.
    Reply
  • Hrel - Wednesday, April 01, 2009 - link

    seriously, what are you talking about?
    pretty sure I'm gonna just ignore you from now on; pretty certain you are medically insane!

    I'd respond to what you said, I honestly have no idea what you were TRYING to say though.
    Reply
  • SiliconDoc - Wednesday, April 08, 2009 - link

    You don't need to respond, friend. You blabber out idiocies of your twisted opinion that noone in their right mind could agree with, so its clear you wouldn't know what anyone else is talking about.
    You whine nvidia made the gt200 core too big, which is merely your stupid opinion.
    The g92 core(ddr3) with ddr5 would match the 4870(drr5), which is a 4850(ddr3) core.
    So nvidia ALREADY HAS a 4850 killer, already has EVERYTHING the ati team has in that region - AND MORE BECAUSE OF THE ENDLESS "REBRANDING".
    But you're just too crewed up to notice it. You want a GT200 that is PATHETIC like the 4830 - a hacked down top core. Well, only ATI can do that, because only their core SUCKS THAT BADLY without ddr5.
    NVidia ALREADY HAS DDR3 ON IT.
    SHOULD THEY GO TO DDR2 TO MOVE THEIR GT200 CORE DOWN TO YOUR DESIRED LEVEL ?
    Now, you probably cannot understand ALL of that either, and being stupid enough to miss it, or so emotionally petrified, isn't MY problem, it's YOURS, and by the way, it CERTAINLY is not NVidia's - they are way ahead of your tinny, sourpussed whine, with aJUST SOME VERY BASIC ELEMENTARY FACTS THAT SHOULD BE CLEAR TO A SIXTH GRADER.
    Good lord.
    the GT200 chips already have just ddr3 on them mr fuddy duddy, they CANNOT cut em down off ddr5 to make them as crappy as the 4850 or 4830, which BTW is matched by the two years old g80 revised core- right mr rebrand ?
    Wow.
    Whine whine whine whine.
    I bet nvidia people look at that crap and wonder how STUPID you people are. How can you be so stupid ? How is it even possible ? Do the red roosters completely brainwash you ?
    I know, you don't understand a word, I have to spell it out explicitly, just the very simple base drooling idiot facts need to be spelled out. Amazing.
    Reply
  • Hrel - Thursday, April 09, 2009 - link

    you should specify when you're being sarcastic and when you're being serious. Also, all that red rooster loon red camp green goblin crap simply doesn't make any sense and makes you sound like a tin-foil hat wearing crazy person. Just sayin' dude, lighten up. Do you work for Nvidia? Or do you just really hate AMD?

    Yes, they're both good cores, and yes, it'd be great if Nvidia used DDR5, but they don't, so they don't get the performance boost from it; that's their fault too. And they DID make the GT200 core too big and expensive to produce, that's why the GTX260 is now being sold at a loss; just to maintain market share.
    Reply
  • Hrel - Wednesday, March 04, 2009 - link

    Oh, also... I almost forgot: you still didn't include 3D Mark scores:( PLEASE start including 3D Mark scores in your reviews.

    Also, I care WAY more about how these cards perform at 144x900 and 1280x800 than I do about 2560x1600; I will NEVER have a monitor with a resolution that high. No point.

    It's just, I'm more interested in seeing what happens when a card that's on the border of playable with max settings, gets the resolution turned down some, then what happens when the resolution gets turned up beyond what my monitor can even display.

    It's pretty simple really, more on board RAM means the card won't insta-tank at resolutions above 1680x1050; but the percent differences should be the same between the cards. Where, comparing a bunch of 512MB and 1GB cards, at resolutions at 1680x1050 and lower, that extra RAM doesn't really matter; so all we're seeing is how powerful the cards are. It seems like a truer representation of the cards performance to me.
    Reply
  • Hrel - Wednesday, March 04, 2009 - link

    I really do mean to stop adding to this; just wanted to clarify.

    When I say that the extra RAM doesn't matter, I mean that the extra RAM isn't necessary just to run the game at ur chosen resolution. Of course some newer games will take advantage of that extra RAM even at resolutions as low as 1280x800. I'd just rather see how the card performs in the game based on it's capabilities rather than seeing one card perform better than another simply because that "other" card doesn't have enough on board RAM; which has NOTHING to do with how much rendering power the card has and has only to do with on board RAM.

    I think it would be good to just add a fourth resolution, 1280x800, just to show what happens when the cards aren't being limited by their on board RAM and are allowed to run the game to the best of their abilities; without superficial limitations. There, pretty sure I'm done. Please respond to at least some of this; it took me kind of a long time; relative to how long I normally spend writing comments.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Hmmm... you'd think you could bring yuorself to apply that to the 4850 and the 4870 that have absolutely IDENTICAL CORES, and only have a RAM DIFFERENCE.
    Yeah, one would think.
    I suppose the red fan feverish screeding "blocked" your immensely powerful mind from thinking of that.
    Reply
  • Hrel - Saturday, March 21, 2009 - link

    What are you talking about? Reply
  • Hrel - Wednesday, March 04, 2009 - link

    I'm excited about this; I was kind of wondering what Nvidia was going to do, considering GT200 costs too much to make and isn't significantly faster than the last generation; and I knew there couldn't be a whole new architecture yet, even Nvidia doesn't have that much money.
    However I'm excited because this is a 9800GTX+, still a very good performing part, made slightly smaller, more energy efficient and cooler running; not to mention offered at a lower price point! Yay, consumers win!(Why did Charlie at the Inquirer say it was MORE expensive but anandtech lists lower prices?) I really hope the 512MB version is shorter and only needs 1 PCI-E connector/lower power consumption; if not, that almost seems like intentional resistance to progress. However the extra RAM will be great now that the clocks are set right; and at $150, or less if rebates and bundles start being offered, that's a great deal.

    On the whole, Nvidia trying to essentially screw the reviewers... I guess I don't have much to say; I'm disappointed. But Nvidia has shown this type of behavior before; it's a shame, but it will only change with new company leadership.

    Anyway, from what I've read so far, it looks like the consumer is winning, prices are dropping, performance is increasing(before at an amazingly rapid rate, now at a crawl, but still increasing.) power consumption is going down and manufacturing processes are maturing... consumers win!
    Reply
  • san1s - Wednesday, March 04, 2009 - link

    365? are you sure about that?
    "even when the 9800 was new... iirc the 4850 was already making it look bad"
    google radeon 4850 vs 9800 GTX+ and see the benchmarks... IMO the 9800 was making the brand new 4850 look bad
    "i'd doubt that anyone buying a 9800 today is planning to sli it later"
    what if they already have a 9800? much cheaper to get another one for sli than a new gtx 260
    "hahaha, less power useage relative to"
    read the article
    "name some mainstream cuda and physx uses"
    ever heard of badaboom? folding@home? mirror's edge?
    the gts 250 competes with the 4850, not 4870
    "continually confusing their most loyal customers "
    what's so confusing about reading a review and looking at the price?

    The gts 250 makes perfect sense to me. Rather than spending $ on R&D for a downgraded GT200 (that will perform the same more or less), why not use an existing GPU that has the performance between the designated 240 and 260?
    Its a no win situation, option #1 will mean a waste of money for something that won't perform better than the existing product that can probably be made cheaper (the G92b is much smaller), and #2 will cause complaints with enthusiasts who are too lazy to read reviews.
    Which option looks better?
    Reply
  • kx5500 - Thursday, March 05, 2009 - link


    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    I saw that more than once in Combat Arms - have you been playing to long on your computer? Reply
  • rbfowler9lfc - Tuesday, March 03, 2009 - link

    Well, whatever it is, be it a rebadged this or that, it seems like it runs on par with the GTX260 in most of the tests. So if it's significantly cheaper than the GTX260, I'll take it, thanks. Reply
  • Leyawiin - Tuesday, March 03, 2009 - link

    Good refinement of an already good card. New more compact PCB, lower power comsumption, lower heat, better performance, 1GB. If Nvidia feels thats worthy of a rename, why should anyone get their drawers in a bunch?

    But please, let the conspiracy theories fly if there was a rewrite of the conclusion. Could be it was just poorly done and wasn't edited, but thats not as fun as insinuating Nvidia must have put pressure on the AT.
    Reply
  • Gannon - Tuesday, March 03, 2009 - link

    Because it's lying, the Core should always match the original naming scheme. Nvidia is just doing this to get rid of inventory and cause market confusion so that dimwits who don't do their research go for the 'newer' ones, when in fact they are the older.

    I hate this practice, creative did the same thing with some of their soundblaster cards, the soundblaster PCI I believe it was, it was some other chipset from a company they had bought out and merely renamed and rebadge the card "soundblaster"

    Needless to say I hate the practice of deceiving customers, imagine you're in a restaurant and you ordered something but then they switched it on you to something else, you'd rightly get pissed off.

    If people weren't so clueless about technology they wouldn't get away with this shit. This is where the market fails, when your customers are clueless, it's sheep to the slaughter.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Yeah, imagine, you ordered coke one day, and the next week you ordered coca cola off the same menu, and they even had the nerve to bring it in a different shaped glass. What nerve, huh !
    They sqirted a little more syrup in the latter mix, and a bit less ice, and you screamed you were decieved and they tricked you, and then you went off wailing away that it's the same thing anyway, but you want the coca cola not the coke because it tasted just a tiny bit better, and you had darn better see them coming yup with some honest names.
    Then ten thousand morons agreed with you - then the cops hauled you out in a straight jecket.
    I see what you mean.
    Coke is coca cola, and it should not be renamed like that - or heck people might buy it.
    I guess that isn't fair ... because people might buy it. It might even a different price at a different restaurant, or even be called something else and taste different out of a can vs a glass - and heck that ain't "fair".
    You do know I think you're all pretty much whining lunatics, now, right ? Just my silly opinion, huh.
    Coke, coca cola, soda, pop, golly - what will people do but listen to the endless whiners SCREAM it's all the same and stop fooling people....
    I guess it was a slow news YEAR.
    Reply
  • SunnyD - Tuesday, March 03, 2009 - link

    Since NVIDIA really wanted to push PhysX... I'm curious which if any of the tested titles have PhysX support and if it's enabled in those titles as tested. I'd be really interested to see what kind of performance hit the PhysX "holy grail" takes from this new/old card when trying to compare it to its competition. Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    I wonder why they haven't done a Mirror's Edge PhysX extravaganza test - they can use secondary PhysX cards and then use the primary for enabling, turn it on and off and compare - etc.
    But not here - Derek would grind off all his tooth enamel, and Anand can't afford the insurance for him.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Derek CAN'T include PsysX, and NEVER SAYS wether or not he has it disabled in the nvidia driver panel - although this site USED to say that.
    If they even dare bring up PhysX - ati looks BAD.
    Hence, to keep as absolutely MUM as possible, is the best red fan rager course.
    You see of course, Derek the red has to admit that yes, even NVIDIA ITSELF brought this SAD BIAS up to Derek...
    Oh well, once a raging red rooster, always a red rooster - and NOTHING is going to change that. (or so it appears)
    Is that 10 or 15 points of absolute glaring bias now ?
    ____________________________________________________________
    " We're just trying to save the billions losing ati so we have competition and lower prices - so shut up SiliconDoc ! Do you want to pay more, ALL OVER AGAIN FOR NVIDIA CARDS !!?!"
    ____________________________________________________________

    PS, thanks for lying so much red roosters, you've done a wonderful job of endless bs and fud, hopefully now Obama can bail out amd/ati, and my nvidia cuba badaboom low power game profiles, forced sli, PhysX cards will remain the best buy and continue to utterly dominate with only DDR3 memory in them.

    PSS - Yes, I can hardly wait for nvidia DDR5 - oh will that ever be fun - be ready to rip your red badges off your puny chests fellers - I'm sure you'll suddenly find a way to reverse 180 degrees after a few weeks of "gating nvidia for stealing ati ddr5 intellectual property".
    LOL
    Oh it's gonna be a blast.
    Reply
  • C'DaleRider - Tuesday, March 03, 2009 - link

    Very early this morning, I stumbled upon this article when it was originally put up....and went directly to the conclusions page. Interesting read....and I should have saved that page.

    Subsequently, the entire review went down with this reasoning, "...ust we had some engine issues... missing images and such. I don't have the images or I'd put them on the server and set the article to "live" again. Anand and Derek have been notified; sorry for the delays."

    Well, it's back up and what do you know.....the conclusions have now become somewhat softer, or as a few others on another forum put it who also saw the "original" review...circumcised, censored, and bullied by nVidia.

    Shame that the original conclusion has been redone....would have liked others to actually see AT had some independence. Guess that's a lost ideal now...........
    Reply
  • strikeback03 - Tuesday, March 03, 2009 - link

    Interesting, you mentioned in the comments in the other article that you didn't get to see any of the review, as when you clicked it went to the i7 system review. Reply
  • JarredWalton - Wednesday, March 04, 2009 - link

    Thanks for the speculation, but I can 100% guarantee that the "pulling" of the article was me taking it down due to missing images. I did it, and I never even looked at the rest of the article, seeing that it was 3AM and I had just finished editing a different article.

    Was the conclusion edited before it was put back up? Yes, but not by me. That's not really unusual, though, since we typically have someone else read over things before an article goes live, and with a bit more discussion the wording can be changed around. It would have changed regardless, and not because of anything NVIDIA said.

    Is the 9800 GTX+ naming change stupid? I certainly think so. However, that doesn't make the current conclusion wrong. The card reworking does have benefits, and at the new price it's definitely worth a look as a midrange option.
    Reply
  • RamarC - Tuesday, March 03, 2009 - link

    please consider styling the resolution links so they stand out a bit or look button-ish. it took me a minute to realize they were clickable. Reply
  • Wurmer - Tuesday, March 03, 2009 - link

    Performances aside, Nvidia should get their naming scheme straight. All this renaming and name swapping only contributes to get the customers confused. Now matter how it's explained, make this simple. Higher number, more powerful card ! In this regard, I find that ATI has made an effort of late.

    I'll also agree with one of the above poster that Nvidia was taken aback by the release of the 4870 and 4850. ATI hit the nail right on the head and the green team seems to have a bit hard time devising a proper response. Instead of getting their pray in their gun sight they use a shotgun and pepper the target all around......
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Shotguns usually KILL with just one shot - and ATI has caused another charge off, another BILLION dollar loss for AMD.
    I'm not sure NVidia hit them, (they obviously don't need to) so let's hope they don't, as well.
    I'm also not sure what it means when people keep speculating that ATI "caught them off guard" - it doesn't really mean anything - it's just a stupid way of saying "ATI did better than I expected" ( but it's "cool" to not say that and put down NVidia instead, huh... since so many around the geek spots "taught you to say it that way" ).
    Then after "being caught off guard" NVidia "drops a card" and it's because "they panicked" - right ?
    DEREK - explained it - didn't he.... " NVida released the 9800GTX... "on the eve of the 4850 launch"....
    YES, THAT'S HOW OFF GUARD THEY WERE.... NVIDIA HUH...
    They released a DAY BEFORE ati did....
    And they're STILL USING the 9800 - to battle the 4850 that was relased AFTER Nvidia....released their card.
    Oh well....
    DEREK tried to make nvidia sound evil, too - for releasing "on the eve of " the sainted red card 4850 release day - those nasty nvidia people spoiling the launch by releasing on ati's "eve"...
    By golly, it's no wonder Obama was elected my friend.

    Reply
  • Mr Perfect - Tuesday, March 03, 2009 - link

    Hey, uhm, in that link posted on page two, there is mention that the press review cards are specially picked by Nvidia. Any idea if this is true? Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Doesn't matter when they are stock clocked. The reviewers can do all sorts of things to "compensate" for how they want outcomes, anyway, like editing ini files and choosing the games and the order - heck even using a card that isn't the card they are supposedly reviewing. Reply
  • spinportal - Tuesday, March 03, 2009 - link

    The GTX250 1GB is barely useful over it's 512MB counterpart except for power usage and slot size. The est. street price of 149 is already countered by the AMD4870 512MB and tests shows it's a hair better. Given the 250 uses less juice over the 4870, it's odd that the 250SLI is using more juice than the 260 core 216 SLI, so there goes that benefit.
    NVidia cannot strip down the GT200 core to reduce the power load from 2 6-pin to just 1 for 150W. Perhaps there is something to be said for GDDR5 power reduction.
    Either way, the 250 is a win for nvidia in the mainstream budget for less power usage, CUDA, & Physx at the same price of the 4870 512MB which runs hotter, noisier (probably) and less feature rich. Does DX 10.1 matter at this point? PureVideo 2 is a wash vs. AMD's UVD.
    It's distateful rebadging a GT92b in the GT200 naming scheme. This helps NVidia's costs by EOL the whole 8800GT architecture.
    But by April, who's going to care? New spins are comings. This stop gap is only to reduce bleeding. Hopefully next gen is executed better so performance grows as power demands decrease.
    Reply
  • kx5500 - Thursday, March 05, 2009 - link


    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Take your finger off the key, you're repeating, you dummy. Reply
  • Wolfpup - Tuesday, March 03, 2009 - link

    I'm none to crazy about the random GPU/CPU naming this industry has always seen, but I disagree with Nvidia somehow needing a budget version of the 260/280 GPUs.

    I mean the 8800 series is basically the same thing. Sure there are some improvements here and there, but basically the 260 is the same thing, but with massively more execution hardware. I don't really see this huge distinction between buying a 250/9800GTX+ and it not being a stripped down 260...I mean it would almost be the same thing anyway.

    And if nothing else, it's great to see how much less this 250 uses in terms of power. I mean this is still a really nice GPU that I'd be glad to have in a system. (I'm on a laptop with a 9650GT, which is yet another 32 processor part...and even this isn't half bad at all!)
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    The red fanboys need Nvidia to cut the arms and legs off the GT200 and turn it into a puny 4830 (that has the top core ati has made, can make).
    The PROBLEM for red fanboy red freaks is just that... their great and sainted ATI has their "superwhomper super efficient super technology core! Oh Goddd! it's so greaaattt ! (*red boy orgasm)" - in the lowly 4830 - now compare that to the GTX260/192 and THAT'S where the red fanboys stands (or, really cries, rather).
    Now look at the 4830 top ATI core and compare it to the GTX285... oh that's PAINFUL.
    Now do the reverse...
    Compare the top core 4870 to the top Nvidia core GTX260/192 (lowest itteration) - and golly, it EVEN BEATS the 4870 sometimes...
    So THERE you have "the big ugly problem" for the red fanboys - who keep wailing that Nvidia MUST make a lower GT200 part for them...
    ( their tiny inner red child needs some comfort - it's just not fair with that big evil green GT200 SPANKING the rv770 so badly ! It's "abuse" ! )
    Can we have a GT200 core that is as LOUSY at the 4830 ?!?! please please pretty please!!! we have some really hurting little red fanboy crybaby whining fud propaganda doper diaper babies that need some satisfaction and their little red egos massaged...
    DON'T make them face the truth, EVER , nvidia, you big mean greedy green ...
    ( Yes, dude, they keep begging for it, and THAT'S WHY ! )
    There is no doubt about it.
    Reply
  • LuxZg - Tuesday, March 03, 2009 - link

    How about making an article where you'd test all those G92 renames, rebrands, overclocks and shrinks?

    So put a fight between 8800GT 256MB (137$), 8800GT 512MB (154$), 8800GTS (171$), 9800GT 512MB (148$), 9800GT 1GB (171$), 9800GTX (205$), 9800GTX+ (214$), 9800GTX 1GB (228$), GTX250 512MB (+230$ ??), GTX250 1GB (+240$ ??). And if I've missed some variation, please include that too :)

    Than test them all overclocked with their default cooling.

    I just want to see how much have they gone from the first revision till these last ones. Prices in brackets are for local prices in Croatia where I live. And yes, you can buy them all.. I even found an 8800GTX 768MB card and 8800GTS 320MB as well, interesting, both at the same 171$ price that gets you "new" 8800GTS (512MB) or 9800GT 1GB :D

    Now, since you can buy all these cards, and most of them are really close in price (some 100$ from all-new top to the rock-bottom 8800GT 256MB; if you exclude those than it's just some 60$ difference) - it would be an interesting article. Especially for those aiming at SLI for their old card which they've bought earlier.

    And while here, I support the above comment - try SLI these 8800/9800 cards with these new GTX 250. Should have no problem, but to check the performance (gains) anyway..
    Reply
  • VooDooAddict - Tuesday, March 03, 2009 - link

    If trying to decide for purchasing, I would cut that list down to the following:

    8800GTS 512MB - Good bang for the $ but hotter, power hungry GPU
    9800GTX+ 512MB - Die shrink gave more speed and lower temps
    GTX250 1GB - New board design gives better power usage
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    VooDoo where the heck can you get an 8800gts anymore ? ebay ?
    The 9800GT ultimate by Asus ? Those are literally gone as well ...
    Which brings me to something... I hadn't thought of yet...
    A LOT of the core g80/g92/g92b cards are GONERS - they're sold out !
    So - nvidia makes a new "flavor".
    Golly Wally, I never thought of that before.
    " That's why you're the Beave. "
    ____________________

    Oh, THEY SOLD OUT - HOW ABOUT THAT.
    Reply
  • erple2 - Wednesday, March 04, 2009 - link

    No, I disagree - the OP has a good point. Compare all of the G92 parts together to see just how much real difference there is. Throwing in the G80 part (8800GTX, I suppose) is an interesting twist as well, to show how the G80 evolved over time.

    nVidia has a crazy number of cards that are all "the same". The evaluation proposed sure would help explain away what was going on.

    I'd definitely be interested in seeing what the results of that were!
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Go to techpowerup and see their reviews, for instance on the 4830 - it has LOTS of games and lots of g80/g92/g92b flaovr - including the gtx768 (G80) which YES, pulls out some wins even against the 4870x2....
    Check it out at techpowerup.
    Reply
  • emboss - Tuesday, March 03, 2009 - link

    I'd also say for the purpose of comparison to throw a G80 in there (ie: a 8800 GTX or Ultra). It'd be interesting if the extra bandwidth and ROPs of the G80s make a difference in any cases. Reply
  • Casper42 - Tuesday, March 03, 2009 - link

    1) You should have included results for a 9800GTX+ so we could truly see if the results were identical to the "new" card.
    2) If you can, please stick a 9800GTX+ and a GTS 250 512MB into the same machine and see if you can still enable SLI.

    I own a 9800 GTX+ and item #2 is especially interesting to me as it means when I want to go SLI, I may have an easier time finding a GTS 250 rather than hunting on eBay for a 9800 GTX+

    Thanks,
    Casper42
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Casper as DEREK said in the article > " Anyway, we haven't previously tested a 1GB 9800 GTX+," (until now)
    THAT'S WHAT THEY USED.
    lol
    Yes, well you still don't have an answer to your question though...
    How about the lower power consumption and better memory and core creation translating into higher overlclocks ?
    LOL
    No checking that, either...
    "A 9800gtx+" will do - "bahhhumbug ! I hate nvidia and it's the same ding dang thing ! Forget that I derek said it's better memory, a better made core itteration, and therefore lower power, a smaller pcb make, SCREW all that I can' overclock I don't have the DAM*! CARD I HATE NVIDIA ANYWAY SO WHO CARES! "
    ____________________________________

    Sorry for the psychological profile but it's all too obvious - and it's obvious nvidia knows it as well.
    Hope the endless red fan ragers save the multiple billion dollar charge off losers, ati. I really do. I really appreciate the constant slant for ati, I think it helps lower the prices on the cards I like to buy.
    It's great.
    Reply
  • Mr Perfect - Tuesday, March 03, 2009 - link

    Now that's a good question(number 2 that is). Maybe a 9800GTX+ can be BIOS flashed into a 250 to enable SLI? Reply
  • DerekWilson - Tuesday, March 03, 2009 - link

    GTS 250 can be SLI'd with 9800 GTX+ -- NVIDIA usually disables SLI with different device IDs, but this is an exception.

    If you've got a 9800 GTX+ 512MB you can SLI it with a GTS 250. If you have a 9800 GTX+ 1GB you can SLI it with a GTS 250 1GB. You can't mix memory sizes though.

    Also, the 9800 GTX+ and the GTS 250 are completely identical and there is no reason to put two in a system and test them because they are the same card with a different name. At least until NVIDIA's partners release GTS 250s based on the updated board, but even then we don't exepct any performance difference whatsoever.

    These numbers were run with a 9800 GTX+ and named GTS 250 to help show the current line up.
    Reply
  • dgingeri - Tuesday, March 03, 2009 - link

    I noticed that the 512MB version of te 4870 beats the GTS250 1GB in everything, and yet costs the same. even when the video memory makes a big difference, the 512MB 4870 wins out. Even better is that the 4870 512MB board costs the same, or at least will soon, as the GTS250 1GB board.

    Doesn't this make the 4870 512MB board a better deal?
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    That's nice, the 4870 512 top core DDR5 ati best they have for core and memeory speed can beat the 2 year old renamed a thousand times Nvidia g92b. Gosh what a GREAT CARD(err... MEMORY ONLY) ati has there... what "awesome technology".... what "efficient core!!! ( WRONG! IT'S ALL THE DDDR5 MEMORY ! THAT'S WHY THE 4850 IS THE SAME CORE WITH DDR3 AND IS UNDER THE GTS250 (g92b / reworked stale ol tired core that ATI has just barely matched ! )
    Don't feed me any crap anymore - ANYONE about "the great ati core ! " --
    It is the same as the g92b (4850) unless you slap DDR5 on it.
    I can hardly wait - or rather I certainly HOPE nvidia slaps DDR5 on that "rebranded piece of crap g80/g92/g92b " !
    My golly, it would be GREAT to see the ten thousand jaws of the red rooster propaganda boys DROP TO THE FLOOR when it matches and spanks the 4870 ...
    PLEASE NVIDIA ! PLEASE DO IT ! DO IT NOW !
    Reply
  • Hrel - Wednesday, March 04, 2009 - link

    why isn't the HD48701Gb in this review? I just noticed that. Reply
  • Totally - Thursday, March 05, 2009 - link

    Why isn't the GTX 285 in this review? I just noticed that. Reply
  • Hrel - Wednesday, April 01, 2009 - link

    my guess would be cause it's an overpriced piece of crap that is matched in many cases by the HD4870 1GB. Reply
  • DerekWilson - Tuesday, March 03, 2009 - link

    4870 is ~$175 right now, which is much more than the $130 for the GTS 250. It would hold an advantage over the GTS 250 1GB if the end user is willing to spend the extra cash. Reply
  • Jansen - Tuesday, March 03, 2009 - link

    Ahem...


    http://www.dailytech.com/Radeon+4870+Gets+50+Price...">http://www.dailytech.com/Radeon+4870+Ge...4850+Wil...
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Did you read the follow on link that was out quite a while ago where the ati board partners told ati to "go shove it!" - because ati wanted the board partners to "take a lot of the loss" and only offered a partial "reimbursement" for the lower price scheme they cooked up ?
    BOY I HOPE YOU DO.....
    So, ati wanted it's board partners to TAKE A LOSS - while the little red fans slopped up the savings and praised ati "for almost destrpying nvidia, again!" - yes.... very, very underhanded and evil.
    Well, the board partners told them to go blow.
    And the price stayed the same, in fact it went up at our fave egg with the usually the very lowest prices around widely known.
    So, so much for that for now.
    The day will come eventually... unless inflation from the monetary crisis whips us all.
    Reply
  • Roland00 - Tuesday, March 03, 2009 - link

    several news sites that aren't Anandtech are saying there is a 4870 price drop coming this week to $149.99 for the 512mb version.

    It isn't true yet, but if it is ATI is much faster for dollar spent.
    Reply
  • Matt Campbell - Tuesday, March 03, 2009 - link

    There are several models at Newegg now that are falling below $165, and one Powercolor that is $149.99 after mail in rebate. http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...

    Competition is making for a great market right now for gamers :)
    Reply
  • Roland00 - Tuesday, March 03, 2009 - link

    Currently Newegg has the Asus 4870 512mb model for
    184.99-50 dollar promo code=134.99 with a 30 dollar rebate on top of that (104.99 after rebate).

    The problem is as very soon as they posted this deal they sold out. Deal is good till the 8th but are they going to get anymore in by then?
    Reply
  • KayKay - Tuesday, March 03, 2009 - link

    on the "Final Words" page

    Loses != Looses
    Reply
  • MamiyaOtaru - Thursday, March 05, 2009 - link

    "Loses" is correct. What are you saying? Reply
  • MamiyaOtaru - Wednesday, March 11, 2009 - link

    Maybe the author already edited the article and the OP was pointing out an actual error that was there. I've seen that happen often enough, why would I fall into that trap.. I guess so many people write "looser" and "loose" that it was perfectly believable to me that he really was assuming those forms were correct. oops. I hope. Reply
  • Proteusza - Tuesday, March 03, 2009 - link

    Brilliant segment, nice PR backfire for Nvidia.

    As I see it, the 1GB version of this card might be good enough to buy on its own merits - it stands up well to the 4850, while having lower power consumption and more video memory (which helps with some resolutions and AA combinations). That being said, a rehash is still a rehash, and its... somewhat surprising to see that nvidia doesnt have a proper GT 200 series mainstream GPU out yet. I guess it could show just how off balance they are from the 4870 and 4850 - you can see a mile away that they did not expect such good performance to be available for such a low price.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    It's funny, I think just about the opposite. It's not strange threre is no GT200 mainstream part, as there is no hole to fill, and it is strictly a high end GPU.
    What you have to finally realize, to understand, is that the best core of the red rooster crew IS the 4850.
    IT IS IDENTICAL TO THE 4870 IN EVERY WAY EXCEPT IN THE MEMORY AND CLOCKS.
    So the real issue is, the ATI CORE CANNOT COMPETE WITH THE GT200 CORE. IT CAN ONLY TRY BY DDR5 MEMORY, and even then it falls short.
    Look at it this way - a 4870 minus the ddr5 IS the 4850. Take the top ati core with ddr3, and the top nvidia core with ddr3, both at a gig ram and same clocks.
    YOU GOT THE 280 OR 285 (GT200) STOMPING THE ATI CORE INTO THE DIRT.
    In the "midrange" (depending on what crazy range that entails for the red rooster fanboi) - the 4850 VS the 9800 flavors - here we see the 4850 top ati core with 512 ram is struggling to keep up with the over 2 years old nvidia core 250 (92b/80 for the whiners).
    So, there is NO REASON for a "midrange" GT200 other than tech geek instatiable curiosity.
    How would nvidia position the GT200 midrange part ? If it's below the GTX260/192, it crunches into the 9800GTX flavors - if below that - the 8800 88gs and the like...
    SEE THE RED ROOSTERS HAVE THEIR INSANE, FRANKLY NUTBALL IDEA THAT THERE IS ROOM FOR A LOWER GT200.
    NO SIR, the GT200 is the 285 AND IT STOMPS EVERY SINGLE CORE ATI HAS EVEN WHEN THAT 4850 CORE IS STUFFED ABOUT WITH NVIDIA GT200 CORE DOESN'T HAVE IT DDR5 IN THE FORM OF THE 1 GIG 4870 OVERCLOCKED !
    So, it's really clear to me, the fanboys are spreading so much CRAP, they have otherwise intelligent persons confused and babbling stupidities.
    Forget all their BS, and take a CLEAR VIEW.
    The BEST CORE ati has to offer is 100% there and enabled on the 4850, and only ddr5 and massive clocking gets it to the 4870, which still can't touch the DDR3 gt200 - it DOESN'T EVEN TOUCH IT.
    The GT200 with DDR5 would "ABSOLUTELY DESTROY UTTERLY THE 4870".
    Nvidia CANNOT put such a tremendous CORE into a mid range low product unless they DUPLICATE the 9800 series, and it would be IMMENSE DUNCERY.
    That must be why the idiot red roosters keep calling for it.
    Like Derek, a supposed "reviewer".
    No, he's a foolish red fanboy joke.
    Yeah, I'm sick of the STUPIDTY.
    The 4850 core goes in the 4830 because it's a lower end core, on par with the g92, NOT CAPABLE OF COMPETING WITH THE GT200.
    How many times do I have to REPEAT IT, before the zombie repeat bot FUD and the total bs CRAP by "intelligent commenters" and "reviewers" - STOPS !
    Gosh, the GT200 with DDR5 huh - no how about the G92/9800GTX with DDR5 - gueess what ? THAT'S EQUIVALENT TO A 4870 !
    THINK ! THINK ! THINK ! TIME TO THINK !
    Go look at the ATI gpu core charts here - then THINK about what I'm RANTING on !
    Thanks if this helps you at all.
    If it doesn't actually help the red rooster crazed liars, well then there is no help for you.
    Reply
  • Galid - Wednesday, March 18, 2009 - link

    First of all, you sound like a nvidia fanboy, when you get mad, you've gotta be a fanboy. No offense, but there's no clear point of view from someone that already chose a side. I owned both nvidia and ATI gpus in my life having problems and good days with both. I remember the drivers problems from ati and I remember my 6800gt with nforce2 chipset incompatibility or my 7900gt that burnt twice.

    Ati never wanted to compete with the higher end parts of GT200 they wanted to take on the midstream graphic processer which they did pretty well. Ati touches one of nvidia gt200 core(the gtx260) with it's radeon 4870 and everyone knows it, that's what heppened since it got first tested on anandtech. For a much lower price BTW. Yes 4850 is the best part bang for the bucks from ati but when you gotta choose between gtx260 or 4870, the price speaks by itself.

    You forgot the 4850x2, ok it'S not being produced a lot which is weird considering it's some of the most interesting part from ati. kills completely gtx280 for a lower price?!?

    That move from nvidia was clearly the best. you'Re right about that. Getting the best bang for the bucks part is an achievement in itself. They did it but quite poorly. Now they have the best of the best and quite the best bang for the bucks unless that radeon4870 for 150$ happens.

    So it's ok if you wanna have the best perf whatever the price, it's a no brainer. But competition is good, it keeps prices down, red rooster dies it's gonna be bad for us.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    " You forgot the 4850x2, ok it'S not being produced a lot which is weird considering it's some of the most interesting part from ati. kills completely gtx280 for a lower price?! "

    Is that a question or a statement ? You don't really know. You don't really want to fight x-fire either - with still NO GAME PROFILES - NO FORCING, STILL BELOW TWO TWO GTS250 BECAUSE OF THAT - AND MORE EXPENSIVE- CHECK THE EGG LIAR!
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Ahh, yeah since you're obviously a liar here's your big problem, proof of that !

    RE: value?? by SiliconDoc, 14 hours ago
    The cheapest 4870 1G at the egg right now is 194.99 + shipping and they go up well over $200 from there -

    The cheapest GTX260/216 at the egg right now is 179.99 + shipping.
    __________________________________-


    Now let's look further - in order ! (second # after rebate)
    4870 1g
    199.99
    199.99/169.99
    199.99/179.99
    214.99/194.99
    234.99/209.99
    239.99/214.99

    GTC260/216
    189.99/159.99
    208.99/189.99
    212.99/177.99
    229.99/199.99
    232.99/197.99
    234.99/214.99

    _______________________________

    Oh well, another red fantasiacal lie exploded all over the place, AGAIN.

    ________________________________-

    Sorry red, YOU LIED.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    You're another red rooster foofer.
    " Ati never wanted to compete with the higher end parts of GT200 they wanted to take on the midstream graphic processer which they did pretty well. Ati touches one of nvidia gt200 core(the gtx260) with it's radeon 4870 and everyone knows it, that's what heppened since it got first tested on anandtech. For a much lower price BTW. Yes 4850 is the best part bang for the bucks from ati but when you gotta choose between gtx260 or 4870, the price speaks by itself. "
    I HAD to write up the price chart - go check page 9 or 10 - then get back to me with an APOLOGY on your pricing LIE.
    DO IT redrooster.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Yes, there is a clear view, if you can do so HONESTLY.
    For instance, the 4830 is a tremendous value - that's the card I like from the red crew - and it hit the egg at $74, and I'm still kicking myself for not buying a half dozen.
    Don't be so quick to judge.
    If I'm incorrect, I will take correction, and apologize, and thank you. Know that.
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Great post.
    "Thanks".
    Don't forget to ask why the red roosters keep begging for nvidia to downgrade their gt200 core... they hope and pray nvidia acts that stupid, so that the sloppy g80/g92/g92b equivalent ati top best core they can make R770 core (4830) LOOKS BETTER and the "cover up" about the GT200 core STOMPING AWAY WITHOUT DDR5 is never known, never widely realized.
    Yes, very insightful.
    "Thank you."
    Reply
  • chrish89 - Tuesday, March 03, 2009 - link

    "power consumption is slower" Reply
  • ickibar1234 - Sunday, June 02, 2013 - link

    YAY! This card, according to the mininum system requirements, will play "Metro: Last night"! Sweet. Reply

Log in

Don't have an account? Sign up now