This is actually really funny you mention multiple sites. Since it's pretty hard to find an site these days which actually doesn't review/preview without sponsors. Meaning lean to one side to other is pretty simplistic just need to review right games and voila either can win. Lol, looking ATI videos damn those are so well selected that damn.
We are definedly getting back to 80s where games where made to GPU. Not to all. The funny thing is even our so trusted Benchmarks like any Futuremark production fakes the results of GPUs. Their so called ORB is pretty far from reality what the hardware is really capable.
most of those use either NV biased games or most likely didn't upgrade the 4890's drivers. All reviews show that 4890 loses its initial advantage at higher resolutions, and the fact that it is now much cheaper. Take your pick, you'd get good value either way.
Well the 4890 isn't exactly kicking the 275's butt here.
Let me break it down:
Age of Conan: 0-3 fps difference. It's a wash
CoD: WaW: 275 is at or above 60 fps on all resolutions, beats 4890 at 2560. 275 wins.
Crysis Warhead: 0-2 fps difference. It's a wash.
Fallout 3: 4890 wins.
Far Cry 2: 0-2 fps difference. It's a wash.
Left 4 Dead: Again, 275 is at or above 60 fps on all resolutions, beats 4890 at 2560. 275 wins.
Grid: 4890 wins.
That's 2 for nvidia, 2 for ATI. And on COD, Crysis, Far Cry, and L4D the 4890 wins at 1680 and 1920, then at 2560 the 275 suddenly pulls ahead? That's supposed to make sense? Not to mention both drivers used were beta. And the 185.65 drivers have been pulled from nvidia's archives.
yes, you said the one that is getting the benefit are the end user, but I think you have a short vision, because when things getting cheapper means we have more chance to get lower quality product. for example, the GTX260 that I bought several month a go, I can see that the image was worse than the 8800GTS that I had 2 years. At beggining I thought it was a defect so I changed other one and other brand and had the same result. so I say, instead of fighting for price, why dont they just make a better product?? lowering the price would just get our product worse and worse, like most of the product sold in US are now made in China... and then everybody are complaining about the product is bad... that is poisoned etc, what a joke what do you expect when the price go down? the answer is easy to get right? So I would suggest you stopping saing the one is getting the benefit are the users, what a brainless comment
Those of us that buy the latest hardware to fly our flight sims have been pretty much left to using the outdated Tom's Hardware charts (which still show the 8800GTS being the fastest card around). I would love to know how the Q9650s and i7s are doing in FSX since the service packs, and it would be great to learn if the GTX 260/280s and now the refreshes are still slower than an 8800GTS in those sims. . .not to mention the abysmal performance of ATI cards! Has anyone found such a review anywhere?
I do not think you realize the problem. A year ago, FSX ate all the hardware you could throw at it.
FSX is a very difficult animal to feed.
It loves fast CPUs, but it also needs a fast GPU. Unfortunately, as was pointed out, there exists few recent comparisons. It is not easy figuring out the correct hw balance for FSX, since few includes it in a review.
Comparing dozens of FPS games is pointless. They perform similar. There are some small differences, but to evaluate a given card, you don't have to review that many games. FSX however poses some unique challenges, and deserves some attention.
Oh... I'd also like to know which of these cards will play nicely with HD video.
Well, every now and then I like to have a little shooter fun, and the GTS is certainly lagging behind in the new titles.
I'm currently beta testing a new sim and it really utilizes the GPU which is nice to see, but my 8800GTS limits me quite a lot, and it's also nicely multi-threaded. I decided it's time to update my system, and really have nothing to guide me. Is ATI still really weak in sims? Have the GTX 280s gotten any better with the recent drivers? What about SP2 in FSX? I just don't have any source of this info, and I've looked everywhere for a legit source.
I've got a GTX 285 on the way and will just end up doing my own testing since that's apparently the only way to get the info.
There are hundreds of review sites out there posting these same four or five titles in their benchmarks and not a single one that includes any of the flight sims, even the new releases. I know sims are a niche market, but flight simmers are left to test for themselves, and they use what is perhaps one of the more demanding titles out there! My complaint isn't directed at Anandtech per se, I favor this site and have seen and appreciated the helpfulness of Gary Key time and again, especially over at the Abit forums, I just wish that Anandtech could employ their testing discipline in titles that really do need a legit place to evaluate them. It could really be a benefit to many people that really aren't catered to at all currently.
I agree but you'll never get that here since ati gets stomped in fs9 and fsx even more.
This is red rooster ragers central - at the reviewer level for now.
Put in the acelleration pack, and go for nvidia - the GT200 chips do well in FS9 - and dual is out for FSX so....
A teenage friend just got a 9800GTX (evga egg) and is running ddr800 2 gigs with a 3.0 P4 HT, on a G35 Asrock and gets 25-30 fps in FSX on a 22" flat Acer with everything cranked.
He oc'ed the cpu to 3.4 and pulled like 5 more frames per.
That's what he wanted, very playable on ultra - on his former 8600GTS he couldn't even give it a go for fsx.
However, moving up from 8800 I'm not certain what the gain is specifically. I've seen one or two reviews on HardOcp for fsx with a few cards. Try them.
Well, now that I've picked up a GTX 285SC, I've done some rudimentary benchmark comparisons between it and my 8800GTS in FSX and IL-2. I will add BlackShark results soon as well.
Very interesting, and not a great increase - Tom's lists FSX benchies in most of his card charts - the 9800GTX+ is way up there(3rd I believe), as are some of the 8800 series.
It's weird.
The old HD2900 (pro even) does well with a good overclock - even the strange saphhire version which was 256 bit with 320 shaders - on a 25% oc it makes FSX quite playable. ( another friend on an E4500 w 4gigs/800).
I saw the ati1950XTX at hard does pretty well - well the 1950GT does NOT.
---
That 8800 chip is still - well, if not the best, still darn close.
My friend working for Anandtech and told me that ATI paying for articles good for them and in disadvantageous to Nvidia..what we can clearly see in that article..
Ahh, yeah well people have to get paid.
It's nice to see the reaction there from the red rooster though, huh - cheering it on while he spews his brainwashed communist-like hatred of nvidia.
It's amazing.
Good for you noticing, though.
I don't hate Nvidia. I own 5 nvidia cards and 1 ati card. I m buying what gives me the best value. To me, its ATI for now. I think AnandTech did a good job reporting on matter the happens behind the scene. They just report it and it up to individual to form their own thoughts.
You obviously only buy Nvidia which is good... no fuss on deciding on what to get next!! hahaha....
Well incorrect entirely. They didn't do a good job reporting on behind the scenes, because they left out the ATI prodding and payment parts.
Furthermore, ati is still in a world of hurt losing billions in consecutive years.
If you were to be HONEST about things, if all the people here were to be, the truth would be: " WHERE THE HECK WAS ATI FOR SO LONG ? !! THEY'VE ALWAYS BEEN AROUND, BUT NVIDIA SPANKED THEM FOR SO LONG, WE HATE NVIDIA FOR BEING TOP DOG AND TOP PRICED - BUT IT'S REALLY ATI'S FAULT, WHO ENTIRELY BLEW IT FOR SO LONG.."
---
See, that's what really happened. ATI fell off the gaming fps wagon, and only recently got their act back together. They shouldn't be praised, they should be insulted for blowing competition for so long.
If you're going to praise them, praise them for losing 33% on every card they sell, in order to have that 5-10 dollar pricepoint advantage, because if ati were to JUST BREAK EVEN, they'ed have to raise all their gaming cards prices about $75 EACH.
So they're losing a billion a year... by destroying themselves.
Nvidia has made a profit all along, however. I think the last quarter they had a tiny downturn - while ati was still bleeding to death.
PRAY that Obama and crew has given or will give ati a couple billion in everyone else's tax money and inflation for everyone printed out of thin air dollars, to save them. You better so, or for a multi-billion dollar sugar daddy corporateer.
Hahaha! An eye for an eye. Guess the table has turned. AMD used to be in a needy position... taking it from left..right..center and back from players like Nvidia.
so far my overclocked 4850 crossfire setup has been keeping me happy, i'll come back into the market when the 5000 series rolls out and i upgrade my rig in general.
Are you on drugs, is that why you don't understand or have a single counterpoint ?
Come on, come up with at least one that refutes my endless stream of corrections to the lies you've lived with for months.
No ?
Ban the truth instead ?
Yeah, that wouldn't help you.
I had 4850, 4870-1Gb, 260-216 and 280-Overclocked. Ran on 24" 1900*1200 - Crysis and Warhead, FarCry2, GTA4, Stalker ....whatever else you can imagnine...
My experience:
Radeons are hot and noisier. You HAVE to increase the fan speed and it is audible. Image quality in games is very good though. Especially Crysis was better looking with the Radeons. Bullet tracing and sunshine effects were spectacular...GTX 280 on max everything in Crysis was also very beautiful. However that card gets HOT, so you would be better off with 285. I didn't like the image quality of Radeons in movies , but maybe my settings were not good. 4850 is definitely not the money, too hot for my test.
So, 4870 or 4890 1 Gb is definitely worth buying, performance is on par with 285 on 1900*1200 - Crysis was 27-41 FPS with standart Radeon 4870, and 31-45 with 280 OC 615 MHz.
IF 285 price is $250, that would be the best buy. If it costs more is NOT worth the money, unless you really want bigger and quiter card. Performance wise is the same as Radeon 4890, which now costs 229 and can be overclocked. I did overclock the GTX280 and 285, which doesnt show any performance change, I guess they are constrained by memory bandwidth?
So, honestly, for the money Radeon 4890 for $229 is the better choice. IF you find 4870 1Gb for $169 is worth considering also. The 896MB on the Nvidias is a constraint, I would not reccomend anything but 285, but that is expensive.
i don't get whats going on with silicon. but i enjoy my 4870. it works best at my resolution(1920x1200) and it costed less than the 275 with the ac-1. runs very chilly(45C idle 57C load oc'ed). i dont need phys-x or an application to do video encoding that costs extra adding to the total cost of the video card. gaming is its sole purpose to me and it does that extremely well.
180 + 80 dollars for the video applications costs more than what my 4870 ran me and it completely outclasses at stock speeds it let alone a 275(260) or 280(270) which mine still costed less than. now you can get a 4870 for what the 260 runs. wheres the logic in that? just so you can run a few games with physx that aren't even that good? to do some video encoding? i'll stick with my lower cost 4870.
I see, now your 4870 completely outclasses even the 280. LOL
Your 4870 is matched with the 260, not the 275, and not the 280.
You don't have anything but another set of lies, so it's not something about you determining "my problem", or you "not knowing what it is", but it is rather the obvious lies required for you to "express your opinion". Maybe you should read my responses for the 20 some pages, and tell me why any of the 20 plus solid points that destroy the lies of the reds, are incorrect ? You think you might try it ? I mean we have a lot more than just YOUR OPINION,, false as you presented it, to determine, what is correct. For instance:
http://www.fudzilla.com/index.php?option=com_conte...">http://www.fudzilla.com/index.php?optio...Itemid=4... .
Now, not even your 4870 overclocked XXX can beat the GTX260 GLH. In your MIND, though, it does, huh....? lol
Too bad, for you. I, unlike you, know what your problem is, and that is exactly what should bother you, about me.
Stick that fudzilla bench up in your.....
Fudzilla is the number one nvidia troll site.Put aside, the fugglier nvidia sponored sites, like Hardocp, neoseeker, Driverheaven, Bjorn3d.................And you go on.
ehehe.. he goes around name calling almost everyone a red rooster.. wonder if he sees himself as the green goblin? :D
jokes aside.. seriously, why limit yourselves to just one camp?? why be fanboys? just get whatever card which give the most bang for your budget and also your needs. i started out with a Matrox G200 (eheh.. you can basically guess how 'young' i am). This was followed by the NVidia GeForce2 MX. The next upgrade was the Radeon 9500.. plus.. managed to unlock the 4 pipelines to make it the legendary 9700.. what more can you ask for. next came the X1800GTO, again managing to unlock 4 disabled pipelines and it became X1800XL. and before silicondoc calls me a red rooster.. lolz.. my current card is the 8800GT. real great buy with it oc capabilities and longevity..
currently with the flurry of releases, i'm targeting either the GTX 260+ or the 1GB 4870 as i only have a 22" flat screen. Performance between both cards are comparable. the difference of a few framerates are barely noticeable.. so its down to pricing. Waiting now to see if there are any further reductions :p As for PhysX, it doesnt seem to have really taken off yet.. game developers are not stupid. if they wanna sell more copies, they will have to make sure it works on as many cards as possible.
Green Goblin, ahh... many that is ugly...
Now, one problem with what you said - you definitely should note I don't call everyone a red rooster - just the liars - there are quite a few posts here that make fine points for ati that aren't steeped in bs up to their necks and over their heads.
You should also note I make the proper POINTS when I post, that prove what is and has been said by this brainwashed crew of red rooster just isn't the truth, period.
See that's the problem -
I certainly don't mind one bit if the truth is told, but if you have to lie so often to keep your favorite looking good, well, that's just stupidity besides being dishonest.
I've made quite a few points the in the box cluckers cannot effectively, fairly, or honestly counter - and the months and months of brainwashing won't unwind right away - but at least it's going into their noggins, and will eat away at the lies, little by little the small voice inside will spank their inner red stepchild into shape.
Hey, I'm a helpful person. Reality should win.
Oh wait lol no they aren't. If they were they wouldn't have a failing chipset side of their business and revenues down from 1.2bn to 480m in the last quarter.
They don't have an x86 licence and intel are suing them. How bad do you have to be for intel to sue lol. Oh yeah, on intel - that reminds me, who was it that took the playstation away from Nvidia again? Seeing as we're talking about consoles now, how about we talk about the 50 million wii's sold with ATI graphics inside?
Fusion is the party of the future and Nvidia aren't invited. To sum up, this desperate little company is going down the tubes and taking all of its clueless fanbois with it. It will be great in the near future when you are all forced to game on ATI graphics lol.
January 2008 - and perhaps as bad or worse since then.
" Without the charge, AMD came close to breaking even in the quarter, listing an operating loss of $9 million. Revenue for Q4 was $1.77 billion, up 8% sequentially and virtually flat compared to Q4 2006. The ATI write-down resulted in a total loss of $3.38 billion for the year, with total non-cash charges adding up to $2.0 billion. 2007 sales were $6.0 billion, up 6$ from 2006. "
http://www.tgdaily.com/content/view/35669/118/">http://www.tgdaily.com/content/view/35669/118/ .
That looks like ATI writedown was 3.38 billion - gee, 2 billion loss for one year.
Oh well, they can crush nvidia with their tiny core vs nvidia's big core by dropping prices, right ?
LOL
Thanks for nothing but lies, red roosters.
Oh and let's not forget the Dubai ownership portion of ati/amd.
Great.
Not trusted for US ports by the American people, but fanboys love it anyway.
Great.
Yes, whether you believe me or not, it's true.I had had much more driver problems with my 8800gtx than what i have had with my current 4870.
As for the arabs, they don't own the company.AMD just bought part of it's manufacturing business to the arabic company,ALMUBSDALA.
And to be more specific, AMD decided to share it's manufacturing business with another "foregin" company because it can't afford the manufacturing costs.
I don't see anything wrong with that.But in case you don't know, i have to give you some informations about who owns what.
Did you know that 2 russian "jewish" billionaires, "Naoom Birizofski and Jacob Trotski", own 79% of INTELs shares.
Did you know that JEWs own all the major technological companies in the WEST. From IBM, Ford, Boeing, Le Stabrolle, Shell..........to ADIDAS, Mercedes, OPEL, NPD Petrochemicals ......
Put aside the US Media, that is totally owned and controled by jewish businessmen..
The whole western civilization in it's all glory is getting purchased by jews.So if you think that those Arabs will/have any kind of effect on anything, then your wrong.
AMD was in big trouble and no other company offered any kind of help.AMD simply wasn't able to keep both it's manufacturing and designing business and make a revenue.
Now, with their manufacturing business is getting fixed and grown, the company is definitely in a better state.As you know, or maybe don't, the global foundaries is expanding now by building two other fabs in NY and Malta.Beside, they fixed and did a lot to AMDs to fabs in Germany. so what AMd has done is establishing a new manufacturing
entity, that will provide thousands of jobs for those great minds in both Europe and USA, and will compete efficiently with those Taiwanese manufacturing companies.
But you, with your stupid little brain, can't recognize all that.you think that your nvidia, that small little company with it's 5000 workers, is more important than anything else.
What about competition?
What about innovation?
what about prices? Are you aware of what the state of the prices of gpu, cpus... or any other products would become without competition?
Here's the other thing, no matter how much you give your opinion on amd/ati, ati lost them billions, and nvidia, even though you pretend otherwise, actually "puts to work" more people than ati making and selling their gpu's, even if they don't have some holding corporation monster over the top of them, bleeding billions as well.
So, you went into your love for amd, and I guess your love for ati, which according to you "saved the loser" called amd. Like I said, you're more fanboy than one could initially have surmised, and your bias is still bleeding out like mad, not to mention the crazy conspiracy talk.
Hey, you're the one with the lies and the cover-ups for ATI, and now the anti-semitic conspiracy theories.
Even with all the spewing you've got going there, you couldn't just say " ati is really the one who lost money, not nvidia with the GT200".
Oh well, it's more important to spread FUD and now, conspiracy against "Jews".
Amazing. I had no idea the rabbit hole goes that deeply. rofl
Check for yourself.It's not a conspiracy, these are facts.
In fact, the 4800 series cards are the most successful generations of cards ATI ever produced.The 4890 that measures about half the size of the gtx285, beats the later in most games at full HD resolutions.
Now tell me about employment or jobs ? Is that in the communist inflation reprint economy that costs us taxpayers trillions - the fantasy world where CONSTANT billion dollar losses on just a billion dollar company is "sustainable" ?
AMD/ATI IS IN DEADLY SERIOUS TROUBLE AND HAS BEEN
NVIDIA IS ALREADY RECOVERING AND HAS BEEN POSTING A PROFIT.
___________
But in your inaginary world filled with HATRED and LIES, it's just the opposite... isn't it.
How pathetic.
You cite "the last quarter", but of course only a fool would use that as a future indicator concerning quality and viability of the company. It's another pathetic attempt, fella. Global downturn means nothing to you, and you FAILED to cite the ati numbers, the two quarters in question, so you really have no point. You must have been afraid to tell the truth ?
If Nvidia has one low quarter in the midst of massive global downturn, while ati had at least 9 quarters where they suffered losses in a row, who is really in danger of playing on the "competitors" chip ?
You see, that's WHY the ati red roosters had to SCREAM endlessly about nvidia's GT200 die size - because THE WHOLE TIME BEHIND THE SCENES OF THEIR FLAPPING RED ROOSTER BEAKS - THEIR BELOEVED ATI WAS LOSING BILLIONS....
See bub, that's what has been going on for far too long.
It's really sad and sick, that people can't be HONEST.
All the red roosters had to do was say " hey buy ati, they're in financial trouble and have been, we all want competition to continue so let's pitch in, because the brands are about equivalent. "
See, that would have been honest and respectable and manly.
Instead the raging red roosters lied and covered up and FALSELY ACCUSED their competition of imaginary losses while their little toy was bleeding half to death - like little lying brats, they couldn't help but spew in the midst of IMMENSE BILLION DOLLAR losses for ati, how the gt200 was "hurting nvidia" and how "ati could crsuh them" with PRICE DROPS -. lol - man alive i'm telling you - all those know it all red rooster jerks - it was and is still amazing.
That's fine, just be aware that it --- has been pathetic behavior.
Actually, you were the one throwing around $billion losses and FAILED to mention Nvidia's own horrible financial situation. Did you say anything about the global downturn while ranting like a fanatic on AMD's losses?
What was it you were saying about HONESTY again? Yes, in caps.
Nvidia hasn't had one low quarter - they've lost 2/3rds of their share value in a year. That doesn't happen in one quarter, same as it didn't happen to AMD in one quarter either.
Nvidia are a horrible little company who hold back progress, and more and more people are wising up to their methods. Articles like this on Anand show what they are like. Nvidia CANNOT COMPETE with ATI on performance so instead they bribe with more cash than ATI use on R&D, and those that don't accept the bribes get cajoled or threatened instead.
All the while sad sycophants like you are banging on about PhysX and cuda as if they make a difference to anyone. What does make a difference is their pathetic rebadging of ancient tech, catching out the people who don't know any better.
That just proves how far ahead the r700 is vs the g200b. ATI put money in research in order to improve the experience, while Nvidia put money into bribes in an attempt to hold onto whatever slender lead they have. It's only a financial lead, in tech terms ATI are a country mile ahead and only the worst Nvidia fanboi cant see that.
" That just proves how far ahead the r700 is vs the g200b."
That rv700 can't compete AT ALL with the GT200 UNLESS it has DDR5 on it.
That is a FACT. That is REALITY.
Without DDR5 it is the full core 4850 that competes with the "old technology" at the DDR3 level on both cards, the 4850 and the 9800 series and flavors.
That's the truth, YOU LIAR.
Case closed, no takebacks, no uncrossing your fingers, no removing your red raging horns - like - forever.
The r700 CAANOT COMPETE WITH THE GT200 - unless DDR5 is added as an advantage for the r700 which actually competes with the g80/g92/g92b.
NOW, if you screamed and schreeched DDR5 is awesome and ati rocks because they used it, I wouldn't disagree or call you the liar you are.
Got it son ?
Figure it out, or go take a college class in logic, and skip the communist training if you possibly can. Might get an estrogen emotion reduction as well while you're at it.
Check the five year stock charts before you keep lying, and then as far as your idiotic rant about nvidia, it just goes to show there is no such thing as a fair performance comparison from you people, you will lie your red rooster rooter butts off because you have a big twisted insane hatred for Nvidia, based upon some communist like rage that profit is a sin, and money in the industry is BAD, except PEOPLE get paid with all that money you claim NV throws around. lol
Dude, you're a red rooster rager, look in the mirror and face it, snce you can't face the facts otherwise. Embrace it, and own it.
Don't be a liar, instead - or rather if all you're going to do is lie, at least admit it - you're body painted red, no matter what.
The really serious issue is ati has a really bad continuous loss, and might go under.
However, I can understand you communist like raging red roosters screaming for more price drops as you declare the much better off financially NVidia the one "to be destroyed", and demand more price drops, as you scream "profiteering".
Well, the basic fact is plain and apparent, ati had to lose 2 billion dollars to provide their competitive price, and ati purchasers are sitting on that loss, their gain, huh.
Like I said, if Obama and co. give ati/amd a couple billion in everyones taxes, it might work out ok, otherwise bankruptcy is looming - or some massive new investor relations are required.
Either way, you people don't tell the truth, and that of course is the point, over and over again.
Well having trawled through all 16 pages of comments I have to say that as much as power & temp benches, I really want NOISE benchmarks. Yes power usually comes at the expense of noise and although I'm primarily a gamer, I hate fan noise too.
I happen to have a 8800GT which was great value when it first came out but it becomes a whirlwind in most games and it drives me crazy, breaks the immersion and only in ear headphones help.
When the scores are this close, I err on the side of silence and (from other sites) it sounds like the GTX275 is noticably quieter than the 4890 under load.
Also, the GTX275 may suck up more juice under load but it is also the same amount more economical when idle and as I spend way less than 50% of my computer time gaming, that is much more useful to me...
Agree that PhysX is overhyped promises at the moment. So, for sound and power efficiency, I think the GTX275 just sways the vote *for me*. And it can overclock a bit even if the impression I'm getting is not quite as much as the 4890.
Then again, here in the UK the prices are different. The new parts are £200+ and that's 33% more than the GTX260 55nm core216 which can be had for only £150 now and is only a little less powerful than the GTX275 and will surely last fine till the DX11 parts come out... choices... choices...
You can edit your vga bios using the radeon bios editor v1.12, which is the one im using now on my 4870, and adjust the frequencies in different modes.By downlcocking your radeon card in idle mode, you can get it operate properly in idle mode without sucking so much power.you can use ATI tray tools also for the same purpose.
As for the noise, i definitely recommend you to wait a little bit until the non-reference cards get released.
According to some sites, AMD is going to release its DX11 cards in Q3 this year, so if your planning to upgrade, you'd better consider waiting a little bit and get a far better card than the current available ones.
From my personal experience, a 4870 1gig is more than enough to play most current games at 24" resolutions with all the setts on their highest including the eye candy.......except for Crysis and Stalker clear sky....If you have a smaller monitor than mine, than you might as well consider a 4850 or a gts250....
So he just told you why he's gettting NVidia, and the little red fanboy in you couldn't stand it. You recommend taking steps that void the warranty, but why should you care - then you blabber about a 4850, but he already noted the 260 - and finally you let the last word in you could barely bring your red rooster self to say GTS250 - as in you might as well get it.... if you have to - 'cept he was already looking above that.
You red ragers just have to spew about your crap card to people who already seemingly decided they don't want it. Why is that ?
You gonna offer him red cuda, red physx, red vreveal, red badaboom, red forced game profiles, red forced dual gpu ? ANY OF THAT ?
NO -
you tell him to HACK a red piece of crap to make it reasonable. LOL
What a shame.
Hey, maybe he can hack the buzzing fan on it , too ?
What the hell are you talking about?
Im not a fan boy of any of those big companies that don't give a shit about me.
I have used both nvidia and ATI, and both produce great graphics cards.
I had a 8800gtx before my current 4870, and i had and athlon x2 6400+ before my current core i7 920.I go with whoever has the product with the best price.
It's you, who the fanboy is.You're trying hard to show the advantages of nvidia cards over the ATI cards, and guess what, you fail.
And i don't give a rat ass about CUDA.Im a gamer, and what matters to me is the card's gaming performance and the features that enhance my my gaming experience "which is what 99.9% of who buy those cards care for", Like physx, which is a true con for nvidia cards.But physx isn't supported on most games, and the physx effects that nvidia is trying to promote can be easily processed on a decent sub 300$ cpu by enahncing the on cpu physx performance via a driver/software that could utilize all cpu cores and utiliz the cpu processing power in a better way.
Ohh wait.ATI has DX10.1 and tessellation which nvidia doesn't have, and thanks to your nvidia, we didn't get much games that support
DX10.1, and we didn't get any game that supports tessellation which is a geometry accelerating technique that can accelerate geometry processing by up to 4 times using the same amount of floating points aka. processing power.If tessellation, which is included in ATI cards APIs since the Hd2000 series days, was used in those demanding games like crysis and stalker, we would've been able to play them using sub 300$ graphics solutions.Put aside the DX10.1 features like the aa enhancement.....and the GRS color detector that allows the gpu to use more accurate color degree for a texel using a more advanced texturing algorithm compared to the tri/bi-linear buffering teqnique used in nvidia's illegal uncompleted DX10 API.
rofl - the long list of your imaginary hatreds against nvidia - you FREAK fanboy.
The problem being, that just like some lying retard, you blame ati's epic failure with tessalation on who ? ROFL NVIDIA.
Sorry bub, you're another one who is INSANE.
You won't face what IS - you want something that isn't - wasn't - or won't be - so keep on WHINING forever, looney tuner.
In the mean time, nvidia users outnumber you, enjoy the large amount of added benefits, and don't have a 2 billion dollar loss hanging over their heads - with a company that might collapse in bankruptcy - and lose support of the already problematci drivers.
You bought the wrong thing, now you have a hundred could be would be if and an's and garbage can complaints that have nothing to do with reality and how it actually is.
Fantasy red rooster fanboy.
LOL - it's amazing.
Freak fanboy......? Hatred....? Are you serious...or ....?
I don't hate any company because i have no reason to.I blame nvidia for not allowing those few developed games companies to include those great features in these very few modern demanding games.
Accelerating physx processing using the gpu is a great idea, but is it worth it?
Is the cpu realy unable to keep up with the gpu in games due to it's slow physx processing ability?
Are those primitive physx effects realy that heavy n a modern quad core cpu?
These are questions that you should ask yourself before trolling for the idea.
And i have to remind you that ATI has coded Havok physx effects in
OPEN-CL programming language, which in case you don't know, is standard langauage compared to nvidia CUDA, which is based on some kind of c programming codes.
Talking about the drivers, i haven't had problems with my 4870 on MY Vista 64bit OS, compared to my old 8800gtx that almost brought me a heart attack.
As for your beloved nvidia, we need nvidia as much as we need AMD and INTEL to keep the competetition alive, which in its turn will keep innovations going and adjust the prices.
Ohh... and thanx to the red camp, we can get a decent graphics card for less than 300$.so have some respect for them.
as for you, i wonder how old you are, cuz you don't seem to have a mature logic.
Yeah, sure. Just like any red rooster, you never had a problem with ati drivers, but nvidia drove you nearly to a heart attack, but I have problems with logic or detecting a red rooster fanboy blabberer ! ROFL
Dude, you keep digging your own hole deeper.
Then you try the immature assault - another losing proposition.
YOU'RE WHINING about nvidia and yeah, you do blame them, and after going on like a lunatic about wanting offload to the cpu, you admit it might not work - yes I read your asinine quadcore offload blabberings, too, and your bloody ragging about nvidia "not letting" your insane fantasy occur - purportedly to advantage ati (not like you're banking on intel graphics).
So red rooster, keep crowing, and never face the reality that is, and carry that chip for all your imiginary grievances of what should be or, what you say could have been.
In the mean time, know you are marked, and I know who and what you are, and I'm sure you'll have further whining and wailing about what nvidia did or didn't do for ati. LOL
roflmao
Logic ? ROFLMAO
" I, almost had a heart attack " said the sissy. lol " But I'm an objective person with logic ". roflmao
Another red rooster who cannot argue with the facts and the truth, and doesn't want them known.
Perhaps you'd notice, I didn't comment right away when the STORIED review came out, you FOOL.
I came days later, and made my comments after you had your bs fest of lies, so I don't expect a lot of responders, you DUMMY.
But you're here, and your response is calling for DEATH.
Now, if anyone needs to be banned, YOU DO.
Futhermore, I really don't care if you're here, and have enjoyed some of your posts, but the fact remains, where I have absolutely FACTUALLY retued your BS in some of your posts, you have no response - other than, your own personal rage.
I'll be glad to see how you can defend yourself, but you obviously cannot.
Go ahead, there's 22 pages, and I've pointed out your lies several times. Have at it. Good luck, just calling for DEATH, and spewing "ban him!" while carrying your torch of lies is just what I expect from someone who doesn't care what bs they spew.
You already claimed you can't understand - LOL - of course you can't, you'd have to straighten out yourself and your lies then.
Good luck doing that.
LOL - the folding was crap forever on ati, and now it's slower.
We know the release date for both cards, and the nvidia is already listed on the egg dude.
When you're a raging red rooster, nothing matters to you but lying for the 2 billion dollar loser - ati.
You cannot argue with facts and the fact of the matter is that you can't help the world find a cure for cancer or Alzheimer's by buying an ATI!
So those of you with an Internet connection, should buy an NVidia and fold@home all the time to help make the world a better place!
Take that ATI and your associated fanboys!
Folding at home is a total waste and is just an excuse to be smug and think you're special, so there to both of you.
"Oh I'm going to save the world by buying overpriced hardware and letting some university use it for studying the human genome. I'm such a humanitarian."
Please, you can justify your over indulgence any way you want but it still doesn't cover up the fact that you're trying to justify sitting on your asses instead of doing some real community work to help change the world.
Folding@home = Too fat and too lazy to really make an effort.
Uhh, dude, they're doing it at college, on like triple TESLA machines with the "supercomputer" motherboards - so you know, go get an education and start whining about unbelievable game framerates - that's what's really going on -
Professor cuda machine checker " What happened ? "
Gamer students " Oh, uhh, well it crashed again it was a Crysis, I mean uh, no crisis, last night and it took us about 5 hours to to reset the awesome TESLA cards. We'll come in tonight to keep an eye on it, and clean up the pizza boxes and lock up again professor."
" Very well."
WHO LOVES THE EDUCATION OF AMERICA? !!!
hahahahaha
Well, since you cannot argue with facts, it's a fact you are a stupid fanboy who doesn't know anything! Check your facts before you post something like that. It is a fact that you can do f@h with an ATI card, as I have been doing it for some time now. So STFU and go spill your hatred somewhere else!
You're not being honest there. A while back ati either couldn't do it all ( no port ) - or it was so pathetic - they had to make a new port - I know they did the latter, and as far having a long stretch where it wasn't available, or just not used much since it was so pathetically slow in compariosn, the fella has the right idea.
Furthermore, unless something has recently changed significantly, the ati port is still WAY slower than the Nvidia for folding.
So anyway, nice try, but telling the truth might actually be something the red rooster crew should start practicing .... or perhaps not, considering lying a whole heckuva lot might make those 2 billion dollar ati loses into "sales" that make "overall a profit" a reality...
On the other hand, if people continuously notice the lying by the red fans, they might gravitate to the competition, for obvious reasons.
So, honesty, or more bs ? I think I know what you'll choose.
No mention of the death of the HD 4850X2 as the HD4890 trashes the power consumption, price, availability, speed and OC-ability. No mention of advantage of DX10.1 and the games available. Hey, even bad news is good news sometimes by spotlighting. What is really missing is the bang for buck quality (bucks spent for performance increase), and talk about price depression for the HD 4870 1GB model by 10$ to 15$ with $50 step increments.
4850 (125)[20.9] 4870 (185)[27.9] 4890 (235)[31.7]
4870X2 (400)[35.0]
Nvidia is cramping its own style:
250 (150)[21.8] 260-216-55 (180)[27] 275 (250?)[31.3]
280 (290)[30.9] 285 (340)[32.8]
The GTX280 is dead now, overpriced for those trying to sneak into SLI. The GTX260 is overlapped with Core216 55nm you'd want to get, but Joe Consumer might mistakenly get the other 2 prior versions to clean out old inventory. The GTX285's price is not justified but more power to nVidia if they get the consumer's buck.
Gladly, by the low temps the dual slot blowback is voiding hot air properly so the vendors are finally manufacturing cards with common sense.
Too bad we have gone the way with power hungry beastly cards needing two 6-pins.
Also, too bad the effects of AF and 0x00, 2xAA, 4xAA and 8xMSAA modes are not investigated. It would be interesting to see how saturated the units get as AF and AA gets bumped and what are the best modes for nVidia and AMD.
Oh, nice blurb for nVidia's shadow enhancement, but ATi/AMD's tesselation enhancement is as much as a hit or miss feature. Will AMD have an tech edge when DX11 tesselation cometh?
Hmm, that said, Derek might be crying, since he couldn't stop crowing about that 4850x2 last review - oh boy, you know - I guess he had the heads up and ati told him what card he needed to help push...
You know how things are.
Anyway, good observation.
LOL - antoher hidden red rooster bias uncovered...
Umm... look, when there's a new ati card, there's no talking about crunching down on former ati cards - OK ? That just is NOT allowed.
" No mention of the death of the HD 4850X2 as the HD4890 trashes the power consumption, price, availability, speed and OC-ability "
Dude, not allowed !
PS- Don't mention how this card is going to smash the "4870" "profit" "flagship" - gee now just don't talk about it - don't mention it - look, there's no rooster crying in fps gaming, ok ?
Props to ATi for delivering a very compelling product. I admit I've always been an Nvidia fan, and I'll generally forgive them a single generational performance loss to ATi, but I've recommended ATi products recently to friends due to their resurgent desirability.
That being said, am I the only one who detects a subtle but distinct underlying disdain for Nvidia? So they tried to market the hell out of you - so what? They are trying to sell cards here. Why the surprise that sales and marketing people are trying to do exactly what they're paid to do? Congrats for being smart enough to see it for what it is, but jeers for making an issue of it as if its some kind of new tactic. Has AMD/ATi never done the same?
CUDA and PhysX are compelling, but I agree not a good reason to overcome a significant gap between Nvidia and ATi at a comparable price point. You clearly agree, but it seems like what little praise you offer is begrudging in the extreme.
Nvidia has definitely acted in bad form in a number of ways throughout this very lengthy generation of hardware. However, you guys are journalists and in my opinion should make a more concerted effort to leave the vitriol and sensationalism at the door, regardless of who it is that is being reviewed. That kind of emotional reaction, personal opinion, irritation, etc is better served for your blog posts than a review article.
Love the site, keep up the good work. Nobodys perfect.
Yeah thanks for noticing, too. It been going on a long time. Notice how now, suddenly when ati doesn't have 2560 sewn up - it doesn't matter anymore ... LOL
Of course the "brilliiantly unbiased" reviewers will claim they did a poll on monitor resolution useage, and therefore sudenyl came to their conclusion about $2,000.00 monitor users, when they tiddled and taddled for years about 10 bucks between framerates and nvidia ati - and chose ati for the 10 bucks difference.
Yep, 10 bucks matters, but $1,700.00 difference for a monitor doesn't matter until they take a poll. Now they didn't say it, but they will - wait it's coming...
Just like I kept pointing out when they raved about ati taking the 30" resolution and not much if anything else, that declaring it the winner wasn't right. Now of course, when ati isn't winning the 30 rez - yes, well, they finally caught on. No bias here ! Nothing to notice, pure professionalism, and hatred of cuda and physx for it's lack of ability to run on ati cards is fully justified, and should offer NO advantage to nvidia when making a purchase decision ! LOL
OMG ! they're like GONERZ man.
Best review so far. And nice cards BTW, they are both worth it, but i like the 4890 better
Funny thing is that GTX 275 > GTX 280.
But my guess is that GTX 280 benefits more from overclocking.
Because of my PC's location I am concerned with idle power, and purchase based on that if other specs and price are even comparable. Peak power doesn't matter as long as it's within the capability of my 800W PSU.
I bought an ATI HD4850 last year because it idled significantly lower than the 4870, and it would run everything in sight. A great card. The Nvidia GTX 260 and 280 had even better performance vs idle power ratios but were way too expensive at the time.
So I think Nvidia takes the laurels now with the GTX 275. 30W less (!) than the HD 4890 at idle, with essentially the same performance. If I were shopping now it would be a VERY easy choice.
I really hope ATI can get their idle power down too. They need to pay more attention to throttling back or downpowering circuits that aren't needed in 2D modes.
The power consumption on the 4890 really interests me. While it uses more than 275 at idle, it uses less under load. Also, it is a significant drop from the 4870 which is a slower card.
So, on the charge of drivers; I've gone from recently having a GT8800GTX 512Meg to a HD4870X2 2gig and if anything I've seen stability improvements between the two. Or to put it another way NV drivers were bluescreening my Vista install when I was doing nothing more than using my TV card and it was crashing in a DirectDraw DLL. Nice.
Not to say AMD hasn't had issues; trying to use hardware acceleration with any bluray play back resulted in a bluescreen due to the gpu going into an infinite loop. Nice. Fortunately, unlike the DDraw error above, I could at least turn off hardware acceleration (and honestly, with an i7 it's not like I needed it).
So, stability wise it's a wash.
As for the memory usage complaints about CCC;
Unless it is running it is NOT taking up physical memory. Like many things in the windows world it might load something into the background but this is quickly paged out and doesn't live in ram. Even if it does living in ram for a short period of time being inactive it will be paged out as soon as memory presure requires it. The simple fact is unused ram is wasted ram; this is why I'm glad Vista uses 10gig of my 12 for cache when it isn't needed for anything else, it speeds up the system.
Cuda.. well, the idea is nice and I like the idea but as mentioned in the article unless you have cross vendor support it isn't as useful as it could be. OpenCL and, for games, DX11's compute shaders are going to make life intresting for both Cuda and AMD's option. I will say this much; I suspect you'll get better performance from NV, AMD and indeed Larrabee when it appears by going 'to the metal' with them but as with many things in the software world you have to trade something for speed.
Now, PhysX.. well, this one is even more fun (and I guess it effects Cuda as well to a degree). Right now, with Vista, you can't run more than one vendor's gfx card in your system at once due to how WDDM1.0 works; so it's AMD or NV and that's your choice. With Win7 however the rules change slight and you'll be able to run, with WDDM1.1 drivers, cards from both vendors at once. Right away this paints an intresting landscape for those intrested; if you want an AMD card but also want some PhysX hardware power than you'll be able to slide in a 'cheap' NV series card to use for that reason (or indeed if you have an old series 8 laying about use that if the driver supports it).
Of course, with Havoc going OpenCL and being free for games which retail for <$10 (iirc) this is probably going to be much of a muchness in the end, but it's an intresting idea at least.
Except you can run 2 nvidia cards, one for gaming, the other for physx.... so red fanboys are sol.
"Right now, with Vista, you can't run more than one vendor's gfx card in your system at once due to how WDDM1.0 works; so it's AMD or NV and that's your choice. "
WRONG, it's TWO nvidia or just ONE ati. Hello - you knew it - but you didn't say it that way - makes ati look bad, and we just cannot have that here....
You failed to read his post, and therefore the context of my response, you IDIOT.
Can you run a second ATI card for PhysX - NO.
Can you run an ati card and a second NV for PhysX - not without a driver hack - check techpowerup for the how to and files, as I've already mentioned.
So, THAT'S WHAT WE WE'RE TALKING ABOUT DUMMY.
Now you can take your stupidity along with you, noone can stop it.
From an objective point of view there is not really a clear winner. At the lower resolutions do you really care if you are getting 80 FPS Vs 100 FPS?
IMO it is the higher resolutions that matter. I would think any real gamer is always looking to upgrade there monitor :).
I wonder how old you guys are that are posting? Who cares if something is "rebadged" or just an OC version of something? Bottom line is how does the card play the game?
IMO both cards are good. It comes down to price for me.
Ahh, you just have to pretend framerates you can't see or notice, and only the top rate or the average, never the bottom framerate...
Then you must discount ALL the OTHER NVIDIA advantages, from cuda, to badaboom, to better folding scores, to physx, to game release day evga drivers ready to go, to forced game profiles in nvidia panel none for ati - and on and on and on...
Now, after 6 months of these red roosters screaming ati wins it all because it had the top resolution of the 30" monitor sewed up and lost in lower resolutions, these red roosters have done a 180 degree about face.... now the top resolution just doesn't matter -
Dude, the red ragers are lying loons, it's that simple.
The 2 year old 9800X core is the 4870 without ddr5. Think about that, and how deranged they truly are.
I bet they have been fervently praying to their red god hoping that change doesn't come in the form of ddr5 on that old g80/g92/g92b core - because then instead of it competing with the 4850 - it would be a 4870 - and THAT would be an embarrassment - a severe embarrassment. The crowing of the red roosters would diminish... and they'ed be bent over sucking up barnyard dirt and chickseed - for a long, long time. lol
Oh well, at least ati might get 2 billion from Obama to cover it's losses ... it's sad when a red rooster card could really use a bailout, isn't it ?
Well, you have a point there. But the card is still not operating on a WHQL driver, and the percentage of those who use 30" montiors is negligible compared to the owners of 22" / 24" monitors.
I think this is probably due to the 256bit internal memory interface compared to the 484bit that the gtx275 has.even at xbitlabs the 4890 drops significantly in performance compared to the gtx285.
From a subjective point of view you may feel that way, but from an objective point of view there is a clear winner, and it is the 4890. Left for Dead and Call of Duty are the only 2 30" display tests where the 275 significantly defeated the 4890. In all of the other tests either the 4890 either dominated (G.R.I.D., Fallout3), or was within 4% of the 275 which I would call a wash. At all other resolutions the 4890 was the undisputed leader. So I find it difficult to say there is no clear winner.
What Nvidia should have done was not nerfed their 22" and 24" resolutions for the very few people that game at 30" with the latest drivers. To be honest I wish the article had included all of the results from the 182 drivers (they show just G.R.I.D. but allude to other games also having similar reduced results except at the highest res). It could very likely be a wash then if the 275 is more competetive at the resolutions 99% of the people buying this level/price of card are going to be playing at.
Anand, any way you could post, even just in the comments, the numbers for the rest of the games with the 182 Nvidia drivers. I don't mind doing the comparison work to see how much closer the 275 would be to the 4890 if they had kept the earlier drivers.
Ah, I see now that the 185's are specifically to enable support for the 275 card. So you can't run the 275 with the 182 drivers. Still would be interesting to see all the data for what happened to the 285 using the newest drivers that decrease the performance at lower resolutions.
First, thanks for your review(s). I'm a silent reader and word-of-mouth spreader for years.
Second, don't you think reviewers should point their fingers a little bit more aggressively to the power-consumption? Not because it's trendy nowadays, but because it's just not sane to waste that much energy in idle (2D, anyone remembers?) mode. I was thrilled what you alone (don't take it as a disrespect) were able to achieve on the SSD issue.
PSST ! The ati cards have like 30 watts more power useage in idle - and like 3 watts less in 3d - so the power thing - well they just declare ati the winner... LOL
They said they were "really surprised" at the 30 watts less in idle for the nvidia - they just couldn't figure it out- and kept rechecking ... but yeah... the 260 was kicking butt.. but... that doesn't matter - ati takes the win using 1-3 watss less in 3D..
So, you know, the red roosters shall not be impugned !
capiche' ?
It appears that you may have had Vsync turned on which caps the game at 60fps in some of the CoD:W@W tests. It's pretty apparent something is up when the nVidia card has the same FPS at 1680x and 1920x. Either way it still seems like the 4890 wins at those resolutions which is different than most sites that pretty much say it's a wash across the board. I'll take nVidia's drivers over ATi's any day.
why didn't you post a page on how useless dx 10.1 is? I bet that there will be even less difference in gameplay with dx10.1 on compared to dx10 than physx
ati can't do physx at all - so uhh, no performance boost there, EVER.
Same with cuda.
Kinda likewise with this cool Vreveal clean up video thingie.
Same with the badaboom converter compared to ati's err(non mentioned terrible implementation)...lol hush hush!!! doesn't matter ! doesn't matter ! nothing going on there !
hu, gpu phsyx gpu aceleration only helps when theres heavy physx caltulations.
almost no game uses that heavy calculations nowadays.
besides, if you wanted physix to run, dont you need a second card to run physX while the other does the graphics?
I suspect thers a slowdown as well if the same graphic card does the work.
Since you suspect there's a slowdown with PhysX enabled it points out two things to me : 1. you have no clue if there is because you don't have an nvidia card, indicating your red rooster issue.
2. That's why you didn't get my point when the other poster linked to the other review and listed the various settings and I laughed while pointing out the NV said PhysX enabled.
_______
It's funny how your brain farts at just the wrong time, and you expose your massive experience weakness:
"you suspect" - you don't know.
Go whine at someone else, or don't at all. At least bring a peashooter to the gunfight.
Ever played War Monger or Mirror's Edge ? lol
No of course not! YOU CAN'T, until, you know.
Given that you label the price for the GTX260-216 as $205, and in reality it's closer to $175, can we expect the 275 will be closer to $215 in short order?
Of course ATi has a hard launch of this product - its hardware appears from your description to be identical to existing hardware just with a slight clockspeed boost, where as the GTX-275 is actually a break between the 285 and the 260.
Also the 275 is much more appealing given that it has actual hardware improvements over the 260 for just a bit more cash.
It's possible he saw the article before Anand/Gary corrected the table. Originally the table had identical numbers between the 4870 and 4890. And in the text they mentioned that it was basically an OC'd 4870. This could lead one to assume it was pretty much the same card. In the article (on the next page) they did mention quickly the high transistor count, but it was brushed over quickly and they didn't really go into detail about the differences (still waiting to get a response about the cooling solution changes).
As for the rest of his post, he IS clearly an Nvidia fanboy, because the 4890 is clearly the better product in just about every case (not even looking at the OC'ing potential which seems to be very nice).
Hmm, better in every case, without the overclocking potential ? lol He's a fanboy ?
Is it better with Cuda, curing cancer with Folding, Vreveal clean up video recoding, forced game profiles, dual GPU game forcing ? Any of those have an as good equivalent ? NO NOT ONE.
So you know dude, he's not the fanboy...
The thing is, ati did a good job with the ring around the gpu 3m transistors to cut down frazzly electric - and gain a good overclock.
That they did well. They also added capacitors and voltage management to the card - an expense left - not mentioned in expense terms - including the larger die cost.
So, on the one hand we have a rebranded overclock that merely used the same type of core reworking that goes into a shrink, but optimized for clocks with a transistor band around the outside.
Not a core rework, but a very good refinement.
I knew the intricacies would be wailed about by the red fans, but not a one is going to note that the G80 is NOT the same as the G92b - the refinements happened there as well in the die shrink, and in between, just like they do on cpu revisions.
Since ATI was making and overclocking upgrade, they needed to ring the core - and make whatever rearrangements were neccessary to do that.
Purely rebrand ? Ahh, not really, but downclocking it to the old numbers may (likely) reveal it's identical anyway.
At that point rebrand is tough to get away from, since the nvidia rebrands offered core revision and memory/clock differences, as well.
I'll give ati a very slight edge because of the ring capacitors, which is interesting, and may be due to the ddr5, that made their core viable for competition to begin with, instead of just a 9800X equivalent, the 4850 - minus the extra capabilities - cuda, better folding, physx, forced sli, game profiles - etc... vreveal... and on and on - evga game drivers on release day - etc. - oh the uhh.. ambient occlusion and fear + many other game mods for it...
Anyway, tell me none of that matters with a straight face - and that face will be so red you'll have to pay in wampum at the puter store.
Is the quality of the drivers. ATI. Call me when: A) you fix your broken drivers. B) Decide to finally ditch that bloated Microsoft Visual C++ just so i can have the enormous privilege of using your- also terrible and also bloated- CCC panel. c) Stop pouting my pc with your useless extra services. Until then 'll carry on with NVIDIA. Thanks. > Happy- nvidia- user (and frustrated ex-ATI costumer)
The quality of drivers can be argued either way and the negative connotations associated with "Drivers" and "ATi" are all but ancient concerns in the single GPU arena.
BS. I have a pretty beefy pc, that doesn't mean im going to stop demanding for efficiency when it comes to memory usage and to reduce the shear amount of stupid services required to run a simple application. This are all FACTS about Ati. But hey, you are free to use vista, buy ATI and end up with a system that is inferior and slower than mine.( performance and feature wise)
btw, to all the people claiming that cuda and physics are gimmicks... Give me a fn break! U Hypocrites. This cards ARE HUGE GIMMICKS! BEHOLD he MEGA POWAAR! For what? Crysis? Thats just ONE GAME. ONE. UNO. 1. Then what?... Console ports. At the end of the day 99.9% of games today are console ports. The fact is, you don't need this monstrosities in order to run that console crap. Trust me, you may get a boner comparing 600 fps vs 599, but the rest of the - sane- people here, dsnt give a rat ass, expectantly when the original - console game- barely runs at 30fps to begin with.
The red roosters cannot face reality my friend. They are insane as you point out.
cuda, physx, badaboom, the vreveal, the mirrors edge that addicted anand, none of it matters to red roosters - the excessive heat at idle from ati - also thrown out with the bathwater, endless driver issues, forget it - no forced multi gpu - nmevermind, no gamer profiles built into the driver for 100 popular games that nvidia has - forget it - better folding performance, forget it -NOT EVEN CURING CANCER MATTERS WHEN A RED ROOSTER FANBOI IS YAKKING ALONG THAT THEY HAVE NO BIAS.
Buddy, be angry, as you definitely deserve to be - where we got so many full of it liars is beyond me, but I suspect the 60's had something to do with it. Perhaps it's their tiny nvidia assaulted wallets - and that chip on their shoulder is just never going away.
I see someone mentioned Nvidia "tortured" the reviewers. LOL
hahahaahahahahahahaaa
There is no comparison... here's one they just hate:
The top ati core, without ddr5, and neutered for the lower tier, is the 4830.
The top nvd core, without ddr5, and neutered for the lower tier, is the 260/192.
Compare the 260/192 to the 4830 - and you'll see the absolute STOMPING it takes.
In fact go up to the 4850, no ddr5 - and it's STILL a kicking to proud of.
Now what would happen if NVidia put ddr5 on it's HATED G92 "rebrand" core ? LOL We both know - the 9800gtx+ and it's flavors actually competes equivalently with the 4850 - if Nvidia put ddr5 on it, it WOULD BE A 4870 .
Now, that G80/G92/G92b "all the same" according to the raging red roosters who SCREAM rebranded - is SEVERAL YEARS OLD nvidia technology... that - well - is the same quality as the current top core ati has produced.
So ATI is 2 years behind on core - luckily they had the one company that was making the dddr5 - to turn that 4850 into a 4870 - same core mind you!
So, the red roosters kept screaming for a GT200 "in the lower mid range" --- they kept whining nvidia has to do it - as if the 8800/9800 series wasn't there.
The real reason of course, would be - it could prove to them, deep down inside, that the GT200 "brute force" was really as bad or worse than the 770 core - bad enough that they could make something with it as lowly as the 4830...
Ahh, but it just hasn't happened - that's what the 2 year old plus rebrand of nvidia is for - competing with the ati core that doesn't have the ddr5 on it.
Well, this reality has been well covered up by the raging red rooster fanboys for quite some time. They are so enraged, and so deranged, and so filled with endless lies and fudging, that they just simply missed it - or drove it deep within their subconscoius, hoping they would never fully, conscoiusly realize it.
Well, that's what I'm here for ! :)
To spread the good word, the word of TRUTH.
I just came from using Cat 9's to 182+'s when I upgraded to nVidia.
The "efficiency when it comes to memory usage" is a non-issue -- especially on a "beefy pc."
The windows task manager is not a benchmark leading to conclusive comparisons regarding quality. My Nvidia GPU can (and has) occupied more memory, especially when I utilize nHancer so as to tap the super-sampling capabilities.
Also, it's something to note that nVidia's latest driver download is 77.0 MB in size, yet ATi's latest is only 38.2 MB.
1) Nhancer is just an optional utility, optional. IF want to check the gpu temps i just use gpuz, or everest, if i want to overclock i just use rivaturner, for the rest i have the nvidia control panel.
The ccc, not only is a bloated crap, it also requires Ms NET framework, and spawns like 45 extra services running non stop ALL THE TIME, clogging my pc, and the thing dsnt even work! GREAT WORK ATI! CCC is stupidly slow and broken. Se, i dont need a massive mega all in one solution that doesn't work and runs like ass.
2) YOUR Nvidia GPU. YOUR. Thats the key word, here. Your fault. Just like that windows task manager of yours, it seems to me you just didnt know how to use that nvidia gpu . And you need that knowledge in order to form conclusive comparisons regarding efficiency.
3) i made a gif, just for you. here. try no to hurt yourself reading it.
sorry to bust your bubble, but your screenshots is no proof, its clear you removed a LOT OF PROCESSES just to take the screenshot, how about if you take the FULL desktop screenshot that shows the nvidia panel loaded?
because it doesnt seem to be in the process list.
also you're liying, I got an ATI 3870. and I only got 3 processes of ATI, one of them being the "AI" tool for game optimizations(using the latest drivers).
and I agree with Anandtech for first time ever, PhysX is just not ready for the "big thing"
most of the things they bloat are just "tech demos" or very weak stuff, Mirrors Edge's PhysX in other hand does show indeed add a lot of graphical feel.
and funny you mention framework, because a lot of new games and programs NOW NEED the framework fundation, or at least the C++ redistribuitable groups.
also lately I've been reading a lot of fuzz related to the 275's optomizations, wich in many games forces games to use less AA levels than the chosen ones. and thus giving the "edge" to the 275 vs the 4890. (MSAA vs TRSS)
I suppose again NVidia as been playing the dirty way.
and its gets annoying how Nvidia as been doing that for quite a bit to keep the dumb thing of "opening a can of whoop ass"
"you removed a LOT OF PROCESSES", dumbass... if its not a necessary service, its turned off REGARDLESS OF THE videocard . maybe you should try to do the same? (lol just 3 ati services, run the ccc and see) btw, if i CHOOSE to use the nvidia control panel, THEN a new nvidia service starts, THEN as soon as I CLOSE THE control panel ( the process disappears from the task manager. THAT 3 ATI SERVICES OF YOURS ARE RUNNING ALL THE FRINGIN TIME, DUMMY, Remaining resident in YOUR memory REGARDLESS IF YOU USE OR NOT THE CCC. AND THEY GET BIGGER, AND BIGGER, AND BIGGER. ALSO YOU HAVE TO USE THE NET CRAP. (EXTRA SERVICES!) AND FINALLY, THE CCC RUNS LIKE ASS. so there, i WIN. YOU LOOSE. END OF STORY.
hang on, since when you need the CCC panel to be ON ( Ie, loaded and not in the tray ) to play games?
are you a bit dumb?
second, why you didnt filter out the services then?
your screenshot is bull
its almost like you ran WinXP in safe mode just to take the screenshot and claim your "memory superiority".
like I said, show us a full screen that shows the nvidia panel LOADED .
your argument is stupid .
4 Mb of Ram must be a LOT for you? (thats what my ATI driver uses currently on vista X64.. )
btw, theres also an option in ATI side to remove the panel from the tray.
the tray serves a similar function as ATI TOOL ( Ie, fast resolution , color dept and frecuency changes )
play apples with apples if you want to make a smart conversation.
"runs like ass", makes me wonder how old are you, 14 years old?
and my CC runs very fine, thank you!, not a single error.
also, I got all frameworks installed and even when programs loaded, I dont see any "framework" services running, nor application, so please, get your head out of your ass.
you're just like this pseudo SiliconDr who spreads only FUD and insults.
Besides all your errors and the corrections issued, it comes down to you claiming " Don't load the software that came with the ATI card because it's a fat bloated pig thqt needs to be gotten rid of".
Yes, and most people who want the performance have to do that.
Sad, isn't it ?
You do know you got spanked badly, and used pathetic 3rd grader whines like "~ your screencap is fake" after he had to correct you on it all....
Just keep it shut until you have a valid point - stop removing all doubt.
Well thanks for stomping the red rooster into the ground, definitively, after proving, once again, that what an idiot blabbering pussbag red spews about without a clue should not be swallowed with lust like a loose girl.
I mean it's about time the reds just shut their stupid traps - 6 months of bs and lies will piss any decent human being off. Heck, it pissed off NVidia, and they're paid to not get angry. lol
"Mirrors Edge's PhysX in other hand does show indeed add a lot of graphical feel. " should have been : Mirrors Edge's physx in other hand, does indeed show a lot of new details.
Firstly, I agree with the articles basic premise that lack of convincing titles for PhysX/CUDA means this is not a weighted factor for most people.
I am not most people, however, and I enjoy running NVIDIA's PhysX and CUDA SDK samples and learning how they work, so I would sacrifice some performance/quality to have access to these features (even spend a little more for them).
The main point I would like to make, however, is that I like the fact that NVIDIA is out there pushing these capabilities. Yes, until we have cross-platform OpenCL, physics and GPGPU apps will not be ubiquitous; but NVIDIA is working with developers to push these capabilities (and 3D Stereo with 3D VISION) and this is pulling the broader market to head in this direction. I think that vision/leadership is a great thing and therefore I buy NVIDIA GPUs.
I realize that ATI was pushing physics with Havok and GPGPU programming early (I think before NVIDIA), but NVIDIA has done a better job of executing on these technologies (you don't get credit for thinking about it, you get credit for doing it).
The reality is that games will be way cooler when the you extrapolate from Mirror's Edge to what will be around down the road. Without companies like NVIDIA out there making solid progress on executing on delivering these capabilities, we will never get there. That has value to me I am willing to pay a little for. Having said that, performance has to be reasonable close for this to be true.
Games will be better when we get better effects, and PhysX has some potential to do that. However, the past is a clear indication that developers aren't going to fully support PhysX until it works on every mainstream card out there. Pretty much it means NVIDIA pays people to add PhysX support (either in hardware or other ways), and OpenCL is what will really create an installed user base for that sort of functionality.
If you're a dev, what would you rather do: work on separate code paths for CUDA and PhysX and forget about all other GPUs, or wait for OpenCL and support all GPUs with one code path? Look at the number of "DX10.1" titles for a good indication.
NVidia has certainly received credit for getting accelerated physics moving, but its momentum stops when they couple it to CUDA when offering it to discrete graphics cards outside of the GeForce family.
Still no 3D Mark scores, STILL no low-med resolutions.
Thanks for including the HD4850, where's the GTS250??? Or do you guys still not have one? Well, you could always use a 9800GTX+ instead, and actually label it correctly this time. Anyway, thanks for the review and all the info on CUDA and PhysX; pretty much just confirmed what I already knew; none of it matters until it's cross-platform.
3DMark can be found in just about every other review. I personally don't care, but realize people compete on the Orb, and since it's just a simple benchmark to run it probably could be included without much work. The only problem I see (and agree with) is the highly optimized nature both Nvidia and ATI put on the PCVantage/3DMark benchmarks. They don't really tell you much about anything IMO. I'd argue they not only don't tell you about future games (since to my knowledge no (one?) games have ever used an engine from the benchmarks), nor do they tell you much between cards from different brands since they look for every opportunity to tweak them for the highest score, regardless of whether it has any effect in realworld performance.
What low-med resolution are you asking for? 1280X1024 is the only one I'd like to see (as that's what I and probably 25-50% of all gamers are still using), but I can see why in most cases they don't test it (you have to go to low end cards to have an issue with playable framerates on anything 4850 and above at that resolution). Xbitlabs' review did include 1280X1024, but as you'll see, unless you are playing Crysis:Warhead, and to a lesser extent Farcry2 with max graphics settings and high levels of AA you are normally in the high double to triple digits in terms of framerate. Any resolution lower than that, you've got to be on integrated video to care about!
Yes, exactly why added value of CUDA, PhysX, badaboom, vReveal, the game profiles ready in nv panel, the forced SLI, the ambient occlusion games and their MODS ( se back a page or two in comments) - all MATTER to a lot gamers.
Let's not forget card size for htpc'ers - heat, dissipation, H.264 etc.
Just the frames matter here just for ati - formerly at 2560x when ati had that crown, now of course, just for lower resolutions - the most important suddenly to the same reviewers, when ati is stuck down there.
Yeah, PATHETIC describes the dismissal of added values.
I have a CUDA-supporting GPU (8800GTS) and I have rarely used it. Other than to run the CUDA version of folding at home (there is also an Ati Stream version) or to look at the preitty effects in a few games. I don't really think these effects are particularly worthwhile and unless the industry comes together and supports a standard like OpenCL I don't see GPU-based processing becoming important to most uses.
Here's a clue as to why you're already WRONG.
Most "gpu users" use NVidia. DUH.
So while you're whistling in the dark, it's already past that time when your line of crap has any basis in reality.
It takes a gigantic red fanboy brain fart to state otherwise.
Oh well, since when did facts matter when the red plague is rampant?
anand doesn't do the overclocked part comparison of the videocard wars - BUT DON 'T worry - a red rooster exception with charts and babbling is no doubt coming down the pike.
Keep begging, then they can "respond to customer demands". lol
Oh man, this is going to be fun.
I suggest they start with the gainward gtx260 overclock goes like hell, that whips every single 4870 1g XXX ever made. Sound good ?
Uhh ati is losing a billion a year.
If you want card specifics, that's probably difficult to calculate - and loss leaders are nothing new in business - in fact that's what successful businesses use as a sales tool. Seems ATI has taken it a bit too far and made every card they sell a loss leader, hence their billions in the hole.
Now as far as the NVidia card in question, even if Obama takes over the mean greedy green machine - he and his cabal "won't release the information because it's just not fair and may cause those not really needing help at the money window to be expsoed".
So no, you won't be finding out.
The problem is anyway, if a certain card is a loss leader, they calculate how much other business it brings in, and that makes it a WINNER - and that's the idea.
The physx/cuda section was interesting, although it sounded a bit... whiny.
I would LOVE it if someone would write an article about all the PR and marketing shenanigans that go on with reviewers behind the scenes. It'll never happen because it would kill any relationship the author has with the companies, but I bet it would be an eye opening read.
I'm one with the opinion that PhysX is good and will only become better in time. Yet, I more than acknowledge the fact that CUDA is going to hinder its adoption so long as nVidia remains unwilling to decouple the two.
There was a big thread concerning this on the Video forum, and some people just can't get through the fact that CUDA is proprietary and OpenCL is not. As long as you have that factor, hardware vendors are going to refrain from supporting their competitors proprietary parallel programming and because of that developers will continue to aim for the biggest market segment.
PhysX set the stage for non-CPU physic calculations, but that is no longer going to be an advantageous trait for them. They'll need to improve PhysX itself, and even then they will have to provide it to all consumers -- be it if they have an ATi or nVidia GPU in their system. They'll have to do this because Havok will be doing this with OpenCL to serve as the parallel programming instead of CUDA, thereby allowing Havok GPU-accelerated physics for all OpenCL-compliant GPUs.
the problem is, by the time the PhysX becomes norm, you will be on your NVidia 480GTX :P
it happened to AMD and their X64 technology, took quite a bit to blast off.
Well, if these cards reduce the price of earlier cards it's just a good thing :-)
From ATI's part changes are not big, but they make the product better. It's better owerclocker than the predessor, it has better power lines. It's just ok upgrade like Phenom 2 was compared to original Phenom (though 4870 was and still is better GPU than Phenom was as an CPU...)
Nvidias 275 offers good upgrade over the 260, so not so bad if those rumors about shady preview samples turns out to be false. If the preview parts really are beefed up versions... Well Nvidia would be in some trouble, and I really think that they would not be that stubid, would'n they? All in all the improvement from 260 to 275 seems to be bigger than 4870 to 4890, so the competition is getting tighter. So far so good.
In real life both producers are keen on developing their DX11 cards to be ready for DX11 launch, so this may be guite boring year in GPU front untill the next generation comes out...
Both cards performed well and the performance differences are small. I can buy the 4890 today on newegg but not the 275. I know the 4890 is a new chip even if it is just a refined RV770 it's still a NEW part. It falls within in an easily understood hierarchy in the 4800 range. Bottom line I know what I'm getting. The 275 I can't buy today and it appears to be another recycled part with unclear origins. Nvidia's track record with musical labeling is bothersome to me. I want to know what I'm buying without having to spend days figuring out which version is the best bang for the buck. Come on Nvidia this is a problem and you can do better than this. The CUDA and PhysX aren't enough to sway me on their own merits since most of the benefits require me to spend more money, yes they add value, but at what expense?.
nutjob, you're not smart enough to own NVidia. Stick with the card for dummies, the ati.
Here's a clue "overclocked 4870 with 1 gig ram not 512, not a 4850 because it has ddr5 not ddr3 - so we call it 4870+ - no wait that would be fair, if we call it 4870 overclocked, uhh... umm.. no we need a better name to make it sound twice as good... let's see 4850, then 4870, so twice as good would be uhh. 4890 ! That's it !
There ya go... So the 4890 is that much better than the 4870, as it is above the 4850, right ? LOL
Maybe they should have called it the 4875, and been HONEST about it like NVidia was > 280 285 ...
No ATI likes to lie, and their dummy fans praise them for it.
Oh well, another red rooster FUD packet blown to pieces.
Dude you missed the whole point, must be the green blurring your vision. Nvidia takes an existing chip and reduces it's capacity or takes one the doesn't meet spec and puts it out as a new product or they take the 8800, then 9800, then the 250, then... that is re-badging. The 4850 and 4830 same same. Grading chip is nothing new but Nvidia keeps rebadging OLD, but good, chips and releases them as if they are NEW which is where my primary complaint about Nvidia gfx cards comes from.
4890 might not be an entirely new core but they ADDED to it, rearranged the layout, in the end improving it, they didn't SUBTRACT from it. It is more than a 4870+. It is a very simple concept that apparently you are unable to grasp due to your being such a fanboy. So you don't like ATI, I don't care, I buy whoever has the best bang for the buck that meets my needs not what you think.
ATI looked at the market and decided to hit the midrange and expand down and up from there. They went where most of the money is, in the midrange, not high end gaming. They are hurting and a silly money flag ship doesn't make sense right now. If Nvidia wasn't concerned with the 4890 they wouldn't have released another cut down chip. Put down the pipe and step away from the torch.... Seek help.
So your primary problem is that you think nvidia didn't rework their layout when they changed from G80, to G92, to G92b, and you don't like the fact that they can cover the entire midrange by doing that, because of the NAME they sue when they change the bit width, the shaders, the mem speed etc - BUT
When aTI does it it's ok because they went for the mid range, you admit the 4850 and 4830 are the same core, but fail to mention the 4870 and fairly include the 4980 as well - because it's OK when ati does it.
Then you ignore all the other winning features of nvidia, and call me names - when I'M THE PERSON TELLING THE TRUTH, AND YOU ARE LYING.
Sorry bubba, doesn't work that way in the real world.
The real horror is ATI doesn't have a core better than the G80/G92/G92b - and the only thing that puts the 4870 and 4890 up to 260/280 levels is the DDR5, which I had to point out to all the little lying spewboys here already.
Now your argument that ATI went for the middle indicates you got that point, and YOU AGREE WITH IT, but just can't bring yourself to say it. Yes, that's the truth as well.
Look at the title of the continuing replies "RE: Another Nvidia knee jerk" - GET A CLUE SON.
lol
Man are you people pathetic. Wow.
If 3,000,000 more tranistors is "basically a rebadge" you are lost on how much work goes into designing a chip as opposed to changing the stamper on the chip printing machine. I would speculate ATI/AMD has made some interesting progress on their next gen chip design and applied it to the RV770 it worked so they're selling it now to fill a hole in the market.
It sounds like you are trying to deal with Nvida's constant rebaging and have to point the finger and claim ATI/AMD is doing it too. Where did the 275 chip come from? Yes it is a good product but how many names do you want it called?
I have bought just as many Nvidia cards as I have ATI/AMD based on bang for the buck, just calling it like I see it...
Well, they worked it for overclocking - and apparently did a fine job of that - but it is a rebadging, none the less.
It seems the less than one half of one percent "new core" transistors are used as a sort of multi capacitor ring around the outside of the core, for overclocking legs. Not bad, but not a new core. I do wonder as they hinted they "did some rearranging" - if they had to waste some of those on the core works - lengthening or widening or bridging this or that - or connections to the bois for volt modding or what have you.
When eother company moves to a smaller die, a similar effect is had for the cores, some movements and fittings and optimizations always occur, although this site always jumped on the hate and lie bandwagon to screech about "rebranding" - as well as "confusing names" since the cards were not all the same... bit width, memory type, size, shaders, etc.
So I'm sure we would hear about the IMMENSE VERSATILITY of the awesome technology of the ati core (if they did the same thing with their core).
However, they've done a rebranding a ring around the overclock. Nice, but same deal.
Can you tell us how much more epxensive it's going to be to produce since derak and anand decided to "not mention the cost" since they didn't have the green monster to bash about it ?
Oh that's right, it's RUDE to mention the extra cost when the red rooster company is burning through a billion a year they don't have - ahh, the great sales numbers, huh ?
The other point is, when you've been whining about nvidia having a giant brute force core that costs too much to make, and how that gives ati a huge price and profit advantage ( even though ati has been losing a billion a year) , that when ati make a larger core and moer expensive breadboard and cooler setup standard for their rebrand, you point out the greater expense, in order to at least appear fair, and not be a red raging rooster rooter.
Got it there bub ?
Sure hope so.
Next time I'll have to start charging you for tutoring and reading comprehension lessens.
Uh, for you, the mentally handicapped, the point is since ati made a rebrand, call it a rebrand, especially when you've been screeching like a 2 year old about nvidia rebrands, otherwise you're a lying sack of red rooster crap, which you apparently are.
Welcome to the club, dumb dumb.
I hope that helps with your mental problem, your absolute inability to comprehend the simplest of points. I would like to give you credit and just claim you're being a smart aleck, but it appears you are serious and haven't got clue one. I do feel sorry for you. Must be tough being that stupid.
Oh sorry, forgot forced SLI profiles, and I don't want to fail to mention something like EVGA's early release NVidia game drivers for games on DAY ONE. lol
Aww, red rover red rover send the crying red rooster right over.
Did I mention ati lost a billion bucks two years in a row for amd ?
No ?
I guess Dewreck and anand forgot to mention the larger die, and more expensive components on the 790 ati boards will knock down "the profits" for ati. LOL Yeah, awww... we just won't mention cost when ati's goes up - another red rooster sin by omission.
I ought to face it, there are so many, I can't even keep up anymore.
They should get ready for NVidia stiffing them again, they certainly deserve it - although it is funny watching anand wince in text as he got addicted to Mirror's Edge - then declared "meh" for nvidia.
lol - it's so PATHETIC.
You proved you can't read and comprehend properly on the former page, where I had to correct you in your attempt to whine at me - so forget it - since you can't read properly ok nummy ?
Ahh, thank you very much. lol
NVIDIA wins again !
rofl
I'm sure the ati card buyers will just hate it...but of course they are so happy with their pathetic "only does framerates, formerly in 2560 for wins, now in lesser resolutions for the win"
It just never ends - Cuda, PhySx, Ambient Occlusion, bababoom, the vReveal, the game presets INCLUDED in the driver, the ability to use your old 8 or 9 Nvidia card for PhysX or Cuda in a xfire board with another NVidia card for main gaming ...
I know, NONE OF IT MATTERS !
The red rooster fanbois hate all of that ! They prefer a few extra frames at way above playable framerates in certain resolutions depending on their fanboy perspective of the card release (formerly 2560 now just lower resolutions)- LOL that they cannot even notice unless they are gawking at the yellow fraps number while they get buzzed down in cold blood in the game.
Ahhh, the sweet taste of victory at every turn.
You can't trust every site you check.especially since most of those sites don't post their funders names on their main page.You must've heard of Hardocp's Kyle who was fired by nvidia because he mentioned that the gtx250 is a renamed 9800gtx.
I think this is due to Nvidia shooting themselves in the leg with the 185 drivers. With the performance penalty at the normal resolutions, anyone testing with the 185's is going to get lower results than someone testing with the previous drivers. And I'm sure you could find 10 games that all perform better on ATI/NVIDIA. That's the problem with game selection and the only real answer is what types of games you play and what engines you think will be used heavily for the next 2 years.
Well the REAL ANSWER is - if you play at 2650, or even if you don't, and have been a red raging babbling lying idiot red rooster for 6 months plus pretending along with Derek that 2650x is the only thing that matters, now you have a driver for NVidia that whips the ati top dog core...
If you're ready to reverse 6 months of red ranting and raving for 2560X ati wins it all, just keep the prior NV driver, so the red roosters screaming they now win because they suddenly are stuck at the LOWER REZ tier to claim a win, can be blasted to pieces anyway- at that resolution.
So - NVidia now has a driver choice - the new for the high rez crown they took from the red fanboy ragers, and the prior driver which SPANKS THE RED CARD AGAIN at the lower rez.
Make sure to collude with all the raging red roosters to keep that as hush hush as possible.
1. spank the 790 at lower rezz with the older Nvidia driver
2. spank the 790 at the highest rez with the new driver
_______________________
Don't worry if you can't understand just keep hopping around flapping those litttle wings and clucking so that red gobbler jouces around - don't worry soft PhysX can display that flabby flapper !
Can someone ban this freaking idiot. The last few posts of his have been nothing but moronic, senseless rants. Jesus Christ, buy a gun and shoot yourself already.
Ahh, you don't like the points, so now you want death. Perhaps you should be banned, mr death wisher.
If you don't like the DOZENS of valid points I made, TOO BAD - because you have no response - now you sound like krz1000 and his endless list of names, the looney red rooster that screeches the same thing you just did, then posts a link to youtube with a freaky slaughter video.
If I wasn't here, the endless LIES would go unopposed, now GO BACK and respond to my points LIKE MAN, if you have anything, which no doubt, you do not.
According to xbitlabs, the 4890 beats the gtx285 at 1920x1200 resolution with 4x aa in Cod5, Crysis Warhead, Stalker CS, Fallout 3 and loses in Far Cry2.Here, the 4890 matches in Far Cry 2 and cod5 with some slightly lower fps than the gtx285 in Crysis warhead.
That is crazy. There is no way variations should be that huge between the 2 tests, regardless of the area they chose to test in the game. Anandtech has it as essentially a wash, while Xbit has the 4890 20% faster!?! (COD:WaW)
Just looked closer at the Xbitlabs review. The card they used was an OC variant that had 900MHz core instead of the stock 850MHz. In certain games that are not super graphically intensive I'm willing to bet at 1920X1200 they may still be core starved and not memory starved so a 50MHz increase may explain the discrepancy.
I've got to admit you need to take the Xbitlabs article with a grain of salt if they are using the OC variant as the base 4890 in all of their charts....that's pretty shady...
And just go and disregard everything I typed (minus the different driver versions). Xbit apparently underclocked the 4890 to stock speeds. So I have no clue how the heck their numbers are so significantly different, except they have this posted on system settings:
ATI Catalyst:
Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always Off
Enable Adaptive Anti-Aliasing: On/Quality
Other settings: default
Nvidia GeForce:
Texture filtering – Quality: High quality
Texture filtering – Trilinear optimization: Off
Texture filtering – Anisotropic sample optimization: Off
Vertical sync: Force off
Antialiasing - Gamma correction: On
Antialiasing - Transparency: Multisampling
Multi-display mixed-GPU acceleration: Multiple display performance mode
Set PhysX GPU acceleration: Enabled
Other settings: default
If those are set differently in Anand's review I'm sure you could get some weird results.
good to know you blame everyone for "bad reading understanding"
let's see
ATI Catalyst:
Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always Off
Enable Adaptive Anti-Aliasing: On/Quality
Other settings: default
Nvidia GeForce:
Texture filtering – Quality: High quality
Texture filtering – Trilinear optimization: Off
you see the big "NVIDIA GEFORCE:" right below "other settings"?
that means the physX was ENABLED on the GEFORCE CARD.
More personal attacks, when YOU are the one who can't read, you IDIOT.
Here are my first two lines: LOL - set PhysX gpu accelleration enabled.
roflmao
_____
Then you tell me it says PhySx is enabled - which is what I pointed out. You probably did not go see the linked test results at the other site, and put two and two together.
Look in the mirror and see who can't read, YOU FOOL.
Better luck next time crowing barnyard animal.
"Cluckle red 'el doo ! Cluckle red 'ell doo !"
Let's see, I say PhySx is enabled, and you scream at me to point out it says PhysX is enabled, and call me an nvidia fan because of it - which would make you an nvidia fan as well - according to you, IF you knew what the heck you were doing, which YOU DON'T.
That makes you - likely a red rooster... I may check on that - hopefully you're not a noob poster, too, as that would reduce my probabilities in the discovery phase. Good luck, you'll likely need it after what I've seen so far.
I don't see the fun in shooting cloth and unrealistic non impact resistant windows in high rise buildings. The video with the cloth was distracting, it made me wonder why it was there. What was its purpose? My senior eyes did not see much of an improvement in the videos in the CUDA application.
Maybe someday you'll lose you're raging red fanboy bias, brakdown entirely, toss out your life religion, and buy an nvidia card. At that point perhaps Mirror's Edge will come with it, and after digging it out of the trash can (second thoughts you had), you'll try it, and like anand, really like it - turn it off, notice what you've been missing, turn it back on, and enjoy. Then after all that, you can crow "meh".
I suppose after that you can revert to red rooster raging fanboy - you'll have to have your best red bud rip you from your Mirror's Edge addiction, but that's ok, he's a red and will probably smack you for trying it out - and have a clean shot with ow absorbed you'll be.
Well, that should rap it up.
are the driver issues for AMD that significant that it needs to be mentioned in a review article? im asking in all honesty as i dont know. Also, this close developer relationship nvidia has w/ developers. does that show up in any games to significantly give a performance edge for nvidia vid cards? is there an example game out there for this? thanks.
Look no further than this article. :) Here's the quote:
"The first thing about Warmonger is that it runs horribly slow on ATI hardware, even with GPU accelerated PhysX disabled. I’m guessing ATI’s developer relations team hasn’t done much to optimize the shaders for Radeon HD hardware. Go figure."
But ATI also has some relations with developers that show an unusually high advantage as well (Race Driver G.R.I.D. for example). All in all, as long as no one is cheating by disabling effects or screwing with draw distances, it only benefits the consumer for the games to be optimized. The more one side pushes for optimizations, the more the other side is forced, or risk losing the benchmark wars (which ultimately decides purchases for most people).
In the conclusion mentions Nvidia's partners releasing OC boards but nothing about AMD. There is already two versions of the XFX HD4890 on Newegg. One is 850 core and the other is 875 core.
The HD4890 is geared to open that SKU of "OC" cards for AMD. People with stock cooling and stock voltage can already push the card to 950+MHz. On the ASUS card you boost voltage to the GPU which has allowed people to get over 1GHz on their GPU. As the card matures seeing 1GHz cores on stock cooling and voltage will become a reality.
Don't worry, it is mentioned in the article their overclocking didn't have good results, so they're keying up a big fat red party for you soon.
They wouldn't dare waste the opportunity to crow and strut around.
This was about announcing the red card, slamming nvidia for late to market, and denouncing cuda and physx, and making an embarrassingly numberous amount of "corrections" to the article, including declaring the 2560 win, not a win anymore, since the red card didn't do it.
That's ok, be ready for the change back to 2560 is THE BESt and wins, when the overclock review comes out.
:)
Don't worry be happy.
SD, you seriously have a mental problem right?
I noticed that you keep bashing, being sarcastically insultive (betwen other things.) to anyone who supports ati.
No, not true at all, there are quite a few posts where the person declaring their ATI fealty doesn't lie their buttinski off - and those posts I don't counter.
Sorry, you must be a raging goofball too who can't spot liars.
It's called LOGIC, that's what you use against the lairs - you know, scientific accuracy.
Better luck next time - If you call me wrong I'll post a half dozen red rooster rooters in this thread that don't lie in what they say and you'll see I didn't respond.
Now, you can apologize any time, and I'll give you another chance, since you were wrong this time.
I just finished a mid-range C2D build, and decided to go with the HD 4870 512MB version for $164.99 (ASUS, no sale at NE, but back up to $190 now). This was my first ATI card and it was a no-brainer. While the 4890 is a better card, to me, it is not worth the nearly $100 more, especially considering I'm gaming at either 1920x1200 on a 40" LCD TV or a 22" LCD monitor at 1680x1050.
Nvidia has lost me after 12 years as a fanboy for the time being, I suppose. What I will do here when I have more time is determine if buying another 4870 512MB for CrossFire will be the better bang for my resolutions or eventually moving up to the 4890 when the price drops this summer and then sell the 4870.
Thanks for the GREAT review AT, and now I have my homework cut out for me for comparisons with your earlier GPU reviews.
I've been sitting on my 8800GT for a while now and was thinking about going to a 4870 1GB model, but now I may hold off and see what prices do.
If the 4890/275 force the 4870 down in price then great I'll go with that, but on the other hand if prices slip from the new parts off of the $250 mark then I'll be tempted by that instead.
Either way I think I'm waiting to see how the market shakes out and in the end I, the consumer, will win.
I haven't checked the benchmark numbers yet, but you list Forceware 185's on the NV side, and Cat 8.12's on the AMD side.
Any reason why you are using brand new drivers for one side and 4 set old drivers on the other?
Sure, NV might not support a card on older drivers etc, but it's more useful to see newest driver vs newest driver, since there are performance changes from 8.12's to 9.3's in various games, but it just seems silly to use 3+ month old drivers which have been superseded by 3 revisions already. Would anyone buy a brand new card and then use drivers from 3~4 months ago?
The setup chart has been updated. We utilized previous test results from the HD4870 with the 8.12 HotFix drivers. The HD 4890 results were with the 9.4 beta drivers. I just went through a complete retest of the HD4870 with the 9.3 drivers and just started with the 9.4 betas on the AMD 790FX and Intel P45/X58 platforms. Except for a slight performance improvement in Crysis Warhead on the P45/X58 systems, the 9.3 drivers offered zero performance improvements over the 8.12 HotFix. The 9.3 drivers do offer a few updated CF profiles and improved video playback performance, but that is it.
"the 9.3 drivers offered zero performance improvements over the 8.12 HotFix."
Not only that, but if you read the ATI forums, a lot of people have been having MAJOR problems with 9.2 & 9.3. I downloaded 9.1 for my new 4870 build and have had no problems.
Now now, ati doesn't have any driver problems, so don't you DARE go spreading that lie. You hear me ?
Hundreds of red roosters here have NEVER had a driver problem with their ati cards.
Shame shame.
The Inq or The Reg is pretty adamant about the 275 being there just to spoil the 4890 launch, but never really in stores, and reviewers getting special hi-specs, hi-overclock hand by nVidia instead of the manufacturers.
Did your card come from an anonymous purchase, or was it a special review sample ?
We mentioned in the article that we received a review sample from NVIDIA. The clock speeds are the same as the retail cards that should start arriving later next week with widespread availability around the week of 4/13. Our retail review samples will arrive early next week.
As we stated several times in the article, this is a hard launch for AMD and yet another paper launch for NVIDIA, although according to the board partners the GTX275 is coming quickly this time around.
I also feel confused and somewhat misled by the conclusion that the GTX275 is "the marginal leader at this new price point of $250".
That conclusion doesn't seem true outside of resolutions of 2560x (30" monitor). Plus, $250 cards aren't even targeted or seriously considered for gaming at that resolution.
I came up with a different conclusion after thoroughly reading this review:
At the $250 price point and 1680x or 1920x gaming resolutions (where these cards primarily matter), the 4890 holds the majority performance advantage. However, at 2560x the GTX275 performs a bit better than the 4890. Realistically though, at 2560x or on 30+" displays, you're best served by a dual GPU or SLI/Xfire solution.
Something's fishy about the reviewers' conclusion.
Update: I guess Anand was making his updates while I was making my post, so the "marginal leader at this new price point of $250" line is gone and the Final Words actually now reflect my own personal conclusion above.
You agree now that NVidia has moved their driver to the 2650 rez to win, since for months on end, you WHINED about NVidia not winning at the highest rez, even though it took everyting lower.
So of COURSE, now is the time to claim 2650 doesn't matter much, and suddenly ROOT for RED at lower resolutions.
It Nvidia screws you out of cards again, I certainly won't be surprised, because you definitely deserve it.
Thanks anyway for changing Derek's 6 month plus long mindset where only the highest resolution mattered, as he had been ranting and red raving how wonderful they were.
That is EXACTLY WHY his brain FARTED, and he declared NVidia the top dog - it's how he's been doing it for MONTHS.
So good job there, you BONEHEAD - you finally caught the bias, just when the red rooster cards FAILED at that resolution.
Look in the mirror - DUMMY - maybe you can figure it out.
As I predicted elsewhere, they probably should have named this new card the GTX 281. In almost every single benchmark and resolution it beats the 280. In one case it even beat the 285 somehow.
/Gripe
That said, Go AMD! I wanna check other sites and see if they benched with the card highly over-clocked. One site got 950 core and 1150 memory easily but they didn't include it on the graphs :(
Hey guys, I just wanted to chime in with a few fixes:
1) I believe Derek used the beta Catalyst driver that ATI gave us with the 4890, not the 8.12 hotfix. I updated the table to reflect this.
2) Power consumption data is now in the article as well, 2nd to last page.
3) I've also updated the conclusion to better reflect the data. What Derek was trying to say is that the GTX 275 vs. 4890 is more of a wash at 2560 x 1600, which it is. At lower than 2560 x 1600 resolutions, the 4890 is the clear winner, losing only a single test.
Thank you Anand for the update and the article changes. I think that will quell most of the comments so far (mine included).
Could you possibly comment on the temps posted earlier in the comments section? My question is whether there are significant changes with the fan/heatsink between the stock 4870 and the 4890. The idle and load temps of the 4890 are much lower, especially when the higher frequency is taken into consideration.
Also a request to describe the differences between the 4890 and the 4870 (several comments allude to a respin that would account for the higher clocks, lower temp, different die size).
Thank you again for all of your hard work (both of you).
Yeah, I would also second a closer comparison between RV790 and RV770, or at least mention it. It's got new power phases, different VRM (7-phase vs 5-phase respectively), slightly redesigned core (AT did mention this) and features a revised HS/F.
I was very happy to see the PhysX details. I'd started worrying I might be missing out with my 4870. It's clear now that I'm not missing out on PhysX, but might be missing out on some great encoding performance wiht CUDA.
I'll be looking forward to your SLI / Crossfire followup. Hoping to see some details about peformance with ultra high Anti-Aliasing that's only available with SLI/Crossfire. I used to run Two 4850s and enjoyed the high-end Edge Antialiasing. Unfortunetly the pair of 4850's were a too much heat in a tiny shuttle case so I had to switch out to a 4870.
Your review reinforced something that I'd been feeling about the 4800s. There isn't much to complain about when running 1920x1200 or lower with modest AA. They seem well positioned for most gamers out there. For those out there with 30" screens (or lusting after them, like myself)... while the GTX280/285 has a solid edge, one really needs SLI/Crossfire to drive 30" well.
Must be tough trying to write a balanced review when you clearly favour one side of the equation. Seriously, you tow NV's line without hesitation, including soon to be extinct physx, a reviewer relieased card, and unreleased drivers at the time of your review. And here's the kicker; you ignore the OC potential of AMD's new card, which as you know, is one of it's major selling points.
Could you possibly bend over any further for NV? Obviously you are perfectly willing to do so. F'n frauds
What?! Did you even read the article? They specifically say they cannot really endorse PhysX or CUDA and note the lack of support in any games. I think you're the one towing a line here.
The red fanboys have to chime in with insanities so the reviewers can claim they're fair because "both sides complain".
Yes, red rooster whiner never read the article, because if he had he would remember the line that neither overclocked well, and that overclocking would come in a future review ( in other words, they were rushed again, or got a chum card and knew it - whatever ).
So, they didn't ignore it , they failed on execution - and delayed it for later, so they say.
Yeah, red rooster boy didn't read.
jesus dude, you have a strong persecution complex right?
its like "ohh noes, they're going against my beloved nvidia, I MUST STOP THEM AT ALL COSTS".
I wonder how much nvidia pays you? ( if not, you're sad.. )
That's interesting, not a single counterpoint, just two whining personal attacks.
Better luck next time - keep flapping those red rooster wings.
(You don't have any decent couinterpoints to the truth, do you flapper ? )
Sometimes things are so out of hand someone has to say it - I'm still waiting for the logical rebuttals - but you don't have any, neither does anyone else.
All these guys talking about how irrelevant physx and how not so many games use it don't get it. The power of physx is bringing the full strength of those GPU's to bear on everyday apps like CS4 or Badaboom video encoding. I used to think it was kind of gimmicky myself until I bought the "very" inexpensive badaboom encoder and wow, how awesome was that! I forgot all about the games.
You forgot all about gaming because you can encode video faster? I guess we are just 2 different people. I don't think I've ever needed to encode a video for my ipod in 60 seconds or less, but I do play a lot of games.
Big difference between Havok Physics and HavokFX physics. With physx you can just turn on hardware acceleration and it works, with havok this is not possible - unlike physx it was never developed to be run on the gpu. Hence havok have had to develop a new physics engine to do that.
No game uses the HavokFX engine - it's not even available to developers yet let alone in shipped games. The ati demo was all we have seen of it for several years. It's not even clear HavokFX is even a fully accelerated hardware physics engine - i.e. the version showed in the past (before intel took over havok) was basically the havok engine with some hw acceleration for effects. i.e. hardware accel could only be used to make it prettier explosions and rippling cloth - it could not be used to do anything game changing.
Hence havok have a way to go before they can even claim to support what physX already does, let alone shipping it to developers and then seeing them use it in games. Like I said the moment that comes close to happening nvidia will just release an OpenCL version of physX and that will be that.
Again... Physx is dead. OpenCL is HavokFX, it's what the consortium has chosen and it runs on any CPU or GPU including Intel's upcoming Larrabee.
Like I said before (you seem to not understand logic). Physx is dead.. it's proprietary and not as flexible as Havok. Many studios are also familiar with Havok's tools.
I think you're mistaken - OpenCL is analogous to CUDA, not to PhysX. HavokFX is analogous to PhysX. OpenCL is the GPGPU compiler that runs on any GPU (and theoretically, it should run on any CPU too, I think). It's what Apple is now trying to push (curious, given that their laptop base is all nVidia now).
However, if NVidia ports PhysX to OpenCL, that's a win for everyone. Sort of. Except for NVIdia that paid a lot of money for the PhysX IP. I think that the conclusions given are accurate - NVidia is banking on "everyone" (ie Game Developers) coding for PhysX (and by extension, CUDA) rather than HavokFX (and by extension, OpenCL). However, if Developers are smart, they'll go with the actually open format (OpenCL, not CUDA). That means that any physics processing they do will work on ANY GPU, (NVidia and ATI). I personally think that NVidia banked badly this time.
While I do believe that doing physics calculations on unused GPU cycles is a great thing (and the Razor's Edge demo shows some of the interesting things that can be done), I think that NVidia's pushing of PhysX (and therefore CUDA) is like what 3dfx did with pushing GLide. Everyone supported Direct3D and OpenGL, but only 3dfx supported Glide. While Glide was more efficient (it was catering to a single hardware vendor that made Glide, afterall), the fact that Game Developers could instead program for OpenGL (or Direct3D) and get all 3D accelerators supported meant that the days of Glide were ultimately numbered.
I wonder if NVidia is trying to pull the industry to adopting its CUDA as a "standard". I think it's ultimately going to fail, however, given that the industry recognizes now that OpenCL is available.
Is OpenCL as mature as CUDA is? Or are they still kind of finalizing it? Maybe that's the issue - OpenCL isn't complete yet, so NVidia is trying to snatch up support in the Developer community early?
CUDA is in many ways a simplified version of OpenCL - in that CUDA knows what hardware it will run on so has set functions to access it, OpenCL is obviously much more generic as it has to run on any hardware so it's not quite as easy. That part of the reason why CUDA is initially at least more popular then OpenCL - it's easier to work with. That said they are very similar so to port from one to the other won't be hard - hence develop for CUDA now then just port to OpenCL when the market demands it.
All in my opinion Ati want is their hardware to run with whatever physics standard is out there. Right now they are at a growing competitive disadvantage as hardware physics slowly takes off. Hence they demo HavokFX in the hope that either (a) it takes off or (b) nvidia are forced to port PhysX to openCL. I don't think they care which one wins - both products belong to a competitor.
Nvidia who have put a lot of money into PhysX want to maximise their investment so they will keep PhysX closed as long as possible to get people to buy their cards, but in the end I am sure they are fully aware they will have to open it up to everyone - it's just a matter of when. From our standpoint the sooner the better.
Sure, but my point was simply that HavokFX and PhysX are physics API's, whereas OpenCL and CUDA are "general" purpose computing languages designed to run on a GPU.
Is CUDA easier to work with? I don't really know, as I've never programmed for either. Is OpenGL harder to program for than Glide was? Again, I don't know, I'm not a developer.
ATI's "CUDA" was "Stream" (I think). I recall ATI abandoning that for (or folding that into) OpenCL. That's a sound strategic decision, I think.
If PhysX is ported to OpenCL, then that's a major win for ATI, and a lesser one for NVidia - the PhysX SDK is already free for any developer that wants it (support costs money, of course). NVidia's position in that market is that PhysX currently only works on NVidia cards. Once it works elsewhere (via OpenCL or Stream), NVidia loses that "edge". However, that's a good thing...
I guess you're forgetting that recently NVidia supported a rogue software coder that was porting PhysX to ATI drivers. Those drivers hit the web and the downloads went wild high - and ATI stepped in and slammed the door shut with a lawsuit and threats.
Oh well, ATI didn't want you to enjoy PhysX effects. You got screwed, even as NVidia tried to help you.
So now all you and Derek and anand are left to do is sourpuss and whine PhysX sucks and means nothing.
Then anand tries Mirror's Edge ( because he HAS TO - cause dingo is gone - unavailable ) and falls in love with PhysX and the game. LOL
His conclusion ? He's an ati fabbyboi so cannot recommend it.
1. The 9.4 beta was used for the HD 4890 and the chart has been updated to reflect it. The 9.3 drivers are not any faster than the 8.12 HotFix for the other AMD cards in every test I have run but Crysis Warhead with a Core 2 Quad. A few improvements have been made for CF compatibility and video playback though.
2. The conclusion has been updated to clarify our thoughts between the two cards.
An ATI gpu with and nvidia gpu doing physx? I'm curious to see results of this kind of arrangement. Not a dedicated PCI Physx card, but on a faster bus, with a more powerful processor, as a video card. I'm wondering about pitfalls and performance and the literal looks of the application.
Sorry bub, you're stuck with ati, and as far as curiosity for physx - uhh... don't worry, you're not missing much, anand only got addicted to it for a bit.
If you want the driver hack for it, there's a thread at techpowerup.
Some genius figured something out on it- not sure which os.
He could say it because he said it for ati for 6 months when ati won the top resolution. So his brain is in a "fart mode" that lied for ati for so long, he said it this time for nvidia - either that or he realized if he didn't he would look like an exposed raging red rooster fanboy.
Good thing the reds started screaming NOW, after loving it for 6 months when their card was on top using the false method - because anand came in and saved the day - and changed the conclusion - for ati.
LOL
When nvidia doesn't give them a card for review again, it will be "them towing the line of honesty" that causes, no doubt, right ?
BWAHAHAHAAAAAAA
( you all just tell yourselves that along with our dear leaders )
"On the NVIDIA side, we received a reference version of the GTX 275."
You wish.
"since there is no 275 ASIC, NV is telling OEMs that they can make it from either a 280 or 260 board. One costs much more, and one performs better, so guess what everyone is going to use?
That isn't necessarily bad, but how NV is seeding reviewers is. They are only going to be giving out a very special run of ONLY 280 based parts.
Quite special 280 based parts at that. Reviewers beware, what you are getting is not what you can buy."
A lot of people seem to be crying about lack of temp, power consumption, oc, and fan noise numbers..while I agree in a stand alone review these are glaring omissions, the fact is theres a dozen reviews around the web where you can get that info in triplicate. I would much rather have the insight on CUDA and PHysx!
I mean, people act like the internet isnt free and we all arent a google search/mouse click away from that type of info! Geez.
That said, I suppose reviews must be treated as "stand alone", however artificial a construct it may be. However if theres anything thats easily forgivable to be left out it's simple data numbers that can be found at a thousand other places. Which is exactly what temp, oc, etc are. I already know those numbers from a ton of other reviews. These people whining in the comments act like Anand is the only hardware review site there is. I would think if anybody was truly interested in laying out 250 to purchase one of these theyd be looking at more than one review!
I second the question about why did you use Catalyst 8.12 Hotfix? Other sites are using what appears to be an Beta Catalyst 9.4 driver so is your listing of Catalyst 8.12 a misprint?
Also why do you care if AMD sent you an overclocked version? The HD4890 is directly targeted at the overclocking enthusiasts which is a realm that AMD has ignored up until now while NV embraced it.
The HD4890 has already been taken to 1+GHz on it's GPU and up to 4.8GHz on it's memory on other sites. That by far makes it the better buy.
Forgot to mention that by having this extremely overclockable card AMD has opened up another entire SKU for themselves by selling "OC" cards with the 4890.
Oh great, a whole other sku to lose another billion a year with. Wonderful. Any word on the new costs of the bigger cpu and expensive capacitors and vrm upgrades ?
Ahh, nevermind, heck, this ain't a green greedy monster card, screw it if they lose their shirts making it - I mean there's no fantasy satisfaction there.
Get back to me on the nvidia costs - so I can really dream about them losing money.
I am not sure about you guys but NVIDIA has problems with their drivers as well. I have a 9400GT and a 8800 GTS in my machine and the new drivers can't make the two work well enough for my computer to come out of hibernation with out Windows XP crashing every so often. This use to work just fine before I upgraded the drivers to the latest version.
There it is again, 30 watts less idle for nvidia, and only 3 watts more in 3d. NVIDIA WINS - that's why they left it out - they just couldn't HANDLE it....
So, if you're 3d gaming 91% of the time, and only 2d surfing 9% of the time, the ati card comes in at equal power useage...
Otherwise, it LOSES - again.
I doubt the red raging reviewers can even say it. Oh well, thanks for posting numbers.
Can anyone confirm whether or not the heatsink/fan has been altered between the 4870 and the 4890? I'm interested to know if the decreased temps of the higher clocked 4890 are due in part to a better cooling mechanism, or strictly from a respin/binning.
Yes, the cooler has been slightly revised. I believe it's a combination of both. I'll admit I'm a bit disappointed AT didn't explore the differences between the HD 4870 and the 4890 more in-depth.
"It looks like NVIDIA might be the marginal leader at this new price point of $250." you wrote
But looking at your own benches..
Since you run 3 resolutions of your benches, lets reasonably declare that the card that can win 2 or more of them "wins" that game. In that case 4890 wins over 275 in: COD WaW, Warhead, Fallout 3, Far Cry 2, GRID, and Left 4 Dead. 275 wins over 4890 in Age of Conan. Either with AA or without the results stay the same.
The only way I think you can contend 275 has an edge is if you place a premium on the 2560X1600 results, where it seems to edge out the 4890 more often. However, it's often at unplayable framerates. Further I dont see a reason to place undue importance on the 2560X benches, the majority of people still game on 1680X1050 monitors, and as you yourself noted, Nvidia released a new driver that trades off performance at low res for high res, which I think is arguably neither here nor their, not a clear advantage at all.
Even at 2560 (using the AA bar graphs because its often difficult to spot the winner at 2560 on the line graphs), where the 275 wins 5 and loses 2, the margins are often so ridiculously close it essentially a tie. 275 takes AOC, COD WaW, and L4D by a reasonable margin at the highest res, while the 4890 wins Fallout3 and GRID comfortably. Warhead and Far Cry 2 are within .7 FPS although nominally wins for 275. Thats a difference of all of 3-2 in materially relevant wins, or exactly 1 game. But keep in mind again that 4890 is fairly clearly winning the lower reses more often, and to me it's wrong to state 275 has the edge.
The funny thing is, if you're in those games and constantly looking at your 5-10 fps difference at 50-60-100-200 fps - there's definitely something wrong with you.
I find reviews that show LOWEST framerate during game when it's a very high resolution and a demanding game useful - usually more useful when the playable rate is hovering around 30 or below 50 (and dips a ways below 30.
Otherwise, you'd have to be an IDIOT to base your decision on the very often, way over playable framerates in the near equally matched cards. WE HAVE A LOT OF IDIOTS HERE.
Then comes the next conclusion, or the follow on. Since framerates are at playable, and are within 10% at the top end, the things that really matter are : game drivers / stability , profiles , clarity, added features, added box contents (a free game one wants perhaps).
Almost ALWAYS, Nvidia wins that - with the very things this site continues to claim simply do not matter, and should not matter - to ANYONE they claim - in effect.
I think it's one big fat lie, and they most certainly SHOULD know it.
Note now, that NVidia - having released their, according to this site, high resolution driver tweak for 2560xX , wins at that resolution, the review calmly states it does'nt matter much, most people don't play at that resolution - and recommend ati now instead.
Whereas just prior, for MONTHS on end, when ati won only the top resolution, and NVidia took the others, this same site could not stop ranting and raving that ATI won it all and was the only buy that made sense.
It's REALLY SICK.
I pointed out their 30" monitor for ATI bias months ago, and they continued with it - but now they agree with me - when ATI loses at that rezz... LOL
Yeah, they're schesiters. Ain't no doubt about it.
Others notice as well - and are saying things now.
I see Jarred is their damage control agent.
How can the conclusion be that the 275 is the leader at the price point? The benchmarks are clearly in favour of the 4890 apart from the extreme end 2560x1600.
Deja vu again, and again, and again. I've posted in no less than 3 other articles how bad some of the conclusions have been. There is NO possible way you could conclude the 275 is the better card at anything other than the 30" display resolution. Not only that, but it appears with the latest Nvidia drivers they are making things worse.
Honestly, does anyone else see the parallel between the original OCZ SSD firmware and these new Nvidia drivers? Seems like they were willing to sacrifice 99% of their customers for the 1% that have 30" displays (which probably wouldn't even be looking at the $250 price point). Nvidia, take a note from OCZ's situation; lower performance at 30" to give better performance at 22-24" resolutions would do you much better in the $250 price segment. You shot yourselves in the foot on this one...
The conclusion has been clarified to reflect the resolution results. It falls right into line with your thoughts and others as well as our original thoughts that did not make it through the edits correctly.
Yup, I responded to Anand's post with a thank you. We readers just like to argue, and when something doesn't make sense, we're quick to go on the attack. But also quick to understand and appreciate a correction.
There is only 1 single benchmark out of 7 where the 275 has better frame rates for 1680 and 1920 resolution against the 4890 and yet your final words are that you favor the 275???? Only in 2560 the 275 is clearly the better choice. Are you already in the year 2012 where 2560 might be the standard resolution of the sales, it is only very recent that the 1680 became standard and even then this resolution is high for global OEM market sales. Your 2560 is not even few % of the market.
I think you have to clarify your final words a bit more with your choice.... Perhaps if we see power consumption, fan noice etc that would be added value to the choice, but for now, TWIMTBP is really not enough push to prefer the card, I am sure the red team will improve there drivers as usual also.
anything else i missed in your review that could counter my thoughts?
Derek has been caught in the 2560 wins it all no matter what with the months on end of ati taking that cake since the 4870 releasse. No lower resolutions mattered for squat since the ati lost there - so you'll have to excuse his months long brainwashing.
Thankfully anand checked in and smacked it out of his review just in time for the red fanboy to start enjoying lower resolution wins while nvidia takes the high resolution crown, which is- well.. not a win here anymore.
Congratulations, red roosters.
just as addon, I also checked some other reviews (yes i always read anandtech first as main source of info) and i saw that it is cooler then a 4870 and actually consumes 10% less then a 4870 so this can't be the reason either while the 275 stays at the same 280 power consumption. Also OC parts are already shown GPU above 1000....
This is an extreme omission. The fact that the 4890 is essentially an overclocked 4870 means with virtually nothing changed you HAVE to show the temps. I still stick by my earlier comment that the Vapo-chill model of the Sapphire 4870 is possibly a better card since it's temps are significantly lower than the stock 4870, while already being overclocked. I could easily imagine that for $50-60 less you could have the performance of the 4890 at cooler temps (by OC'ing the vapochill further).
Umm, they - you know the AT bosses, don't like the implications of that. So many months, even years, spent on screeching like women about nvidia rebranding has them in a very difficult position.
Besides, they have to keep the illusion of superior red power useage, so only after demand will they put up the power chart.
They tried to get away with not, but they couldn't do it.
Yeah, I don't know why they're playing this off as an RV770 overclock. RV790 is indeed a respin of RV770, but hey if nV can get by with 1000 different variants on the same GT200... Why not mention the benefits/differences between the RV770 and the RV790? Disappointed.
I guess they didn't mention the differences ? Tell you what, when ati gets 999 more rebrands and catches up with their competitotr, we'll call it even, ok ?
In the mean time, the 4870 crossfires with the 4980, and soon enough we'll have the gamer joe reviewers that downclock the 4890 and find it has identical results to the same clocked 4870 - at that point the red roosters will tuck their flapping feathers and go home.
I know, it's hard to see it coming, when all you can see is s tiny dot of red, in a sea of 1000 choices of green. rofl
According to info at other sites, the 4890 has 3 million more transistors (959 instead of 956, very little difference). It also has a somewhat larger die due to tweaks made to allow the higher clocks.
Go to Firing Squad or Xbitlabs review, both have an certain ATI slide that explains the small changes in detail.
" Because they’re so similar, the Radeon 4870 and 4890 can be combined together for mix-and-match CrossFire, just like the 4850 and 4870. "
I guess it's not a rebrand.
roflmao
The piece on PhysX kinda mirrors my thoughts on it - its not worth basing a GPU purchasing a decision on it because it affects so few games. If you design your game around PhysX, you end up making a gimmicky game, if you design a good game and think of good ways to let PhysX enhance it, you can make something good like Mirror's Edge.
The way I think about PhysX is based on Amdahl's law, which says that overrall speedup of a CPU from an enhancement that affects only a certain class of application is affected by the amount of time spent using that certain class of application. In the case of PhysX, the amount of time spent using it is generally extremely low, and when it is used the effect isnt always noticeable or worth having.
NVidia's marketing tactics leave a lot to be desired frankly, although I'm not naive enough to say AMD never tries a little marketing manipulation themselves.
Why on earth would you compare a newly released Nvidia driver to that of an ATI driver from December last year and a Hotfix at that? The latest ATI drivers have had substantial improvements in a few games and surely they would have sent you an up to date driver with the 4890 review sample- somethings not right there. Also, where was the overclocking comparisons? (some reviews stating 1ghz core 4890 no problem). What about Temps and Stock cooling fan noise?
I'm a bit disappointed with the ATI card. That is pretty much the Sapphire Vapochill model with increased core (actually it's a slightly slower memory setting). At least the GTX 275 is something different.
Wow lol..both cards are just rehashes. Calling the Nvidia card "something different" is a hell of a stretch.. it's just their same other cards with various clocks twiddled for the trillionth time.
If anything the ATI card brings more to the table, as it offers much more clock headroom (1ghz is said to be well within reach) due to it's redesign, while the Nvidia card is nothing at all new intrinsically (aka it will overclock similar to the 285). Too be fair Nvidia's better clock-capable models (285) just came out a couple months earlier instead of now.
If the 4890 is in fact a respin then I retract my original comment. My point if just a simple OC was that they were basically rebranding (binning) parts that could clock higher than the stock 4870 and selling it as a new card. That seems not to be the case, and so I can't be at fault if the Anand article didn't address this.
Regardless of whether the Nvidia card is in fact similar to the other offerings it does have disabled/enabled parts that do make it different than the 285 and 260.
I'd still really like to see one of the Vapochill units up against the 4890. I'm pretty confident you could get to the stock 4890 speeds, so it's just a matter of whether $70 is worth the potential to OC much higher than the 4870 (if these 1gig core clocks are the norm).
What we really need to see though is the temps for these cards under idle/load. That would be extremely helpful in deciding how good they are. For example if we see the 4890 at its stock speed is significantly cooler than the 4870 (and they haven't done much to the heatsink/fan), then the Vapochill 4870's just don't stand a chance. If we find the 4890's are similar or higher in temp than the stock 4870's, then it seems much more like a rebadge job.
If you look at it objectively the GTX 275 is something more different than the HD4890 unless there are undercover changes in the latter of which we haven't been made aware. HD4890 = clock bumped HD4870 exactly, GTX 275 = 240SP 448-bit memory interface GT200b which was not available as a single GPU card.
Meh..Madman Nvidia are still just playing around with the exact same modular component sets they have been, not adding anything new. Besides as even you alluded it isnt even a new card, it's just half the exact previously existing configuration in a GTX295
But as I said 285 is clocked higher than 280, I'm assuming Nvidia did die tweak to get there (at the least they switched to 55nm). They just did them 3 months ago or whenever, ATI is just now getting to it.
But for todays launches, imo ATI brings more new to the table than Nvidia, ever so slightly.
Right, they brought ambient occlusion to the table with their new driver.... LOL
Man , I'm telling you.
The new red rooster mantra " shadows in games do not matter " !
( "We don"t care nvidia does it on die and in drivers, making it happen in games without developer implementation ! WE HATE SHADERS/SHADOWS who cares!" )
I mean you have to be a real nutball. The Camaro car shop doesn't have enough cherry red paint to go around.
I wonder if the red roosters body paint ATI all over before they start gaming ? They probably spraypaint their Christmas trees red - you know, just to show nvidia whose boss...
Unbelievable. Shadows - new shadows not there before - don't matter... LOL
roflmao
I'm surprised they didn't mention it, maybe they hadn't been properly briefed, but yes the HD 4890 IS a different core than the HD 4870.
It uses a respin on the RV770 called RV790 which has slight clock-for-clock performance increases and much better power efficiency than the RV770. Case in point: higher clocks yet lower idle power draw. It's supposed to clock to 1 GHz without too much hassle granted proper cooling also.
This review was kind of a let down for me. It almost seems Nvidias sales rep terrorized you so much the last year so you felt compelled to write about CUDA and PhysX. But just as you said from the beginning it’s not a big deal.
As a trade off temperatures, noise and power seems to have gone missing. You talk about Nvidias new driver but what about ATIs new driver? Did you really test the ATI cards with “Catalyst 8.12 hotfix” as is stated on the test page?!? Surely ATI sent you a new driver and the performance figures seem to support that. I is my understanding that ATI has upped their driver performance the last months just like Nvidia has. No mention of IQ except from Nvidias new drivers. No overclocking which I had heard would be one of the strong points of the ATI card with 1 GHz GPU a possibility. I know you mentioned you would look at it again but just crank up the damn cards and let us know where they go.
Dont get me wrong the article was good but I guess I just want more ;)
ATI sems to win at “my” resolution of 1680x1050, but then again Nvida has some advantages as well. Tough call and I guess price will have to settle this one.
I agree noise and temps should be in all reviews. So should image quality comparisons. While we are at it 2d performance and image quality comparisons should really be part of any complete review. It seems frame rates are all review sites care to report.
You and others want more but yet keep bitching about mentions such as CUDA and PhysX. If Anandtech doesn't mention then someone has to complain why they weren't and weren't included in the test. For example the recent buyers guide. And when they do mention it and said it doesn't do anything much and left it alone there's bitching going on. I really don't get you guys sometime.
Well it's funny isn't it - with the hatred of NVidia by these reviewers here. Anand says "he has never played Mirror's Edge" - but of course it has been released for quite some time. So Anand by chance with the red rooster gone has to try it - of course he didn't want to, but they had to finally mention CUDA and PhysX - even though they dpn't want to.
Then Anand really does like the game he has been avioding, it's great, he gets near addicted, shuts off PhysX, notices the difference, turns it back on and keeps happily playing.
Then he says it doesn't really matter.
Same for the video converter. Works great, doesn't matter.
CUDA - same thing, works, doesn't matter, and don't mention folding, because that works better on NVida - has for a long time, ATI has some new port, not as good, so don't mention it.
Then Ambient Occlusion - works great, shadows - which used to be a very, very big deal are now with the NVidia implementation on 22 top games, well, it's a "meh".
There's only so many times so many added features can work well, be neat, be liked, and then the reviewer, even near addicted to one game because of the implementation, says "meh", and people cannot conclude the MASSIVE BIAS slapping them in the face.
We KNOW what it would be like if ATI had FOUR extra features Nvidia didn't - we would NEVER hear the end of the extra value.
Anand goes so far as to hope and pray openCL hits very soon, because then Havok via ATI "could soon be implemented in some games and catch up with PhysX fairly soon".
I mean you have to be a BLIND RED ROOSTER DROOLING IDIOT not to see it, and of course there is absolutely no excuse for it.
It's like cheering for democrats or republicans and lying as much as possible depending on which team you're on. It is no less than that, and if you don't see it glaring in your face, you've got the very same mental problem. It's called DISHONESTY. Guided by their emotions, they cannot help themselves, and will do everything to continue and remain in absolute denial - at least publicly.
The fact you have to pay extra on top of the card prices to use these features is a no go. You start to lose value, thus negating the effect these "features" have.
p.s ATI have similar features to nvidia, what they have is nothing new.
One ? I count for or five. I never had to pay extra outside card cost for PhysX, did you ?
You see, you people will just lie your yappers off.
Yeah ati has PhysX - it's own. ROFLMAO
Look, just jump around and cluck and flap the rooster wings and eat some chickseed, you all can believe eachothers LIES. Have a happy lie fest, dude.
Personally while you bring up good points I'd much, much, MUCH rather have the thorough explanation of CUDA and PHYSX and the relevance thereof, they gave us than power, heat and overclocking numbers you can get at dozens of other reviews. The former is insight, the latter just legwork.
I had a 4850 that I bought at launch. I was very excited when ATI released their Video Convertor app. I spent days trying to make that produce watchable video. Then I realized that every website that tested it had the same result. They released a broken POS and have yet to fix it. I did not appreciate them treating me like that so when I replaced the card I switched out to Nvidia. I have gone back and forth but this time I think I will stick with Nvidia for a while.
and by buying Nvidia you already knew that you didn't have POS so in the end you have the same result, except for the fact that the 48xx series really had a true performance advantage with that price range so your rebranded replacement just gave you 1) additional cost and 2) really 0 added value, so your grass is a bit to green.....
"0 added value"? Really? He didn't have a GPU video converter that worked on his ATI card, and now he DOES have a working program with his Nvidia card. Sounds like added value to me. He gets the same performance, pretty much the same price, and working software. Not a bad deal...
Is the 30$ pricetag of badaboom included in the "pretty much the same price"? If it isn't, then actually there is no added value. You have a converter (value, well only if your goal is to put video's on your ipod and it's worth 30$ to you to do it faster) but you have to pay for it extra. The only thing the nvidia card provides is the ability to accelerate that program, you don't actually get the program.
Congrats ATI, the 4890 is a strong performer! So much chatter about what constitutes rebadging; at the end of the day it's performance that matters. 4890 does a great job for the money.
The GTX 275 performs well but lacks excitement IMO. Nothing surprising or exciting; we've already seen a 240 shader enabled gpu on a 260 style interface (x2). If anything, the 285 receives strong competition from both 4890 and 275. Makes little sense to remain at it's price point. It's price should be $300.
I can't believe how biased anandtech has become.
I've checked all other review sites and in all the GTX275 was winning by a pretty big margin, here it actually looses to the HD4890.
Now I'm not a fanboy for either, I've had 2 nvidia graphic cards and 2 ATI cards, the current one is ATI, but this bias thing can't go un-noticed.
Some investigators must be summoned to deal with anandtech, this has been going for quite a while now.
I see the difference.. those "other reviews" used the Catalyst 9.3 drivers.
Anandtech, HardOCP and Firingsquad used the new 9.4 Beta drivers.
No bias on Anandtech's part. Rather a bias from those other sites who used the new nVIDIA BETA driver but not the ATi one that has proper support for the 4890.
I dont normally take notice of comments like this on here, but it does seem a little like it. It's as if NV have pissed off Anandtech with there dirty tactics (understandable), and Anandtech are being a little bias because of this.
I've looked at three reviews (FiringSquad, THG, and HardOCP - also Xbitlabs, but they didn't have the GTX 275 in their results). I'm not quite sure what horribly biased and inaccurate results we're supposed to have, as most of the tests are quite similar to ours. Two sites - HardOCP and FiringSquad - essentially end up as a tie. THG favors the 275, at least at lower resolutions and without 4xAA, but then several of the games they test we didn't use, and vice versa. (The 4970 also beats the 275 there if you run 4xAA 2560x1600.)
Obviously, we had a lengthy rant on CUDA and PhysX and discussed the usefulness of those features (conclusion: meh), but with all the marketing in that area it was something that was begging to be done. Pricing, availability, and drivers are still areas you need to look at, but it's really a very close race.
If you have reviews that show very different results than what I'm seeing, post the name of the site rather than making vague claims like, "I've checked all other review sites and in all the GTX275 was winning by a pretty big margin, here it actually looses to the HD4890."
That's different then what I've seen. I dunno what sites you visit but all of the ones I've been to show them just about neck and neck or the 4890 just edging out the 275.
Personally I give the edge to the 4890 due to it's high overclockability.
That's different then what I've seen. I dunno what sites you visit but all of the ones I've been to show them just about neck and neck or the 4890 just edging out the 275.
Personally I give the edge to the 4890 due to it's high overclockability.
First you say you aren't concerned about the 4890 being a rebadge because at the end of the day it's performance that matters, and then you said the GTX 275 lacks excitement because "we've already seen a 140 shader enabled gpu on a 260 style interface (x2)," whatever the significance of already seeing that is.
Of course it's contradictory, it's a red rooster statement. Then the 3m they use to just crank a few more mhz is the rework.. LOL
Pay homage to the red and hate green and spew accordingly with as many lies as possible or you won't fit in here - been like that for quite some time. Be smug and arrogant about it, too, and never amdit your massive errors - that's how to do it.
Make sure you whine about nvidia and say you hate them in as many ways as possible, as well - be absolutely insane mostly, that's what works - like screaming they can take down nvidia when the red rooster shop has been losing a billion a year on a billion in sales.
Be an opposite man.
Of course it's contradictory. Duhh.. they're insane man - they are GONERS.
I am not sure what all the conversation here is about. I will tell you a bit my graphics card. First, I am an GeForce man through and through. I will tell you why. I have never purchased a GeForce card that was faulty. Luck? My current computer is running all Asus. Twin Ati Radeon 4890's ... and so far.. I am on my third replacement graphics card. The first one had memory problems. Second was doa... third.. well.. this one overheats and crashes. The Radeon may be better than the GeForce when it works. I really dont notice a difference. So to me.. it is quality of craftsmanship that makes the difference. Currently I am very unhappy with Radeon becuase I build my new system for this graphics setup. My Asus mother board dosent support dual GeForce only Radeon. It seems I am stuck sending my graphics cards back and praying eventually I will get one that is not a lemon.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
294 Comments
Back to Article
johnjames - Monday, May 18, 2009 - link
I don't get it, I started reading this review and decided to get a 4890, then I read the following reviews:[url]http://www.driverheaven.net/reviews.php?reviewid=7...[/url]
[url]http://www.bit-tech.net/hardware/graphics/2009/04/...[/url]
[url]http://www.bjorn3d.com/read.php?cID=1539&pageI...[/url]
[url]http://www.dailytech.com/422009+Daily+Hardware+Rev...[/url]
[url]http://www.guru3d.com/article/geforce-gtx-275-revi...[/url]
[url]http://www.legitreviews.com/article/944/15/[/url]
[url]http://www.overclockersclub.com/reviews/nvidia_3d_...[/url]
[url]http://www.hardwarecanucks.com/forum/hardware-canu...[/url]
[url]http://hothardware.com/Articles/NVIDIA-GeForce-GTX...[/url]
[url]http://www.engadget.com/2009/04/02/nvidia-gtx-275-...[/url]
[url]http://www.overclockersclub.com/reviews/nvidia_gtx...[/url]
[url]http://www.pcper.com/article.php?aid=684&type=...[/url]
And they all state the GTX 275 gives a lot more fps in all games bar Grid.
genetix - Wednesday, September 23, 2009 - link
This is actually really funny you mention multiple sites. Since it's pretty hard to find an site these days which actually doesn't review/preview without sponsors. Meaning lean to one side to other is pretty simplistic just need to review right games and voila either can win. Lol, looking ATI videos damn those are so well selected that damn.We are definedly getting back to 80s where games where made to GPU. Not to all. The funny thing is even our so trusted Benchmarks like any Futuremark production fakes the results of GPUs. Their so called ORB is pretty far from reality what the hardware is really capable.
Asianman - Tuesday, June 16, 2009 - link
most of those use either NV biased games or most likely didn't upgrade the 4890's drivers. All reviews show that 4890 loses its initial advantage at higher resolutions, and the fact that it is now much cheaper. Take your pick, you'd get good value either way.Patrick Wolf - Sunday, August 2, 2009 - link
Well the 4890 isn't exactly kicking the 275's butt here.Let me break it down:
Age of Conan: 0-3 fps difference. It's a wash
CoD: WaW: 275 is at or above 60 fps on all resolutions, beats 4890 at 2560. 275 wins.
Crysis Warhead: 0-2 fps difference. It's a wash.
Fallout 3: 4890 wins.
Far Cry 2: 0-2 fps difference. It's a wash.
Left 4 Dead: Again, 275 is at or above 60 fps on all resolutions, beats 4890 at 2560. 275 wins.
Grid: 4890 wins.
That's 2 for nvidia, 2 for ATI. And on COD, Crysis, Far Cry, and L4D the 4890 wins at 1680 and 1920, then at 2560 the 275 suddenly pulls ahead? That's supposed to make sense? Not to mention both drivers used were beta. And the 185.65 drivers have been pulled from nvidia's archives.
pinguw - Friday, April 17, 2009 - link
yes, you said the one that is getting the benefit are the end user, but I think you have a short vision, because when things getting cheapper means we have more chance to get lower quality product. for example, the GTX260 that I bought several month a go, I can see that the image was worse than the 8800GTS that I had 2 years. At beggining I thought it was a defect so I changed other one and other brand and had the same result. so I say, instead of fighting for price, why dont they just make a better product?? lowering the price would just get our product worse and worse, like most of the product sold in US are now made in China... and then everybody are complaining about the product is bad... that is poisoned etc, what a joke what do you expect when the price go down? the answer is easy to get right? So I would suggest you stopping saing the one is getting the benefit are the users, what a brainless commentjoeysfb - Tuesday, April 28, 2009 - link
Something is not right here, are you linking the lowering of product quality to ficere competition???That's why people read reviews, comments from Neweggs, Amazons... to find out the user experience before buying a desired product...
Almost everythings is made in china now...like it or not.
8KCABrett - Thursday, April 16, 2009 - link
Those of us that buy the latest hardware to fly our flight sims have been pretty much left to using the outdated Tom's Hardware charts (which still show the 8800GTS being the fastest card around). I would love to know how the Q9650s and i7s are doing in FSX since the service packs, and it would be great to learn if the GTX 260/280s and now the refreshes are still slower than an 8800GTS in those sims. . .not to mention the abysmal performance of ATI cards! Has anyone found such a review anywhere?joeysfb - Friday, April 17, 2009 - link
just stick to 8800GTS then (money saved)... besides there not many sim titles these days.BikeDude - Friday, April 17, 2009 - link
Stick with the 8800GTS?I do not think you realize the problem. A year ago, FSX ate all the hardware you could throw at it.
FSX is a very difficult animal to feed.
It loves fast CPUs, but it also needs a fast GPU. Unfortunately, as was pointed out, there exists few recent comparisons. It is not easy figuring out the correct hw balance for FSX, since few includes it in a review.
Comparing dozens of FPS games is pointless. They perform similar. There are some small differences, but to evaluate a given card, you don't have to review that many games. FSX however poses some unique challenges, and deserves some attention.
Oh... I'd also like to know which of these cards will play nicely with HD video.
8KCABrett - Tuesday, April 21, 2009 - link
Well, every now and then I like to have a little shooter fun, and the GTS is certainly lagging behind in the new titles.I'm currently beta testing a new sim and it really utilizes the GPU which is nice to see, but my 8800GTS limits me quite a lot, and it's also nicely multi-threaded. I decided it's time to update my system, and really have nothing to guide me. Is ATI still really weak in sims? Have the GTX 280s gotten any better with the recent drivers? What about SP2 in FSX? I just don't have any source of this info, and I've looked everywhere for a legit source.
I've got a GTX 285 on the way and will just end up doing my own testing since that's apparently the only way to get the info.
There are hundreds of review sites out there posting these same four or five titles in their benchmarks and not a single one that includes any of the flight sims, even the new releases. I know sims are a niche market, but flight simmers are left to test for themselves, and they use what is perhaps one of the more demanding titles out there! My complaint isn't directed at Anandtech per se, I favor this site and have seen and appreciated the helpfulness of Gary Key time and again, especially over at the Abit forums, I just wish that Anandtech could employ their testing discipline in titles that really do need a legit place to evaluate them. It could really be a benefit to many people that really aren't catered to at all currently.
OK. . .back to lurking.
SiliconDoc - Friday, April 24, 2009 - link
I agree but you'll never get that here since ati gets stomped in fs9 and fsx even more.This is red rooster ragers central - at the reviewer level for now.
Put in the acelleration pack, and go for nvidia - the GT200 chips do well in FS9 - and dual is out for FSX so....
A teenage friend just got a 9800GTX (evga egg) and is running ddr800 2 gigs with a 3.0 P4 HT, on a G35 Asrock and gets 25-30 fps in FSX on a 22" flat Acer with everything cranked.
He oc'ed the cpu to 3.4 and pulled like 5 more frames per.
That's what he wanted, very playable on ultra - on his former 8600GTS he couldn't even give it a go for fsx.
However, moving up from 8800 I'm not certain what the gain is specifically. I've seen one or two reviews on HardOcp for fsx with a few cards. Try them.
8KCABrett - Friday, May 8, 2009 - link
Well, now that I've picked up a GTX 285SC, I've done some rudimentary benchmark comparisons between it and my 8800GTS in FSX and IL-2. I will add BlackShark results soon as well.http://www.txsquadron.com/forum/index.php?topic=26...">http://www.txsquadron.com/forum/index.php?topic=26...
SiliconDoc - Monday, June 22, 2009 - link
Very interesting, and not a great increase - Tom's lists FSX benchies in most of his card charts - the 9800GTX+ is way up there(3rd I believe), as are some of the 8800 series.It's weird.
The old HD2900 (pro even) does well with a good overclock - even the strange saphhire version which was 256 bit with 320 shaders - on a 25% oc it makes FSX quite playable. ( another friend on an E4500 w 4gigs/800).
I saw the ati1950XTX at hard does pretty well - well the 1950GT does NOT.
---
That 8800 chip is still - well, if not the best, still darn close.
lk7900 - Monday, April 27, 2009 - link
Can you please die? Prefearbly by getting crushed to death, or by getting your face cut to shreds with a
pocketknife.
I hope that you get curb-stomped, f ucking retard
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
http://www.youtube.com/watch?v=QGt3lpxyo1U">http://www.youtube.com/watch?v=QGt3lpxyo1U
I wish you a truly painful, bloody, gory, and agonizing death, *beep*
lk7900 - Monday, April 27, 2009 - link
http://www.anandtech.com/video/showdoc.aspx?i=3539...">http://www.anandtech.com/video/showdoc.aspx?i=3539...asq - Monday, April 13, 2009 - link
My friend working for Anandtech and told me that ATI paying for articles good for them and in disadvantageous to Nvidia..what we can clearly see in that article..lk7900 - Monday, April 27, 2009 - link
Die of aids moron.SiliconDoc - Friday, April 24, 2009 - link
Ahh, yeah well people have to get paid.It's nice to see the reaction there from the red rooster though, huh - cheering it on while he spews his brainwashed communist-like hatred of nvidia.
It's amazing.
Good for you noticing, though.
joeysfb - Tuesday, April 28, 2009 - link
I don't hate Nvidia. I own 5 nvidia cards and 1 ati card. I m buying what gives me the best value. To me, its ATI for now. I think AnandTech did a good job reporting on matter the happens behind the scene. They just report it and it up to individual to form their own thoughts.You obviously only buy Nvidia which is good... no fuss on deciding on what to get next!! hahaha....
SiliconDoc - Monday, June 22, 2009 - link
Well incorrect entirely. They didn't do a good job reporting on behind the scenes, because they left out the ATI prodding and payment parts.Furthermore, ati is still in a world of hurt losing billions in consecutive years.
If you were to be HONEST about things, if all the people here were to be, the truth would be: " WHERE THE HECK WAS ATI FOR SO LONG ? !! THEY'VE ALWAYS BEEN AROUND, BUT NVIDIA SPANKED THEM FOR SO LONG, WE HATE NVIDIA FOR BEING TOP DOG AND TOP PRICED - BUT IT'S REALLY ATI'S FAULT, WHO ENTIRELY BLEW IT FOR SO LONG.."
---
See, that's what really happened. ATI fell off the gaming fps wagon, and only recently got their act back together. They shouldn't be praised, they should be insulted for blowing competition for so long.
If you're going to praise them, praise them for losing 33% on every card they sell, in order to have that 5-10 dollar pricepoint advantage, because if ati were to JUST BREAK EVEN, they'ed have to raise all their gaming cards prices about $75 EACH.
So they're losing a billion a year... by destroying themselves.
Nvidia has made a profit all along, however. I think the last quarter they had a tiny downturn - while ati was still bleeding to death.
PRAY that Obama and crew has given or will give ati a couple billion in everyone else's tax money and inflation for everyone printed out of thin air dollars, to save them. You better so, or for a multi-billion dollar sugar daddy corporateer.
lk7900 - Monday, April 27, 2009 - link
Can you please die? Prefearbly by getting crushed to death, or by getting your face cut to shreds with a
pocketknife.
I hope that you get curb-stomped, f ucking retard
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
http://www.youtube.com/watch?v=QGt3lpxyo1U">http://www.youtube.com/watch?v=QGt3lpxyo1U
I wish you a truly painful, bloody, gory, and agonizing death, *beep*
joeysfb - Wednesday, April 15, 2009 - link
Hahaha! An eye for an eye. Guess the table has turned. AMD used to be in a needy position... taking it from left..right..center and back from players like Nvidia.joeysfb - Monday, April 13, 2009 - link
Good job AnandTech!!, really like your behind the scene commentary.araczynski - Saturday, April 11, 2009 - link
so far my overclocked 4850 crossfire setup has been keeping me happy, i'll come back into the market when the 5000 series rolls out and i upgrade my rig in general.ChemicalAffinity - Thursday, April 9, 2009 - link
Can someone ban this guy? I mean seriously.SiliconDoc - Friday, April 24, 2009 - link
Are you on drugs, is that why you don't understand or have a single counterpoint ?Come on, come up with at least one that refutes my endless stream of corrections to the lies you've lived with for months.
No ?
Ban the truth instead ?
Yeah, that wouldn't help you.
Ananke - Thursday, April 9, 2009 - link
I had 4850, 4870-1Gb, 260-216 and 280-Overclocked. Ran on 24" 1900*1200 - Crysis and Warhead, FarCry2, GTA4, Stalker ....whatever else you can imagnine...My experience:
Radeons are hot and noisier. You HAVE to increase the fan speed and it is audible. Image quality in games is very good though. Especially Crysis was better looking with the Radeons. Bullet tracing and sunshine effects were spectacular...GTX 280 on max everything in Crysis was also very beautiful. However that card gets HOT, so you would be better off with 285. I didn't like the image quality of Radeons in movies , but maybe my settings were not good. 4850 is definitely not the money, too hot for my test.
So, 4870 or 4890 1 Gb is definitely worth buying, performance is on par with 285 on 1900*1200 - Crysis was 27-41 FPS with standart Radeon 4870, and 31-45 with 280 OC 615 MHz.
IF 285 price is $250, that would be the best buy. If it costs more is NOT worth the money, unless you really want bigger and quiter card. Performance wise is the same as Radeon 4890, which now costs 229 and can be overclocked. I did overclock the GTX280 and 285, which doesnt show any performance change, I guess they are constrained by memory bandwidth?
So, honestly, for the money Radeon 4890 for $229 is the better choice. IF you find 4870 1Gb for $169 is worth considering also. The 896MB on the Nvidias is a constraint, I would not reccomend anything but 285, but that is expensive.
Truenofan - Tuesday, April 7, 2009 - link
woops. i meant arctic cooling S1 Rev2.Truenofan - Tuesday, April 7, 2009 - link
i don't get whats going on with silicon. but i enjoy my 4870. it works best at my resolution(1920x1200) and it costed less than the 275 with the ac-1. runs very chilly(45C idle 57C load oc'ed). i dont need phys-x or an application to do video encoding that costs extra adding to the total cost of the video card. gaming is its sole purpose to me and it does that extremely well.180 + 80 dollars for the video applications costs more than what my 4870 ran me and it completely outclasses at stock speeds it let alone a 275(260) or 280(270) which mine still costed less than. now you can get a 4870 for what the 260 runs. wheres the logic in that? just so you can run a few games with physx that aren't even that good? to do some video encoding? i'll stick with my lower cost 4870.
SiliconDoc - Tuesday, April 7, 2009 - link
I see, now your 4870 completely outclasses even the 280. LOLYour 4870 is matched with the 260, not the 275, and not the 280.
You don't have anything but another set of lies, so it's not something about you determining "my problem", or you "not knowing what it is", but it is rather the obvious lies required for you to "express your opinion". Maybe you should read my responses for the 20 some pages, and tell me why any of the 20 plus solid points that destroy the lies of the reds, are incorrect ? You think you might try it ? I mean we have a lot more than just YOUR OPINION,, false as you presented it, to determine, what is correct. For instance:
http://www.fudzilla.com/index.php?option=com_conte...">http://www.fudzilla.com/index.php?optio...Itemid=4...
.
Now, not even your 4870 overclocked XXX can beat the GTX260 GLH. In your MIND, though, it does, huh....? lol
Too bad, for you. I, unlike you, know what your problem is, and that is exactly what should bother you, about me.
helldrell666 - Wednesday, April 8, 2009 - link
Stick that fudzilla bench up in your.....Fudzilla is the number one nvidia troll site.Put aside, the fugglier nvidia sponored sites, like Hardocp, neoseeker, Driverheaven, Bjorn3d.................And you go on.
Your one ugly fugly creataure.
SiliconDoc - Wednesday, August 5, 2009 - link
helldrell666 - then here's another long set of wins for the GTX275 against the 4890 - you can go tell johnjames on pg. 29 to stick all of them up his... since you're so "red" with embarrassment...http://www.driverheaven.net/reviews.php?reviewid=7...">http://www.driverheaven.net/reviews.php?reviewid=7...
http://www.bit-tech.net/hardware/graphics/2009/04/...">http://www.bit-tech.net/hardware/graphi...03/radeo...
http://www.bjorn3d.com/read.php?cID=1539&pageI...">http://www.bjorn3d.com/read.php?cID=1539&pageI...
http://www.dailytech.com/422009+Daily+Hardware+Rev...">http://www.dailytech.com/422009+Daily+H...adeon+HD...
http://www.guru3d.com/article/geforce-gtx-275-revi...">http://www.guru3d.com/article/geforce-gtx-275-revi...
http://www.legitreviews.com/article/944/15/">http://www.legitreviews.com/article/944/15/
http://www.overclockersclub.com/reviews/nvidia_3d_...">http://www.overclockersclub.com/reviews/nvidia_3d_...
http://www.hardwarecanucks.com/forum/hardware-canu...">http://www.hardwarecanucks.com/forum/ha...a-geforc...
http://hothardware.com/Articles/NVIDIA-GeForce-GTX...">http://hothardware.com/Articles/NVIDIA-GeForce-GTX...
http://www.engadget.com/2009/04/02/nvidia-gtx-275-...">http://www.engadget.com/2009/04/02/nvid...275-ati-...
http://www.overclockersclub.com/reviews/nvidia_gtx...">http://www.overclockersclub.com/reviews/nvidia_gtx...
http://www.pcper.com/article.php?aid=684&type=...">http://www.pcper.com/article.php?aid=684&type=...
----
There you go - now I'm sure every one of those sights is filled with green bias, huh , the worst kind...
ROFLMAO
mav3rick - Wednesday, April 8, 2009 - link
ehehe.. he goes around name calling almost everyone a red rooster.. wonder if he sees himself as the green goblin? :Djokes aside.. seriously, why limit yourselves to just one camp?? why be fanboys? just get whatever card which give the most bang for your budget and also your needs. i started out with a Matrox G200 (eheh.. you can basically guess how 'young' i am). This was followed by the NVidia GeForce2 MX. The next upgrade was the Radeon 9500.. plus.. managed to unlock the 4 pipelines to make it the legendary 9700.. what more can you ask for. next came the X1800GTO, again managing to unlock 4 disabled pipelines and it became X1800XL. and before silicondoc calls me a red rooster.. lolz.. my current card is the 8800GT. real great buy with it oc capabilities and longevity..
currently with the flurry of releases, i'm targeting either the GTX 260+ or the 1GB 4870 as i only have a 22" flat screen. Performance between both cards are comparable. the difference of a few framerates are barely noticeable.. so its down to pricing. Waiting now to see if there are any further reductions :p As for PhysX, it doesnt seem to have really taken off yet.. game developers are not stupid. if they wanna sell more copies, they will have to make sure it works on as many cards as possible.
SiliconDoc - Friday, April 24, 2009 - link
Green Goblin, ahh... many that is ugly...Now, one problem with what you said - you definitely should note I don't call everyone a red rooster - just the liars - there are quite a few posts here that make fine points for ati that aren't steeped in bs up to their necks and over their heads.
You should also note I make the proper POINTS when I post, that prove what is and has been said by this brainwashed crew of red rooster just isn't the truth, period.
See that's the problem -
I certainly don't mind one bit if the truth is told, but if you have to lie so often to keep your favorite looking good, well, that's just stupidity besides being dishonest.
I've made quite a few points the in the box cluckers cannot effectively, fairly, or honestly counter - and the months and months of brainwashing won't unwind right away - but at least it's going into their noggins, and will eat away at the lies, little by little the small voice inside will spank their inner red stepchild into shape.
Hey, I'm a helpful person. Reality should win.
tamalero - Thursday, April 9, 2009 - link
dont forget the X800GTO2 that could unlock extra pipes to become X850XTJamahl - Tuesday, April 7, 2009 - link
Oh wait lol no they aren't. If they were they wouldn't have a failing chipset side of their business and revenues down from 1.2bn to 480m in the last quarter.They don't have an x86 licence and intel are suing them. How bad do you have to be for intel to sue lol. Oh yeah, on intel - that reminds me, who was it that took the playstation away from Nvidia again? Seeing as we're talking about consoles now, how about we talk about the 50 million wii's sold with ATI graphics inside?
Fusion is the party of the future and Nvidia aren't invited. To sum up, this desperate little company is going down the tubes and taking all of its clueless fanbois with it. It will be great in the near future when you are all forced to game on ATI graphics lol.
SiliconDoc - Tuesday, April 7, 2009 - link
January 2008 - and perhaps as bad or worse since then." Without the charge, AMD came close to breaking even in the quarter, listing an operating loss of $9 million. Revenue for Q4 was $1.77 billion, up 8% sequentially and virtually flat compared to Q4 2006. The ATI write-down resulted in a total loss of $3.38 billion for the year, with total non-cash charges adding up to $2.0 billion. 2007 sales were $6.0 billion, up 6$ from 2006. "
http://www.tgdaily.com/content/view/35669/118/">http://www.tgdaily.com/content/view/35669/118/
.
That looks like ATI writedown was 3.38 billion - gee, 2 billion loss for one year.
Oh well, they can crush nvidia with their tiny core vs nvidia's big core by dropping prices, right ?
LOL
Thanks for nothing but lies, red roosters.
SiliconDoc - Tuesday, April 7, 2009 - link
Oh and let's not forget the Dubai ownership portion of ati/amd.Great.
Not trusted for US ports by the American people, but fanboys love it anyway.
Great.
helldrell666 - Tuesday, April 7, 2009 - link
Yes, whether you believe me or not, it's true.I had had much more driver problems with my 8800gtx than what i have had with my current 4870.As for the arabs, they don't own the company.AMD just bought part of it's manufacturing business to the arabic company,ALMUBSDALA.
And to be more specific, AMD decided to share it's manufacturing business with another "foregin" company because it can't afford the manufacturing costs.
I don't see anything wrong with that.But in case you don't know, i have to give you some informations about who owns what.
Did you know that 2 russian "jewish" billionaires, "Naoom Birizofski and Jacob Trotski", own 79% of INTELs shares.
Did you know that JEWs own all the major technological companies in the WEST. From IBM, Ford, Boeing, Le Stabrolle, Shell..........to ADIDAS, Mercedes, OPEL, NPD Petrochemicals ......
Put aside the US Media, that is totally owned and controled by jewish businessmen..
The whole western civilization in it's all glory is getting purchased by jews.So if you think that those Arabs will/have any kind of effect on anything, then your wrong.
AMD was in big trouble and no other company offered any kind of help.AMD simply wasn't able to keep both it's manufacturing and designing business and make a revenue.
Now, with their manufacturing business is getting fixed and grown, the company is definitely in a better state.As you know, or maybe don't, the global foundaries is expanding now by building two other fabs in NY and Malta.Beside, they fixed and did a lot to AMDs to fabs in Germany. so what AMd has done is establishing a new manufacturing
entity, that will provide thousands of jobs for those great minds in both Europe and USA, and will compete efficiently with those Taiwanese manufacturing companies.
But you, with your stupid little brain, can't recognize all that.you think that your nvidia, that small little company with it's 5000 workers, is more important than anything else.
What about competition?
What about innovation?
what about prices? Are you aware of what the state of the prices of gpu, cpus... or any other products would become without competition?
SiliconDoc - Wednesday, April 8, 2009 - link
Here's the other thing, no matter how much you give your opinion on amd/ati, ati lost them billions, and nvidia, even though you pretend otherwise, actually "puts to work" more people than ati making and selling their gpu's, even if they don't have some holding corporation monster over the top of them, bleeding billions as well.So, you went into your love for amd, and I guess your love for ati, which according to you "saved the loser" called amd. Like I said, you're more fanboy than one could initially have surmised, and your bias is still bleeding out like mad, not to mention the crazy conspiracy talk.
SiliconDoc - Tuesday, April 7, 2009 - link
Hey, you're the one with the lies and the cover-ups for ATI, and now the anti-semitic conspiracy theories.Even with all the spewing you've got going there, you couldn't just say " ati is really the one who lost money, not nvidia with the GT200".
Oh well, it's more important to spread FUD and now, conspiracy against "Jews".
Amazing. I had no idea the rabbit hole goes that deeply. rofl
helldrell666 - Wednesday, April 8, 2009 - link
Check for yourself.It's not a conspiracy, these are facts.In fact, the 4800 series cards are the most successful generations of cards ATI ever produced.The 4890 that measures about half the size of the gtx285, beats the later in most games at full HD resolutions.
Btw, where are you from?
SiliconDoc - Friday, April 24, 2009 - link
I guess you forgot about the pci mach64, and dummy there in between doesn't have a clue what that is.Let's see, another lie you told - ati is huge blah blah blah nvidia only 5,000 jobs...
http://finance.yahoo.com/q/bc?s=NVDA&t=5y&...">http://finance.yahoo.com/q/bc?s=NVDA&t=5y&...
3 billion, 4 billion 3.5 billion SALES WITH PROFITS !
http://finance.yahoo.com/q/is?s=NVDA&annual">http://finance.yahoo.com/q/is?s=NVDA&annual
NVIDIA IS ALREADY RECOVERING AND STOCK IS UP NICELY. SORRY RED FANS...
http://finance.yahoo.com/q/bc?s=AMD&t=5y">http://finance.yahoo.com/q/bc?s=AMD&t=5y
VERY BAD NUMBERS FOR AMD (ati only being a portion far less than half)
http://finance.yahoo.com/q/is?s=AMD&annual">http://finance.yahoo.com/q/is?s=AMD&annual
RED LOSSES 3 YEARS IN A ROW - OVER 2 BILLION EACH YEAR LAST TWO YEARS A BILLION OF WHICH IS ATI.
________________________________
Now tell me about employment or jobs ? Is that in the communist inflation reprint economy that costs us taxpayers trillions - the fantasy world where CONSTANT billion dollar losses on just a billion dollar company is "sustainable" ?
AMD/ATI IS IN DEADLY SERIOUS TROUBLE AND HAS BEEN
NVIDIA IS ALREADY RECOVERING AND HAS BEEN POSTING A PROFIT.
___________
But in your inaginary world filled with HATRED and LIES, it's just the opposite... isn't it.
How pathetic.
tamalero - Thursday, April 9, 2009 - link
I dont know, the 9700-9800 from ATI were amazing as well.SiliconDoc - Tuesday, April 7, 2009 - link
You cite "the last quarter", but of course only a fool would use that as a future indicator concerning quality and viability of the company. It's another pathetic attempt, fella. Global downturn means nothing to you, and you FAILED to cite the ati numbers, the two quarters in question, so you really have no point. You must have been afraid to tell the truth ?If Nvidia has one low quarter in the midst of massive global downturn, while ati had at least 9 quarters where they suffered losses in a row, who is really in danger of playing on the "competitors" chip ?
You see, that's WHY the ati red roosters had to SCREAM endlessly about nvidia's GT200 die size - because THE WHOLE TIME BEHIND THE SCENES OF THEIR FLAPPING RED ROOSTER BEAKS - THEIR BELOEVED ATI WAS LOSING BILLIONS....
See bub, that's what has been going on for far too long.
It's really sad and sick, that people can't be HONEST.
All the red roosters had to do was say " hey buy ati, they're in financial trouble and have been, we all want competition to continue so let's pitch in, because the brands are about equivalent. "
See, that would have been honest and respectable and manly.
Instead the raging red roosters lied and covered up and FALSELY ACCUSED their competition of imaginary losses while their little toy was bleeding half to death - like little lying brats, they couldn't help but spew in the midst of IMMENSE BILLION DOLLAR losses for ati, how the gt200 was "hurting nvidia" and how "ati could crsuh them" with PRICE DROPS -. lol - man alive i'm telling you - all those know it all red rooster jerks - it was and is still amazing.
That's fine, just be aware that it --- has been pathetic behavior.
Jamahl - Wednesday, April 8, 2009 - link
Actually, you were the one throwing around $billion losses and FAILED to mention Nvidia's own horrible financial situation. Did you say anything about the global downturn while ranting like a fanatic on AMD's losses?What was it you were saying about HONESTY again? Yes, in caps.
Nvidia hasn't had one low quarter - they've lost 2/3rds of their share value in a year. That doesn't happen in one quarter, same as it didn't happen to AMD in one quarter either.
Nvidia are a horrible little company who hold back progress, and more and more people are wising up to their methods. Articles like this on Anand show what they are like. Nvidia CANNOT COMPETE with ATI on performance so instead they bribe with more cash than ATI use on R&D, and those that don't accept the bribes get cajoled or threatened instead.
All the while sad sycophants like you are banging on about PhysX and cuda as if they make a difference to anyone. What does make a difference is their pathetic rebadging of ancient tech, catching out the people who don't know any better.
http://www.youtube.com/watch?v=a7-_Uj0o2aI&fea...">http://www.youtube.com/watch?v=a7-_Uj0o2aI&fea...
That just proves how far ahead the r700 is vs the g200b. ATI put money in research in order to improve the experience, while Nvidia put money into bribes in an attempt to hold onto whatever slender lead they have. It's only a financial lead, in tech terms ATI are a country mile ahead and only the worst Nvidia fanboi cant see that.
SiliconDoc - Friday, April 24, 2009 - link
" That just proves how far ahead the r700 is vs the g200b."That rv700 can't compete AT ALL with the GT200 UNLESS it has DDR5 on it.
That is a FACT. That is REALITY.
Without DDR5 it is the full core 4850 that competes with the "old technology" at the DDR3 level on both cards, the 4850 and the 9800 series and flavors.
That's the truth, YOU LIAR.
Case closed, no takebacks, no uncrossing your fingers, no removing your red raging horns - like - forever.
The r700 CAANOT COMPETE WITH THE GT200 - unless DDR5 is added as an advantage for the r700 which actually competes with the g80/g92/g92b.
NOW, if you screamed and schreeched DDR5 is awesome and ati rocks because they used it, I wouldn't disagree or call you the liar you are.
Got it son ?
Figure it out, or go take a college class in logic, and skip the communist training if you possibly can. Might get an estrogen emotion reduction as well while you're at it.
SiliconDoc - Friday, April 24, 2009 - link
Check the five year stock charts before you keep lying, and then as far as your idiotic rant about nvidia, it just goes to show there is no such thing as a fair performance comparison from you people, you will lie your red rooster rooter butts off because you have a big twisted insane hatred for Nvidia, based upon some communist like rage that profit is a sin, and money in the industry is BAD, except PEOPLE get paid with all that money you claim NV throws around. lolDude, you're a red rooster rager, look in the mirror and face it, snce you can't face the facts otherwise. Embrace it, and own it.
Don't be a liar, instead - or rather if all you're going to do is lie, at least admit it - you're body painted red, no matter what.
The really serious issue is ati has a really bad continuous loss, and might go under.
However, I can understand you communist like raging red roosters screaming for more price drops as you declare the much better off financially NVidia the one "to be destroyed", and demand more price drops, as you scream "profiteering".
Well, the basic fact is plain and apparent, ati had to lose 2 billion dollars to provide their competitive price, and ati purchasers are sitting on that loss, their gain, huh.
Like I said, if Obama and co. give ati/amd a couple billion in everyones taxes, it might work out ok, otherwise bankruptcy is looming - or some massive new investor relations are required.
Either way, you people don't tell the truth, and that of course is the point, over and over again.
JNo - Saturday, April 4, 2009 - link
Well having trawled through all 16 pages of comments I have to say that as much as power & temp benches, I really want NOISE benchmarks. Yes power usually comes at the expense of noise and although I'm primarily a gamer, I hate fan noise too.I happen to have a 8800GT which was great value when it first came out but it becomes a whirlwind in most games and it drives me crazy, breaks the immersion and only in ear headphones help.
When the scores are this close, I err on the side of silence and (from other sites) it sounds like the GTX275 is noticably quieter than the 4890 under load.
Also, the GTX275 may suck up more juice under load but it is also the same amount more economical when idle and as I spend way less than 50% of my computer time gaming, that is much more useful to me...
Agree that PhysX is overhyped promises at the moment. So, for sound and power efficiency, I think the GTX275 just sways the vote *for me*. And it can overclock a bit even if the impression I'm getting is not quite as much as the 4890.
Then again, here in the UK the prices are different. The new parts are £200+ and that's 33% more than the GTX260 55nm core216 which can be had for only £150 now and is only a little less powerful than the GTX275 and will surely last fine till the DX11 parts come out... choices... choices...
helldrell666 - Saturday, April 4, 2009 - link
You can edit your vga bios using the radeon bios editor v1.12, which is the one im using now on my 4870, and adjust the frequencies in different modes.By downlcocking your radeon card in idle mode, you can get it operate properly in idle mode without sucking so much power.you can use ATI tray tools also for the same purpose.As for the noise, i definitely recommend you to wait a little bit until the non-reference cards get released.
According to some sites, AMD is going to release its DX11 cards in Q3 this year, so if your planning to upgrade, you'd better consider waiting a little bit and get a far better card than the current available ones.
From my personal experience, a 4870 1gig is more than enough to play most current games at 24" resolutions with all the setts on their highest including the eye candy.......except for Crysis and Stalker clear sky....If you have a smaller monitor than mine, than you might as well consider a 4850 or a gts250....
SiliconDoc - Monday, April 6, 2009 - link
So he just told you why he's gettting NVidia, and the little red fanboy in you couldn't stand it. You recommend taking steps that void the warranty, but why should you care - then you blabber about a 4850, but he already noted the 260 - and finally you let the last word in you could barely bring your red rooster self to say GTS250 - as in you might as well get it.... if you have to - 'cept he was already looking above that.You red ragers just have to spew about your crap card to people who already seemingly decided they don't want it. Why is that ?
You gonna offer him red cuda, red physx, red vreveal, red badaboom, red forced game profiles, red forced dual gpu ? ANY OF THAT ?
NO -
you tell him to HACK a red piece of crap to make it reasonable. LOL
What a shame.
Hey, maybe he can hack the buzzing fan on it , too ?
helldrell666 - Monday, April 6, 2009 - link
What the hell are you talking about?Im not a fan boy of any of those big companies that don't give a shit about me.
I have used both nvidia and ATI, and both produce great graphics cards.
I had a 8800gtx before my current 4870, and i had and athlon x2 6400+ before my current core i7 920.I go with whoever has the product with the best price.
SiliconDoc - Monday, April 6, 2009 - link
No, you are a fanboy and you are exposed. Deal with it.helldrell666 - Monday, April 6, 2009 - link
It's you, who the fanboy is.You're trying hard to show the advantages of nvidia cards over the ATI cards, and guess what, you fail.And i don't give a rat ass about CUDA.Im a gamer, and what matters to me is the card's gaming performance and the features that enhance my my gaming experience "which is what 99.9% of who buy those cards care for", Like physx, which is a true con for nvidia cards.But physx isn't supported on most games, and the physx effects that nvidia is trying to promote can be easily processed on a decent sub 300$ cpu by enahncing the on cpu physx performance via a driver/software that could utilize all cpu cores and utiliz the cpu processing power in a better way.
Ohh wait.ATI has DX10.1 and tessellation which nvidia doesn't have, and thanks to your nvidia, we didn't get much games that support
DX10.1, and we didn't get any game that supports tessellation which is a geometry accelerating technique that can accelerate geometry processing by up to 4 times using the same amount of floating points aka. processing power.If tessellation, which is included in ATI cards APIs since the Hd2000 series days, was used in those demanding games like crysis and stalker, we would've been able to play them using sub 300$ graphics solutions.Put aside the DX10.1 features like the aa enhancement.....and the GRS color detector that allows the gpu to use more accurate color degree for a texel using a more advanced texturing algorithm compared to the tri/bi-linear buffering teqnique used in nvidia's illegal uncompleted DX10 API.
SiliconDoc - Monday, April 6, 2009 - link
rofl - the long list of your imaginary hatreds against nvidia - you FREAK fanboy.The problem being, that just like some lying retard, you blame ati's epic failure with tessalation on who ? ROFL NVIDIA.
Sorry bub, you're another one who is INSANE.
You won't face what IS - you want something that isn't - wasn't - or won't be - so keep on WHINING forever, looney tuner.
In the mean time, nvidia users outnumber you, enjoy the large amount of added benefits, and don't have a 2 billion dollar loss hanging over their heads - with a company that might collapse in bankruptcy - and lose support of the already problematci drivers.
You bought the wrong thing, now you have a hundred could be would be if and an's and garbage can complaints that have nothing to do with reality and how it actually is.
Fantasy red rooster fanboy.
LOL - it's amazing.
helldrell666 - Tuesday, April 7, 2009 - link
Freak fanboy......? Hatred....? Are you serious...or ....?I don't hate any company because i have no reason to.I blame nvidia for not allowing those few developed games companies to include those great features in these very few modern demanding games.
Accelerating physx processing using the gpu is a great idea, but is it worth it?
Is the cpu realy unable to keep up with the gpu in games due to it's slow physx processing ability?
Are those primitive physx effects realy that heavy n a modern quad core cpu?
These are questions that you should ask yourself before trolling for the idea.
And i have to remind you that ATI has coded Havok physx effects in
OPEN-CL programming language, which in case you don't know, is standard langauage compared to nvidia CUDA, which is based on some kind of c programming codes.
Talking about the drivers, i haven't had problems with my 4870 on MY Vista 64bit OS, compared to my old 8800gtx that almost brought me a heart attack.
As for your beloved nvidia, we need nvidia as much as we need AMD and INTEL to keep the competetition alive, which in its turn will keep innovations going and adjust the prices.
Ohh... and thanx to the red camp, we can get a decent graphics card for less than 300$.so have some respect for them.
as for you, i wonder how old you are, cuz you don't seem to have a mature logic.
tamalero - Monday, April 20, 2009 - link
dont worry, this guy clearly as mental issues, a nvidia paid troll.SiliconDoc - Tuesday, April 7, 2009 - link
Yeah, sure. Just like any red rooster, you never had a problem with ati drivers, but nvidia drove you nearly to a heart attack, but I have problems with logic or detecting a red rooster fanboy blabberer ! ROFLDude, you keep digging your own hole deeper.
Then you try the immature assault - another losing proposition.
YOU'RE WHINING about nvidia and yeah, you do blame them, and after going on like a lunatic about wanting offload to the cpu, you admit it might not work - yes I read your asinine quadcore offload blabberings, too, and your bloody ragging about nvidia "not letting" your insane fantasy occur - purportedly to advantage ati (not like you're banking on intel graphics).
So red rooster, keep crowing, and never face the reality that is, and carry that chip for all your imiginary grievances of what should be or, what you say could have been.
In the mean time, know you are marked, and I know who and what you are, and I'm sure you'll have further whining and wailing about what nvidia did or didn't do for ati. LOL
roflmao
Logic ? ROFLMAO
" I, almost had a heart attack " said the sissy. lol " But I'm an objective person with logic ". roflmao
Jamahl - Monday, April 6, 2009 - link
Wow what a stunning pile of crap i've just read.ATI's can fold, guess what it's downloaded via CCC.
ATI has open source cloth physx, stream and avivo which pisses all over that trash nvidia call 'purevideo' or whatever.
But best of all, you can ACTUALLY BUY A 4890 whereas the 275 only exists in the Nvidia fanbois tiny little green with envy minds.
The0ne - Tuesday, April 7, 2009 - link
If you haven't noticed, SiliconDoc is basically ignored in all his responses. Be wise and do that same. He'll eventually kill himself :)SiliconDoc - Tuesday, April 7, 2009 - link
Another red rooster who cannot argue with the facts and the truth, and doesn't want them known.Perhaps you'd notice, I didn't comment right away when the STORIED review came out, you FOOL.
I came days later, and made my comments after you had your bs fest of lies, so I don't expect a lot of responders, you DUMMY.
But you're here, and your response is calling for DEATH.
Now, if anyone needs to be banned, YOU DO.
Futhermore, I really don't care if you're here, and have enjoyed some of your posts, but the fact remains, where I have absolutely FACTUALLY retued your BS in some of your posts, you have no response - other than, your own personal rage.
I'll be glad to see how you can defend yourself, but you obviously cannot.
Go ahead, there's 22 pages, and I've pointed out your lies several times. Have at it. Good luck, just calling for DEATH, and spewing "ban him!" while carrying your torch of lies is just what I expect from someone who doesn't care what bs they spew.
You already claimed you can't understand - LOL - of course you can't, you'd have to straighten out yourself and your lies then.
Good luck doing that.
SiliconDoc - Monday, April 6, 2009 - link
LOL - the folding was crap forever on ati, and now it's slower.We know the release date for both cards, and the nvidia is already listed on the egg dude.
When you're a raging red rooster, nothing matters to you but lying for the 2 billion dollar loser - ati.
sidk47 - Friday, April 3, 2009 - link
You cannot argue with facts and the fact of the matter is that you can't help the world find a cure for cancer or Alzheimer's by buying an ATI!So those of you with an Internet connection, should buy an NVidia and fold@home all the time to help make the world a better place!
Take that ATI and your associated fanboys!
x86 64 - Sunday, April 5, 2009 - link
Folding at home is a total waste and is just an excuse to be smug and think you're special, so there to both of you."Oh I'm going to save the world by buying overpriced hardware and letting some university use it for studying the human genome. I'm such a humanitarian."
Please, you can justify your over indulgence any way you want but it still doesn't cover up the fact that you're trying to justify sitting on your asses instead of doing some real community work to help change the world.
Folding@home = Too fat and too lazy to really make an effort.
SiliconDoc - Monday, April 6, 2009 - link
Uhh, dude, they're doing it at college, on like triple TESLA machines with the "supercomputer" motherboards - so you know, go get an education and start whining about unbelievable game framerates - that's what's really going on -Professor cuda machine checker " What happened ? "
Gamer students " Oh, uhh, well it crashed again it was a Crysis, I mean uh, no crisis, last night and it took us about 5 hours to to reset the awesome TESLA cards. We'll come in tonight to keep an eye on it, and clean up the pizza boxes and lock up again professor."
" Very well."
WHO LOVES THE EDUCATION OF AMERICA? !!!
hahahahaha
LeonRa - Saturday, April 4, 2009 - link
Well, since you cannot argue with facts, it's a fact you are a stupid fanboy who doesn't know anything! Check your facts before you post something like that. It is a fact that you can do f@h with an ATI card, as I have been doing it for some time now. So STFU and go spill your hatred somewhere else!SiliconDoc - Tuesday, April 7, 2009 - link
You're not being honest there. A while back ati either couldn't do it all ( no port ) - or it was so pathetic - they had to make a new port - I know they did the latter, and as far having a long stretch where it wasn't available, or just not used much since it was so pathetically slow in compariosn, the fella has the right idea.Furthermore, unless something has recently changed significantly, the ati port is still WAY slower than the Nvidia for folding.
So anyway, nice try, but telling the truth might actually be something the red rooster crew should start practicing .... or perhaps not, considering lying a whole heckuva lot might make those 2 billion dollar ati loses into "sales" that make "overall a profit" a reality...
On the other hand, if people continuously notice the lying by the red fans, they might gravitate to the competition, for obvious reasons.
So, honesty, or more bs ? I think I know what you'll choose.
marraco - Friday, April 3, 2009 - link
I hope to see benchmarks with ATI in charge of graphics, and a Geforce in charge of PhysX.... kind of SLI/crossfire betwen ATIs and Geforces :)
A value-added of the geforces, is that, once you buy a new card, the old can unload Physics from the new card. Nice. I hate wasting old hardare.
On other side, most of the games on PhysX nvidia list don't relly work with GPGPU PhysX. Only with the old AGEIA cards.
Sadly, Crisys and Far Cry don't use PhysX. Only Havoc. And AMD still don't support it in hardware.
spinportal - Friday, April 3, 2009 - link
No mention of the death of the HD 4850X2 as the HD4890 trashes the power consumption, price, availability, speed and OC-ability. No mention of advantage of DX10.1 and the games available. Hey, even bad news is good news sometimes by spotlighting. What is really missing is the bang for buck quality (bucks spent for performance increase), and talk about price depression for the HD 4870 1GB model by 10$ to 15$ with $50 step increments.4850 (125)[20.9] 4870 (185)[27.9] 4890 (235)[31.7]
4870X2 (400)[35.0]
Nvidia is cramping its own style:
250 (150)[21.8] 260-216-55 (180)[27] 275 (250?)[31.3]
280 (290)[30.9] 285 (340)[32.8]
The GTX280 is dead now, overpriced for those trying to sneak into SLI. The GTX260 is overlapped with Core216 55nm you'd want to get, but Joe Consumer might mistakenly get the other 2 prior versions to clean out old inventory. The GTX285's price is not justified but more power to nVidia if they get the consumer's buck.
Gladly, by the low temps the dual slot blowback is voiding hot air properly so the vendors are finally manufacturing cards with common sense.
Too bad we have gone the way with power hungry beastly cards needing two 6-pins.
Also, too bad the effects of AF and 0x00, 2xAA, 4xAA and 8xMSAA modes are not investigated. It would be interesting to see how saturated the units get as AF and AA gets bumped and what are the best modes for nVidia and AMD.
Oh, nice blurb for nVidia's shadow enhancement, but ATi/AMD's tesselation enhancement is as much as a hit or miss feature. Will AMD have an tech edge when DX11 tesselation cometh?
SiliconDoc - Monday, April 6, 2009 - link
Hmm, that said, Derek might be crying, since he couldn't stop crowing about that 4850x2 last review - oh boy, you know - I guess he had the heads up and ati told him what card he needed to help push...You know how things are.
Anyway, good observation.
SiliconDoc - Monday, April 6, 2009 - link
LOL - antoher hidden red rooster bias uncovered...Umm... look, when there's a new ati card, there's no talking about crunching down on former ati cards - OK ? That just is NOT allowed.
" No mention of the death of the HD 4850X2 as the HD4890 trashes the power consumption, price, availability, speed and OC-ability "
Dude, not allowed !
PS- Don't mention how this card is going to smash the "4870" "profit" "flagship" - gee now just don't talk about it - don't mention it - look, there's no rooster crying in fps gaming, ok ?
Torquer350 - Friday, April 3, 2009 - link
Props to ATi for delivering a very compelling product. I admit I've always been an Nvidia fan, and I'll generally forgive them a single generational performance loss to ATi, but I've recommended ATi products recently to friends due to their resurgent desirability.That being said, am I the only one who detects a subtle but distinct underlying disdain for Nvidia? So they tried to market the hell out of you - so what? They are trying to sell cards here. Why the surprise that sales and marketing people are trying to do exactly what they're paid to do? Congrats for being smart enough to see it for what it is, but jeers for making an issue of it as if its some kind of new tactic. Has AMD/ATi never done the same?
CUDA and PhysX are compelling, but I agree not a good reason to overcome a significant gap between Nvidia and ATi at a comparable price point. You clearly agree, but it seems like what little praise you offer is begrudging in the extreme.
Nvidia has definitely acted in bad form in a number of ways throughout this very lengthy generation of hardware. However, you guys are journalists and in my opinion should make a more concerted effort to leave the vitriol and sensationalism at the door, regardless of who it is that is being reviewed. That kind of emotional reaction, personal opinion, irritation, etc is better served for your blog posts than a review article.
Love the site, keep up the good work. Nobodys perfect.
SiliconDoc - Monday, April 6, 2009 - link
Yeah thanks for noticing, too. It been going on a long time. Notice how now, suddenly when ati doesn't have 2560 sewn up - it doesn't matter anymore ... LOLOf course the "brilliiantly unbiased" reviewers will claim they did a poll on monitor resolution useage, and therefore sudenyl came to their conclusion about $2,000.00 monitor users, when they tiddled and taddled for years about 10 bucks between framerates and nvidia ati - and chose ati for the 10 bucks difference.
Yep, 10 bucks matters, but $1,700.00 difference for a monitor doesn't matter until they take a poll. Now they didn't say it, but they will - wait it's coming...
Just like I kept pointing out when they raved about ati taking the 30" resolution and not much if anything else, that declaring it the winner wasn't right. Now of course, when ati isn't winning the 30 rez - yes, well, they finally caught on. No bias here ! Nothing to notice, pure professionalism, and hatred of cuda and physx for it's lack of ability to run on ati cards is fully justified, and should offer NO advantage to nvidia when making a purchase decision ! LOL
OMG ! they're like GONERZ man.
Dried - Friday, April 3, 2009 - link
Best review so far. And nice cards BTW, they are both worth it, but i like the 4890 betterFunny thing is that GTX 275 > GTX 280.
But my guess is that GTX 280 benefits more from overclocking.
Arbie - Friday, April 3, 2009 - link
Because of my PC's location I am concerned with idle power, and purchase based on that if other specs and price are even comparable. Peak power doesn't matter as long as it's within the capability of my 800W PSU.I bought an ATI HD4850 last year because it idled significantly lower than the 4870, and it would run everything in sight. A great card. The Nvidia GTX 260 and 280 had even better performance vs idle power ratios but were way too expensive at the time.
So I think Nvidia takes the laurels now with the GTX 275. 30W less (!) than the HD 4890 at idle, with essentially the same performance. If I were shopping now it would be a VERY easy choice.
I really hope ATI can get their idle power down too. They need to pay more attention to throttling back or downpowering circuits that aren't needed in 2D modes.
helldrell666 - Friday, April 3, 2009 - link
Use the radeon bios editor to edit the 2d profile and then downclock your gpu frequencies.OCedHrt - Friday, April 3, 2009 - link
The power consumption on the 4890 really interests me. While it uses more than 275 at idle, it uses less under load. Also, it is a significant drop from the 4870 which is a slower card.bobvodka - Friday, April 3, 2009 - link
So, on the charge of drivers; I've gone from recently having a GT8800GTX 512Meg to a HD4870X2 2gig and if anything I've seen stability improvements between the two. Or to put it another way NV drivers were bluescreening my Vista install when I was doing nothing more than using my TV card and it was crashing in a DirectDraw DLL. Nice.Not to say AMD hasn't had issues; trying to use hardware acceleration with any bluray play back resulted in a bluescreen due to the gpu going into an infinite loop. Nice. Fortunately, unlike the DDraw error above, I could at least turn off hardware acceleration (and honestly, with an i7 it's not like I needed it).
So, stability wise it's a wash.
As for the memory usage complaints about CCC;
Unless it is running it is NOT taking up physical memory. Like many things in the windows world it might load something into the background but this is quickly paged out and doesn't live in ram. Even if it does living in ram for a short period of time being inactive it will be paged out as soon as memory presure requires it. The simple fact is unused ram is wasted ram; this is why I'm glad Vista uses 10gig of my 12 for cache when it isn't needed for anything else, it speeds up the system.
Cuda.. well, the idea is nice and I like the idea but as mentioned in the article unless you have cross vendor support it isn't as useful as it could be. OpenCL and, for games, DX11's compute shaders are going to make life intresting for both Cuda and AMD's option. I will say this much; I suspect you'll get better performance from NV, AMD and indeed Larrabee when it appears by going 'to the metal' with them but as with many things in the software world you have to trade something for speed.
Now, PhysX.. well, this one is even more fun (and I guess it effects Cuda as well to a degree). Right now, with Vista, you can't run more than one vendor's gfx card in your system at once due to how WDDM1.0 works; so it's AMD or NV and that's your choice. With Win7 however the rules change slight and you'll be able to run, with WDDM1.1 drivers, cards from both vendors at once. Right away this paints an intresting landscape for those intrested; if you want an AMD card but also want some PhysX hardware power than you'll be able to slide in a 'cheap' NV series card to use for that reason (or indeed if you have an old series 8 laying about use that if the driver supports it).
Of course, with Havoc going OpenCL and being free for games which retail for <$10 (iirc) this is probably going to be much of a muchness in the end, but it's an intresting idea at least.
SiliconDoc - Monday, April 6, 2009 - link
Except you can run 2 nvidia cards, one for gaming, the other for physx.... so red fanboys are sol."Right now, with Vista, you can't run more than one vendor's gfx card in your system at once due to how WDDM1.0 works; so it's AMD or NV and that's your choice. "
WRONG, it's TWO nvidia or just ONE ati. Hello - you knew it - but you didn't say it that way - makes ati look bad, and we just cannot have that here....
Rhino2 - Monday, April 13, 2009 - link
The hell are you talking about? Crossfire works in vista just fine.SiliconDoc - Friday, April 24, 2009 - link
You failed to read his post, and therefore the context of my response, you IDIOT.Can you run a second ATI card for PhysX - NO.
Can you run an ati card and a second NV for PhysX - not without a driver hack - check techpowerup for the how to and files, as I've already mentioned.
So, THAT'S WHAT WE WE'RE TALKING ABOUT DUMMY.
Now you can take your stupidity along with you, noone can stop it.
pizzimp - Friday, April 3, 2009 - link
From an objective point of view there is not really a clear winner. At the lower resolutions do you really care if you are getting 80 FPS Vs 100 FPS?IMO it is the higher resolutions that matter. I would think any real gamer is always looking to upgrade there monitor :).
I wonder how old you guys are that are posting? Who cares if something is "rebadged" or just an OC version of something? Bottom line is how does the card play the game?
IMO both cards are good. It comes down to price for me.
SiliconDoc - Monday, April 6, 2009 - link
Ahh, you just have to pretend framerates you can't see or notice, and only the top rate or the average, never the bottom framerate...Then you must discount ALL the OTHER NVIDIA advantages, from cuda, to badaboom, to better folding scores, to physx, to game release day evga drivers ready to go, to forced game profiles in nvidia panel none for ati - and on and on and on...
Now, after 6 months of these red roosters screaming ati wins it all because it had the top resolution of the 30" monitor sewed up and lost in lower resolutions, these red roosters have done a 180 degree about face.... now the top resolution just doesn't matter -
Dude, the red ragers are lying loons, it's that simple.
The 2 year old 9800X core is the 4870 without ddr5. Think about that, and how deranged they truly are.
I bet they have been fervently praying to their red god hoping that change doesn't come in the form of ddr5 on that old g80/g92/g92b core - because then instead of it competing with the 4850 - it would be a 4870 - and THAT would be an embarrassment - a severe embarrassment. The crowing of the red roosters would diminish... and they'ed be bent over sucking up barnyard dirt and chickseed - for a long, long time. lol
Oh well, at least ati might get 2 billion from Obama to cover it's losses ... it's sad when a red rooster card could really use a bailout, isn't it ?
helldrell666 - Friday, April 3, 2009 - link
Well, you have a point there. But the card is still not operating on a WHQL driver, and the percentage of those who use 30" montiors is negligible compared to the owners of 22" / 24" monitors.I think this is probably due to the 256bit internal memory interface compared to the 484bit that the gtx275 has.even at xbitlabs the 4890 drops significantly in performance compared to the gtx285.
7Enigma - Friday, April 3, 2009 - link
From a subjective point of view you may feel that way, but from an objective point of view there is a clear winner, and it is the 4890. Left for Dead and Call of Duty are the only 2 30" display tests where the 275 significantly defeated the 4890. In all of the other tests either the 4890 either dominated (G.R.I.D., Fallout3), or was within 4% of the 275 which I would call a wash. At all other resolutions the 4890 was the undisputed leader. So I find it difficult to say there is no clear winner.What Nvidia should have done was not nerfed their 22" and 24" resolutions for the very few people that game at 30" with the latest drivers. To be honest I wish the article had included all of the results from the 182 drivers (they show just G.R.I.D. but allude to other games also having similar reduced results except at the highest res). It could very likely be a wash then if the 275 is more competetive at the resolutions 99% of the people buying this level/price of card are going to be playing at.
Anand, any way you could post, even just in the comments, the numbers for the rest of the games with the 182 Nvidia drivers. I don't mind doing the comparison work to see how much closer the 275 would be to the 4890 if they had kept the earlier drivers.
7Enigma - Friday, April 3, 2009 - link
Ah, I see now that the 185's are specifically to enable support for the 275 card. So you can't run the 275 with the 182 drivers. Still would be interesting to see all the data for what happened to the 285 using the newest drivers that decrease the performance at lower resolutions.minime - Friday, April 3, 2009 - link
First, thanks for your review(s). I'm a silent reader and word-of-mouth spreader for years.Second, don't you think reviewers should point their fingers a little bit more aggressively to the power-consumption? Not because it's trendy nowadays, but because it's just not sane to waste that much energy in idle (2D, anyone remembers?) mode. I was thrilled what you alone (don't take it as a disrespect) were able to achieve on the SSD issue.
SiliconDoc - Monday, April 6, 2009 - link
PSST ! The ati cards have like 30 watts more power useage in idle - and like 3 watts less in 3d - so the power thing - well they just declare ati the winner... LOLThey said they were "really surprised" at the 30 watts less in idle for the nvidia - they just couldn't figure it out- and kept rechecking ... but yeah... the 260 was kicking butt.. but... that doesn't matter - ati takes the win using 1-3 watss less in 3D..
So, you know, the red roosters shall not be impugned !
capiche' ?
VulgarDisplay - Friday, April 3, 2009 - link
It appears that you may have had Vsync turned on which caps the game at 60fps in some of the CoD:W@W tests. It's pretty apparent something is up when the nVidia card has the same FPS at 1680x and 1920x. Either way it still seems like the 4890 wins at those resolutions which is different than most sites that pretty much say it's a wash across the board. I'll take nVidia's drivers over ATi's any day.SiliconDoc - Monday, April 6, 2009 - link
Hey any little trick that smacks nvidia down a notch is not to be pointed out.san1s - Thursday, April 2, 2009 - link
why didn't you post a page on how useless dx 10.1 is? I bet that there will be even less difference in gameplay with dx10.1 on compared to dx10 than physxAnandThenMan - Friday, April 3, 2009 - link
10.1 can bring some meaningful performance boosts.http://img7.imageshack.us/img7/1757/hd489064.jpg">http://img7.imageshack.us/img7/1757/hd489064.jpg
SiliconDoc - Monday, April 6, 2009 - link
ati can't do physx at all - so uhh, no performance boost there, EVER.Same with cuda.
Kinda likewise with this cool Vreveal clean up video thingie.
Same with the badaboom converter compared to ati's err(non mentioned terrible implementation)...lol hush hush!!! doesn't matter ! doesn't matter ! nothing going on there !
tamalero - Thursday, April 9, 2009 - link
hu, gpu phsyx gpu aceleration only helps when theres heavy physx caltulations.almost no game uses that heavy calculations nowadays.
besides, if you wanted physix to run, dont you need a second card to run physX while the other does the graphics?
I suspect thers a slowdown as well if the same graphic card does the work.
SiliconDoc - Friday, April 24, 2009 - link
Since you suspect there's a slowdown with PhysX enabled it points out two things to me : 1. you have no clue if there is because you don't have an nvidia card, indicating your red rooster issue.2. That's why you didn't get my point when the other poster linked to the other review and listed the various settings and I laughed while pointing out the NV said PhysX enabled.
_______
It's funny how your brain farts at just the wrong time, and you expose your massive experience weakness:
"you suspect" - you don't know.
Go whine at someone else, or don't at all. At least bring a peashooter to the gunfight.
Ever played War Monger or Mirror's Edge ? lol
No of course not! YOU CAN'T, until, you know.
yacoub - Thursday, April 2, 2009 - link
Given that you label the price for the GTX260-216 as $205, and in reality it's closer to $175, can we expect the 275 will be closer to $215 in short order?yacoub - Thursday, April 2, 2009 - link
Of course ATi has a hard launch of this product - its hardware appears from your description to be identical to existing hardware just with a slight clockspeed boost, where as the GTX-275 is actually a break between the 285 and the 260.Also the 275 is much more appealing given that it has actual hardware improvements over the 260 for just a bit more cash.
chrnochime - Thursday, April 2, 2009 - link
Did you actually bothered to read other comments wrt the fact that the RV790 is a respin, not just bump in gpu/mem clkspeed??And the actual hardware improvements that are just cut-down from the GTX295. Big deal. Appealing to the NV fanboys, sure.
7Enigma - Friday, April 3, 2009 - link
It's possible he saw the article before Anand/Gary corrected the table. Originally the table had identical numbers between the 4870 and 4890. And in the text they mentioned that it was basically an OC'd 4870. This could lead one to assume it was pretty much the same card. In the article (on the next page) they did mention quickly the high transistor count, but it was brushed over quickly and they didn't really go into detail about the differences (still waiting to get a response about the cooling solution changes).As for the rest of his post, he IS clearly an Nvidia fanboy, because the 4890 is clearly the better product in just about every case (not even looking at the OC'ing potential which seems to be very nice).
SiliconDoc - Monday, April 6, 2009 - link
Hmm, better in every case, without the overclocking potential ? lol He's a fanboy ?Is it better with Cuda, curing cancer with Folding, Vreveal clean up video recoding, forced game profiles, dual GPU game forcing ? Any of those have an as good equivalent ? NO NOT ONE.
So you know dude, he's not the fanboy...
The thing is, ati did a good job with the ring around the gpu 3m transistors to cut down frazzly electric - and gain a good overclock.
That they did well. They also added capacitors and voltage management to the card - an expense left - not mentioned in expense terms - including the larger die cost.
So, on the one hand we have a rebranded overclock that merely used the same type of core reworking that goes into a shrink, but optimized for clocks with a transistor band around the outside.
Not a core rework, but a very good refinement.
I knew the intricacies would be wailed about by the red fans, but not a one is going to note that the G80 is NOT the same as the G92b - the refinements happened there as well in the die shrink, and in between, just like they do on cpu revisions.
Since ATI was making and overclocking upgrade, they needed to ring the core - and make whatever rearrangements were neccessary to do that.
Purely rebrand ? Ahh, not really, but downclocking it to the old numbers may (likely) reveal it's identical anyway.
At that point rebrand is tough to get away from, since the nvidia rebrands offered core revision and memory/clock differences, as well.
I'll give ati a very slight edge because of the ring capacitors, which is interesting, and may be due to the ddr5, that made their core viable for competition to begin with, instead of just a 9800X equivalent, the 4850 - minus the extra capabilities - cuda, better folding, physx, forced sli, game profiles - etc... vreveal... and on and on - evga game drivers on release day - etc. - oh the uhh.. ambient occlusion and fear + many other game mods for it...
Anyway, tell me none of that matters with a straight face - and that face will be so red you'll have to pay in wampum at the puter store.
papapapapapapapababy - Thursday, April 2, 2009 - link
Is the quality of the drivers. ATI. Call me when: A) you fix your broken drivers. B) Decide to finally ditch that bloated Microsoft Visual C++ just so i can have the enormous privilege of using your- also terrible and also bloated- CCC panel. c) Stop pouting my pc with your useless extra services. Until then 'll carry on with NVIDIA. Thanks. > Happy- nvidia- user (and frustrated ex-ATI costumer)josh6079 - Thursday, April 2, 2009 - link
The quality of drivers can be argued either way and the negative connotations associated with "Drivers" and "ATi" are all but ancient concerns in the single GPU arena.papapapapapapapababy - Thursday, April 2, 2009 - link
BS. I have a pretty beefy pc, that doesn't mean im going to stop demanding for efficiency when it comes to memory usage and to reduce the shear amount of stupid services required to run a simple application. This are all FACTS about Ati. But hey, you are free to use vista, buy ATI and end up with a system that is inferior and slower than mine.( performance and feature wise)btw, to all the people claiming that cuda and physics are gimmicks... Give me a fn break! U Hypocrites. This cards ARE HUGE GIMMICKS! BEHOLD he MEGA POWAAR! For what? Crysis? Thats just ONE GAME. ONE. UNO. 1. Then what?... Console ports. At the end of the day 99.9% of games today are console ports. The fact is, you don't need this monstrosities in order to run that console crap. Trust me, you may get a boner comparing 600 fps vs 599, but the rest of the - sane- people here, dsnt give a rat ass, expectantly when the original - console game- barely runs at 30fps to begin with.
SiliconDoc - Monday, April 6, 2009 - link
The red roosters cannot face reality my friend. They are insane as you point out.cuda, physx, badaboom, the vreveal, the mirrors edge that addicted anand, none of it matters to red roosters - the excessive heat at idle from ati - also thrown out with the bathwater, endless driver issues, forget it - no forced multi gpu - nmevermind, no gamer profiles built into the driver for 100 popular games that nvidia has - forget it - better folding performance, forget it -NOT EVEN CURING CANCER MATTERS WHEN A RED ROOSTER FANBOI IS YAKKING ALONG THAT THEY HAVE NO BIAS.
Buddy, be angry, as you definitely deserve to be - where we got so many full of it liars is beyond me, but I suspect the 60's had something to do with it. Perhaps it's their tiny nvidia assaulted wallets - and that chip on their shoulder is just never going away.
I see someone mentioned Nvidia "tortured" the reviewers. LOL
hahahaahahahahahahaaa
There is no comparison... here's one they just hate:
The top ati core, without ddr5, and neutered for the lower tier, is the 4830.
The top nvd core, without ddr5, and neutered for the lower tier, is the 260/192.
Compare the 260/192 to the 4830 - and you'll see the absolute STOMPING it takes.
In fact go up to the 4850, no ddr5 - and it's STILL a kicking to proud of.
Now what would happen if NVidia put ddr5 on it's HATED G92 "rebrand" core ? LOL We both know - the 9800gtx+ and it's flavors actually competes equivalently with the 4850 - if Nvidia put ddr5 on it, it WOULD BE A 4870 .
Now, that G80/G92/G92b "all the same" according to the raging red roosters who SCREAM rebranded - is SEVERAL YEARS OLD nvidia technology... that - well - is the same quality as the current top core ati has produced.
So ATI is 2 years behind on core - luckily they had the one company that was making the dddr5 - to turn that 4850 into a 4870 - same core mind you!
So, the red roosters kept screaming for a GT200 "in the lower mid range" --- they kept whining nvidia has to do it - as if the 8800/9800 series wasn't there.
The real reason of course, would be - it could prove to them, deep down inside, that the GT200 "brute force" was really as bad or worse than the 770 core - bad enough that they could make something with it as lowly as the 4830...
Ahh, but it just hasn't happened - that's what the 2 year old plus rebrand of nvidia is for - competing with the ati core that doesn't have the ddr5 on it.
Well, this reality has been well covered up by the raging red rooster fanboys for quite some time. They are so enraged, and so deranged, and so filled with endless lies and fudging, that they just simply missed it - or drove it deep within their subconscoius, hoping they would never fully, conscoiusly realize it.
Well, that's what I'm here for ! :)
To spread the good word, the word of TRUTH.
josh6079 - Friday, April 3, 2009 - link
I just came from using Cat 9's to 182+'s when I upgraded to nVidia.The "efficiency when it comes to memory usage" is a non-issue -- especially on a "beefy pc."
The windows task manager is not a benchmark leading to conclusive comparisons regarding quality. My Nvidia GPU can (and has) occupied more memory, especially when I utilize nHancer so as to tap the super-sampling capabilities.
Also, it's something to note that nVidia's latest driver download is 77.0 MB in size, yet ATi's latest is only 38.2 MB.
papapapapapapapababy - Saturday, April 4, 2009 - link
1) Nhancer is just an optional utility, optional. IF want to check the gpu temps i just use gpuz, or everest, if i want to overclock i just use rivaturner, for the rest i have the nvidia control panel.The ccc, not only is a bloated crap, it also requires Ms NET framework, and spawns like 45 extra services running non stop ALL THE TIME, clogging my pc, and the thing dsnt even work! GREAT WORK ATI! CCC is stupidly slow and broken. Se, i dont need a massive mega all in one solution that doesn't work and runs like ass.
2) YOUR Nvidia GPU. YOUR. Thats the key word, here. Your fault. Just like that windows task manager of yours, it seems to me you just didnt know how to use that nvidia gpu . And you need that knowledge in order to form conclusive comparisons regarding efficiency.
3) i made a gif, just for you. here. try no to hurt yourself reading it.
http://i39.tinypic.com/5ygu91.jpg">http://i39.tinypic.com/5ygu91.jpg
3) upgrade your adsl? btw the nvidia driver includes the extra functionality, that ati dsnt even have. ( and hint, it doesn't pollute your pc!)
tamalero - Sunday, April 5, 2009 - link
sorry to bust your bubble, but your screenshots is no proof, its clear you removed a LOT OF PROCESSES just to take the screenshot, how about if you take the FULL desktop screenshot that shows the nvidia panel loaded?because it doesnt seem to be in the process list.
also you're liying, I got an ATI 3870. and I only got 3 processes of ATI, one of them being the "AI" tool for game optimizations(using the latest drivers).
and I agree with Anandtech for first time ever, PhysX is just not ready for the "big thing"
most of the things they bloat are just "tech demos" or very weak stuff, Mirrors Edge's PhysX in other hand does show indeed add a lot of graphical feel.
and funny you mention framework, because a lot of new games and programs NOW NEED the framework fundation, or at least the C++ redistribuitable groups.
also lately I've been reading a lot of fuzz related to the 275's optomizations, wich in many games forces games to use less AA levels than the chosen ones. and thus giving the "edge" to the 275 vs the 4890. (MSAA vs TRSS)
I suppose again NVidia as been playing the dirty way.
and its gets annoying how Nvidia as been doing that for quite a bit to keep the dumb thing of "opening a can of whoop ass"
papapapapapapapababy - Monday, April 6, 2009 - link
"you removed a LOT OF PROCESSES", dumbass... if its not a necessary service, its turned off REGARDLESS OF THE videocard . maybe you should try to do the same? (lol just 3 ati services, run the ccc and see) btw, if i CHOOSE to use the nvidia control panel, THEN a new nvidia service starts, THEN as soon as I CLOSE THE control panel ( the process disappears from the task manager. THAT 3 ATI SERVICES OF YOURS ARE RUNNING ALL THE FRINGIN TIME, DUMMY, Remaining resident in YOUR memory REGARDLESS IF YOU USE OR NOT THE CCC. AND THEY GET BIGGER, AND BIGGER, AND BIGGER. ALSO YOU HAVE TO USE THE NET CRAP. (EXTRA SERVICES!) AND FINALLY, THE CCC RUNS LIKE ASS. so there, i WIN. YOU LOOSE. END OF STORY.tamalero - Thursday, April 9, 2009 - link
hang on, since when you need the CCC panel to be ON ( Ie, loaded and not in the tray ) to play games?are you a bit dumb?
second, why you didnt filter out the services then?
your screenshot is bull
its almost like you ran WinXP in safe mode just to take the screenshot and claim your "memory superiority".
like I said, show us a full screen that shows the nvidia panel LOADED .
your argument is stupid .
4 Mb of Ram must be a LOT for you? (thats what my ATI driver uses currently on vista X64.. )
btw, theres also an option in ATI side to remove the panel from the tray.
the tray serves a similar function as ATI TOOL ( Ie, fast resolution , color dept and frecuency changes )
play apples with apples if you want to make a smart conversation.
"runs like ass", makes me wonder how old are you, 14 years old?
and my CC runs very fine, thank you!, not a single error.
also, I got all frameworks installed and even when programs loaded, I dont see any "framework" services running, nor application, so please, get your head out of your ass.
you're just like this pseudo SiliconDr who spreads only FUD and insults.
SiliconDoc - Friday, April 24, 2009 - link
Besides all your errors and the corrections issued, it comes down to you claiming " Don't load the software that came with the ATI card because it's a fat bloated pig thqt needs to be gotten rid of".Yes, and most people who want the performance have to do that.
Sad, isn't it ?
You do know you got spanked badly, and used pathetic 3rd grader whines like "~ your screencap is fake" after he had to correct you on it all....
Just keep it shut until you have a valid point - stop removing all doubt.
SiliconDoc - Monday, April 6, 2009 - link
Well thanks for stomping the red rooster into the ground, definitively, after proving, once again, that what an idiot blabbering pussbag red spews about without a clue should not be swallowed with lust like a loose girl.I mean it's about time the reds just shut their stupid traps - 6 months of bs and lies will piss any decent human being off. Heck, it pissed off NVidia, and they're paid to not get angry. lol
tamalero - Sunday, April 5, 2009 - link
arggh, lots of typoos."Mirrors Edge's PhysX in other hand does show indeed add a lot of graphical feel. " should have been : Mirrors Edge's physx in other hand, does indeed show a lot of new details.
lk7600 - Friday, April 3, 2009 - link
Can you please die? Prefearbly by getting crushed to death, or by getting your face cut to shreds with a
pocketknife.
I hope that you get curb-stomped, f ucking retard
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
papapapapapapapababy - Saturday, April 4, 2009 - link
Yes, i love you too, silly girl.lk7600 - Friday, April 3, 2009 - link
Can you please remove yourself from the gene pool? Preferably in the most painful and agonizing way possible? Retard
magnetar68 - Thursday, April 2, 2009 - link
Firstly, I agree with the articles basic premise that lack of convincing titles for PhysX/CUDA means this is not a weighted factor for most people.I am not most people, however, and I enjoy running NVIDIA's PhysX and CUDA SDK samples and learning how they work, so I would sacrifice some performance/quality to have access to these features (even spend a little more for them).
The main point I would like to make, however, is that I like the fact that NVIDIA is out there pushing these capabilities. Yes, until we have cross-platform OpenCL, physics and GPGPU apps will not be ubiquitous; but NVIDIA is working with developers to push these capabilities (and 3D Stereo with 3D VISION) and this is pulling the broader market to head in this direction. I think that vision/leadership is a great thing and therefore I buy NVIDIA GPUs.
I realize that ATI was pushing physics with Havok and GPGPU programming early (I think before NVIDIA), but NVIDIA has done a better job of executing on these technologies (you don't get credit for thinking about it, you get credit for doing it).
The reality is that games will be way cooler when the you extrapolate from Mirror's Edge to what will be around down the road. Without companies like NVIDIA out there making solid progress on executing on delivering these capabilities, we will never get there. That has value to me I am willing to pay a little for. Having said that, performance has to be reasonable close for this to be true.
JarredWalton - Thursday, April 2, 2009 - link
Games will be better when we get better effects, and PhysX has some potential to do that. However, the past is a clear indication that developers aren't going to fully support PhysX until it works on every mainstream card out there. Pretty much it means NVIDIA pays people to add PhysX support (either in hardware or other ways), and OpenCL is what will really create an installed user base for that sort of functionality.If you're a dev, what would you rather do: work on separate code paths for CUDA and PhysX and forget about all other GPUs, or wait for OpenCL and support all GPUs with one code path? Look at the number of "DX10.1" titles for a good indication.
josh6079 - Thursday, April 2, 2009 - link
Agreed.NVidia has certainly received credit for getting accelerated physics moving, but its momentum stops when they couple it to CUDA when offering it to discrete graphics cards outside of the GeForce family.
Hrel - Thursday, April 2, 2009 - link
Still no 3D Mark scores, STILL no low-med resolutions.Thanks for including the HD4850, where's the GTS250??? Or do you guys still not have one? Well, you could always use a 9800GTX+ instead, and actually label it correctly this time. Anyway, thanks for the review and all the info on CUDA and PhysX; pretty much just confirmed what I already knew; none of it matters until it's cross-platform.
7Enigma - Friday, April 3, 2009 - link
3DMark can be found in just about every other review. I personally don't care, but realize people compete on the Orb, and since it's just a simple benchmark to run it probably could be included without much work. The only problem I see (and agree with) is the highly optimized nature both Nvidia and ATI put on the PCVantage/3DMark benchmarks. They don't really tell you much about anything IMO. I'd argue they not only don't tell you about future games (since to my knowledge no (one?) games have ever used an engine from the benchmarks), nor do they tell you much between cards from different brands since they look for every opportunity to tweak them for the highest score, regardless of whether it has any effect in realworld performance.What low-med resolution are you asking for? 1280X1024 is the only one I'd like to see (as that's what I and probably 25-50% of all gamers are still using), but I can see why in most cases they don't test it (you have to go to low end cards to have an issue with playable framerates on anything 4850 and above at that resolution). Xbitlabs' review did include 1280X1024, but as you'll see, unless you are playing Crysis:Warhead, and to a lesser extent Farcry2 with max graphics settings and high levels of AA you are normally in the high double to triple digits in terms of framerate. Any resolution lower than that, you've got to be on integrated video to care about!
SiliconDoc - Monday, April 6, 2009 - link
Yes, exactly why added value of CUDA, PhysX, badaboom, vReveal, the game profiles ready in nv panel, the forced SLI, the ambient occlusion games and their MODS ( se back a page or two in comments) - all MATTER to a lot gamers.Let's not forget card size for htpc'ers - heat, dissipation, H.264 etc.
Just the frames matter here just for ati - formerly at 2560x when ati had that crown, now of course, just for lower resolutions - the most important suddenly to the same reviewers, when ati is stuck down there.
Yeah, PATHETIC describes the dismissal of added values.
Flunk - Thursday, April 2, 2009 - link
I have a CUDA-supporting GPU (8800GTS) and I have rarely used it. Other than to run the CUDA version of folding at home (there is also an Ati Stream version) or to look at the preitty effects in a few games. I don't really think these effects are particularly worthwhile and unless the industry comes together and supports a standard like OpenCL I don't see GPU-based processing becoming important to most uses.SiliconDoc - Monday, April 6, 2009 - link
Here's a clue as to why you're already WRONG.Most "gpu users" use NVidia. DUH.
So while you're whistling in the dark, it's already past that time when your line of crap has any basis in reality.
It takes a gigantic red fanboy brain fart to state otherwise.
Oh well, since when did facts matter when the red plague is rampant?
Hrel - Thursday, April 2, 2009 - link
You can get an Nvidia GPU that runs CUDA and Badaboom for $50; the 9600GT. End of page 13.Hrel - Thursday, April 2, 2009 - link
You can get an Nvidia GPU that runs CUDA and Badaboom for $50; the 9600GT.punjabiplaya - Thursday, April 2, 2009 - link
Just need to some stable OC vs OC results!SiliconDoc - Monday, April 6, 2009 - link
anand doesn't do the overclocked part comparison of the videocard wars - BUT DON 'T worry - a red rooster exception with charts and babbling is no doubt coming down the pike.Keep begging, then they can "respond to customer demands". lol
Oh man, this is going to be fun.
I suggest they start with the gainward gtx260 overclock goes like hell, that whips every single 4870 1g XXX ever made. Sound good ?
Griswold - Thursday, April 2, 2009 - link
What I'm really curious about because neither of the cards is what I'm interested in buying, but I like to follow both companies business strategies:Does nvidia really lose money or is looking at a fat zero on the bottom line with this card?
SiliconDoc - Monday, April 6, 2009 - link
Uhh ati is losing a billion a year.If you want card specifics, that's probably difficult to calculate - and loss leaders are nothing new in business - in fact that's what successful businesses use as a sales tool. Seems ATI has taken it a bit too far and made every card they sell a loss leader, hence their billions in the hole.
Now as far as the NVidia card in question, even if Obama takes over the mean greedy green machine - he and his cabal "won't release the information because it's just not fair and may cause those not really needing help at the money window to be expsoed".
So no, you won't be finding out.
The problem is anyway, if a certain card is a loss leader, they calculate how much other business it brings in, and that makes it a WINNER - and that's the idea.
flashbacck - Thursday, April 2, 2009 - link
The physx/cuda section was interesting, although it sounded a bit... whiny.I would LOVE it if someone would write an article about all the PR and marketing shenanigans that go on with reviewers behind the scenes. It'll never happen because it would kill any relationship the author has with the companies, but I bet it would be an eye opening read.
josh6079 - Thursday, April 2, 2009 - link
I'm one with the opinion that PhysX is good and will only become better in time. Yet, I more than acknowledge the fact that CUDA is going to hinder its adoption so long as nVidia remains unwilling to decouple the two.There was a big thread concerning this on the Video forum, and some people just can't get through the fact that CUDA is proprietary and OpenCL is not. As long as you have that factor, hardware vendors are going to refrain from supporting their competitors proprietary parallel programming and because of that developers will continue to aim for the biggest market segment.
PhysX set the stage for non-CPU physic calculations, but that is no longer going to be an advantageous trait for them. They'll need to improve PhysX itself, and even then they will have to provide it to all consumers -- be it if they have an ATi or nVidia GPU in their system. They'll have to do this because Havok will be doing this with OpenCL to serve as the parallel programming instead of CUDA, thereby allowing Havok GPU-accelerated physics for all OpenCL-compliant GPUs.
tamalero - Sunday, April 5, 2009 - link
the problem is, by the time the PhysX becomes norm, you will be on your NVidia 480GTX :Pit happened to AMD and their X64 technology, took quite a bit to blast off.
haukionkannel - Thursday, April 2, 2009 - link
Well, if these cards reduce the price of earlier cards it's just a good thing :-)From ATI's part changes are not big, but they make the product better. It's better owerclocker than the predessor, it has better power lines. It's just ok upgrade like Phenom 2 was compared to original Phenom (though 4870 was and still is better GPU than Phenom was as an CPU...)
Nvidias 275 offers good upgrade over the 260, so not so bad if those rumors about shady preview samples turns out to be false. If the preview parts really are beefed up versions... Well Nvidia would be in some trouble, and I really think that they would not be that stubid, would'n they? All in all the improvement from 260 to 275 seems to be bigger than 4870 to 4890, so the competition is getting tighter. So far so good.
In real life both producers are keen on developing their DX11 cards to be ready for DX11 launch, so this may be guite boring year in GPU front untill the next generation comes out...
knutjb - Thursday, April 2, 2009 - link
Both cards performed well and the performance differences are small. I can buy the 4890 today on newegg but not the 275. I know the 4890 is a new chip even if it is just a refined RV770 it's still a NEW part. It falls within in an easily understood hierarchy in the 4800 range. Bottom line I know what I'm getting. The 275 I can't buy today and it appears to be another recycled part with unclear origins. Nvidia's track record with musical labeling is bothersome to me. I want to know what I'm buying without having to spend days figuring out which version is the best bang for the buck. Come on Nvidia this is a problem and you can do better than this. The CUDA and PhysX aren't enough to sway me on their own merits since most of the benefits require me to spend more money, yes they add value, but at what expense?.SiliconDoc - Monday, April 6, 2009 - link
nutjob, you're not smart enough to own NVidia. Stick with the card for dummies, the ati.Here's a clue "overclocked 4870 with 1 gig ram not 512, not a 4850 because it has ddr5 not ddr3 - so we call it 4870+ - no wait that would be fair, if we call it 4870 overclocked, uhh... umm.. no we need a better name to make it sound twice as good... let's see 4850, then 4870, so twice as good would be uhh. 4890 ! That's it !
There ya go... So the 4890 is that much better than the 4870, as it is above the 4850, right ? LOL
Maybe they should have called it the 4875, and been HONEST about it like NVidia was > 280 285 ...
No ATI likes to lie, and their dummy fans praise them for it.
Oh well, another red rooster FUD packet blown to pieces.
knutjb - Saturday, April 11, 2009 - link
Dude you missed the whole point, must be the green blurring your vision. Nvidia takes an existing chip and reduces it's capacity or takes one the doesn't meet spec and puts it out as a new product or they take the 8800, then 9800, then the 250, then... that is re-badging. The 4850 and 4830 same same. Grading chip is nothing new but Nvidia keeps rebadging OLD, but good, chips and releases them as if they are NEW which is where my primary complaint about Nvidia gfx cards comes from.4890 might not be an entirely new core but they ADDED to it, rearranged the layout, in the end improving it, they didn't SUBTRACT from it. It is more than a 4870+. It is a very simple concept that apparently you are unable to grasp due to your being such a fanboy. So you don't like ATI, I don't care, I buy whoever has the best bang for the buck that meets my needs not what you think.
ATI looked at the market and decided to hit the midrange and expand down and up from there. They went where most of the money is, in the midrange, not high end gaming. They are hurting and a silly money flag ship doesn't make sense right now. If Nvidia wasn't concerned with the 4890 they wouldn't have released another cut down chip. Put down the pipe and step away from the torch.... Seek help.
SiliconDoc - Thursday, April 23, 2009 - link
So your primary problem is that you think nvidia didn't rework their layout when they changed from G80, to G92, to G92b, and you don't like the fact that they can cover the entire midrange by doing that, because of the NAME they sue when they change the bit width, the shaders, the mem speed etc - BUTWhen aTI does it it's ok because they went for the mid range, you admit the 4850 and 4830 are the same core, but fail to mention the 4870 and fairly include the 4980 as well - because it's OK when ati does it.
Then you ignore all the other winning features of nvidia, and call me names - when I'M THE PERSON TELLING THE TRUTH, AND YOU ARE LYING.
Sorry bubba, doesn't work that way in the real world.
The real horror is ATI doesn't have a core better than the G80/G92/G92b - and the only thing that puts the 4870 and 4890 up to 260/280 levels is the DDR5, which I had to point out to all the little lying spewboys here already.
Now your argument that ATI went for the middle indicates you got that point, and YOU AGREE WITH IT, but just can't bring yourself to say it. Yes, that's the truth as well.
Look at the title of the continuing replies "RE: Another Nvidia knee jerk" - GET A CLUE SON.
lol
Man are you people pathetic. Wow.
Exar3342 - Thursday, April 2, 2009 - link
These are both basically rebadges; deal with it.knutjb - Friday, April 3, 2009 - link
If 3,000,000 more tranistors is "basically a rebadge" you are lost on how much work goes into designing a chip as opposed to changing the stamper on the chip printing machine. I would speculate ATI/AMD has made some interesting progress on their next gen chip design and applied it to the RV770 it worked so they're selling it now to fill a hole in the market.It sounds like you are trying to deal with Nvida's constant rebaging and have to point the finger and claim ATI/AMD is doing it too. Where did the 275 chip come from? Yes it is a good product but how many names do you want it called?
I have bought just as many Nvidia cards as I have ATI/AMD based on bang for the buck, just calling it like I see it...
SiliconDoc - Monday, April 6, 2009 - link
Well, they worked it for overclocking - and apparently did a fine job of that - but it is a rebadging, none the less.It seems the less than one half of one percent "new core" transistors are used as a sort of multi capacitor ring around the outside of the core, for overclocking legs. Not bad, but not a new core. I do wonder as they hinted they "did some rearranging" - if they had to waste some of those on the core works - lengthening or widening or bridging this or that - or connections to the bois for volt modding or what have you.
When eother company moves to a smaller die, a similar effect is had for the cores, some movements and fittings and optimizations always occur, although this site always jumped on the hate and lie bandwagon to screech about "rebranding" - as well as "confusing names" since the cards were not all the same... bit width, memory type, size, shaders, etc.
So I'm sure we would hear about the IMMENSE VERSATILITY of the awesome technology of the ati core (if they did the same thing with their core).
However, they've done a rebranding a ring around the overclock. Nice, but same deal.
Can you tell us how much more epxensive it's going to be to produce since derak and anand decided to "not mention the cost" since they didn't have the green monster to bash about it ?
Oh that's right, it's RUDE to mention the extra cost when the red rooster company is burning through a billion a year they don't have - ahh, the great sales numbers, huh ?
tamalero - Thursday, April 9, 2009 - link
the 270 is a 285 nerf, so what?your point is?
SiliconDoc - Thursday, April 23, 2009 - link
The other point is, when you've been whining about nvidia having a giant brute force core that costs too much to make, and how that gives ati a huge price and profit advantage ( even though ati has been losing a billion a year) , that when ati make a larger core and moer expensive breadboard and cooler setup standard for their rebrand, you point out the greater expense, in order to at least appear fair, and not be a red raging rooster rooter.Got it there bub ?
Sure hope so.
Next time I'll have to start charging you for tutoring and reading comprehension lessens.
SiliconDoc - Thursday, April 23, 2009 - link
Uh, for you, the mentally handicapped, the point is since ati made a rebrand, call it a rebrand, especially when you've been screeching like a 2 year old about nvidia rebrands, otherwise you're a lying sack of red rooster crap, which you apparently are.Welcome to the club, dumb dumb.
I hope that helps with your mental problem, your absolute inability to comprehend the simplest of points. I would like to give you credit and just claim you're being a smart aleck, but it appears you are serious and haven't got clue one. I do feel sorry for you. Must be tough being that stupid.
Griswold - Thursday, April 2, 2009 - link
Just that one "rebadge" comes with 3 million extra transistors; deal with it.SiliconDoc - Monday, April 6, 2009 - link
" Because they’re so similar, the Radeon 4870 and 4890 can be combined together for mix-and-match CrossFire, just like the 4850 and 4870."Yep, that non rebadge. LOL
jtleon - Thursday, April 2, 2009 - link
Visit:http://chhosting.org/index.php?topic=24.0">http://chhosting.org/index.php?topic=24.0
To see AO applied to FEAR. Boris Vorontsov developed the directx mod long ago!
SiliconDoc - Monday, April 6, 2009 - link
Oh sorry, forgot forced SLI profiles, and I don't want to fail to mention something like EVGA's early release NVidia game drivers for games on DAY ONE. lolAww, red rover red rover send the crying red rooster right over.
Did I mention ati lost a billion bucks two years in a row for amd ?
No ?
I guess Dewreck and anand forgot to mention the larger die, and more expensive components on the 790 ati boards will knock down "the profits" for ati. LOL Yeah, awww... we just won't mention cost when ati's goes up - another red rooster sin by omission.
I ought to face it, there are so many, I can't even keep up anymore.
They should get ready for NVidia stiffing them again, they certainly deserve it - although it is funny watching anand wince in text as he got addicted to Mirror's Edge - then declared "meh" for nvidia.
lol - it's so PATHETIC.
tamalero - Thursday, April 9, 2009 - link
what the hell are you talking about?SiliconDoc - Thursday, April 23, 2009 - link
You proved you can't read and comprehend properly on the former page, where I had to correct you in your attempt to whine at me - so forget it - since you can't read properly ok nummy ?SiliconDoc - Monday, April 6, 2009 - link
Ahh, thank you very much. lolNVIDIA wins again !
rofl
I'm sure the ati card buyers will just hate it...but of course they are so happy with their pathetic "only does framerates, formerly in 2560 for wins, now in lesser resolutions for the win"
It just never ends - Cuda, PhySx, Ambient Occlusion, bababoom, the vReveal, the game presets INCLUDED in the driver, the ability to use your old 8 or 9 Nvidia card for PhysX or Cuda in a xfire board with another NVidia card for main gaming ...
I know, NONE OF IT MATTERS !
The red rooster fanbois hate all of that ! They prefer a few extra frames at way above playable framerates in certain resolutions depending on their fanboy perspective of the card release (formerly 2560 now just lower resolutions)- LOL that they cannot even notice unless they are gawking at the yellow fraps number while they get buzzed down in cold blood in the game.
Ahhh, the sweet taste of victory at every turn.
Psyside - Thursday, April 2, 2009 - link
Can anyone tell me about the testing metod average or maximum fps? thanks.Jamahl - Thursday, April 2, 2009 - link
some sites have the gtx275 clearly winning at all games, all resolutions.helldrell666 - Thursday, April 2, 2009 - link
You can't trust every site you check.especially since most of those sites don't post their funders names on their main page.You must've heard of Hardocp's Kyle who was fired by nvidia because he mentioned that the gtx250 is a renamed 9800gtx.7Enigma - Thursday, April 2, 2009 - link
I think this is due to Nvidia shooting themselves in the leg with the 185 drivers. With the performance penalty at the normal resolutions, anyone testing with the 185's is going to get lower results than someone testing with the previous drivers. And I'm sure you could find 10 games that all perform better on ATI/NVIDIA. That's the problem with game selection and the only real answer is what types of games you play and what engines you think will be used heavily for the next 2 years.SiliconDoc - Monday, April 6, 2009 - link
Well the REAL ANSWER is - if you play at 2650, or even if you don't, and have been a red raging babbling lying idiot red rooster for 6 months plus pretending along with Derek that 2650x is the only thing that matters, now you have a driver for NVidia that whips the ati top dog core...If you're ready to reverse 6 months of red ranting and raving for 2560X ati wins it all, just keep the prior NV driver, so the red roosters screaming they now win because they suddenly are stuck at the LOWER REZ tier to claim a win, can be blasted to pieces anyway- at that resolution.
So - NVidia now has a driver choice - the new for the high rez crown they took from the red fanboy ragers, and the prior driver which SPANKS THE RED CARD AGAIN at the lower rez.
Make sure to collude with all the raging red roosters to keep that as hush hush as possible.
1. spank the 790 at lower rezz with the older Nvidia driver
2. spank the 790 at the highest rez with the new driver
_______________________
Don't worry if you can't understand just keep hopping around flapping those litttle wings and clucking so that red gobbler jouces around - don't worry soft PhysX can display that flabby flapper !
The0ne - Tuesday, April 7, 2009 - link
Can someone ban this freaking idiot. The last few posts of his have been nothing but moronic, senseless rants. Jesus Christ, buy a gun and shoot yourself already.SiliconDoc - Tuesday, April 7, 2009 - link
Ahh, you don't like the points, so now you want death. Perhaps you should be banned, mr death wisher.If you don't like the DOZENS of valid points I made, TOO BAD - because you have no response - now you sound like krz1000 and his endless list of names, the looney red rooster that screeches the same thing you just did, then posts a link to youtube with a freaky slaughter video.
If I wasn't here, the endless LIES would go unopposed, now GO BACK and respond to my points LIKE MAN, if you have anything, which no doubt, you do not.
helldrell666 - Thursday, April 2, 2009 - link
According to xbitlabs, the 4890 beats the gtx285 at 1920x1200 resolution with 4x aa in Cod5, Crysis Warhead, Stalker CS, Fallout 3 and loses in Far Cry2.Here, the 4890 matches in Far Cry 2 and cod5 with some slightly lower fps than the gtx285 in Crysis warhead.Strange....
7Enigma - Thursday, April 2, 2009 - link
That is crazy. There is no way variations should be that huge between the 2 tests, regardless of the area they chose to test in the game. Anandtech has it as essentially a wash, while Xbit has the 4890 20% faster!?! (COD:WaW)7Enigma - Thursday, April 2, 2009 - link
Just looked closer at the Xbitlabs review. The card they used was an OC variant that had 900MHz core instead of the stock 850MHz. In certain games that are not super graphically intensive I'm willing to bet at 1920X1200 they may still be core starved and not memory starved so a 50MHz increase may explain the discrepancy.I've got to admit you need to take the Xbitlabs article with a grain of salt if they are using the OC variant as the base 4890 in all of their charts....that's pretty shady...
7Enigma - Thursday, April 2, 2009 - link
And just go and disregard everything I typed (minus the different driver versions). Xbit apparently underclocked the 4890 to stock speeds. So I have no clue how the heck their numbers are so significantly different, except they have this posted on system settings:ATI Catalyst:
Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always Off
Enable Adaptive Anti-Aliasing: On/Quality
Other settings: default
Nvidia GeForce:
Texture filtering – Quality: High quality
Texture filtering – Trilinear optimization: Off
Texture filtering – Anisotropic sample optimization: Off
Vertical sync: Force off
Antialiasing - Gamma correction: On
Antialiasing - Transparency: Multisampling
Multi-display mixed-GPU acceleration: Multiple display performance mode
Set PhysX GPU acceleration: Enabled
Other settings: default
If those are set differently in Anand's review I'm sure you could get some weird results.
SiliconDoc - Monday, April 6, 2009 - link
LOL - set PhysX gpu accelleration enabled.roflmao
Yeah man, I'm gonna get me that red card... ( if you didn't detect sarcasm, forget it)
tamalero - Thursday, April 9, 2009 - link
good to know you blame everyone for "bad reading understanding"let's see
ATI Catalyst:
Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always Off
Enable Adaptive Anti-Aliasing: On/Quality
Other settings: default
Nvidia GeForce:
Texture filtering – Quality: High quality
Texture filtering – Trilinear optimization: Off
you see the big "NVIDIA GEFORCE:" right below "other settings"?
that means the physX was ENABLED on the GEFORCE CARD.
you sir, are a nvidia fanboy and a big douché
SiliconDoc - Thursday, April 23, 2009 - link
More personal attacks, when YOU are the one who can't read, you IDIOT.Here are my first two lines: LOL - set PhysX gpu accelleration enabled.
roflmao
_____
Then you tell me it says PhySx is enabled - which is what I pointed out. You probably did not go see the linked test results at the other site, and put two and two together.
Look in the mirror and see who can't read, YOU FOOL.
Better luck next time crowing barnyard animal.
"Cluckle red 'el doo ! Cluckle red 'ell doo !"
Let's see, I say PhySx is enabled, and you scream at me to point out it says PhysX is enabled, and call me an nvidia fan because of it - which would make you an nvidia fan as well - according to you, IF you knew what the heck you were doing, which YOU DON'T.
That makes you - likely a red rooster... I may check on that - hopefully you're not a noob poster, too, as that would reduce my probabilities in the discovery phase. Good luck, you'll likely need it after what I've seen so far.
7Enigma - Thursday, April 2, 2009 - link
Looked even closer and the drivers used were different.ATI Drivers:
Anand-9.4 beta
Xbit-9.3
Nvidia:
Anand-185
Xbit-182.08
ancient46 - Thursday, April 2, 2009 - link
I don't see the fun in shooting cloth and unrealistic non impact resistant windows in high rise buildings. The video with the cloth was distracting, it made me wonder why it was there. What was its purpose? My senior eyes did not see much of an improvement in the videos in the CUDA application.SiliconDoc - Monday, April 6, 2009 - link
Maybe someday you'll lose you're raging red fanboy bias, brakdown entirely, toss out your life religion, and buy an nvidia card. At that point perhaps Mirror's Edge will come with it, and after digging it out of the trash can (second thoughts you had), you'll try it, and like anand, really like it - turn it off, notice what you've been missing, turn it back on, and enjoy. Then after all that, you can crow "meh".I suppose after that you can revert to red rooster raging fanboy - you'll have to have your best red bud rip you from your Mirror's Edge addiction, but that's ok, he's a red and will probably smack you for trying it out - and have a clean shot with ow absorbed you'll be.
Well, that should rap it up.
poohbear - Thursday, April 2, 2009 - link
are the driver issues for AMD that significant that it needs to be mentioned in a review article? im asking in all honesty as i dont know. Also, this close developer relationship nvidia has w/ developers. does that show up in any games to significantly give a performance edge for nvidia vid cards? is there an example game out there for this? thanks.7Enigma - Thursday, April 2, 2009 - link
Look no further than this article. :) Here's the quote:"The first thing about Warmonger is that it runs horribly slow on ATI hardware, even with GPU accelerated PhysX disabled. I’m guessing ATI’s developer relations team hasn’t done much to optimize the shaders for Radeon HD hardware. Go figure."
But ATI also has some relations with developers that show an unusually high advantage as well (Race Driver G.R.I.D. for example). All in all, as long as no one is cheating by disabling effects or screwing with draw distances, it only benefits the consumer for the games to be optimized. The more one side pushes for optimizations, the more the other side is forced, or risk losing the benchmark wars (which ultimately decides purchases for most people).
SkullOne - Thursday, April 2, 2009 - link
In the conclusion mentions Nvidia's partners releasing OC boards but nothing about AMD. There is already two versions of the XFX HD4890 on Newegg. One is 850 core and the other is 875 core.The HD4890 is geared to open that SKU of "OC" cards for AMD. People with stock cooling and stock voltage can already push the card to 950+MHz. On the ASUS card you boost voltage to the GPU which has allowed people to get over 1GHz on their GPU. As the card matures seeing 1GHz cores on stock cooling and voltage will become a reality.
It seems like these facts are being ignored.
SiliconDoc - Monday, April 6, 2009 - link
Don't worry, it is mentioned in the article their overclocking didn't have good results, so they're keying up a big fat red party for you soon.They wouldn't dare waste the opportunity to crow and strut around.
This was about announcing the red card, slamming nvidia for late to market, and denouncing cuda and physx, and making an embarrassingly numberous amount of "corrections" to the article, including declaring the 2560 win, not a win anymore, since the red card didn't do it.
That's ok, be ready for the change back to 2560 is THE BESt and wins, when the overclock review comes out.
:)
Don't worry be happy.
tamalero - Thursday, April 9, 2009 - link
SD, you seriously have a mental problem right?I noticed that you keep bashing, being sarcastically insultive (betwen other things.) to anyone who supports ati.
SiliconDoc - Thursday, April 23, 2009 - link
No, not true at all, there are quite a few posts where the person declaring their ATI fealty doesn't lie their buttinski off - and those posts I don't counter.Sorry, you must be a raging goofball too who can't spot liars.
It's called LOGIC, that's what you use against the lairs - you know, scientific accuracy.
Better luck next time - If you call me wrong I'll post a half dozen red rooster rooters in this thread that don't lie in what they say and you'll see I didn't respond.
Now, you can apologize any time, and I'll give you another chance, since you were wrong this time.
Nfarce - Thursday, April 2, 2009 - link
I just finished a mid-range C2D build, and decided to go with the HD 4870 512MB version for $164.99 (ASUS, no sale at NE, but back up to $190 now). This was my first ATI card and it was a no-brainer. While the 4890 is a better card, to me, it is not worth the nearly $100 more, especially considering I'm gaming at either 1920x1200 on a 40" LCD TV or a 22" LCD monitor at 1680x1050.Nvidia has lost me after 12 years as a fanboy for the time being, I suppose. What I will do here when I have more time is determine if buying another 4870 512MB for CrossFire will be the better bang for my resolutions or eventually moving up to the 4890 when the price drops this summer and then sell the 4870.
Thanks for the GREAT review AT, and now I have my homework cut out for me for comparisons with your earlier GPU reviews.
Jamahl - Thursday, April 2, 2009 - link
Good job with tidying up the conclusion Anand.Russ2650 - Thursday, April 2, 2009 - link
I've read that the 4890 has 959M transistors, 3M more than the 4870.Gary Key - Thursday, April 2, 2009 - link
That is correct and is discussed on page 3. The increase in die size is due to power delivery improvements to handle the increased clock speeds.Warren21 - Thursday, April 2, 2009 - link
Maybe the tables should be updated to reflect this?Gary Key - Thursday, April 2, 2009 - link
They are... :)helpmespock - Thursday, April 2, 2009 - link
I've been sitting on my 8800GT for a while now and was thinking about going to a 4870 1GB model, but now I may hold off and see what prices do.If the 4890/275 force the 4870 down in price then great I'll go with that, but on the other hand if prices slip from the new parts off of the $250 mark then I'll be tempted by that instead.
Either way I think I'm waiting to see how the market shakes out and in the end I, the consumer, will win.
Lonyo - Thursday, April 2, 2009 - link
I haven't checked the benchmark numbers yet, but you list Forceware 185's on the NV side, and Cat 8.12's on the AMD side.Any reason why you are using brand new drivers for one side and 4 set old drivers on the other?
Sure, NV might not support a card on older drivers etc, but it's more useful to see newest driver vs newest driver, since there are performance changes from 8.12's to 9.3's in various games, but it just seems silly to use 3+ month old drivers which have been superseded by 3 revisions already. Would anyone buy a brand new card and then use drivers from 3~4 months ago?
Gary Key - Thursday, April 2, 2009 - link
The setup chart has been updated. We utilized previous test results from the HD4870 with the 8.12 HotFix drivers. The HD 4890 results were with the 9.4 beta drivers. I just went through a complete retest of the HD4870 with the 9.3 drivers and just started with the 9.4 betas on the AMD 790FX and Intel P45/X58 platforms. Except for a slight performance improvement in Crysis Warhead on the P45/X58 systems, the 9.3 drivers offered zero performance improvements over the 8.12 HotFix. The 9.3 drivers do offer a few updated CF profiles and improved video playback performance, but that is it.Nfarce - Thursday, April 2, 2009 - link
"the 9.3 drivers offered zero performance improvements over the 8.12 HotFix."Not only that, but if you read the ATI forums, a lot of people have been having MAJOR problems with 9.2 & 9.3. I downloaded 9.1 for my new 4870 build and have had no problems.
SiliconDoc - Monday, April 6, 2009 - link
Now now, ati doesn't have any driver problems, so don't you DARE go spreading that lie. You hear me ?Hundreds of red roosters here have NEVER had a driver problem with their ati cards.
Shame shame.
Rhino2 - Monday, April 13, 2009 - link
Stop posting, seriously, you're stupidity is causing me pain.SiliconDoc - Thursday, April 23, 2009 - link
Another personal attacker without a single counterpoint to the many lies I've exposed. Good job brainwashed fool.Frallan - Thursday, April 2, 2009 - link
Thanx - good to know.StormyParis - Thursday, April 2, 2009 - link
The Inq or The Reg is pretty adamant about the 275 being there just to spoil the 4890 launch, but never really in stores, and reviewers getting special hi-specs, hi-overclock hand by nVidia instead of the manufacturers.Did your card come from an anonymous purchase, or was it a special review sample ?
Gary Key - Thursday, April 2, 2009 - link
We mentioned in the article that we received a review sample from NVIDIA. The clock speeds are the same as the retail cards that should start arriving later next week with widespread availability around the week of 4/13. Our retail review samples will arrive early next week.As we stated several times in the article, this is a hard launch for AMD and yet another paper launch for NVIDIA, although according to the board partners the GTX275 is coming quickly this time around.
evilsopure - Thursday, April 2, 2009 - link
I also feel confused and somewhat misled by the conclusion that the GTX275 is "the marginal leader at this new price point of $250".That conclusion doesn't seem true outside of resolutions of 2560x (30" monitor). Plus, $250 cards aren't even targeted or seriously considered for gaming at that resolution.
I came up with a different conclusion after thoroughly reading this review:
At the $250 price point and 1680x or 1920x gaming resolutions (where these cards primarily matter), the 4890 holds the majority performance advantage. However, at 2560x the GTX275 performs a bit better than the 4890. Realistically though, at 2560x or on 30+" displays, you're best served by a dual GPU or SLI/Xfire solution.
Something's fishy about the reviewers' conclusion.
evilsopure - Thursday, April 2, 2009 - link
Update: I guess Anand was making his updates while I was making my post, so the "marginal leader at this new price point of $250" line is gone and the Final Words actually now reflect my own personal conclusion above.Anand Lal Shimpi - Thursday, April 2, 2009 - link
I've updated the conclusion, we agree :)-A
SiliconDoc - Monday, April 6, 2009 - link
You agree now that NVidia has moved their driver to the 2650 rez to win, since for months on end, you WHINED about NVidia not winning at the highest rez, even though it took everyting lower.So of COURSE, now is the time to claim 2650 doesn't matter much, and suddenly ROOT for RED at lower resolutions.
It Nvidia screws you out of cards again, I certainly won't be surprised, because you definitely deserve it.
Thanks anyway for changing Derek's 6 month plus long mindset where only the highest resolution mattered, as he had been ranting and red raving how wonderful they were.
That is EXACTLY WHY his brain FARTED, and he declared NVidia the top dog - it's how he's been doing it for MONTHS.
So good job there, you BONEHEAD - you finally caught the bias, just when the red rooster cards FAILED at that resolution.
Look in the mirror - DUMMY - maybe you can figure it out.
7Enigma - Thursday, April 2, 2009 - link
Check the article again. Anand edited it and it is now very clear and concise.7Enigma - Thursday, April 2, 2009 - link
Bah, internet lag. Ya got there first.... :)sublifer - Thursday, April 2, 2009 - link
As I predicted elsewhere, they probably should have named this new card the GTX 281. In almost every single benchmark and resolution it beats the 280. In one case it even beat the 285 somehow./Gripe
That said, Go AMD! I wanna check other sites and see if they benched with the card highly over-clocked. One site got 950 core and 1150 memory easily but they didn't include it on the graphs :(
Anand Lal Shimpi - Thursday, April 2, 2009 - link
Hey guys, I just wanted to chime in with a few fixes:1) I believe Derek used the beta Catalyst driver that ATI gave us with the 4890, not the 8.12 hotfix. I updated the table to reflect this.
2) Power consumption data is now in the article as well, 2nd to last page.
3) I've also updated the conclusion to better reflect the data. What Derek was trying to say is that the GTX 275 vs. 4890 is more of a wash at 2560 x 1600, which it is. At lower than 2560 x 1600 resolutions, the 4890 is the clear winner, losing only a single test.
Thank you for all the responses :)
Take care,
Anand
7Enigma - Thursday, April 2, 2009 - link
Thank you Anand for the update and the article changes. I think that will quell most of the comments so far (mine included).Could you possibly comment on the temps posted earlier in the comments section? My question is whether there are significant changes with the fan/heatsink between the stock 4870 and the 4890. The idle and load temps of the 4890 are much lower, especially when the higher frequency is taken into consideration.
Also a request to describe the differences between the 4890 and the 4870 (several comments allude to a respin that would account for the higher clocks, lower temp, different die size).
Thank you again for all of your hard work (both of you).
Warren21 - Thursday, April 2, 2009 - link
Yeah, I would also second a closer comparison between RV790 and RV770, or at least mention it. It's got new power phases, different VRM (7-phase vs 5-phase respectively), slightly redesigned core (AT did mention this) and features a revised HS/F.VooDooAddict - Thursday, April 2, 2009 - link
I was very happy to see the PhysX details. I'd started worrying I might be missing out with my 4870. It's clear now that I'm not missing out on PhysX, but might be missing out on some great encoding performance wiht CUDA.I'll be looking forward to your SLI / Crossfire followup. Hoping to see some details about peformance with ultra high Anti-Aliasing that's only available with SLI/Crossfire. I used to run Two 4850s and enjoyed the high-end Edge Antialiasing. Unfortunetly the pair of 4850's were a too much heat in a tiny shuttle case so I had to switch out to a 4870.
Your review reinforced something that I'd been feeling about the 4800s. There isn't much to complain about when running 1920x1200 or lower with modest AA. They seem well positioned for most gamers out there. For those out there with 30" screens (or lusting after them, like myself)... while the GTX280/285 has a solid edge, one really needs SLI/Crossfire to drive 30" well.
piesquared - Thursday, April 2, 2009 - link
Must be tough trying to write a balanced review when you clearly favour one side of the equation. Seriously, you tow NV's line without hesitation, including soon to be extinct physx, a reviewer relieased card, and unreleased drivers at the time of your review. And here's the kicker; you ignore the OC potential of AMD's new card, which as you know, is one of it's major selling points.Could you possibly bend over any further for NV? Obviously you are perfectly willing to do so. F'n frauds
Chlorus - Friday, April 3, 2009 - link
What?! Did you even read the article? They specifically say they cannot really endorse PhysX or CUDA and note the lack of support in any games. I think you're the one towing a line here.SiliconDoc - Monday, April 6, 2009 - link
The red fanboys have to chime in with insanities so the reviewers can claim they're fair because "both sides complain".Yes, red rooster whiner never read the article, because if he had he would remember the line that neither overclocked well, and that overclocking would come in a future review ( in other words, they were rushed again, or got a chum card and knew it - whatever ).
So, they didn't ignore it , they failed on execution - and delayed it for later, so they say.
Yeah, red rooster boy didn't read.
tamalero - Thursday, April 9, 2009 - link
jesus dude, you have a strong persecution complex right?its like "ohh noes, they're going against my beloved nvidia, I MUST STOP THEM AT ALL COSTS".
I wonder how much nvidia pays you? ( if not, you're sad.. )
SiliconDoc - Thursday, April 23, 2009 - link
That's interesting, not a single counterpoint, just two whining personal attacks.Better luck next time - keep flapping those red rooster wings.
(You don't have any decent couinterpoints to the truth, do you flapper ? )
Sometimes things are so out of hand someone has to say it - I'm still waiting for the logical rebuttals - but you don't have any, neither does anyone else.
aguilpa1 - Thursday, April 2, 2009 - link
All these guys talking about how irrelevant physx and how not so many games use it don't get it. The power of physx is bringing the full strength of those GPU's to bear on everyday apps like CS4 or Badaboom video encoding. I used to think it was kind of gimmicky myself until I bought the "very" inexpensive badaboom encoder and wow, how awesome was that! I forgot all about the games.Rhino2 - Monday, April 13, 2009 - link
You forgot all about gaming because you can encode video faster? I guess we are just 2 different people. I don't think I've ever needed to encode a video for my ipod in 60 seconds or less, but I do play a lot of games.z3R0C00L - Thursday, April 2, 2009 - link
You're talking about CUDA not Physx.Physx is useless as HavokFX will replace it as a standard through OpenCL.
sbuckler - Thursday, April 2, 2009 - link
No physx has the market, HavokFX is currently demoing what physx did 2 years ago.What will happen is the moment HavokFX becomes anything approaching a threat nvidia will port Physx to OpenCL and kill it.
As far as ATI users are concerned the end result is the same - you'll be able to use physics acceleration on your card.
z3R0C00L - Thursday, April 2, 2009 - link
You do realize that Havok Physics are used in more games than Physx right (including all the source engine based games)?And that Diablo 3 makes use of Havok Physics right? Just thought I'd mention that to give you time to change your conclusion.
sbuckler - Thursday, April 2, 2009 - link
Big difference between Havok Physics and HavokFX physics. With physx you can just turn on hardware acceleration and it works, with havok this is not possible - unlike physx it was never developed to be run on the gpu. Hence havok have had to develop a new physics engine to do that.No game uses the HavokFX engine - it's not even available to developers yet let alone in shipped games. The ati demo was all we have seen of it for several years. It's not even clear HavokFX is even a fully accelerated hardware physics engine - i.e. the version showed in the past (before intel took over havok) was basically the havok engine with some hw acceleration for effects. i.e. hardware accel could only be used to make it prettier explosions and rippling cloth - it could not be used to do anything game changing.
Hence havok have a way to go before they can even claim to support what physX already does, let alone shipping it to developers and then seeing them use it in games. Like I said the moment that comes close to happening nvidia will just release an OpenCL version of physX and that will be that.
z3R0C00L - Thursday, April 2, 2009 - link
It's integrated in the same way. Many game developers are already familiar with coding for Havok effects.Not to mention that OpenCL has chosen HavokFX (which is simply using either a CPU or a GPU to render Physics effect as seen here: http://www.youtube.com/watch?v=MCaGb40Bz58">http://www.youtube.com/watch?v=MCaGb40Bz58.
Again... Physx is dead. OpenCL is HavokFX, it's what the consortium has chosen and it runs on any CPU or GPU including Intel's upcoming Larrabee.
Like I said before (you seem to not understand logic). Physx is dead.. it's proprietary and not as flexible as Havok. Many studios are also familiar with Havok's tools.
C'est Fini as they say in french.
erple2 - Friday, April 3, 2009 - link
I think you're mistaken - OpenCL is analogous to CUDA, not to PhysX. HavokFX is analogous to PhysX. OpenCL is the GPGPU compiler that runs on any GPU (and theoretically, it should run on any CPU too, I think). It's what Apple is now trying to push (curious, given that their laptop base is all nVidia now).However, if NVidia ports PhysX to OpenCL, that's a win for everyone. Sort of. Except for NVIdia that paid a lot of money for the PhysX IP. I think that the conclusions given are accurate - NVidia is banking on "everyone" (ie Game Developers) coding for PhysX (and by extension, CUDA) rather than HavokFX (and by extension, OpenCL). However, if Developers are smart, they'll go with the actually open format (OpenCL, not CUDA). That means that any physics processing they do will work on ANY GPU, (NVidia and ATI). I personally think that NVidia banked badly this time.
While I do believe that doing physics calculations on unused GPU cycles is a great thing (and the Razor's Edge demo shows some of the interesting things that can be done), I think that NVidia's pushing of PhysX (and therefore CUDA) is like what 3dfx did with pushing GLide. Everyone supported Direct3D and OpenGL, but only 3dfx supported Glide. While Glide was more efficient (it was catering to a single hardware vendor that made Glide, afterall), the fact that Game Developers could instead program for OpenGL (or Direct3D) and get all 3D accelerators supported meant that the days of Glide were ultimately numbered.
I wonder if NVidia is trying to pull the industry to adopting its CUDA as a "standard". I think it's ultimately going to fail, however, given that the industry recognizes now that OpenCL is available.
Is OpenCL as mature as CUDA is? Or are they still kind of finalizing it? Maybe that's the issue - OpenCL isn't complete yet, so NVidia is trying to snatch up support in the Developer community early?
sbuckler - Friday, April 3, 2009 - link
CUDA is in many ways a simplified version of OpenCL - in that CUDA knows what hardware it will run on so has set functions to access it, OpenCL is obviously much more generic as it has to run on any hardware so it's not quite as easy. That part of the reason why CUDA is initially at least more popular then OpenCL - it's easier to work with. That said they are very similar so to port from one to the other won't be hard - hence develop for CUDA now then just port to OpenCL when the market demands it.All in my opinion Ati want is their hardware to run with whatever physics standard is out there. Right now they are at a growing competitive disadvantage as hardware physics slowly takes off. Hence they demo HavokFX in the hope that either (a) it takes off or (b) nvidia are forced to port PhysX to openCL. I don't think they care which one wins - both products belong to a competitor.
Nvidia who have put a lot of money into PhysX want to maximise their investment so they will keep PhysX closed as long as possible to get people to buy their cards, but in the end I am sure they are fully aware they will have to open it up to everyone - it's just a matter of when. From our standpoint the sooner the better.
erple2 - Friday, April 3, 2009 - link
Sure, but my point was simply that HavokFX and PhysX are physics API's, whereas OpenCL and CUDA are "general" purpose computing languages designed to run on a GPU.Is CUDA easier to work with? I don't really know, as I've never programmed for either. Is OpenGL harder to program for than Glide was? Again, I don't know, I'm not a developer.
ATI's "CUDA" was "Stream" (I think). I recall ATI abandoning that for (or folding that into) OpenCL. That's a sound strategic decision, I think.
If PhysX is ported to OpenCL, then that's a major win for ATI, and a lesser one for NVidia - the PhysX SDK is already free for any developer that wants it (support costs money, of course). NVidia's position in that market is that PhysX currently only works on NVidia cards. Once it works elsewhere (via OpenCL or Stream), NVidia loses that "edge". However, that's a good thing...
SiliconDoc - Monday, April 6, 2009 - link
I guess you're forgetting that recently NVidia supported a rogue software coder that was porting PhysX to ATI drivers. Those drivers hit the web and the downloads went wild high - and ATI stepped in and slammed the door shut with a lawsuit and threats.Oh well, ATI didn't want you to enjoy PhysX effects. You got screwed, even as NVidia tried to help you.
So now all you and Derek and anand are left to do is sourpuss and whine PhysX sucks and means nothing.
Then anand tries Mirror's Edge ( because he HAS TO - cause dingo is gone - unavailable ) and falls in love with PhysX and the game. LOL
His conclusion ? He's an ati fabbyboi so cannot recommend it.
tamalero - Monday, April 20, 2009 - link
the ammount of fud you spit is staggeringz3R0C00L - Thursday, April 2, 2009 - link
On one hand you have OpenCL, Havok, ATi, AMD and Intel on the other you have nVIDIA.Seriously.
z3R0C00L - Thursday, April 2, 2009 - link
I'm an nVIDIA fan.. I'll admit. I like that you added CUDA and Physx.. but are we reading the same results?The Radeon HD 4890 is the clear winner here. I don't understand how it could be any different.
CrystalBay - Thursday, April 2, 2009 - link
I agree if nV wants to sell more cards they need to include the video software at no charge...jeffrey - Thursday, April 2, 2009 - link
1) ATI driver - Was 8.12 really used? Why? 9.3 was released last month.2) Conclusion - The edge should have gone to the 4890 for being ahead of the 275 in most games at resolutions targeting the price point.
Gary Key - Thursday, April 2, 2009 - link
1. The 9.4 beta was used for the HD 4890 and the chart has been updated to reflect it. The 9.3 drivers are not any faster than the 8.12 HotFix for the other AMD cards in every test I have run but Crysis Warhead with a Core 2 Quad. A few improvements have been made for CF compatibility and video playback though.2. The conclusion has been updated to clarify our thoughts between the two cards.
can - Thursday, April 2, 2009 - link
An ATI gpu with and nvidia gpu doing physx? I'm curious to see results of this kind of arrangement. Not a dedicated PCI Physx card, but on a faster bus, with a more powerful processor, as a video card. I'm wondering about pitfalls and performance and the literal looks of the application.SiliconDoc - Monday, April 6, 2009 - link
Sorry bub, you're stuck with ati, and as far as curiosity for physx - uhh... don't worry, you're not missing much, anand only got addicted to it for a bit.If you want the driver hack for it, there's a thread at techpowerup.
Some genius figured something out on it- not sure which os.
Jamahl - Thursday, April 2, 2009 - link
The conclusion in this review is awful beyond anything I have read before.How can the reviewer say the 275 is winning this one when the benchmarks clearly show dominance for the 4890 at most resolutions?
Otherwise it was a good article but the conclusion leaves a sour taste in the mouth.
SiliconDoc - Monday, April 6, 2009 - link
He could say it because he said it for ati for 6 months when ati won the top resolution. So his brain is in a "fart mode" that lied for ati for so long, he said it this time for nvidia - either that or he realized if he didn't he would look like an exposed raging red rooster fanboy.Good thing the reds started screaming NOW, after loving it for 6 months when their card was on top using the false method - because anand came in and saved the day - and changed the conclusion - for ati.
LOL
When nvidia doesn't give them a card for review again, it will be "them towing the line of honesty" that causes, no doubt, right ?
BWAHAHAHAAAAAAA
( you all just tell yourselves that along with our dear leaders )
erikejw - Thursday, April 2, 2009 - link
"On the NVIDIA side, we received a reference version of the GTX 275."You wish.
"since there is no 275 ASIC, NV is telling OEMs that they can make it from either a 280 or 260 board. One costs much more, and one performs better, so guess what everyone is going to use?
That isn't necessarily bad, but how NV is seeding reviewers is. They are only going to be giving out a very special run of ONLY 280 based parts.
Quite special 280 based parts at that. Reviewers beware, what you are getting is not what you can buy."
http://www.theinquirer.net/inquirer/news/599/10515...">http://www.theinquirer.net/inquirer/new...ia-hoodw...
bill3 - Thursday, April 2, 2009 - link
A lot of people seem to be crying about lack of temp, power consumption, oc, and fan noise numbers..while I agree in a stand alone review these are glaring omissions, the fact is theres a dozen reviews around the web where you can get that info in triplicate. I would much rather have the insight on CUDA and PHysx!I mean, people act like the internet isnt free and we all arent a google search/mouse click away from that type of info! Geez.
That said, I suppose reviews must be treated as "stand alone", however artificial a construct it may be. However if theres anything thats easily forgivable to be left out it's simple data numbers that can be found at a thousand other places. Which is exactly what temp, oc, etc are. I already know those numbers from a ton of other reviews. These people whining in the comments act like Anand is the only hardware review site there is. I would think if anybody was truly interested in laying out 250 to purchase one of these theyd be looking at more than one review!
SkullOne - Thursday, April 2, 2009 - link
I second the question about why did you use Catalyst 8.12 Hotfix? Other sites are using what appears to be an Beta Catalyst 9.4 driver so is your listing of Catalyst 8.12 a misprint?Also why do you care if AMD sent you an overclocked version? The HD4890 is directly targeted at the overclocking enthusiasts which is a realm that AMD has ignored up until now while NV embraced it.
The HD4890 has already been taken to 1+GHz on it's GPU and up to 4.8GHz on it's memory on other sites. That by far makes it the better buy.
SkullOne - Thursday, April 2, 2009 - link
Forgot to mention that by having this extremely overclockable card AMD has opened up another entire SKU for themselves by selling "OC" cards with the 4890.SiliconDoc - Monday, April 6, 2009 - link
Oh great, a whole other sku to lose another billion a year with. Wonderful. Any word on the new costs of the bigger cpu and expensive capacitors and vrm upgrades ?Ahh, nevermind, heck, this ain't a green greedy monster card, screw it if they lose their shirts making it - I mean there's no fantasy satisfaction there.
Get back to me on the nvidia costs - so I can really dream about them losing money.
itbj2 - Thursday, April 2, 2009 - link
I am not sure about you guys but NVIDIA has problems with their drivers as well. I have a 9400GT and a 8800 GTS in my machine and the new drivers can't make the two work well enough for my computer to come out of hibernation with out Windows XP crashing every so often. This use to work just fine before I upgraded the drivers to the latest version.FishTankX - Thursday, April 2, 2009 - link
For anyone who REALLY wants temp data..Firingsquad 4890/GTX275 review
http://www.firingsquad.com/hardware/ati_radeon_489...">http://www.firingsquad.com/hardware/ati...4890_nvi...
Idle
GTX 260 216 (45C)
GTX 285 (46C)
GTX 275 (47C)
4890 1GB (51C)
4870 (60C)
Load
4890 1GB (64C)
GTX 260 216 (64C)
GTX 275 (68C)
GTX 285 (70C)
4870 1GB (80C)
Power consumption
(Total system power)
Idle
GTX 275 (143W)
4890 (172W)
Load
4890 (276W)
GTX 275 (279W)
There, now you can can it! :D
SiliconDoc - Monday, April 6, 2009 - link
There it is again, 30 watts less idle for nvidia, and only 3 watts more in 3d. NVIDIA WINS - that's why they left it out - they just couldn't HANDLE it....So, if you're 3d gaming 91% of the time, and only 2d surfing 9% of the time, the ati card comes in at equal power useage...
Otherwise, it LOSES - again.
I doubt the red raging reviewers can even say it. Oh well, thanks for posting numbers.
7Enigma - Thursday, April 2, 2009 - link
Can anyone confirm whether or not the heatsink/fan has been altered between the 4870 and the 4890? I'm interested to know if the decreased temps of the higher clocked 4890 are due in part to a better cooling mechanism, or strictly from a respin/binning.Warren21 - Thursday, April 2, 2009 - link
Yes, the cooler has been slightly revised. I believe it's a combination of both. I'll admit I'm a bit disappointed AT didn't explore the differences between the HD 4870 and the 4890 more in-depth.Comparisson:
http://www.hardwarecanucks.com/forum/hardware-canu...">http://www.hardwarecanucks.com/forum/ha...phire-ra...
bill3 - Thursday, April 2, 2009 - link
"It looks like NVIDIA might be the marginal leader at this new price point of $250." you wroteBut looking at your own benches..
Since you run 3 resolutions of your benches, lets reasonably declare that the card that can win 2 or more of them "wins" that game. In that case 4890 wins over 275 in: COD WaW, Warhead, Fallout 3, Far Cry 2, GRID, and Left 4 Dead. 275 wins over 4890 in Age of Conan. Either with AA or without the results stay the same.
The only way I think you can contend 275 has an edge is if you place a premium on the 2560X1600 results, where it seems to edge out the 4890 more often. However, it's often at unplayable framerates. Further I dont see a reason to place undue importance on the 2560X benches, the majority of people still game on 1680X1050 monitors, and as you yourself noted, Nvidia released a new driver that trades off performance at low res for high res, which I think is arguably neither here nor their, not a clear advantage at all.
Even at 2560 (using the AA bar graphs because its often difficult to spot the winner at 2560 on the line graphs), where the 275 wins 5 and loses 2, the margins are often so ridiculously close it essentially a tie. 275 takes AOC, COD WaW, and L4D by a reasonable margin at the highest res, while the 4890 wins Fallout3 and GRID comfortably. Warhead and Far Cry 2 are within .7 FPS although nominally wins for 275. Thats a difference of all of 3-2 in materially relevant wins, or exactly 1 game. But keep in mind again that 4890 is fairly clearly winning the lower reses more often, and to me it's wrong to state 275 has the edge.
SiliconDoc - Monday, April 6, 2009 - link
The funny thing is, if you're in those games and constantly looking at your 5-10 fps difference at 50-60-100-200 fps - there's definitely something wrong with you.I find reviews that show LOWEST framerate during game when it's a very high resolution and a demanding game useful - usually more useful when the playable rate is hovering around 30 or below 50 (and dips a ways below 30.
Otherwise, you'd have to be an IDIOT to base your decision on the very often, way over playable framerates in the near equally matched cards. WE HAVE A LOT OF IDIOTS HERE.
Then comes the next conclusion, or the follow on. Since framerates are at playable, and are within 10% at the top end, the things that really matter are : game drivers / stability , profiles , clarity, added features, added box contents (a free game one wants perhaps).
Almost ALWAYS, Nvidia wins that - with the very things this site continues to claim simply do not matter, and should not matter - to ANYONE they claim - in effect.
I think it's one big fat lie, and they most certainly SHOULD know it.
Note now, that NVidia - having released their, according to this site, high resolution driver tweak for 2560xX , wins at that resolution, the review calmly states it does'nt matter much, most people don't play at that resolution - and recommend ati now instead.
Whereas just prior, for MONTHS on end, when ati won only the top resolution, and NVidia took the others, this same site could not stop ranting and raving that ATI won it all and was the only buy that made sense.
It's REALLY SICK.
I pointed out their 30" monitor for ATI bias months ago, and they continued with it - but now they agree with me - when ATI loses at that rezz... LOL
Yeah, they're schesiters. Ain't no doubt about it.
Others notice as well - and are saying things now.
I see Jarred is their damage control agent.
JonnyDough - Thursday, April 2, 2009 - link
Why not just use RivaTuner or ATI Tool to underclock OC'd cards?Jamahl - Thursday, April 2, 2009 - link
How can the conclusion be that the 275 is the leader at the price point? The benchmarks are clearly in favour of the 4890 apart from the extreme end 2560x1600.7Enigma - Thursday, April 2, 2009 - link
Deja vu again, and again, and again. I've posted in no less than 3 other articles how bad some of the conclusions have been. There is NO possible way you could conclude the 275 is the better card at anything other than the 30" display resolution. Not only that, but it appears with the latest Nvidia drivers they are making things worse.Honestly, does anyone else see the parallel between the original OCZ SSD firmware and these new Nvidia drivers? Seems like they were willing to sacrifice 99% of their customers for the 1% that have 30" displays (which probably wouldn't even be looking at the $250 price point). Nvidia, take a note from OCZ's situation; lower performance at 30" to give better performance at 22-24" resolutions would do you much better in the $250 price segment. You shot yourselves in the foot on this one...
Gary Key - Thursday, April 2, 2009 - link
The conclusion has been clarified to reflect the resolution results. It falls right into line with your thoughts and others as well as our original thoughts that did not make it through the edits correctly.7Enigma - Thursday, April 2, 2009 - link
Yup, I responded to Anand's post with a thank you. We readers just like to argue, and when something doesn't make sense, we're quick to go on the attack. But also quick to understand and appreciate a correction.duploxxx - Thursday, April 2, 2009 - link
Just some thoughts:There is only 1 single benchmark out of 7 where the 275 has better frame rates for 1680 and 1920 resolution against the 4890 and yet your final words are that you favor the 275???? Only in 2560 the 275 is clearly the better choice. Are you already in the year 2012 where 2560 might be the standard resolution of the sales, it is only very recent that the 1680 became standard and even then this resolution is high for global OEM market sales. Your 2560 is not even few % of the market.
I think you have to clarify your final words a bit more with your choice.... Perhaps if we see power consumption, fan noice etc that would be added value to the choice, but for now, TWIMTBP is really not enough push to prefer the card, I am sure the red team will improve there drivers as usual also.
anything else i missed in your review that could counter my thoughts?
SiliconDoc - Monday, April 6, 2009 - link
Derek has been caught in the 2560 wins it all no matter what with the months on end of ati taking that cake since the 4870 releasse. No lower resolutions mattered for squat since the ati lost there - so you'll have to excuse his months long brainwashing.Thankfully anand checked in and smacked it out of his review just in time for the red fanboy to start enjoying lower resolution wins while nvidia takes the high resolution crown, which is- well.. not a win here anymore.
Congratulations, red roosters.
duploxxx - Thursday, April 2, 2009 - link
just as addon, I also checked some other reviews (yes i always read anandtech first as main source of info) and i saw that it is cooler then a 4870 and actually consumes 10% less then a 4870 so this can't be the reason either while the 275 stays at the same 280 power consumption. Also OC parts are already shown GPU above 1000....cyriene - Thursday, April 2, 2009 - link
I would have liked to see some information on heat output and the temperatures of the cards while gaming.Otherwise, nice article.
7Enigma - Thursday, April 2, 2009 - link
This is an extreme omission. The fact that the 4890 is essentially an overclocked 4870 means with virtually nothing changed you HAVE to show the temps. I still stick by my earlier comment that the Vapo-chill model of the Sapphire 4870 is possibly a better card since it's temps are significantly lower than the stock 4870, while already being overclocked. I could easily imagine that for $50-60 less you could have the performance of the 4890 at cooler temps (by OC'ing the vapochill further).Comon guys, you have to give thought to this!
SiliconDoc - Monday, April 6, 2009 - link
Umm, they - you know the AT bosses, don't like the implications of that. So many months, even years, spent on screeching like women about nvidia rebranding has them in a very difficult position.Besides, they have to keep the illusion of superior red power useage, so only after demand will they put up the power chart.
They tried to get away with not, but they couldn't do it.
initialised - Thursday, April 2, 2009 - link
GPU-z lists the R790 as having a surface area of 282mm2 while the R770 has 256mm2 but both are listed as having the same transistor count.Warren21 - Thursday, April 2, 2009 - link
Yeah, I don't know why they're playing this off as an RV770 overclock. RV790 is indeed a respin of RV770, but hey if nV can get by with 1000 different variants on the same GT200... Why not mention the benefits/differences between the RV770 and the RV790? Disappointed.SiliconDoc - Monday, April 6, 2009 - link
I guess they didn't mention the differences ? Tell you what, when ati gets 999 more rebrands and catches up with their competitotr, we'll call it even, ok ?In the mean time, the 4870 crossfires with the 4980, and soon enough we'll have the gamer joe reviewers that downclock the 4890 and find it has identical results to the same clocked 4870 - at that point the red roosters will tuck their flapping feathers and go home.
I know, it's hard to see it coming, when all you can see is s tiny dot of red, in a sea of 1000 choices of green. rofl
bill3 - Thursday, April 2, 2009 - link
According to info at other sites, the 4890 has 3 million more transistors (959 instead of 956, very little difference). It also has a somewhat larger die due to tweaks made to allow the higher clocks.Go to Firing Squad or Xbitlabs review, both have an certain ATI slide that explains the small changes in detail.
SiliconDoc - Monday, April 6, 2009 - link
" Because they’re so similar, the Radeon 4870 and 4890 can be combined together for mix-and-match CrossFire, just like the 4850 and 4870. "I guess it's not a rebrand.
roflmao
bill3 - Thursday, April 2, 2009 - link
http://www.firingsquad.com/hardware/ati_radeon_489...">http://www.firingsquad.com/hardware/ati..._4890_nv...The slide is the first clickable pic on that page, actually. Didn't realize we could do links.
bill3 - Thursday, April 2, 2009 - link
Or even betterhttp://www.firingsquad.com/hardware/ati_radeon_489...">http://www.firingsquad.com/hardware/ati...0_nvidia...
heh
Proteusza - Thursday, April 2, 2009 - link
Thanks guys, good read.The piece on PhysX kinda mirrors my thoughts on it - its not worth basing a GPU purchasing a decision on it because it affects so few games. If you design your game around PhysX, you end up making a gimmicky game, if you design a good game and think of good ways to let PhysX enhance it, you can make something good like Mirror's Edge.
The way I think about PhysX is based on Amdahl's law, which says that overrall speedup of a CPU from an enhancement that affects only a certain class of application is affected by the amount of time spent using that certain class of application. In the case of PhysX, the amount of time spent using it is generally extremely low, and when it is used the effect isnt always noticeable or worth having.
NVidia's marketing tactics leave a lot to be desired frankly, although I'm not naive enough to say AMD never tries a little marketing manipulation themselves.
Sylvanas - Thursday, April 2, 2009 - link
Why on earth would you compare a newly released Nvidia driver to that of an ATI driver from December last year and a Hotfix at that? The latest ATI drivers have had substantial improvements in a few games and surely they would have sent you an up to date driver with the 4890 review sample- somethings not right there. Also, where was the overclocking comparisons? (some reviews stating 1ghz core 4890 no problem). What about Temps and Stock cooling fan noise?7Enigma - Thursday, April 2, 2009 - link
I'm a bit disappointed with the ATI card. That is pretty much the Sapphire Vapochill model with increased core (actually it's a slightly slower memory setting). At least the GTX 275 is something different.bill3 - Thursday, April 2, 2009 - link
Wow lol..both cards are just rehashes. Calling the Nvidia card "something different" is a hell of a stretch.. it's just their same other cards with various clocks twiddled for the trillionth time.If anything the ATI card brings more to the table, as it offers much more clock headroom (1ghz is said to be well within reach) due to it's redesign, while the Nvidia card is nothing at all new intrinsically (aka it will overclock similar to the 285). Too be fair Nvidia's better clock-capable models (285) just came out a couple months earlier instead of now.
7Enigma - Thursday, April 2, 2009 - link
If the 4890 is in fact a respin then I retract my original comment. My point if just a simple OC was that they were basically rebranding (binning) parts that could clock higher than the stock 4870 and selling it as a new card. That seems not to be the case, and so I can't be at fault if the Anand article didn't address this.Regardless of whether the Nvidia card is in fact similar to the other offerings it does have disabled/enabled parts that do make it different than the 285 and 260.
I'd still really like to see one of the Vapochill units up against the 4890. I'm pretty confident you could get to the stock 4890 speeds, so it's just a matter of whether $70 is worth the potential to OC much higher than the 4870 (if these 1gig core clocks are the norm).
What we really need to see though is the temps for these cards under idle/load. That would be extremely helpful in deciding how good they are. For example if we see the 4890 at its stock speed is significantly cooler than the 4870 (and they haven't done much to the heatsink/fan), then the Vapochill 4870's just don't stand a chance. If we find the 4890's are similar or higher in temp than the stock 4870's, then it seems much more like a rebadge job.
MadMan007 - Thursday, April 2, 2009 - link
If you look at it objectively the GTX 275 is something more different than the HD4890 unless there are undercover changes in the latter of which we haven't been made aware. HD4890 = clock bumped HD4870 exactly, GTX 275 = 240SP 448-bit memory interface GT200b which was not available as a single GPU card.bill3 - Thursday, April 2, 2009 - link
Meh..Madman Nvidia are still just playing around with the exact same modular component sets they have been, not adding anything new. Besides as even you alluded it isnt even a new card, it's just half the exact previously existing configuration in a GTX295But as I said 285 is clocked higher than 280, I'm assuming Nvidia did die tweak to get there (at the least they switched to 55nm). They just did them 3 months ago or whenever, ATI is just now getting to it.
But for todays launches, imo ATI brings more new to the table than Nvidia, ever so slightly.
Snarks - Tuesday, April 7, 2009 - link
Ati was the first with the 55nm core.. or did you mean something else?the GTX275 is just simply a 280GTX with a few things disabled is it not?
SiliconDoc - Monday, April 6, 2009 - link
Right, they brought ambient occlusion to the table with their new driver.... LOLMan , I'm telling you.
The new red rooster mantra " shadows in games do not matter " !
( "We don"t care nvidia does it on die and in drivers, making it happen in games without developer implementation ! WE HATE SHADERS/SHADOWS who cares!" )
I mean you have to be a real nutball. The Camaro car shop doesn't have enough cherry red paint to go around.
I wonder if the red roosters body paint ATI all over before they start gaming ? They probably spraypaint their Christmas trees red - you know, just to show nvidia whose boss...
Unbelievable. Shadows - new shadows not there before - don't matter... LOL
roflmao
Warren21 - Thursday, April 2, 2009 - link
I'm surprised they didn't mention it, maybe they hadn't been properly briefed, but yes the HD 4890 IS a different core than the HD 4870.It uses a respin on the RV770 called RV790 which has slight clock-for-clock performance increases and much better power efficiency than the RV770. Case in point: higher clocks yet lower idle power draw. It's supposed to clock to 1 GHz without too much hassle granted proper cooling also.
Live - Thursday, April 2, 2009 - link
This review was kind of a let down for me. It almost seems Nvidias sales rep terrorized you so much the last year so you felt compelled to write about CUDA and PhysX. But just as you said from the beginning it’s not a big deal.As a trade off temperatures, noise and power seems to have gone missing. You talk about Nvidias new driver but what about ATIs new driver? Did you really test the ATI cards with “Catalyst 8.12 hotfix” as is stated on the test page?!? Surely ATI sent you a new driver and the performance figures seem to support that. I is my understanding that ATI has upped their driver performance the last months just like Nvidia has. No mention of IQ except from Nvidias new drivers. No overclocking which I had heard would be one of the strong points of the ATI card with 1 GHz GPU a possibility. I know you mentioned you would look at it again but just crank up the damn cards and let us know where they go.
Dont get me wrong the article was good but I guess I just want more ;)
ATI sems to win at “my” resolution of 1680x1050, but then again Nvida has some advantages as well. Tough call and I guess price will have to settle this one.
dubyadubya - Friday, April 3, 2009 - link
I agree noise and temps should be in all reviews. So should image quality comparisons. While we are at it 2d performance and image quality comparisons should really be part of any complete review. It seems frame rates are all review sites care to report.The0ne - Thursday, April 2, 2009 - link
You and others want more but yet keep bitching about mentions such as CUDA and PhysX. If Anandtech doesn't mention then someone has to complain why they weren't and weren't included in the test. For example the recent buyers guide. And when they do mention it and said it doesn't do anything much and left it alone there's bitching going on. I really don't get you guys sometime.SiliconDoc - Monday, April 6, 2009 - link
Well it's funny isn't it - with the hatred of NVidia by these reviewers here. Anand says "he has never played Mirror's Edge" - but of course it has been released for quite some time. So Anand by chance with the red rooster gone has to try it - of course he didn't want to, but they had to finally mention CUDA and PhysX - even though they dpn't want to.Then Anand really does like the game he has been avioding, it's great, he gets near addicted, shuts off PhysX, notices the difference, turns it back on and keeps happily playing.
Then he says it doesn't really matter.
Same for the video converter. Works great, doesn't matter.
CUDA - same thing, works, doesn't matter, and don't mention folding, because that works better on NVida - has for a long time, ATI has some new port, not as good, so don't mention it.
Then Ambient Occlusion - works great, shadows - which used to be a very, very big deal are now with the NVidia implementation on 22 top games, well, it's a "meh".
There's only so many times so many added features can work well, be neat, be liked, and then the reviewer, even near addicted to one game because of the implementation, says "meh", and people cannot conclude the MASSIVE BIAS slapping them in the face.
We KNOW what it would be like if ATI had FOUR extra features Nvidia didn't - we would NEVER hear the end of the extra value.
Anand goes so far as to hope and pray openCL hits very soon, because then Havok via ATI "could soon be implemented in some games and catch up with PhysX fairly soon".
I mean you have to be a BLIND RED ROOSTER DROOLING IDIOT not to see it, and of course there is absolutely no excuse for it.
It's like cheering for democrats or republicans and lying as much as possible depending on which team you're on. It is no less than that, and if you don't see it glaring in your face, you've got the very same mental problem. It's called DISHONESTY. Guided by their emotions, they cannot help themselves, and will do everything to continue and remain in absolute denial - at least publicly.
Snarks - Tuesday, April 7, 2009 - link
One is an open, one is not.Jesus christ.
The fact you have to pay extra on top of the card prices to use these features is a no go. You start to lose value, thus negating the effect these "features" have.
p.s ATI have similar features to nvidia, what they have is nothing new.
SiliconDoc - Tuesday, April 7, 2009 - link
Did you see a charge for ambient occlusion ?Here you are "clucky clucky cluck cluck !"
Red rooster, the LIARS crew.
SiliconDoc - Tuesday, April 7, 2009 - link
One ? I count for or five. I never had to pay extra outside card cost for PhysX, did you ?You see, you people will just lie your yappers off.
Yeah ati has PhysX - it's own. ROFLMAO
Look, just jump around and cluck and flap the rooster wings and eat some chickseed, you all can believe eachothers LIES. Have a happy lie fest, dude.
bill3 - Thursday, April 2, 2009 - link
Personally while you bring up good points I'd much, much, MUCH rather have the thorough explanation of CUDA and PHYSX and the relevance thereof, they gave us than power, heat and overclocking numbers you can get at dozens of other reviews. The former is insight, the latter just legwork.joos2000 - Thursday, April 2, 2009 - link
I really like to soft shadows you get in the corners with the new AO features in nVidia's drivers. Very neat.dryloch - Thursday, April 2, 2009 - link
I had a 4850 that I bought at launch. I was very excited when ATI released their Video Convertor app. I spent days trying to make that produce watchable video. Then I realized that every website that tested it had the same result. They released a broken POS and have yet to fix it. I did not appreciate them treating me like that so when I replaced the card I switched out to Nvidia. I have gone back and forth but this time I think I will stick with Nvidia for a while.duploxxx - Thursday, April 2, 2009 - link
and by buying Nvidia you already knew that you didn't have POS so in the end you have the same result, except for the fact that the 48xx series really had a true performance advantage with that price range so your rebranded replacement just gave you 1) additional cost and 2) really 0 added value, so your grass is a bit to green.....Exar3342 - Thursday, April 2, 2009 - link
"0 added value"? Really? He didn't have a GPU video converter that worked on his ATI card, and now he DOES have a working program with his Nvidia card. Sounds like added value to me. He gets the same performance, pretty much the same price, and working software. Not a bad deal...z3R0C00L - Thursday, April 2, 2009 - link
The GPU converted that comes with nVIDIA is horrible (better than ATi's though).I use Cyberlink PowerDirector 7 Ultra which supports both CUDA and Stream. Worth mentioning that Stream is faster.
Spoelie - Thursday, April 2, 2009 - link
Is the 30$ pricetag of badaboom included in the "pretty much the same price"? If it isn't, then actually there is no added value. You have a converter (value, well only if your goal is to put video's on your ipod and it's worth 30$ to you to do it faster) but you have to pay for it extra. The only thing the nvidia card provides is the ability to accelerate that program, you don't actually get the program.Hauk - Thursday, April 2, 2009 - link
Congrats ATI, the 4890 is a strong performer! So much chatter about what constitutes rebadging; at the end of the day it's performance that matters. 4890 does a great job for the money.The GTX 275 performs well but lacks excitement IMO. Nothing surprising or exciting; we've already seen a 240 shader enabled gpu on a 260 style interface (x2). If anything, the 285 receives strong competition from both 4890 and 275. Makes little sense to remain at it's price point. It's price should be $300.
Times are tight. Cheers to competition...
slickr - Thursday, April 2, 2009 - link
I can't believe how biased anandtech has become.I've checked all other review sites and in all the GTX275 was winning by a pretty big margin, here it actually looses to the HD4890.
Now I'm not a fanboy for either, I've had 2 nvidia graphic cards and 2 ATI cards, the current one is ATI, but this bias thing can't go un-noticed.
Some investigators must be summoned to deal with anandtech, this has been going for quite a while now.
z3R0C00L - Friday, April 3, 2009 - link
I see the difference.. those "other reviews" used the Catalyst 9.3 drivers.Anandtech, HardOCP and Firingsquad used the new 9.4 Beta drivers.
No bias on Anandtech's part. Rather a bias from those other sites who used the new nVIDIA BETA driver but not the ATi one that has proper support for the 4890.
B3an - Thursday, April 2, 2009 - link
I dont normally take notice of comments like this on here, but it does seem a little like it. It's as if NV have pissed off Anandtech with there dirty tactics (understandable), and Anandtech are being a little bias because of this.JarredWalton - Thursday, April 2, 2009 - link
I've looked at three reviews (FiringSquad, THG, and HardOCP - also Xbitlabs, but they didn't have the GTX 275 in their results). I'm not quite sure what horribly biased and inaccurate results we're supposed to have, as most of the tests are quite similar to ours. Two sites - HardOCP and FiringSquad - essentially end up as a tie. THG favors the 275, at least at lower resolutions and without 4xAA, but then several of the games they test we didn't use, and vice versa. (The 4970 also beats the 275 there if you run 4xAA 2560x1600.)Obviously, we had a lengthy rant on CUDA and PhysX and discussed the usefulness of those features (conclusion: meh), but with all the marketing in that area it was something that was begging to be done. Pricing, availability, and drivers are still areas you need to look at, but it's really a very close race.
If you have reviews that show very different results than what I'm seeing, post the name of the site rather than making vague claims like, "I've checked all other review sites and in all the GTX275 was winning by a pretty big margin, here it actually looses to the HD4890."
SkullOne - Thursday, April 2, 2009 - link
That's different then what I've seen. I dunno what sites you visit but all of the ones I've been to show them just about neck and neck or the 4890 just edging out the 275.Personally I give the edge to the 4890 due to it's high overclockability.
SkullOne - Thursday, April 2, 2009 - link
That's different then what I've seen. I dunno what sites you visit but all of the ones I've been to show them just about neck and neck or the 4890 just edging out the 275.Personally I give the edge to the 4890 due to it's high overclockability.
Spoelie - Thursday, April 2, 2009 - link
p=2 As we can clearly see, in the cards we *r*estedp=11 particles are one of the most difficult things to do on the CPU *thanks*
Drivers? test table says 8.12 hotfix but we're at 9.3/9.4 now...
Yojimbo - Thursday, April 2, 2009 - link
First you say you aren't concerned about the 4890 being a rebadge because at the end of the day it's performance that matters, and then you said the GTX 275 lacks excitement because "we've already seen a 140 shader enabled gpu on a 260 style interface (x2)," whatever the significance of already seeing that is.Aren't these contradictory statements?
SiliconDoc - Thursday, April 23, 2009 - link
Of course it's contradictory, it's a red rooster statement. Then the 3m they use to just crank a few more mhz is the rework.. LOLPay homage to the red and hate green and spew accordingly with as many lies as possible or you won't fit in here - been like that for quite some time. Be smug and arrogant about it, too, and never amdit your massive errors - that's how to do it.
Make sure you whine about nvidia and say you hate them in as many ways as possible, as well - be absolutely insane mostly, that's what works - like screaming they can take down nvidia when the red rooster shop has been losing a billion a year on a billion in sales.
Be an opposite man.
Of course it's contradictory. Duhh.. they're insane man - they are GONERS.
lk7900 - Monday, April 27, 2009 - link
Can you please die? Prefearbly by getting crushed to death, or by getting your face cut to shreds with a
pocketknife.
I hope that you get curb-stomped, f ucking retard
Shut the *beep* up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
http://www.youtube.com/watch?v=QGt3lpxyo1U">http://www.youtube.com/watch?v=QGt3lpxyo1U
I wish you a truly painful, bloody, gory, and agonizing death, *beep*
Veteran - Thursday, April 2, 2009 - link
BTW. 4890 is not a rebadge or something, it is an improved core (check xbitlabs), it has 3M more pixels and about 22 sqmm more diesizeVeteran - Thursday, April 2, 2009 - link
pixels should be transistors offcourse.....GamerBad - Thursday, August 5, 2010 - link
I am not sure what all the conversation here is about.I will tell you a bit my graphics card.
First,
I am an GeForce man through and through. I will tell you why.
I have never purchased a GeForce card that was faulty. Luck?
My current computer is running all Asus. Twin Ati Radeon 4890's ...
and so far.. I am on my third replacement graphics card. The first one had memory problems. Second was doa... third.. well.. this one overheats and crashes.
The Radeon may be better than the GeForce when it works. I really dont notice a difference.
So to me.. it is quality of craftsmanship that makes the difference.
Currently I am very unhappy with Radeon becuase I build my new system for this graphics setup. My Asus mother board dosent support dual GeForce only Radeon.
It seems I am stuck sending my graphics cards back and praying eventually I will get one that is not a lemon.