the usual red rooster line I see:
" It's all about cost benefit. Certainly it's a benefit to have smaller GPUs as they cost less to make. "
---
YES WE KNOW YOU IDIOT RED ROOSTERS - ATI HAS LOST A BILLION A YEAR IN THEIR COST EFFECTIVE TINY GPU DIE SALES.
THEY LOSE 33% IN DOLLARS OF THEIR SALE PRICE ON EVERY DING DANG VIDEOCARD THEY SELL AS AN AVERAGE.
That means if you buy a $200 lovely driver corrupted red rooster card, ATI loses $66 for AMD and themselves !
---
WHAT a great advantage they have in that die size !
----
MEANWHILE FOR THOSE BILLION DOLLAR ATI LOSS YEARS, NVIDIA HAS BEEN CRANKING OUT A PROFIT ! $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$4
--
Oh well, nice BS anyway Derek ! Hope it makes all your fantasies real for you, because that's the only place they are real, in your head.
It quite apparent that the manufacturers are TRYING to create confusion for less sophisticated buyers so that they cannot easily
compare price/performance factors of cards.
This is especially true of Nvidia. (Can anybody explain why the 8800 magically became the 9800?)
Obviously an attempt to create a "new" card without really changing anything.
[quote]The fact that the AMD hardware is leading here is not unexpected. But we do see that the NVIDIA GTS 250 looks a little bit CPU bound at lower resolutions.[/quote]
I wouldn't say it is CPU bound as the AMD cards are not at all CPU bound at considerably higher framerates. Rather the nVidia card is for some reason being capped at 60fps, almost probably by its driver (or possibly some setting in it). Most LCD displays are capable of accepting a refresh rate of 75hz (at certain resolutions) as well as 60hz, so you could always try doing that to see if the driver is limiting the framerate to the refresh-rate.
I guess it is because of the games they tested with that favors the ATI hardware. Specialy GRID and AoC. The cards performance is very similar, so the games choosen to performe the test will define with card is better.
The GTS 250 is not doing too bad at all. It is an 55nm part and many years old.
I am sure if Nvidia did some small tweaks to it and shrink it to 40nm and add GDDR5, it would be much more competitive if not better then 4770.
with the die shrink i was really hoping for a single slot card this time. seriously will we ever see decent single slot cards again? i know the other manufacturers like gigabyte and his have their designs slightly slimmer but the coolers are still too big to fit in a single slot.
I imagine it's because the number of people that hate dual slot GPUs is so miniscule that it doesn't make sense to cripple your card's factory clocks just to cater to them. You can always get a 4670 or other low-end card if you want one that's a single slot.
Well, after reading the reviews, I don't find the card terribly exciting, except for the low power levels. At $110, this card is more of a triumph for ATi than it is consumers. It is very cheap to make, with fewer pins for the 128-bit memory bus, and cheaper power circuitry. It's a nice card, and overall the best card at it's price point, but just barely. The main prupose of this card was to be more profitable for ATi, and it's done a great job at that.
A low cost HD4830 is a better deal for many people. It can easily overclock to HD4850 speeds (you have to go higher than actual HD4850 speeds to get the same perforamnce, due to the lower number of shaders). Also, there are 800SP HD4830 cards out there (or so I've heard).
The mood I get from the AT article - ho hum, and that's my impression as well. The low power draw is very nice, though (see the XbitLabs article for the actual power draw of the card, instead of the whole system). It's very impressive in the power dept., but that's about it.
I think there's too much focus on the name of the video card. It's honestly perfect for the situation. They didn't rename an old gpu and it's obviously not the same type of chip as a 48xx. Also it's most certainly targeted for the midrange, which a 47xx sounds like a perfect name for. I'd say amd is doing a much, much better job at naming chips than nvidia, which has done far too many renames of the same chip.
Hi,everyone,I come from China-mainland, nice to "meet" u all. Although I know anandtech for a long time, this's my first message in Anandtech. ^_^
I'm glad to see that ATI becomes more and more stronger, she brings benign market competition,and gives us good performance/price low-end cards(actually not only card). that's really good for most chinese,because most young guys'earning ain't very much, just like me.....
Due to previous high cost,not everybody owns PC in china, especially in rural areas,where occupy 80% population of china.....oh,god, can u image that how many computers we still need? We must energetically promote IT application and use IT to propel and accelerate industrialization & multi-media education in backward areas. Mr.Obama said:you still have a lot of work to do.
so do we ^_^
fortunately, right now:
$20LE-1150+$30 690G(or maybe $50 780G)= waht a perfect entry level, for us.
$50 4600+ $55 770 + $55 HD4650 = what a perfect middle level,for us.
$120 X3 710 + $90 790GX + $95~120 HD4830/4850 = what a perfect player level, for us.
Here,I'd like to give my full support to HD4770.
Go,AMD-ATI~
BTW,go~houston rocket~! haha
Ya the power readings are all over the map here. Xbitlabs has the 4770 with MUCH lower idle power than the 4830. While Anandtech shows the 4770 consuming more at idle. Something is strange with the power measurements of this graphics card.
Everbody is measuring system power consumption at the wall socket instead of GPU power consumption; therefore, power comsumption varies since each site uses different CPU's, motherboards, powersupplies, etc.
I haven;t read the XBitLab article yet, but they normally use a technique where they directly measure the power draw of the card, rather than the entire system. I believe they are more accurate. I'll read their article now, since it probably has OVERCLOCKING!!! How could AT miss that - at least OC and run one benchmark so we can see the percentage gain.
Looking at HARDocp's power results, they don't make sense. Supposedly their system without video card is 46 watts from the wall and system with 4770 is 53 watts? There's no way the idle is 7 watts for the video card including psu inefficiencies. This contradicts greatly with xbitlabs and anandtech...which already both contradict each other regarding power usage. Basically none of the power results can be trusted which is really annoying.
What a terrible piece. This is the exact reason why I rarely visit AT anymore. One resolution per game tested? No overclocking? One page of text to whine about the name?
B...B...But the other resolutions don't have pretty bar graphs! Seriously don't feed these trolls. If they can't even READ the review before blasting it for things included, they don't deserve to get answered.
Personally I don’t mind about the naming of the card.
Although this card is the first 0.40nm and it’s the first time the GPUs are manufactured using a smaller process than CPUs, I don’t see the reviewers to be exited for this product. I mean this review feels like it was made in 15 minutes just for having it in the site. No O/C and no CrossFire results. Have you ever wandered if 2 x 4770 are faster than a 4890?? how about 3 x 4770 ??
Yeah, it's kinda odd how GPU's simply skipped the 45nm base node this time around. I guess it's good in away quicker progress. Also much needed considering how much MORE core logic GPU's have then CPU's which over 50% of the transistor budget is cache.
This is only a test shuttle basically for the 40nm process small simple part, for high yields and to work out the kinks before deploying complex parts on this new process.
AMD and Nvidia GPU's are fabbed by TSMC. I don't think TSMC have a 45nm process - they jumped to 40nm instead, which seems sensible to me: timewise TSMC's 40nm process is entering production almost halfway between Intel's 45nm and 32nm processes.
I see no reason why this couldn't be in a single-slot solution. That is what everyone really wants...I would grab 3 of these if they were available in such a way.
really dont post a review as bad as this
you make the 4770 look like any other card while its performance to price ratio is even greater then the 4850
final point this review failed miserably
This card seems to be kind of in limbo to me. It isnt a performance leader, but still is not particularly low in power consumption. It still requires a power connector and is dual slot (dual slot ???). In performance it is also bracketed by the 4830 and 4850. Price is also not outstanding.
To showcase 40nm architecture, I would have thought that AMD would have had either a higher performance card or an improved performance 4670 type card that required no power connector and was single slot.
At this point, I would choose a 4670 for low power and no connector required or go with a 4850 or 4870 for better performance.
Don't be rude. I did read the article. It states that the 4770 is faster than the 4830. I took this to mean that the 4850 was faster than the 4770. Looking closely at the graphs, it is faster in some games, but not in others. I don't mind people pointing out mistakes, but you can be nice about it.
The 4770 is going to replace the 4830, which will be (or has been?) phased out of production. The card is intened for gamers wanting more performance than a 4670 but who don't want to pay for a 4850. Looks to me like the target market is gamers with 1680x1050 panels. For lower resolutions less expensive cards would make more sense, for 1920x1080 and 1920x1200 the 4850, 4870 or 4890 makes more sense, and if you want to game at 2560x1600 you'll probably want a dual-GPU solution....
By 256-bit version, do you mean the memory-bus width? All that would achieve is allow them to use slower memory (GDDR3 instead of GDDR5) and would have no effect on the speed the GPU itself needs to be clocked at, and therefore the temperature it would run at.
Speaking of which, I don't think temperatures were mentioned in the article (goes to check).
I read the review completely and i clearly saw the 4770 leading the GTS 250 in most games. After reading the conclusion i was a little bit dissapointed at Anandtech. The 4770 which costs around 100$ clearly outperforms a 130$+ part while consuming lower power. Still the conclusion is not very positive... How can this happen?
This is on of the best cards at the moment price/performance wise, so why doesn't Anandtech recommend this card? Scared to loose good relationships with nVidia? The last couple of months you can clearly see where anandtech is going to... It's really sad, since this was one of the best around.
You don't think this is a AMD-supportive conclusion?
"It isn't clear when NVIDIA will have a part in this generation of their architecture that competes in the near $100 market, but in the meantime the option is certainly clear: the Radeon HD 4770 is the way to go for now."
Well, checking Newegg right now, there are 4 4770 options available (interestingly enough, all using the same non-reference cooler) all at 109.99. There are 12 GTS250 512 cards, one of which is 120 shipped, and three others at 110, 115, and 120 with rebates. Given the relatively small performance difference and the relatively small price difference, a relatively mild recommendation seems warranted. I'd imagine this will end up becoming a case of "Pick whichever brand you like better or whoever has the better price at the moment." I personally would pick an nvidia card for an extra $10 just due to the driver issues I have experienced with AMD.
[quote]interestingly enough, all using the same non-reference cooler[/quote]
yeah I cant find any that use the reference cooler reviewed in the articles on this site or others. I would prefer if its going to have a dual slot solution, that it vents the exhaust outside the case! The cooler thats on newegg is ugly too... I would have picked one up today if it looked like the ones in the article!
Do not forget the 10$ mail in rebates @ newegg, so the 4770 IS cheaper.
The 4770 also uses way less power and it is even more more strange to see Derek bitching about the naming of the card instead of looking at the OC ability.
The German site pcgameshardware.de was able to reach a massive overlock, leading to a stunning 25% performance improvement.
Those $10 MIRs were not there when I checked before writing that. I also wouldn't be surprised to see bigger rebates on the nvidia hardware within a day or two.
The important thing here is there is no attempt by ATI to deceive the public.
All the 48xx cards have 256-bit buses, and they also have higher bandwith than the 4770. However, the 4770 has much more in common with a 4870 than it does a 4830 or 4850, namely clock speeds and DDR5. It just so happens that its peformance level drops in at the 4850 mark due to the smaller 128 bit bus and a few less shaders.
If you look at it logically, it's a cut down 4870. 4770 is the best name that could have been chosen for it taking everything into consideration.
I think... very complicated conmpared to 3 SKU's for ATi. At least Nvidia is trying to make some headway into making a unified product line with the GTS 250, GTX 260, GTX 275, GTX 285...
GTX 260 Core 216, should have been GTX 265 for simplicity sake...
Um yeah, the author seems to think the dual slot cooler is an inconvenience, but I'd LOVE to have a dual slot cooler that pumps the hotness outside of the case for me. Most mobo makers are careful enough about the slot below the PCIe16 slot because, believe it or not, they've thought about the fact that a dual slot graphics card might be put in there.
Many who buy these lower-end cards have more compact systems, sometimes with m-ATX boards.
When I got my 4850 last year I had such a board, meaning a dual slot card would have blocked one of the two PCI slots, and made the last one nearly useless since the PCI card would sit right against the videocard, possibly blocking the fan.
Even though I've got a full ATX board now, the lack of hot air exhaust doesn't bother me. The two 80mm fan+PSU fan take care of getting the hot air out of the case. In fact they move much more hot air than the small GPU fan.
There will probably be variations just like with all other cards so those who want single slot will be able to find one. Prices will also go below $100 in a matter of weeks, IMO.
This isnt a lower-end card. Its a mid-range card. Low-end is often passive cooled. The single slot cooler on the 3850 and 4850 are misserable for this type of GPU. But i'm sure there will be single slot models of the 4770 as well...
I've got mATX too. AMD needs to focus on providing the best possible cooling and not make compromises for mATX. And when these cards arrive with bad ass coolers that keep the chips cool, the noise down, and the case temps down, it is too bad they get looked on with scorn. The consumer made his/her bed when choosing a motherboard and case. And even still, this card, dual slot and all, will fit in an mATX setup.
That's just my feeling. I am thrilled to see high quality coolers attached to a $100 card. That's wonderful!
I agree. Far from not liking a dual-slot cooler, I love them. Not only do they keep the GPU itself cooler, but they also improve the overall system airflow. Given the choice, I'd always pick a dual-slot cooler over a same priced card with a single-slot cooler.
The only possible reason for choosing a single-slot cooler would be if you wanted to use the slot right next to the gfx-card, which means the already inferior single-slot cooler is made even worse because there is now another card sitting millimetres from the GPU fan.
Looks like a good card. Hopefully the lessons learnt from this 40nm process will enable future AMD graphics cards (or even the 4870) to use it.
One thing though - in light of the fact that the 4830 will be dropped, the naming scheme makes sense.
Relative performance is now:
4830 -> 4770 -> 4850 -> 4870
It will become
4770 -> 4850 -> 4870
And everyone is happy again. At least its not misleading - I mean the 4770 really is a different card altogether to the 4800 series, so its good that its name reflects that. Pity it requires an external power connector, but at least it isnt very power hungry.
it seems that power management in these 40nm parts is completely redesigned. Could we (readers, of course) know more on this? TDP of about 80W versus delta (full load - idle) of just above 40W? I wonder if this PCIe power connector and two-slot cooling aren't just a precautionary features... Maybe at AMD/ATI they couldn't approximate properly new parts' power consumption?
IMO: the perfect first hit, I even don't regret these famous 10$ ;) 48xx must be sold ;)
There were many issues regarding TSMC's 40nm node. Apparently, there were many issues with leakage. ATi probably wanted to err on the side of caution after the adequate but much maligned 4850 cooler.
And, I don't know, how you talk about 110$, when this part is available in Europe (which tends to be more expensive, I don't know why) for 89€ (Listed for 87)!
I agree, this seems very try hard by Derek to criticize at least something.
It is even more funny to lash out at AMD for using a lower number for a better performing card. Compare that to the competition which renames with higher and higher numbers and the performance does not change. Talk about "understatement" versus "deceiving".
I second the naming bashing. You spent a lot of time bashing a naming system that makes things short and simple. Frankly it makes a lot more sense than the previous standard of stringing along an ever increasing number of words.
You would never expect a 43xx series card to beat a 38xx series card. Even though it has a higher number, it simply isn't going to happen. However, it is ludicrous to expect ATI to keep numbering their low end cards in the 3xxxs until the low end performance can match the highest end 3xxx cards performance. Even if they did, given the different architectures, you just complain that what they deem better doesn't match up with what you do. Further, it would quickly get very confusing trying to pick out the new cards from the old.
Categorizing the cards by architecture, then performance, makes much more sense than categorizing them by performance alone. This automatically distinguishes newer cards from older ones. This also tells you a little about where the card might be strong/weak and what you stand to gain by overclocking. (A card with twice the hardware will gain much more per MHz overclock than the already higher clock one with half the hardware assuming similar architecture)
That all said, there are some things ATI could do to make the current scheme better. I.E. Delegate x2xx, x5xx, and x8xx as the low, medium, and high end (Assuming 3 basic architectures are used). Then use x3xx, x6xx, and x9xx for the same purpose on the refresh.
Hopefully ATI can refine their naming scheme to address both architecture and performance considerations equally as you wish. However, to say that the current naming scheme is in the same class as GeForce GTX260 Core 216 is going overboard. Until they get a scheme that can address both considerations equally, I'd prefer the current architecture based distinctions over a simple performance only scheme.
Oh, and did you really that much time complaining about a such a minor issue. Address the issue, then move on to something more interesting like overclocking.
Alternately, address other naming scheme issues in equal length. nVidia's GeForce GTX260 Core 216 comes to mind. How about the GeForce 9800GTX that was slower than the 8800GTX. Or maybe the 9800GT that was slower than the 8800GTS. I always hate having to explain to people what the difference between my GTX260 Core 216 and the regular GTX260 is (especially when they don't remember the next time around). Also, try explaining to an average Joe that the second highest end of the current generation (at the time) is slower than the second highest end of the previous generation.
Personally, I wouldn't even put it in a review, but rather, I'd send my concerns directly to the company and let them work it out. If that didn't work, I'd put it in a rant not a review.
very well said... i juz duno why this article spent so much time criticizing the 4770 name which i tink is completely fine. Whts wrong wif a 70 series of a 700 family outperforming a 30 series in the 800 family in the same 4000 generation? It is completely inadequate to base naming on purely performance aspect alone as the reasons mentioned above. If anything, they would be better off starting a flaming articles about nvidia naming scheme which is very very deceiving... how about G92 designated as GTX 280M i nthe mobile sector? Not to mention GTS250 being a renamed 9800GTX+. The + is a weird symbol in the naming in the first place. For generations its been jus suffixes n now theres a "+" sign. Why isnt there a "-" sign then for lower end product? How about 8800GS being a 8800GT-? Sounds good? And wht about GTX260 Core 216? WHy not GTX280 core 240? GTS250 core 128? So confusing n no standardization. Nvidia naming is all the place, nonsensical. It doesnt even sound good.
I wish Nvidia had named the GTX260 respin GTX270 instead of "GTX260 Core 216". And I agree that the mobile names are downright deceptive.
But a new 4770 (slightly) outperforming an end of life 4830 doesn't really bother me. I could see it being a bit annoying if the 4830 wasn't phased out, but as it is the naming complaint is trivial and petty - not worth wasting so much space in a review. Besides if AMD did call it 4840 instead of 4770 many people (though perhaps not AT) would accuse AMD of deceptively passing off the cheaper RV740 as an RV770. Furthermore, it's probably possible to concoct benchmarks which "prove" the 4830 (with sligly more memory bandwidth) is faster than the 4770.
Besides, I always thought the second digit in the AMD naming scheme to represent the GPU architechure - while increasing numbers would indicate faster architectures, the mid and high end RV770 cards (4850 and 4870) are faster than any 47xx card. In this case to top of the 47xx line overlaped the bottom of the 48xx - a problem AMD are sensibly correcting by dropping the more expensive to manufacture 4830...
Although the current state sure beats the terminally meaningless "GT, GS, GTS, GTX, GTX-S, GT+, GTX+" naming scheme.
It's a battle between graphics vendors priorities...
They have to choose between:
- Being accurate for informed users (the type who read anandtech)
- Marketing poorly performing, high-margin/high-volume cards to uninformed users
This is even more evident with their mobile parts. Why the hell would you (nVidia) call something a Mobile 9800M GT if it only performs like a desktop 9600GT? Why not call it a "mobile 9600 GT" so people know at a glance it's got 9600 GT performance with laptop-friendly power consumption? Is it really considered more important to communicate that it falls in the same relative position to other nVidia mobile parts as the 9800 falls into among other desktop parts?
I find the writing on this issue to be highly valuable. Why should the naming not make sense, be understandable, and do a good job of conveying relative performance? Why is a 47xx faster than a 48xx? Why would you ever complain about the author "going to bat" for the consumer for sensible product naming? The author is trying to help you and me!
60+ fps at resolutions above 3000x2000 for $150-200 that I would call modern and good graphics until then I'd rather wait. It's about time for that to happen!
ATI IS LOSING M0NEY ON EVERY CARD THEY SELL - SO MUCH FOR PROFIT FROM TINY GPU CORES! GREAT LIE THO RED ROOSTERS AND BOOSTERS... GOSH YER SO DANG SMART !
" AMD's graphics division (ATI) reported net revenue of $248 million (PDF, page 3) and an operating loss of $38 million in Q2. Revenue fell $14 million from Q1, but was up 17.5 percent compared to the equivalent period in 2007. For the six months ended June 28, 2008, graphics revenue was $510 million with an operating loss of $25 million, compared to revenue of $422 million and an operating loss of $65 million through the same period in 2007. "
READ THAT AS IN THE RED, RED ROOSTERS !
http://arstechnica.com/hardware/news/2008/08/ati-p...">http://arstechnica.com/hardware/news/20...t-market... ---
NOW FOR NVIDIA PROFITS! YOU KNOW, THAT'S $$$$ LEFT OVER AFTER MAKING AND SELLING, SOMETHING ATI DOESN'T KNOW ABOUT WITH THEIR PATHETICALLY CHEAP TINY CORE YOU YAPPERS LIKE DEREK KEEP PUSHING ON LOSERS !
" For the three months that ended October 26, profit sank 74 percent to $61.7 million, or 11 cents a share, from $235.7 million a year earlier. Excluding costs from stock compensation and other expenses, earnings were 20 cents a share. This exceeded the average estimate of 11 cents projected by First Call. "
THAT'S CALLED A PROFIT AND STOCK INCREASE THERE RED ROOSTERS, AND ONLY NVIDIA HAS IT ! YOUR RED ATI LOST MONEY - DUHH !
Better make that tiny overheating core MUCH MUCH SMALLER !
http://news.cnet.com/8301-13924_3-10084168-64.html">http://news.cnet.com/8301-13924_3-10084168-64.html ---
YES I HATE THE LYING RED ROOSTERS! OF COURSE, THEY ARE LIARS LIARS LIARS !
Don't you love it - the paper launch with just a tiny trickle of 4770's for sale - you still can't get any at newegg and the parts rans out initially with 2-5 reviews (meaning a tiny part supply).
--
Same at Tigerdirect - NOT AVAILABLE
--
Same at Amazon - NOT AVAILABLE
--
The greatest paper launch scandal ever - and NOT A WORD OF COMPLAINT FROM ANANDTECH BECAUSE IT'S ATI AND NOT NVIDIA !
---
Welcome to the ultimate BIAS.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
88 Comments
Back to Article
SiliconDoc - Sunday, June 7, 2009 - link
the usual red rooster line I see:" It's all about cost benefit. Certainly it's a benefit to have smaller GPUs as they cost less to make. "
---
YES WE KNOW YOU IDIOT RED ROOSTERS - ATI HAS LOST A BILLION A YEAR IN THEIR COST EFFECTIVE TINY GPU DIE SALES.
THEY LOSE 33% IN DOLLARS OF THEIR SALE PRICE ON EVERY DING DANG VIDEOCARD THEY SELL AS AN AVERAGE.
That means if you buy a $200 lovely driver corrupted red rooster card, ATI loses $66 for AMD and themselves !
---
WHAT a great advantage they have in that die size !
----
MEANWHILE FOR THOSE BILLION DOLLAR ATI LOSS YEARS, NVIDIA HAS BEEN CRANKING OUT A PROFIT ! $$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$4
--
Oh well, nice BS anyway Derek ! Hope it makes all your fantasies real for you, because that's the only place they are real, in your head.
srikar115 - Saturday, May 30, 2009 - link
on stock its same performer to 4850,cheaper,cooler nore ociable,this will defently hamper the sales of 4850.clocks reached as high as 1000/1200 with voltage adjustments...i jus love this card
http://pcgamersera.com/forum/topic.php?id=16">http://pcgamersera.com/forum/topic.php?id=16
PC Reviewer - Wednesday, May 13, 2009 - link
I believe that the newer 4890's are better performance for the cost...its only marginally more yet offers more ram for larger resolutions and more power..
http://pcreviewer.org/new-radeon-hd-4890-video-car...">http://pcreviewer.org/new-radeon-hd-4890-video-car...
Sunsmasher - Saturday, May 2, 2009 - link
It quite apparent that the manufacturers are TRYING to create confusion for less sophisticated buyers so that they cannot easilycompare price/performance factors of cards.
This is especially true of Nvidia. (Can anybody explain why the 8800 magically became the 9800?)
Obviously an attempt to create a "new" card without really changing anything.
PrinceGaz - Wednesday, April 29, 2009 - link
[quote]The fact that the AMD hardware is leading here is not unexpected. But we do see that the NVIDIA GTS 250 looks a little bit CPU bound at lower resolutions.[/quote]I wouldn't say it is CPU bound as the AMD cards are not at all CPU bound at considerably higher framerates. Rather the nVidia card is for some reason being capped at 60fps, almost probably by its driver (or possibly some setting in it). Most LCD displays are capable of accepting a refresh rate of 75hz (at certain resolutions) as well as 60hz, so you could always try doing that to see if the driver is limiting the framerate to the refresh-rate.
rickshaw - Wednesday, April 29, 2009 - link
This card look amazing for the price. I am wondering now if this card is any better than the HD 3870? Should I get this card to replace the HD 3870?Any comments for the experts?
Lummox - Thursday, May 7, 2009 - link
with two of these cards for $220 = rebate you will smoke anything else at this price including a 280. see toms for reviewVooDooAddict - Wednesday, April 29, 2009 - link
I'm drooling over the power usage.Top end i7
6 dimms
4770
under 220w
I think I found the new Crossfire setup for my Shuttle!
Wreckage - Tuesday, April 28, 2009 - link
How come the 4770 does so much better in the Anandtech review than in the Guru3D, HardOCP, Xbit Labs, Tom's Hardware and several other reviews?mmpalmeira - Wednesday, April 29, 2009 - link
I guess it is because of the games they tested with that favors the ATI hardware. Specialy GRID and AoC. The cards performance is very similar, so the games choosen to performe the test will define with card is better.iwodo - Tuesday, April 28, 2009 - link
The GTS 250 is not doing too bad at all. It is an 55nm part and many years old.I am sure if Nvidia did some small tweaks to it and shrink it to 40nm and add GDDR5, it would be much more competitive if not better then 4770.
Veteran - Tuesday, April 28, 2009 - link
Then it would be even more expensive.... + power consumption wouldn't change much because of GDDR5quanta - Tuesday, April 28, 2009 - link
According to PC Perspective review[1], it has only 8 ROPs, and would only have 16 texturing units instead of 32.[1] http://pcper.com/article.php?aid=700&type=expe...">http://pcper.com/article.php?aid=700&type=expe...
kevinkreiser - Tuesday, April 28, 2009 - link
with the die shrink i was really hoping for a single slot card this time. seriously will we ever see decent single slot cards again? i know the other manufacturers like gigabyte and his have their designs slightly slimmer but the coolers are still too big to fit in a single slot.frowny - Tuesday, April 28, 2009 - link
I imagine it's because the number of people that hate dual slot GPUs is so miniscule that it doesn't make sense to cripple your card's factory clocks just to cater to them. You can always get a 4670 or other low-end card if you want one that's a single slot.kmmatney - Tuesday, April 28, 2009 - link
Well, after reading the reviews, I don't find the card terribly exciting, except for the low power levels. At $110, this card is more of a triumph for ATi than it is consumers. It is very cheap to make, with fewer pins for the 128-bit memory bus, and cheaper power circuitry. It's a nice card, and overall the best card at it's price point, but just barely. The main prupose of this card was to be more profitable for ATi, and it's done a great job at that.A low cost HD4830 is a better deal for many people. It can easily overclock to HD4850 speeds (you have to go higher than actual HD4850 speeds to get the same perforamnce, due to the lower number of shaders). Also, there are 800SP HD4830 cards out there (or so I've heard).
The mood I get from the AT article - ho hum, and that's my impression as well. The low power draw is very nice, though (see the XbitLabs article for the actual power draw of the card, instead of the whole system). It's very impressive in the power dept., but that's about it.
frozentundra123456 - Tuesday, April 28, 2009 - link
good post. This is pretty much my feeling also, but you explained it better.tomoyo - Tuesday, April 28, 2009 - link
I think there's too much focus on the name of the video card. It's honestly perfect for the situation. They didn't rename an old gpu and it's obviously not the same type of chip as a 48xx. Also it's most certainly targeted for the midrange, which a 47xx sounds like a perfect name for. I'd say amd is doing a much, much better job at naming chips than nvidia, which has done far too many renames of the same chip.Liujia - Tuesday, April 28, 2009 - link
Hi,everyone,I come from China-mainland, nice to "meet" u all. Although I know anandtech for a long time, this's my first message in Anandtech. ^_^I'm glad to see that ATI becomes more and more stronger, she brings benign market competition,and gives us good performance/price low-end cards(actually not only card). that's really good for most chinese,because most young guys'earning ain't very much, just like me.....
Due to previous high cost,not everybody owns PC in china, especially in rural areas,where occupy 80% population of china.....oh,god, can u image that how many computers we still need? We must energetically promote IT application and use IT to propel and accelerate industrialization & multi-media education in backward areas. Mr.Obama said:you still have a lot of work to do.
so do we ^_^
fortunately, right now:
$20LE-1150+$30 690G(or maybe $50 780G)= waht a perfect entry level, for us.
$50 4600+ $55 770 + $55 HD4650 = what a perfect middle level,for us.
$120 X3 710 + $90 790GX + $95~120 HD4830/4850 = what a perfect player level, for us.
Here,I'd like to give my full support to HD4770.
Go,AMD-ATI~
BTW,go~houston rocket~! haha
armandbr - Tuesday, April 28, 2009 - link
To me the biggest issue with this card is the memory size.I heard there are going to be cards with 1 Gb of ram.
I hope there will be.
Luminair - Tuesday, April 28, 2009 - link
whose power readings are correct, anandtech or hardocptomoyo - Tuesday, April 28, 2009 - link
Ya the power readings are all over the map here. Xbitlabs has the 4770 with MUCH lower idle power than the 4830. While Anandtech shows the 4770 consuming more at idle. Something is strange with the power measurements of this graphics card.RagingDragon - Wednesday, May 13, 2009 - link
Everbody is measuring system power consumption at the wall socket instead of GPU power consumption; therefore, power comsumption varies since each site uses different CPU's, motherboards, powersupplies, etc.kmmatney - Tuesday, April 28, 2009 - link
I haven;t read the XBitLab article yet, but they normally use a technique where they directly measure the power draw of the card, rather than the entire system. I believe they are more accurate. I'll read their article now, since it probably has OVERCLOCKING!!! How could AT miss that - at least OC and run one benchmark so we can see the percentage gain.tomoyo - Tuesday, April 28, 2009 - link
Looking at HARDocp's power results, they don't make sense. Supposedly their system without video card is 46 watts from the wall and system with 4770 is 53 watts? There's no way the idle is 7 watts for the video card including psu inefficiencies. This contradicts greatly with xbitlabs and anandtech...which already both contradict each other regarding power usage. Basically none of the power results can be trusted which is really annoying.Korr - Tuesday, April 28, 2009 - link
What a terrible piece. This is the exact reason why I rarely visit AT anymore. One resolution per game tested? No overclocking? One page of text to whine about the name?Please never review computer hardware again.
aeternitas - Wednesday, June 10, 2009 - link
I guess simple graphs are still too complex for some people to understand. lolJarredWalton - Tuesday, April 28, 2009 - link
Don't know about you, but I see resolution scaling charts showing 1280x1024, 1680x1050, and 1920x1200.7Enigma - Wednesday, April 29, 2009 - link
B...B...But the other resolutions don't have pretty bar graphs! Seriously don't feed these trolls. If they can't even READ the review before blasting it for things included, they don't deserve to get answered.AtenRa - Tuesday, April 28, 2009 - link
Personally I don’t mind about the naming of the card.Although this card is the first 0.40nm and it’s the first time the GPUs are manufactured using a smaller process than CPUs, I don’t see the reviewers to be exited for this product. I mean this review feels like it was made in 15 minutes just for having it in the site. No O/C and no CrossFire results. Have you ever wandered if 2 x 4770 are faster than a 4890?? how about 3 x 4770 ??
coldpower27 - Wednesday, April 29, 2009 - link
Yeah, it's kinda odd how GPU's simply skipped the 45nm base node this time around. I guess it's good in away quicker progress. Also much needed considering how much MORE core logic GPU's have then CPU's which over 50% of the transistor budget is cache.This is only a test shuttle basically for the 40nm process small simple part, for high yields and to work out the kinks before deploying complex parts on this new process.
Sorta like G92b for Nvidia
RagingDragon - Wednesday, May 13, 2009 - link
AMD and Nvidia GPU's are fabbed by TSMC. I don't think TSMC have a 45nm process - they jumped to 40nm instead, which seems sensible to me: timewise TSMC's 40nm process is entering production almost halfway between Intel's 45nm and 32nm processes.armandbr - Tuesday, April 28, 2009 - link
here are crossfire numbershttp://www.matbe.com/articles/lire/1421/radeon-hd-...">http://www.matbe.com/articles/lire/1421...4770-per...
Exar3342 - Tuesday, April 28, 2009 - link
I see no reason why this couldn't be in a single-slot solution. That is what everyone really wants...I would grab 3 of these if they were available in such a way.AmazighQ - Tuesday, April 28, 2009 - link
really dont post a review as bad as thisyou make the 4770 look like any other card while its performance to price ratio is even greater then the 4850
final point this review failed miserably
frowny - Tuesday, April 28, 2009 - link
Why are you guys focusing on 4770 vs GTS250? The correct comparison is 4770 vs 9800GT since those are the same price points.frozentundra123456 - Tuesday, April 28, 2009 - link
This card seems to be kind of in limbo to me. It isnt a performance leader, but still is not particularly low in power consumption. It still requires a power connector and is dual slot (dual slot ???). In performance it is also bracketed by the 4830 and 4850. Price is also not outstanding.To showcase 40nm architecture, I would have thought that AMD would have had either a higher performance card or an improved performance 4670 type card that required no power connector and was single slot.
At this point, I would choose a 4670 for low power and no connector required or go with a 4850 or 4870 for better performance.
FireSnake - Tuesday, April 28, 2009 - link
"or go with a 4850 for better performance"Read the article first, and stop writing nonsense!
frozentundra123456 - Tuesday, April 28, 2009 - link
Don't be rude. I did read the article. It states that the 4770 is faster than the 4830. I took this to mean that the 4850 was faster than the 4770. Looking closely at the graphs, it is faster in some games, but not in others. I don't mind people pointing out mistakes, but you can be nice about it.RagingDragon - Wednesday, May 13, 2009 - link
The 4770 is going to replace the 4830, which will be (or has been?) phased out of production. The card is intened for gamers wanting more performance than a 4670 but who don't want to pay for a 4850. Looks to me like the target market is gamers with 1680x1050 panels. For lower resolutions less expensive cards would make more sense, for 1920x1080 and 1920x1200 the 4850, 4870 or 4890 makes more sense, and if you want to game at 2560x1600 you'll probably want a dual-GPU solution....yacoub - Tuesday, April 28, 2009 - link
A 256-bit version (and thus able to run lower clockspeeds but get the same performance) would make a great passively-cooled GPU.PrinceGaz - Wednesday, April 29, 2009 - link
By 256-bit version, do you mean the memory-bus width? All that would achieve is allow them to use slower memory (GDDR3 instead of GDDR5) and would have no effect on the speed the GPU itself needs to be clocked at, and therefore the temperature it would run at.Speaking of which, I don't think temperatures were mentioned in the article (goes to check).
Veteran - Tuesday, April 28, 2009 - link
I read the review completely and i clearly saw the 4770 leading the GTS 250 in most games. After reading the conclusion i was a little bit dissapointed at Anandtech. The 4770 which costs around 100$ clearly outperforms a 130$+ part while consuming lower power. Still the conclusion is not very positive... How can this happen?This is on of the best cards at the moment price/performance wise, so why doesn't Anandtech recommend this card? Scared to loose good relationships with nVidia? The last couple of months you can clearly see where anandtech is going to... It's really sad, since this was one of the best around.
crimson117 - Tuesday, April 28, 2009 - link
You don't think this is a AMD-supportive conclusion?"It isn't clear when NVIDIA will have a part in this generation of their architecture that competes in the near $100 market, but in the meantime the option is certainly clear: the Radeon HD 4770 is the way to go for now."
flipmode - Tuesday, April 28, 2009 - link
Maybe you're high? Read:"As for the competition, the 4770 comes out on top in the games we tested."
"the option is certainly clear: the Radeon HD 4770 is the way to go for now."
What more do you want? A all out denouncement of Nvidia?
strikeback03 - Tuesday, April 28, 2009 - link
Well, checking Newegg right now, there are 4 4770 options available (interestingly enough, all using the same non-reference cooler) all at 109.99. There are 12 GTS250 512 cards, one of which is 120 shipped, and three others at 110, 115, and 120 with rebates. Given the relatively small performance difference and the relatively small price difference, a relatively mild recommendation seems warranted. I'd imagine this will end up becoming a case of "Pick whichever brand you like better or whoever has the better price at the moment." I personally would pick an nvidia card for an extra $10 just due to the driver issues I have experienced with AMD.aapocketz - Wednesday, April 29, 2009 - link
[quote]interestingly enough, all using the same non-reference cooler[/quote]yeah I cant find any that use the reference cooler reviewed in the articles on this site or others. I would prefer if its going to have a dual slot solution, that it vents the exhaust outside the case! The cooler thats on newegg is ugly too... I would have picked one up today if it looked like the ones in the article!
balancedthinking - Tuesday, April 28, 2009 - link
Do not forget the 10$ mail in rebates @ newegg, so the 4770 IS cheaper.The 4770 also uses way less power and it is even more more strange to see Derek bitching about the naming of the card instead of looking at the OC ability.
The German site pcgameshardware.de was able to reach a massive overlock, leading to a stunning 25% performance improvement.
RagingDragon - Wednesday, May 13, 2009 - link
I guess the reviewer didn't have time for OC testing, and thus chose to fill the space with ranting instead.strikeback03 - Tuesday, April 28, 2009 - link
Those $10 MIRs were not there when I checked before writing that. I also wouldn't be surprised to see bigger rebates on the nvidia hardware within a day or two.balancedthinking - Tuesday, April 28, 2009 - link
http://www.pcgameshardware.de/aid,682645/Test-Ati-...">http://www.pcgameshardware.de/aid,68264...force-98...Zstream - Tuesday, April 28, 2009 - link
It is because Derek wrote the article. As long as he is benchmarking and testing the AMD bashing will continue. Just get used to it.Jamahl - Tuesday, April 28, 2009 - link
The important thing here is there is no attempt by ATI to deceive the public.All the 48xx cards have 256-bit buses, and they also have higher bandwith than the 4770. However, the 4770 has much more in common with a 4870 than it does a 4830 or 4850, namely clock speeds and DDR5. It just so happens that its peformance level drops in at the 4850 mark due to the smaller 128 bit bus and a few less shaders.
If you look at it logically, it's a cut down 4870. 4770 is the best name that could have been chosen for it taking everything into consideration.
coldpower27 - Wednesday, April 29, 2009 - link
I think it kinda makes sense.The first digit explains the generation of technology.
4 = RV7xx Series with DX10.1 Technology. (Current)
3 = RV6xx Series with DX10.0 Refresh Line
2 = R600 Series with DX10 Original Line
The second digit explains where the product Series slots in with regard to everything else...
8 = Performance
7 = Mainstream Refresh
6 = Mainstream
5-3 = Budget...(this should be consolidated)
The Third digit is an isolated variable and explains where it slots within this particular series..
7 = Top Card.
5 = Middle Card.
3 = Low Card.
The Last digit current serves no purpose...
it's a bit more simple I will admit then Nvidia's nomentclature.
With ATI having 7,5,3 for defining top, middle and low of a particular range.
I will try to insert an analgous numbering concept for Nvidia.
Nvidia has GTX+ = 8 GTX = 7 GTS = 6, GT = 5, GS = 3, GSO = 2
I think... very complicated conmpared to 3 SKU's for ATi. At least Nvidia is trying to make some headway into making a unified product line with the GTS 250, GTX 260, GTX 275, GTX 285...
GTX 260 Core 216, should have been GTX 265 for simplicity sake...
flipmode - Tuesday, April 28, 2009 - link
I agree with the author in his criticism of the name chosen for the product.4845 if you want to end odd
4840 if you do not care
Naming these products should not be this hard. What AMD and Nvidia need to be passionate about is helping the buyer make sense of it all.
Also - like the author said - why the hell bother with the 4th digit if it is just going to be -0- all the time. Put that son of a bizzle to work.
flipmode - Tuesday, April 28, 2009 - link
Um yeah, the author seems to think the dual slot cooler is an inconvenience, but I'd LOVE to have a dual slot cooler that pumps the hotness outside of the case for me. Most mobo makers are careful enough about the slot below the PCIe16 slot because, believe it or not, they've thought about the fact that a dual slot graphics card might be put in there.JimmiG - Tuesday, April 28, 2009 - link
Many who buy these lower-end cards have more compact systems, sometimes with m-ATX boards.When I got my 4850 last year I had such a board, meaning a dual slot card would have blocked one of the two PCI slots, and made the last one nearly useless since the PCI card would sit right against the videocard, possibly blocking the fan.
Even though I've got a full ATX board now, the lack of hot air exhaust doesn't bother me. The two 80mm fan+PSU fan take care of getting the hot air out of the case. In fact they move much more hot air than the small GPU fan.
There will probably be variations just like with all other cards so those who want single slot will be able to find one. Prices will also go below $100 in a matter of weeks, IMO.
Griswold - Wednesday, April 29, 2009 - link
This isnt a lower-end card. Its a mid-range card. Low-end is often passive cooled. The single slot cooler on the 3850 and 4850 are misserable for this type of GPU. But i'm sure there will be single slot models of the 4770 as well...flipmode - Tuesday, April 28, 2009 - link
I've got mATX too. AMD needs to focus on providing the best possible cooling and not make compromises for mATX. And when these cards arrive with bad ass coolers that keep the chips cool, the noise down, and the case temps down, it is too bad they get looked on with scorn. The consumer made his/her bed when choosing a motherboard and case. And even still, this card, dual slot and all, will fit in an mATX setup.That's just my feeling. I am thrilled to see high quality coolers attached to a $100 card. That's wonderful!
PrinceGaz - Wednesday, April 29, 2009 - link
I agree. Far from not liking a dual-slot cooler, I love them. Not only do they keep the GPU itself cooler, but they also improve the overall system airflow. Given the choice, I'd always pick a dual-slot cooler over a same priced card with a single-slot cooler.The only possible reason for choosing a single-slot cooler would be if you wanted to use the slot right next to the gfx-card, which means the already inferior single-slot cooler is made even worse because there is now another card sitting millimetres from the GPU fan.
Proteusza - Tuesday, April 28, 2009 - link
Looks like a good card. Hopefully the lessons learnt from this 40nm process will enable future AMD graphics cards (or even the 4870) to use it.One thing though - in light of the fact that the 4830 will be dropped, the naming scheme makes sense.
Relative performance is now:
4830 -> 4770 -> 4850 -> 4870
It will become
4770 -> 4850 -> 4870
And everyone is happy again. At least its not misleading - I mean the 4770 really is a different card altogether to the 4800 series, so its good that its name reflects that. Pity it requires an external power connector, but at least it isnt very power hungry.
AmazighQ - Tuesday, April 28, 2009 - link
and the fact that the 4830 was only a temporarily solution to fill up the 100 dollar gap, is and was very well knownhere a beter review of the HD 4770 :http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/display/rad...
Amiga500 - Tuesday, April 28, 2009 - link
If AMD had named it the 4750 we'd all be happy.4750 -> 4850 -> 4870
Griswold - Wednesday, April 29, 2009 - link
Why? There will be a 4750 with GDDR3 and lower core clock speed...wit p - Tuesday, April 28, 2009 - link
it seems that power management in these 40nm parts is completely redesigned. Could we (readers, of course) know more on this? TDP of about 80W versus delta (full load - idle) of just above 40W? I wonder if this PCIe power connector and two-slot cooling aren't just a precautionary features... Maybe at AMD/ATI they couldn't approximate properly new parts' power consumption?IMO: the perfect first hit, I even don't regret these famous 10$ ;) 48xx must be sold ;)
Zoomer - Tuesday, April 28, 2009 - link
There were many issues regarding TSMC's 40nm node. Apparently, there were many issues with leakage. ATi probably wanted to err on the side of caution after the adequate but much maligned 4850 cooler.FireSnake - Tuesday, April 28, 2009 - link
.... it is excellent part.And, I don't know, how you talk about 110$, when this part is available in Europe (which tends to be more expensive, I don't know why) for 89€ (Listed for 87)!
http://geizhals.at/eu/a426956.html">http://geizhals.at/eu/a426956.html
And bitching about name ..... you would rather test overclocking capabilities.
evilspoons - Tuesday, April 28, 2009 - link
Maybe because 89€ is $115 USD when you apply the exhange rate. Just a thought.Griswold - Wednesday, April 29, 2009 - link
Just that you didnt include (or rather subtract) the 19% VAT they pay in germany in your equation, which is included in the 89€.balancedthinking - Tuesday, April 28, 2009 - link
I agree, this seems very try hard by Derek to criticize at least something.It is even more funny to lash out at AMD for using a lower number for a better performing card. Compare that to the competition which renames with higher and higher numbers and the performance does not change. Talk about "understatement" versus "deceiving".
JPForums - Tuesday, April 28, 2009 - link
I second the naming bashing. You spent a lot of time bashing a naming system that makes things short and simple. Frankly it makes a lot more sense than the previous standard of stringing along an ever increasing number of words.You would never expect a 43xx series card to beat a 38xx series card. Even though it has a higher number, it simply isn't going to happen. However, it is ludicrous to expect ATI to keep numbering their low end cards in the 3xxxs until the low end performance can match the highest end 3xxx cards performance. Even if they did, given the different architectures, you just complain that what they deem better doesn't match up with what you do. Further, it would quickly get very confusing trying to pick out the new cards from the old.
Categorizing the cards by architecture, then performance, makes much more sense than categorizing them by performance alone. This automatically distinguishes newer cards from older ones. This also tells you a little about where the card might be strong/weak and what you stand to gain by overclocking. (A card with twice the hardware will gain much more per MHz overclock than the already higher clock one with half the hardware assuming similar architecture)
That all said, there are some things ATI could do to make the current scheme better. I.E. Delegate x2xx, x5xx, and x8xx as the low, medium, and high end (Assuming 3 basic architectures are used). Then use x3xx, x6xx, and x9xx for the same purpose on the refresh.
Hopefully ATI can refine their naming scheme to address both architecture and performance considerations equally as you wish. However, to say that the current naming scheme is in the same class as GeForce GTX260 Core 216 is going overboard. Until they get a scheme that can address both considerations equally, I'd prefer the current architecture based distinctions over a simple performance only scheme.
Oh, and did you really that much time complaining about a such a minor issue. Address the issue, then move on to something more interesting like overclocking.
Alternately, address other naming scheme issues in equal length. nVidia's GeForce GTX260 Core 216 comes to mind. How about the GeForce 9800GTX that was slower than the 8800GTX. Or maybe the 9800GT that was slower than the 8800GTS. I always hate having to explain to people what the difference between my GTX260 Core 216 and the regular GTX260 is (especially when they don't remember the next time around). Also, try explaining to an average Joe that the second highest end of the current generation (at the time) is slower than the second highest end of the previous generation.
Personally, I wouldn't even put it in a review, but rather, I'd send my concerns directly to the company and let them work it out. If that didn't work, I'd put it in a rant not a review.
Seramics - Thursday, April 30, 2009 - link
very well said... i juz duno why this article spent so much time criticizing the 4770 name which i tink is completely fine. Whts wrong wif a 70 series of a 700 family outperforming a 30 series in the 800 family in the same 4000 generation? It is completely inadequate to base naming on purely performance aspect alone as the reasons mentioned above. If anything, they would be better off starting a flaming articles about nvidia naming scheme which is very very deceiving... how about G92 designated as GTX 280M i nthe mobile sector? Not to mention GTS250 being a renamed 9800GTX+. The + is a weird symbol in the naming in the first place. For generations its been jus suffixes n now theres a "+" sign. Why isnt there a "-" sign then for lower end product? How about 8800GS being a 8800GT-? Sounds good? And wht about GTX260 Core 216? WHy not GTX280 core 240? GTS250 core 128? So confusing n no standardization. Nvidia naming is all the place, nonsensical. It doesnt even sound good.RagingDragon - Wednesday, May 13, 2009 - link
I wish Nvidia had named the GTX260 respin GTX270 instead of "GTX260 Core 216". And I agree that the mobile names are downright deceptive.But a new 4770 (slightly) outperforming an end of life 4830 doesn't really bother me. I could see it being a bit annoying if the 4830 wasn't phased out, but as it is the naming complaint is trivial and petty - not worth wasting so much space in a review. Besides if AMD did call it 4840 instead of 4770 many people (though perhaps not AT) would accuse AMD of deceptively passing off the cheaper RV740 as an RV770. Furthermore, it's probably possible to concoct benchmarks which "prove" the 4830 (with sligly more memory bandwidth) is faster than the 4770.
Besides, I always thought the second digit in the AMD naming scheme to represent the GPU architechure - while increasing numbers would indicate faster architectures, the mid and high end RV770 cards (4850 and 4870) are faster than any 47xx card. In this case to top of the 47xx line overlaped the bottom of the 48xx - a problem AMD are sensibly correcting by dropping the more expensive to manufacture 4830...
Zoomer - Tuesday, April 28, 2009 - link
I think the name makes sense and is completely reasonable. IMO, ATi should not attempt to change something that works.It's far better than the nVidia fiasco. Imagine a 4500Pro XT+ that is nothing more than a rebranded 2900XT.
Or a Radeon 4000 MX 400 that is based off the GF2 MX, erm, I mean Radeon 32MB DDR.
Seramics - Thursday, April 30, 2009 - link
true...nvidia naming schemes is despicablebogda - Tuesday, April 28, 2009 - link
I agree completely. "Naming issue" is one of the most pointless pieces of writing on anandtech ever.crimson117 - Tuesday, April 28, 2009 - link
I find it very useful.Although the current state sure beats the terminally meaningless "GT, GS, GTS, GTX, GTX-S, GT+, GTX+" naming scheme.
It's a battle between graphics vendors priorities...
They have to choose between:
- Being accurate for informed users (the type who read anandtech)
- Marketing poorly performing, high-margin/high-volume cards to uninformed users
This is even more evident with their mobile parts. Why the hell would you (nVidia) call something a Mobile 9800M GT if it only performs like a desktop 9600GT? Why not call it a "mobile 9600 GT" so people know at a glance it's got 9600 GT performance with laptop-friendly power consumption? Is it really considered more important to communicate that it falls in the same relative position to other nVidia mobile parts as the 9800 falls into among other desktop parts?
RagingDragon - Wednesday, May 13, 2009 - link
++Nvidia's mobile GPU naming really annoys me.
flipmode - Tuesday, April 28, 2009 - link
I find the writing on this issue to be highly valuable. Why should the naming not make sense, be understandable, and do a good job of conveying relative performance? Why is a 47xx faster than a 48xx? Why would you ever complain about the author "going to bat" for the consumer for sensible product naming? The author is trying to help you and me!ssj4Gogeta - Tuesday, April 28, 2009 - link
I too find it useful. Companies like Nvidia will always try to deceive the customers. (8800->9800->9800+ ->GTS250)- Tuesday, April 28, 2009 - link
ATI will discontinue the 4830 cards in favour of the 4770.http://www.dailytech.com/ATI+Launches+Radeon+HD+47...">http://www.dailytech.com/ATI+Launches+R...+Using+4...
Zingam - Tuesday, April 28, 2009 - link
60+ fps at resolutions above 3000x2000 for $150-200 that I would call modern and good graphics until then I'd rather wait. It's about time for that to happen!strikeback03 - Tuesday, April 28, 2009 - link
Why? Is there even 0.01% of the possible market with a display of that kind of resolution?SiliconDoc - Sunday, June 7, 2009 - link
ATI IS LOSING M0NEY ON EVERY CARD THEY SELL - SO MUCH FOR PROFIT FROM TINY GPU CORES! GREAT LIE THO RED ROOSTERS AND BOOSTERS... GOSH YER SO DANG SMART !" AMD's graphics division (ATI) reported net revenue of $248 million (PDF, page 3) and an operating loss of $38 million in Q2. Revenue fell $14 million from Q1, but was up 17.5 percent compared to the equivalent period in 2007. For the six months ended June 28, 2008, graphics revenue was $510 million with an operating loss of $25 million, compared to revenue of $422 million and an operating loss of $65 million through the same period in 2007. "
READ THAT AS IN THE RED, RED ROOSTERS !
http://arstechnica.com/hardware/news/2008/08/ati-p...">http://arstechnica.com/hardware/news/20...t-market...
---
NOW FOR NVIDIA PROFITS! YOU KNOW, THAT'S $$$$ LEFT OVER AFTER MAKING AND SELLING, SOMETHING ATI DOESN'T KNOW ABOUT WITH THEIR PATHETICALLY CHEAP TINY CORE YOU YAPPERS LIKE DEREK KEEP PUSHING ON LOSERS !
" For the three months that ended October 26, profit sank 74 percent to $61.7 million, or 11 cents a share, from $235.7 million a year earlier. Excluding costs from stock compensation and other expenses, earnings were 20 cents a share. This exceeded the average estimate of 11 cents projected by First Call. "
THAT'S CALLED A PROFIT AND STOCK INCREASE THERE RED ROOSTERS, AND ONLY NVIDIA HAS IT ! YOUR RED ATI LOST MONEY - DUHH !
Better make that tiny overheating core MUCH MUCH SMALLER !
http://news.cnet.com/8301-13924_3-10084168-64.html">http://news.cnet.com/8301-13924_3-10084168-64.html
---
YES I HATE THE LYING RED ROOSTERS! OF COURSE, THEY ARE LIARS LIARS LIARS !
philosofool - Thursday, September 3, 2009 - link
I never read anything in all caps.Fluxzx - Tuesday, December 22, 2009 - link
Amen.SiliconDoc - Monday, June 8, 2009 - link
Don't you love it - the paper launch with just a tiny trickle of 4770's for sale - you still can't get any at newegg and the parts rans out initially with 2-5 reviews (meaning a tiny part supply).--
Same at Tigerdirect - NOT AVAILABLE
--
Same at Amazon - NOT AVAILABLE
--
The greatest paper launch scandal ever - and NOT A WORD OF COMPLAINT FROM ANANDTECH BECAUSE IT'S ATI AND NOT NVIDIA !
---
Welcome to the ultimate BIAS.
aeternitas - Wednesday, June 10, 2009 - link
My friend got his and loves it. Best deal since Voodoo II.Maybe you just suck?