Maybe tesla cards in supercomputers which are closed platforms the cuda is better but for anything other commercial OpenCL will be better.
This is a CUDA vs OpenCL test from Sisoftware http://www.sisoftware.net/index.html?dir=qa&lo...">http://www.sisoftware.net/index.html?di...ocation=... The conclusion from that article : We see little reason to use proprietary frameworks like CUDA or STREAM once public drivers supporting OpenCL are released - unless there are features your code depends on that are not included yet; even then, they will most likely be available as extensions (similar to OpenGL) pretty soon.
It wouldnt be bad to see those kind of tests on anadtech. Something like GPUs vs CPUs tests with same code.
I dont know how others, but the 8 time increase in DP which is one of the pr stunts doesnt seem too much if u dont compare it to the weak gt200 DP numbers. The 5870 has something over 500 GFlops DP and the gt200 had around 80 GFlops DP (but the quadro and tesla cards had higher shader clocks i think). They will be happy if they reach 1.5 times the radeon 5800 DP performance. In this pdf from nvidia site http://www.nvidia.com/content/PDF/fermi_white_pape...">http://www.nvidia.com/content/PDF/fermi...T.Halfhi... they write that the ECC will hawe a performance penalty from 5% to 20% (on tesla cards u will hawe the option to turn it off/on on GT cards it will be turned off).
I also want to add that if the DP has increased 8 times from gt200 than let we say around 650 Gflops, than if the DP is half of the SP (as they state) performance in Fermi than i get 1300 Gflops ???? (with same clock speeds). For GT200 they stated 933 Gflops. Something is wrong here maybe ?
Actualy they state 30 FMA ops per clock for 240 cuda cores in gt200 and 256 FMA ops per clock for 512 cuda cores in Fermi. Which means clock for clock and core for core they increased 4 times the DP performance.
Hi. I'm a long time Anandtech reader (roughly 4 years already). I registered yesterday just because I wanted to give SiliconDoc a piece of my mind but thankfully ended being up being rational and not replying anymore.
Now that he's gone. I just want to know what you guys think of Fermi being another big chip. Is it safe to assume that Nvidia is losing more money than ATI on high-end models being sold simply because the GTX cards are much bigger than their ATI counterparts? Moreso now that the HD 58xx cards have been released which are faster overall than any of Nvidia's single-GPU solutions. Nvidia will be forced to further lower the price of their GTX cards. I'm still boggled as to why Nvidia would still cling to really big chips rather than go ATI's "efficiency" route. From what I'm reading, this card may focus more on professional applications rather than raw performance in games. Is it possible that this may simply be a technology demonstrator in the making in addition to something that will "reassure" the market to prevent them from going ATI? I don't know why they should differentiate this much if it's intended to compete with ATI's offerings, unless that isn't entirely their intention...
There was no benchmark, not even a demo during the so-called demonstration! This is very pathetic and it looks that Nvidia wont even meet the december timeframe. To debug a chip that doesnt work properly might cost many months. To manufacture a chip another 12 weeks. To develop the infrastructure including drivers and card manufactures another few months. Therefore, late q12010 or even 6/2010 might become realistic for a true launch and not a paperlaunch. What we could see on this demonstration was no more than the paper launch of the paper launch.
Hi, I fully agree with you 100%
You seem to be one of very FEW people that actually see that or get it.
You know what i can not seem to understand ??
How can supposedly a few hundred or so of people that are knowlegable of what it is they are about too see or somewhat of why they are attending the demonstration just sit there and listen to 1 person standing up and make claims about his or a product but have no proof ?
I understand how things are suppose to be, but have we all just become so naive to just believe what is pushed onto us through media ( ie..TV,Radio.Blogs.Magazines.ect...) and just believe it all ?
I am not saying that what Jen Shun showed was NOT a real demo of a working Fermi Card , I am just saying that there was and still is NO proof of any sort from anyone that was able to actually confirm or denie that it actually was.
Untill Nvidia actually shows a working sample of Fermi , even a so called ruffor demo model of it so long as it actually real I will not believe it.
There is a huge difference between someone makeing claims on the forums of sites like this here and or Blogs and someone who is holding a news conference clainming what they have achieved .
Next thing you know someone will stand up and say they have discoverd how to time travel and then show a video of just that.
FWIW, I'm glad AT banned that fool. Too bad it took 37 pages of fanboi ranting for it to come to fruition. For those that cry that there is no place to discuss this AT does have a video forum that will not allow this kind of shenanigans. Does anyone wonder if this is Rollo back from the grave?
According to this very link http://www.anandtech.com/video/showdoc.aspx?i=3573...">http://www.anandtech.com/video/showdoc.aspx?i=3573... AMD already presented a WORKING SILICON at Computex roughly 4 months ago on June 3rd. So it took roughly 4 and a half months to prepare drivers, infrastructure and mass production to have enoough for the start of Windows 7 and DX11. However, Nvidia wasnt even talking about W7 and DX11 so late Q1 2010 or even later becomes more realistic than december. But there are much more questions ahead: What pricepoint, Clockrates and TDP. My impression is that Nvidia has no clue about this questions and the more I watch this development, the more Fermi resembles to the Voodoo5 Chip and the V6000 card which never made into the market because of its much to high TDP.
Nah, I expect nVidia to do everything they can to get this into retail channels because it's the culmination of a lot of hard work. I also expect it to be a monster, but I'm still curious as to how they're going to sort out mainstream options due to their top-down philosophy.
That's not to say ATI's idea of a mid-range card that scales up and down doesn't have its flaws, but with both the 4800 and 5800 series, there's been a card out at the start with a bona fide GPU with nothing disabled (4850, and now 5870), along with a cheaper counterpart with slower RAM and a slightly handicapped core (4830/5850). Higher spec single GPU versions will most likely just benefit from more and/or faster RAM and/or a higher core clock, but the architecture of the core itself will probably be unchanged - can nVidia afford to release a competing version of Fermi without disabling parts of the core? If it's as powerful as we're lead to believe, it will certainly warrant a higher price tag than the 5870.
Nvidia wants it to be the jack of all trades. However, they are risking with being an overpriced master of none. Thats probably the reason they give their cards more and more gimmicks to play with each year. They are hoping that the cards value will be greater than the sum of its parts. And that might even be a successful strategy to some extent. In a consumerist world, reputation is everything.
They might start overdoing it at some point though.
Its like mobile phones nowadays. You really dont need to have a radio, an mp3-player, a camera nor other such extras in it (in fact, my phone isnt able to do anything but call and send messages). But unless you have these features, you arent considered as competition. It gives you the opportunity to call your product "vastly superior" even though from a usability standpoint it isnt.
Ahh... I see where you're coming from. I've had many classmates who've asked me what laptop to buy and they're always so giddy when they see laptops with the "Geforce" sticker and say they want it cause they want some casual gaming. Yes, even if the GPU is a Geforce 9100M. I recommended them laptop using AMD's Puma platform and many of them ask if that's a good choice (unfortunately here, only the Macbook has a 9400M GPU and it's still outside many of my classmates' budgets). Seems like brand awareness of Nvidia amongst many consumers is still much better than AMD/ATI's. So it's an issue of clever branding then?
A little late for any meaningful discussion over here as AT let the trolls go for 40 or so pages. I doubt many people can be arsed to sort through it now, so you'd be better off going to a forum for a real discussion of Fermi.
if you missed it then here you go ... happy day for all of us :
quote from comment posted on page 37 by Pastuch
" Below is an email I got from Anand. Thanks so much for this wonderful site.
-------------------------------------------------------------------
Thank you for your email. SiliconDoc has been banned and we're accelerating the rollout of our new comments rating/reporting system as a result of him and a few other bad apples lately.
Some may enjoy it, but this unusual freedom that blatant trolls using aggressive, rude language are getting lately is making a mockery of this site.
I don't mind it going on for a while, even 20 pages tbh, it is funny, but at some point i'd like to see a message from Gary saying, "K, SiliconDoc, we've laughed enough at your drivel, tchau, banned! :)"
That's what i want to see after reading through 380 bloody comments, not that he's pretty much gotten away with it. And if he has finally been banned, i'd actually love to know about it in the comments section.
By the looks of it, Nvidia doesn't have much going on for this year. If they loose the DX11 boat against ATI then I will pity their stockholders. About the only thing that makes those green cards attractive is their Physics spiel. Now if ATI would hurry up and do somethin with that Havoc, then dark days will await Nvidia. One way or the other, its a win-win for the consumer. I just wish their AMD division would fare just as well against intel.
I dont wont to be too pesimistic but availability in Q1 2010 is lame late. Windows 7 will come out soon so people will surely want to upgrade to dx11 till christmas. Also OEM market which is actualy the most profitable. Dell, HP and others will hawe windows 7 systems and they will of course need dx11 cards till christmas.(amd will hawe hopefully all models out till that time)
Than of course dx11 games that will come out in future can be optimized for radeon 5K now while for gt300 we dont even know the graphic specs and the only working silicon dont even resemble to a card.
Very bad timing for nvidia this time that will give amd a huge advantage.
Actualy this could hapen if u merge a super gpgpu tesla card and a GPU and want to sell it as one("because designing GPUs this big is "fucking hard"). Average people (maybe 95% of all) dont even know what Megabyte or bit is not even GPGPU. They will want to buy a graphic card not cuda card.
If amd and microsoft will make heawy DX11 pr than even the rest of nvidias gpus wont sell.
As with anything hardware, you need the killer software to have consumers want it. DX11 is out now, so we have Windows 7 (which most people are taking a liking to, even gamers) and you have a few upcoming games that people look to be interested in. For GPGPU and all that, well... What do we have as a seriously awesome application that consumers want and feel they need to go out and buy a GPU for? Some do that for F@H and the like, and a few for transcoding video, but what else is there? Until we see that, it's going to be ahrd to convince consumers to buy that GPU. As it is, most feel IGP is good enough for them...
Actually, thinking about this... Maybe if they were able to put a small portion of this into IGP, and include some good software with it, maybe the average consumer could see the benefits easier and quicker and be inclined to go for that step up to a dedicated GPU?
DocSilicon, you are one funny as hell mental patient to be!. I really hope you dont get banned. You just made reading the comments a whole lot more fun. Plus, it's win win. You get to satisfy your need to go completely postal at everyone, and we get a funny sideshow.
Great words but nothing behind! Fermis is Nvidias Prescott or should I say much like the last Voodoo chip that never really appeared on the market? Too many transistors are not good ...
You are all talking too much about technologies. Who cares about this? DX11 from ATI is already available in Japan and they are selling like sex dolls. And why didnt NVDIA provided any benchmarks? Perhaps the drivers aren ready or Nvidia doesnt even know at what clockspeed this monster can run without exhausting your pcs power supply. Fermi is not here yet, it is a concept but not a product. ATI will cash in and Nvidia can only look. And when the Fermi-Monster will finally arrive, ATI will enroll with 5890 and X2 in the luxury class and some other products in the 100 Dollar class. Nvidia will always be a few months late and ATI will get the business. It is that easy. Who wants all this Cuda stuff? Some number crunching in the science field, ok. But if it were for physix an add-on board would do. But in reality there was never any run for physix. Why should this boom come now? I think Nvdia bet on the wrong card and they will suffer heavily for this wrong decision. They had better bought VIA or its CPU-division instead of Physix. Physix is no standard architecture and never will. In contrast, ATI is doing just what gamers want and this is were the money is. Were are the Gaming-benchmarks for FERMI? Nvidia is over!
With all this Cuda and Physix stuff Nvidia will have 20-30% more power consumption at any pricepoint and up to 50% higher production costs because of their much bigger die size. ATI will lower the price whenever necessary in order to beat Nvidia in the market place! And when will Nvida arrive? Yesterday we didnt see even a paperlaunch! It was the announcement of a paperlaunch maybe in late december but the cards wont be available until late q12010 I guess. They are so much out of the business but most people do not realise this.
A little googling might (or might not) support the fact that he is a loony. Just type "site:forums.sohc4.net silicondoc" and youll find he has quite a reputation there (different site but seems to be the same profile, "handwriting" and same bike)
And that MIGHT lead us to the fact that he MIGHT actually be (currently) 45 and not a young raging teenage nerd called Brian.
Of course... this is just some fun guesswork I did (its all just oh so entertaining).
Below is an email I got from Anand. Thanks so much for this wonderful site.
-------------------------------------------------------------------
Thank you for your email. SiliconDoc has been banned and we're accelerating the rollout of our new comments rating/reporting system as a result of him and a few other bad apples lately.
So it's safe now to post again? Much thanks has to go to Anand to cleaning up the virus that has infected these comments. I mean, it's new tech. Aren't we free to postulate about what we think is going on, discuss our thoughts and feelings without fear of some person trolling us down till we can't breathe? It feels better in here now, so thanks again.
Just wanted to say thanks for the article. Love the quotes and behind-the-scene views, and in general the ever so informative articles like this that just can't be found elsewhere. So, thank you!
Someone earlier askes if supporting doubles was going to waste silicon, I don't think it will.
If you look at the through put numbers and the fact that FP64 is half that of FP32 with the SFU disabled I suspect what is going on is that the FP64 calculations are being done by 2 cores at once with the SFU being involved in some way (given how it is decoupled from the cores there is no apprent good reason why the SFU should be disabled during FP64 operation).
A comment was also made re:ECC memory.
I suspect this wont make it to the consumer board; there is no good reason to do so and it would just cost silicon and power for a feature users don't need.
Maybe the consumer board wont hawe ECC but it will be still in the silicon (disabled). I dont think that they will produce two different silicons just becouse of ECC.
Well, not only is the GT300 months away but it looks like the card they showed off is a fake anyhoo, check it out at Charlie Demerjian's www.semiaccurate.com
I call BS. How many people have 2560x1600 30-inchers? Two? Three? Main point - resolutions are _VERY_ far from being stagnated, they have SOOOOOOOOO _MUCH_ room for growth until 2560x1600 which right now covers maybe 1% of the PC gaming market. 90% of PC gamers still use low-res 1680x1050 if not less (I for one have 1400x1050, yeah shame on me, I don't want to spend $800 on hi-end SLI setup just to play Crysis in all its hi-res beauty, for.get.it.)
Shame Anand, real shame.
Otherwise top notch quality stuff, as always with Ananad.
I believe what you describe is exactly what is meant by stagnation. From Merriam-Webster: "To become stagnant." Stagnant: "Not advancing or developing." So yeah, I'd say that pretty much sums up display resolutions: they're not advancing.
Is that bad? Not necessarily, especially when we have so many applications that do thing based purely on the wonderful pixel instead of on points or DPI. I use a 30" LCD, and I love the extra resolution for working with images, but the text by default tends to be too small. I have to zoom to 150% in a lot of apps (including Firefox/IE) to get what I consider comfortably readable text. I would say that 2560x1600 on a 30" LCD is about as much as I see myself needing for a good, looooong time.
No, Jarred, they are advancing towards 2560x1600 on every PC gamer's desk. Since they move towards that (used to be 800x600 everywhere, now it's more like 1280x1024 everywhere, in couple of years it'll be 1680x1050 everywhere and so on) they cannot be described as stagnant, hence your statement is BS, Jarred.
This sweet spot used to be 19" 1280x1024 a while ago, with 17" 1024x768 before that. In a couple of years sweet spot will move to 24" 1920x1200, and so on. Hence monitor resolution does progress, it does NOT stagnate, and you do listen to Jarred's fairy tales too much :P
What we have here is a failure to communicate. My point, which you are ignoring, is that maximum resolutions are "stagnating" in the sense that they are remaining static. It's not "BS" or a "fairy tale", unless you can provide detail that shows otherwise. I purchased a 24" LCD with a native 1920x1200 resolution six years ago, and then got a 30" 2560x1600 LCD two years later. Outside of ultra-expensive solutions, nothing is higher than 2560x1600 right now, is it?
1280x1024 was mainstream from about 7-11 years ago, and 1024x768 hasn't been the norm since around 1995 (unless you bought a crappy 14/15" LCD). We have not had a serious move to anything higher than 1920x1080 in the mainstream for a while now, but even then 1080p (or 1200p really) has been available for well over 15 years if you count the non-widescreen 1600x1200. I was running 1600x1200 back in 1995 on my 75 pound 21" CRT, for instance, and I know people that were using it in the early 90s. 2048x1536 and 2560x1600 are basically the next bump up from 1600x1200 and 1920x1200, and that's where we've stopped.
Now, Anand seems to want even higher resolutions; personally, I'd be happier if we first found a way to make those resolutions work well for every application (i.e. DPI that extends to everything, not just text). Vista was supposed to make that a lot better than XP, but really it's still a crap shoot. Some apps work well if you set your DPI to 120 instead of the default 96; others don't change at all.
I agree that maximum resolution has stagnated at 2560x1600, my point was that the average resolution of PC gamers is still moving from pretty low 1280x1024 towards this holy grail of 2560x1600 and who knows how many years will pass until every PC gamer will have such "stagnated" resolution on his/her desk.
So yeah, max resolution stagnated, but average resolution did not and will not.
first off, I am not mad as hell! I just registered this acct after being an Anandtech reader for 10+ years. That's right. It's also my #1 tech website. I read several others, but this WAS always my favorite.
I don't know what happened here lately, but it's becoming more and more of a circus in here.
I am going to make a few points suggestions:
1) In the old days reviews were reviews, this days I there are a lot more PRE-views and picture shows and blog (chit chatting) entries.
2) In the old days a bunch of hardware was rounded up and compared to each other (mobos, memory, PSU's, etc..) I don't see that here much anymore. It's kind of worthless to me just to review one PSU or one motherboard at the time. Round em all up and then lets have a good ole fashioned shootout.
3) I miss the monthly buyer guides. What happened to them? I like to see CPU + Mobo + Mem + PSU combo recommendations in the MOST BANG FOR THE BUCKS categories (Something most people can afford to buy/build)
4) Time to moderate the comments section, but not censorship. My concern is not with f-words, but with trolls and comment abusers. I can't stand it. I remember the old days when then a famous site totally self-destruct, and at that time I think had maybe more readers than Anand, (hint: It had the English version of "Tiburon" as name as part of the domain name) when their forum went totally out of control because it was moderated.
5) Also, time to upgrade the comments software here at Anandtech. It needs a up/down ratings feature that even many newspaper websites offer these days.
I agree with the idea of a comments rating system (thumbs up or down). Its a democratic way of moderating. It also saves the need for those short replies when all you want to convey is that you agree or not.
Maybe also an abuse button that people can click on should things get really out of control..?
I don't like commenter ratings, since they give unequal representation/visibility of comments, they affect your perception of a message before you read it, and it's one more thing to look at while skimming comments.
I'm sorry, but I'm going to ask for a ban of SiliconDoc as well. One person has single-handedly taken over the reply section to this article. I was actually interested in reading what other folks thought of what I feel is a rather remarkable new direction that nvidia is taking in terms of gpu design and marketing, and instead there are literally 30 pages of a single person waging a shouting matching with the world.
Right now there isn't free speech in this discussion because an individual is shouting everyone down. If the admins don't act, people are likely to stop using the reply section all together.
Just created an account to say that I've never seen this kind of sycophantic, schizophrenic blathering as Silicondoc. I have been an Nvidia user for the past 7 years or so (simply out of habit) and before that a Voodoo user and I cannot even begin to relate to this buffoon.
Oh and to incense him even more, I'd like to add that I bought a HD5870 from newegg a couple nights ago since I needed to upgrade my old 8800 GTS card, NOW... not in a few months or next year. Now. Seeing as how the HD 5870 is the fastest for my buck, I went with that. Forget treating brands like religions. Get whatever's good and you can afford and forget brands. The end.
LOL - another lifelong nobody who hasdn't a clue and was a green goblin by habit self reported, made another account, and came in just to "incense" "the debater".
Really, you couldn't have done a better job of calling yourself a piece of trash.
Congratulations for that.
You admitted your goal was trolling and fanning the flames.
LOL
That was your goal, and so bright you are, that "sandwiches" came to mind as your name, a complimentary imitation, indicating you hoped to be equal to "Silicon" and decided to take a "tech part" styled name, or, perhaps as likely, it was haphazard chance cause buckey was hungry when he got here, and that's all he could think of.
I think you're another pile of stupid with the C and below crowd, basically.
So, that sort of explains your purchase, doesn't it ?
LOL
Yes, you can't possibly relate, you need more MHZ a lot bigger powr supply "sandwiches".
LOL
sandwiches!
Reading your comments makes me think you are about 12 and in need of serious evaluation. I have never seen someone so out of control on Anand for 30+ pages. Seriously, get your brain checked out, there is some sort of imbalance going on when everyone in the whole world is a liar except you, the only beacon of truth.
Anyways, is there any reason why this has to be a "one size fits them all" chip? I mean according to the article there's a lot of stuff in it, which only a minority of gamers would ever need (ECC memory? That's more expensive than normal memory and usually has a performance impact).
I mean there's already a workstation chip, why not a third one for GPU computing?
You know when someone says you are drunk/stupid/drugged/crazy/etc... then you might question him but when 2 or 3 people say it then it`s most than probably true but when all the people at anand say it then SiliconDoc should just stfu, go right now and buy an ati 5870 and smash it on the ground and maby he will feel better and let us be.
I vote for ban also :) and i donate 10$ for the 5870 we anand users will give him as a present for christmas.Happy new red year Silicon...
I'm excited about this but I wish it was ready sooner. It looks like we'll have to wait 2-3 months for benchmarks, right?
I hope it'll blow 5870 away because that's what is best for us, the consumers. We'll have an even faster GPU available to us which is all that really matters.
I've noticed that a person here has been criticizing this article for belittling the fact that nVidia's upcoming GPU is likely going to have a vastly suerior memory bandwidth to ATI's current flagship. Anand gave us the very limited data that exists at the moment and left most of the speculation to us. He doesn't emphasize that Fermi (which won't even be available for months) has far more bandwidth than ATI's current flagship. I contend that most people already suspected as much.
The vastly superior memmory bandwidth suggests that nVidia might just have a 5870 killer up it's sleeve. See what I just did there? This is called engaging in speculation. Anand could have done more of that, I agree, but saying that this is proof of Anand's supposed bias towards ATI? That is totally unreasonable.
Hey, Doc, do you want to see a real life, batshit crazy, foaming at the mouth fanboy? All you need is a mirror.
"Jonah did step in to clarify. He believes that AMD's strategy simply boils down to targeting a different price point. He believes that the correct answer isn't to target a lower price point first, but rather build big chips efficiently. And build them so that you can scale to different sizes/configurations without having to redo a bunch of stuff. Putting on his marketing hat for a bit, Jonah said that NVIDIA is actively making investments in that direction. Perhaps Fermi will be different and it'll scale down to $199 and $299 price points with little effort? It seems doubtful, but we'll find out next year."
FOOL!
So nVidia is going to make this for Tesla. That's great that they're innovating but you mentioned that those sales are only a small percentage. AMD went from competing strongly in the CPU market to dominating the GPU market. Good move. But if there's no existing market for the GPGPU...do you really want to be switching gears and trying to create one? Hmm. Crazy!
Admin: You've overstayed your welcome, goodbye.
I'm not sure where you got your red rooster lies information. AMD/ati has NO DOMINATION in the GPU market.
---
One moment please to blow away your fantasy...
---
http://jonpeddie.com/press-releases/details/amd-so...">http://jonpeddie.com/press-releases/det...ntel-and... ---
Just in case you're ready to scream I cooked up a pure grteen biased link, check back a few pages and you'll see the liar who claimed ati is now profitable and holding up AMD because of it provided the REAL empty rhetoric fanboy link http://arstechnica.com/hardware/news/2009/07/intel...">http://arstechnica.com/hardware/news/20...-graphic... which states
" The actual numbers that JPR gives are worth looking at,
which show INTEL dominates the GPU market !
Wow, what a surprise for you. I've just expanded your world, to something called "honest".
So:
INTEL 50.30%
NVIDIA 28.74%
AMD 18.13%
others below 1% each.
------------
Gee, now you know who dominates, and for our discussions here, who is IN LAST PLACE! AND THAT WOULD BE ATI THE LAST PLACE LOSER!
--
Now, I wouldn't mind something like ati is competitive, but that DOMINATES thing says it's #1, and ati is :
************* ATI IS IN LAST PLACE ! LAST PLACE! **************
Now please whine about the 3 listed at the link less than 1% each, so you can "pump up ati" by claiming "it's not last, which of course I would welcome, since it's much better than lying and claiming it's number one.
---
I suspect all you crying ban babies are ready to claim to have found absolutely zero information or contributioon in this post.
huge difference between INTEGRATED GRAPHIC SEGMENT, wich is almost every laptop there and a lot of business computers.
versus the DISCRETE MARKET, wich is the GAMING section... where AMD-ATI and NVIDIA are 100%.
get your facts and see a doctor, your delusional attitude is getting annoying.
Gee, you certaibnly are not a computer technician.
That you even POSIT that games aren't played on INTEL GPU's is an astounding whack.
Nvidia and ati have laptop graphics, as does intel, and although I don't have the numbers for you on slot cards vs integrated, your whole idea is another absolute FUD and hogwash.
INTEL slotted graphics are still around playing games, bubbba.
You sure don't know much, and your point is 100% invalid, and contains the problem of YOUR TINY MIND.
Let me remind you Tamalero, you shrieked and wailed against nvidia for INTEGRATED GRAPHICS of theirs on a laptop !
Wow, you're a BOZO, again.
The performance and awesomeness of a company campared to another is biaised. Sure Nvidia probably has historically done better than ATI.
I hope that ATI as a company does much better this year and next, so that the there is greater competition. Competition which will consequently mean smaller profit margins, but better deals for us consumers!
At the end of the day, who cares who's winning? Shouldn't we all be hoping that each does well? Shouldn't we all hope that there will always be several major graphics providers? Do we really want a monopoly on GPU's? How would this effect the price on a performance card?
I think you should be banned SiliconDoc. You're adding no real value here. Leave.
I have a GTX260 btw. So i'm not speaking from bias. Wander what kind of card you have? lol...
What card you have or don't have doesn't matter one whit, but what you claim DOES. What you SPEW does !
and when you lie expect any card to save you.
If liars were banned, you'd all be gone, and I'd be left. ( that does not include of course those who aren't trolling jerkbots running around to my every post wailing and whining and saying ABSOLUTELY NOTHING )
-
" Sure Nvidia probably has historically done better than ATI."
since you 1. obviously haven't got a clue wether what you said is true or not 2. Why would you even say it, with your stupidty retaining, lack of knowledge, or lying caveat "probably" ?
If you're so ignorant, please shut it on the matter! Do you prefer to open your big fat yap and prove how knowledgeless you are ? I guess you do.
If you don't know, why are you even opening your piehole about it ?
It certainly doesn't do anything for me if you aren't correct and you don't know it ! I don't WANT YOUR LIES, nor your pie open when you haven't a clue. I don't want your wishy washy CRAP.
Ok ?
Got it ?
If you open the flapper, make sure it gets it right.
-
If you actually are an enthusiast, why is it that the result is, you blather on in bland generalities, get large facts tossed in a fudging, sloppy, half baked inconclusive manner, and in the end, wind up being nothing less than the very red rooster you demand I believe you are not.
What a crappy outcome, really.---
--
Frankly, you cannot accept me even telling the facts as they actually are, that is too much for your mushy, weak, flakey head, and when I do, you attribute some far out motive to it !
There's no motive other than GET IT RIGHT, YOU IDIOTS !
--
What do you claim, though ?
Why is it, you have such an aversion to FACTS ? WHY IS THAT ?
If I point out ati is not in fact on top, but last, and NVIDIA is almost double ati, (to use the "authors" comparison techniques but not separate companies for "internal comparisons" and make CERTAIN I exagerrate) - why are you so GD bent out of shape ?
I'll tell you why...
YOU FORBID IT.
I certainly don't understand your mindset, you'd much prefer some puss filled bag of slop you can quack out so "we can come to some generalization on our desires and feelings" about "the industry".
Go suck down your estrogen pills with your girlfriends.
---
I don't care what your feelings are, what flakey desire you have for continuing competition, because, you prefer LIES over the truth.
Instead of course, after you whining in some sissy crybaby pathetic wail for the PC cleche of continuing competition, you'll turn around and screech the competition I provide to your censored mindset is the worst kind you could possibly imagine to encounter ! Then you wail aloud "destroy it! get rid of it ! ban it ! "
LOL
You're one piece of filthy work, that's for sure.
---
So, you want me to squeal like an idiot like you did, that you want lower prices and competition, and the way to get that is to LIE about ati in the good, and DISS nvidia to the bad with even bigger lies ?
I see.. I see exactly !
So when I point out the big fat lying fibs for ati and against nvidia - you percieve it as a great threat to "your bottom line" pocketbook.
LOL
Well you know what - TOO BAD ! If the card cannot survive on FACTS AND THE TRUTH, then it deserves to die.
Or is honesty banned so you can fan up ati numbers with your lies, and therefore get your cheaper nvidia card ?
--
This is WHAT YOU PEOPLE WANT - enough lies for ati and against nvidia to keep the piece of red crappin ?
LOL yeah man, just like you jerks...
---
" At the end of the day, who cares who's winning? "
Take a look at that you flaked out JERKOFF, and apply it to this site for the YEARS you didn't have your INSANE GOURD focused on me.
Come on you piece of filth, take a look in the mirror !
It's ALL ABOUT WHOSE WINNING HERE.
THE WHOLE SITE IS BASED UPON YOU LITTLE PIECE OF CRAP !
---
And of course worse than that, after claiming you don't care whose winning, you go on to spread your hope that ati market share climbs, so you can suck down a cheapo card with continuing competition.
So what that says, is all YOU care about is your money. MONEY, your money.
"Quick ban the truth! jerkoffs pocketbook is threatened by posts on anandtech because this poster won't claim he wants equal market share !"
--
Dude, you are disgusting. You take fear and personal greed to a whole new level.
Why that's great, let's see what you have for any rebuttal to this clone you claim is me:
" Even the gtx260 uses less power than the 4870.
Pretty simple math - 295 WINS.
Cuda, PhysX, large cool die size, better power management, game profiles out of the box, forced SLI, better overclocking, 65% of GPU-Z scores and marketshare, TWIMTBP, no CCC bloat, secondary PhysX card capable, SLI monitor control "
---
Anything there you can refute ? Even just one thing ?
I'm not sure what your complaint is if you can't.
That's the text that was there, so why didn't you read it or try to claim anything in it was wrong ?
Are you just a little sourpussed spasm boy red, or do you actually have a reason ANY of that text is incorrect ?
Anything at all there ? Are you an empty shell who copies the truth then whines like a punk idiot ? Come on, prove you're not that pathetic.
less margins indeed. At a business point of view though that's good anyway, not because less margins is good for business (of course not) but for what it implies. It means factories can keep on maximum production - and that's very important. There's less profit for each sale but number of consumers is bigger in an exponential way. So not bad indeed... even for them - good for all.
A few points I have about this chip. First it is massive which leads me to believe it is going to be hot and use a lot of power (depending on frequencies). Second it is a one size fits all processor and not specifically a graphics processor. Third is it is going to be difficult to make with decent yields IE expensive and will be hard to scale performance up. I do believe It will be fast due to cache but redesigning cache will be hard for this monolith.
It should take the performance crown back from ATI but I'm worried that it's going to be difficult to scale it down for lesser cards (which is where nVidia will make more of its money anyway).
When it's out and we can compare its performance as well as price with the 58x0 series, I'll be happier. Choice is never a bad thing. I also don't want nVidia to be too badly hurt by Larrabee so it's in their best interests to get this thing out soon.
The Atom is for mobile applications, and Intel is still designing faster desktop chips. The "Atom" of graphics is called "integrated", and it has been around forever. There's no reason to believe that PC games of 2010 won't require faster graphics.
The fact that nvidia wants to GROW doesn't mean their bread-and-butter business is going away. Every company wants to grow.
If Fermi's die size is significantly increased by adding stuff that doesn't benefit 3D games, that's a problem, and they should consider 2 different designs for Tesla and gaming. Intel has Xeon chips separate, don't they?
If digital displays overcome their 60Hz limitation, there will be more incentive for cards to render more than 60fps.
Lastly, Anand, you have a reoccurring grammar problem of separating two complete sentences with a comma. This is hard to read and annoying. Please either use a semicolon or start a new sentence. Two examples are, Page 8, sentences that begin with "Display resolutions" and "The architecture". Aside from that, excellent article as usual.
Actually, NVidia is great company, as well as AMD is. However, NVidia cards recently tend to be more expensive compared to their counterparts, so WHY somebody would pay more for the same result?
If and when they bring that Fermi to the market, and if that thing is $200 per card delivered to me, I may consider buying. Most people here don't care if NVidia is capable of building supercomputers. They care if they can buy descent gaming card for less than $200. Very simple economics.
I'm not sure, other than there's another red raver ready on repeat, but if all that you and your "overwhelming number" of fps freaks care about is fps dollar bang, you still don't have your information correct.
Does ATI have a gaming presets panel, filled with a hundred popular games all configurable with one click of the mouse to get there?
Somehow, when Derek quickly put up the very, very disappointing new ati CCC shell, it was immediately complained about from all corners, and the worst part was lesser functionality in the same amount of clicks. A drop down mess, instead of a side spread nice bookmarks panel.
So really, even if you're only all about fps, at basically perhaps a few frames more at 2560x with 4xaa and 16aa on only a few specific games, less equal or below at lower rez, WHY would you settle for that CCC nightmare, or some other mushed up thing like ramming into atitool and manually clicking and typing in everything to get a gaming profile, or endless jacking with rivatuner ?
Not only that, but then you've got zero PhysX (certainly part of 3d gaming), no ambient occlusion, less GAME support with TWIMTBP dominating the field, and no UNIFIED 190.26 driver, but a speckling hack of various ati versions in order to get the right one to work with your particular ati card ?
---
I mean it's nice to make a big fat dream line that everything is equal, but that really is not the case at all. It's not even close.
-
I find ragin red roosters come back with "I don't run CCC !" To which of course one must ask "Why not ? Why can't you run your "equal card" panel, why is it - because it sucks ?
Well it most definitely DOES compared to the NVidia implementation.
--
Something usually costs more because, well, we all know why.
well that answers everything, when someone has to spam the "catholic", must be a bibblethumper who only spreads a single thing and doesnt believe nor accept any other information, even with confirmed facts.
What makes you think I'm "catholic" ?
And that's interesting you've thrown out another nutball cleche', anyway.
How is it that you've determined that a "catholic" doesn't accept "any other 'even confirmed' facts" ? ( I rather doubt you know what Confirmation is, so you don't get a pun point, and that certainly doesn't prove I'm anything but knowledgeable. )
Or even a "Bible thumper" ?
Have you ever met a bible thumper?
Be nice to meet one some day, guess you've been sinnin' yer little lying butt off - you must attract them ! Not sure what proives either, other than it is just as confirmed a fact as you've ever shared.
I suppose that puts 95% of the world's population in your idiot bucket, since that's low giving 5% to athiests, probably not that many.
So in your world, you, the athiest, and your less than 5%, are those who know the facts ? LOL
About what ? LOL
Now aren't you REALLY talking about, yourself, and all your little lying red ragers here ?
Let's count the PROOFS you and yours have failed, I'll be generous
1. Paper launch definition
2. Not really NVIDIA launch day
3. 5870 is NOT 10.5" but 11.1" IN FACT, and longer than 285 and 295
4. GT300 is already cooked and cards are being tested just not by your red rooster master, he's low down on the 2 month plus totem pole
5. GT300 cores have a good yield
6. ati cores did/do not have a good yield on 5870
7. Nvidia is and has been very profitable
8. ati amd have been losing lots of money, BILLIONS on billions sold BAD BAD losses
9. ati cores as a general rule and NEARLY ALWAYS have hotter running cores as released, because of their tiny chip that causes greater heat density with the same, and more and even less power useage, this is a physical law of science and cannot be changed by red fan wishes.
10. NVIDIA has a higher market share 28% than ati who is 3rd and at only 18% or so. Intel actually leads at 50%, but ati is LAST.
---
Shall we go on, close minded, ati card thumping red rooster ?
---
I mean it's just SPECTACULAR that you can be such a hypocrit.
a.-Since when the yields of the 5870 are lower than the GT300?
they use the same tech and since the 5870 is less complex and smaller core, it will obviusly have HIGHER YIELDS. (also where are your sources, I want facts not your imaginary friend who tells you stuff)
2.-Nvidia wasnt profiteable last year when they got caught shipping defective chipsets and forced by ATI to lower the prices of the GT200 series.
3.- only Nvidia said "everyhing is OK" while demostrating no working silicon, thats not the way to show that the yields are "OK".
4.- only the AMD division is lossing money, ATI is earning and will for sure earn a lot more now that the 58XX series are selling like hot cakes.
5.- 50% if you count the INTEGRATED market, wich is not the focus of ATI, ATI and NVidia are mostly focused on discrete graphics.
intel as 0% of the discrete market.
and actually Nvidia would be the one to dissapear first, as they dont hae exclusivity for their SLIs and Corei7 corei5 chipsets.
while ATI can with no problem produce stuff for their AMD mobos.
and dude, I might be from another country, but at least Im not trying to spit insults every second unlike you, specially when proven wrong with facts.
please, do the world a favor and get your medicine.
Since you pretend to be a Kingslayer, I have to warn you, you have failed.
I am King, as you said, and in this case, your dullard's, idiotic attempt at another red raging rooster "silence job", has utterly failed.
Now, did you see that awesome closeup of TESLA ? You think that's carbon fiber at the bracket end ? Sure looks like it.
I wonder why ati only has like "red" "red" "red" "red" and "red" cards, don't you ?
I mean I never really thought about it before, but it IS LAME. It's like another cheapo low rent cost savings QUACK from the dollar undenominated lesser queendom of corner cutting, ati.
--
Gee, thank you for the added inspiration, as you well have noticed, the awful realities never mentioned keep coming to light.
At least one of us is actually thinking about videocards.
Not like you'll ever change that trend, so the King's Declaration is: EPIC FAIL !
Well that's actually kinda cool, but was the rest of it covered in that sick ati red ? That card is a heat monster with it's tiny core, so we know it was COVERED in fannage and plastic, unless it was a supercheap single slot that just blasted it into the case.
BTW, in order to comment you've exposed another notch in your red fannage purchase history.
Yes, once again you made my point for me, and being so ignorant you failed to notice ! I mean do you guys do this on purpos, or are you just that stupid ?
If you have a red nvidia card on the desk at work, that shows nvidia is flexible in color schemes, unlike 'red 'red' red red red red red rooster cards !
You do realize you made an immense BLUNDER now, don't you ?
You thought I meant the color RED was awful.
lol
Man you people just don't have sense above a tomato.
Just because the reference cards are red, doesn't mean the manufacturers have to make them so.
In fact, ASUS released a 4890 with a black PCB.
You've now descended from arguing about the length of the card and the power of its GPU to the colour of the PCB. Considering it's under a damned cooling solution, how does this matter?
Anand hates Nvidia because they competed against his former lover AMD, and compete against his current lover Intel. Anand you are such a rotten spoiled brat. Since this website fell into your lap you could at least make an effort to act responsibly. Have you EVER held a job working for someone else? I doubt it. And you should ban the other spoiled brats who apparently work for Microsoft and spend about 8 hours a day dominating everything posted on Dailytech such as TheIdiotNickDanger and Evil666. Bunch of Cretians.
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I would love to kick you hard in the face, breaking it. Then I'd cut
your stomach open with a chainsaw, exposing your intestines. Then I'd
cut your windpipe in two with a boxcutter.
Hopefully you'll get what's coming to you. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. Faggot
Shut the fuck up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
Look at every bright area of high contrast. All the spotlight reflections have a red ring around them. So the thumb, in front of the highly reflective gold connectors, also has the same halo effect. I think that it's as much evidence of a digital camera as it is Photoshop manipulation.
With that said, it could also be a non-functional mock-up. Holding a mock-up or prototype in your hand is not the same as benchmarking a production (ready for consumer release) product.
OK, so assuming it's a fake (and I'm not saying it isn't), I have three questions:
1) Where did you get the photo?
2) Why do it? (And "Who did it?", but that's closely related to Q1.
3) Where did they get the photo of the hardware, which they then put into the person's hand?
Combining #2 and #3) If the card is from a real photo of real hardware, then what was the value of photoshopping it into someone's hand?
I'm not trying to argue, just trying to understand.
Seriously? I have a 1080p monitor and Radeon 4670 with UVD2, but my PS3 with 1080p output to the same monitor looks MUCH better at upscaling DVDs (night and day difference.) PowerDVD does have a better upscaling tech, but that's using software decoding. Can somebody port ffdshow/libmpeg2 for CUDA and ATI Stream (or DirectCompute?) kthxbye
I buy two videocards per year on average. I've owned an almost equal number of ATI/Nvidia cards. I loved my geforce 8800 GTX despite it costing a fortune but since then it's been ALL down hill. I've had driver issues with home theater PCs and Nvidia drivers. I've been totally disappointed with Nvidias performance with high def audio formats. The fact that the entire ATI 48xx line can do 7.1 audio pass-through while only a handful of Nvidia videocards can even do 5.1 audio passthrough is just sad. The world is moving to hometheater gaming PCs and Nvidia is dragging arse.
The fact that 5850 can do bitstreaming audio for $250 RIGHT NOW and is the second fastest 1 GPU solution for gaming makes it one hell of a product in my eyes. You no longer nead an Asus Xonar or Auzentech soundcard saving me $200. Hell with the money I saved I could almost buy a SECOND 5850! Lets see if the new Nvidia cards can do bitstreaming... if they can't then Nvidia won't be getting any more of my money.
P.S. Thanks Anand for inspiring me to build the hometheater of my dreams. Gaming on a 110 Inch screen is the future!
Well that's very nice, and since this has been declared the home of "only game fps and bang for that buck" matters, and therefore PhysX, ambient occlusion, CUDA, and other nvidia advantages, and your "outlier" htpc desires are WORTHLESS according to the home crowd, I guess they can't respond without contradiciting themselves, so I will considering I have always supported added value, and have been attacked for it.
--
Yes, throw out your $200 sound cards, or sell them, and plop that heat monster into the tiny unit, good luck. Better spend some on after market cooling, or the raging videocard fan sound will probably drive you crazy. So another $100 there.
Now the $100 you got for the used soundcard is gone.
I also wonder what sound chip you're going to use then when you aren't playing a movie or whatever, I suppose you'll use your motherboard sound chip, which might be a lousy one, and definitely is lousier than the Auzentech you just sold or tossed.
So how exactly does "passthrough" save you a dime ?
If you're going to try to copy Anand's basement theatre projection, I have to wonder why you wouldn't use the digital or optical output of the high end soundcard... or your motherboards, if indeed it has a decent soundchip on it, which isn't exactly likely.
-
Maybe we'll all get luckier,and with TESLA like massive computing power, we'll get an NVIDIA blueray dvd movie player converter that runs on the holy grail of the PhysX haters, openCL and or direct compute, and you'll have to do with the better sound of your add on sound cards, anyway, instead of using a videocard as a transit device.
I can't imagine "cable mamnagement" as an excuse either, with a 110" curved screen home built threate room...
---
Feel free to educate me.
People will buy nVidia hardware for their HTPCs regardless of it having PhysX, AO, CUDA or whatever. Price is a very attractive factor, but so is noise and temperature, so people will go for what suits them the best. If people think nVidia offers more for the price, they will buy it, some may go for another option if they want less heat, or less speed or whatever. It's their choice, and not one made out of malice.
This thread isn't full of nVidia-haters like you want to believe it is. Keep thinking that if you feel more comfortable doing so. In the end, we as consumers have a choice as to what we buy and nothing of what you are saying here has any bearing on that decision making process.
I think I'll just ignore you, since you seem to have acquired a Svengali mind read on your big "we" extension, and somehow think you represent every person here.
I don't put any stock in your idiotic lunatic demi-god musings.
--
If you ever say anything worth more than a piece of scat, I will however respond appropriately.
I'll remind you, you can't even prevent YOURSELF from being influenced by me, let alone "everyone here".
Now if you don't have any KNOWLEDGE on the HTPC issues and questions I brought up with this other poster and his HTPC dreams, please excuse your mind reading self, and keep yourself just as deluded as possible.
I find this a classic IDIOCY : " we as consumers have a choice as to what we buy (oh no problem there)
and nothing of what you are saying here has any bearing on that decision making process. "
You just keep telling yourself that, you unbelievably deranged goofball. LOL, and maybe it will become true for you, if you just keep repeating it.
The first sign of your own cracked shield in that area is you actually saying that. You've already been influenced, and you're so goofy, you just had to go in text and claim no one ever will be.
I mean, you are so much worse than anything I've done here it is just amnazing.
How often do you tell yourself fantasies that there is no chance to can possibly believe or prove, and in fact, have likely already failed yourself ?
Really, I mean absolutely.
If you had a mind left to form any sort of coherent thought patterns, we might take you seriously here. You have just admitted (in your own incoheret, babbling way) that you are trying to actively (and forcibly, I might add) influence people to buy nVidia cards over ATI. I'm telling you that you've failed and will continue to fail as long as you keep shimmying up and down the green flag pole in the name of progress. I wonder if anyone at nVidia reads these comments; what must they think of you? If they considered AT a biased publication then they wouldn't speak with Anand as cordially as they do.
I say "we" because, unless you've opened your eyes, "we" as a community are becoming even more united against no-brained deluded fanboys such as yourself. We DON'T hate nVidia, a lot of people here own nVidia cards, some only have nVidia cards, some own nVidia and ATI, and some own ATI. This isn't about hatred or bias is misinformation; this is about one socially inept weasel who has been attempting to shove his knowledge down everyone else's throats on this (and other subjects) whether there's any factual basis to it or not.
You disagree with me using the term "we", fine. I personally want to see the GT300 launch. I personally want nVidia to bring out a mainstream flavour to compete with the 5850. I personally want prices to fall. I personally don't have anything against PhysX, CUDA or AO. I personally want to see 3D gaming gather momentum.
Now ask yourself - can you be as objective and impartial as that?
You just seem to read what you like and completely miss the point of any post you reply to. There's no way someone can be impartial on this site with you around because any word of praise about ATI equates to bias in your head.
There's only so far you can go before someone clicks the Ban button but I'm sure you'll come back with another account.
I dont think thats to fair from nvidia to let pay the extra cost for design and manufacture from the gpgpu bloat for all people. They lunched tesla card becouse it cost insane money and can get away with curent yields. For majority of graphic is still simd and almosnt no branching uterly enough. I mean if they would make stand alone cuda cards without the useless graphic pipeline parts it could be smaler or faster.And that goes for graphic too.
I mean how hard would it be for amd or intel to put some similar low transistor budget simd units to the pipeline to CPU like are in GPU. And they could run on CPU clocks and would be integral part of CPU(latencies, cache etc).
I dont think thats the right strategy for nvidia.
nVidia could charge a premium for the Tesla-badged cards due to their potential savings over the more traditional method of using masses of general-purpose servers, however they may want to really establish Tesla as a viable option so they can't very well charge too much for it.
I'm interested in seeing the peak performance figures for both Cypress and Fermi; will the AMD part still have an advantage in raw processing power due to having many many more, if weaker, SPs/cores? And will it matter in the working environment?
Nvidias dreams of those 500x performance in the coming years are actualy only for GPGPU not graphic.
The curent gen cards are begining to show some strange scaling.(i think nvidia wont be other in this case too)
They will need some more changes if they want to utilize more shader procesors than just double everithing. If u think about it than at 850 MHz feeding 1600 shaders (actualy 320 is more realistic) is quite a transistor chalenge.(CPUs look like babie toys to these with large cache and much less core)
Actualy there are some physical limits to transistors too. Increasing to 4k milion transistors and 3200 shaders in next card would need even more internal speed. It would be maybe easyer to place to rv870 dies in one gpu than double everything.
We all like our freedom of opinions at anand and this article was very interesting and the comments as well that was until SiliconDoc started trashing everything.As stated by a lot of other users i ask anand to take some action against the user , he is ruining my experience and others of calmly reading the articles in the morning with a cofee :).All his arguments and the way he throws them is so random and makes no sense, he sound like a man who needs his drug of praising nvidia and trashing the red rooster any way he can even if it`s with no real arguments.I read with pleasure the comments of the smart non-biased guys posting here but this guy is just talking crap to fill the lines.
On topic ... considering what 5850 has : eyefinity , performance/price, directx 11, power cons, and most important availability i was smiling to myself and thinking that ATI will have killer sales this 3 months left of 2009. I personally will wait for nvidia to bring fermi and with it the price war cuz we all know that then all prices will go down i estimate 150$ for 5850 and about 200 for 5870 around june and if nvidia has better price/perf i will definetly buy it.
And, here we have your contribution, after whining about me claiming no points, the usual bs from red lovers, here is the evidence of your bloodshot eyes, at least you've accepted my direct orders and forced yourself to talk topic.
-
" On topic ... considering what 5850 has : eyefinity , performance/price, directx 11, power cons, and most important availability i was smiling to myself and thinking that ATI will have killer sales this 3 months left of 2009. "
--
And after you realize what a red rooster you just were, wether you thought it was a good jab at me, since you know I'll read your attack and that's what the attack was about, or wether you couldn't help yourself, you went on to claim how fair and balanced you are after you hoped for 2 cheap ati cards. LOL The afterthought, barely surfacing from the lack of wattage, added at the end, "if nvidia has better I'll blah blah"..
FUNNY how you talk about THOSE CARDS in the TESLA THREAD, when you are ON TOPIC !
roflmao !
Wowzie!
Cocka ! Doodle ! Doooo !
Let me ask you, since you considered eyefinity so great, do you
The GPGPU market is very little...doesn't make good revenue for a company. Moreover the sw development is HARD and expensive; this piece of silicon seems like a NVidia mistake: too big to manifacture for a graphic card, nice (on the paper) for a market of niche.
In comparison Amd is working far better and Intel too with Larrabee.
Hard times at the orizon for Nvidia, it has a monster with very low manufacturing yields, but nothing feasible for the consumer arena.
A prediction ? AMD will have the lead of graphic cards in the next years.....
The yields are fine, apparently you caught wind of the ati pr crew who was caught out lying.
If not, you just made your standard idiot assumption, because the actual FACTS concerning the these tow latest 40nm chips is that ati yields have been very poor, and nvidias have been good.
---
Nice try, but you're wrong.
" Scalable Informatics has been selling NVidia Tesla (C1060) units as part of our Pegasus-GPU and Pegasus-GPU+Cell units. Several issues have arisen with Tesla availability and pricing.
First issue: Tesla units are currently on a 4-8 week back order. We have no control over this, all other vendors have the exact same issues. NVidia is not shipping Tesla in any appreciable volumes.
Our workaround: Until NVidia is able to ramp its volume to appropriate levels, Scalable Informatics will provide loaner GTX260 cards in place of the Tesla units. Once the Tesla units ship, we will exchange the GTX260 units for the Tesla units.
Update: 1-September-2009
Tesla C1060 units are now readily available for Pegasus and JackRabbit systems.
---
NOW THE PRICING
" Scalable Informatics JackRabbit systems are available in deskside and rackmount configurations, starting at 8 TB (TeraByte) in size, with individual systems ranging from 8TB to 96TB, and storage clusters up to 62 PB (PetaByte), with most systems starting price under $1USD/GB."
Yes, not much there. LOLOLOL
--
POWER SAVINGS replacing massive cpu computers
--
The BNP Paribas (finance) study showed a $250,000 500 core cluster (37.5 kW) replaced with a 2 S1070 Tesla cluster at a cost of $24,000 and using only 2.8 kW. A study with oil and gas company Hess showed an $8M 2000-socket system (1.2Mw) being replaced by a 32 S1070 cluster for $400,000 and using only 45 kW in 31x less space. If you are running a CUDA-enabled application, or have access to the source code (you’ll need that to take advantage of the GPUs), you can clearly get significant performance gains for certain applications.
-
about 4 TFLOPS of peak from four C1060 cards (or 3 C1060 and a Quadro) and plugs into a standard wall outlet. Word from some of those selling this system is that sales have been mostly in the academic space and a little slower than expected, possibly due to the initially high ($10k+) price point. Prices have started to come down, however, and that might help sales. You can buy these today from vendors like Dell, Colfax, AMAX, Microway, and Penguin (for a partial list see NVIDIA’s PS product page).
-
---
And, of course you predict amd will have the lead in videocards the next few years. LOL
bhwahahahahaaaaaaaaaaaaaaaaaa
Personally I think NVidia has made the best bet it can make with supporting more Telsa style stuff, and in general just building a bigger madder GPU.
The fact is that there aren't many good PC games around, I would say NVidia made some good sales out of Crysis by it self, people building a new PC with that game in mind having a very large weight on GPU choice.
But it is just not enough. L4D 2 is the next big title but being on the Vavle engine everyone know you will get 100fps on a GTX 275.
The other twist is that Steam has probably been one of the best things for gaming on the PC it just makes things 10 times easier.
Manually patching games etc is a killer for all but those who are gaming enthusiasts.
GT300 looks like a revolutionary product as far as HPC and GPU Computing are concerned. Happy times ahead, for professionals and scientists at least...
Regarding the 3d gaming market though, things are not as optimistic. GT300 performance is rather irrelevant, due to the fact that nvidia currently does not have a speedy answer for the discrete, budget, mainstream and lower performance segments. Price projections aside, the GT300 will get the performance crown, and act as a marketing boost for the rest of the product line. Customers in the higher performance and enthusiast markets that have brand loyalty towards the greens are locked anyway. And yes, thats still irrelevant.
Remember ppl, the profit and bulk in the market is in a price segment nvidia does not even try to address currently. We can only hope that the greens can get sth more than GT200 rebranding/respins out for the lower market segments. Fast. Ideally, the new architecture should be able to be downscaled easily. Lets hope for that, or its definitely rough times ahead for nvidia. Especially if you look closely at the 5850 performance per $ ratio, as well as the juniper projections. And add in the economy crisis, shifting consumer focus, the difference of performance needed by sotware and performance given by the hw, the locking of TFT resolutions and heat/power consumption concerns.
With AMD getting out of the warehouses the whole 5XXX family in under 6months (I think thats a first for the GPU industry, I might be wrong though), the greens are in a rather tight spot atm. GT200 respins wont save the round, GT300 @500$++ wont save the round, and tesla wont certainly save the round (just look at sales and profit in the last years concerning the HPC-GPUCU segments).
Lets hope for the best, its in our interest as consumers anyway..
I'm leaving all your posts here for evidence that you're a complete lunatic, but I'm glad you realise that you do need help. It's the first step.
I recommend checking out Nvidia forums and posting there - you'll feel more at home.
-----------
S/He is obviously retarded person. I mean initially it was "fun" to read but now I'm just so bored with this s*it and it is actually interfering while trying to read comments from other readers. Maybe that is the whole point of the noise; to side track any meaningful conversation.
I vote silicondoc to banned from this site (or give me an ability to filter all the post from and related to this user)... anyone else?
It seems to me, you wish to remain absolutely blind with your fuming hatred and emotional issues, let's see WHAT was supposedly said at your link:
" Originally Posted by wuyanxu
nVidia is trying very hard to NOT loose this round, they've priced this too aggressively, surely there's some cooperate law on this? "
---
Here we see the brain deranged red rooster, who has been decieved by the likes of you know who, for so long, that a low priced Nvidia card that beats the ati card, must be "illegally priced", according to the little red communist overseas.
I suppose pointing that out in a fashion you and your glorious roosters don't like, is plenty reason for you to shriek "contributes nothing" and "let's ban him!"
Well, fire up your glowing red torches, and I will gladly continue to show what fools red roosters can be, and often are.
I'm so happy you linked some silicondoc post on some other forum, and we had the chance to see the deranged red rooster screech that a low priced Nvidia GTX275 is illegal.
--
Good for you, you're such a big help here.
would be nice, I was wondering when I saw this article how it could have 140 comments already, forgetting he was sure to come trolling. I've stopped reading each comment thread after he got involved, since any chance of reliable information coning out has ceased.
Ooops, I think I need to speak something on topic at least. Anyone could tell me if OpenCL SDK is out yet? Or DirectCompute too? It has been over a year since GPU computing was announced and nothing useful for the consumers (I don't call folding for consumers).
Yes, both OpenCL and DirectCompute are available for development. It will take time for developers to release applications that use these APIs.
There are already consumer applications that use CUDA, although these are mostly video encoding, Folding@Home/SETI@home, and PhysX-based games. Possibly not too exciting to you, but hopefully more will be coming as GPU computing gains traction.
Does anyone know if the 5000 series support hardware virtualisation? I think this will be the killer feature once AMD's 800 series chipsets debut here shortly. Being able to virtualise the GPU and other hardware with your virtual machines is the last stop to pure bliss.
I am also curious. Right now only nVidia's Quadro cards support this.
The thing is, though, that your CPU and chipset also have to support what Intel calls VT-d.
Being able to play 3D games in virtual OS with little to no performance would be great and useful.
Not going to happen soon, though. It's also funny that virtually no one Lynnfield mentioned the lack of VT-d in 750 in his "deep" review. Huge disappointment.
If there's any technology that seams to scratch that virtualization, i think this new gt300 is the one. When reading about nvidia making the card compute oriented it just drove my mind to that thought. Hope i'm right. To be fair with amd, i think their doubled stream processors could be a step forward in that direction too, coupled with dx11 direct compute. Virtual machines just need to acknowledge the cards and capabilities.
According to the super troll who keeps screeching about bandwidth, then the GT300 must be a lesser card since it doesn't have 512 bit connection like the GT200.
Yes that is quite disappointing, but the 3 million transistor count and ddr5 somewaht makes up for it, and the fact that we're told by the red roosters that even 153 bandwith is plenty or just a tiny bit shy, and with what it looks like the 384 bit ddr5 4000 or 4800 data GT300 will come in at 192 bandwith minimum, or more likely 240 bandwith, quite a lot higher than 153.6 for ati's 5870.
---
So really, what should I believe now, 153 is really plenty except in a few rare instances, or what you say LOL @ Trolls should believe, that 192 or 240 is worse than 153 ?
--
You might LOL @ TRolls, but from my view, you just made an awful fool of yourself.
HINT: The ati 5870 is only 256bit, not 384, and not 512.
I know the ATI cards have 256 bit connections dumb ass. I'm just using your logic (or lack of it.) ATI has been able to outperform Nvidia cards with their 256 bit connections so your point about bandwidth is meaningless idiot.
Golly, that ddr5 has nothing to do with bandwith, right, you stupid idiot ?
--
Talk to me about the 4850, you lame, near brain dead, doofus. It's got ddr3 on it.
---
See, that's the other problem for you idiotic true believers, NV is moving from 2248 data rate ram on up to 4800.
But you're so able to keep more than one tiny thought in your stupid gourd at once, you "knew that".
BTW, you're not doing what I'm doing, you're not capable of it.
Now that sourpussed last little whine by you, the 295, beats everything ati has, making your simpleton statement A BIG FAT JOKE.
My god silicondoc you aren't really succeeding here. At what purpose? To convince people not to buy ati cards? You are such a complete, massive ahole it makes me want to go out and buy ati cards in bulk just to spite you.
I'm guessing that if nvidia PR ever watched you rant your all-caps rants they would politely request that you stop associating yourself with their product.
Go ahead everybody google "silicondoc" if you have a strong stomach. Talk about spreading yourself all over the tubes! This guy's fingerprint is unmistakeable. Looks like he got banned on the HondaSwap forums after 14 posts. Guess he sucks on every forum. Maybe anandtech could ban him?
I think every one of you, that instead of actually leaving me alone, or responding with a counter argument to my points, every one of you that merely got logged in, and ripped away at me with an insult ought to be BANNED.
That's what really should happen. I make my complaints and arguments on article and cards and companies, and the lies I see about all those, and most, but not all of you, have no response other than a pure trolling, insulting put down.
Every single one of you that came in, and personally attacked me without posting a single comment about the article, YOU are the ones that need to be banned.
Your collective whining is pure personal attack, and instead of commenting on the article, your love or hate for it, you texted up and did one single thing, let loose a rant against me. Just because you could, just because you felt apparently, "it was taking the high road"... which is as ignorant as the lies I've pointed out.
Time for YOU PEOPLE to be banned.
(minus those of course that actually made counterpoints, wether or not they insulted me or complained when they did - because AT LEAST they actually were discussing points, and contributing to the knowledge and opinion groupings.
Like for instance Monkeypaw, who made a reply that wasn't a pure trolling hate filled diatribe like you just posted, having nothing to do with the article at all.
Take a look in the mirror then consider yourself fella.
In which case I request that YOU are banned for calling ME a liar when I did nothing beyond reply telling you how, on launch day, I ordered a HD5870 and had it the next day.
Oh you're full of it again, you pretended your view is the world, and therefore lied your butt off, and in a smart aleck fashion, PERIOD, pretending everyone doesn't know a few trickled out, which if you had clue one, you'd KNOW I was the one who posted that very information on this site.
Claiming anyting else, is plain stupid, smart alecky, and LYING.
Just because drunken bob got a card, he claims, on the morning, it shhipped that day, he had it immediately, and has been enojying it ever since, the whole world is satisfied with the paper launch, that "does not exist in bob's drunken vodka world" where who knows what day it is anyeway.
You know, you people are trash,and expecting anyone else to pretend you're not is asking for way too much.
hu.. you'r the one insulting every people who doesnt share your opinion with your "RED ROOSTERS" and other stuff...
you're really special Mr. Doc, but in the sad way.
What red roosters, there aren't any here I'm told. Just plain frank and honest people who tell the truth.
So if I say red rooster, it cannot possibly mean anyone here, posting, lurking or otherwise, as I'm certain you absolutely know.
( not like coming down to your level takes any effort, there you are special, just for you, so you don't feeel so bad about yourself )
I have already countered your suggestion that ATI 5870 is just a paper launch, somewhere in this same discussion.
Plus, if nVidia really has working silicon as you showed in the fudzilla link, where then I can buy it? Even at the IDF, Intel shows working silicon for Larrabee (although older version), but not even the die-hard Intel fanboys will claim that Larrabee will be available soon.
Gee, we have several phrases. Hard launch, and paper launch. Would you prefer something in between like soft launch ?
Last time nvidia had a paper launch, that what everyone called it and noone had a problem, even if cards trickled out.
So now, we need changed definitions and new wordings, for the raging little lying red roosters.
I won't be agreeing with you, nor have you done anything but lie, and attack, and act like a 3 year old.
It's paper launch, cards were not in the channels on the day they claimed they were available. 99 out of 100 people were left bone dry, hanging.
Early today the 5850 was listed, but not avaialable.
Now, since you people and this very site taught me your standards and the definition and how to use it, we're sticking to it when it's a god for saken red rooster card, wether you like it or not.
You defined hard launch as having cards on retail shelf.
That's what happened here in the first couple of days at the place I lived in. So, according to your standard, 5870 is a hard launch, not paper launch or soft launch. I can easily get one if I want to (but my casing is just crappy TECOM mid-tower, the card will not fit).
As far as I am concerned, 5870 has a successful hard launch. You tried to tell people otherwise, that's why I called you a liar.
This CPU is for what? Oh, Tesla - the things that cost 2000 :) And the consumers won't really get anything more by what ATI offers currently!
Seems like it is time for ATI to do a paper launch.
Just to inform the fanboys: ATI has already finalized the specs for generation 900 and 1000. The current is just 800.
So on paper, dudes, ATI has even more than what they are displaying now!
"Perhaps Fermi will be different and it'll scale down to $199 and $299 price points with little effort? It seems doubtful, but we'll find out next year."
Yeah okay, side with their marketing folks. God forbid they actually release reasonably-priced versions of Fermi that people will actually care to buy.
Derek did an article not long ago on the costs of a modern videocard, and broke it down part and piece by cost, each.
Was well done and appeared accurate, and the "margin" for the big two was quite small. Tiny really.
So people have to realize fancy new tech costs money, and they aren't getting raked over the coals. its just plain expensive to have a shot to the moon, or to have the latest greatest gpu's.
Back in '96 when I sold a brand new computer I doubled my investment, and those days are long gone, unles you sell to schools or the government, then the upside can be even higher.
And yet all of that is irrelevant if the product cannot be delivered at a price point where most of the potential customers will buy it. You're forgetting that R&D costs are not just "whatever they will be" but are based off what the market will support via purchasing the end result. It all starts withe the consumer. You can argue all you want that Joe Gamer should buy a $400 GPU but he's only capable of buying a $300 GPU and only willing to buy a $250 GPU, then you're not going to get a sale until you cross the $300 threshold with amazing marketing and performance or the $250 threshold with solid marketing and performance. Companies go bust because they overspend on R&D and never recoup the cost because they can't price the product properly for it to sell the quantities needed to pay back the initial investment, let alone turn a significant profit.
Arguing that gamers should just magically spend more is silly and shows a lack of understanding of economics.
Well I didn't argue that gamers should magically spend more.
--
I ARGUED THAT THE VIDEOCARDS ARE NOT SCALPING THE BUYERS.
---
Deerek's article, if you had checked, outlined a card barely over $100.00
But, you instead of thinking, or even comprehending, made a giant leap of false assuming. So let me comment on your statements, and we'll see where we agree.
--
1. Ok, the irreverance(yes that word) here is that TESLA commands a high price, and certainly has been stated to be the profit margin center (GT200) for NVIDIA- so...whatever...
- Your basic points are purely obvious common sense, one doesn't even need state them - BUT - since Nvidia has a profit driver where ATI does not, if you're a joe rouge, admitting that shouldn't be the crushing blow it apparently is.
2. Since Nvidia has been making the MONEY, the PROFIT, and reinvesting, I think they have a handle on what to spend, not you, and their sales are much higher up to recoup costs, not sales, to you, your type.
----
Stating the very simpleton points of even having a lemonade stand work out doesn't impress me, nor should you have wasted your time doing it.
Now, let's recap my point: " So people have to realize fancy new tech costs money, and they aren't getting raked over the coals."
--
That's a direct quote.
I also will reobject your own stupidty : " You're forgetting that R&D costs are not just "whatever they will be" but are based off what the market will support via purchasing the end result."
PLEASE SEE TESLA PRICES.
--
Another jerkoff that is SOOOOOOOO stupid, finds it possible that someone would even argue that gamers should just spend more money on a card, and - after pointing out that's ridiculous, feels he has made a good point, and claims, an understanding of economics.
-
If you don't see the hilarity in that, I'm not sure you're alive.
-
Where do we get these brilliant analysts who state the obvious a high schooler has had under their belt for quite some time ?
--
I will say this much - YOU specifically (it seems), have encountered in the help forums, the arrogant know it all blowhards, who upon say, encountering a person with a P4 HT at 3.0GHZ, and a pci-e 1.0 x16 slot, scream in horror as the fella asks about placing a 4890 in it. The first thing out of their pieholes is UPGRADE THE CPU, the board, the ram, then get a card....
If that is the kind of thing you really meant to discuss, I'd have to say I'm likely much further down that bash road than you.
You might be the schmuck that shrieks "cpu limitation!, and recommends the full monty reaplcements"
--
Let's hope you're not after that simpleton quack you had at me.
"The motivation behind AMD's "sweet spot" strategy wasn't just die size, it was price."
LOL, no it wasn't. Not when everyone, even Anandtech staff, anticipated the pricing for the two Cypress chips to be closer to $199 and $259, not the $299 and $399 they MSRP'd at.
This return to high GPU prices is disheartening, particularly in this economy. We had better prices for cutting edge GPUs two years ago at the peak of the economic bubble. Today in the midst of the burst, they're coming out with high-priced chips again. But that's okay, they'll have to come down when they don't get enough sales.
It was fun for half a year as the red fans were strung along with the pricing fantasy here.
Now of course, well the bitter disappointment, not as fast as expected and much more costly. "low yields" - you know, that problem that makles ati "smaller dies" price like "big green monsters" (that have good yields on the GT300).
--
But, no "nothing is wrong, this is great!" Anyone not agreeing is "a problem". A paid agent, too, of that evil money bloated you know who.
Another lie, no worry, you're no physician, but I am SiliconDoc, so grab your gallon red water bottle reserve for your overheating ati card and bend over and self administer you enema, as usual.
Some time ago I heard that the nex gen of consoles would run DX11 (Playstation2 and Xbox were DX7, PS3 and X360 DX9. So PS4 and X720 could perfectly be DX11). If this is the case, we are about to see new consoles with really awesome graphics - and then the GPU race would need to start over to more and more performance.
Do you guys have any news on those new consoles development? It could complete the figure in the new GPU articles this year.
I think you mean DX9 class hardware, PS3 who has zero DX9 support and XBOX360 has DX9c class support but a console specific version. PS3 was using OpenGL ES 1.0 with shaders and other feature from 2.0 as it was release pre OpenGL ES 2.0 spec. The game engines don't need the DX API. It doesn't matter to game content developers any way.
Xbox was actually DirectX 8.1 equivalent. As said next gen consoles are years away. Larrabee and fermi will have been long out by then.
Perhaps next gen consoles would be C++ based and not API based (what a great terminology). So in that sense DirectX won't matter like it won't matter on Larrabee because it will be emulated in software. Larrabee would not have any silicon dedicated to OpenGL or DirectX I think.
Once GPUs get fast enough I guess they won;t be called Graphics processors anymore and will support APIs as software implementations.
well, perhaps... I was searching the web yesterday for info on the new consoles, it was kinda sad. if we do not get a new minimun-standard (a powerful console), then the PC games will not be that hard to run on PCs... then my old Radeon 3850 is still capable of running almost ALL games with "good enough" permformance (read: average 30-50fps on most of the console ports to PC).
Personally I think Eyefinity will be remembered as the master stroke as well as first to implement DirectX 11. Nvidia may get 10 fps more in DirectX 11 in Q2 2010 but will still struggle until it has it's own version of Eyefinity.
Current uber-cool for the cashed-up is Eyefinity and once you have 3 or 6 monitors you will only buy hardware that will support it. These 'cashed-up' PC gamers are usually Nvidia's favorite customers.
Nvidia needs flexible wrap-around OLED mega resolution monitors to come out yesterday, but I'm pretty sure that didn't happen... 5850 which supports 3 monitors came out yesterday. :P
Nvidia has supported FOUR monitors on say for instance, the 570i sli, for like YEARS dude.
Just put in 2 nv cards, plug em up - and off you go, it's right in the motherboard manuals...
Heck you can plug in two ati cards for that matter.
---
Anyway on the triple monitor with this 5870/50, the drivers are a mess, some having found the patch won't be for a month, then the extra $100 cable is needed, too, as some have mentioned, that ati has not included.
They're pissed.
I'll be avoiding the extra $100 cable by getting DisplayPort monitors from the start. Also want to get ips monitor (i think) so that it will support the portrait mode.
If I already had 3 non-DisplayPort monitors, I wouldn't mind shelling out for the DisplayPort adapter if that was my only expense. But if the adapter was flakey I'd be upset as well. I know multi-monitors have been around for years, but they've never been this easy to set-up... Even I could do it :P And the drivers will get better in time, and no doubt future games will look to include Eyefinity as well.
Well I do hope you have good luck and that you and your son enjoy it ( no doubt will if you manage get it), and it would be nice if you can eventually link a pic (likely in some future article text area) just because.
I think eyefinity has an inherent advantage, cheaper motherboard possible, 3 on one card, and with 3 from the same card, you can have the concave wrap view going with easy setup.
I agree however with the comment that it won't be a widely used feature, and realize most don't even use both monitor hookups on their videocards that are already available as standard, since long ago, say the 9600se and before.
(I use two though, and I have to say it is a huge difference, and much, much better than one)
Wake up: 99% of people don't give a crap about Eyefinity. Not only do the VAST, VAST majority of customers have just one display, but those who do have multiple ones (like myself) often have completely different displays, not multiple of the same model and size. And then, even when you find that 0.1% of the customer base that has two or more identical monitors side-by-side, you have to find the ones who game on them. Then of those people, find the ones who actually WANT to have their game screen split across two monitors with a thick line of two display borders right in the middle of their image.
Eyefinity is relevant to such an infinitesimally small number of people it is laughable every time someone mentions it like it's some sort of "killer app" feature.
I'm with the zorro - will be setting this up for my son pretty soon - he is an extreme gamer who has mentioned multiple monitors to me a few times over the last few months. Up until now I only had a vague idea on how I could accommodate his desire.... that has all changed since the introduction of Eyefinity.
Nvidia didn't mention anything about multi-monitor support, but today's presentation wasn't really focused on the 3D gaming market and GeForce. They did spend a LOT of time on 3D Vision though, even integrating it into their presentation. They also made mention of the movie industry's heavy interest in 3D, so if I had to bet, they would go in the direction of 3D support before multi-monitor gaming.
It wouldn't be hard for them to implement it though if they wanted to or were compelled to. Its most likely just a simple driver block or code they need to port to their desktop products. They already have multi-monitor 3D on their Quadro parts and have supported it for years, its nothing new really, just new on the desktop space with Eyefinity. It then becomes a question if they're willing to cannibilize their lucrative Quadro sales to compete with AMD on this relatively low-demand segment. My guess is no, but hopefully I'm wrong.
I think Nvidia are underestimating the desire and affordability for multi-monitor gaming. Have you seen monitor prices lately? Have you seen the Eyefinity reviews?
By not making any mention of it is a big mistake in my book. Sure they can do it, but it will reduce there margins even further since they obviously hadn't planned spending the extra dollar$ this way.
I do like the sound of the whole 3D thing in the keynote though... and everyone wearing 3D glasses...(not so much). But it will be cool once the Sony vs Panasonic vs etc.? 3D format war is finished, (although it's barely started) so us mainstream general consumers know which 3D product to buy. Just hope that James Cameron Avatar film is good :)
Yeah I've seen the reviews and none seemed very compelling tbh, the 3-way portrait views seemed to be the best implementation. 6-way is a complete joke, unless you enjoy playing World of Bezelcraft? There's also quite a few problems with its implementation as you alluded to, the requirement of an active DP adapter was just a short-sighted half-assed implementation by AMD.
As Yacoub mentioned, the market segment for people interested or willing to invest in this technology is so ridiculously small, 0.1% is probably pretty close to accurate given multi-GPU technology is estimated to only be ~1% of the GPU market. Surely those interested in multi-monitor is below that by a significant degree.
Still for a free feature its definitely welcome, even in the 2D productivity sense, or perhaps for a day trader or broker....or anyone who wanted to play 20 flops simultaneously in online poker.
Lol @ 20 flops simultaneously in online poker. I struggle with 4 :)
Agree with 6 monitor bezelcraft - Cross hair is the bezel :)
I guess I'm lucky that my son is due for a screen upgrade anyhow so all 3 monitors will be new. Which one will be the problem - I hear Samsung are bringing out small-bezel monitors specifically for this, but I probably can't wait that long. (Samsung LED looks awesome though) I might end up opting for 3 of Dell's old (2008) 2408WFP's (my work monitor) as I know I can get a fair discount for these and I think they have DisplayPort. I'm not sure if my son will like Landscape or Portrait better but I want him to have the option... and yeah apparently the portrait drivers are limited (read crap) atm.
Appreciate your feedback as well as your comments on the 5850 article... I actually expected the GT prices to be $600+ not the $500-$550 you mentioned. Oops... rambling now. Cheers
Heheh I've heard of people playing more than 20 flops at a time....madness.
Anyways, I'm in a similar holding pattern on the LCD. While I'm not interested in multi-monitor as of now, I'm holding out for LED 120Hz panels at 24+" and 1920. Tbh, I'd probably check out 3D Vision before Eyefinity/multi-monitor at this point, but even without 3D Vision you'd get the additonal FPS from a 120Hz panel along with increased response times from LED.
If you're looking to buy now for quality panels with native DP support, you should check out the Dell U2410. Ryan Smith's 5870 review used 3 of them I think in portrait and it looked pretty good. They're a bit pricey though, $600ish but they were on sale for $480 or so with a 20% coupon. If you called Dell Sm. Biz and said you wanted 3 you could probably get that price without coupon.
As for GTX 380 price, was just a guess, Anand's article also hints Nvidia doesn't want to get caught again with a similar pricing situation as with GT200 but at the same time, relative performance ultimately dictates price. Anyways, enjoyed the convo, hope the multi-mon set-up works out! Sounds like it'll be great (especially if you like sims or racing games)!
Bezel Craft can be easly avoided. Just tear you monitor apart. A stand for 3 monitors is easly ordered/DIY made. Ussually the bezel is way thicker than it need to be.
Unfortunely i alreayd have a 4850 CF that i will keep for more a year or two and let the tecnology mature for now.
Can we have a new feature in the comments please?
I just get tired of reading a few comments and get bugged by some SiliconDoc interference.
Can we have a noise filter so comments area gets normal again.
Every graphics related article gets this noise.
Just a button to switch the filter on. Thanks.
I doubt SiliconDoc is actually paid by nvidia, I've met people like this in real life who just for some reason feel a need to support one company fanatically.
Or he just enjoys ticking others off. One of my friends while playing Call of Duty sometimes just runs around trying to tick teammates off and get them to shoot back at him.
If facing the truth and the facts makes you mad, it's your problem, and your fault.
I certainly know of people like you describe, and let's face it, it is one of YOUR TEAMMATES---
--
Now, when you collective liars and deniars counter one of my pointed examples, you can claim something. Until then, you've got nothing.
And those last 3 posts, yours included, have nothing, except in your case, it shows what you hang with, and that pretty much describes the lies told by the ati fans, and how they work.
I have no doubt pointing them out "ticks them off".
The simple fix is, stop lying.
Honestly all I want to know is:
When will it launch? (as in be available for actual purchase)
How much will it cost?
Will this beast even fit into my case...and how much power will it use?
How will it perform? (particularly I'm wondering about DX11 games...as it seems to be very much a big deal for ATI)
but heh none of these questions will be answered for a while I guess....
I'm also kinda wondering about:
How does the GT300 handle tessellation?
Does it feature Angle-Independent Anisotropic Filtering?
I could really couldn't give a crap less about using my GPU for general computing purposes....I just want to play some good looking games without breaking the bank...
Well it's going to be DX11 card, so it can handle tessalation. How well? That remains to be seen, but there is enough computing power to do it guite nicely.
But the big guestion is not, if the GT300 is faster than 5870 or not, It most propably is, but how much and how much it does cost...
If you can buy two 5870 for the prize of GT300, it has to be really fast!
Interesting release and good article to reveal the architecture behind this chip. I am sure, that we will see more new around the release of Win7, even if the card is not released until 2010. Just to make sure, that not too many "potential" customers does not buy ATI made card by that time.
Allso as someone said before this seams to be guite modular, so it's possible to see some cheaper cut down versions allso. We need competition to low and middle range allso. Can G300 design do it reamains to be seeing.
Well, that brings to mind another anandtech LIE.
--
In the 5870 article text post area, the article writer and tester, responded to a query by one of the fans, and claimed the 5870 is "the standard 10.5 " .
Well, it is NOT. It is OVER 11", and it is longer than the 285, by a bit.
So, I just have to shake my head, and no one should have wonder why. Even lying about the length of the ati card. It is nothing short of amazing.
I'm sorry, I realize I left with you in the air, since you're so convinced I don't know what I'm talking about.
" The card that we will be showing you today is the reference Radeon HD 5870, which is a dual-slot graphics card that measures in at 11.1" in length. "
http://www.legitreviews.com/article/1080/2/">http://www.legitreviews.com/article/1080/2/
I mean really, you should have given up a long time ago.
Anand, could you or Ryan come back to us with the exact length of the reference 5870, please? I know Ryan put 10.5" in the review but I'd like to be sure, please.
It's best to check with someone who actually has a card to measure.
Jeezus, you're just that bright, aren't you.
The article is dated September 19th, and "they scored a picture" from another website, that "scored a picture".
Our friendly reviewer herer at AT had the cards in his hands, on the bench, IRL.
--
I mean you have like no clue at all, don't you.
First first reaction after reading that the cost of double multiply would be twice that of a single was "great. Half the transistors will be sitting there idle during games." Sure, this isn't meant to be a toy, but it looks like they have given up the desktop graphics to AMD (and whenever Intel gets something working). Maybe they will get volume up enough to lower the price, but there are only so many chips TMSC can make that size.
On second thought, those little green squares can't take up half the chip. Any guess what part of the squares are multiplies? Is the cost of fast double point something like 10% of the transistors idle during single (games)? On the gripping hand, makes the claim that "All of the processing done at the core level is now to IEEE spec. That’s IEEE-754 2008 for floating point math (same as RV870/5870)". If they seriously mean that they are prepared to include all rounding, all exceptions, and all the ugly, hairy corner cases that inhabit IEEE-754, wait for Juniper. I really mean it. If you are doing real numerical computing you need IEEE-754. If you don't (like you just want a real framerate from Crysis for once) avoid it like the plague.
Sorry about the rant. Came for the beef on doubles, but noticed that quote when checking the article. Looks like we'll need some real information about what "core level at IEEE-754" means on different processors. Who provides all the rounding modes, and what parts get emulated slowly? [side note. Is anybody with a 5870 able to test underflow in OpenCL? You might find out a huge amount about your chip with a single test].
I think I'll stick with the giant profitables greens proven track record, not your e-weened redspliferous dissing.
Did you watch the NV live webcast @ 1pm EST ?
---
Nvidia is the only gpu company with OBE BILLION DOLLARS PER YEAR IN R&D.
---
That's correct, nvidia put into research on the Geforce, the whoile BILLION ati loses selling their crappy cheap hot cores on weaker thinner pcb with near zero extra features only good for very high rez, which DOESN'T MATCH the cheapo budget pinching purchasrs who buy red to save 5-10 bang for bucks...--
--
Now about that marketing scheme ?
LOL
Ati plays to high rez wins, but has the cheapo card, expecting $2,000 monitor owners to pinch pennies.
"great marketing" ati...
LOL
Just so you know, ATI is a seperate division in AMD (the graphics side obviously) and did post earnings this year. ATI is keeping the CPU side of AMD afloat in all intents and purposes. Is there a way to ban or block you? I was excited to read about the GF300 and expecting some good comments and discussion about this, and then you wrecked the experience. Now I just don't care.
The truth is a good thing, even if you're so used to lies that you don't like it.
I guess it's good too, that so many people have tried so hard to think of a rebuttal to any or of all my points, and they don't have one, yet.
Isn't that wonderful ! You fit that category, too.
Do you think yhour LIES will pass with no backup ?
" A.M.D. has struggled for two years to return to profitability, losing billions of dollars in the process.
A.M.D., the No. 2 maker of computer microprocessors after Intel, lost $330 million, or 49 cents a share, in the second quarter. In the same period last year, it lost $1.2 billion, or $1.97 a share.
ATI card sales did increase a bit, but LOST MONEY anyway. More than expected.
--
PS I'm not sorry I've ruined your fantasy and expsoed your lie. If you keep lying, should you be banned for it ?
LOL- YOU'VE SIMPLY LIED AGAIN, AND PROVIDED A LINK, THAT CONFIRMS YOU LIED.
It must be tough being such a slumbag.
--
" After the channel stopped ordering GPUs and depleted inventory in anticipation of a long drawn out worldwide recession in Q3 and Q4 of 2008, expectations were hopeful, if not high that Q1’09 would change for the better. In fact, Q1 showed improvement but it was less than expected, or hoped. Instead, Q2 was a very good quarter for vendors – counter to normal seasonality – but then these are hardly normal times.
Things probably aren't going to get back to the normal seasonality till Q3 or Q4 this year, and we won't hit the levels of 2008 until 2010."
As you should have a clue, noting, 2008 was bad, and they can't even reach that pathetic crash until 2010.
An increase in sales from a recent prior full on disaster decrease, is still less than the past, is low in the present, and is " A LOSS " PERIOD.
You don't provide text because NOTHING at your link claims what you've said, you are simply a big fat LIAR.
Thanks for the link anyway, that links my link:
http://jonpeddie.com/press-releases/details/amd-so...">http://jonpeddie.com/press-releases/det...ntel-and...
This is a great quote: " We still believe there will be an impact from the stimulus programs worldwide "
LOL
hahahhha - just as I kept supposing.
" -Jon Peddie Research (JPR), the industry's research and consulting firm for graphics and multimedia"
---
NOTHING, AT either link, describes a profit for ati graphics, PERIOD.
But maybe you should stop typing and go into hybernation to await the GT300's holy ascension from heaven! FYI: It's unhealthy to have shrines dedicated to silicon dude. Get off the GPU cr@ck!!!
On a more serious note: Nvidia are good, ATI has gotten a lot better though.
I just bought a GTX260 recently, so I'm in no hurry to buy at the moment. I'll be eagerly awaiting to see what happens when Nvidia actually have the product launch and not just some lame paper/promo launch.
My aregument is I've heard the EXACT SAME geekfoot whine before, twice in fact. Once for G80, once for GT200, and NOW, again....
Here is what the guy said I responded to:
" Nvidia is painting itself into a corner in terms of engineering and direction. As a graphical engine, ATI's architecture is both smaller, cheaper to manufacture and scales better simply by combining chips or expanding # of units as mfg tech improves.. As a compute engine, Intel's Larabee will have unmatched parallel thread processing horsepower. What is Nvidia thinking trying to pass on this huge, monolithic albatross? It will lose on both fronts. "
---
MY ARGUMENT IS : A red raging rooster who just got their last two nvidia destruction calls WRONG for G80 and GT200 (the giant brute force non-profit expensive blah blah blah), are likely to the tune of 100% - TO BE GETTING THIS CRYING SPASM WRONG AS WELL.
---
When there is clear evidence Nvidia has been a markleting genius (it's called REBRANDING by the bashing red rooster crybabies) and has a billion bucks to burn a year on R&D, the argument HAS ALREADY BEEN MADE FOR ME.
-----
The person you should be questioning is the opinionated raging nvidia disser, who by all standards jives out an arrogant WHACK JOB on nvidia, declaring DUAL defeat...
QUOTETH ! "What is Nvidia thinking trying to pass on this huge, monolithic albatross? It will lose on both fronts. "
---
LOL that huge monolithic albatross COMMANDS $475,000.00 for 4 of them in some TESLA server for the collegiate geeks and freaks all over the world- I don't suppose there is " loss on that front" do you ?
ROFLMAO
Who are you questioning and WHY ? Why aren't you seeing clearly ? Did the reds already brainwash you ? Have the last two gigantic expensive cores "destroyed nvidia" as they predicted?
--
In closing "GET A CLUE".
While I definitely prefer the $200-$300 space that ATI released 48xx at, It seems like $400 is the magic number for single GPUs. Anything much higher than that is in multi-GPU space where you can get away with a higher price to performance ratio.
If Nvidia can hit the market with well engineered $400 or so card that is easily pared down, then they can hit a market ATI would have trouble scaling to while being able to easily re-badge gimped silicon to meet whatever market segment they can best compete in with whatever quality yield they get.
Regarding Larabee, I think Nvidia's strategy is to just get in the door first. To compete against Intel's first offering they don't need to do something special, they just need to get the right feature set out there. If they can get developers writing for their hardware asap Tesla will have done its job.
Until that thing from NVIDIA comes out AMD has time to work on a response and if they are not lazy or stupid they'll have a match for it.
So in any way I believe that things are going to get more interesting than ever in the next 3 years!!!
:D ?an't wait to hear what DirectX 12 will be like!!!
My guess is that in 5 years we will have a truly new CPUs - that would do what GPUs + CPUs are doing together today.
Perhaps will come to the point where we'll get blade like home PCs. If you want more power you just shove in another board. Perhaps PC architecture will change completely once software gets ready for SMP.
Nvidia is also launching Nexus at their GDC this week, a plug-in for Visual Studio that will basically integrate all of these various API under an industry standard IDE. That's the launching point imo for cGPU, Tesla and everything else Nvidia is hoping to accompolish outside of the 3D Gaming space with Fermi.
Making their hardware more accessible to create those next killer apps is what's been missing in the past with GPGPU and CUDA. Now it'll all be cGPU and transparent in your workflow within Visual Studio.
As for the news of Fermi as a gaming GPU, very excited on that front, but not all that surprised really. Nvidia was due for another home run and it looks like Fermi might just clear the ball park completely. Tough times ahead for AMD, but at least they'll be able to enjoy the 5850/5870 success for a few months.
"Architecturally, there aren't huge lessons to be learned from RV770"
SNIF SNIF BS!
"ATI's approach is much more cautious"
more like "ATI's approach is much more FOCUSED"
( eyes on the ball people)
"While Fermi will play games, it is designed to be a general purpose compute machine."
nvidia, is starting to sound like Sony " the ps3 is not a console its a supercomputer @ HD movie player, it only does everything" guess what? people wanted to play games, nintendo ( the focused company, did that > games, not movies, not hd graphics, games, motion control) Sony - like nvidia here- didn't have the eyes on the ball.
Well, you have to consider that nvidia is getting between a rock and a hard place. The PC gaming market is shrinking. Theres not much point in making desktop chipsets anymore... they have to shift focus (and I'm sure they will focus) on new things like GPGPU. I wont be surprised if GT300 wont be a the super awesome gamer GPU of choice so many people expect it to be. And perhaps, the one after GT300 will be even less impressive for gaming, regardless of what they just said about making humongous chips for the high-end segment.
Gee nvidia is between a rock and a hard place, since they have an OUT, and ATI DOES NOT.
lol
That was a GREAT JOB focusing on the wrong player who is between a rock and a hard place, and that player would be RED ROOSTER ATI !
--
no chipsets
no chance at TESLA sales in the billions to coleges and government and schools and research centers all ove the world....
--
buh bye ATI ! < what you should have actually "speculated"
...
But then, we know who you are and what you're about -
TELLING THE EXACT OPPSITE OF THE TRUTH, ALL FOR YOUR RED GOD, ATI !
--
When nVidia actually sends out Fermi samples for previews/reviews, only then will you know how good it is. We all want to see it because we want competition and lower prices (and maybe some of us will buy one or more, as well!).
Until then, keep your fanboy comments to yourself.
No silverblue, that is in fact your problem, not mine, as you won't know anything, till you're shown a lie or otherwise, and it's shoved into your tiny processor for your personal acceptance.
The fact remains, red fanboy raver Griswold blew it, and I pointed out exactly WHY.
The fact that you cry about it, because you group stupid dummies keep blowing nearly every statement you make, sure isn't my fault.
Nvidia is simply hedging their bets and expanding their horizons. They've still managed to offer the fastest GPUs per product cycle/generation and they're clearly far more advanced than AMD when it comes to GPGPU in both theory and practice.
Jensen's keynote tipped his hat numerous times to Nvidia's roots as a GPU company that designed chips to run 3D video games, but the focus of his presentation was clearly to sell it as more than that, as a cGPU capable of incredible computational ability.
No no! This is just on paper! When will see it for real!! Oh... Q2-3-4 next year! :)
So you cannot claim they have the better thing because they don't have it yet! And don't forget next year we might have the head-smashing Larrabee!
:)
Who knows!!! I think you are way to biased and not objective when you type!
Heheh if Q2 is what you want to believe when you cry yourself to sleep every night, so be it. ;)
Seriously though, its looking like late Q4 or early Q1 and its undoubtedly meant for one single purpose: to destroy the world of ATI GPUs.
As for Larrabee lol...check out some of the IDF news about it. Even Anand hints at Laughabee's failure in his article here. It may compete as a GPGPU extension of x86, but not as a traditional 3D raster, not even close.
Wow, a video card! On top of that pcb could be a cat shit for all we know. The card does not exist, because I can't touch it, I can't buy it, and I can't play games on it.
Also, the fact that you seem to get all of your info from Fudzilla speaks volumes. All of your syphillus induced mad ramblings are tiresome.
Really like these kind of leaps in computing power, I find it fascinating. A shame that it seems nVidia is pulling a bit away from the mainstream graphics segment, but I suppose that means that the new cards from ATI/AMD are the undisputed choice for a graphics card in the next few months. 5850 it is!
Let's crack it on page 4. A mjore efficient architecture max threads in flight. Although the DOWNSIDE is sure to be mentioned FIRST as in "not as many as GT200", and the differences mentioned later, the hidden conclusion with the dissing included is apparent.
Let's draw it OUT.
---
What should have been said 1st:
Nvidia's new core is 4 times more efficient with threads in flight, so it reduces the number of those from 30,720 to 24,576, maintaining an impressive INCREASE.
---
Yes, now the simple calculation:
GT200 30720x2 = 61,440 GT300 24576x4 = 98,304
at the bottom we find second to last line the TRUTH, before the SLAM on the gt200 ends the page:
" After two clocks, the dispatchers are free to send another pair of half-warps out again. As I mentioned before, in GT200/G80 the entire SM was tied up for a full 8 cycles after an SFU issue."
4 to 1, 4 times better, 1/4th the clock cycles needed
" The flexibility is nice, or rather, the inflexibility of GT200/G80 was horrible for efficiency and Fermi fixes that. "
LOL
With a 4x increase in this core design area, first we're told GT200 "had more" then were told Fermi is faster in terms that allow > the final tale, GT200 sucks.
--
I just LOVE IT, I bet nvidia does as well.
on paper everything looks amazing, just like the R600 did in its time, and the Nvidia FX series as well. so please, just shut up and start spreading your FUD until theres real information, real benches, real useful stuff.
The R600 was great, you idiot.
Of course, when hating nvidia is your real gig, I don't expect you to do anything but be parrot off someone else's text and get the idea wrong, get the repeating incorrect.
-
The R600 was and is great, and has held up a long time, like the G80. Of course if you actually had a clue, you'd know that, and be aware that you refuted your own attempt at a counterpoint, since the R600 was "great on paper" and also "in gaming machines".
It's a lot of fun when so many fools self-proof it trying to do anything other than scream lunatic.
Great job, you put down a really good ATI card, and slapped yourself and your point, doing it. It's pathetic, but I can;t claim it's not SOP, so you have plenty of company.
that means, next consoles > minuscule speed bump, low price and (lame) motion control attached. All this tech is useless with no real killer ap EXCLUSIVE FOR THE PC! But hey who cares, lets play PONG at 900 fps !
I think the point is - the last GT200 was ALSO TESLA -- and so of course...
It's the SECOND TIME the red roosters can cluck and cluck and cluck "it won't be any good" , and "it's not for gaming".
LOL
Wrong before, wrong again, but never able to learn from their mistakes, the barnyard animals.
Last time I bought the most expensive GPU available was Riva TNT!
Sorry but even if they offer this for gamers I won't be able to buy it. It is high above my budget.
I'd buy based on quality/price/features! And not based on who has the better card on paper in year 20xx.
Well, for that, I am sorry in a sense, but on the other hand find it hard to believe, depending upon your location in the world.
Better luck if you're stuck in a bad place, and good luck on keeping your internet connection in that case.
Yeah, that was cool.
Don't know about you guys, but my interest in GPU's is gaming @ 1920X1200. From that pov it looks like Nvidia's about to crack a coconut with a ten-ton press.
My 280 runs just about everything flat-out (except Crysis naturally)and the 5850 beats it. So why spend more? Most everything's a consul port these days and they aren't slated for an upgrade till 2012, least last I heard.
Boo hoo.
Guess that's why multiple-screen gaming strating to be pushed.
No way Jose.
Yes, now about that fantasy paper anand was spewing on - yes he won't get one for two months, but AS I SAID, WE ALREADY KNOW IT BEATS the ati epic failure.
Now were down to the launch and paper lies in this article, were lies, as I've said. Bigger lies by the red texters. If I were Anand I'd be giggling at you fools.
Newegg link please. Or any other online retailer websites for the matter.
Did I already said to stop it with the paper launch already? That only exists in your dreams you know. Or maybe America. But such thing is not true here. Just because America doesn't have enough unit it doesn't mean it is true everywhere else.
Oh, so sorry mi' lady, here is your newegg link, you'll see 2 greyted out 5850's and THAT'S IT.
http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...10679497... Can't buy em. Paper, e-paper in this case, digital nothing.
--
Now if we only could send you some money, and you could trot over to your imaginary shops.... gee if only someone gave you some money, you could HAVE A PICTURE, because veryone knows handy little cameras are BANNED there, huh.
Gee, all those walks to work.. and not ten seconds to take a pic, and now it's really, too late LOL
ahhahahahahaaa
--
More of that wonderful "red rooster evidence".
I bought one, 5850 that is. They are popular, so they sell out. The same thing happened to me when the 8800GT's launched, bought 2 and they were sold out 15 minutes later.
If people can buy them, it isn't a paper launch. Give it up. There were cards for sale from many etailers on launch day for the 5870 as well.
You're saying the 8800GT and 8800GTS g92 were paper launches also? They were selling out in minutes.
Funny how you wait for the perfect post to claim your lie is true.
You're a pure troll, nothing more, in every single post you've made here.
Of course I know you're lying.
You are the one who are lying by claiming that 5870 is a paper launch, when availability at my place is pretty good. Then you claim that I do not actually come from a SE Asia country, but admins in this site can easily verify my IP and see where I come from. Accusing people of lying will not make you look good.
Neither does europe, nor africa, nor SA, nor the ME, apparently the only spot is your walk to work. Congratulations, you're at ground zero. Just think how lucky you are.
I am not the who claims GT300 is available, you do with your fudzilla link. And that picture may only have mock-ups because nVidia doesn't have any working demo.
At least Intel with its Larabee did showcase their unimpressive raytracing demo with Larabee in IDF.
the one I was going to buy on the 24 was from ncix.com but shipping and handling was gonna be a bit too much. The fact that they had the cards on the launch date, 23 goes to show how a big liar silicondoc is. Plus his link on fudzilla, just like I said before, betrays him since it does not say that the model being shown is a working model. I can say I'm making a remote that can trump all other remotes on performance levels and will be the next big thing for gadget lovers. But if I come up with a supposed model and just flash it around, it doesnt mean that it's a working model unless I can prove that it is by using it. So until Nvidia shows the card at work and gives us numbers based on such task then we can talk about it. Hey don't forget that we also go for bang for buck, reason why I got the 5870. I have a feeling that when GT300 does launch, it will be big but also would like to see how ATI responds to that.
I think we all get it. This guy is mentally ill. So I posted those links to refute his argument and he hasn't touched any. He knows he's lying and that his fudzilla link betrays him. So he'll keep on ranting. silicondoc has to remember this, all big companies that have a working model of an upcoming product do a demo and show the figures and they don't just give you paper work (which is what Nvidia is doing right now.) And it doesn't take a genius to see that GT300 w/e they wanna call it is not even close to be released unless you would want to call many sources such as tgdaily, the inquirer (http://www.theinquirer.net/inquirer/news/1052025/g...">http://www.theinquirer.net/inquirer/news/1052025/g... and ofcourse anand big liers then go ahead silicondoc..So until it comes out ( be it Nov, Mid Oct or Jan) then we can come back and talk performance. Have fun silicondoc and when GT300 performance trumps HD5870 then I'll sell 5870 for Gt300 card but don't forget we also need to see how much power that monster will be drawing. I call it monster because so far the on "paper" performance puts it that way
by the way sorry I forgot to pass you the link where I got mine.. here you go http://www.canadacomputers.com/index.php?do=ShowPr...">http://www.canadacomputers.com/index.ph...=pd&... and here is another site they have had them in stock since the 23rd of sept just like predicted. Man you are full of shit..was gonna buy mine from there but decided to save the money on shipping and boom 3 days later it was in a store right next door.wwohoo..I just said bye bye to my nvidia card and if you wanna know it was yes you guessed it GTX 285 and glad I got almost full value for that!!
I agree, it shows sooooo much intelligence and class to curse! It's so creative! It's so wonderful! It's a real shame we don't have more foul mouths in positions of power, because it's just so entertaining!
Admin: You've overstayed your welcome, goodbye.
I am beginning to think this site is lost. One thing I always liked about Anand was his tone was never harsh, even if I didn't agree with the content.
Now he's cursing. Yes, and as you said, in bold. Ugggh.
This and their idiotic pictorials of motherboards, then the clear bias towards Lynnfield. Again, I'm not complaining they liked it, so much as the way they lied about the numbers, and it took a lot of complaints to show the Bloomfied was faster. They were trying to hide that.
Their unscientific testing is also gotten to the point of absurdity.
I like their web page layout, but it's getting to the point where this site is become much less useful for information.
It's easy to get to the point where you do what people say they want. Yes, the jerks like to see curses, and think it's cool. I'm sure they got page reads from the idiotic pictorials of motherboards. Most of the people here did want to be lied to about Lynnfield, since it was something more people could afford compared to Bloomfield. It's tempting, but, ultimately, it's a mistake.
I'd like to see them do something that requires intelligence, and a bit more daring. Pit one writer (not Gary, he's too easy to beat) against another. Have, point and counterpoint articles. Let's say Anand is pro-Lynnfield, or pro-ATI card, or whatever. Then they use Jarrod to argue the points against it, or for the other card. Now, maybe Anand isn't so pro this item, or Jarrod isn't against it. Nonetheless, each could argue (anyone can argue and make points, because nothing in this world is absolutely good or bad, except for maybe pizza), and in doing so bring up the complexity of parts, rather than making people post about the mistakes they make and then have them show the complexity.
You think Anand would have put up those overblown remarks in his initial article on Lynnfield if he knew Jarrod would jump on him for it? I'd be more careful if I were writing it, so would he. I think the back and forth would be fun for them, and at the same time, would make them think and bring out things their articles never even approach.
It's better than us having to post about their inaccuracies and flaws in testing. It would be more entertaining too. And, people can argue, without disliking each other. Argument is healthy, and is a sign of active minds. Blind obedience is best relegated to dogs, or women :P. OK, I'm glad my other half doesn't read these things, or I'd get slapped.
I'm sure Anand brought it out of him with his bias.
Already on page one, we see the UNFAIR comparison to RV870, and after wailing Fermi "not double the bandwidth" - we get ZERO comparison, because of course, ATI loses BADLY.
Let me help:
NVIDIA : 240 G bandwidth
ati : 153 G bandwidth
------------------------nvidia
---------------ati
There's the bandwidth comparison, that the biased author couldn't bring himself to state. When ati LOSES, the red fans ALWAYS make NO CROSS COMPANY comparison.
Instead it's "nvidia relates to it's former core as ati relates to it's former core - so then "amount of improvement" "within in each company" can be said to "be similar" while the ACTUAL STAT is "OMITTED !
---
Congratulations once again for the immediate massive bias. Just wonderful.
omitted bandwith chart below, the secret knowledge the article cannot state ! LOL a review and it cannot state the BANDWITH of NVIDIA's new card! roflmao !
Wow, another doofus. Overclock the 5870's memory only, and watch your framerates rise. Overclocking the memory increases the bandwith, hence the use of it. If frames don't rise, it's not using it, doesn't need it, and extra is present.
THAT DOESN'T HAPPEN for 5870.
-
Now, since FERMI has 40% more T in core, and an enourmous amount of astounding optimizations, you declare it won't use the bandwith, but your excuse was your falsehood about ati not using it's bandwith, which is 100% incorrect.
Let's pretend you meant GT200, same deal there, higher mem oc= more band and frames rise well.
Better luck next time, since you were 100% wrong.
you do realize the entire point of mentioning bandwidth was to show that both Nvidia and AMD feel that they are not currently bandwidth limited. They have each doubled their number of cores but only increased bandwidth by ~%50. Theres no mention of overall bandwidth because thats not the point that was being made. Just an off hand observation that says "hey looks like everyone feels memory bandwidth wasn't the limitation last time around"
Do you see any captions on that site? I don't think so. Nowhere does it mention that it's a complete card. So please stop lying because that goes to show how ignorant you are. Any person with a sound mind can and will tell you that it's not a finished product. So come up with something more valid to show and rant about. Sorry that your big daddy Heung hasn't given you your green slime if you like it that way. Just wait on the corner and when he says, GT300 is a go and tests confirm that it trumps 5870 then you can stop crying and suck on that.
Oh golly, another lie.
First you admit I'm correct, FINALLY, then you claim only mistakes from me.
You're a liar again.
However, I congratulate you, for FINALLY having the half baked dishonesty under enough control that you offer an excuse for Anand.
That certainly is progress.
I think that pat fancy can now fairly be declared the quacking idiot group collective's complete defense.
Congratulations, you're all such a pile of ignorant sheep, you'll swather together the same old feckless riddle for eachothers emotional comfort, and so far to here, nearly only monkeypaw tried to address the launch lie pointed out.
I suppose a general rule, you love your mass hysterical delusionary appeasement, in leiu of an actual admittance, understanding, or mere rebuttal to the author's false launch accusation in the article, the warped and biased comparisons pointed out, and the calculations required to reveal the various cover-ups I already commented on.
Good for you people, when the exposure of bias and lies is too great to even attempt to negate, it's great to be a swaddling jerkoff in union.
Is sanity now considered to be a disease? We're not the one's visiting a website in which we so aggressively scream "bias" on (apparently) every GPU article. If you think Anand's work is so offensive and wrong, then why do you keep coming back for more?
Anyway, I just don't see where you get this "bias" talk. For crying out loud, you can't make many assumptions about the product's performance when you don't even know the clock speeds. You can guess till you're blue in the face, but that still leaves you with no FACTS. Also keep in mind that GT300 will have ECC enabled (at least in Tesla), which has been known to affect latency and clock speeds in other realms. I'm not 100% sure how the ECC spec works in GDDR5, but usually ECC comes at a cost.
As for "paper launch," ultimately semantics don't matter. However, a paper launch is generally defined as a product announcement that you cannot buy yet. It's frequently used as a business tactic to keep people from buying your competitor's products. If the card is officially announced (and it hasn't), but no product is available, then by my definition, it is a paper launch. However, everyone has their own definition of the term. This article I see more as a technology preview, though nVidia's intent is still probably to keep people from buying an RV870 right now. That's where the line blurs.
A launch date is the date the company claims PRODUCT WILL BE AVAILABLE IN RETAIL CHANNELS.
No "semantics" you whine about or cocka doodle do up will change that.
A LAUNCH date officially as has been for YEARS sonny paw, is when the corp says "YOU CAN BUY THIS" as a private end consumer.
---
Anything ELSE is a showcase, an announcement, a preview of upcoming tech, a marketing plan, ETC.
---
YOU LTING ABOUT THE VERY ABSOLUTE FACTS THAT FOR YEARS HAVE APPLIED PERIOD IS JUST ANOTHER RED ROOSTER NOTCH ACQUIRED.
Here in SE Asia, 5870 GPU is available in abundance in retail channels. If you PayPal me USD450, I can go straight to any of the computer shops I passed when I go to work, so that I can buy the card (and a casing that will fit the full length card), then I can take pictures and show it to you.
Stop it with the claims that the 5870 launch is just a paper launch. That patently isn't true, and will only make you look stupid.
I'm sure your email box is overflowing with requests, and I'm sure your walk to work will serve all the customers around the world.
Thanks for that great bit of information for those walking to work with you in SE asia, I bet they're really happy.
---
Maybe you should get a Reseller ID, and make that millionaire dream of yours come true, and soon when rooster central flaps it up again, you can prove to the world dry as a bone ain't rice paper.
---
No, one cannot really fathom the insanity, and red rooster doesn't describe the thickness of skull properly at all, merely the size of it's contents.
Actually the first person to offer any thought on the matter suggested green goblin, which was a decent attempt, since grizzly bears aren't green, and goblins have a much better chance of being so.
Howver, if you'd the actual nvidia equivalence of what you ati red roosters are, I'd be happy to provide some examples for you, which I have not done as of yet, and of course you're all too stupid rah-rah to even fathom that. That's pretty sad, and only confirms the problem. I'm certain you can't understand, so don't bother yourself.
If you even believed your own pile of fud, you'd go to page 2 I believe it is in the article and see where anand says " sorry that's all we know about the GT300 the game card, nvidia won't tell us anymore"
What he was told is IT'S FASTER THAN 5870, and the cores have already been cut, and the cards already under test.
So we already know, if we aren't a raging red doofus, and of course, that is very difficult for almost everyone here.
Also, this was not an official launch date for NVidia, they never declared it as such, just Anand delcared it in his article.
The official launch date for GT300 already spoken about multiple times by the aithors of this website is !!! > THE RELEASE DATE OF WINDOWS 7...
Now, wether nvidia changes their official launch date before then or not, or where the authors got that former information, one can surmise, but changing their AT tune about nvidia in an article title, for a conference and a web video atttendance, in order to appease the shamed and embarrased 3rd time in a row paper launching ati, 4870,4770, 5870, is not "unbiased" nor is it honest, no matter how much you want it to be.
If a person wants to claim it's a planned LEAK to showcase upcoming tech ( nvidia did this AFTER the GT300 gpu cores reported GOOD YIELD) - and combat fools purchasing the epic failure 5870 instead of waiting for the gold, ok.
WOw wow wow!! You sir must be the most ignorant, manipulative, underappreciating, bastard.. sorry for tearing your world but you deserve such credentials and a lot more that can be given to people who display your kind of behaviour.
So the only source you can come up with is yourself and you said it here and I quote "If you even believed your own pile of fud, you'd go to page 2 I believe it is in the article and see where anand says " sorry that's all we know about the GT300 the game card, nvidia won't tell us anymore"
What he was told is IT'S FASTER THAN 5870, and the cores have already been cut, and the cards already under test. " Those are your words, not NVIDIAs, no Anand, not Fudzilla, not from any other reviwers but yours.
Therefore, can you please STFU and stop trying to label everyone a red nosed rooster or whatever the f*** u call them.
P.S Not everyone appreciate your level of stupidity and before you can go and say geez there goes another one, FIY I'm running my system on Nvidia card and will buy ATI and snould NVIDIA "Physically Launch" GT300 and prove it to be better then already launched and benchamrked 5870 then you can come back and start your ranting. Until then plug that sh** hole of yours
Nice consolation speech.
I guess you expected " you're right ", but somehow lying to make you feel good is not in my playbook.
Now, next time you don't take it so seriously as to reply, and then still, be pathetic enough to get it wrong. Hows that for a fun deal ?
What flame war? It's just a single nut barking at everyone for no reason. If he has a problem with the article he's sure making it difficult to figure out what it is with all his carrying on and red rooster nonsense.
Does anyone (besides the nut) actually care what is said in this article? It's simply something to pass the time with, and certainly not worth getting upset over. Is the nut part of the nvidia marketing machine or merely a troll? It almost seems as if he's writing in a manner as to cover up his true identity. Yes silicondoc, it IS that obvious.
Wow, a conspiracist.
Well, for your edification, you didn't score any points, since the readers here get all uppity about what's in the articles, so they have shown a propensity to care, even if you're just here to pass the time, or lie your yapper off for the convenient line it provides you for this momment.
Usually, the last stab of the sinking pirate goes something it like: " It doesn't matter !"
Then Davey Jones proves to 'em it does.
-
Nice try, but the worst problem for you is, it matters so much to you, you think I'm not me. NOW THAT's FUNNY !
ahhahahahaaha
I can't help but note for the record that um, the card isn't out yet, so how can they win when no one knows when you can actually buy one yet? And for the record, I have a 5870 in my system, right now, that can play games....right now. I went to a retail store and bought it. That's how simple it was. I know you've been posting tons of FUD in the other review forums about how it's unavailable etc etc but the fact is, it IS available, and multiple people can own one.
Also, let me state for the record that I have owned nvidia GPUs in the past so that I'm vendor agnostic. I buy whatever solution is available and better. KEY POINTS: AVAILABLE. BETTER.
Am I hearing you right - you say GT300 isn't a paper launch despite there being no cards for sale for the next few months, yet you said the 5870 was AND THERE WERE CARDS FOR SALE WHEN YOU MADE THE COMMENT! I don't care that you couldn't locate one, the simple fact is people had already bought cards from the first trickle (emphasis on the trickle part) and as such made your statement completely invalid.
How much more rubbish are you going to spew from your hole?
(note: I needed to have caps above to make a salient point and not just because I felt like holding the shift key for no particular reason)
roflmao - If you're hearing anything right, you'd keep your text yap shut.
Please show me the LAUNCH information on GT300, there, brainless bubba, the liar. I really cannot imagine you are that stupid, but then again, it is possible.
Congratulations for being a COMPLETE IDIOT AND LIAR! Really, you must work very hard to maintain that level of ignorance. In fact, the requirement to be that stupid exceeds the likelihood that you actually are purely ignorant, and therefore, it is more likely you're a troll. My condolences in either case in all seriousness.
The proposed launch is late November but even Fudzilla concede that any problems will delay this. The earliest we'll see a GT300 on the shelves is just under 2 months. There, that's information for you. I want nVidia to launch GT300 this year but we don't always get what we wish for.
Where did I lie in any of my previous posts? Oh right... I didn't bow down to worship the Green God(dess). If you had any semblance of an open mind or any stability at all, your nose wouldn't need cleaning. Calling me a troll is pure comedy gold and offering me your pity is outstanding to say the least :)
Keep trying. Or don't. Either way, I doubt many people care for your viewpoints anymore.
Oh jeeze, one red rooster who finally gets it.
Congratulations, you're not the dumbest of your crowd.
--
No shirking here comes the QUOTE !
" The proposed launch is late November " !!! whoo hoo !
Now a proposed launch is not an official launch- try to keep that straight in the gourd haters when the time comes.
--
Pass it along to all the screaming tards, won't you please, you talk their language, or perhapos we'll just say you already have, because, by golly, they can believe you.
ROFLMAO
PS THE SILICON IS ALREADY CUT AND IN PRODUCTION!
---
Yes anand at the bottom of page 1 claims "it's paper" - DECIEVING YOU, since the WAFERS HAVE ALREADY BEEN BURNED AND YIELDS ARE REPORTED HIGH ! (in spite of ati's marketing arm lying and claiming "only 9 cores per wafer yields" - A BIG FAT LIE NVIDIA POINTED OUT !
Where have you been with your head in the sand ?
--
So at the bottom of page 1 Anand leaves you dips with the impression "it's all paper" (but the TRUTH is DEVELOPER CARDS ARE ALREADY ASSEMBLED and being DEBUGGED and TESTED) just anand won't get one for 2 months.
---
THEN BY PAGE 2 ANAND CALLS IT A PAPER LAUNCH !
roflmao
Yes, the red rooster himself has convinced himself "today's nvidia LAUNCH" (that LAUNCH word is what anand made up in his deranged mind) is a paper launch "JUST LIKE ATI'S!".
---
It is nothing short of absolutely AMAZING.
The red rooster fan has boonswoggled his own gourd, stated in fasle terms, bashed it to be as bad as what ati just did with 5870, and IT'S NOT EVEN A LAUNCH DAY FOR NVIDIA !
---
Congratulations, the massive bias is SCREAMING off the page. LOL
It's hilarious, to say the least, that the master can be that deluded with his own spew!
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I would love to kick you hard in the face, breaking it. Then I'd cut
your stomach open with a chainsaw, exposing your intestines. Then I'd
cut your windpipe in two with a boxcutter.
Hopefully you'll get what's coming to you. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. Faggot
Shut the fuck up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
What a sad, lonely little life this Silicon Doc must lead. I struggle to see how this guy can have any friends, not to mention a significant other. Or even people that can stand being in a room with him for long. Prolly the stereotypical fat boy in his mom's basement.
Careful SD, the "red roosters" are out to get you! Its all a conspiracy to overthrow the universe, and you're the only one that knows!
Great article Anand, as always.
Regards,
Law
Vendor agnostic buyer of the best price / performance GPU at the time
They can't get me, they've gotten themselves, and I've just mashed their face in it.
And you're stupid enough to only be able to repeat the more than thousandth time repeated internet insult cleche's, and by your ignorant post, it appears you are an audiophile who sits bouncing around like a retard with headphones on, before after and during getting cooked on some weird dope, a HouseHead, right ? And that of course does not mean a family, doper.
So you giggle like a little girl and repeat what you read since that's all the stoned gourd can muster, then you kiss that rear nice and tight, brown nose.
Don't forget your personal claim to utter innocence, either, mr unbiased.
LOL
Yep there we have it, a househead music doused doped up butt kisser with a lame cleche'd brain and a giggly girl tude.
Golly, what were you saying about wifeless and friendless ?
Your spelling, grammar, and general lack of communication skill lead me to think that you are actually a double agent, it's an act if you will...an ATI guy posing as a really socially stunted Nvidia fan in an attempt to turn people off of Nvidia products solely by the ineptitude of your rhetoric.
I'd hate to have a political conversation with SiliconDoc, but I digress...
Some very interesting information came out in today's previews. Will Fermi be a bigger chip than Cypress? Certainly. Will it be more *powerful* than Cypress? Possibly. Will it be more expensive than Cypress? Probably. Will it have more memory bandwidth than Cypress? Yes.
Will it *play games* better than Cypress? Remains to be seen. Too many factors at play here. We don't know clock speeds. We have no idea if "midrange" Fermi cards will retain the 384-bit memory interface. We have
For all we know, all of Fermi's optimizations will mean great things for OpenCL and DirectCompute, but how many *games* make use of these APIs today? How can we compare DirectX 11 performance with few games and no Fermi silicon available for testing? Most of the people here will care about game performance, not Tesla or GPGPU. Hell, its been years since CUDA and Stream arrived and I'm still waiting for a decent video encoding/transcoding solution.
Even between current cards (NVIDIA and AMD/ATI) the performance crown moves from one game to another - one card could do very well in one game and much worse in another (compared to the competition). As for not yet released cards, performance numbers in games can only be divined, not predicted
I know it seems like SiliconDoc is going on a ranting rage, because he kinda is, but the fact remains that this was a fairly biased article on the part of Anandtech. I've been reading reviews and articles here for a long time, and recently there has been a certain level of prejudice against Nvidia and its products that I haven't noticed on other legitimate review cites. This seems to have been the result of Anandtech getting left out of the loop last year. Throughout the article there is a pretty obvious sarcastic undertone towards what the Nvidia representatives say, and their newly announced GPU. I can only hope that this stops, so that anandtech can return to its former days of relatively balanced and fair reporting, which is all anyone can ask of any legitimate review cite. Articles of this manner and tone serve no purpose but to enrage people like SiliconDoc, and hurt Anandtech's image and reputation as a balanced a legitimate tech cite.
I see a little bit of the tone, but it seems warranted for a company that has for the last few years over-promised and under delivered. Very similar to how AMD/ATI was treated upto the release of the 4 series. Nvidia needs to prove (again) that it can deliever a real innovative product priced at an affordable level for the core audience of graphics cards.
Here we are, 7 days after 5870 launch and Egg has 5870s for ~375 to GTX 295s at 500. Yet again, ATI/AMD has made it a puzzling choice to buy any Nvida product more than 200 dollars.... for months at a time.
What's puzzling is you are so out of touch, you don't realize the GTX295's were $406 before ati launched it's epic failure, then the gtx295 rose to $469 and the 5870 author edsxplained in text the pre launch price, and now you say the GTX295 is at $500.
Clearly, the market has decided the 5870 is epic failure, and instead of bringing down the GTX295, it has increased it's value !
ROFLMAO
Awwww, the poor ati failure card drove up the price of the GTX295.
Awww, poor little red roosters, sorry I had to explain it to you, it's better if you tell yourself some imaginary delusion and spew it everywhere.
You are also the person that went into a tirade about nvidia not replacing laptop gpu's with the faulty substrate and instead puttig on a heftier fan.
You waxed on about how much you hate nvidia, and how they harmed the children (you claimed to be a teacher of some sort) then you screeched about nvidia reps, wished violence upon them, and claimed you'd love to show them how to do their jobs correctly.
---
That's YOU tamalero.
--
Now it's pretty amazing I tell the simple plain truth, you deny it a week late, lying for ati, have you public hate and rage on this board for nvidia, and yet claim it is I that is a fanboy.
--
One Q, has your raging hatred for nvidia receded, or does lying about the 5870 release give you a sense of vengeful pleasure ?
what truth?
you're just inventing random crap your brain somehow imagines in illusions.
and what the hell are you talking about?
I never claimed to be a "teacher", wished violence? what the hell are you smoking?
harmed the children.. jesuchrist... are you on some sort of scientologist brainwashing group ?
no they wre not already on newegg - listed and greyed out- the first one available in a trickle -and only today have those listed appeared available, before that it was on for a few seconds, card gone - all GREYED OUT again.
---
Sept. 23rd was launch, this is 7 days later.
They were a WEEK of paper. (no one can fairly count a sickly 1,2 or half dozen trickle)
they were grey, because they sold out, note..., there were on amazon and tigerdirect.com as well. I woudlnt be surprised if newwave and other sites had the 5870 as well.
you're just a person with mental problems who cant really accept anything outside your tiny world.
I woke up on HD5870 launch day.
I logged onto a website in the UK.
I ordered an HD5870.
It shipped the same day.
I had it the next day and have been enjoying it ever since.
Good for you, one of 7 billion, and then again one of perhaps 20, as reported for Europe.
But, all you see is yourself, because you're just that selfish. And, you're a big enough liar, that you even posted your insane smart aleck stupidity, like a little brat.
Ah, I see, you have no facts to refute me with thus you fall back to unfounded insults safe in the knowledge that you are nothing but a troll hiding behind a keyboard.
Sorry I wasted my time with you, clearly you aren't able to deal with the world in logical terms.
Uhmm... maybe because it is common knowledge that ATI can actually get 5870 launched properly, with multiple manufacturers on board, and get the retail stores stocked up?
20 for the whole Europe? What a joke. If I am a millionaire, I can get 20 of those 5870 GPU thing easily.
Isn't the internet great. It allows shitheads like yourself to say shit that would, in real life
get your head cracked open.
Hopefully you'll suffer the same fate fucking cunt.
Please turn to the loaded gun in your drawer, put it in your mouth, and pull the trigger,
blowing your brains out. You'll be doing the whole world a favor. Shitbag.
What paper launch? Is Newegg is the only place to get one? Here somewhere in SE Asia getting one of this 5870 GPU is as easy as going to a store, flash your wad of cash at the cashier and then returns home with a box with pre-rendered 3D objects/characters on it (and of course an ATI 5870 GPU in it). In fact, after a week from the release date, there is a glut of them here already, mainly from Powercolor and HIS.
LOL - roflmao - So announce in the foreign tongue, and move to the next continent when ready, you dummy. They didn't do that. They LIED, again, and failed.
A week late is better than several or a month or two for the 4870.
You can't buy quantity yet either, but for peons, who cares.
Uhmm... the second language in SE Asia is English. What, just because I can prove to you that 5870 launch is real, you started to deny it? Are you the typical American that thinks the rest of the world doesn't exists?
Yuo can't prove anything to me, since you won't be proving the GT300 LAUNCHED like the author claimed.
Instead, none of you quacking loons have anything but "foreign nation", no links and it's too late, and strangely none of you type in the Asain fashion.
LOL
So who the heck knows what you liars are doing anyway.
The paper standard was set by this site and it's authors, and the 4870 was paper, the 4770 was paper, and this 5870 was paper, PERIOD, and as of this morning the 5850 was also PAPER LAUNCHED.
What's funny is only you morons deny it.
All the other IT channels admit it.
--
Good for you red roosters here, you're the only ones correct in the world. ( no, you're not really, and I had to say that because you'll believe anything )
Go ask the administrator to check my IP and they can verify that my IP comes from a SE Asia country. Are you accusing me of lying for claiming that I come from a nirvana where 5870 GPU is plentiful?
Is that all you can do?
Fact - 5870 is not paper launch. You cannot even deny this.
Ah, BTW, English in SE Asia is the same as the ones used in America and Europe.
Seriously, what are you on? It has to be some good stuff. I want some.
I like how you go on and on spouting nonsense about how GT300 has 50% more theoretical bandwith, but without clock speeds there is no way to gauge how much of it will be saturated. In plain speak: Without hard numbers BANDWIDTH ALONE MEANS NOTHING. Sure nvidia has tons of road but we have no idea what they are going to drive on it.
About the 5870 being a paper launch, my best friend had his since the 30th. Day the 5850 launched, took a look over at newegg at 7 in the evening they where there available to order. And still you can order/go to the store and purchase either right now!!! That's not a paper launch. Last time I checked a paper launch is when a product goes live and it's unavailable for over a month.
When anand posts the GD bit width and transistor count, and mem, then CLAIMS bandwith is NOT DOUBLE, it is CLEAR the very simple calculation you 3rd graders don't know is AVAILABLE.
---
IT'S 240 GB !
4800x384/8 !
duhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh
It's not FUD, it's just you people are so ignorant it's EASY to have the wool pulled over your eyes.
Or 50% faster than 153GB/s - still a big gap but clearly not even nearly double.
It's not FUD, it's just you trolls are so bad at maths you can't even use a calculator to do basic arithmetic with it's EASY to have the wool pulled over your eyes.
The author claimed not double the former GT200, sir.
In the 5870 review just the other day, the 5870 had a disappointing 153+ bandwith, vs the 115 of the 4780 or 124 of the 4890.
--
So you can see with the 5870 it went up by not much.
--
In this review, the former GT200 referred to has a 112, 127, 141, or 159 bandwith, as compared to the MYSTERY # 240 for the GT300.
So the author claims in back reference to the ati card the nvidia card "also fails" to double it's predecesor.
--
I have a problem with that - since this new GT300 is gonig to be 240 bandwith, nearly 100 GB/sec more than the card the author holds up higher and gioves a massive break to, the one not being reviewed, the ati 5870.
--
It's bias, period. The author could fairly have mentioned how it will be far ahead of it's competition, and be much higher, as it's predecessor nvidia card was also much higher.
Instead, we get the cryptic BS that winds up praising ati instead of pointing out the massive LEAD this new GT300 will have in the bandiwth area.
I hope you can understand, but if you cannot, it's no wonder the author does such a thing, as it appears he can snowball plenty with it.
STFU you stupid moron. There's no "bias". The 5870 has a full, in-depth, separate review with full benchmarks. The author didn't do direct comparisons because THERE IS NO CARD TO COMPARE IT WITH TODAY. FERMI ONLY EXISTS ON PAPER--the mere existence of engineering samples doesn't help this review. The author even indicated he wished he had more info to share but that's all Nvidia allowed. How about we wait until a GT300 ships before we start making final judgements, ok?
Wow, you must be the biggest fanboy i've ever seen. It would be funny if it wasn't so sad you're vasting so much energy on such insignificant issue and everyone around here just thought to themselves..."what a total failure".
But hey, on the bright-side, you made me jump off that fence and register, so you might not be as useless as you seem.
Wow, your proof that Fermi exists is a photo of Huang holding up a mock-up of what the new card is going to look like?
If that was a real card, and engineering samples existed, why isn't it actually in a PCI-e slot running something? Why were no functioning Fermi cards actually shown at the conference? Why was the ray-tracing demo performed on a GT200?
Finally, why did Huang say that cards will be ready in "a few short months", if they are actually ready now?
You need to calm down a little. You also need to work on your reading skills and to stop making up controversies where none exist.
Yes, Anand pointed out that the memory bandwidth did not double, but in the very same sentence, he mentions that it did not double for the 5870 either.
Not that I really want to support SD here, but there was working silicon there. It's kind of weird that many sites fail to mention this. Instead, they focus on the mockup.
Go read a few articles on how a card is developed, and you'll have the timeline, you red rooster retard.
I mean really, I'm talking to ignoramussed spitting cockled mooks.
Please, the articles are right here on your red fan website, so go have a read since it's so important to you how people act when your idiotic speculation is easily and absolutely 100% incorrect, and it's PROVEABLE, the facts are already IN.
What facts? What framerates can it manage in Crysis? What scores in 3DMark? How good it is at F@H?
Link us, so we can all be shown the errors of our ways. It's obvious that GT300 has been benchmarked, or at least, it's only obvious to you simply because the rest of us are on a different planet.
You call people idiots, and then when they reply in a sensible manner, you conveniently forget all that and call them biased (along with multiple variations on the red rooster theme). You're like a scratched vinyl record and it's about time you got off this site if you hate its oh-so-anti-nVidia stance that doesn't actually exist except in your head.
Prove us wrong! Please! I want to see those GT300 benchmarks! Evidence that Anandtech are so far up AMD's rear end that nothing else is worth reporting on fairly!
GTX285 had 32 ROPs and 80 TMUs for aorund the same bandwith like 5870 with same 32 ROPs and 80 TMUs. Dont be stupid. GTX will surely need more ROPs and TMUs if they want to keep up with graphic even with the GPGPU bloat.
You assume that they will use GDDR5 clocked at the same speed as ATI.
They could use higher clocked GDDR5 (meaning even more bandwidth), or lower clocked GDDR5 (meaning less bandwidth).
There's no bandwidth comparison because 1) it's meaningless and 2) it's impossible to make an absolute comparison.
NV will have 50% more bandwidth if the speed of the RAM is the same, but it doesn't have to be the same, it could be higher, or lower, so you can't say what absolute numbers NV will have.
I could make a graph showing equal bandwidth between the two cards even though NV has a bigger bus, or I could make one showing NV having two times the bandwidth despite only a 50% bigger bus.
Both could be valid, but both would be speculative.
Yeah, of course, 3 million T core, and it doesn't need much bandiwth, and it won't be used, perhaps, or probably, because the GPU designers haven't a clue, and of course, you do.
---
Another amazing clown with a red nose. You people really should stop typing stupid stuff like that.
There was no performance improvement from increasing the bandwidth of the Athlon64 processors due to the move to DDR2 memory (a theoretical doubling of performance, with about one and a half more measured bandwidth in the first generation of the processor).
Very interesting of course, and so with your theory, it could also be LOWER, with slower ddr5, but the fact REMAINS, 240 has ALREADY BEEN LEAKED.
so we know what it is, and Anand KNOWS IT'S MORE, BUT NOT DOUBLE AND SAYS SO !
---
SO WHAT WE HAVE IS A CONVENIENT COVER UP. PERIOD!
--
JUST NOT AS STUPID AS YOU, THAT'S ALL.
I'm certain you attempted it, as you no doubt love to wallow in blissful ignorance and denial just by mere unchangeable habit.
But having read it, you have nowhere to go. Your mind is already irreparably altered. Congratulations.
Well you can't ignore the discussion, and as far as that goes, you want everyone else to do your will, as you beg for it, with your empty advice, which is by the way, all you provided in the last thread.
So you quack around telling others not to participate. That's your whole gig mary.
I find the stupidity level amazing, as most of you can only spew in and beg eachother not to comment, and by the quality of the comments that actually try a counter, I certainly cannot blame you, for begging others to surrender ahead of time.
You notice how many new names are here ? lol
It's clear who and what it is that responds, and what level of conduct they are all about.
Now let's see the Sybillic mapes sign off on his/her own glorious advice, which she/he failed to follow already.
Your product isn't on the market yet. There's no recommendation to be heeded. It doesn't matter how fast it is, you can't have one. No one can. If someone wants to get their next-gen performance on now, ATI is what you'll buy.
It's okay though. Not everyone is going to run out and grab an ATI card. There's plenty of people that will wait for hell to freeze over for nVidia to release a new card. Personally, I'll take the Q1 2010 release date as semi-fact, but hell freezing over is my fallback date.
So feel free to continue ranting like you've run out of meds. Insult anyone who has made a comment contrary to your own. It's not really doing you any favors, but that's okay. I'm sure it's having an effect on the opinions of people sitting on the fence. Ah, and just to dig the finger in the wound, you can consider me an ATI shill, just for good measure.
Gosh you';re a sweetheart, too, and wrong ! WRONG ! WRONG !
the red bloviator> " . If someone wants to get their next-gen performance on now, ATI is what you'll buy. "
Gee, did you lose it that easily, was your heart beating so hard, were you sweaty, and upset, and out of control, and decided it was great to tell me to settle down, when your brain was on FART ?
Please see the GTX295 that BEATS the 5870. The 5870 is PRIOR GEN performance - as ati's own 4870x2/or CF is equal.
Golly, what a deal, another red rooster massive ruse.
I have no clue what the red rooster thing implies, and I never understood why people called nVIDIA the green goblin. Until now. You, sir, have made it clear to me. They are called the green goblin, because that's where the trolls come from. Like wow. Your partisan and righteous thinking has no merit, no basis except conjecture and criticism. Save a keyboard, chill out and let's see if you can post anything in here without using the words, nVIDIA, ATI, red rooster, green goblin, and anything with ALL CAPS.
It's fine to be passionate about something. But to exessive extents that push everyone else away and leave people ashamed, discouraged and embarrased; that's not how to win hearts and minds. I can already see you getting riled up over this post telling you to chill out....
Hmmmm, that's very interesting. First you go into a pretend place where you assume green goblin is something "they call" nVIDIA, but just earlier, you'd never seen it in print before in your life.
Along with that little fib problem, you make the rest of the paragraph a whining attack. One might think you need to settle down and take your own medicine.
And speaking of advice, your next paragraph talks about what you did in your first that you claim noone should, so I guess you're exempt in your own mind.
Yall...seriously...leave the poor NVidia Fanboy alone. His head is probably throbbing with the fact that he found his first website (other than HardOCP) that isn't extremely NVidia biased.
The 5870 is but one single GPU. The 295 is two and costs more. The 4870X2/CF is also a case of two GPUs. A 5870X2 would annihilate everything out there right now, and guess what? 5870 CF does just that. If money is no object, that would be the current option, or 5850s in CF to cut down on power usage and a fair amount of the cost without substantially decreasing performance.
By stating "if someone wants to get their next-gen performance now", of course he's going to point in the direction of ATI as they are the only people with a DX11 part, and they currently hold the single GPU speed crown. This will not be the case in a few months, but for now, they do.
I kinda doubt the 5870x2 blows away GTX295 quad, don't you ?
--
Now you want to whine cost, too, but then excuse it for the 5870CF. LOL.
Another big fat riotous red rooster.
Really, you people love lies, and what's bad when it's nvidia, is good when it's ati, you just exactly said it !
ROFLMAO
--
Should I go get a 295 quad setup review and show you ?
--
How come you were wrong, whined I should settle down, then came back blowing lies again ?
There's no DX11 ready to speak of, so that's another pale and feckless attempt at the face save, after your excited, out of control, whipped up incorrect initial post, and this follow up fibber.
You need to settle down. "I want you banned"
Finally, you try to pretend you're not full of it, with your spewing caveat of prediction, "this will not be the case in a few months" - LOL
It's NOT the case NOW, but in a few months, it sure looks like it might BE THE CASE NO MATTER WHAT, unless of course ati launches the 5870x2 along with nvidia's SC GT300, which for all I know could happen.
So, even in that, you are NOT correct to any certainty, are you...
LOL
Calm down, and think FIRST, then start on your rampage without lying.
Why would I want to compare a dual GPU setup with an 8 GPU setup? What numpty would do that when it would logically be far faster? Even a quad 5870 setup wouldn't beat a quad 295 setup, and you know what? WE KNOW! 8 cores versus 4 is no contest. Core for core, RV870 is noticeably faster than the GT200 series, but you're the only person attempting to compare a single GPU card to a dual GPU card and saying the single GPU card sucks because it doesn't win.
And where did I say "I want you banned"? As someone once said, "lay off the crack".
Aren't you the one who claimed only ati for the next gen performance ?
Well, you really blew it, and no face save is possible. A single NVIDIA card beats the best single ati card. PERIOD.
It's true right now, and may or may not change within two months.
PERIOD.
No, I said that ATI currently has the single GPU crown. Not card - GPU. In a couple of months, ATI may have the 5870X2 out, and that WILL send the 295 the way of the dodo if it's priced correctly.
^^LOL. I don't see what all the bickering is about. If you're willing to wait a few more months, then you can buy a faster card. If you want to buy now, there are also some nice options available. Currently there are 5 brands of 5870's and 1 5850 at the egg.
I'm sure "it won't be used" because for the very first time "nvidia will make sure it "won't be used" becuase "they designed it that way ! " LOL
--
You people are absolutely PATHETIC.
Now the greater Nvidia bandwith doesn't matter, because you don't care if it's 999, because... nvidia failed on design, and "it won't be used!"
ROFLMAO
Honestly, if you people heard yourselves...
I am really disappointed that the bias here is so much worse than even I had known, not to mention the utter lack of intellect so often displayed.
What a shame.
Exactly! R600 had huge bandwidth but couldn't effectively use it; for the msot part. Is this huge bandwdth the GF300 has only able to be used in cGPU, or is it able to be used in games, too? We won't know till the card is actually reviewed a long while from now.
What a joke. The current GT200 responds in all flavors quite well to memory clock / hence bandwith increases.
You know that, you have been around long enough.
It's great seeing the reds scream it doesn't matter when ati loses a category. (no actually it isn't great, it's quite sickening)
Yes of course bandwith does not really matter when ati loses, got it red rooster. When nvidia is SO FAR AHEAD in it, it's better to say "it's not double"...LOL
---
WHAT IS WRONG WITH YOU PEOPLE AND THE AUTHOR IS THE REAL QUESTION!
--
What is wrong with you ? Why don't you want to know when it's nvidia, when it's nvidia a direct comparison to ati's card is FORBIDDEN !
That's what the author did !
It was " a very adept DECEPTION" !
---
Just pointing out how you get snowballed and haven't a clue.
Rumors also speculated 4,000 data rate ddr5
4000x384/8 - 192 bandwith, still planty more than 153 ati.
CLEARLY though "not double 141" (nvidia's former number also conveniently NOT MEWNTIONED being so close to 153/5870 is EMBARRASSING) - is 282...
--
So anand knows it's 240, not quite double 141, short of 282.
Blimey, I didn't know Ujesh could utter such things. :D When I knew
him in 1998 he was much more offical/polite-sounding (he was Product
Manager for the O2 workstation at SGI; I was using a loaner O2 from
SGI to hunt for OS/app bugs - Ujesh was my main contact for feedback).
The poster who talked about availability has a strong point. My brother
has asked me to build him a new system next week. Looks like it'll be
an Athlon II X4 620, 4GB RAM, 5850, better CPU cooler, with either an
AM3 mbd and DDR3 RAM or AM2+ mbd and DDR2 RAM (not sure yet). By heck
he's going to see one hell of a speed boost; his current system is a
single-core Athlon64 2.64GHz, 2GB DDR400, X1950Pro AGP 8X. :D My own
6000+ 8800GT will seem slow by comparison... :|
ATI's availability will be sorted out soon, NVIDIA's weird design choices that are targeted at anything but graphics won't
in fact, I have just realized: NVIDIA IS DOING A MATROX!
(forget about graphics, concentrate in a proffessional niche, subsequently get run over by competitors in its former main market... eventually dissappear from the graphics market or become irrelevant? with some luck, RayTracing will be here sooner rather than later, ATI will switch to GPUcomputing at the right time -as opposed to very much too soon-, and we will have a 3 players market; until then, ATI domination all over)
What they may have done is take an existing PCB design for something else, and tacked down the parts and air-wired them. It is a faster way to debug a prototype, as well as just drilling a few holes and putting makeshift screws in to test a cooling design before going to the effort of the rest of the support parts before you know if the cooling subsystem is adequate.
IF that is the situation, I feel nVidia should have held off until they were further along with the prototypes, but when all is said and done if they can produce performance in line with the expectations, that would prove they had a working card.
First off, I don't know the truth about a fake or real Tesla being in existence; however, when an article shows a strong emotional bias, I do find it hard to accept the conclusions.
Here is a link to the current Tesla product for sale online:
This clearly shows the existing Tesla card with screws on the end plate. Also, if memory serves, having partial venting on a single slot for the new Tesla card would equal the cooling available on the ATI card. Also, six-pin connector is in roughly the same place.
As for the PCB, it is hidden on the older Tesla screen shots, so nothing can be derived.
The card may be fake, or not, but Charlie is not exactly unbiased either.
What's the deal with that, I keep trying to read Semi's articles, though his 'tude towards MS and Intel is pretty juvenile, but I've got to ask; did somebody at Nvidia gang rape his mom?
I simply assume he is either directly or indirectly on ATI's payroll.
Fudzilla wrote "The real engineering sample card is full of dangling wires." To display such a card to others they could simply epoxy down some connectors and solder the wires to them.
Here's an article from Fudo saying that the card was a mock-up. Nvidia claimed it was real at the conference, and are now saying its a fake, but that they really, truly, had a real one running the demos. Really! I completely believe them.
What makes you think it isn't the right time? You can only really tell in hindsight, but you give in your post any reason that you think now is not the right time and later, when amd is gonna do it, is the right time. I think the right time is whenever the architecture is available and the interest is there. Nvidia has, over the past 5 years, been steadily building the architecture for it. Whether the tools are all in place yet and whether the interest is really there remains to be seen.
It has nothing to do with matrox or any shift to a "professional niche." Nvidia believes that it has the ability to evolve and leverage its products from the niche sector of 3d graphics into a broader and more ubiquitous computing engine.
Do you see any sign of commercial software support? Anybody Nvidia can point to and say "they are porting $important_app to openCL"? I haven't heard a mention. That pretty much puts Nvidia's GPU computing schemes solely in the realm of academia (where you can use grad students a cheap highly-skilled labor). If they could sell something like a FEA package for pro-engineer or solidworks, the things would fly off the shelves (at least I know companies who would buy them, but it might be more a location bias). If you have to code it yourself, that leaves either academia (which mostly just needs to look at hardware costs) and existing supercomputer users. The existing commercial users have both hardware and software (otherwise they would be "potential users"), and are unlikely to want to rewrite the software unless it is really, really, cheaper. Try to imagine all the salaries involved in running the big, big, jobs Nvidia is going after and tell me that the hardware is a good place to save money (at the cost of changing *everything*).
I'd say Nvidia is not only killing the graphics (with all sorts of extra transistors that are in the way and are only for double point), but they aren't giving anyone (outside academia) any reason to use openCL. Maybe they have enough customers who want systems much bigger than $400k, but they will need enough of them to justify designing a >400mm chip (plus the academics, who are buying these because they don't have a lot of money).
"Do you see any sign of commercial software support? Anybody Nvidia can point to and say "they are porting $important_app to openCL"? I haven't heard a mention. That pretty much puts Nvidia's GPU computing schemes solely in the realm of academia"
Well, I do HPC for a living, and I think it's too early to push GPU computing so hard because I've tried to use it, and gave up because it required too much effort (and I didn't know exactly how much I would gain in my particular applications).
I've also tried to promote GPU computing among some peers who are even more hardcore HPC users, and they didn't pick it up either.
If even your typical physicist is scared by the complexity of the tool, it's too early.
(as I'm told, there was a time when similar efforts were needed in order to use the mathematical coprocessor...)
>>If even your typical physicist is scared by the complexity of the >>tool, it's too early.
This sounds good but it's not accurate. Physicists are interested in physics and most are not too keen on learning some new programing technique unless it is obvious that it will make a big difference for them. Even then, adoption is likely to be slow due to inertia. Nvidia is trying to break that inertia by pushing gpu computing. First they need to put the hardware in place and then they need to convince people to use it and put the software in place. They don't expect it to work like a switch. If they think the tools are in place to make it viable, then how is the time to push, because it will ALWAYS require a lot of effort when making the switch.
I do bioinformatics / HPC and in our field too we have had several good GPU ports for a handful for algorithms, but nothing so great to drive us to add massive amounts of GPU racks to our clusters. With OpenCL coming available this year, the programming model is dramatically improved and we will see a lot more research and prototypes of code being ported to OpenCL.
I feel we are still in the research phase of GPU computing for HPC (workstations, a few GPU racks, lots of software development work). I am guessing it will be 2+ years till GPU/stream/OpenCL algorithms warrant wide-spread adoption of GPUs in clusters. I think a telling example is the RIKEN 12petaflop supercomputer which is switching to a complete scalar processor approach (100,000 Sparc64 VIIIfx chips with 800,000 cores)
http://www.fujitsu.com/global/news/pr/archives/mon...">http://www.fujitsu.com/global/news/pr/archives/mon...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
415 Comments
Back to Article
Fortesting - Wednesday, October 7, 2009 - link
See this article:http://www.semiaccurate.com/2009/10/06/nvidia-kill...">http://www.semiaccurate.com/2009/10/06/...x260-aba...
Zool - Tuesday, October 6, 2009 - link
Maybe tesla cards in supercomputers which are closed platforms the cuda is better but for anything other commercial OpenCL will be better.This is a CUDA vs OpenCL test from Sisoftware http://www.sisoftware.net/index.html?dir=qa&lo...">http://www.sisoftware.net/index.html?di...ocation=...
The conclusion from that article : We see little reason to use proprietary frameworks like CUDA or STREAM once public drivers supporting OpenCL are released - unless there are features your code depends on that are not included yet; even then, they will most likely be available as extensions (similar to OpenGL) pretty soon.
It wouldnt be bad to see those kind of tests on anadtech. Something like GPUs vs CPUs tests with same code.
Zool - Monday, October 5, 2009 - link
I dont know how others, but the 8 time increase in DP which is one of the pr stunts doesnt seem too much if u dont compare it to the weak gt200 DP numbers. The 5870 has something over 500 GFlops DP and the gt200 had around 80 GFlops DP (but the quadro and tesla cards had higher shader clocks i think). They will be happy if they reach 1.5 times the radeon 5800 DP performance. In this pdf from nvidia site http://www.nvidia.com/content/PDF/fermi_white_pape...">http://www.nvidia.com/content/PDF/fermi...T.Halfhi... they write that the ECC will hawe a performance penalty from 5% to 20% (on tesla cards u will hawe the option to turn it off/on on GT cards it will be turned off).Zool - Monday, October 5, 2009 - link
I also want to add that if the DP has increased 8 times from gt200 than let we say around 650 Gflops, than if the DP is half of the SP (as they state) performance in Fermi than i get 1300 Gflops ???? (with same clock speeds). For GT200 they stated 933 Gflops. Something is wrong here maybe ?Zool - Monday, October 5, 2009 - link
Actualy they state 30 FMA ops per clock for 240 cuda cores in gt200 and 256 FMA ops per clock for 512 cuda cores in Fermi. Which means clock for clock and core for core they increased 4 times the DP performance.SymphonyX7 - Saturday, October 3, 2009 - link
Hi. I'm a long time Anandtech reader (roughly 4 years already). I registered yesterday just because I wanted to give SiliconDoc a piece of my mind but thankfully ended being up being rational and not replying anymore.Now that he's gone. I just want to know what you guys think of Fermi being another big chip. Is it safe to assume that Nvidia is losing more money than ATI on high-end models being sold simply because the GTX cards are much bigger than their ATI counterparts? Moreso now that the HD 58xx cards have been released which are faster overall than any of Nvidia's single-GPU solutions. Nvidia will be forced to further lower the price of their GTX cards. I'm still boggled as to why Nvidia would still cling to really big chips rather than go ATI's "efficiency" route. From what I'm reading, this card may focus more on professional applications rather than raw performance in games. Is it possible that this may simply be a technology demonstrator in the making in addition to something that will "reassure" the market to prevent them from going ATI? I don't know why they should differentiate this much if it's intended to compete with ATI's offerings, unless that isn't entirely their intention...
Nakomis - Saturday, October 3, 2009 - link
Boy can I tell you I really wish SilDoc was still here? Anyone have his email address? I wanted to send him this:http://rss.slashdot.org/~r/Slashdot/slashdot/~3/9J...">http://rss.slashdot.org/~r/Slashdot/sla...es-Fermi...
- Saturday, October 3, 2009 - link
There was no benchmark, not even a demo during the so-called demonstration! This is very pathetic and it looks that Nvidia wont even meet the december timeframe. To debug a chip that doesnt work properly might cost many months. To manufacture a chip another 12 weeks. To develop the infrastructure including drivers and card manufactures another few months. Therefore, late q12010 or even 6/2010 might become realistic for a true launch and not a paperlaunch. What we could see on this demonstration was no more than the paper launch of the paper launch.Nate0007 - Friday, October 9, 2009 - link
Hi, I fully agree with you 100%You seem to be one of very FEW people that actually see that or get it.
You know what i can not seem to understand ??
How can supposedly a few hundred or so of people that are knowlegable of what it is they are about too see or somewhat of why they are attending the demonstration just sit there and listen to 1 person standing up and make claims about his or a product but have no proof ?
I understand how things are suppose to be, but have we all just become so naive to just believe what is pushed onto us through media ( ie..TV,Radio.Blogs.Magazines.ect...) and just believe it all ?
I am not saying that what Jen Shun showed was NOT a real demo of a working Fermi Card , I am just saying that there was and still is NO proof of any sort from anyone that was able to actually confirm or denie that it actually was.
Untill Nvidia actually shows a working sample of Fermi , even a so called ruffor demo model of it so long as it actually real I will not believe it.
There is a huge difference between someone makeing claims on the forums of sites like this here and or Blogs and someone who is holding a news conference clainming what they have achieved .
Next thing you know someone will stand up and say they have discoverd how to time travel and then show a video of just that.
There is a difference be facts and reality.
bigboxes - Saturday, October 3, 2009 - link
RED ROOSTER! jk :pFWIW, I'm glad AT banned that fool. Too bad it took 37 pages of fanboi ranting for it to come to fruition. For those that cry that there is no place to discuss this AT does have a video forum that will not allow this kind of shenanigans. Does anyone wonder if this is Rollo back from the grave?
palladium - Monday, October 5, 2009 - link
Not quite:http://www.dailytech.com/article.aspx?newsid=16410">http://www.dailytech.com/article.aspx?newsid=16410
Scroll down halfway thru the comments. He re-registered as SilicconDoc and barks about his hatred for red roosters (in an Apple-related article!)
johnsonx - Monday, October 5, 2009 - link
that looks more like someone mocking him- Sunday, October 4, 2009 - link
According to this very link http://www.anandtech.com/video/showdoc.aspx?i=3573...">http://www.anandtech.com/video/showdoc.aspx?i=3573... AMD already presented a WORKING SILICON at Computex roughly 4 months ago on June 3rd. So it took roughly 4 and a half months to prepare drivers, infrastructure and mass production to have enoough for the start of Windows 7 and DX11. However, Nvidia wasnt even talking about W7 and DX11 so late Q1 2010 or even later becomes more realistic than december. But there are much more questions ahead: What pricepoint, Clockrates and TDP. My impression is that Nvidia has no clue about this questions and the more I watch this development, the more Fermi resembles to the Voodoo5 Chip and the V6000 card which never made into the market because of its much to high TDP.silverblue - Sunday, October 4, 2009 - link
Nah, I expect nVidia to do everything they can to get this into retail channels because it's the culmination of a lot of hard work. I also expect it to be a monster, but I'm still curious as to how they're going to sort out mainstream options due to their top-down philosophy.That's not to say ATI's idea of a mid-range card that scales up and down doesn't have its flaws, but with both the 4800 and 5800 series, there's been a card out at the start with a bona fide GPU with nothing disabled (4850, and now 5870), along with a cheaper counterpart with slower RAM and a slightly handicapped core (4830/5850). Higher spec single GPU versions will most likely just benefit from more and/or faster RAM and/or a higher core clock, but the architecture of the core itself will probably be unchanged - can nVidia afford to release a competing version of Fermi without disabling parts of the core? If it's as powerful as we're lead to believe, it will certainly warrant a higher price tag than the 5870.
Ahmed0 - Saturday, October 3, 2009 - link
Nvidia wants it to be the jack of all trades. However, they are risking with being an overpriced master of none. Thats probably the reason they give their cards more and more gimmicks to play with each year. They are hoping that the cards value will be greater than the sum of its parts. And that might even be a successful strategy to some extent. In a consumerist world, reputation is everything.They might start overdoing it at some point though.
Its like mobile phones nowadays. You really dont need to have a radio, an mp3-player, a camera nor other such extras in it (in fact, my phone isnt able to do anything but call and send messages). But unless you have these features, you arent considered as competition. It gives you the opportunity to call your product "vastly superior" even though from a usability standpoint it isnt.
SymphonyX7 - Saturday, October 3, 2009 - link
Ahh... I see where you're coming from. I've had many classmates who've asked me what laptop to buy and they're always so giddy when they see laptops with the "Geforce" sticker and say they want it cause they want some casual gaming. Yes, even if the GPU is a Geforce 9100M. I recommended them laptop using AMD's Puma platform and many of them ask if that's a good choice (unfortunately here, only the Macbook has a 9400M GPU and it's still outside many of my classmates' budgets). Seems like brand awareness of Nvidia amongst many consumers is still much better than AMD/ATI's. So it's an issue of clever branding then?Lifted - Saturday, October 3, 2009 - link
A little late for any meaningful discussion over here as AT let the trolls go for 40 or so pages. I doubt many people can be arsed to sort through it now, so you'd be better off going to a forum for a real discussion of Fermi.neomocos - Saturday, October 3, 2009 - link
if you missed it then here you go ... happy day for all of us :quote from comment posted on page 37 by Pastuch
" Below is an email I got from Anand. Thanks so much for this wonderful site.
-------------------------------------------------------------------
Thank you for your email. SiliconDoc has been banned and we're accelerating the rollout of our new comments rating/reporting system as a result of him and a few other bad apples lately.
A- "
james jwb - Saturday, October 3, 2009 - link
Some may enjoy it, but this unusual freedom that blatant trolls using aggressive, rude language are getting lately is making a mockery of this site.I don't mind it going on for a while, even 20 pages tbh, it is funny, but at some point i'd like to see a message from Gary saying, "K, SiliconDoc, we've laughed enough at your drivel, tchau, banned! :)"
That's what i want to see after reading through 380 bloody comments, not that he's pretty much gotten away with it. And if he has finally been banned, i'd actually love to know about it in the comments section.
/Rant over.
Gary Key - Monday, October 5, 2009 - link
He is gone as are a couple of others. We have a new comments system in final development now that should take care of this problem in the future.Voo - Saturday, October 3, 2009 - link
You may overseen it, but there was a edit by an administrator to one of his posts which did exactly what you want ;)james jwb - Sunday, October 4, 2009 - link
that's good to hear :)Hxx - Friday, October 2, 2009 - link
By the looks of it, Nvidia doesn't have much going on for this year. If they loose the DX11 boat against ATI then I will pity their stockholders. About the only thing that makes those green cards attractive is their Physics spiel. Now if ATI would hurry up and do somethin with that Havoc, then dark days will await Nvidia. One way or the other, its a win-win for the consumer. I just wish their AMD division would fare just as well against intel.Zool - Friday, October 2, 2009 - link
I dont wont to be too pesimistic but availability in Q1 2010 is lame late. Windows 7 will come out soon so people will surely want to upgrade to dx11 till christmas. Also OEM market which is actualy the most profitable. Dell, HP and others will hawe windows 7 systems and they will of course need dx11 cards till christmas.(amd will hawe hopefully all models out till that time)Than of course dx11 games that will come out in future can be optimized for radeon 5K now while for gt300 we dont even know the graphic specs and the only working silicon dont even resemble to a card.
Very bad timing for nvidia this time that will give amd a huge advantage.
Zool - Friday, October 2, 2009 - link
Actualy this could hapen if u merge a super gpgpu tesla card and a GPU and want to sell it as one("because designing GPUs this big is "fucking hard"). Average people (maybe 95% of all) dont even know what Megabyte or bit is not even GPGPU. They will want to buy a graphic card not cuda card.If amd and microsoft will make heawy DX11 pr than even the rest of nvidias gpus wont sell.
PorscheRacer - Friday, October 2, 2009 - link
As with anything hardware, you need the killer software to have consumers want it. DX11 is out now, so we have Windows 7 (which most people are taking a liking to, even gamers) and you have a few upcoming games that people look to be interested in. For GPGPU and all that, well... What do we have as a seriously awesome application that consumers want and feel they need to go out and buy a GPU for? Some do that for F@H and the like, and a few for transcoding video, but what else is there? Until we see that, it's going to be ahrd to convince consumers to buy that GPU. As it is, most feel IGP is good enough for them...PorscheRacer - Friday, October 2, 2009 - link
Actually, thinking about this... Maybe if they were able to put a small portion of this into IGP, and include some good software with it, maybe the average consumer could see the benefits easier and quicker and be inclined to go for that step up to a dedicated GPU?RXR - Friday, October 2, 2009 - link
DocSilicon, you are one funny as hell mental patient to be!. I really hope you dont get banned. You just made reading the comments a whole lot more fun. Plus, it's win win. You get to satisfy your need to go completely postal at everyone, and we get a funny sideshow.- Friday, October 2, 2009 - link
Great words but nothing behind! Fermis is Nvidias Prescott or should I say much like the last Voodoo chip that never really appeared on the market? Too many transistors are not good ...ioannis - Friday, October 2, 2009 - link
Although the Star Trek TNG reference is ok, 'Nexus' should have been accompanied by a Blade Runner reference instead, Nexus-6 :)- Friday, October 2, 2009 - link
You are all talking too much about technologies. Who cares about this? DX11 from ATI is already available in Japan and they are selling like sex dolls. And why didnt NVDIA provided any benchmarks? Perhaps the drivers aren ready or Nvidia doesnt even know at what clockspeed this monster can run without exhausting your pcs power supply. Fermi is not here yet, it is a concept but not a product. ATI will cash in and Nvidia can only look. And when the Fermi-Monster will finally arrive, ATI will enroll with 5890 and X2 in the luxury class and some other products in the 100 Dollar class. Nvidia will always be a few months late and ATI will get the business. It is that easy. Who wants all this Cuda stuff? Some number crunching in the science field, ok. But if it were for physix an add-on board would do. But in reality there was never any run for physix. Why should this boom come now? I think Nvdia bet on the wrong card and they will suffer heavily for this wrong decision. They had better bought VIA or its CPU-division instead of Physix. Physix is no standard architecture and never will. In contrast, ATI is doing just what gamers want and this is were the money is. Were are the Gaming-benchmarks for FERMI? Nvidia is over!- Friday, October 2, 2009 - link
With all this Cuda and Physix stuff Nvidia will have 20-30% more power consumption at any pricepoint and up to 50% higher production costs because of their much bigger die size. ATI will lower the price whenever necessary in order to beat Nvidia in the market place! And when will Nvida arrive? Yesterday we didnt see even a paperlaunch! It was the announcement of a paperlaunch maybe in late december but the cards wont be available until late q12010 I guess. They are so much out of the business but most people do not realise this.Ahmed0 - Friday, October 2, 2009 - link
I know for sure SD is from Illinois (his online profiles which are related to his rants [which in turn are related to each other] point to it).So, Im going to go out on a limb here and suggest that SiliconDoc was/is this guy:
http://www.automotiveforums.com/vbulletin/member.p...">http://www.automotiveforums.com/vbulletin/member.p...
A little googling might (or might not) support the fact that he is a loony. Just type "site:forums.sohc4.net silicondoc" and youll find he has quite a reputation there (different site but seems to be the same profile, "handwriting" and same bike)
And that MIGHT lead us to the fact that he MIGHT actually be (currently) 45 and not a young raging teenage nerd called Brian.
Of course... this is just some fun guesswork I did (its all just oh so entertaining).
Ahmed0 - Friday, October 2, 2009 - link
Well... either that or all users called SiliconDoc are arsholes.k1ckass - Friday, October 2, 2009 - link
I guess silicondoc would eat **** if nvidia says that it tastes good, LOL.btw, fermi cards shown appears to be fake...
http://www.semiaccurate.com/2009/10/01/nvidia-fake...">http://www.semiaccurate.com/2009/10/01/nvidia-fake...
and btw, I use an nvidia gtx, propable would get an hd5870 next week because of all this crap nvidia throws at its consumers.
Pastuch - Friday, October 2, 2009 - link
Below is an email I got from Anand. Thanks so much for this wonderful site.-------------------------------------------------------------------
Thank you for your email. SiliconDoc has been banned and we're accelerating the rollout of our new comments rating/reporting system as a result of him and a few other bad apples lately.
A-
tamalero - Saturday, October 3, 2009 - link
about time, was getting boring with the constant "bubba, red roosters, morons..etc.."sigmatau - Friday, October 2, 2009 - link
.......
SiliconDoc getting banned.... PRICELESS.
PorscheRacer - Friday, October 2, 2009 - link
So it's safe now to post again? Much thanks has to go to Anand to cleaning up the virus that has infected these comments. I mean, it's new tech. Aren't we free to postulate about what we think is going on, discuss our thoughts and feelings without fear of some person trolling us down till we can't breathe? It feels better in here now, so thanks again.Mr Perfect - Saturday, October 3, 2009 - link
It looks like it safe... After about 37 pages.Good job though, it's actually been worse in Anandtech comments then it usually is on Daily Tech! Now that's saying something...
Kougar - Friday, October 2, 2009 - link
Hey Anand:Just wanted to say thanks for the article. Love the quotes and behind-the-scene views, and in general the ever so informative articles like this that just can't be found elsewhere. So, thank you!
bobvodka - Friday, October 2, 2009 - link
Someone earlier askes if supporting doubles was going to waste silicon, I don't think it will.If you look at the through put numbers and the fact that FP64 is half that of FP32 with the SFU disabled I suspect what is going on is that the FP64 calculations are being done by 2 cores at once with the SFU being involved in some way (given how it is decoupled from the cores there is no apprent good reason why the SFU should be disabled during FP64 operation).
A comment was also made re:ECC memory.
I suspect this wont make it to the consumer board; there is no good reason to do so and it would just cost silicon and power for a feature users don't need.
Zool - Friday, October 2, 2009 - link
Maybe the consumer board wont hawe ECC but it will be still in the silicon (disabled). I dont think that they will produce two different silicons just becouse of ECC.bobvodka - Friday, October 2, 2009 - link
hmmm, you are probably right on that score and that might aid yield if they can turn it off as any faults in the ECC areas could be safely ignored.Chances of them using ECC ram on the boards themselves I would have said was zero simply due to cost :)
halcyon - Friday, October 2, 2009 - link
Same foundry, same process, much more transistors....Based on roughly extrapolating scaling from the RV870, how much bigger power draw would this baby have?
The dollar draw from my wallet is going to be really powerful, that's for sure, but how about power?
deeper - Friday, October 2, 2009 - link
Well, not only is the GT300 months away but it looks like the card they showed off is a fake anyhoo, check it out at Charlie Demerjian's www.semiaccurate.comZool - Friday, October 2, 2009 - link
Could you pls delete majority of SiliconDoc replies and than this after them. Its embarassing to read them.Pirks - Friday, October 2, 2009 - link
I call BS. How many people have 2560x1600 30-inchers? Two? Three? Main point - resolutions are _VERY_ far from being stagnated, they have SOOOOOOOOO _MUCH_ room for growth until 2560x1600 which right now covers maybe 1% of the PC gaming market. 90% of PC gamers still use low-res 1680x1050 if not less (I for one have 1400x1050, yeah shame on me, I don't want to spend $800 on hi-end SLI setup just to play Crysis in all its hi-res beauty, for.get.it.)Shame Anand, real shame.
Otherwise top notch quality stuff, as always with Ananad.
bigboxes - Friday, October 2, 2009 - link
1680x1050 = low res??? Seriously? That's hi-def bro. I understand you can do better, but for my 20" widescreen it is definitley hi-def.JarredWalton - Friday, October 2, 2009 - link
I believe what you describe is exactly what is meant by stagnation. From Merriam-Webster: "To become stagnant." Stagnant: "Not advancing or developing." So yeah, I'd say that pretty much sums up display resolutions: they're not advancing.Is that bad? Not necessarily, especially when we have so many applications that do thing based purely on the wonderful pixel instead of on points or DPI. I use a 30" LCD, and I love the extra resolution for working with images, but the text by default tends to be too small. I have to zoom to 150% in a lot of apps (including Firefox/IE) to get what I consider comfortably readable text. I would say that 2560x1600 on a 30" LCD is about as much as I see myself needing for a good, looooong time.
Pirks - Monday, October 5, 2009 - link
No, Jarred, they are advancing towards 2560x1600 on every PC gamer's desk. Since they move towards that (used to be 800x600 everywhere, now it's more like 1280x1024 everywhere, in couple of years it'll be 1680x1050 everywhere and so on) they cannot be described as stagnant, hence your statement is BS, Jarred.mejobloggs - Tuesday, October 6, 2009 - link
I think I'll agree with Jared on this oneLCD tech isn't advancing enough to get decent high quality large screens at a decent price. 22" seems about the sweet spot which is usually 1680x1050
Pirks - Wednesday, October 7, 2009 - link
This sweet spot used to be 19" 1280x1024 a while ago, with 17" 1024x768 before that. In a couple of years sweet spot will move to 24" 1920x1200, and so on. Hence monitor resolution does progress, it does NOT stagnate, and you do listen to Jarred's fairy tales too much :PJarredWalton - Friday, October 9, 2009 - link
What we have here is a failure to communicate. My point, which you are ignoring, is that maximum resolutions are "stagnating" in the sense that they are remaining static. It's not "BS" or a "fairy tale", unless you can provide detail that shows otherwise. I purchased a 24" LCD with a native 1920x1200 resolution six years ago, and then got a 30" 2560x1600 LCD two years later. Outside of ultra-expensive solutions, nothing is higher than 2560x1600 right now, is it?1280x1024 was mainstream from about 7-11 years ago, and 1024x768 hasn't been the norm since around 1995 (unless you bought a crappy 14/15" LCD). We have not had a serious move to anything higher than 1920x1080 in the mainstream for a while now, but even then 1080p (or 1200p really) has been available for well over 15 years if you count the non-widescreen 1600x1200. I was running 1600x1200 back in 1995 on my 75 pound 21" CRT, for instance, and I know people that were using it in the early 90s. 2048x1536 and 2560x1600 are basically the next bump up from 1600x1200 and 1920x1200, and that's where we've stopped.
Now, Anand seems to want even higher resolutions; personally, I'd be happier if we first found a way to make those resolutions work well for every application (i.e. DPI that extends to everything, not just text). Vista was supposed to make that a lot better than XP, but really it's still a crap shoot. Some apps work well if you set your DPI to 120 instead of the default 96; others don't change at all.
Pirks - Friday, October 9, 2009 - link
I agree that maximum resolution has stagnated at 2560x1600, my point was that the average resolution of PC gamers is still moving from pretty low 1280x1024 towards this holy grail of 2560x1600 and who knows how many years will pass until every PC gamer will have such "stagnated" resolution on his/her desk.So yeah, max resolution stagnated, but average resolution did not and will not.
I am as mad as hell - Friday, October 2, 2009 - link
Hi everyone,first off, I am not mad as hell! I just registered this acct after being an Anandtech reader for 10+ years. That's right. It's also my #1 tech website. I read several others, but this WAS always my favorite.
I don't know what happened here lately, but it's becoming more and more of a circus in here.
I am going to make a few points suggestions:
1) In the old days reviews were reviews, this days I there are a lot more PRE-views and picture shows and blog (chit chatting) entries.
2) In the old days a bunch of hardware was rounded up and compared to each other (mobos, memory, PSU's, etc..) I don't see that here much anymore. It's kind of worthless to me just to review one PSU or one motherboard at the time. Round em all up and then lets have a good ole fashioned shootout.
3) I miss the monthly buyer guides. What happened to them? I like to see CPU + Mobo + Mem + PSU combo recommendations in the MOST BANG FOR THE BUCKS categories (Something most people can afford to buy/build)
4) Time to moderate the comments section, but not censorship. My concern is not with f-words, but with trolls and comment abusers. I can't stand it. I remember the old days when then a famous site totally self-destruct, and at that time I think had maybe more readers than Anand, (hint: It had the English version of "Tiburon" as name as part of the domain name) when their forum went totally out of control because it was moderated.
5) Also, time to upgrade the comments software here at Anandtech. It needs a up/down ratings feature that even many newspaper websites offer these days.
shotage - Saturday, October 3, 2009 - link
I agree with the idea of a comments rating system (thumbs up or down). Its a democratic way of moderating. It also saves the need for those short replies when all you want to convey is that you agree or not.Maybe also an abuse button that people can click on should things get really out of control..?
Skiprudder - Friday, October 2, 2009 - link
The up/down idea is perfect for the site. Now why didn't I think of that! =)AnnonymousCoward - Friday, October 2, 2009 - link
I don't like commenter ratings, since they give unequal representation/visibility of comments, they affect your perception of a message before you read it, and it's one more thing to look at while skimming comments.Skiprudder - Friday, October 2, 2009 - link
I'm sorry, but I'm going to ask for a ban of SiliconDoc as well. One person has single-handedly taken over the reply section to this article. I was actually interested in reading what other folks thought of what I feel is a rather remarkable new direction that nvidia is taking in terms of gpu design and marketing, and instead there are literally 30 pages of a single person waging a shouting matching with the world.Right now there isn't free speech in this discussion because an individual is shouting everyone down. If the admins don't act, people are likely to stop using the reply section all together.
sandwiches - Thursday, October 1, 2009 - link
Just created an account to say that I've never seen this kind of sycophantic, schizophrenic blathering as Silicondoc. I have been an Nvidia user for the past 7 years or so (simply out of habit) and before that a Voodoo user and I cannot even begin to relate to this buffoon.Oh and to incense him even more, I'd like to add that I bought a HD5870 from newegg a couple nights ago since I needed to upgrade my old 8800 GTS card, NOW... not in a few months or next year. Now. Seeing as how the HD 5870 is the fastest for my buck, I went with that. Forget treating brands like religions. Get whatever's good and you can afford and forget brands. The end.
SiliconDoc - Friday, October 2, 2009 - link
LOL - another lifelong nobody who hasdn't a clue and was a green goblin by habit self reported, made another account, and came in just to "incense" "the debater".Really, you couldn't have done a better job of calling yourself a piece of trash.
Congratulations for that.
You admitted your goal was trolling and fanning the flames.
LOL
That was your goal, and so bright you are, that "sandwiches" came to mind as your name, a complimentary imitation, indicating you hoped to be equal to "Silicon" and decided to take a "tech part" styled name, or, perhaps as likely, it was haphazard chance cause buckey was hungry when he got here, and that's all he could think of.
I think you're another pile of stupid with the C and below crowd, basically.
So, that sort of explains your purchase, doesn't it ?
LOL
Yes, you can't possibly relate, you need more MHZ a lot bigger powr supply "sandwiches".
LOL
sandwiches!
CptTripps - Friday, October 2, 2009 - link
Reading your comments makes me think you are about 12 and in need of serious evaluation. I have never seen someone so out of control on Anand for 30+ pages. Seriously, get your brain checked out, there is some sort of imbalance going on when everyone in the whole world is a liar except you, the only beacon of truth.sandwiches - Friday, October 2, 2009 - link
LMAOThis has to be an online persona for you, Brian. Are you seriously like this in real life?
Voo - Thursday, October 1, 2009 - link
We really need a "ignore this idiot" button :/Anyways, is there any reason why this has to be a "one size fits them all" chip? I mean according to the article there's a lot of stuff in it, which only a minority of gamers would ever need (ECC memory? That's more expensive than normal memory and usually has a performance impact).
I mean there's already a workstation chip, why not a third one for GPU computing?
neomocos - Thursday, October 1, 2009 - link
You know when someone says you are drunk/stupid/drugged/crazy/etc... then you might question him but when 2 or 3 people say it then it`s most than probably true but when all the people at anand say it then SiliconDoc should just stfu, go right now and buy an ati 5870 and smash it on the ground and maby he will feel better and let us be.I vote for ban also :) and i donate 10$ for the 5870 we anand users will give him as a present for christmas.Happy new red year Silicon...
andrihb - Thursday, October 1, 2009 - link
I'm excited about this but I wish it was ready sooner. It looks like we'll have to wait 2-3 months for benchmarks, right?I hope it'll blow 5870 away because that's what is best for us, the consumers. We'll have an even faster GPU available to us which is all that really matters.
I've noticed that a person here has been criticizing this article for belittling the fact that nVidia's upcoming GPU is likely going to have a vastly suerior memory bandwidth to ATI's current flagship. Anand gave us the very limited data that exists at the moment and left most of the speculation to us. He doesn't emphasize that Fermi (which won't even be available for months) has far more bandwidth than ATI's current flagship. I contend that most people already suspected as much.
The vastly superior memmory bandwidth suggests that nVidia might just have a 5870 killer up it's sleeve. See what I just did there? This is called engaging in speculation. Anand could have done more of that, I agree, but saying that this is proof of Anand's supposed bias towards ATI? That is totally unreasonable.
Hey, Doc, do you want to see a real life, batshit crazy, foaming at the mouth fanboy? All you need is a mirror.
JonnyDough - Thursday, October 1, 2009 - link
"Jonah did step in to clarify. He believes that AMD's strategy simply boils down to targeting a different price point. He believes that the correct answer isn't to target a lower price point first, but rather build big chips efficiently. And build them so that you can scale to different sizes/configurations without having to redo a bunch of stuff. Putting on his marketing hat for a bit, Jonah said that NVIDIA is actively making investments in that direction. Perhaps Fermi will be different and it'll scale down to $199 and $299 price points with little effort? It seems doubtful, but we'll find out next year."FOOL!
So nVidia is going to make this for Tesla. That's great that they're innovating but you mentioned that those sales are only a small percentage. AMD went from competing strongly in the CPU market to dominating the GPU market. Good move. But if there's no existing market for the GPGPU...do you really want to be switching gears and trying to create one? Hmm. Crazy!
SiliconDoc - Thursday, October 1, 2009 - link
Admin: You've overstayed your welcome, goodbye.I'm not sure where you got your red rooster lies information. AMD/ati has NO DOMINATION in the GPU market.
---
One moment please to blow away your fantasy...
---
http://jonpeddie.com/press-releases/details/amd-so...">http://jonpeddie.com/press-releases/det...ntel-and...
---
Just in case you're ready to scream I cooked up a pure grteen biased link, check back a few pages and you'll see the liar who claimed ati is now profitable and holding up AMD because of it provided the REAL empty rhetoric fanboy link http://arstechnica.com/hardware/news/2009/07/intel...">http://arstechnica.com/hardware/news/20...-graphic...
which states
" The actual numbers that JPR gives are worth looking at,
which show INTEL dominates the GPU market !
Wow, what a surprise for you. I've just expanded your world, to something called "honest".
So:
INTEL 50.30%
NVIDIA 28.74%
AMD 18.13%
others below 1% each.
------------
Gee, now you know who dominates, and for our discussions here, who is IN LAST PLACE! AND THAT WOULD BE ATI THE LAST PLACE LOSER!
--
Now, I wouldn't mind something like ati is competitive, but that DOMINATES thing says it's #1, and ati is :
************* ATI IS IN LAST PLACE ! LAST PLACE! **************
Now please whine about the 3 listed at the link less than 1% each, so you can "pump up ati" by claiming "it's not last, which of course I would welcome, since it's much better than lying and claiming it's number one.
---
I suspect all you crying ban babies are ready to claim to have found absolutely zero information or contributioon in this post.
tamalero - Thursday, October 1, 2009 - link
huge difference between INTEGRATED GRAPHIC SEGMENT, wich is almost every laptop there and a lot of business computers.versus the DISCRETE MARKET, wich is the GAMING section... where AMD-ATI and NVIDIA are 100%.
get your facts and see a doctor, your delusional attitude is getting annoying.
SiliconDoc - Friday, October 2, 2009 - link
Gee, you certaibnly are not a computer technician.That you even POSIT that games aren't played on INTEL GPU's is an astounding whack.
Nvidia and ati have laptop graphics, as does intel, and although I don't have the numbers for you on slot cards vs integrated, your whole idea is another absolute FUD and hogwash.
INTEL slotted graphics are still around playing games, bubbba.
You sure don't know much, and your point is 100% invalid, and contains the problem of YOUR TINY MIND.
Let me remind you Tamalero, you shrieked and wailed against nvidia for INTEGRATED GRAPHICS of theirs on a laptop !
Wow, you're a BOZO, again.
sandwiches - Friday, October 2, 2009 - link
You're a sad, pathetic troll, Silicondoc. I honestly do feel contempt for you. Your life is obviously devoid of any real substance.shotage - Thursday, October 1, 2009 - link
The performance and awesomeness of a company campared to another is biaised. Sure Nvidia probably has historically done better than ATI.I hope that ATI as a company does much better this year and next, so that the there is greater competition. Competition which will consequently mean smaller profit margins, but better deals for us consumers!
At the end of the day, who cares who's winning? Shouldn't we all be hoping that each does well? Shouldn't we all hope that there will always be several major graphics providers? Do we really want a monopoly on GPU's? How would this effect the price on a performance card?
I think you should be banned SiliconDoc. You're adding no real value here. Leave.
I have a GTX260 btw. So i'm not speaking from bias. Wander what kind of card you have? lol...
SiliconDoc - Friday, October 2, 2009 - link
What card you have or don't have doesn't matter one whit, but what you claim DOES. What you SPEW does !and when you lie expect any card to save you.
If liars were banned, you'd all be gone, and I'd be left. ( that does not include of course those who aren't trolling jerkbots running around to my every post wailing and whining and saying ABSOLUTELY NOTHING )
-
" Sure Nvidia probably has historically done better than ATI."
since you 1. obviously haven't got a clue wether what you said is true or not 2. Why would you even say it, with your stupidty retaining, lack of knowledge, or lying caveat "probably" ?
If you're so ignorant, please shut it on the matter! Do you prefer to open your big fat yap and prove how knowledgeless you are ? I guess you do.
If you don't know, why are you even opening your piehole about it ?
It certainly doesn't do anything for me if you aren't correct and you don't know it ! I don't WANT YOUR LIES, nor your pie open when you haven't a clue. I don't want your wishy washy CRAP.
Ok ?
Got it ?
If you open the flapper, make sure it gets it right.
-
If you actually are an enthusiast, why is it that the result is, you blather on in bland generalities, get large facts tossed in a fudging, sloppy, half baked inconclusive manner, and in the end, wind up being nothing less than the very red rooster you demand I believe you are not.
What a crappy outcome, really.---
--
Frankly, you cannot accept me even telling the facts as they actually are, that is too much for your mushy, weak, flakey head, and when I do, you attribute some far out motive to it !
There's no motive other than GET IT RIGHT, YOU IDIOTS !
--
What do you claim, though ?
Why is it, you have such an aversion to FACTS ? WHY IS THAT ?
If I point out ati is not in fact on top, but last, and NVIDIA is almost double ati, (to use the "authors" comparison techniques but not separate companies for "internal comparisons" and make CERTAIN I exagerrate) - why are you so GD bent out of shape ?
I'll tell you why...
YOU FORBID IT.
I certainly don't understand your mindset, you'd much prefer some puss filled bag of slop you can quack out so "we can come to some generalization on our desires and feelings" about "the industry".
Go suck down your estrogen pills with your girlfriends.
---
I don't care what your feelings are, what flakey desire you have for continuing competition, because, you prefer LIES over the truth.
Instead of course, after you whining in some sissy crybaby pathetic wail for the PC cleche of continuing competition, you'll turn around and screech the competition I provide to your censored mindset is the worst kind you could possibly imagine to encounter ! Then you wail aloud "destroy it! get rid of it ! ban it ! "
LOL
You're one piece of filthy work, that's for sure.
---
So, you want me to squeal like an idiot like you did, that you want lower prices and competition, and the way to get that is to LIE about ati in the good, and DISS nvidia to the bad with even bigger lies ?
I see.. I see exactly !
So when I point out the big fat lying fibs for ati and against nvidia - you percieve it as a great threat to "your bottom line" pocketbook.
LOL
Well you know what - TOO BAD ! If the card cannot survive on FACTS AND THE TRUTH, then it deserves to die.
Or is honesty banned so you can fan up ati numbers with your lies, and therefore get your cheaper nvidia card ?
--
This is WHAT YOU PEOPLE WANT - enough lies for ati and against nvidia to keep the piece of red crappin ?
LOL yeah man, just like you jerks...
---
" At the end of the day, who cares who's winning? "
Take a look at that you flaked out JERKOFF, and apply it to this site for the YEARS you didn't have your INSANE GOURD focused on me.
Come on you piece of filth, take a look in the mirror !
It's ALL ABOUT WHOSE WINNING HERE.
THE WHOLE SITE IS BASED UPON YOU LITTLE PIECE OF CRAP !
---
And of course worse than that, after claiming you don't care whose winning, you go on to spread your hope that ati market share climbs, so you can suck down a cheapo card with continuing competition.
So what that says, is all YOU care about is your money. MONEY, your money.
"Quick ban the truth! jerkoffs pocketbook is threatened by posts on anandtech because this poster won't claim he wants equal market share !"
--
Dude, you are disgusting. You take fear and personal greed to a whole new level.
sandwiches - Friday, October 2, 2009 - link
For some laughs at this poor excuse for a 29-year-old man, check out these links:Here, he was banned from driverheaven.com for his rants against ATI. He really is a truly rabid ATI hater with horribly sycophantic traits:
http://www.driverheaven.net/members/silicondoc.htm...">http://www.driverheaven.net/members/silicondoc.htm...
More rants by him about ATI and how great NVIDIA is:
http://www.maximumpc.com/user/silicondoc">http://www.maximumpc.com/user/silicondoc
http://forums.bit-tech.net/search.php?searchid=853...">http://forums.bit-tech.net/search.php?searchid=853...
Here's some political rant by Silicondoc. Is anyone surprised to learn he's also a rabid wingnut?
http://www.danielpipes.org/comments/118289">http://www.danielpipes.org/comments/118289
SiliconDoc - Friday, October 2, 2009 - link
Why that's great, let's see what you have for any rebuttal to this clone you claim is me:" Even the gtx260 uses less power than the 4870.
Pretty simple math - 295 WINS.
Cuda, PhysX, large cool die size, better power management, game profiles out of the box, forced SLI, better overclocking, 65% of GPU-Z scores and marketshare, TWIMTBP, no CCC bloat, secondary PhysX card capable, SLI monitor control "
---
Anything there you can refute ? Even just one thing ?
I'm not sure what your complaint is if you can't.
That's the text that was there, so why didn't you read it or try to claim anything in it was wrong ?
Are you just a little sourpussed spasm boy red, or do you actually have a reason ANY of that text is incorrect ?
Anything at all there ? Are you an empty shell who copies the truth then whines like a punk idiot ? Come on, prove you're not that pathetic.
wifiwolf - Thursday, October 1, 2009 - link
less margins indeed. At a business point of view though that's good anyway, not because less margins is good for business (of course not) but for what it implies. It means factories can keep on maximum production - and that's very important. There's less profit for each sale but number of consumers is bigger in an exponential way. So not bad indeed... even for them - good for all.Zingam - Thursday, October 1, 2009 - link
Fermi he hassilverblue - Thursday, October 1, 2009 - link
Doesn't matter mate, he'll still just accuse you of bias and brown-nosing ATI.Jamahl - Thursday, October 1, 2009 - link
Silicondoc, go see a REAL doc please.457R4LDR34DKN07 - Thursday, October 1, 2009 - link
A few points I have about this chip. First it is massive which leads me to believe it is going to be hot and use a lot of power (depending on frequencies). Second it is a one size fits all processor and not specifically a graphics processor. Third is it is going to be difficult to make with decent yields IE expensive and will be hard to scale performance up. I do believe It will be fast due to cache but redesigning cache will be hard for this monolith.silverblue - Thursday, October 1, 2009 - link
It should take the performance crown back from ATI but I'm worried that it's going to be difficult to scale it down for lesser cards (which is where nVidia will make more of its money anyway).When it's out and we can compare its performance as well as price with the 58x0 series, I'll be happier. Choice is never a bad thing. I also don't want nVidia to be too badly hurt by Larrabee so it's in their best interests to get this thing out soon.
AnnonymousCoward - Thursday, October 1, 2009 - link
The Atom is for mobile applications, and Intel is still designing faster desktop chips. The "Atom" of graphics is called "integrated", and it has been around forever. There's no reason to believe that PC games of 2010 won't require faster graphics.The fact that nvidia wants to GROW doesn't mean their bread-and-butter business is going away. Every company wants to grow.
If Fermi's die size is significantly increased by adding stuff that doesn't benefit 3D games, that's a problem, and they should consider 2 different designs for Tesla and gaming. Intel has Xeon chips separate, don't they?
If digital displays overcome their 60Hz limitation, there will be more incentive for cards to render more than 60fps.
Lastly, Anand, you have a reoccurring grammar problem of separating two complete sentences with a comma. This is hard to read and annoying. Please either use a semicolon or start a new sentence. Two examples are, Page 8, sentences that begin with "Display resolutions" and "The architecture". Aside from that, excellent article as usual.
Ananke - Thursday, October 1, 2009 - link
Actually, NVidia is great company, as well as AMD is. However, NVidia cards recently tend to be more expensive compared to their counterparts, so WHY somebody would pay more for the same result?If and when they bring that Fermi to the market, and if that thing is $200 per card delivered to me, I may consider buying. Most people here don't care if NVidia is capable of building supercomputers. They care if they can buy descent gaming card for less than $200. Very simple economics.
SiliconDoc - Thursday, October 1, 2009 - link
I'm not sure, other than there's another red raver ready on repeat, but if all that you and your "overwhelming number" of fps freaks care about is fps dollar bang, you still don't have your information correct.Does ATI have a gaming presets panel, filled with a hundred popular games all configurable with one click of the mouse to get there?
Somehow, when Derek quickly put up the very, very disappointing new ati CCC shell, it was immediately complained about from all corners, and the worst part was lesser functionality in the same amount of clicks. A drop down mess, instead of a side spread nice bookmarks panel.
So really, even if you're only all about fps, at basically perhaps a few frames more at 2560x with 4xaa and 16aa on only a few specific games, less equal or below at lower rez, WHY would you settle for that CCC nightmare, or some other mushed up thing like ramming into atitool and manually clicking and typing in everything to get a gaming profile, or endless jacking with rivatuner ?
Not only that, but then you've got zero PhysX (certainly part of 3d gaming), no ambient occlusion, less GAME support with TWIMTBP dominating the field, and no UNIFIED 190.26 driver, but a speckling hack of various ati versions in order to get the right one to work with your particular ati card ?
---
I mean it's nice to make a big fat dream line that everything is equal, but that really is not the case at all. It's not even close.
-
I find ragin red roosters come back with "I don't run CCC !" To which of course one must ask "Why not ? Why can't you run your "equal card" panel, why is it - because it sucks ?
Well it most definitely DOES compared to the NVidia implementation.
--
Something usually costs more because, well, we all know why.
Divide Overflow - Thursday, October 1, 2009 - link
Agreed. I'm a bit worried that this monster will cost an arm and a leg and won't scale well into consumer price points.Kingslayer - Thursday, October 1, 2009 - link
Silicon duck is the greatest fanboy I've ever seen, maybe less annomynity would quiet his rhetoric.http://www.automotiveforums.com/vbulletin/member.p...">http://www.automotiveforums.com/vbulletin/member.p...
tamalero - Thursday, October 1, 2009 - link
well that answers everything, when someone has to spam the "catholic", must be a bibblethumper who only spreads a single thing and doesnt believe nor accept any other information, even with confirmed facts.SiliconDoc - Thursday, October 1, 2009 - link
What makes you think I'm "catholic" ?And that's interesting you've thrown out another nutball cleche', anyway.
How is it that you've determined that a "catholic" doesn't accept "any other 'even confirmed' facts" ? ( I rather doubt you know what Confirmation is, so you don't get a pun point, and that certainly doesn't prove I'm anything but knowledgeable. )
Or even a "Bible thumper" ?
Have you ever met a bible thumper?
Be nice to meet one some day, guess you've been sinnin' yer little lying butt off - you must attract them ! Not sure what proives either, other than it is just as confirmed a fact as you've ever shared.
I suppose that puts 95% of the world's population in your idiot bucket, since that's low giving 5% to athiests, probably not that many.
So in your world, you, the athiest, and your less than 5%, are those who know the facts ? LOL
About what ? LOL
Now aren't you REALLY talking about, yourself, and all your little lying red ragers here ?
Let's count the PROOFS you and yours have failed, I'll be generous
1. Paper launch definition
2. Not really NVIDIA launch day
3. 5870 is NOT 10.5" but 11.1" IN FACT, and longer than 285 and 295
4. GT300 is already cooked and cards are being tested just not by your red rooster master, he's low down on the 2 month plus totem pole
5. GT300 cores have a good yield
6. ati cores did/do not have a good yield on 5870
7. Nvidia is and has been very profitable
8. ati amd have been losing lots of money, BILLIONS on billions sold BAD BAD losses
9. ati cores as a general rule and NEARLY ALWAYS have hotter running cores as released, because of their tiny chip that causes greater heat density with the same, and more and even less power useage, this is a physical law of science and cannot be changed by red fan wishes.
10. NVIDIA has a higher market share 28% than ati who is 3rd and at only 18% or so. Intel actually leads at 50%, but ati is LAST.
---
Shall we go on, close minded, ati card thumping red rooster ?
---
I mean it's just SPECTACULAR that you can be such a hypocrit.
tamalero - Friday, October 2, 2009 - link
a.-Since when the yields of the 5870 are lower than the GT300?they use the same tech and since the 5870 is less complex and smaller core, it will obviusly have HIGHER YIELDS. (also where are your sources, I want facts not your imaginary friend who tells you stuff)
2.-Nvidia wasnt profiteable last year when they got caught shipping defective chipsets and forced by ATI to lower the prices of the GT200 series.
3.- only Nvidia said "everyhing is OK" while demostrating no working silicon, thats not the way to show that the yields are "OK".
4.- only the AMD division is lossing money, ATI is earning and will for sure earn a lot more now that the 58XX series are selling like hot cakes.
5.- 50% if you count the INTEGRATED market, wich is not the focus of ATI, ATI and NVidia are mostly focused on discrete graphics.
intel as 0% of the discrete market.
and actually Nvidia would be the one to dissapear first, as they dont hae exclusivity for their SLIs and Corei7 corei5 chipsets.
while ATI can with no problem produce stuff for their AMD mobos.
and dude, I might be from another country, but at least Im not trying to spit insults every second unlike you, specially when proven wrong with facts.
please, do the world a favor and get your medicine.
SiliconDoc - Thursday, October 1, 2009 - link
Well that's quite a compliment, thank you.Since you pretend to be a Kingslayer, I have to warn you, you have failed.
I am King, as you said, and in this case, your dullard's, idiotic attempt at another red raging rooster "silence job", has utterly failed.
Now, did you see that awesome closeup of TESLA ? You think that's carbon fiber at the bracket end ? Sure looks like it.
I wonder why ati only has like "red" "red" "red" "red" and "red" cards, don't you ?
I mean I never really thought about it before, but it IS LAME. It's like another cheapo low rent cost savings QUACK from the dollar undenominated lesser queendom of corner cutting, ati.
--
Gee, thank you for the added inspiration, as you well have noticed, the awful realities never mentioned keep coming to light.
At least one of us is actually thinking about videocards.
Not like you'll ever change that trend, so the King's Declaration is: EPIC FAIL !
tamalero - Thursday, October 1, 2009 - link
hu.. my 3870 from visiontek had black pbc :|SiliconDoc - Thursday, October 1, 2009 - link
Well that's actually kinda cool, but was the rest of it covered in that sick ati red ? That card is a heat monster with it's tiny core, so we know it was COVERED in fannage and plastic, unless it was a supercheap single slot that just blasted it into the case.BTW, in order to comment you've exposed another notch in your red fannage purchase history.
tamalero - Friday, October 2, 2009 - link
wait.. WHAT? o_Oyou dont make any sense.
silverblue - Thursday, October 1, 2009 - link
There's an nVidia card sat on a desk at work. It's got a red PCB.SiliconDoc - Thursday, October 1, 2009 - link
Yes, once again you made my point for me, and being so ignorant you failed to notice ! I mean do you guys do this on purpos, or are you just that stupid ?If you have a red nvidia card on the desk at work, that shows nvidia is flexible in color schemes, unlike 'red 'red' red red red red red rooster cards !
You do realize you made an immense BLUNDER now, don't you ?
You thought I meant the color RED was awful.
lol
Man you people just don't have sense above a tomato.
silverblue - Friday, October 2, 2009 - link
Just because the reference cards are red, doesn't mean the manufacturers have to make them so.In fact, ASUS released a 4890 with a black PCB.
You've now descended from arguing about the length of the card and the power of its GPU to the colour of the PCB. Considering it's under a damned cooling solution, how does this matter?
engineer1 - Thursday, October 1, 2009 - link
Anand hates Nvidia because they competed against his former lover AMD, and compete against his current lover Intel. Anand you are such a rotten spoiled brat. Since this website fell into your lap you could at least make an effort to act responsibly. Have you EVER held a job working for someone else? I doubt it. And you should ban the other spoiled brats who apparently work for Microsoft and spend about 8 hours a day dominating everything posted on Dailytech such as TheIdiotNickDanger and Evil666. Bunch of Cretians.gx80050 - Friday, October 2, 2009 - link
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I would love to kick you hard in the face, breaking it. Then I'd cut
your stomach open with a chainsaw, exposing your intestines. Then I'd
cut your windpipe in two with a boxcutter.
Hopefully you'll get what's coming to you. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. Faggot
Shut the fuck up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
papapapapapapapababy - Thursday, October 1, 2009 - link
lol, look at the finger, is this shoped or what! there is no card FAKE!
http://img34.imageshack.us/img34/3594/fermi1.jpg">http://img34.imageshack.us/img34/3594/fermi1.jpg
SiliconDoc - Thursday, October 1, 2009 - link
Sweet ! Nice pick, looks like carbon fiber at the bracket end.Wowzie, a real honker based on THOUSANDS OF DOLLARS of tech and core per part.
I feel SO PRIVLEDGED to have a chance at the gaming segment version, all that massive power jammed into a gaming card !
Whoo! U P S C A L E !
justaviking - Thursday, October 1, 2009 - link
Look at every bright area of high contrast. All the spotlight reflections have a red ring around them. So the thumb, in front of the highly reflective gold connectors, also has the same halo effect. I think that it's as much evidence of a digital camera as it is Photoshop manipulation.With that said, it could also be a non-functional mock-up. Holding a mock-up or prototype in your hand is not the same as benchmarking a production (ready for consumer release) product.
papapapapapapapababy - Thursday, October 1, 2009 - link
look at that irregular borders closely. ( above the watch) also, the shadows (finger) are off. thats a (terrible) shop.v1001 - Thursday, October 1, 2009 - link
All they did was blacken out the background more. Probably was more noise and distraction going on that they didn't want in there.justaviking - Thursday, October 1, 2009 - link
OK, so assuming it's a fake (and I'm not saying it isn't), I have three questions:1) Where did you get the photo?
2) Why do it? (And "Who did it?", but that's closely related to Q1.
3) Where did they get the photo of the hardware, which they then put into the person's hand?
Combining #2 and #3) If the card is from a real photo of real hardware, then what was the value of photoshopping it into someone's hand?
I'm not trying to argue, just trying to understand.
papapapapapapapababy - Thursday, October 1, 2009 - link
more fakes! source: bit-tech ( this one is even "better")http://i34.tinypic.com/34inz9j.jpg">http://i34.tinypic.com/34inz9j.jpg
also, not mine ( from xnews)
http://img28.imageshack.us/img28/2883/tesafilm.png">http://img28.imageshack.us/img28/2883/tesafilm.png
papapapapapapapababy - Thursday, October 1, 2009 - link
also below the card... whats that sloppy withe trim in the middle of a shadow? JAaAAAUNCjigga - Thursday, October 1, 2009 - link
Seriously? I have a 1080p monitor and Radeon 4670 with UVD2, but my PS3 with 1080p output to the same monitor looks MUCH better at upscaling DVDs (night and day difference.) PowerDVD does have a better upscaling tech, but that's using software decoding. Can somebody port ffdshow/libmpeg2 for CUDA and ATI Stream (or DirectCompute?) kthxbyePastuch - Thursday, October 1, 2009 - link
I buy two videocards per year on average. I've owned an almost equal number of ATI/Nvidia cards. I loved my geforce 8800 GTX despite it costing a fortune but since then it's been ALL down hill. I've had driver issues with home theater PCs and Nvidia drivers. I've been totally disappointed with Nvidias performance with high def audio formats. The fact that the entire ATI 48xx line can do 7.1 audio pass-through while only a handful of Nvidia videocards can even do 5.1 audio passthrough is just sad. The world is moving to hometheater gaming PCs and Nvidia is dragging arse.The fact that 5850 can do bitstreaming audio for $250 RIGHT NOW and is the second fastest 1 GPU solution for gaming makes it one hell of a product in my eyes. You no longer nead an Asus Xonar or Auzentech soundcard saving me $200. Hell with the money I saved I could almost buy a SECOND 5850! Lets see if the new Nvidia cards can do bitstreaming... if they can't then Nvidia won't be getting any more of my money.
P.S. Thanks Anand for inspiring me to build the hometheater of my dreams. Gaming on a 110 Inch screen is the future!
SiliconDoc - Thursday, October 1, 2009 - link
Well that's very nice, and since this has been declared the home of "only game fps and bang for that buck" matters, and therefore PhysX, ambient occlusion, CUDA, and other nvidia advantages, and your "outlier" htpc desires are WORTHLESS according to the home crowd, I guess they can't respond without contradiciting themselves, so I will considering I have always supported added value, and have been attacked for it.--
Yes, throw out your $200 sound cards, or sell them, and plop that heat monster into the tiny unit, good luck. Better spend some on after market cooling, or the raging videocard fan sound will probably drive you crazy. So another $100 there.
Now the $100 you got for the used soundcard is gone.
I also wonder what sound chip you're going to use then when you aren't playing a movie or whatever, I suppose you'll use your motherboard sound chip, which might be a lousy one, and definitely is lousier than the Auzentech you just sold or tossed.
So how exactly does "passthrough" save you a dime ?
If you're going to try to copy Anand's basement theatre projection, I have to wonder why you wouldn't use the digital or optical output of the high end soundcard... or your motherboards, if indeed it has a decent soundchip on it, which isn't exactly likely.
-
Maybe we'll all get luckier,and with TESLA like massive computing power, we'll get an NVIDIA blueray dvd movie player converter that runs on the holy grail of the PhysX haters, openCL and or direct compute, and you'll have to do with the better sound of your add on sound cards, anyway, instead of using a videocard as a transit device.
I can't imagine "cable mamnagement" as an excuse either, with a 110" curved screen home built threate room...
---
Feel free to educate me.
silverblue - Thursday, October 1, 2009 - link
People will buy nVidia hardware for their HTPCs regardless of it having PhysX, AO, CUDA or whatever. Price is a very attractive factor, but so is noise and temperature, so people will go for what suits them the best. If people think nVidia offers more for the price, they will buy it, some may go for another option if they want less heat, or less speed or whatever. It's their choice, and not one made out of malice.This thread isn't full of nVidia-haters like you want to believe it is. Keep thinking that if you feel more comfortable doing so. In the end, we as consumers have a choice as to what we buy and nothing of what you are saying here has any bearing on that decision making process.
SiliconDoc - Thursday, October 1, 2009 - link
I think I'll just ignore you, since you seem to have acquired a Svengali mind read on your big "we" extension, and somehow think you represent every person here.I don't put any stock in your idiotic lunatic demi-god musings.
--
If you ever say anything worth more than a piece of scat, I will however respond appropriately.
I'll remind you, you can't even prevent YOURSELF from being influenced by me, let alone "everyone here".
Now if you don't have any KNOWLEDGE on the HTPC issues and questions I brought up with this other poster and his HTPC dreams, please excuse your mind reading self, and keep yourself just as deluded as possible.
I find this a classic IDIOCY : " we as consumers have a choice as to what we buy (oh no problem there)
and nothing of what you are saying here has any bearing on that decision making process. "
You just keep telling yourself that, you unbelievably deranged goofball. LOL, and maybe it will become true for you, if you just keep repeating it.
The first sign of your own cracked shield in that area is you actually saying that. You've already been influenced, and you're so goofy, you just had to go in text and claim no one ever will be.
I mean, you are so much worse than anything I've done here it is just amnazing.
How often do you tell yourself fantasies that there is no chance to can possibly believe or prove, and in fact, have likely already failed yourself ?
Really, I mean absolutely.
silverblue - Friday, October 2, 2009 - link
If you had a mind left to form any sort of coherent thought patterns, we might take you seriously here. You have just admitted (in your own incoheret, babbling way) that you are trying to actively (and forcibly, I might add) influence people to buy nVidia cards over ATI. I'm telling you that you've failed and will continue to fail as long as you keep shimmying up and down the green flag pole in the name of progress. I wonder if anyone at nVidia reads these comments; what must they think of you? If they considered AT a biased publication then they wouldn't speak with Anand as cordially as they do.I say "we" because, unless you've opened your eyes, "we" as a community are becoming even more united against no-brained deluded fanboys such as yourself. We DON'T hate nVidia, a lot of people here own nVidia cards, some only have nVidia cards, some own nVidia and ATI, and some own ATI. This isn't about hatred or bias is misinformation; this is about one socially inept weasel who has been attempting to shove his knowledge down everyone else's throats on this (and other subjects) whether there's any factual basis to it or not.
You disagree with me using the term "we", fine. I personally want to see the GT300 launch. I personally want nVidia to bring out a mainstream flavour to compete with the 5850. I personally want prices to fall. I personally don't have anything against PhysX, CUDA or AO. I personally want to see 3D gaming gather momentum.
Now ask yourself - can you be as objective and impartial as that?
You just seem to read what you like and completely miss the point of any post you reply to. There's no way someone can be impartial on this site with you around because any word of praise about ATI equates to bias in your head.
There's only so far you can go before someone clicks the Ban button but I'm sure you'll come back with another account.
shotage - Thursday, October 1, 2009 - link
Still voting to get you banned SiliconDoc.Zool - Thursday, October 1, 2009 - link
I dont think thats to fair from nvidia to let pay the extra cost for design and manufacture from the gpgpu bloat for all people. They lunched tesla card becouse it cost insane money and can get away with curent yields. For majority of graphic is still simd and almosnt no branching uterly enough. I mean if they would make stand alone cuda cards without the useless graphic pipeline parts it could be smaler or faster.And that goes for graphic too.I mean how hard would it be for amd or intel to put some similar low transistor budget simd units to the pipeline to CPU like are in GPU. And they could run on CPU clocks and would be integral part of CPU(latencies, cache etc).
I dont think thats the right strategy for nvidia.
silverblue - Thursday, October 1, 2009 - link
nVidia could charge a premium for the Tesla-badged cards due to their potential savings over the more traditional method of using masses of general-purpose servers, however they may want to really establish Tesla as a viable option so they can't very well charge too much for it.I'm interested in seeing the peak performance figures for both Cypress and Fermi; will the AMD part still have an advantage in raw processing power due to having many many more, if weaker, SPs/cores? And will it matter in the working environment?
Zool - Thursday, October 1, 2009 - link
Nvidias dreams of those 500x performance in the coming years are actualy only for GPGPU not graphic.The curent gen cards are begining to show some strange scaling.(i think nvidia wont be other in this case too)
They will need some more changes if they want to utilize more shader procesors than just double everithing. If u think about it than at 850 MHz feeding 1600 shaders (actualy 320 is more realistic) is quite a transistor chalenge.(CPUs look like babie toys to these with large cache and much less core)
Actualy there are some physical limits to transistors too. Increasing to 4k milion transistors and 3200 shaders in next card would need even more internal speed. It would be maybe easyer to place to rv870 dies in one gpu than double everything.
neomocos - Thursday, October 1, 2009 - link
We all like our freedom of opinions at anand and this article was very interesting and the comments as well that was until SiliconDoc started trashing everything.As stated by a lot of other users i ask anand to take some action against the user , he is ruining my experience and others of calmly reading the articles in the morning with a cofee :).All his arguments and the way he throws them is so random and makes no sense, he sound like a man who needs his drug of praising nvidia and trashing the red rooster any way he can even if it`s with no real arguments.I read with pleasure the comments of the smart non-biased guys posting here but this guy is just talking crap to fill the lines.On topic ... considering what 5850 has : eyefinity , performance/price, directx 11, power cons, and most important availability i was smiling to myself and thinking that ATI will have killer sales this 3 months left of 2009. I personally will wait for nvidia to bring fermi and with it the price war cuz we all know that then all prices will go down i estimate 150$ for 5850 and about 200 for 5870 around june and if nvidia has better price/perf i will definetly buy it.
SiliconDoc - Thursday, October 1, 2009 - link
And, here we have your contribution, after whining about me claiming no points, the usual bs from red lovers, here is the evidence of your bloodshot eyes, at least you've accepted my direct orders and forced yourself to talk topic.-
" On topic ... considering what 5850 has : eyefinity , performance/price, directx 11, power cons, and most important availability i was smiling to myself and thinking that ATI will have killer sales this 3 months left of 2009. "
--
And after you realize what a red rooster you just were, wether you thought it was a good jab at me, since you know I'll read your attack and that's what the attack was about, or wether you couldn't help yourself, you went on to claim how fair and balanced you are after you hoped for 2 cheap ati cards. LOL The afterthought, barely surfacing from the lack of wattage, added at the end, "if nvidia has better I'll blah blah"..
FUNNY how you talk about THOSE CARDS in the TESLA THREAD, when you are ON TOPIC !
roflmao !
Wowzie!
Cocka ! Doodle ! Doooo !
Let me ask you, since you considered eyefinity so great, do you
shotage - Thursday, October 1, 2009 - link
1 vote to get you banned.Moricon - Thursday, October 1, 2009 - link
Silicondoc you are a big GREEN C*OK lol it even rhymes!Alberto - Thursday, October 1, 2009 - link
The GPGPU market is very little...doesn't make good revenue for a company. Moreover the sw development is HARD and expensive; this piece of silicon seems like a NVidia mistake: too big to manifacture for a graphic card, nice (on the paper) for a market of niche.In comparison Amd is working far better and Intel too with Larrabee.
Hard times at the orizon for Nvidia, it has a monster with very low manufacturing yields, but nothing feasible for the consumer arena.
A prediction ? AMD will have the lead of graphic cards in the next years.....
SiliconDoc - Thursday, October 1, 2009 - link
The yields are fine, apparently you caught wind of the ati pr crew who was caught out lying.If not, you just made your standard idiot assumption, because the actual FACTS concerning the these tow latest 40nm chips is that ati yields have been very poor, and nvidias have been good.
---
Nice try, but you're wrong.
" Scalable Informatics has been selling NVidia Tesla (C1060) units as part of our Pegasus-GPU and Pegasus-GPU+Cell units. Several issues have arisen with Tesla availability and pricing.
First issue: Tesla units are currently on a 4-8 week back order. We have no control over this, all other vendors have the exact same issues. NVidia is not shipping Tesla in any appreciable volumes.
Our workaround: Until NVidia is able to ramp its volume to appropriate levels, Scalable Informatics will provide loaner GTX260 cards in place of the Tesla units. Once the Tesla units ship, we will exchange the GTX260 units for the Tesla units.
Update: 1-September-2009
Tesla C1060 units are now readily available for Pegasus and JackRabbit systems.
---
NOW THE PRICING
" Scalable Informatics JackRabbit systems are available in deskside and rackmount configurations, starting at 8 TB (TeraByte) in size, with individual systems ranging from 8TB to 96TB, and storage clusters up to 62 PB (PetaByte), with most systems starting price under $1USD/GB."
So, an 8TB system is 8 grand, 96TB 96 grand, and a 62 petabyte in the approaching one MILLION range.
http://www.scalableinformatics.com/catalog">http://www.scalableinformatics.com/catalog
Yes, not much there. LOLOLOL
--
POWER SAVINGS replacing massive cpu computers
--
The BNP Paribas (finance) study showed a $250,000 500 core cluster (37.5 kW) replaced with a 2 S1070 Tesla cluster at a cost of $24,000 and using only 2.8 kW. A study with oil and gas company Hess showed an $8M 2000-socket system (1.2Mw) being replaced by a 32 S1070 cluster for $400,000 and using only 45 kW in 31x less space. If you are running a CUDA-enabled application, or have access to the source code (you’ll need that to take advantage of the GPUs), you can clearly get significant performance gains for certain applications.
-
about 4 TFLOPS of peak from four C1060 cards (or 3 C1060 and a Quadro) and plugs into a standard wall outlet. Word from some of those selling this system is that sales have been mostly in the academic space and a little slower than expected, possibly due to the initially high ($10k+) price point. Prices have started to come down, however, and that might help sales. You can buy these today from vendors like Dell, Colfax, AMAX, Microway, and Penguin (for a partial list see NVIDIA’s PS product page).
-
---
And, of course you predict amd will have the lead in videocards the next few years. LOL
bhwahahahahaaaaaaaaaaaaaaaaaa
thebeastie - Thursday, October 1, 2009 - link
Personally I think NVidia has made the best bet it can make with supporting more Telsa style stuff, and in general just building a bigger madder GPU.The fact is that there aren't many good PC games around, I would say NVidia made some good sales out of Crysis by it self, people building a new PC with that game in mind having a very large weight on GPU choice.
But it is just not enough. L4D 2 is the next big title but being on the Vavle engine everyone know you will get 100fps on a GTX 275.
The other twist is that Steam has probably been one of the best things for gaming on the PC it just makes things 10 times easier.
Manually patching games etc is a killer for all but those who are gaming enthusiasts.
Dante80 - Thursday, October 1, 2009 - link
GT300 looks like a revolutionary product as far as HPC and GPU Computing are concerned. Happy times ahead, for professionals and scientists at least...Regarding the 3d gaming market though, things are not as optimistic. GT300 performance is rather irrelevant, due to the fact that nvidia currently does not have a speedy answer for the discrete, budget, mainstream and lower performance segments. Price projections aside, the GT300 will get the performance crown, and act as a marketing boost for the rest of the product line. Customers in the higher performance and enthusiast markets that have brand loyalty towards the greens are locked anyway. And yes, thats still irrelevant.
Remember ppl, the profit and bulk in the market is in a price segment nvidia does not even try to address currently. We can only hope that the greens can get sth more than GT200 rebranding/respins out for the lower market segments. Fast. Ideally, the new architecture should be able to be downscaled easily. Lets hope for that, or its definitely rough times ahead for nvidia. Especially if you look closely at the 5850 performance per $ ratio, as well as the juniper projections. And add in the economy crisis, shifting consumer focus, the difference of performance needed by sotware and performance given by the hw, the locking of TFT resolutions and heat/power consumption concerns.
With AMD getting out of the warehouses the whole 5XXX family in under 6months (I think thats a first for the GPU industry, I might be wrong though), the greens are in a rather tight spot atm. GT200 respins wont save the round, GT300 @500$++ wont save the round, and tesla wont certainly save the round (just look at sales and profit in the last years concerning the HPC-GPUCU segments).
Lets hope for the best, its in our interest as consumers anyway..
blindbox - Thursday, October 1, 2009 - link
I'm sorry, but I couldn't resist.The Adventures of SiliconDoc.
NVIDIA GeForce GTS 250: A Rebadged 9800 GTX+
http://www.anandtech.com/showdoc.aspx?i=3523">http://www.anandtech.com/showdoc.aspx?i=3523
ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
http://www.anandtech.com/video/showdoc.aspx?i=3539...">http://www.anandtech.com/video/showdoc.aspx?i=3539...
AMD's Radeon HD 5870: Bringing About the Next Generation Of GPUs
http://www.anandtech.com/video/showdoc.aspx?i=3643...">http://www.anandtech.com/video/showdoc.aspx?i=3643...
The Radeon HD 4870 1GB: The Card to Get
http://www.anandtech.com/showdoc.aspx?i=3415">http://www.anandtech.com/showdoc.aspx?i=3415
Overclocking Extravaganza: GTX 275's Complex Characteristics
http://www.anandtech.com/video/showdoc.aspx?i=3575">http://www.anandtech.com/video/showdoc.aspx?i=3575
NVIDIA GeForce GTX 295: Leading the Pack
http://www.anandtech.com/showdoc.aspx?i=3498&p...">http://www.anandtech.com/showdoc.aspx?i=3498&p...
Faster Graphics For Lower Prices: ATI Radeon HD 4770
http://www.anandtech.com/video/showdoc.aspx?i=3553...">http://www.anandtech.com/video/showdoc.aspx?i=3553...
Of course, check the comments.
I couldn't find his comments in the 4870x2 review, nor the pre-DX10 days.
tamalero - Friday, October 2, 2009 - link
this guy is such a epic trainwreck....I actually wonder if this guy is the ANGRY GERMAN KID on disguise ( check the video on youtube lol )
Docket - Thursday, October 1, 2009 - link
Yep SiliconDoc has been making same nonsense noise elsewhere as well and been banned at least from one other site (google silicondocs):http://forums.bit-tech.net/showthread.php?p=203896...">http://forums.bit-tech.net/showthread.php?p=203896...
Here extract from bit-tech staff:
-----------
OK time for you to go, you contribute nothing to the community other than trolling, bye bye.
I'm leaving all your posts here for evidence that you're a complete lunatic, but I'm glad you realise that you do need help. It's the first step.
I recommend checking out Nvidia forums and posting there - you'll feel more at home.
-----------
S/He is obviously retarded person. I mean initially it was "fun" to read but now I'm just so bored with this s*it and it is actually interfering while trying to read comments from other readers. Maybe that is the whole point of the noise; to side track any meaningful conversation.
I vote silicondoc to banned from this site (or give me an ability to filter all the post from and related to this user)... anyone else?
SiliconDoc - Thursday, October 1, 2009 - link
It seems to me, you wish to remain absolutely blind with your fuming hatred and emotional issues, let's see WHAT was supposedly said at your link:" Originally Posted by wuyanxu
nVidia is trying very hard to NOT loose this round, they've priced this too aggressively, surely there's some cooperate law on this? "
---
Here we see the brain deranged red rooster, who has been decieved by the likes of you know who, for so long, that a low priced Nvidia card that beats the ati card, must be "illegally priced", according to the little red communist overseas.
I suppose pointing that out in a fashion you and your glorious roosters don't like, is plenty reason for you to shriek "contributes nothing" and "let's ban him!"
Well, fire up your glowing red torches, and I will gladly continue to show what fools red roosters can be, and often are.
I'm so happy you linked some silicondoc post on some other forum, and we had the chance to see the deranged red rooster screech that a low priced Nvidia GTX275 is illegal.
--
Good for you, you're such a big help here.
strikeback03 - Thursday, October 1, 2009 - link
would be nice, I was wondering when I saw this article how it could have 140 comments already, forgetting he was sure to come trolling. I've stopped reading each comment thread after he got involved, since any chance of reliable information coning out has ceased.shotage - Thursday, October 1, 2009 - link
lol*shakes head*
palladium - Thursday, October 1, 2009 - link
Ahh, he said a 9800 GTX + GDDR5 = 4870 !blindbox - Thursday, October 1, 2009 - link
Ooops, I think I need to speak something on topic at least. Anyone could tell me if OpenCL SDK is out yet? Or DirectCompute too? It has been over a year since GPU computing was announced and nothing useful for the consumers (I don't call folding for consumers).habibo - Thursday, October 1, 2009 - link
Yes, both OpenCL and DirectCompute are available for development. It will take time for developers to release applications that use these APIs.There are already consumer applications that use CUDA, although these are mostly video encoding, Folding@Home/SETI@home, and PhysX-based games. Possibly not too exciting to you, but hopefully more will be coming as GPU computing gains traction.
PorscheRacer - Thursday, October 1, 2009 - link
Does anyone know if the 5000 series support hardware virtualisation? I think this will be the killer feature once AMD's 800 series chipsets debut here shortly. Being able to virtualise the GPU and other hardware with your virtual machines is the last stop to pure bliss.dgz - Thursday, October 1, 2009 - link
I am also curious. Right now only nVidia's Quadro cards support this.The thing is, though, that your CPU and chipset also have to support what Intel calls VT-d.
Being able to play 3D games in virtual OS with little to no performance would be great and useful.
Not going to happen soon, though. It's also funny that virtually no one Lynnfield mentioned the lack of VT-d in 750 in his "deep" review. Huge disappointment.
wifiwolf - Thursday, October 1, 2009 - link
If there's any technology that seams to scratch that virtualization, i think this new gt300 is the one. When reading about nvidia making the card compute oriented it just drove my mind to that thought. Hope i'm right. To be fair with amd, i think their doubled stream processors could be a step forward in that direction too, coupled with dx11 direct compute. Virtual machines just need to acknowledge the cards and capabilities.dgz - Friday, October 2, 2009 - link
They already do. vmware and vbox have such capabilities. Not everything is possible atm, though.dgz - Thursday, October 1, 2009 - link
oops, I meant "little to no performance penalty" :)sigmatau - Thursday, October 1, 2009 - link
According to the super troll who keeps screeching about bandwidth, then the GT300 must be a lesser card since it doesn't have 512 bit connection like the GT200.LOL @ Trolls.
SiliconDoc - Thursday, October 1, 2009 - link
Yes that is quite disappointing, but the 3 million transistor count and ddr5 somewaht makes up for it, and the fact that we're told by the red roosters that even 153 bandwith is plenty or just a tiny bit shy, and with what it looks like the 384 bit ddr5 4000 or 4800 data GT300 will come in at 192 bandwith minimum, or more likely 240 bandwith, quite a lot higher than 153.6 for ati's 5870.---
So really, what should I believe now, 153 is really plenty except in a few rare instances, or what you say LOL @ Trolls should believe, that 192 or 240 is worse than 153 ?
--
You might LOL @ TRolls, but from my view, you just made an awful fool of yourself.
HINT: The ati 5870 is only 256bit, not 384, and not 512.
Now, look in the mirror and LOL.
sigmatau - Thursday, October 1, 2009 - link
I know the ATI cards have 256 bit connections dumb ass. I'm just using your logic (or lack of it.) ATI has been able to outperform Nvidia cards with their 256 bit connections so your point about bandwidth is meaningless idiot.Now go pull that G295 out your ass, ok?
SiliconDoc - Thursday, October 1, 2009 - link
Golly, that ddr5 has nothing to do with bandwith, right, you stupid idiot ?--
Talk to me about the 4850, you lame, near brain dead, doofus. It's got ddr3 on it.
---
See, that's the other problem for you idiotic true believers, NV is moving from 2248 data rate ram on up to 4800.
But you're so able to keep more than one tiny thought in your stupid gourd at once, you "knew that".
BTW, you're not doing what I'm doing, you're not capable of it.
Now that sourpussed last little whine by you, the 295, beats everything ati has, making your simpleton statement A BIG FAT JOKE.
moltentofu - Thursday, October 1, 2009 - link
My god silicondoc you aren't really succeeding here. At what purpose? To convince people not to buy ati cards? You are such a complete, massive ahole it makes me want to go out and buy ati cards in bulk just to spite you.I'm guessing that if nvidia PR ever watched you rant your all-caps rants they would politely request that you stop associating yourself with their product.
Go ahead everybody google "silicondoc" if you have a strong stomach. Talk about spreading yourself all over the tubes! This guy's fingerprint is unmistakeable. Looks like he got banned on the HondaSwap forums after 14 posts. Guess he sucks on every forum. Maybe anandtech could ban him?
SiliconDoc - Thursday, October 1, 2009 - link
I think every one of you, that instead of actually leaving me alone, or responding with a counter argument to my points, every one of you that merely got logged in, and ripped away at me with an insult ought to be BANNED.That's what really should happen. I make my complaints and arguments on article and cards and companies, and the lies I see about all those, and most, but not all of you, have no response other than a pure trolling, insulting put down.
Every single one of you that came in, and personally attacked me without posting a single comment about the article, YOU are the ones that need to be banned.
Your collective whining is pure personal attack, and instead of commenting on the article, your love or hate for it, you texted up and did one single thing, let loose a rant against me. Just because you could, just because you felt apparently, "it was taking the high road"... which is as ignorant as the lies I've pointed out.
Time for YOU PEOPLE to be banned.
(minus those of course that actually made counterpoints, wether or not they insulted me or complained when they did - because AT LEAST they actually were discussing points, and contributing to the knowledge and opinion groupings.
Like for instance Monkeypaw, who made a reply that wasn't a pure trolling hate filled diatribe like you just posted, having nothing to do with the article at all.
Take a look in the mirror then consider yourself fella.
bobvodka - Thursday, October 1, 2009 - link
In which case I request that YOU are banned for calling ME a liar when I did nothing beyond reply telling you how, on launch day, I ordered a HD5870 and had it the next day.SiliconDoc - Thursday, October 1, 2009 - link
Oh you're full of it again, you pretended your view is the world, and therefore lied your butt off, and in a smart aleck fashion, PERIOD, pretending everyone doesn't know a few trickled out, which if you had clue one, you'd KNOW I was the one who posted that very information on this site.Claiming anyting else, is plain stupid, smart alecky, and LYING.
Just because drunken bob got a card, he claims, on the morning, it shhipped that day, he had it immediately, and has been enojying it ever since, the whole world is satisfied with the paper launch, that "does not exist in bob's drunken vodka world" where who knows what day it is anyeway.
You know, you people are trash,and expecting anyone else to pretend you're not is asking for way too much.
shotage - Thursday, October 1, 2009 - link
SillyDuck - please tone it down. You're getting out of control again!tamalero - Thursday, October 1, 2009 - link
hu.. you'r the one insulting every people who doesnt share your opinion with your "RED ROOSTERS" and other stuff...you're really special Mr. Doc, but in the sad way.
SiliconDoc - Thursday, October 1, 2009 - link
What red roosters, there aren't any here I'm told. Just plain frank and honest people who tell the truth.So if I say red rooster, it cannot possibly mean anyone here, posting, lurking or otherwise, as I'm certain you absolutely know.
( not like coming down to your level takes any effort, there you are special, just for you, so you don't feeel so bad about yourself )
rennya - Thursday, October 1, 2009 - link
I have already countered your suggestion that ATI 5870 is just a paper launch, somewhere in this same discussion.Plus, if nVidia really has working silicon as you showed in the fudzilla link, where then I can buy it? Even at the IDF, Intel shows working silicon for Larrabee (although older version), but not even the die-hard Intel fanboys will claim that Larrabee will be available soon.
SiliconDoc - Thursday, October 1, 2009 - link
Gee, we have several phrases. Hard launch, and paper launch. Would you prefer something in between like soft launch ?Last time nvidia had a paper launch, that what everyone called it and noone had a problem, even if cards trickled out.
So now, we need changed definitions and new wordings, for the raging little lying red roosters.
I won't be agreeing with you, nor have you done anything but lie, and attack, and act like a 3 year old.
It's paper launch, cards were not in the channels on the day they claimed they were available. 99 out of 100 people were left bone dry, hanging.
Early today the 5850 was listed, but not avaialable.
Now, since you people and this very site taught me your standards and the definition and how to use it, we're sticking to it when it's a god for saken red rooster card, wether you like it or not.
rennya - Friday, October 2, 2009 - link
You defined hard launch as having cards on retail shelf.That's what happened here in the first couple of days at the place I lived in. So, according to your standard, 5870 is a hard launch, not paper launch or soft launch. I can easily get one if I want to (but my casing is just crappy TECOM mid-tower, the card will not fit).
As far as I am concerned, 5870 has a successful hard launch. You tried to tell people otherwise, that's why I called you a liar.
Where to know where I live? Open up the Lynnfield review at http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?... then look at the first picture in the first page. It shows you the country where I am posting this post from. The same info can also be seen in AMD Athlon X4 620 review at http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?... . The markup from MRSP can be ridiculous sometimes, but availability is not a problem.
Zingam - Thursday, October 1, 2009 - link
This CPU is for what? Oh, Tesla - the things that cost 2000 :) And the consumers won't really get anything more by what ATI offers currently!Seems like it is time for ATI to do a paper launch.
Just to inform the fanboys: ATI has already finalized the specs for generation 900 and 1000. The current is just 800.
So on paper, dudes, ATI has even more than what they are displaying now!
BTW who said: DX11 won't matter?? :)
cactusdog - Thursday, October 1, 2009 - link
Its unbelievable that Nvidia wont have a DX11 chip in 2009. Massive fail.strikeback03 - Thursday, October 1, 2009 - link
Not if there are no worthwhile DX11 games in 2009.yacoub - Wednesday, September 30, 2009 - link
"Perhaps Fermi will be different and it'll scale down to $199 and $299 price points with little effort? It seems doubtful, but we'll find out next year."Yeah okay, side with their marketing folks. God forbid they actually release reasonably-priced versions of Fermi that people will actually care to buy.
SiliconDoc - Thursday, October 1, 2009 - link
Derek did an article not long ago on the costs of a modern videocard, and broke it down part and piece by cost, each.Was well done and appeared accurate, and the "margin" for the big two was quite small. Tiny really.
So people have to realize fancy new tech costs money, and they aren't getting raked over the coals. its just plain expensive to have a shot to the moon, or to have the latest greatest gpu's.
Back in '96 when I sold a brand new computer I doubled my investment, and those days are long gone, unles you sell to schools or the government, then the upside can be even higher.
yacoub - Thursday, October 1, 2009 - link
And yet all of that is irrelevant if the product cannot be delivered at a price point where most of the potential customers will buy it. You're forgetting that R&D costs are not just "whatever they will be" but are based off what the market will support via purchasing the end result. It all starts withe the consumer. You can argue all you want that Joe Gamer should buy a $400 GPU but he's only capable of buying a $300 GPU and only willing to buy a $250 GPU, then you're not going to get a sale until you cross the $300 threshold with amazing marketing and performance or the $250 threshold with solid marketing and performance. Companies go bust because they overspend on R&D and never recoup the cost because they can't price the product properly for it to sell the quantities needed to pay back the initial investment, let alone turn a significant profit.Arguing that gamers should just magically spend more is silly and shows a lack of understanding of economics.
SiliconDoc - Thursday, October 1, 2009 - link
Well I didn't argue that gamers should magically spend more.--
I ARGUED THAT THE VIDEOCARDS ARE NOT SCALPING THE BUYERS.
---
Deerek's article, if you had checked, outlined a card barely over $100.00
But, you instead of thinking, or even comprehending, made a giant leap of false assuming. So let me comment on your statements, and we'll see where we agree.
--
1. Ok, the irreverance(yes that word) here is that TESLA commands a high price, and certainly has been stated to be the profit margin center (GT200) for NVIDIA- so...whatever...
- Your basic points are purely obvious common sense, one doesn't even need state them - BUT - since Nvidia has a profit driver where ATI does not, if you're a joe rouge, admitting that shouldn't be the crushing blow it apparently is.
2. Since Nvidia has been making the MONEY, the PROFIT, and reinvesting, I think they have a handle on what to spend, not you, and their sales are much higher up to recoup costs, not sales, to you, your type.
----
Stating the very simpleton points of even having a lemonade stand work out doesn't impress me, nor should you have wasted your time doing it.
Now, let's recap my point: " So people have to realize fancy new tech costs money, and they aren't getting raked over the coals."
--
That's a direct quote.
I also will reobject your own stupidty : " You're forgetting that R&D costs are not just "whatever they will be" but are based off what the market will support via purchasing the end result."
PLEASE SEE TESLA PRICES.
--
Another jerkoff that is SOOOOOOOO stupid, finds it possible that someone would even argue that gamers should just spend more money on a card, and - after pointing out that's ridiculous, feels he has made a good point, and claims, an understanding of economics.
-
If you don't see the hilarity in that, I'm not sure you're alive.
-
Where do we get these brilliant analysts who state the obvious a high schooler has had under their belt for quite some time ?
--
I will say this much - YOU specifically (it seems), have encountered in the help forums, the arrogant know it all blowhards, who upon say, encountering a person with a P4 HT at 3.0GHZ, and a pci-e 1.0 x16 slot, scream in horror as the fella asks about placing a 4890 in it. The first thing out of their pieholes is UPGRADE THE CPU, the board, the ram, then get a card....
If that is the kind of thing you really meant to discuss, I'd have to say I'm likely much further down that bash road than you.
You might be the schmuck that shrieks "cpu limitation!, and recommends the full monty reaplcements"
--
Let's hope you're not after that simpleton quack you had at me.
yacoub - Thursday, October 1, 2009 - link
uh-oh, boys, he's foaming at the mouth. time to put him down.SiliconDoc - Thursday, October 1, 2009 - link
Ah, another coward defeated. No surprise.yacoub - Wednesday, September 30, 2009 - link
"The motivation behind AMD's "sweet spot" strategy wasn't just die size, it was price."LOL, no it wasn't. Not when everyone, even Anandtech staff, anticipated the pricing for the two Cypress chips to be closer to $199 and $259, not the $299 and $399 they MSRP'd at.
This return to high GPU prices is disheartening, particularly in this economy. We had better prices for cutting edge GPUs two years ago at the peak of the economic bubble. Today in the midst of the burst, they're coming out with high-priced chips again. But that's okay, they'll have to come down when they don't get enough sales.
SiliconDoc - Thursday, October 1, 2009 - link
It was fun for half a year as the red fans were strung along with the pricing fantasy here.Now of course, well the bitter disappointment, not as fast as expected and much more costly. "low yields" - you know, that problem that makles ati "smaller dies" price like "big green monsters" (that have good yields on the GT300).
--
But, no "nothing is wrong, this is great!" Anyone not agreeing is "a problem". A paid agent, too, of that evil money bloated you know who.
the zorro - Thursday, October 1, 2009 - link
silicon duck, please take a valium i'm worried about you.SiliconDoc - Thursday, October 1, 2009 - link
Another lie, no worry, you're no physician, but I am SiliconDoc, so grab your gallon red water bottle reserve for your overheating ati card and bend over and self administer you enema, as usual.araczynski - Wednesday, September 30, 2009 - link
sounds like ati will win the bang for the buck war this time as well. at least it makes the choice easier for me.marc1000 - Wednesday, September 30, 2009 - link
Some time ago I heard that the nex gen of consoles would run DX11 (Playstation2 and Xbox were DX7, PS3 and X360 DX9. So PS4 and X720 could perfectly be DX11). If this is the case, we are about to see new consoles with really awesome graphics - and then the GPU race would need to start over to more and more performance.Do you guys have any news on those new consoles development? It could complete the figure in the new GPU articles this year.
Penti - Friday, October 2, 2009 - link
I think you mean DX9 class hardware, PS3 who has zero DX9 support and XBOX360 has DX9c class support but a console specific version. PS3 was using OpenGL ES 1.0 with shaders and other feature from 2.0 as it was release pre OpenGL ES 2.0 spec. The game engines don't need the DX API. It doesn't matter to game content developers any way.Xbox was actually DirectX 8.1 equivalent. As said next gen consoles are years away. Larrabee and fermi will have been long out by then.
haukionkannel - Thursday, October 1, 2009 - link
Rumours says that next generation consoles are releaset around 2013-2014...But who can say...
Zingam - Thursday, October 1, 2009 - link
Perhaps next gen consoles would be C++ based and not API based (what a great terminology). So in that sense DirectX won't matter like it won't matter on Larrabee because it will be emulated in software. Larrabee would not have any silicon dedicated to OpenGL or DirectX I think.Once GPUs get fast enough I guess they won;t be called Graphics processors anymore and will support APIs as software implementations.
marc1000 - Thursday, October 1, 2009 - link
well, perhaps... I was searching the web yesterday for info on the new consoles, it was kinda sad. if we do not get a new minimun-standard (a powerful console), then the PC games will not be that hard to run on PCs... then my old Radeon 3850 is still capable of running almost ALL games with "good enough" permformance (read: average 30-50fps on most of the console ports to PC).and so: no reason to upgrade! :-(
Dobs - Wednesday, September 30, 2009 - link
Personally I think Eyefinity will be remembered as the master stroke as well as first to implement DirectX 11. Nvidia may get 10 fps more in DirectX 11 in Q2 2010 but will still struggle until it has it's own version of Eyefinity.Current uber-cool for the cashed-up is Eyefinity and once you have 3 or 6 monitors you will only buy hardware that will support it. These 'cashed-up' PC gamers are usually Nvidia's favorite customers.
Nvidia needs flexible wrap-around OLED mega resolution monitors to come out yesterday, but I'm pretty sure that didn't happen... 5850 which supports 3 monitors came out yesterday. :P
SiliconDoc - Thursday, October 1, 2009 - link
Nvidia has supported FOUR monitors on say for instance, the 570i sli, for like YEARS dude.Just put in 2 nv cards, plug em up - and off you go, it's right in the motherboard manuals...
Heck you can plug in two ati cards for that matter.
---
Anyway on the triple monitor with this 5870/50, the drivers are a mess, some having found the patch won't be for a month, then the extra $100 cable is needed, too, as some have mentioned, that ati has not included.
They're pissed.
Dobs - Thursday, October 1, 2009 - link
I'll be avoiding the extra $100 cable by getting DisplayPort monitors from the start. Also want to get ips monitor (i think) so that it will support the portrait mode.If I already had 3 non-DisplayPort monitors, I wouldn't mind shelling out for the DisplayPort adapter if that was my only expense. But if the adapter was flakey I'd be upset as well. I know multi-monitors have been around for years, but they've never been this easy to set-up... Even I could do it :P And the drivers will get better in time, and no doubt future games will look to include Eyefinity as well.
SiliconDoc - Thursday, October 1, 2009 - link
Well I do hope you have good luck and that you and your son enjoy it ( no doubt will if you manage get it), and it would be nice if you can eventually link a pic (likely in some future article text area) just because.I think eyefinity has an inherent advantage, cheaper motherboard possible, 3 on one card, and with 3 from the same card, you can have the concave wrap view going with easy setup.
I agree however with the comment that it won't be a widely used feature, and realize most don't even use both monitor hookups on their videocards that are already available as standard, since long ago, say the 9600se and before.
(I use two though, and I have to say it is a huge difference, and much, much better than one)
yacoub - Wednesday, September 30, 2009 - link
Wake up: 99% of people don't give a crap about Eyefinity. Not only do the VAST, VAST majority of customers have just one display, but those who do have multiple ones (like myself) often have completely different displays, not multiple of the same model and size. And then, even when you find that 0.1% of the customer base that has two or more identical monitors side-by-side, you have to find the ones who game on them. Then of those people, find the ones who actually WANT to have their game screen split across two monitors with a thick line of two display borders right in the middle of their image.Eyefinity is relevant to such an infinitesimally small number of people it is laughable every time someone mentions it like it's some sort of "killer app" feature.
Jamahl - Thursday, October 1, 2009 - link
thats why eyefinity has millions of youtube views already right.yacoub - Thursday, October 1, 2009 - link
because purchases are measured in YouTube views. wow, just... wow.ClownPuncher - Thursday, October 1, 2009 - link
It clearly means people are interested enough to look. Wow, just...wow.You don't like it, other people do. Get over it.
Dobs - Thursday, October 1, 2009 - link
I'm with the zorro - will be setting this up for my son pretty soon - he is an extreme gamer who has mentioned multiple monitors to me a few times over the last few months. Up until now I only had a vague idea on how I could accommodate his desire.... that has all changed since the introduction of Eyefinity.Finally - Thursday, October 1, 2009 - link
..pussy-whipped by your son?the zorro - Thursday, October 1, 2009 - link
moron, i am going to buy two more monitors and then... eyefinity.chizow - Wednesday, September 30, 2009 - link
Nvidia didn't mention anything about multi-monitor support, but today's presentation wasn't really focused on the 3D gaming market and GeForce. They did spend a LOT of time on 3D Vision though, even integrating it into their presentation. They also made mention of the movie industry's heavy interest in 3D, so if I had to bet, they would go in the direction of 3D support before multi-monitor gaming.It wouldn't be hard for them to implement it though if they wanted to or were compelled to. Its most likely just a simple driver block or code they need to port to their desktop products. They already have multi-monitor 3D on their Quadro parts and have supported it for years, its nothing new really, just new on the desktop space with Eyefinity. It then becomes a question if they're willing to cannibilize their lucrative Quadro sales to compete with AMD on this relatively low-demand segment. My guess is no, but hopefully I'm wrong.
Dobs - Thursday, October 1, 2009 - link
I think Nvidia are underestimating the desire and affordability for multi-monitor gaming. Have you seen monitor prices lately? Have you seen the Eyefinity reviews?By not making any mention of it is a big mistake in my book. Sure they can do it, but it will reduce there margins even further since they obviously hadn't planned spending the extra dollar$ this way.
I do like the sound of the whole 3D thing in the keynote though... and everyone wearing 3D glasses...(not so much). But it will be cool once the Sony vs Panasonic vs etc.? 3D format war is finished, (although it's barely started) so us mainstream general consumers know which 3D product to buy. Just hope that James Cameron Avatar film is good :)
chizow - Thursday, October 1, 2009 - link
Yeah I've seen the reviews and none seemed very compelling tbh, the 3-way portrait views seemed to be the best implementation. 6-way is a complete joke, unless you enjoy playing World of Bezelcraft? There's also quite a few problems with its implementation as you alluded to, the requirement of an active DP adapter was just a short-sighted half-assed implementation by AMD.As Yacoub mentioned, the market segment for people interested or willing to invest in this technology is so ridiculously small, 0.1% is probably pretty close to accurate given multi-GPU technology is estimated to only be ~1% of the GPU market. Surely those interested in multi-monitor is below that by a significant degree.
Still for a free feature its definitely welcome, even in the 2D productivity sense, or perhaps for a day trader or broker....or anyone who wanted to play 20 flops simultaneously in online poker.
Dobs - Thursday, October 1, 2009 - link
Lol @ 20 flops simultaneously in online poker. I struggle with 4 :)Agree with 6 monitor bezelcraft - Cross hair is the bezel :)
I guess I'm lucky that my son is due for a screen upgrade anyhow so all 3 monitors will be new. Which one will be the problem - I hear Samsung are bringing out small-bezel monitors specifically for this, but I probably can't wait that long. (Samsung LED looks awesome though) I might end up opting for 3 of Dell's old (2008) 2408WFP's (my work monitor) as I know I can get a fair discount for these and I think they have DisplayPort. I'm not sure if my son will like Landscape or Portrait better but I want him to have the option... and yeah apparently the portrait drivers are limited (read crap) atm.
Appreciate your feedback as well as your comments on the 5850 article... I actually expected the GT prices to be $600+ not the $500-$550 you mentioned. Oops... rambling now. Cheers
chizow - Thursday, October 1, 2009 - link
Heheh I've heard of people playing more than 20 flops at a time....madness.Anyways, I'm in a similar holding pattern on the LCD. While I'm not interested in multi-monitor as of now, I'm holding out for LED 120Hz panels at 24+" and 1920. Tbh, I'd probably check out 3D Vision before Eyefinity/multi-monitor at this point, but even without 3D Vision you'd get the additonal FPS from a 120Hz panel along with increased response times from LED.
If you're looking to buy now for quality panels with native DP support, you should check out the Dell U2410. Ryan Smith's 5870 review used 3 of them I think in portrait and it looked pretty good. They're a bit pricey though, $600ish but they were on sale for $480 or so with a 20% coupon. If you called Dell Sm. Biz and said you wanted 3 you could probably get that price without coupon.
As for GTX 380 price, was just a guess, Anand's article also hints Nvidia doesn't want to get caught again with a similar pricing situation as with GT200 but at the same time, relative performance ultimately dictates price. Anyways, enjoyed the convo, hope the multi-mon set-up works out! Sounds like it'll be great (especially if you like sims or racing games)!
RadnorHarkonnen - Thursday, October 1, 2009 - link
Eyefinity is screaming for DIY.Bezel Craft can be easly avoided. Just tear you monitor apart. A stand for 3 monitors is easly ordered/DIY made. Ussually the bezel is way thicker than it need to be.
Unfortunely i alreayd have a 4850 CF that i will keep for more a year or two and let the tecnology mature for now.
wifiwolf - Wednesday, September 30, 2009 - link
Can we have a new feature in the comments please?I just get tired of reading a few comments and get bugged by some SiliconDoc interference.
Can we have a noise filter so comments area gets normal again.
Every graphics related article gets this noise.
Just a button to switch the filter on. Thanks.
AtwaterFS - Wednesday, September 30, 2009 - link
4 reals - this dude is clearly an Nvidia shill.Question is, do you really want to support a company that routinely supports this propaganda blitz on the comments of every Fn GPU article?
It just feels dirty doesn't it?
strikeback03 - Thursday, October 1, 2009 - link
I doubt SiliconDoc is actually paid by nvidia, I've met people like this in real life who just for some reason feel a need to support one company fanatically.Or he just enjoys ticking others off. One of my friends while playing Call of Duty sometimes just runs around trying to tick teammates off and get them to shoot back at him.
SiliconDoc - Thursday, October 1, 2009 - link
If facing the truth and the facts makes you mad, it's your problem, and your fault.I certainly know of people like you describe, and let's face it, it is one of YOUR TEAMMATES---
--
Now, when you collective liars and deniars counter one of my pointed examples, you can claim something. Until then, you've got nothing.
And those last 3 posts, yours included, have nothing, except in your case, it shows what you hang with, and that pretty much describes the lies told by the ati fans, and how they work.
I have no doubt pointing them out "ticks them off".
The simple fix is, stop lying.
Yangorang - Wednesday, September 30, 2009 - link
Honestly all I want to know is:When will it launch? (as in be available for actual purchase)
How much will it cost?
Will this beast even fit into my case...and how much power will it use?
How will it perform? (particularly I'm wondering about DX11 games...as it seems to be very much a big deal for ATI)
but heh none of these questions will be answered for a while I guess....
I'm also kinda wondering about:
How does the GT300 handle tessellation?
Does it feature Angle-Independent Anisotropic Filtering?
I could really couldn't give a crap less about using my GPU for general computing purposes....I just want to play some good looking games without breaking the bank...
haukionkannel - Thursday, October 1, 2009 - link
Well it's going to be DX11 card, so it can handle tessalation. How well? That remains to be seen, but there is enough computing power to do it guite nicely.But the big guestion is not, if the GT300 is faster than 5870 or not, It most propably is, but how much and how much it does cost...
If you can buy two 5870 for the prize of GT300, it has to be really fast!
Interesting release and good article to reveal the architecture behind this chip. I am sure, that we will see more new around the release of Win7, even if the card is not released until 2010. Just to make sure, that not too many "potential" customers does not buy ATI made card by that time.
Allso as someone said before this seams to be guite modular, so it's possible to see some cheaper cut down versions allso. We need competition to low and middle range allso. Can G300 design do it reamains to be seeing.
SiliconDoc - Thursday, October 1, 2009 - link
Well, that brings to mind another anandtech LIE.--
In the 5870 article text post area, the article writer and tester, responded to a query by one of the fans, and claimed the 5870 is "the standard 10.5 " .
Well, it is NOT. It is OVER 11", and it is longer than the 285, by a bit.
So, I just have to shake my head, and no one should have wonder why. Even lying about the length of the ati card. It is nothing short of amazing.
silverblue - Thursday, October 1, 2009 - link
http://vr-zone.com/articles/sapphire-ati-radeon-hd...They say 10.5".
SiliconDoc - Thursday, October 1, 2009 - link
I'm sorry, I realize I left with you in the air, since you're so convinced I don't know what I'm talking about." The card that we will be showing you today is the reference Radeon HD 5870, which is a dual-slot graphics card that measures in at 11.1" in length. "
http://www.legitreviews.com/article/1080/2/">http://www.legitreviews.com/article/1080/2/
I mean really, you should have given up a long time ago.
silverblue - Friday, October 2, 2009 - link
Anand, could you or Ryan come back to us with the exact length of the reference 5870, please? I know Ryan put 10.5" in the review but I'd like to be sure, please.It's best to check with someone who actually has a card to measure.
silverblue - Friday, October 2, 2009 - link
You know something? I'm just going to back down and say you're right. You might just be, but I couldn't give a damn anymore.SiliconDoc - Thursday, October 1, 2009 - link
Jeezus, you're just that bright, aren't you.The article is dated September 19th, and "they scored a picture" from another website, that "scored a picture".
Our friendly reviewer herer at AT had the cards in his hands, on the bench, IRL.
--
I mean you have like no clue at all, don't you.
palladium - Thursday, October 1, 2009 - link
I agree. GPGPU has come a long way, but it's still in its infancy, at least in the consumer space (Badaboom and AVIVO both had bugs).I just want a card that can play Crysis all very high 19x12 4xAA @60fps. Maybe a dual-GPU GT300 can deliver that.
wumpus - Wednesday, September 30, 2009 - link
First first reaction after reading that the cost of double multiply would be twice that of a single was "great. Half the transistors will be sitting there idle during games." Sure, this isn't meant to be a toy, but it looks like they have given up the desktop graphics to AMD (and whenever Intel gets something working). Maybe they will get volume up enough to lower the price, but there are only so many chips TMSC can make that size.On second thought, those little green squares can't take up half the chip. Any guess what part of the squares are multiplies? Is the cost of fast double point something like 10% of the transistors idle during single (games)? On the gripping hand, makes the claim that "All of the processing done at the core level is now to IEEE spec. That’s IEEE-754 2008 for floating point math (same as RV870/5870)". If they seriously mean that they are prepared to include all rounding, all exceptions, and all the ugly, hairy corner cases that inhabit IEEE-754, wait for Juniper. I really mean it. If you are doing real numerical computing you need IEEE-754. If you don't (like you just want a real framerate from Crysis for once) avoid it like the plague.
Sorry about the rant. Came for the beef on doubles, but noticed that quote when checking the article. Looks like we'll need some real information about what "core level at IEEE-754" means on different processors. Who provides all the rounding modes, and what parts get emulated slowly? [side note. Is anybody with a 5870 able to test underflow in OpenCL? You might find out a huge amount about your chip with a single test].
SiliconDoc - Wednesday, September 30, 2009 - link
I think I'll stick with the giant profitables greens proven track record, not your e-weened redspliferous dissing.Did you watch the NV live webcast @ 1pm EST ?
---
Nvidia is the only gpu company with OBE BILLION DOLLARS PER YEAR IN R&D.
---
That's correct, nvidia put into research on the Geforce, the whoile BILLION ati loses selling their crappy cheap hot cores on weaker thinner pcb with near zero extra features only good for very high rez, which DOESN'T MATCH the cheapo budget pinching purchasrs who buy red to save 5-10 bang for bucks...--
--
Now about that marketing scheme ?
LOL
Ati plays to high rez wins, but has the cheapo card, expecting $2,000 monitor owners to pinch pennies.
"great marketing" ati...
LOL
PorscheRacer - Wednesday, September 30, 2009 - link
Just so you know, ATI is a seperate division in AMD (the graphics side obviously) and did post earnings this year. ATI is keeping the CPU side of AMD afloat in all intents and purposes. Is there a way to ban or block you? I was excited to read about the GF300 and expecting some good comments and discussion about this, and then you wrecked the experience. Now I just don't care.Adul - Thursday, October 1, 2009 - link
silicon idiot is doing more harm than good. please ban himSiliconDoc - Thursday, October 1, 2009 - link
The truth is a good thing, even if you're so used to lies that you don't like it.I guess it's good too, that so many people have tried so hard to think of a rebuttal to any or of all my points, and they don't have one, yet.
Isn't that wonderful ! You fit that category, too.
SiliconDoc - Wednesday, September 30, 2009 - link
Do you think yhour LIES will pass with no backup ?" A.M.D. has struggled for two years to return to profitability, losing billions of dollars in the process.
A.M.D., the No. 2 maker of computer microprocessors after Intel, lost $330 million, or 49 cents a share, in the second quarter. In the same period last year, it lost $1.2 billion, or $1.97 a share.
Excluding one-time gains, A.M.D. says its loss was 62 cents a share. On that basis, analysts had predicted a loss of 47 cents a share, according to Thomson Reuters. Sales fell to $1.18 billion, down 13 percent. Analysts were expecting $1.13 billion."
---
http://www.nytimes.com/2009/07/22/technology/compa...">http://www.nytimes.com/2009/07/22/technology/compa...
ATI card sales did increase a bit, but LOST MONEY anyway. More than expected.
--
PS I'm not sorry I've ruined your fantasy and expsoed your lie. If you keep lying, should you be banned for it ?
PorscheRacer - Thursday, October 1, 2009 - link
http://arstechnica.com/hardware/news/2009/07/intel...">http://arstechnica.com/hardware/news/20...-graphic...Again, the graphics group of AMD turned a profit (albeit a small one after R&D and costs) while the other divisions lost money.
SiliconDoc - Thursday, October 1, 2009 - link
LOL- YOU'VE SIMPLY LIED AGAIN, AND PROVIDED A LINK, THAT CONFIRMS YOU LIED.It must be tough being such a slumbag.
--
" After the channel stopped ordering GPUs and depleted inventory in anticipation of a long drawn out worldwide recession in Q3 and Q4 of 2008, expectations were hopeful, if not high that Q1’09 would change for the better. In fact, Q1 showed improvement but it was less than expected, or hoped. Instead, Q2 was a very good quarter for vendors – counter to normal seasonality – but then these are hardly normal times.
Things probably aren't going to get back to the normal seasonality till Q3 or Q4 this year, and we won't hit the levels of 2008 until 2010."
As you should have a clue, noting, 2008 was bad, and they can't even reach that pathetic crash until 2010.
An increase in sales from a recent prior full on disaster decrease, is still less than the past, is low in the present, and is " A LOSS " PERIOD.
You don't provide text because NOTHING at your link claims what you've said, you are simply a big fat LIAR.
Thanks for the link anyway, that links my link:
http://jonpeddie.com/press-releases/details/amd-so...">http://jonpeddie.com/press-releases/det...ntel-and...
This is a great quote: " We still believe there will be an impact from the stimulus programs worldwide "
LOL
hahahhha - just as I kept supposing.
" -Jon Peddie Research (JPR), the industry's research and consulting firm for graphics and multimedia"
---
NOTHING, AT either link, describes a profit for ati graphics, PERIOD.
Try again mr liar.
SiliconDoc - Wednesday, September 30, 2009 - link
No they did not post earnings, other than in the sense IN THE RED LOSSES called sales.shotage - Wednesday, September 30, 2009 - link
I'm not sure what your argument is SiliconDuck..But maybe you should stop typing and go into hybernation to await the GT300's holy ascension from heaven! FYI: It's unhealthy to have shrines dedicated to silicon dude. Get off the GPU cr@ck!!!
On a more serious note: Nvidia are good, ATI has gotten a lot better though.
I just bought a GTX260 recently, so I'm in no hurry to buy at the moment. I'll be eagerly awaiting to see what happens when Nvidia actually have the product launch and not just some lame paper/promo launch.
SiliconDoc - Wednesday, September 30, 2009 - link
My aregument is I've heard the EXACT SAME geekfoot whine before, twice in fact. Once for G80, once for GT200, and NOW, again....Here is what the guy said I responded to:
" Nvidia is painting itself into a corner in terms of engineering and direction. As a graphical engine, ATI's architecture is both smaller, cheaper to manufacture and scales better simply by combining chips or expanding # of units as mfg tech improves.. As a compute engine, Intel's Larabee will have unmatched parallel thread processing horsepower. What is Nvidia thinking trying to pass on this huge, monolithic albatross? It will lose on both fronts. "
---
MY ARGUMENT IS : A red raging rooster who just got their last two nvidia destruction calls WRONG for G80 and GT200 (the giant brute force non-profit expensive blah blah blah), are likely to the tune of 100% - TO BE GETTING THIS CRYING SPASM WRONG AS WELL.
---
When there is clear evidence Nvidia has been a markleting genius (it's called REBRANDING by the bashing red rooster crybabies) and has a billion bucks to burn a year on R&D, the argument HAS ALREADY BEEN MADE FOR ME.
-----
The person you should be questioning is the opinionated raging nvidia disser, who by all standards jives out an arrogant WHACK JOB on nvidia, declaring DUAL defeat...
QUOTETH ! "What is Nvidia thinking trying to pass on this huge, monolithic albatross? It will lose on both fronts. "
---
LOL that huge monolithic albatross COMMANDS $475,000.00 for 4 of them in some TESLA server for the collegiate geeks and freaks all over the world- I don't suppose there is " loss on that front" do you ?
ROFLMAO
Who are you questioning and WHY ? Why aren't you seeing clearly ? Did the reds already brainwash you ? Have the last two gigantic expensive cores "destroyed nvidia" as they predicted?
--
In closing "GET A CLUE".
shotage - Wednesday, September 30, 2009 - link
Found my clue.. I hope you get help in time: http://www.physorg.com/news171819640.html">http://www.physorg.com/news171819640.htmlSiliconDoc - Thursday, October 1, 2009 - link
You are your clue, and here is your buddy, your duplicate:" What is Nvidia thinking trying to pass on this huge, monolithic albatross? It will lose on both fronts."
Now, I quite understand denial is a favorite pasttime of losers, and you've effectively joined the red club. Let me convert for you.
" What is Ati thinking trying to pass on this over length, heat soaked, barely better afterthought? It will lose on it's only front."
-there you are schmucko, a fine example of real misbehavior you pass-
AaronJD - Wednesday, September 30, 2009 - link
While I definitely prefer the $200-$300 space that ATI released 48xx at, It seems like $400 is the magic number for single GPUs. Anything much higher than that is in multi-GPU space where you can get away with a higher price to performance ratio.If Nvidia can hit the market with well engineered $400 or so card that is easily pared down, then they can hit a market ATI would have trouble scaling to while being able to easily re-badge gimped silicon to meet whatever market segment they can best compete in with whatever quality yield they get.
Regarding Larabee, I think Nvidia's strategy is to just get in the door first. To compete against Intel's first offering they don't need to do something special, they just need to get the right feature set out there. If they can get developers writing for their hardware asap Tesla will have done its job.
Zingam - Thursday, October 1, 2009 - link
Until that thing from NVIDIA comes out AMD has time to work on a response and if they are not lazy or stupid they'll have a match for it.So in any way I believe that things are going to get more interesting than ever in the next 3 years!!!
:D ?an't wait to hear what DirectX 12 will be like!!!
My guess is that in 5 years we will have a truly new CPUs - that would do what GPUs + CPUs are doing together today.
Perhaps will come to the point where we'll get blade like home PCs. If you want more power you just shove in another board. Perhaps PC architecture will change completely once software gets ready for SMP.
chizow - Wednesday, September 30, 2009 - link
Nvidia is also launching Nexus at their GDC this week, a plug-in for Visual Studio that will basically integrate all of these various API under an industry standard IDE. That's the launching point imo for cGPU, Tesla and everything else Nvidia is hoping to accompolish outside of the 3D Gaming space with Fermi.Making their hardware more accessible to create those next killer apps is what's been missing in the past with GPGPU and CUDA. Now it'll all be cGPU and transparent in your workflow within Visual Studio.
As for the news of Fermi as a gaming GPU, very excited on that front, but not all that surprised really. Nvidia was due for another home run and it looks like Fermi might just clear the ball park completely. Tough times ahead for AMD, but at least they'll be able to enjoy the 5850/5870 success for a few months.
ilkhan - Wednesday, September 30, 2009 - link
If it plays games faster/prettier at the same or better price, who cares what the architecture looks like?On a similar note, if the die looks like that first image (which is likely) chopping it to smaller price points looks incredibly easy.
papapapapapapapababy - Wednesday, September 30, 2009 - link
"Architecturally, there aren't huge lessons to be learned from RV770"SNIF SNIF BS!
"ATI's approach is much more cautious"
more like "ATI's approach is much more FOCUSED"
( eyes on the ball people)
"While Fermi will play games, it is designed to be a general purpose compute machine."
nvidia, is starting to sound like Sony " the ps3 is not a console its a supercomputer @ HD movie player, it only does everything" guess what? people wanted to play games, nintendo ( the focused company, did that > games, not movies, not hd graphics, games, motion control) Sony - like nvidia here- didn't have the eyes on the ball.
Griswold - Wednesday, September 30, 2009 - link
Well, you have to consider that nvidia is getting between a rock and a hard place. The PC gaming market is shrinking. Theres not much point in making desktop chipsets anymore... they have to shift focus (and I'm sure they will focus) on new things like GPGPU. I wont be surprised if GT300 wont be a the super awesome gamer GPU of choice so many people expect it to be. And perhaps, the one after GT300 will be even less impressive for gaming, regardless of what they just said about making humongous chips for the high-end segment.SiliconDoc - Wednesday, September 30, 2009 - link
Gee nvidia is between a rock and a hard place, since they have an OUT, and ATI DOES NOT.lol
That was a GREAT JOB focusing on the wrong player who is between a rock and a hard place, and that player would be RED ROOSTER ATI !
--
no chipsets
no chance at TESLA sales in the billions to coleges and government and schools and research centers all ove the world....
--
buh bye ATI ! < what you should have actually "speculated"
...
But then, we know who you are and what you're about -
TELLING THE EXACT OPPSITE OF THE TRUTH, ALL FOR YOUR RED GOD, ATI !
--
silverblue - Thursday, October 1, 2009 - link
When nVidia actually sends out Fermi samples for previews/reviews, only then will you know how good it is. We all want to see it because we want competition and lower prices (and maybe some of us will buy one or more, as well!).Until then, keep your fanboy comments to yourself.
SiliconDoc - Thursday, October 1, 2009 - link
No silverblue, that is in fact your problem, not mine, as you won't know anything, till you're shown a lie or otherwise, and it's shoved into your tiny processor for your personal acceptance.The fact remains, red fanboy raver Griswold blew it, and I pointed out exactly WHY.
The fact that you cry about it, because you group stupid dummies keep blowing nearly every statement you make, sure isn't my fault.
silverblue - Thursday, October 1, 2009 - link
I wonder if you do actually read posts before you reply to them.SiliconDoc - Thursday, October 1, 2009 - link
Take your own advice, you pathetic hypocrit.ClownPuncher - Thursday, October 1, 2009 - link
Its actually "hypocrite".SiliconDoc - Friday, October 2, 2009 - link
It's "it's", you pathetic hypocrit.silverblue - Friday, October 2, 2009 - link
It's "hypocrite", you pathetic hypocrite.chizow - Wednesday, September 30, 2009 - link
Nvidia is simply hedging their bets and expanding their horizons. They've still managed to offer the fastest GPUs per product cycle/generation and they're clearly far more advanced than AMD when it comes to GPGPU in both theory and practice.Jensen's keynote tipped his hat numerous times to Nvidia's roots as a GPU company that designed chips to run 3D video games, but the focus of his presentation was clearly to sell it as more than that, as a cGPU capable of incredible computational ability.
Zingam - Thursday, October 1, 2009 - link
No no! This is just on paper! When will see it for real!! Oh... Q2-3-4 next year! :)So you cannot claim they have the better thing because they don't have it yet! And don't forget next year we might have the head-smashing Larrabee!
:)
Who knows!!! I think you are way to biased and not objective when you type!
chizow - Thursday, October 1, 2009 - link
Heheh if Q2 is what you want to believe when you cry yourself to sleep every night, so be it. ;)Seriously though, its looking like late Q4 or early Q1 and its undoubtedly meant for one single purpose: to destroy the world of ATI GPUs.
As for Larrabee lol...check out some of the IDF news about it. Even Anand hints at Laughabee's failure in his article here. It may compete as a GPGPU extension of x86, but not as a traditional 3D raster, not even close.
SiliconDoc - Thursday, October 1, 2009 - link
Gosh you'd be correct except here is the FERMIhttp://www.fudzilla.com/content/view/15762/1/">http://www.fudzilla.com/content/view/15762/1/
There it is bubba. you blew your yap wide open in ignorance and LOST.
Good job, you've got plenty of company.
ClownPuncher - Thursday, October 1, 2009 - link
Wow, a video card! On top of that pcb could be a cat shit for all we know. The card does not exist, because I can't touch it, I can't buy it, and I can't play games on it.Also, the fact that you seem to get all of your info from Fudzilla speaks volumes. All of your syphillus induced mad ramblings are tiresome.
Lifted - Thursday, October 1, 2009 - link
I see what appears to be a PCB with some plastic attached, and possibly a fan in there as well. Yawn.ksherman - Wednesday, September 30, 2009 - link
Really like these kind of leaps in computing power, I find it fascinating. A shame that it seems nVidia is pulling a bit away from the mainstream graphics segment, but I suppose that means that the new cards from ATI/AMD are the undisputed choice for a graphics card in the next few months. 5850 it is!fri2219 - Wednesday, September 30, 2009 - link
For the love of Strunk and White, stop murdering English in that manner- it detracts from the text buried between banner ads.Sunday Ironfoot - Wednesday, September 30, 2009 - link
nVidia have invented a new way to fry eggs, just crack one open on top of their GPU and play some Crysis. :-)SiliconDoc - Wednesday, September 30, 2009 - link
Let's crack it on page 4. A mjore efficient architecture max threads in flight. Although the DOWNSIDE is sure to be mentioned FIRST as in "not as many as GT200", and the differences mentioned later, the hidden conclusion with the dissing included is apparent.Let's draw it OUT.
---
What should have been said 1st:
Nvidia's new core is 4 times more efficient with threads in flight, so it reduces the number of those from 30,720 to 24,576, maintaining an impressive INCREASE.
---
Yes, now the simple calculation:
GT200 30720x2 = 61,440 GT300 24576x4 = 98,304
at the bottom we find second to last line the TRUTH, before the SLAM on the gt200 ends the page:
" After two clocks, the dispatchers are free to send another pair of half-warps out again. As I mentioned before, in GT200/G80 the entire SM was tied up for a full 8 cycles after an SFU issue."
4 to 1, 4 times better, 1/4th the clock cycles needed
" The flexibility is nice, or rather, the inflexibility of GT200/G80 was horrible for efficiency and Fermi fixes that. "
LOL
With a 4x increase in this core design area, first we're told GT200 "had more" then were told Fermi is faster in terms that allow > the final tale, GT200 sucks.
--
I just LOVE IT, I bet nvidia does as well.
tamalero - Thursday, October 1, 2009 - link
on paper everything looks amazing, just like the R600 did in its time, and the Nvidia FX series as well. so please, just shut up and start spreading your FUD until theres real information, real benches, real useful stuff.SiliconDoc - Thursday, October 1, 2009 - link
The R600 was great, you idiot.Of course, when hating nvidia is your real gig, I don't expect you to do anything but be parrot off someone else's text and get the idea wrong, get the repeating incorrect.
-
The R600 was and is great, and has held up a long time, like the G80. Of course if you actually had a clue, you'd know that, and be aware that you refuted your own attempt at a counterpoint, since the R600 was "great on paper" and also "in gaming machines".
It's a lot of fun when so many fools self-proof it trying to do anything other than scream lunatic.
Great job, you put down a really good ATI card, and slapped yourself and your point, doing it. It's pathetic, but I can;t claim it's not SOP, so you have plenty of company.
papapapapapapapababy - Wednesday, September 30, 2009 - link
because both ms and sony are copying nintendo...that means, next consoles > minuscule speed bump, low price and (lame) motion control attached. All this tech is useless with no real killer ap EXCLUSIVE FOR THE PC! But hey who cares, lets play PONG at 900 fps !
Lonyo - Wednesday, September 30, 2009 - link
Did you even read the article?The point of this tech is to move away from games, so the killer app for it won't be games, but HPC programs.
SiliconDoc - Thursday, October 1, 2009 - link
I think the point is - the last GT200 was ALSO TESLA -- and so of course...It's the SECOND TIME the red roosters can cluck and cluck and cluck "it won't be any good" , and "it's not for gaming".
LOL
Wrong before, wrong again, but never able to learn from their mistakes, the barnyard animals.
Zingam - Thursday, October 1, 2009 - link
Last time I bought the most expensive GPU available was Riva TNT!Sorry but even if they offer this for gamers I won't be able to buy it. It is high above my budget.
I'd buy based on quality/price/features! And not based on who has the better card on paper in year 20xx.
SiliconDoc - Thursday, October 1, 2009 - link
Well, for that, I am sorry in a sense, but on the other hand find it hard to believe, depending upon your location in the world.Better luck if you're stuck in a bad place, and good luck on keeping your internet connection in that case.
ClownPuncher - Thursday, October 1, 2009 - link
Or maybe he has other priorities besides being an asshole.SiliconDoc - Thursday, October 1, 2009 - link
Being unable, and choosing not to, are two different things.And generally speaking ati users are unable, and therefore cannot choose to, because they sit on that thing you talk about being.
Now that's how you knockout a clown.
Lord 666 - Wednesday, September 30, 2009 - link
That actually just made my day; seeing a VP of Marketing speak their mind.Cybersciver - Friday, October 2, 2009 - link
Yeah, that was cool.Don't know about you guys, but my interest in GPU's is gaming @ 1920X1200. From that pov it looks like Nvidia's about to crack a coconut with a ten-ton press.
My 280 runs just about everything flat-out (except Crysis naturally)and the 5850 beats it. So why spend more? Most everything's a consul port these days and they aren't slated for an upgrade till 2012, least last I heard.
Boo hoo.
Guess that's why multiple-screen gaming strating to be pushed.
No way Jose.
SiliconDoc - Thursday, October 1, 2009 - link
Plenty hard, but they GOT HER DONE, and here is the pic of herhttp://www.fudzilla.com/content/view/15762/1/">http://www.fudzilla.com/content/view/15762/1/
Yes, now about that fantasy paper anand was spewing on - yes he won't get one for two months, but AS I SAID, WE ALREADY KNOW IT BEATS the ati epic failure.
rennya - Thursday, October 1, 2009 - link
Where can I get that GPU? At least at my place I can get a 5870 GPU if I want to, but not so for this GPU.SiliconDoc - Thursday, October 1, 2009 - link
Well go get one.Now were down to the launch and paper lies in this article, were lies, as I've said. Bigger lies by the red texters. If I were Anand I'd be giggling at you fools.
rennya - Thursday, October 1, 2009 - link
Newegg link please. Or any other online retailer websites for the matter.Did I already said to stop it with the paper launch already? That only exists in your dreams you know. Or maybe America. But such thing is not true here. Just because America doesn't have enough unit it doesn't mean it is true everywhere else.
SiliconDoc - Thursday, October 1, 2009 - link
Oh, so sorry mi' lady, here is your newegg link, you'll see 2 greyted out 5850's and THAT'S IT.http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...10679497...
Can't buy em. Paper, e-paper in this case, digital nothing.
--
Now if we only could send you some money, and you could trot over to your imaginary shops.... gee if only someone gave you some money, you could HAVE A PICTURE, because veryone knows handy little cameras are BANNED there, huh.
Gee, all those walks to work.. and not ten seconds to take a pic, and now it's really, too late LOL
ahhahahahahaaa
--
More of that wonderful "red rooster evidence".
ClownPuncher - Thursday, October 1, 2009 - link
I bought one, 5850 that is. They are popular, so they sell out. The same thing happened to me when the 8800GT's launched, bought 2 and they were sold out 15 minutes later.If people can buy them, it isn't a paper launch. Give it up. There were cards for sale from many etailers on launch day for the 5870 as well.
You're saying the 8800GT and 8800GTS g92 were paper launches also? They were selling out in minutes.
SiliconDoc - Friday, October 2, 2009 - link
Funny how you wait for the perfect post to claim your lie is true.You're a pure troll, nothing more, in every single post you've made here.
Of course I know you're lying.
rennya - Friday, October 2, 2009 - link
You are the one who are lying by claiming that 5870 is a paper launch, when availability at my place is pretty good. Then you claim that I do not actually come from a SE Asia country, but admins in this site can easily verify my IP and see where I come from. Accusing people of lying will not make you look good.SiliconDoc - Thursday, October 1, 2009 - link
Neither does europe, nor africa, nor SA, nor the ME, apparently the only spot is your walk to work. Congratulations, you're at ground zero. Just think how lucky you are.rennya - Friday, October 2, 2009 - link
I am not the who claims GT300 is available, you do with your fudzilla link. And that picture may only have mock-ups because nVidia doesn't have any working demo.At least Intel with its Larabee did showcase their unimpressive raytracing demo with Larabee in IDF.
siyabongazulu - Friday, October 2, 2009 - link
rennyathe one I was going to buy on the 24 was from ncix.com but shipping and handling was gonna be a bit too much. The fact that they had the cards on the launch date, 23 goes to show how a big liar silicondoc is. Plus his link on fudzilla, just like I said before, betrays him since it does not say that the model being shown is a working model. I can say I'm making a remote that can trump all other remotes on performance levels and will be the next big thing for gadget lovers. But if I come up with a supposed model and just flash it around, it doesnt mean that it's a working model unless I can prove that it is by using it. So until Nvidia shows the card at work and gives us numbers based on such task then we can talk about it. Hey don't forget that we also go for bang for buck, reason why I got the 5870. I have a feeling that when GT300 does launch, it will be big but also would like to see how ATI responds to that.
siyabongazulu - Friday, October 2, 2009 - link
rennyaI think we all get it. This guy is mentally ill. So I posted those links to refute his argument and he hasn't touched any. He knows he's lying and that his fudzilla link betrays him. So he'll keep on ranting. silicondoc has to remember this, all big companies that have a working model of an upcoming product do a demo and show the figures and they don't just give you paper work (which is what Nvidia is doing right now.) And it doesn't take a genius to see that GT300 w/e they wanna call it is not even close to be released unless you would want to call many sources such as tgdaily, the inquirer (http://www.theinquirer.net/inquirer/news/1052025/g...">http://www.theinquirer.net/inquirer/news/1052025/g... and ofcourse anand big liers then go ahead silicondoc..So until it comes out ( be it Nov, Mid Oct or Jan) then we can come back and talk performance. Have fun silicondoc and when GT300 performance trumps HD5870 then I'll sell 5870 for Gt300 card but don't forget we also need to see how much power that monster will be drawing. I call it monster because so far the on "paper" performance puts it that way
siyabongazulu - Friday, October 2, 2009 - link
by the way sorry I forgot to pass you the link where I got mine.. here you go http://www.canadacomputers.com/index.php?do=ShowPr...">http://www.canadacomputers.com/index.ph...=pd&... and here is another site they have had them in stock since the 23rd of sept just like predicted. Man you are full of shit..was gonna buy mine from there but decided to save the money on shipping and boom 3 days later it was in a store right next door.wwohoo..I just said bye bye to my nvidia card and if you wanna know it was yes you guessed it GTX 285 and glad I got almost full value for that!!siyabongazulu - Friday, October 2, 2009 - link
Here SD..And that's where I got mine you dumb prick.. now stop fussin and if you ask when I got it, well so just you know it was on the 29th you idiot and that is simply because I'm in Canada. So if Nvidia had it launched, wouldn't it be somewhere on the web now. by the way here is an article that refutes your claim that Radeon HD 5870 was a fud (http://www.engadget.com/2009/09/24/4-000-alienware...">http://www.engadget.com/2009/09/24/4-00...-benchma... Dell had it already. Here is another (http://www.engadget.com/2009/09/23/maingear-cyberp...">http://www.engadget.com/2009/09/23/main...-desktop.... And all of those pc makers had the cards prior to launch date. So I don't know how you can argue that. So if you Nvidia was to have its card as you claim by using fudzilla pictures, then please provide us with a link that shows any manufacture that is already selling a pc with your GT300 and here is another link that shows how dumb you are http://www.engadget.com/2009/09/23/ati-radeon-hd-5...">http://www.engadget.com/2009/09/23/ati-...o-the-sc... hope that helpsTA152H - Wednesday, September 30, 2009 - link
I agree, it shows sooooo much intelligence and class to curse! It's so creative! It's so wonderful! It's a real shame we don't have more foul mouths in positions of power, because it's just so entertaining!Yay!
yacoub - Wednesday, September 30, 2009 - link
I'm with you. Oh wait, don't forget the bold to make sure no one misses it!TA152H - Thursday, October 1, 2009 - link
Admin: You've overstayed your welcome, goodbye.I am beginning to think this site is lost. One thing I always liked about Anand was his tone was never harsh, even if I didn't agree with the content.
Now he's cursing. Yes, and as you said, in bold. Ugggh.
This and their idiotic pictorials of motherboards, then the clear bias towards Lynnfield. Again, I'm not complaining they liked it, so much as the way they lied about the numbers, and it took a lot of complaints to show the Bloomfied was faster. They were trying to hide that.
Their unscientific testing is also gotten to the point of absurdity.
I like their web page layout, but it's getting to the point where this site is become much less useful for information.
It's easy to get to the point where you do what people say they want. Yes, the jerks like to see curses, and think it's cool. I'm sure they got page reads from the idiotic pictorials of motherboards. Most of the people here did want to be lied to about Lynnfield, since it was something more people could afford compared to Bloomfield. It's tempting, but, ultimately, it's a mistake.
I'd like to see them do something that requires intelligence, and a bit more daring. Pit one writer (not Gary, he's too easy to beat) against another. Have, point and counterpoint articles. Let's say Anand is pro-Lynnfield, or pro-ATI card, or whatever. Then they use Jarrod to argue the points against it, or for the other card. Now, maybe Anand isn't so pro this item, or Jarrod isn't against it. Nonetheless, each could argue (anyone can argue and make points, because nothing in this world is absolutely good or bad, except for maybe pizza), and in doing so bring up the complexity of parts, rather than making people post about the mistakes they make and then have them show the complexity.
You think Anand would have put up those overblown remarks in his initial article on Lynnfield if he knew Jarrod would jump on him for it? I'd be more careful if I were writing it, so would he. I think the back and forth would be fun for them, and at the same time, would make them think and bring out things their articles never even approach.
It's better than us having to post about their inaccuracies and flaws in testing. It would be more entertaining too. And, people can argue, without disliking each other. Argument is healthy, and is a sign of active minds. Blind obedience is best relegated to dogs, or women :P. OK, I'm glad my other half doesn't read these things, or I'd get slapped.
ClownPuncher - Thursday, October 1, 2009 - link
Don't let the door hit you in the ego on the way out.the zorro - Wednesday, September 30, 2009 - link
this is a catastrophe.really, nvidia is finished.
this means four or five months of amd ruling.
seriously nvidia has nothing, nothing,zero, nada,
kaput nvidia is over.
Lifted - Wednesday, September 30, 2009 - link
Huh? Is that really all the troll you could muster for this article? SiliconDoc has you beat by a mile.SiliconDoc - Wednesday, September 30, 2009 - link
I'm sure Anand brought it out of him with his bias.Already on page one, we see the UNFAIR comparison to RV870, and after wailing Fermi "not double the bandwidth" - we get ZERO comparison, because of course, ATI loses BADLY.
Let me help:
NVIDIA : 240 G bandwidth
ati : 153 G bandwidth
------------------------nvidia
---------------ati
There's the bandwidth comparison, that the biased author couldn't bring himself to state. When ati LOSES, the red fans ALWAYS make NO CROSS COMPANY comparison.
Instead it's "nvidia relates to it's former core as ati relates to it's former core - so then "amount of improvement" "within in each company" can be said to "be similar" while the ACTUAL STAT is "OMITTED !
---
Congratulations once again for the immediate massive bias. Just wonderful.
omitted bandwith chart below, the secret knowledge the article cannot state ! LOL a review and it cannot state the BANDWITH of NVIDIA's new card! roflmao !
------------------------nvidia
---------------ati
NVIDIA WINS BY A VERY LARGE PERCENTAGE.
konjiki7 - Friday, October 2, 2009 - link
http://www.hardocp.com/news/2009/10/02/nvidia_fake...">http://www.hardocp.com/news/2009/10/02/..._fakes_f...
Samus - Thursday, October 1, 2009 - link
Thats great and all nVidia has more available bandwidth but....they're not anywhere close to using it (much like ATi) so exactly what is your point?SiliconDoc - Friday, October 2, 2009 - link
Wow, another doofus. Overclock the 5870's memory only, and watch your framerates rise. Overclocking the memory increases the bandwith, hence the use of it. If frames don't rise, it's not using it, doesn't need it, and extra is present.THAT DOESN'T HAPPEN for 5870.
-
Now, since FERMI has 40% more T in core, and an enourmous amount of astounding optimizations, you declare it won't use the bandwith, but your excuse was your falsehood about ati not using it's bandwith, which is 100% incorrect.
Let's pretend you meant GT200, same deal there, higher mem oc= more band and frames rise well.
Better luck next time, since you were 100% wrong.
mm2587 - Thursday, October 1, 2009 - link
you do realize the entire point of mentioning bandwidth was to show that both Nvidia and AMD feel that they are not currently bandwidth limited. They have each doubled their number of cores but only increased bandwidth by ~%50. Theres no mention of overall bandwidth because thats not the point that was being made. Just an off hand observation that says "hey looks like everyone feels memory bandwidth wasn't the limitation last time around"Zingam - Thursday, October 1, 2009 - link
ATI has it here and has it now! NVIDIA does not win because on paper I have 50 billion transistors GPU on 1 nm process! I win! ;)You are a retarded fanboy! And I am not. I'd buy what's best for my money.
SiliconDoc - Thursday, October 1, 2009 - link
Behold the FERMI GPU unbeliever !http://www.fudzilla.com/content/view/15762/1/">http://www.fudzilla.com/content/view/15762/1/
That's called, COMPLETED CARD, RUNNING SILICON.
Better luck next time incorrect ignorant whining looner.
siyabongazulu - Friday, October 2, 2009 - link
Do you see any captions on that site? I don't think so. Nowhere does it mention that it's a complete card. So please stop lying because that goes to show how ignorant you are. Any person with a sound mind can and will tell you that it's not a finished product. So come up with something more valid to show and rant about. Sorry that your big daddy Heung hasn't given you your green slime if you like it that way. Just wait on the corner and when he says, GT300 is a go and tests confirm that it trumps 5870 then you can stop crying and suck on that.silverblue - Thursday, October 1, 2009 - link
When's it coming out?I mean, you have all the answers.
SiliconDoc - Thursday, October 1, 2009 - link
Well thanks for the vote of confidence, but yesterday on the launch, according to the author, right ?LOL
Ha, golly, what a pile.
silverblue - Thursday, October 1, 2009 - link
Anand's entitled to make mistakes. You do nothing else.SiliconDoc - Thursday, October 1, 2009 - link
Oh golly, another lie.First you admit I'm correct, FINALLY, then you claim only mistakes from me.
You're a liar again.
However, I congratulate you, for FINALLY having the half baked dishonesty under enough control that you offer an excuse for Anand.
That certainly is progress.
silverblue - Friday, October 2, 2009 - link
And you conveniently forget the title of this article which clearly states 2010.johnsonx - Wednesday, September 30, 2009 - link
I think there might be something wrong with SiliconDoc. Something wrong in the head.SiliconDoc - Thursday, October 1, 2009 - link
I think that pat fancy can now fairly be declared the quacking idiot group collective's complete defense.Congratulations, you're all such a pile of ignorant sheep, you'll swather together the same old feckless riddle for eachothers emotional comfort, and so far to here, nearly only monkeypaw tried to address the launch lie pointed out.
I suppose a general rule, you love your mass hysterical delusionary appeasement, in leiu of an actual admittance, understanding, or mere rebuttal to the author's false launch accusation in the article, the warped and biased comparisons pointed out, and the calculations required to reveal the various cover-ups I already commented on.
Good for you people, when the exposure of bias and lies is too great to even attempt to negate, it's great to be a swaddling jerkoff in union.
I certainly don't have to wonder anymore.
Griswold - Wednesday, September 30, 2009 - link
So, you're the new village fool?Finally - Thursday, October 1, 2009 - link
Make that "Global Village Fool 2.0"He is an advanced version, y'know?
SiliconDoc - Wednesday, September 30, 2009 - link
Nvidia LAUNCHED TODAY... se page two by your insane master Anand.--
YOU'VE all got the same disease.
MonkeyPaw - Wednesday, September 30, 2009 - link
Is sanity now considered to be a disease? We're not the one's visiting a website in which we so aggressively scream "bias" on (apparently) every GPU article. If you think Anand's work is so offensive and wrong, then why do you keep coming back for more?Anyway, I just don't see where you get this "bias" talk. For crying out loud, you can't make many assumptions about the product's performance when you don't even know the clock speeds. You can guess till you're blue in the face, but that still leaves you with no FACTS. Also keep in mind that GT300 will have ECC enabled (at least in Tesla), which has been known to affect latency and clock speeds in other realms. I'm not 100% sure how the ECC spec works in GDDR5, but usually ECC comes at a cost.
As for "paper launch," ultimately semantics don't matter. However, a paper launch is generally defined as a product announcement that you cannot buy yet. It's frequently used as a business tactic to keep people from buying your competitor's products. If the card is officially announced (and it hasn't), but no product is available, then by my definition, it is a paper launch. However, everyone has their own definition of the term. This article I see more as a technology preview, though nVidia's intent is still probably to keep people from buying an RV870 right now. That's where the line blurs.
SiliconDoc - Thursday, October 1, 2009 - link
A launch date is the date the company claims PRODUCT WILL BE AVAILABLE IN RETAIL CHANNELS.No "semantics" you whine about or cocka doodle do up will change that.
A LAUNCH date officially as has been for YEARS sonny paw, is when the corp says "YOU CAN BUY THIS" as a private end consumer.
---
Anything ELSE is a showcase, an announcement, a preview of upcoming tech, a marketing plan, ETC.
---
YOU LTING ABOUT THE VERY ABSOLUTE FACTS THAT FOR YEARS HAVE APPLIED PERIOD IS JUST ANOTHER RED ROOSTER NOTCH ACQUIRED.
rennya - Thursday, October 1, 2009 - link
Here in SE Asia, 5870 GPU is available in abundance in retail channels. If you PayPal me USD450, I can go straight to any of the computer shops I passed when I go to work, so that I can buy the card (and a casing that will fit the full length card), then I can take pictures and show it to you.Stop it with the claims that the 5870 launch is just a paper launch. That patently isn't true, and will only make you look stupid.
SiliconDoc - Thursday, October 1, 2009 - link
I'm sure your email box is overflowing with requests, and I'm sure your walk to work will serve all the customers around the world.Thanks for that great bit of information for those walking to work with you in SE asia, I bet they're really happy.
---
Maybe you should get a Reseller ID, and make that millionaire dream of yours come true, and soon when rooster central flaps it up again, you can prove to the world dry as a bone ain't rice paper.
---
No, one cannot really fathom the insanity, and red rooster doesn't describe the thickness of skull properly at all, merely the size of it's contents.
rennya - Friday, October 2, 2009 - link
Nope, my inbox is not overflowing with requests, because after all, anyone who wants a 5870 GPU, will be able to get it.If you cannot prove that 5870 is a paper launch, maybe you should shut up your shop?
Sozo - Thursday, October 1, 2009 - link
If we are "red roosters" what does that make you? The green grizzly?SiliconDoc - Thursday, October 1, 2009 - link
Actually the first person to offer any thought on the matter suggested green goblin, which was a decent attempt, since grizzly bears aren't green, and goblins have a much better chance of being so.Howver, if you'd the actual nvidia equivalence of what you ati red roosters are, I'd be happy to provide some examples for you, which I have not done as of yet, and of course you're all too stupid rah-rah to even fathom that. That's pretty sad, and only confirms the problem. I'm certain you can't understand, so don't bother yourself.
http://www.fudzilla.com/content/view/15762/1">http://www.fudzilla.com/content/view/15762/1
silverblue - Thursday, October 1, 2009 - link
What sort of rooster are we talking? I mean, a Sussex rooster is almost exclusively not red. Can I be that one, please?Now THAT's trolling.
Natfly - Thursday, October 1, 2009 - link
I'm thinking a green goober.SiliconDoc - Thursday, October 1, 2009 - link
If you even believed your own pile of fud, you'd go to page 2 I believe it is in the article and see where anand says " sorry that's all we know about the GT300 the game card, nvidia won't tell us anymore"What he was told is IT'S FASTER THAN 5870, and the cores have already been cut, and the cards already under test.
So we already know, if we aren't a raging red doofus, and of course, that is very difficult for almost everyone here.
Also, this was not an official launch date for NVidia, they never declared it as such, just Anand delcared it in his article.
The official launch date for GT300 already spoken about multiple times by the aithors of this website is !!! > THE RELEASE DATE OF WINDOWS 7...
Now, wether nvidia changes their official launch date before then or not, or where the authors got that former information, one can surmise, but changing their AT tune about nvidia in an article title, for a conference and a web video atttendance, in order to appease the shamed and embarrased 3rd time in a row paper launching ati, 4870,4770, 5870, is not "unbiased" nor is it honest, no matter how much you want it to be.
If a person wants to claim it's a planned LEAK to showcase upcoming tech ( nvidia did this AFTER the GT300 gpu cores reported GOOD YIELD) - and combat fools purchasing the epic failure 5870 instead of waiting for the gold, ok.
siyabongazulu - Friday, October 2, 2009 - link
WOw wow wow!! You sir must be the most ignorant, manipulative, underappreciating, bastard.. sorry for tearing your world but you deserve such credentials and a lot more that can be given to people who display your kind of behaviour.You have been crying bias for no reason at all. If Anand says its paper launch, and if tgdaily says its paper launch (http://www.tgdaily.com/content/view/44157/135/)">http://www.tgdaily.com/content/view/44157/135/) and fudzilla (http://www.fudzilla.com/content/view/15762/1/)">http://www.fudzilla.com/content/view/15762/1/) which seems to be your favourite source so far doesn't even speak of anything but a display model that only confirms that GT300 is under construction.
So the only source you can come up with is yourself and you said it here and I quote "If you even believed your own pile of fud, you'd go to page 2 I believe it is in the article and see where anand says " sorry that's all we know about the GT300 the game card, nvidia won't tell us anymore"
What he was told is IT'S FASTER THAN 5870, and the cores have already been cut, and the cards already under test. " Those are your words, not NVIDIAs, no Anand, not Fudzilla, not from any other reviwers but yours.
Therefore, can you please STFU and stop trying to label everyone a red nosed rooster or whatever the f*** u call them.
P.S Not everyone appreciate your level of stupidity and before you can go and say geez there goes another one, FIY I'm running my system on Nvidia card and will buy ATI and snould NVIDIA "Physically Launch" GT300 and prove it to be better then already launched and benchamrked 5870 then you can come back and start your ranting. Until then plug that sh** hole of yours
MonkeyPaw - Thursday, October 1, 2009 - link
Dude, you take this way too personally. Do you have the same burning passion for real problems?SiliconDoc - Thursday, October 1, 2009 - link
Nice consolation speech.I guess you expected " you're right ", but somehow lying to make you feel good is not in my playbook.
Now, next time you don't take it so seriously as to reply, and then still, be pathetic enough to get it wrong. Hows that for a fun deal ?
Maian - Wednesday, September 30, 2009 - link
Where's snakeoil when you need him... I don't give a shit about vendor, but the flame wars here are hilarious :DLifted - Wednesday, September 30, 2009 - link
What flame war? It's just a single nut barking at everyone for no reason. If he has a problem with the article he's sure making it difficult to figure out what it is with all his carrying on and red rooster nonsense.Does anyone (besides the nut) actually care what is said in this article? It's simply something to pass the time with, and certainly not worth getting upset over. Is the nut part of the nvidia marketing machine or merely a troll? It almost seems as if he's writing in a manner as to cover up his true identity. Yes silicondoc, it IS that obvious.
SiliconDoc - Thursday, October 1, 2009 - link
Wow, a conspiracist.Well, for your edification, you didn't score any points, since the readers here get all uppity about what's in the articles, so they have shown a propensity to care, even if you're just here to pass the time, or lie your yapper off for the convenient line it provides you for this momment.
Usually, the last stab of the sinking pirate goes something it like: " It doesn't matter !"
Then Davey Jones proves to 'em it does.
-
Nice try, but the worst problem for you is, it matters so much to you, you think I'm not me. NOW THAT's FUNNY !
ahhahahahaaha
ClownPuncher - Wednesday, September 30, 2009 - link
Aspbergers.Kaleid - Wednesday, September 30, 2009 - link
No, most people with Asperger's are highly functional. This is something else.Finally - Thursday, October 1, 2009 - link
It's Rain Man?tamalero - Friday, October 2, 2009 - link
its assburgers, google it.redpriest_ - Wednesday, September 30, 2009 - link
I can't help but note for the record that um, the card isn't out yet, so how can they win when no one knows when you can actually buy one yet? And for the record, I have a 5870 in my system, right now, that can play games....right now. I went to a retail store and bought it. That's how simple it was. I know you've been posting tons of FUD in the other review forums about how it's unavailable etc etc but the fact is, it IS available, and multiple people can own one.Also, let me state for the record that I have owned nvidia GPUs in the past so that I'm vendor agnostic. I buy whatever solution is available and better. KEY POINTS: AVAILABLE. BETTER.
SiliconDoc - Thursday, October 1, 2009 - link
Here's how they can win, here the NVidia master holds FERMI up for all to see !http://www.fudzilla.com/content/view/15762/1/">http://www.fudzilla.com/content/view/15762/1/
Aww, dat too bad for the wittle wed woosters. It really is real, little red lying hoods.
silverblue - Thursday, October 1, 2009 - link
Am I hearing you right - you say GT300 isn't a paper launch despite there being no cards for sale for the next few months, yet you said the 5870 was AND THERE WERE CARDS FOR SALE WHEN YOU MADE THE COMMENT! I don't care that you couldn't locate one, the simple fact is people had already bought cards from the first trickle (emphasis on the trickle part) and as such made your statement completely invalid.How much more rubbish are you going to spew from your hole?
(note: I needed to have caps above to make a salient point and not just because I felt like holding the shift key for no particular reason)
SiliconDoc - Thursday, October 1, 2009 - link
roflmao - If you're hearing anything right, you'd keep your text yap shut.Please show me the LAUNCH information on GT300, there, brainless bubba, the liar. I really cannot imagine you are that stupid, but then again, it is possible.
Congratulations for being a COMPLETE IDIOT AND LIAR! Really, you must work very hard to maintain that level of ignorance. In fact, the requirement to be that stupid exceeds the likelihood that you actually are purely ignorant, and therefore, it is more likely you're a troll. My condolences in either case in all seriousness.
silverblue - Thursday, October 1, 2009 - link
The proposed launch is late November but even Fudzilla concede that any problems will delay this. The earliest we'll see a GT300 on the shelves is just under 2 months. There, that's information for you. I want nVidia to launch GT300 this year but we don't always get what we wish for.Where did I lie in any of my previous posts? Oh right... I didn't bow down to worship the Green God(dess). If you had any semblance of an open mind or any stability at all, your nose wouldn't need cleaning. Calling me a troll is pure comedy gold and offering me your pity is outstanding to say the least :)
Keep trying. Or don't. Either way, I doubt many people care for your viewpoints anymore.
SiliconDoc - Thursday, October 1, 2009 - link
Oh jeeze, one red rooster who finally gets it.Congratulations, you're not the dumbest of your crowd.
--
No shirking here comes the QUOTE !
" The proposed launch is late November " !!! whoo hoo !
Now a proposed launch is not an official launch- try to keep that straight in the gourd haters when the time comes.
--
Pass it along to all the screaming tards, won't you please, you talk their language, or perhapos we'll just say you already have, because, by golly, they can believe you.
ROFLMAO
rennya - Thursday, October 1, 2009 - link
'Nvidia LAUNCHED TODAY... se page two by your insane master Anand.'This what you have said yourself somewhere in this very discussion. So you must have known about this so-called launch yourself.
SiliconDoc - Thursday, October 1, 2009 - link
That's called sarcasm dear. Jiminy crickets.palladium - Thursday, October 1, 2009 - link
How can you tell if that's not a GTX285 with redesigned cover/cooler/PCB?samspqr - Monday, October 5, 2009 - link
it is SO funny that that thing silicondoc's master/god is holding in his hand ended up revealing itself as nothing more than a mock-up...SiliconDoc - Wednesday, September 30, 2009 - link
PS THE SILICON IS ALREADY CUT AND IN PRODUCTION!---
Yes anand at the bottom of page 1 claims "it's paper" - DECIEVING YOU, since the WAFERS HAVE ALREADY BEEN BURNED AND YIELDS ARE REPORTED HIGH ! (in spite of ati's marketing arm lying and claiming "only 9 cores per wafer yields" - A BIG FAT LIE NVIDIA POINTED OUT !
Where have you been with your head in the sand ?
--
So at the bottom of page 1 Anand leaves you dips with the impression "it's all paper" (but the TRUTH is DEVELOPER CARDS ARE ALREADY ASSEMBLED and being DEBUGGED and TESTED) just anand won't get one for 2 months.
---
THEN BY PAGE 2 ANAND CALLS IT A PAPER LAUNCH !
roflmao
Yes, the red rooster himself has convinced himself "today's nvidia LAUNCH" (that LAUNCH word is what anand made up in his deranged mind) is a paper launch "JUST LIKE ATI'S!".
---
It is nothing short of absolutely AMAZING.
The red rooster fan has boonswoggled his own gourd, stated in fasle terms, bashed it to be as bad as what ati just did with 5870, and IT'S NOT EVEN A LAUNCH DAY FOR NVIDIA !
---
Congratulations, the massive bias is SCREAMING off the page. LOL
It's hilarious, to say the least, that the master can be that deluded with his own spew!
gx80050 - Friday, October 2, 2009 - link
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I would love to kick you hard in the face, breaking it. Then I'd cut
your stomach open with a chainsaw, exposing your intestines. Then I'd
cut your windpipe in two with a boxcutter.
Hopefully you'll get what's coming to you. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. Faggot
Shut the fuck up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
LawRecordings - Thursday, October 1, 2009 - link
Buwahahaha!!!What a sad, lonely little life this Silicon Doc must lead. I struggle to see how this guy can have any friends, not to mention a significant other. Or even people that can stand being in a room with him for long. Prolly the stereotypical fat boy in his mom's basement.
Careful SD, the "red roosters" are out to get you! Its all a conspiracy to overthrow the universe, and you're the only one that knows!
Great article Anand, as always.
Regards,
Law
Vendor agnostic buyer of the best price / performance GPU at the time
SiliconDoc - Thursday, October 1, 2009 - link
They can't get me, they've gotten themselves, and I've just mashed their face in it.And you're stupid enough to only be able to repeat the more than thousandth time repeated internet insult cleche's, and by your ignorant post, it appears you are an audiophile who sits bouncing around like a retard with headphones on, before after and during getting cooked on some weird dope, a HouseHead, right ? And that of course does not mean a family, doper.
So you giggle like a little girl and repeat what you read since that's all the stoned gourd can muster, then you kiss that rear nice and tight, brown nose.
Don't forget your personal claim to utter innocence, either, mr unbiased.
LOL
Yep there we have it, a househead music doused doped up butt kisser with a lame cleche'd brain and a giggly girl tude.
Golly, what were you saying about wifeless and friendless ?
ClownPuncher - Thursday, October 1, 2009 - link
What exactly is a cleche?Is it anything like a cliche?
Your spelling, grammar, and general lack of communication skill lead me to think that you are actually a double agent, it's an act if you will...an ATI guy posing as a really socially stunted Nvidia fan in an attempt to turn people off of Nvidia products solely by the ineptitude of your rhetoric.
UNCjigga - Wednesday, September 30, 2009 - link
I'd hate to have a political conversation with SiliconDoc, but I digress...Some very interesting information came out in today's previews. Will Fermi be a bigger chip than Cypress? Certainly. Will it be more *powerful* than Cypress? Possibly. Will it be more expensive than Cypress? Probably. Will it have more memory bandwidth than Cypress? Yes.
Will it *play games* better than Cypress? Remains to be seen. Too many factors at play here. We don't know clock speeds. We have no idea if "midrange" Fermi cards will retain the 384-bit memory interface. We have
For all we know, all of Fermi's optimizations will mean great things for OpenCL and DirectCompute, but how many *games* make use of these APIs today? How can we compare DirectX 11 performance with few games and no Fermi silicon available for testing? Most of the people here will care about game performance, not Tesla or GPGPU. Hell, its been years since CUDA and Stream arrived and I'm still waiting for a decent video encoding/transcoding solution.
Calin - Thursday, October 1, 2009 - link
Even between current cards (NVIDIA and AMD/ATI) the performance crown moves from one game to another - one card could do very well in one game and much worse in another (compared to the competition). As for not yet released cards, performance numbers in games can only be divined, not predictedBull Dog - Wednesday, September 30, 2009 - link
So how much in NVIDIA's focus group partner paying you to post this stuff?dzoni2k2 - Wednesday, September 30, 2009 - link
You seriously need to take your medicine. And call your shrink.dragonsqrrl - Thursday, October 1, 2009 - link
I know it seems like SiliconDoc is going on a ranting rage, because he kinda is, but the fact remains that this was a fairly biased article on the part of Anandtech. I've been reading reviews and articles here for a long time, and recently there has been a certain level of prejudice against Nvidia and its products that I haven't noticed on other legitimate review cites. This seems to have been the result of Anandtech getting left out of the loop last year. Throughout the article there is a pretty obvious sarcastic undertone towards what the Nvidia representatives say, and their newly announced GPU. I can only hope that this stops, so that anandtech can return to its former days of relatively balanced and fair reporting, which is all anyone can ask of any legitimate review cite. Articles of this manner and tone serve no purpose but to enrage people like SiliconDoc, and hurt Anandtech's image and reputation as a balanced a legitimate tech cite.Keeir - Thursday, October 1, 2009 - link
Curious in where you see the Bias.I see a little bit of the tone, but it seems warranted for a company that has for the last few years over-promised and under delivered. Very similar to how AMD/ATI was treated upto the release of the 4 series. Nvidia needs to prove (again) that it can deliever a real innovative product priced at an affordable level for the core audience of graphics cards.
Here we are, 7 days after 5870 launch and Egg has 5870s for ~375 to GTX 295s at 500. Yet again, ATI/AMD has made it a puzzling choice to buy any Nvida product more than 200 dollars.... for months at a time.
SiliconDoc - Friday, October 2, 2009 - link
What's puzzling is you are so out of touch, you don't realize the GTX295's were $406 before ati launched it's epic failure, then the gtx295 rose to $469 and the 5870 author edsxplained in text the pre launch price, and now you say the GTX295 is at $500.Clearly, the market has decided the 5870 is epic failure, and instead of bringing down the GTX295, it has increased it's value !
ROFLMAO
Awwww, the poor ati failure card drove up the price of the GTX295.
Awww, poor little red roosters, sorry I had to explain it to you, it's better if you tell yourself some imaginary delusion and spew it everywhere.
SiliconDoc - Wednesday, September 30, 2009 - link
Nice rebuttal to page 2: " Another kind of LAUNCH "--
write it down, nvidia launched today....(according to lunatic lying red roosters)
tamalero - Wednesday, September 30, 2009 - link
weird.. they still said its "coming soon", I dont see any GF300 firm chips.when ATI said "we present the 5870" they were already on newegg.com
Silicon, let's face it, you're the biggest pro-nvidia troll I've ever seen.
SiliconDoc - Wednesday, September 30, 2009 - link
You are also the person that went into a tirade about nvidia not replacing laptop gpu's with the faulty substrate and instead puttig on a heftier fan.You waxed on about how much you hate nvidia, and how they harmed the children (you claimed to be a teacher of some sort) then you screeched about nvidia reps, wished violence upon them, and claimed you'd love to show them how to do their jobs correctly.
---
That's YOU tamalero.
--
Now it's pretty amazing I tell the simple plain truth, you deny it a week late, lying for ati, have you public hate and rage on this board for nvidia, and yet claim it is I that is a fanboy.
--
One Q, has your raging hatred for nvidia receded, or does lying about the 5870 release give you a sense of vengeful pleasure ?
tamalero - Friday, October 2, 2009 - link
what truth?you're just inventing random crap your brain somehow imagines in illusions.
and what the hell are you talking about?
I never claimed to be a "teacher", wished violence? what the hell are you smoking?
harmed the children.. jesuchrist... are you on some sort of scientologist brainwashing group ?
SiliconDoc - Friday, October 2, 2009 - link
Since you have lied, I will get the link and your quotes.SiliconDoc - Wednesday, September 30, 2009 - link
no they wre not already on newegg - listed and greyed out- the first one available in a trickle -and only today have those listed appeared available, before that it was on for a few seconds, card gone - all GREYED OUT again.---
Sept. 23rd was launch, this is 7 days later.
They were a WEEK of paper. (no one can fairly count a sickly 1,2 or half dozen trickle)
tamalero - Friday, October 2, 2009 - link
they were grey, because they sold out, note..., there were on amazon and tigerdirect.com as well. I woudlnt be surprised if newwave and other sites had the 5870 as well.you're just a person with mental problems who cant really accept anything outside your tiny world.
SiliconDoc - Friday, October 2, 2009 - link
TigerDirect was pre-order, as well as Amazon was reserve - you just haven't got clue one.fikimiki - Friday, October 2, 2009 - link
In Poland, (it is Europe cause you don't know for sure)it is available in shops.
Also you can grab one from newegg.com
bobvodka - Thursday, October 1, 2009 - link
I woke up on HD5870 launch day.I logged onto a website in the UK.
I ordered an HD5870.
It shipped the same day.
I had it the next day and have been enjoying it ever since.
Looks like a non-paper launch to me.
SiliconDoc - Thursday, October 1, 2009 - link
Good for you, one of 7 billion, and then again one of perhaps 20, as reported for Europe.But, all you see is yourself, because you're just that selfish. And, you're a big enough liar, that you even posted your insane smart aleck stupidity, like a little brat.
That's what you're about. Case closed.
bobvodka - Thursday, October 1, 2009 - link
Ah, I see, you have no facts to refute me with thus you fall back to unfounded insults safe in the knowledge that you are nothing but a troll hiding behind a keyboard.Sorry I wasted my time with you, clearly you aren't able to deal with the world in logical terms.
rennya - Thursday, October 1, 2009 - link
Uhmm... maybe because it is common knowledge that ATI can actually get 5870 launched properly, with multiple manufacturers on board, and get the retail stores stocked up?20 for the whole Europe? What a joke. If I am a millionaire, I can get 20 of those 5870 GPU thing easily.
SiliconDoc - Thursday, October 1, 2009 - link
This is October 1st, not September 23rd, so for being a millionaire, you certainly are one ding dang dumb dumb.gx80050 - Friday, October 2, 2009 - link
Isn't the internet great. It allows shitheads like yourself to say shit that would, in real life
get your head cracked open.
Hopefully you'll suffer the same fate fucking cunt.
Please turn to the loaded gun in your drawer, put it in your mouth, and pull the trigger,
blowing your brains out. You'll be doing the whole world a favor. Shitbag.
rennya - Friday, October 2, 2009 - link
Hahahaha.... even that today is already 1 October, you are still claiming that 5870 GPU is paper launch, when it is definitely not.rennya - Thursday, October 1, 2009 - link
What paper launch? Is Newegg is the only place to get one? Here somewhere in SE Asia getting one of this 5870 GPU is as easy as going to a store, flash your wad of cash at the cashier and then returns home with a box with pre-rendered 3D objects/characters on it (and of course an ATI 5870 GPU in it). In fact, after a week from the release date, there is a glut of them here already, mainly from Powercolor and HIS.SiliconDoc - Thursday, October 1, 2009 - link
LOL - roflmao - So announce in the foreign tongue, and move to the next continent when ready, you dummy. They didn't do that. They LIED, again, and failed.A week late is better than several or a month or two for the 4870.
You can't buy quantity yet either, but for peons, who cares.
rennya - Thursday, October 1, 2009 - link
Uhmm... the second language in SE Asia is English. What, just because I can prove to you that 5870 launch is real, you started to deny it? Are you the typical American that thinks the rest of the world doesn't exists?SiliconDoc - Thursday, October 1, 2009 - link
Yuo can't prove anything to me, since you won't be proving the GT300 LAUNCHED like the author claimed.Instead, none of you quacking loons have anything but "foreign nation", no links and it's too late, and strangely none of you type in the Asain fashion.
LOL
So who the heck knows what you liars are doing anyway.
The paper standard was set by this site and it's authors, and the 4870 was paper, the 4770 was paper, and this 5870 was paper, PERIOD, and as of this morning the 5850 was also PAPER LAUNCHED.
What's funny is only you morons deny it.
All the other IT channels admit it.
--
Good for you red roosters here, you're the only ones correct in the world. ( no, you're not really, and I had to say that because you'll believe anything )
rennya - Friday, October 2, 2009 - link
Go ask the administrator to check my IP and they can verify that my IP comes from a SE Asia country. Are you accusing me of lying for claiming that I come from a nirvana where 5870 GPU is plentiful?Is that all you can do?
Fact - 5870 is not paper launch. You cannot even deny this.
Ah, BTW, English in SE Asia is the same as the ones used in America and Europe.
Totally - Friday, October 2, 2009 - link
Seriously, what are you on? It has to be some good stuff. I want some.I like how you go on and on spouting nonsense about how GT300 has 50% more theoretical bandwith, but without clock speeds there is no way to gauge how much of it will be saturated. In plain speak: Without hard numbers BANDWIDTH ALONE MEANS NOTHING. Sure nvidia has tons of road but we have no idea what they are going to drive on it.
About the 5870 being a paper launch, my best friend had his since the 30th. Day the 5850 launched, took a look over at newegg at 7 in the evening they where there available to order. And still you can order/go to the store and purchase either right now!!! That's not a paper launch. Last time I checked a paper launch is when a product goes live and it's unavailable for over a month.
lyeoh - Friday, October 2, 2009 - link
Doesn't look like good stuff to me. You'd probably get brain damage or worse.Should be banned in most countries.
SiliconDoc - Wednesday, September 30, 2009 - link
When anand posts the GD bit width and transistor count, and mem, then CLAIMS bandwith is NOT DOUBLE, it is CLEAR the very simple calculation you 3rd graders don't know is AVAILABLE.---
IT'S 240 GB !
4800x384/8 !
duhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh
It's not FUD, it's just you people are so ignorant it's EASY to have the wool pulled over your eyes.
Lightnix - Wednesday, September 30, 2009 - link
4800mHz x 384 / 8 = 230400mB/s = 230.4GB/sOr 50% faster than 153GB/s - still a big gap but clearly not even nearly double.
It's not FUD, it's just you trolls are so bad at maths you can't even use a calculator to do basic arithmetic with it's EASY to have the wool pulled over your eyes.
SiliconDoc - Wednesday, September 30, 2009 - link
The author claimed not double the former GT200, sir.In the 5870 review just the other day, the 5870 had a disappointing 153+ bandwith, vs the 115 of the 4780 or 124 of the 4890.
--
So you can see with the 5870 it went up by not much.
--
In this review, the former GT200 referred to has a 112, 127, 141, or 159 bandwith, as compared to the MYSTERY # 240 for the GT300.
So the author claims in back reference to the ati card the nvidia card "also fails" to double it's predecesor.
--
I have a problem with that - since this new GT300 is gonig to be 240 bandwith, nearly 100 GB/sec more than the card the author holds up higher and gioves a massive break to, the one not being reviewed, the ati 5870.
--
It's bias, period. The author could fairly have mentioned how it will be far ahead of it's competition, and be much higher, as it's predecessor nvidia card was also much higher.
Instead, we get the cryptic BS that winds up praising ati instead of pointing out the massive LEAD this new GT300 will have in the bandiwth area.
I hope you can understand, but if you cannot, it's no wonder the author does such a thing, as it appears he can snowball plenty with it.
UNCjigga - Thursday, October 1, 2009 - link
STFU you stupid moron. There's no "bias". The 5870 has a full, in-depth, separate review with full benchmarks. The author didn't do direct comparisons because THERE IS NO CARD TO COMPARE IT WITH TODAY. FERMI ONLY EXISTS ON PAPER--the mere existence of engineering samples doesn't help this review. The author even indicated he wished he had more info to share but that's all Nvidia allowed. How about we wait until a GT300 ships before we start making final judgements, ok?SiliconDoc - Thursday, October 1, 2009 - link
Good job ignoramus.http://www.fudzilla.com/content/view/15762/1">http://www.fudzilla.com/content/view/15762/1
Oh, look at that, you're 100% INCORRECT.
Another loser idiot with insults and NOTHING ELSE but the sheepled parrot mind that was slammed into stupidity by the author of this piece.
Great job doofu.
ufon68 - Thursday, October 1, 2009 - link
Wow, you must be the biggest fanboy i've ever seen. It would be funny if it wasn't so sad you're vasting so much energy on such insignificant issue and everyone around here just thought to themselves..."what a total failure".But hey, on the bright-side, you made me jump off that fence and register, so you might not be as useless as you seem.
monomer - Thursday, October 1, 2009 - link
Wow, your proof that Fermi exists is a photo of Huang holding up a mock-up of what the new card is going to look like?If that was a real card, and engineering samples existed, why isn't it actually in a PCI-e slot running something? Why were no functioning Fermi cards actually shown at the conference? Why was the ray-tracing demo performed on a GT200?
Finally, why did Huang say that cards will be ready in "a few short months", if they are actually ready now?
You need to calm down a little. You also need to work on your reading skills and to stop making up controversies where none exist.
Yes, Anand pointed out that the memory bandwidth did not double, but in the very same sentence, he mentions that it did not double for the 5870 either.
Inkie - Saturday, October 3, 2009 - link
Not that I really want to support SD here, but there was working silicon there. It's kind of weird that many sites fail to mention this. Instead, they focus on the mockup.SiliconDoc - Thursday, October 1, 2009 - link
Go read a few articles on how a card is developed, and you'll have the timeline, you red rooster retard.I mean really, I'm talking to ignoramussed spitting cockled mooks.
Please, the articles are right here on your red fan website, so go have a read since it's so important to you how people act when your idiotic speculation is easily and absolutely 100% incorrect, and it's PROVEABLE, the facts are already IN.
gx80050 - Friday, October 2, 2009 - link
You're a fucking friendless loser who should have died on 9/11. Fucking cuntmonomer - Friday, October 2, 2009 - link
In reply to your original link, here's a retraction, of sorts:http://www.fudzilla.com/content/view/15798/1/">http://www.fudzilla.com/content/view/15798/1/
The card Nvidia showed everyone, and said was Fermi is in fact a mock-up. Oh well.
silverblue - Thursday, October 1, 2009 - link
What facts? What framerates can it manage in Crysis? What scores in 3DMark? How good it is at F@H?Link us, so we can all be shown the errors of our ways. It's obvious that GT300 has been benchmarked, or at least, it's only obvious to you simply because the rest of us are on a different planet.
You call people idiots, and then when they reply in a sensible manner, you conveniently forget all that and call them biased (along with multiple variations on the red rooster theme). You're like a scratched vinyl record and it's about time you got off this site if you hate its oh-so-anti-nVidia stance that doesn't actually exist except in your head.
Prove us wrong! Please! I want to see those GT300 benchmarks! Evidence that Anandtech are so far up AMD's rear end that nothing else is worth reporting on fairly!
Zool - Thursday, October 1, 2009 - link
GTX285 had 32 ROPs and 80 TMUs for aorund the same bandwith like 5870 with same 32 ROPs and 80 TMUs. Dont be stupid. GTX will surely need more ROPs and TMUs if they want to keep up with graphic even with the GPGPU bloat.Totally - Wednesday, September 30, 2009 - link
it's 225GB/s not 230.4/s230400/1024 = 225
I'm afraid your bad at math.
Lightnix - Thursday, October 1, 2009 - link
Nope, just really bad at remembering that those prefixes mean 1024 at like 1 in the morning.Lonyo - Wednesday, September 30, 2009 - link
You assume that they will use GDDR5 clocked at the same speed as ATI.They could use higher clocked GDDR5 (meaning even more bandwidth), or lower clocked GDDR5 (meaning less bandwidth).
There's no bandwidth comparison because 1) it's meaningless and 2) it's impossible to make an absolute comparison.
NV will have 50% more bandwidth if the speed of the RAM is the same, but it doesn't have to be the same, it could be higher, or lower, so you can't say what absolute numbers NV will have.
I could make a graph showing equal bandwidth between the two cards even though NV has a bigger bus, or I could make one showing NV having two times the bandwidth despite only a 50% bigger bus.
Both could be valid, but both would be speculative.
Calin - Thursday, October 1, 2009 - link
Also, there could be a chance that the Fermi chip doesn't need/use much more bandwidth than the GT200. Available bandwidth does not performance makeSiliconDoc - Friday, October 2, 2009 - link
Yeah, of course, 3 million T core, and it doesn't need much bandiwth, and it won't be used, perhaps, or probably, because the GPU designers haven't a clue, and of course, you do.---
Another amazing clown with a red nose. You people really should stop typing stupid stuff like that.
Calin - Sunday, October 4, 2009 - link
There was no performance improvement from increasing the bandwidth of the Athlon64 processors due to the move to DDR2 memory (a theoretical doubling of performance, with about one and a half more measured bandwidth in the first generation of the processor).I might not have a clue, but do you?
SiliconDoc - Wednesday, September 30, 2009 - link
Very interesting of course, and so with your theory, it could also be LOWER, with slower ddr5, but the fact REMAINS, 240 has ALREADY BEEN LEAKED.so we know what it is, and Anand KNOWS IT'S MORE, BUT NOT DOUBLE AND SAYS SO !
---
SO WHAT WE HAVE IS A CONVENIENT COVER UP. PERIOD!
--
JUST NOT AS STUPID AS YOU, THAT'S ALL.
dragunover - Thursday, October 1, 2009 - link
"SO WHAT WE HAVE IS A CONVENIENT COVER UP. PERIOD!--
JUST NOT AS STUPID AS YOU, THAT'S ALL. "
I disregarded anything you said before and after that.
SiliconDoc - Friday, October 2, 2009 - link
I'm certain you attempted it, as you no doubt love to wallow in blissful ignorance and denial just by mere unchangeable habit.But having read it, you have nowhere to go. Your mind is already irreparably altered. Congratulations.
jonGhast - Wednesday, September 30, 2009 - link
True, you seem to be a whole new level of stupid.mapesdhs - Wednesday, September 30, 2009 - link
I have a theory on that one: faulty keyboard. Every time he hits Shift
to get upper case, his keyboard is zapping out brain cells with EMF bursts. :D
Best to just ignore & not reply IMO.
Ian.
SiliconDoc - Thursday, October 1, 2009 - link
Well you can't ignore the discussion, and as far as that goes, you want everyone else to do your will, as you beg for it, with your empty advice, which is by the way, all you provided in the last thread.So you quack around telling others not to participate. That's your whole gig mary.
I find the stupidity level amazing, as most of you can only spew in and beg eachother not to comment, and by the quality of the comments that actually try a counter, I certainly cannot blame you, for begging others to surrender ahead of time.
You notice how many new names are here ? lol
It's clear who and what it is that responds, and what level of conduct they are all about.
Now let's see the Sybillic mapes sign off on his/her own glorious advice, which she/he failed to follow already.
Silverel - Thursday, October 1, 2009 - link
Hey buddy. Take a pill and relax.Your product isn't on the market yet. There's no recommendation to be heeded. It doesn't matter how fast it is, you can't have one. No one can. If someone wants to get their next-gen performance on now, ATI is what you'll buy.
It's okay though. Not everyone is going to run out and grab an ATI card. There's plenty of people that will wait for hell to freeze over for nVidia to release a new card. Personally, I'll take the Q1 2010 release date as semi-fact, but hell freezing over is my fallback date.
So feel free to continue ranting like you've run out of meds. Insult anyone who has made a comment contrary to your own. It's not really doing you any favors, but that's okay. I'm sure it's having an effect on the opinions of people sitting on the fence. Ah, and just to dig the finger in the wound, you can consider me an ATI shill, just for good measure.
SiliconDoc - Thursday, October 1, 2009 - link
Gosh you';re a sweetheart, too, and wrong ! WRONG ! WRONG !the red bloviator> " . If someone wants to get their next-gen performance on now, ATI is what you'll buy. "
Gee, did you lose it that easily, was your heart beating so hard, were you sweaty, and upset, and out of control, and decided it was great to tell me to settle down, when your brain was on FART ?
Please see the GTX295 that BEATS the 5870. The 5870 is PRIOR GEN performance - as ati's own 4870x2/or CF is equal.
Golly, what a deal, another red rooster massive ruse.
PorscheRacer - Thursday, October 1, 2009 - link
I have no clue what the red rooster thing implies, and I never understood why people called nVIDIA the green goblin. Until now. You, sir, have made it clear to me. They are called the green goblin, because that's where the trolls come from. Like wow. Your partisan and righteous thinking has no merit, no basis except conjecture and criticism. Save a keyboard, chill out and let's see if you can post anything in here without using the words, nVIDIA, ATI, red rooster, green goblin, and anything with ALL CAPS.It's fine to be passionate about something. But to exessive extents that push everyone else away and leave people ashamed, discouraged and embarrased; that's not how to win hearts and minds. I can already see you getting riled up over this post telling you to chill out....
SiliconDoc - Friday, October 2, 2009 - link
Hmmmm, that's very interesting. First you go into a pretend place where you assume green goblin is something "they call" nVIDIA, but just earlier, you'd never seen it in print before in your life.Along with that little fib problem, you make the rest of the paragraph a whining attack. One might think you need to settle down and take your own medicine.
And speaking of advice, your next paragraph talks about what you did in your first that you claim noone should, so I guess you're exempt in your own mind.
kirillian - Thursday, October 1, 2009 - link
Yall...seriously...leave the poor NVidia Fanboy alone. His head is probably throbbing with the fact that he found his first website (other than HardOCP) that isn't extremely NVidia biased.SiliconDoc - Friday, October 2, 2009 - link
Gee, I find that interesting that you know all about bias at other websites...So that says what again about here ?
silverblue - Thursday, October 1, 2009 - link
The 5870 is but one single GPU. The 295 is two and costs more. The 4870X2/CF is also a case of two GPUs. A 5870X2 would annihilate everything out there right now, and guess what? 5870 CF does just that. If money is no object, that would be the current option, or 5850s in CF to cut down on power usage and a fair amount of the cost without substantially decreasing performance.By stating "if someone wants to get their next-gen performance now", of course he's going to point in the direction of ATI as they are the only people with a DX11 part, and they currently hold the single GPU speed crown. This will not be the case in a few months, but for now, they do.
SiliconDoc - Friday, October 2, 2009 - link
I kinda doubt the 5870x2 blows away GTX295 quad, don't you ?--
Now you want to whine cost, too, but then excuse it for the 5870CF. LOL.
Another big fat riotous red rooster.
Really, you people love lies, and what's bad when it's nvidia, is good when it's ati, you just exactly said it !
ROFLMAO
--
Should I go get a 295 quad setup review and show you ?
--
How come you were wrong, whined I should settle down, then came back blowing lies again ?
There's no DX11 ready to speak of, so that's another pale and feckless attempt at the face save, after your excited, out of control, whipped up incorrect initial post, and this follow up fibber.
You need to settle down. "I want you banned"
Finally, you try to pretend you're not full of it, with your spewing caveat of prediction, "this will not be the case in a few months" - LOL
It's NOT the case NOW, but in a few months, it sure looks like it might BE THE CASE NO MATTER WHAT, unless of course ati launches the 5870x2 along with nvidia's SC GT300, which for all I know could happen.
So, even in that, you are NOT correct to any certainty, are you...
LOL
Calm down, and think FIRST, then start on your rampage without lying.
silverblue - Friday, October 2, 2009 - link
My GOD... you're a retard of the highest order.Why would I want to compare a dual GPU setup with an 8 GPU setup? What numpty would do that when it would logically be far faster? Even a quad 5870 setup wouldn't beat a quad 295 setup, and you know what? WE KNOW! 8 cores versus 4 is no contest. Core for core, RV870 is noticeably faster than the GT200 series, but you're the only person attempting to compare a single GPU card to a dual GPU card and saying the single GPU card sucks because it doesn't win.
And where did I say "I want you banned"? As someone once said, "lay off the crack".
SiliconDoc - Friday, October 2, 2009 - link
Aren't you the one who claimed only ati for the next gen performance ?Well, you really blew it, and no face save is possible. A single NVIDIA card beats the best single ati card. PERIOD.
It's true right now, and may or may not change within two months.
PERIOD.
silverblue - Friday, October 2, 2009 - link
No, I said that ATI currently has the single GPU crown. Not card - GPU. In a couple of months, ATI may have the 5870X2 out, and that WILL send the 295 the way of the dodo if it's priced correctly.No face saving necessary on my part.
Zaitsev - Wednesday, September 30, 2009 - link
^^LOL. I don't see what all the bickering is about. If you're willing to wait a few more months, then you can buy a faster card. If you want to buy now, there are also some nice options available. Currently there are 5 brands of 5870's and 1 5850 at the egg.AlexWade - Wednesday, September 30, 2009 - link
How long have you been working for NVidia?taltamir - Thursday, October 1, 2009 - link
don't insult nvidia by insinuating that this zealot is their employeedzoni2k2 - Wednesday, September 30, 2009 - link
What the heck is wrong with you SiliconDoc?Since when is memory bandwidth main indicator of performance?!
For all I care Fermis memory bandwidth can be 999GB/s but what good is that if it's not used?
SiliconDoc - Friday, October 2, 2009 - link
I'm sure "it won't be used" because for the very first time "nvidia will make sure it "won't be used" becuase "they designed it that way ! " LOL--
You people are absolutely PATHETIC.
Now the greater Nvidia bandwith doesn't matter, because you don't care if it's 999, because... nvidia failed on design, and "it won't be used!"
ROFLMAO
Honestly, if you people heard yourselves...
I am really disappointed that the bias here is so much worse than even I had known, not to mention the utter lack of intellect so often displayed.
What a shame.
PorscheRacer - Wednesday, September 30, 2009 - link
Exactly! R600 had huge bandwidth but couldn't effectively use it; for the msot part. Is this huge bandwdth the GF300 has only able to be used in cGPU, or is it able to be used in games, too? We won't know till the card is actually reviewed a long while from now.SiliconDoc - Wednesday, September 30, 2009 - link
What a joke. The current GT200 responds in all flavors quite well to memory clock / hence bandwith increases.You know that, you have been around long enough.
It's great seeing the reds scream it doesn't matter when ati loses a category. (no actually it isn't great, it's quite sickening)
SiliconDoc - Wednesday, September 30, 2009 - link
Yes of course bandwith does not really matter when ati loses, got it red rooster. When nvidia is SO FAR AHEAD in it, it's better to say "it's not double"...LOL---
WHAT IS WRONG WITH YOU PEOPLE AND THE AUTHOR IS THE REAL QUESTION!
--
What is wrong with you ? Why don't you want to know when it's nvidia, when it's nvidia a direct comparison to ati's card is FORBIDDEN !
That's what the author did !
It was " a very adept DECEPTION" !
---
Just pointing out how you get snowballed and haven't a clue.
Rumors also speculated 4,000 data rate ddr5
4000x384/8 - 192 bandwith, still planty more than 153 ati.
CLEARLY though "not double 141" (nvidia's former number also conveniently NOT MEWNTIONED being so close to 153/5870 is EMBARRASSING) - is 282...
--
So anand knows it's 240, not quite double 141, short of 282.
DigitalFreak - Wednesday, September 30, 2009 - link
Looks like SnakeOil has another alias!therealnickdanger - Wednesday, September 30, 2009 - link
Agreed. That was refreshing!mapesdhs - Wednesday, September 30, 2009 - link
Blimey, I didn't know Ujesh could utter such things. :D When I knew
him in 1998 he was much more offical/polite-sounding (he was Product
Manager for the O2 workstation at SGI; I was using a loaner O2 from
SGI to hunt for OS/app bugs - Ujesh was my main contact for feedback).
The poster who talked about availability has a strong point. My brother
has asked me to build him a new system next week. Looks like it'll be
an Athlon II X4 620, 4GB RAM, 5850, better CPU cooler, with either an
AM3 mbd and DDR3 RAM or AM2+ mbd and DDR2 RAM (not sure yet). By heck
he's going to see one hell of a speed boost; his current system is a
single-core Athlon64 2.64GHz, 2GB DDR400, X1950Pro AGP 8X. :D My own
6000+ 8800GT will seem slow by comparison... :|
Ian.
samspqr - Thursday, October 1, 2009 - link
ATI's availability will be sorted out soon, NVIDIA's weird design choices that are targeted at anything but graphics won'tin fact, I have just realized: NVIDIA IS DOING A MATROX!
(forget about graphics, concentrate in a proffessional niche, subsequently get run over by competitors in its former main market... eventually dissappear from the graphics market or become irrelevant? with some luck, RayTracing will be here sooner rather than later, ATI will switch to GPUcomputing at the right time -as opposed to very much too soon-, and we will have a 3 players market; until then, ATI domination all over)
andrihb - Thursday, October 1, 2009 - link
What a huge leap of the imagination :Psamspqr - Friday, October 2, 2009 - link
sorry, I was just trying to imagine how many weird things would have to happen so that we don't have a single GPU maker in the marketin any case, if you want some imaginative thinking, try here:
http://www.semiaccurate.com/2009/10/01/nvidia-fake...">http://www.semiaccurate.com/2009/10/01/nvidia-fake...
(I'm not sure yet who is the one making stuff up -charlie or nvidia-, but so far my bet would be on nvidia)
mindless1 - Saturday, October 3, 2009 - link
What they may have done is take an existing PCB design for something else, and tacked down the parts and air-wired them. It is a faster way to debug a prototype, as well as just drilling a few holes and putting makeshift screws in to test a cooling design before going to the effort of the rest of the support parts before you know if the cooling subsystem is adequate.IF that is the situation, I feel nVidia should have held off until they were further along with the prototypes, but when all is said and done if they can produce performance in line with the expectations, that would prove they had a working card.
IGoodwin - Friday, October 2, 2009 - link
First off, I don't know the truth about a fake or real Tesla being in existence; however, when an article shows a strong emotional bias, I do find it hard to accept the conclusions.Here is a link to the current Tesla product for sale online:
http://www.tigerdirect.com/applications/SearchTool...">http://www.tigerdirect.com/applications...tails.as...
This clearly shows the existing Tesla card with screws on the end plate. Also, if memory serves, having partial venting on a single slot for the new Tesla card would equal the cooling available on the ATI card. Also, six-pin connector is in roughly the same place.
As for the PCB, it is hidden on the older Tesla screen shots, so nothing can be derived.
The card may be fake, or not, but Charlie is not exactly unbiased either.
jonGhast - Saturday, October 3, 2009 - link
"but Charlie is not exactly unbiased either."What's the deal with that, I keep trying to read Semi's articles, though his 'tude towards MS and Intel is pretty juvenile, but I've got to ask; did somebody at Nvidia gang rape his mom?
mindless1 - Saturday, October 3, 2009 - link
I simply assume he is either directly or indirectly on ATI's payroll.Fudzilla wrote "The real engineering sample card is full of dangling wires." To display such a card to others they could simply epoxy down some connectors and solder the wires to them.
monomer - Friday, October 2, 2009 - link
Here's an article from Fudo saying that the card was a mock-up. Nvidia claimed it was real at the conference, and are now saying its a fake, but that they really, truly, had a real one running the demos. Really! I completely believe them.http://www.fudzilla.com/content/view/15798/1/">http://www.fudzilla.com/content/view/15798/1/
Yojimbo - Thursday, October 1, 2009 - link
What makes you think it isn't the right time? You can only really tell in hindsight, but you give in your post any reason that you think now is not the right time and later, when amd is gonna do it, is the right time. I think the right time is whenever the architecture is available and the interest is there. Nvidia has, over the past 5 years, been steadily building the architecture for it. Whether the tools are all in place yet and whether the interest is really there remains to be seen.It has nothing to do with matrox or any shift to a "professional niche." Nvidia believes that it has the ability to evolve and leverage its products from the niche sector of 3d graphics into a broader and more ubiquitous computing engine.
wumpus - Thursday, October 1, 2009 - link
Do you see any sign of commercial software support? Anybody Nvidia can point to and say "they are porting $important_app to openCL"? I haven't heard a mention. That pretty much puts Nvidia's GPU computing schemes solely in the realm of academia (where you can use grad students a cheap highly-skilled labor). If they could sell something like a FEA package for pro-engineer or solidworks, the things would fly off the shelves (at least I know companies who would buy them, but it might be more a location bias). If you have to code it yourself, that leaves either academia (which mostly just needs to look at hardware costs) and existing supercomputer users. The existing commercial users have both hardware and software (otherwise they would be "potential users"), and are unlikely to want to rewrite the software unless it is really, really, cheaper. Try to imagine all the salaries involved in running the big, big, jobs Nvidia is going after and tell me that the hardware is a good place to save money (at the cost of changing *everything*).I'd say Nvidia is not only killing the graphics (with all sorts of extra transistors that are in the way and are only for double point), but they aren't giving anyone (outside academia) any reason to use openCL. Maybe they have enough customers who want systems much bigger than $400k, but they will need enough of them to justify designing a >400mm chip (plus the academics, who are buying these because they don't have a lot of money).
hazarama - Saturday, October 3, 2009 - link
"Do you see any sign of commercial software support? Anybody Nvidia can point to and say "they are porting $important_app to openCL"? I haven't heard a mention. That pretty much puts Nvidia's GPU computing schemes solely in the realm of academia"Maybe you should check out Snow Leopard ..
samspqr - Friday, October 2, 2009 - link
Well, I do HPC for a living, and I think it's too early to push GPU computing so hard because I've tried to use it, and gave up because it required too much effort (and I didn't know exactly how much I would gain in my particular applications).I've also tried to promote GPU computing among some peers who are even more hardcore HPC users, and they didn't pick it up either.
If even your typical physicist is scared by the complexity of the tool, it's too early.
(as I'm told, there was a time when similar efforts were needed in order to use the mathematical coprocessor...)
Yojimbo - Sunday, October 4, 2009 - link
>>If even your typical physicist is scared by the complexity of the >>tool, it's too early.This sounds good but it's not accurate. Physicists are interested in physics and most are not too keen on learning some new programing technique unless it is obvious that it will make a big difference for them. Even then, adoption is likely to be slow due to inertia. Nvidia is trying to break that inertia by pushing gpu computing. First they need to put the hardware in place and then they need to convince people to use it and put the software in place. They don't expect it to work like a switch. If they think the tools are in place to make it viable, then how is the time to push, because it will ALWAYS require a lot of effort when making the switch.
jessicafae - Saturday, October 3, 2009 - link
Fantastic article.I do bioinformatics / HPC and in our field too we have had several good GPU ports for a handful for algorithms, but nothing so great to drive us to add massive amounts of GPU racks to our clusters. With OpenCL coming available this year, the programming model is dramatically improved and we will see a lot more research and prototypes of code being ported to OpenCL.
I feel we are still in the research phase of GPU computing for HPC (workstations, a few GPU racks, lots of software development work). I am guessing it will be 2+ years till GPU/stream/OpenCL algorithms warrant wide-spread adoption of GPUs in clusters. I think a telling example is the RIKEN 12petaflop supercomputer which is switching to a complete scalar processor approach (100,000 Sparc64 VIIIfx chips with 800,000 cores)
http://www.fujitsu.com/global/news/pr/archives/mon...">http://www.fujitsu.com/global/news/pr/archives/mon...
Thatguy97 - Thursday, May 28, 2015 - link
oh fermi how i miss ya hot underperforming ass