Never thought the 8800 GT was this good! I found some really low prices at this website (http://www.videocardemporium.com)">http://www.videocardemporium.com), but I'm still unsure if I want to try to get two of these in SLI or go for the 9600 GT. Or maybe just slurge for the GTX 280 >.<
Just ordered mine Friday, should be here Wednesday.
I was lucky, ordered it from Newegg at $239.99 and it seems totally worth it after reading these comments.
Oh and I checked again today and it seems Newegg is all sold out =]]
It'd be awfully nice to have the axes of the graphs labeled. For the first set, I can guess that they are screen resolution on the horizontal axis, and frames per second on the vertical, but I could be wrong, since there's no labels.
I also couldn't follow the page on comparing the 8800 GT to the 8800 GTX. Your conclusion seems to state that, "the 8800 GT doesn't touch the GTX." However, I can't come up with that conclusion from the graphs. They look roughly comparable in most of the tests that you've shown, with only a slight advantage to the GTX at very high resolutions.
On the "Out with the old..." page, there is a typo in the second paragraph. In the last sentence, "fasted" should be "fastest".
From what I have read here and else where this seems to be THE card to get. Before I make a purchase though, I would very much like to see more data comparing the cards offered by the different card manufacturers.
Is Anandtech going to be doing an 8800 GT roundup any time soon? Do I have to beg?
How much of an improvement if you SLI'd a 8800GT with a GTX? I know mix n matching is not optimal but the price difference makes me wonder. Would it fall between two GT's and two GTX's? I dont have any experience with SLI. I've avoided it because it's never been a decent upgrade path.
Ordered mine from Scan.co.uk - Gainward Bliss - £170 GBP
O.M.G - it really is amazing.
I bought 2x 7800GTX's (just before the 7900GTX's came out) at £660 for the pair and this card just blows them away.
On my 24" dell monitor at 1920x1200 with 4xAA on an opteron 175 @ stock 2.2ghz i get average 125fps in Team Fortress 2. Everything else I've tried has been very smooth - world in conflict, bf2142. A *very* noticable performance increase at a billiant price!
If you're considering an upgrade buy one of these NOW, play todays games at awesome speeds. then get a nice new intel 45nm quadcore + x38 + ddr3 in january when the products are released and prices will be lower.. Then if needed sell the gfx card and buy whatever nvidia are offering in january - if you really need too (which i doubt you will).
quote: First, our understanding is that the RV670 based AMD part will not be any faster than the 2900 XT (and will likely be at least a little bit slower).
Most online retailers have pulled these items off their websites entirely, as I'm sure these cards have been picked up ravenously by gamers wanting The Holy Grail of video cards, as it seems this is.
My question is, "Why so cheap, and why now?" for injection into the market, NVIDIA could have raised the price at least $50 (which most retail shops have already done to capitalize on its popularity) and still have a product that sells like crazy. This makes me wonder what is next, and if a better product is in the works that makes them want to get rid of this inventory as quickly as possible before the next big thing comes out. It may be (and yes, I'm reaching) that this card is on the low side in NVIDIA's new product line, and they can clear inventory at a price premium now as opposed to when the full line is released. They have little reason to throw out their best until AMD has shown their hand, and are playing the same game that Intel is, with their 3.0ghz processor that can easily be clocked higher.
With this in mind, I plan on holding on to my money for now, partially because I can't even find one in stock yet, and partially because having this card at this price point doesn't seem to make much sense unless a full line refresh is coming, and this card is the weakest link, which is an incredible thing to think about, considering how good this card appears to be
"A G92-derivative will appear later this year with even more shader units. According to company guidance, the new G92 will launch in early December and feature 128 shader units as opposed to the 112 featured on GeForce 8800 GT. ... In addition to the extra shaders, the new G92 will also feature higher core frequencies and support for up to 1GB GDDR3."
Awesome, been waiting for something like this to come around. Right now at most places the cheapest I've found is $260 with $6 shipping. I'll wait for it to drop down to around the $199 mark & I'll be all over it.
how long before we start seeing something like this in a laptop? i think there was a brief mention that it might be possible to make one with passive cooling.. so that makes me hopeful. the 8600 series in laptops doesnt really impress me
The text of the article goes on as if the GT doesn't really compare to the GTX, except on price/performance:
quote: We would be out of our minds to expect the 8800 GT to even remotely compete with the GTX, but the real question is - how much more performance do you get from the extra money you spent on the GTX over the GT?
quote: But back to the real story, in spite of the fact that the 8800 GT doesn't touch the GTX, two of them will certainly beat it for either equal or less money.
Yet all the graphs show the GT performing pretty much on par with the GTX, with at most a 5-10fps difference at the highest resolution.
I didn't understand that last sentence I quoted above at all.
This is obviously an amazing card and I hope it sets a new trend for getting good gaming performance in the latest titles for around $200 like it used to be, unlike the recent trend of having to spend $350+ for high end (not even ultra high end). However, I don't get why a GT part is higher performing than a GTS, isn't that going against their normal naming scheme a bit? I thought it was typically: Ultra -> GTX -> GTS -> GT -> GS, or something like that.
I've been hearing rumors about an Nvidia 9800 card being released in the coming months .... is that the same card with an outdated/incorrect naming convention or a new architecture beyond G92?
I guess if Nvidia had a next-gen architecture coming it would explain why they dont mind wiping some of their old products off the board with the 8800 GT which seems as though it will be a dominant part for the remaining lifetime of this generation of parts.
After lurking on Anandtech for two layout/design revisions, I have finally decided to post a comment. :D
First of all hi all!
Second of all, is it okay that nVidia decided not to introduce a proper next gen part in favour of this mid range offering? Okay so its good and what not, but what I'm wondering is, something that the article does not talk about, is what the future value of this card is. Can I expect this to play some upcoming games (Alan Wake?) on 1600 x 1200? I know its hard to predict, but industry analysts like you guys should have some idea. Also how long can I expect this card to continue playing games at acceptable framerates? Any idea, any one?
Thanks.
UT3 looks great in DX9, and Bioshock looks great in DX10. Crysis looks amazing, but its a demo, not final code and it does run very slow.
The bottom line is that developers need to balance the amazing effects they show off with playability -- it's up to them. They know what hardware you've got and they chose to push the envelope or not.
I konw that's not an answer, sorry :-( ... it is just nearly impossible to say what will happen.
From context, I'm thinking 512. Since 512MB are the only cards available in the channel, and Derek was hypothesizing about the pricing of a 256MB version, I think you can be confident this was a 512MB test card.
just activated the step-up on my current 8800GTS 320MB -- after shipping costs and discounting the MIR from back then, I actually get the 8800GT 512MB for -$12 :)
Genrally, Anandtech does an excellent job with it's reviews and uses robust benchmarking methodology. Any ideas why the Tech Report's results are so different?
Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.
For a different, more comprehensive look at the 8800 GT, why not try http://www.firingsquad.com/hardware/nvidia_geforce...">the FiringSquad review? They test at a variety of resolutions with a decent selection of GPUs and games. Out of all of their results, the only situation where the 8800 GTS 640 comes out ahead of the 8800 GT is in Crysis at 2xAA/8xAF at 1920x1200. Granted, they don't have 2560x1600 resolutions in their results, but how many midrange people use 30" LCDs? For that matter, how many highend gamers use 30" LCDs? I'm sure they're nice, but for $1300+ I have a lot of other stuff I'd be interested in purchasing!
There are a lot of things that we don't know about testing methodology with all of the reviews. What exact detail settings are used, for example, and more importantly how realistic are those settings? Remember Doom 3's High Quality and Ultra Quality? Running everything with uncompressed textures to artificially help 512MB cards appear better than 256MB cards is stupid. Side by side screenshots showed virtually no difference. I don't know what the texture settings are in the Crysis demo, but I wouldn't be surprised if a bunch of people are maxing everything out and then crying about performance. Being a next gen title, I bet Crysis has the ability to stress the 1GB cards - whether or not it really results in an improved visual experience.
Maybe we can get some image quality comparisons when the game actually launches, though - because admittedly I could be totally wrong and the Crysis settings might be reasonable.
Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.
For a different, more comprehensive look at the 8800 GT, why not try
Ive been following anandtech testresults very carefully since the UT3 demo was released. What i can find comparing these results to the others in UT3 just doesnt make any sense ;
3.rd
Looking at the new test again, 8800GT VS 8800GTS : http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140... Shows the 8800GT beating 8800GTS. @ 1280 * 1024 = close to 120fps vs 105fps. The GTS still over 100, when being below a 100 on the previous test.
But the huge difference is @ 1600 * 1200. 8800GT right above 100fps, when the GTS around 90? On the previous test GTS showed results as low as 77fps, cmon something smells wierd.
Seems like all the 8800GT, GTX, ULTRA cards just got awhole freaking lot better, and making the 2900xt looking worse. WHICH I FIND DOUBLTY.. Someone bring the facts to the table.
dont tell me 2extra gb of ram made the nvidia cards play alot better, and the ati card alot worse!
Well clearly a graphics issue this must be. But I read nvidia 169.xx drivers were made for optimizing the performance, but lowering the quality of the graphics.
This was prooved when the water was less nicer in crysis etc with 169.04 and 169.01, than with their previous 163.xx drivers.
It's hard to tell what you are getting when you compare the results from one article to those of another article. Ideally, you would like to be able to assume that the testing was done in an identical manner, but this isn't typically the case. As was already pointed out, look at the drivers being used. The earlier tests used nvidia's 163.75 drivers while the tests in this article used nvidia's 169.10 drivers.
Also, not enough was said about how Unreal 3 was being tested to know, but I wonder if they benchmarked the the game in different manners for the different articles. For example, were they using the same map "demo"? Were they using the game's built-in fly-bys or where they using FRAPS? These kind of differences between articles could make direct comparisons between articles difficult.
To blacken. I am a big AMD fan, but right now it's almost laughable how they're getting stepped and kicked on by the competition.
AMD's ideas are great for the long run, and their 65nm process was just a mistake since 45nm is right around the corner. They simply do not know how to compete when the heat is on. AMD is still traveling in 1st gear.
"NVIDIA Demolishes... NVIDIA? 8800 GT vs. 8600 GTS"
Well the 8600GTS was a mistake that never should have seen the light of day: over-priced, under-featured from the start. The 8800 GT is the card we were expecting back in the Spring when NVidia launched that 8600 GTS turd instead.
I may be a bit misinformed on this, but I'm getting the impression that Crysis represents the first game that makes major use of DX10 features, and as a consequence, it takes a major bite out of the performance that existing PC hardware can provide. When the 8800GT is used in a heavy DX10 game context does the performance that results fall into a hardware class that we typically would expect from a $200 part? In other words, making use of the Ti-4200 comparison, is the playable performance only acceptable at moderate resolutions and medium settings?
We've seen something like this before, when DX8 hardware was available and people were still playing DX7 games with this new hardware, the performance was very good. Once games started to show up that were true DX8 games, hardware (like the Ti-4200) that first supported DX8 features struggled to actually run these DX8 features.
Basically, I'm wondering whether Crysis (and other DX10 games that presumably will follow) places the 8800GT's $200 price point into a larger context that makes sense.
I've run Vista for about a month before switching back to XP due to Quake Wars crashing a lot (no more crashes under XP). I've run bunch of demos during that month including Crysis and Bioshock and I swear I didn't see a lot of visual difference between DX10 on Vista and DX9 on XP. Same for Time Shift (does it use DX10?). And all games run faster on XP. I really see no compelling reason to go back to Vista just because of DX10.
I'm not sure why the first post lost my text unless it was the bracket I used around the H - but HardOCP is reporting that nVidia is changing the 8800GTS 640 MB to have 112 stream processors.
Great article Derek - I think you can tell you're mildly excited about this product :)
Is there a reason that you didn't do any tests with anti-aliasing? I would assume that this would show more deviation between the 8800GTX and the 8800GT?
Just wondering though, if you were able to test the cards at the same clock speeds. The GT by default has @100MHz advantage on the core over the GTS, which is a common reason the GTS falls so far behind in head to head testing. I expect the GT to have more OC'ing headroom than the GTS anyways, but it would be nice to see an apples to apples comparison to reveal the impact of some of the architecture changes from G80 to G92. Of note, the GT has fewer ROPs and a smaller memory bus but gains 1:1 address/filter units and 16 more stream processors.
Also, I saw an early review that showed massive performance gains when the shader processor was overclocked on the GT; much bigger gains than significant increases to the core/memory clocks. Similar testing with the GTS/GTX don't yield anywhere near that much performance gain when the shader core clock is bumped up.
Lastly, any idea when the G92 8800GTS refresh is going to be released? With a 640MB GTS this seems more of a lateral move to an 8800GT, although a refreshed GTS with 128SP and all the other enhancements of the G92 should undoubtedly be faster than the GTX...and maybe even the Ultra once overclocked.
Thanks for the reply.
This card looks to be pretty cool running and when not running 3D intensive apps I'm sure power consumption and noise is really low.
So it might be nice to be able to play a little on a 52"LCD!
also, if you go with a less powerful card for HD HTPC you'll want at minimum the 8600 GTS -- which is not a good card. The 8800 GT does offer a lot more bang for the buck, and Sparkle is offering a silent version.
Nothing like cherry picking the games... I don't understand why games like Stalker and Prey weren't tested as the 2900XT has superior performance on those titles, as well as other titles. Seems like a biased test.
we tested quake wars, which is effectively updated prey (id's engine).
and stalker runs better on nvidia hardware -- when tested properly (many people use demo flybys that point up at the sky way too much rather than fraps run throughs).
quote: The G92 is fabbed on a 65nm process, and even though it has fewer SPs, less texturing power
G92 has the same amount of SPs and MORE texturing power (twice as many addressing units) than G80. However, 8800GT card has some SPs and texture units disabled.
well, first, if G92 has those units disabled, then it can't claim them.
second, NVIDIA would not confirm that the G92 as incarnate on 8800 GT has units disabled, but it is fair to speculate that this configuration was chosen to work out yields on their first 65nm part.
Based on benchmarks and price this card is finally in the sweet spot for me which means I can finally ditch my ATI X300! I only have one question remaining and that concerns the noise level. How does it compare to the 8800GTS? Why was this omitted from your review?
Tom's Hardware did a noise comparison and found that the 8800GT was as quiet or quieter than any of the other 8800 series cards, the 8600 series, and the 2900XT.
This game seems real demanding. If it is getting 37 f.p.s. at 1280 x 1024, imagine what the frame rate will be with 4X FSAA enabled combined with 8X Anistrophic Filtering. I think I will wait till Nvidia releases there 9800/9600 GT/GTS and combine that with Intel's 45nm Penryn CPU. I want to play this beautiful game in all it's glory!:)
but there were issues ... not with the game, we just shot ourselves in the foot on this one and weren't able to do as much as we wanted. We had to retest a bunch of stuff, and we didn't get to crysis.
Yes, I am glad instead of purchasing a video card, I instead changed motherboard/CPU for Intel vs AMD. I still like my AM2 Opteron system a lot, but performance numbers, and the effortless 1Ghz OC on the ABIT IP35-E/(at $90usd !) was just too much to overlook.
I can definitely understand your 'praise' as it were when nVidia is now lowering their prices, but this is where these prices should have always been. nVidia, and ATI/AMD have been ripping us, the consumer off for the last 1.5 years or so, so you will excuse me if I do not show too much enthusiasm when they finally lower their prices to where they should be. I do not consider this to be much different than the memory industry over charging, and the consumer getting the shaft(as per your article).
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
90 Comments
Back to Article
Nick388 - Wednesday, September 24, 2008 - link
Never thought the 8800 GT was this good! I found some really low prices at this website (http://www.videocardemporium.com)">http://www.videocardemporium.com), but I'm still unsure if I want to try to get two of these in SLI or go for the 9600 GT. Or maybe just slurge for the GTX 280 >.<sickish - Monday, February 25, 2008 - link
Just ordered mine Friday, should be here Wednesday.I was lucky, ordered it from Newegg at $239.99 and it seems totally worth it after reading these comments.
Oh and I checked again today and it seems Newegg is all sold out =]]
mgambrell - Monday, November 12, 2007 - link
geforce 8800 gt: the only card that crashes my PC in nv4_disp over and over and over.no matter which motherboard it is in or which OS is installed or which other expansion cards are installed
no matter whose 8800 I use.
nvidia you are now on my shitlist.
gleblanc - Friday, November 9, 2007 - link
It'd be awfully nice to have the axes of the graphs labeled. For the first set, I can guess that they are screen resolution on the horizontal axis, and frames per second on the vertical, but I could be wrong, since there's no labels.I also couldn't follow the page on comparing the 8800 GT to the 8800 GTX. Your conclusion seems to state that, "the 8800 GT doesn't touch the GTX." However, I can't come up with that conclusion from the graphs. They look roughly comparable in most of the tests that you've shown, with only a slight advantage to the GTX at very high resolutions.
On the "Out with the old..." page, there is a typo in the second paragraph. In the last sentence, "fasted" should be "fastest".
True Strike - Thursday, November 8, 2007 - link
From what I have read here and else where this seems to be THE card to get. Before I make a purchase though, I would very much like to see more data comparing the cards offered by the different card manufacturers.Is Anandtech going to be doing an 8800 GT roundup any time soon? Do I have to beg?
Manch - Monday, November 5, 2007 - link
How much of an improvement if you SLI'd a 8800GT with a GTX? I know mix n matching is not optimal but the price difference makes me wonder. Would it fall between two GT's and two GTX's? I dont have any experience with SLI. I've avoided it because it's never been a decent upgrade path.hydrasworld - Friday, November 2, 2007 - link
Ordered mine from Scan.co.uk - Gainward Bliss - £170 GBPO.M.G - it really is amazing.
I bought 2x 7800GTX's (just before the 7900GTX's came out) at £660 for the pair and this card just blows them away.
On my 24" dell monitor at 1920x1200 with 4xAA on an opteron 175 @ stock 2.2ghz i get average 125fps in Team Fortress 2. Everything else I've tried has been very smooth - world in conflict, bf2142. A *very* noticable performance increase at a billiant price!
If you're considering an upgrade buy one of these NOW, play todays games at awesome speeds. then get a nice new intel 45nm quadcore + x38 + ddr3 in january when the products are released and prices will be lower.. Then if needed sell the gfx card and buy whatever nvidia are offering in january - if you really need too (which i doubt you will).
Highly Recommended.
rap - Friday, November 2, 2007 - link
Info on the RV670:
http://www.penstarsys.com/editor/so3d/q4_2007/so3d...">http://www.penstarsys.com/editor/so3d/q4_2007/so3d...
ksherman - Wednesday, October 31, 2007 - link
A great review, and a great part! Finally a video card that excites me. Too bad I ditched the desktop in favor of a laptop.Screammit - Wednesday, October 31, 2007 - link
Most online retailers have pulled these items off their websites entirely, as I'm sure these cards have been picked up ravenously by gamers wanting The Holy Grail of video cards, as it seems this is.My question is, "Why so cheap, and why now?" for injection into the market, NVIDIA could have raised the price at least $50 (which most retail shops have already done to capitalize on its popularity) and still have a product that sells like crazy. This makes me wonder what is next, and if a better product is in the works that makes them want to get rid of this inventory as quickly as possible before the next big thing comes out. It may be (and yes, I'm reaching) that this card is on the low side in NVIDIA's new product line, and they can clear inventory at a price premium now as opposed to when the full line is released. They have little reason to throw out their best until AMD has shown their hand, and are playing the same game that Intel is, with their 3.0ghz processor that can easily be clocked higher.
With this in mind, I plan on holding on to my money for now, partially because I can't even find one in stock yet, and partially because having this card at this price point doesn't seem to make much sense unless a full line refresh is coming, and this card is the weakest link, which is an incredible thing to think about, considering how good this card appears to be
bob4432 - Wednesday, October 31, 2007 - link
i have been waiting for this card :) my old x1800xt will soon be retired once these guys get to ~$180 AR!!!! :) :)R3MF - Tuesday, October 30, 2007 - link
I am deeply impressed with the card, but i have a severe aversion to cut-down products.A 128 SPU version clocked at 640MHz with 2000MHz GDDR memory would go down a treat.
How about it?
mpc7488 - Thursday, November 1, 2007 - link
About one month.http://www.dailytech.com/article.aspx?newsid=9474">http://www.dailytech.com/article.aspx?newsid=9474
"A G92-derivative will appear later this year with even more shader units. According to company guidance, the new G92 will launch in early December and feature 128 shader units as opposed to the 112 featured on GeForce 8800 GT. ... In addition to the extra shaders, the new G92 will also feature higher core frequencies and support for up to 1GB GDDR3."
varia - Monday, October 29, 2007 - link
RE: Wow by EODetroit on: Oct 29, 2007 3:06 PMRating: 2Now.
http://www.newegg.com/Product/ProductLi...18+10696...">http://www.newegg.com/Product/ProductLi...18+10696...
When I was checking out around 1pm today at newegg, they got 4 diff. cards, all $249-269
Now, they listed 2, all back order, price: $289-299
Pffff not gonna but from them, for sure.
varia - Monday, October 29, 2007 - link
Forget newegg, Fry's will have it at this friday.http://shop4.outpost.com/product/5434329?site=sr:S...">http://shop4.outpost.com/product/5434329?site=sr:S...
EVGA GeForce 8800GT Video Card (512MB DDR3, PCI-E 2.0, DX10, OpenGL 2.0)
EVGA:
FRYS.com #: 5434329
Price: $ 229.99
gplracer - Monday, October 29, 2007 - link
Are these results running the 8800gt as a single or in sli?gplracer - Monday, October 29, 2007 - link
never mind it is single i miss read itShlong - Monday, October 29, 2007 - link
Awesome, been waiting for something like this to come around. Right now at most places the cheapest I've found is $260 with $6 shipping. I'll wait for it to drop down to around the $199 mark & I'll be all over it.clandren - Monday, October 29, 2007 - link
how long before we start seeing something like this in a laptop? i think there was a brief mention that it might be possible to make one with passive cooling.. so that makes me hopeful. the 8600 series in laptops doesnt really impress meAggressorPrime - Monday, October 29, 2007 - link
Page 3"We aren't including any new tests here, as we can expect performance on the same level as the 8600 GTS."
Let us hope the GeForce 8800 GT is on the same level as the GeForce 8600 GTS.
AggressorPrime - Monday, October 29, 2007 - link
I made a typo. Let us hope they are not on the same level.ninjit - Monday, October 29, 2007 - link
This page has my very confused:http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
The text of the article goes on as if the GT doesn't really compare to the GTX, except on price/performance:
Yet all the graphs show the GT performing pretty much on par with the GTX, with at most a 5-10fps difference at the highest resolution.
I didn't understand that last sentence I quoted above at all.
archcommus - Monday, October 29, 2007 - link
This is obviously an amazing card and I hope it sets a new trend for getting good gaming performance in the latest titles for around $200 like it used to be, unlike the recent trend of having to spend $350+ for high end (not even ultra high end). However, I don't get why a GT part is higher performing than a GTS, isn't that going against their normal naming scheme a bit? I thought it was typically: Ultra -> GTX -> GTS -> GT -> GS, or something like that.mac2j - Monday, October 29, 2007 - link
I've been hearing rumors about an Nvidia 9800 card being released in the coming months .... is that the same card with an outdated/incorrect naming convention or a new architecture beyond G92?I guess if Nvidia had a next-gen architecture coming it would explain why they dont mind wiping some of their old products off the board with the 8800 GT which seems as though it will be a dominant part for the remaining lifetime of this generation of parts.
MFK - Monday, October 29, 2007 - link
After lurking on Anandtech for two layout/design revisions, I have finally decided to post a comment. :DFirst of all hi all!
Second of all, is it okay that nVidia decided not to introduce a proper next gen part in favour of this mid range offering? Okay so its good and what not, but what I'm wondering is, something that the article does not talk about, is what the future value of this card is. Can I expect this to play some upcoming games (Alan Wake?) on 1600 x 1200? I know its hard to predict, but industry analysts like you guys should have some idea. Also how long can I expect this card to continue playing games at acceptable framerates? Any idea, any one?
Thanks.
DerekWilson - Monday, October 29, 2007 - link
that's a tough call ....but really, it's up to the developers.
UT3 looks great in DX9, and Bioshock looks great in DX10. Crysis looks amazing, but its a demo, not final code and it does run very slow.
The bottom line is that developers need to balance the amazing effects they show off with playability -- it's up to them. They know what hardware you've got and they chose to push the envelope or not.
I konw that's not an answer, sorry :-( ... it is just nearly impossible to say what will happen.
crimson117 - Monday, October 29, 2007 - link
How much ram was on the 8800 GT used in testing? Was is 256 or 512?NoBull6 - Monday, October 29, 2007 - link
From context, I'm thinking 512. Since 512MB are the only cards available in the channel, and Derek was hypothesizing about the pricing of a 256MB version, I think you can be confident this was a 512MB test card.DerekWilson - Monday, October 29, 2007 - link
correct.256MB cards do not exist outside NVIDIA at this point.
ninjit - Monday, October 29, 2007 - link
I was just wondering about that too.I thought I missed it in the article, but I didn't see it in another run through.
I see I'm not the only one who was curious
vijay333 - Monday, October 29, 2007 - link
just activated the step-up on my current 8800GTS 320MB -- after shipping costs and discounting the MIR from back then, I actually get the 8800GT 512MB for -$12 :)bespoke - Monday, October 29, 2007 - link
Lucky bastard! :)vijay333 - Monday, October 29, 2007 - link
hehe...great timing too. only had 5 days remaining before the 90day limit for the step-up program expired :)clockerspiel - Monday, October 29, 2007 - link
Genrally, Anandtech does an excellent job with it's reviews and uses robust benchmarking methodology. Any ideas why the Tech Report's results are so different?http://www.techreport.com/articles.x/13479">http://www.techreport.com/articles.x/13479
Frumious1 - Monday, October 29, 2007 - link
Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.For a different, more comprehensive look at the 8800 GT, why not try http://www.firingsquad.com/hardware/nvidia_geforce...">the FiringSquad review? They test at a variety of resolutions with a decent selection of GPUs and games. Out of all of their results, the only situation where the 8800 GTS 640 comes out ahead of the 8800 GT is in Crysis at 2xAA/8xAF at 1920x1200. Granted, they don't have 2560x1600 resolutions in their results, but how many midrange people use 30" LCDs? For that matter, how many highend gamers use 30" LCDs? I'm sure they're nice, but for $1300+ I have a lot of other stuff I'd be interested in purchasing!
There are a lot of things that we don't know about testing methodology with all of the reviews. What exact detail settings are used, for example, and more importantly how realistic are those settings? Remember Doom 3's High Quality and Ultra Quality? Running everything with uncompressed textures to artificially help 512MB cards appear better than 256MB cards is stupid. Side by side screenshots showed virtually no difference. I don't know what the texture settings are in the Crysis demo, but I wouldn't be surprised if a bunch of people are maxing everything out and then crying about performance. Being a next gen title, I bet Crysis has the ability to stress the 1GB cards - whether or not it really results in an improved visual experience.
Maybe we can get some image quality comparisons when the game actually launches, though - because admittedly I could be totally wrong and the Crysis settings might be reasonable.
Frumious1 - Monday, October 29, 2007 - link
Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.For a different, more comprehensive look at the 8800 GT, why not try
Parafan - Monday, October 29, 2007 - link
I just dont like being fed by the same site to tell 2 totally different things when picking my new GPU card.Parafan - Monday, October 29, 2007 - link
Ive been following anandtech testresults very carefully since the UT3 demo was released. What i can find comparing these results to the others in UT3 just doesnt make any sense ;1.st
Looking at : http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
shows the new 8800GT card beating 2900XT by, almost 120fps vs 105fps or so, in 1280*1024 @ UT3.
2.nd
Looking at the first & second GPU test : http://www.anandtech.com/video/showdoc.aspx?i=3128...">http://www.anandtech.com/video/showdoc.aspx?i=3128...
Shows the 2900XT being on top with about 108,5fps, vs 8800 ULTRA, GTX and GTS, with 104,2 98,3 and 97.2 @ 1280 * 1024.
Prett close nr.s you see.
3.rd
Looking at the new test again, 8800GT VS 8800GTS : http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
Shows the 8800GT beating 8800GTS. @ 1280 * 1024 = close to 120fps vs 105fps. The GTS still over 100, when being below a 100 on the previous test.
But the huge difference is @ 1600 * 1200. 8800GT right above 100fps, when the GTS around 90? On the previous test GTS showed results as low as 77fps, cmon something smells wierd.
See where im going?
http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
just showed the 8600GTS performing alot worse in this new test compared to the old one, @ all resolutions.
and again
http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
8800GT and 8800GTX performing about the same, at the highest almost 120fps. compared to the previous test thats like 20 fps better than the GTX performed last time. Why dont these tests corresponde at all to the one just made?
Seems like all the 8800GT, GTX, ULTRA cards just got awhole freaking lot better, and making the 2900xt looking worse. WHICH I FIND DOUBLTY.. Someone bring the facts to the table.
dont tell me 2extra gb of ram made the nvidia cards play alot better, and the ati card alot worse!
DerekWilson - Monday, October 29, 2007 - link
We used a different driver version this time -- in fact, we've gone through two driver revisions from NVIDIA here.The AMD card didn't slip significatnly in performance at all (differences were all within 3%).
We did rerun the numbers, and we really think its a driver issue -- the new NV driver improved performance.
Parafan - Wednesday, November 7, 2007 - link
Well clearly a graphics issue this must be. But I read nvidia 169.xx drivers were made for optimizing the performance, but lowering the quality of the graphics.This was prooved when the water was less nicer in crysis etc with 169.04 and 169.01, than with their previous 163.xx drivers.
Spacecomber - Monday, October 29, 2007 - link
It's hard to tell what you are getting when you compare the results from one article to those of another article. Ideally, you would like to be able to assume that the testing was done in an identical manner, but this isn't typically the case. As was already pointed out, look at the drivers being used. The earlier tests used nvidia's 163.75 drivers while the tests in this article used nvidia's 169.10 drivers.Also, not enough was said about how Unreal 3 was being tested to know, but I wonder if they benchmarked the the game in different manners for the different articles. For example, were they using the same map "demo"? Were they using the game's built-in fly-bys or where they using FRAPS? These kind of differences between articles could make direct comparisons between articles difficult.
spinportal - Monday, October 29, 2007 - link
Have you checked the driver versions? Over time drivers do improve performance, perhaps?Parafan - Monday, October 29, 2007 - link
Well the 'new' drivers made the GF 8600GTS Perform alot worse. But the higher ranked cards better. I dont know how likely that isRegs - Monday, October 29, 2007 - link
To blacken. I am a big AMD fan, but right now it's almost laughable how they're getting stepped and kicked on by the competition.AMD's ideas are great for the long run, and their 65nm process was just a mistake since 45nm is right around the corner. They simply do not know how to compete when the heat is on. AMD is still traveling in 1st gear.
yacoub - Monday, October 29, 2007 - link
"NVIDIA Demolishes... NVIDIA? 8800 GT vs. 8600 GTS"Well the 8600GTS was a mistake that never should have seen the light of day: over-priced, under-featured from the start. The 8800 GT is the card we were expecting back in the Spring when NVidia launched that 8600 GTS turd instead.
yacoub - Monday, October 29, 2007 - link
First vendor to put a quieter/larger cooling hsf on it gets my $250.gamephile - Monday, October 29, 2007 - link
Dih. Toh.CrystalBay - Monday, October 29, 2007 - link
Hi Derek, How are the Temps on load? I've seen some results of the GPU pushing 88C degrees plus with that anemic stock cooler.Spacecomber - Monday, October 29, 2007 - link
I may be a bit misinformed on this, but I'm getting the impression that Crysis represents the first game that makes major use of DX10 features, and as a consequence, it takes a major bite out of the performance that existing PC hardware can provide. When the 8800GT is used in a heavy DX10 game context does the performance that results fall into a hardware class that we typically would expect from a $200 part? In other words, making use of the Ti-4200 comparison, is the playable performance only acceptable at moderate resolutions and medium settings?We've seen something like this before, when DX8 hardware was available and people were still playing DX7 games with this new hardware, the performance was very good. Once games started to show up that were true DX8 games, hardware (like the Ti-4200) that first supported DX8 features struggled to actually run these DX8 features.
Basically, I'm wondering whether Crysis (and other DX10 games that presumably will follow) places the 8800GT's $200 price point into a larger context that makes sense.
Zak - Monday, November 5, 2007 - link
I've run Vista for about a month before switching back to XP due to Quake Wars crashing a lot (no more crashes under XP). I've run bunch of demos during that month including Crysis and Bioshock and I swear I didn't see a lot of visual difference between DX10 on Vista and DX9 on XP. Same for Time Shift (does it use DX10?). And all games run faster on XP. I really see no compelling reason to go back to Vista just because of DX10.Zak
Spacecomber - Monday, October 29, 2007 - link
TestEateryOfPiza - Monday, October 29, 2007 - link
What kind of G92 variants can we expect by Christmas 07?Or Summer 08?
mpc7488 - Monday, October 29, 2007 - link
ardOCP is reporting that nVidia is increasing the 8800GTS stream processors to 112.Spacecomber - Monday, October 29, 2007 - link
Testing ;-)Spacecomber - Monday, October 29, 2007 - link
It appears that it was the bracketed h that was hiding all subsequent text. It needed a bracketed /h to close that "feature".mpc7488 - Monday, October 29, 2007 - link
Haha - thanks. I guess if anyone wants the explanation of the stream processors they can highlight the 'hidden message'.mpc7488 - Monday, October 29, 2007 - link
I'm not sure why the first post lost my text unless it was the bracket I used around the H - but HardOCP is reporting that nVidia is changing the 8800GTS 640 MB to have 112 stream processors.mpc7488 - Monday, October 29, 2007 - link
Great article Derek - I think you can tell you're mildly excited about this product :)Is there a reason that you didn't do any tests with anti-aliasing? I would assume that this would show more deviation between the 8800GTX and the 8800GT?
chizow - Monday, October 29, 2007 - link
Nice job as usual Derek!Just wondering though, if you were able to test the cards at the same clock speeds. The GT by default has @100MHz advantage on the core over the GTS, which is a common reason the GTS falls so far behind in head to head testing. I expect the GT to have more OC'ing headroom than the GTS anyways, but it would be nice to see an apples to apples comparison to reveal the impact of some of the architecture changes from G80 to G92. Of note, the GT has fewer ROPs and a smaller memory bus but gains 1:1 address/filter units and 16 more stream processors.
Also, I saw an early review that showed massive performance gains when the shader processor was overclocked on the GT; much bigger gains than significant increases to the core/memory clocks. Similar testing with the GTS/GTX don't yield anywhere near that much performance gain when the shader core clock is bumped up.
Lastly, any idea when the G92 8800GTS refresh is going to be released? With a 640MB GTS this seems more of a lateral move to an 8800GT, although a refreshed GTS with 128SP and all the other enhancements of the G92 should undoubtedly be faster than the GTX...and maybe even the Ultra once overclocked.
Hulk - Monday, October 29, 2007 - link
I'm looking to build a HTPC and this would be a great card if it does video decoding?defter - Monday, October 29, 2007 - link
Yes it has VP2 processor for video decoding. But why would you need a fast gaming card for HTPC? Wouldn't 8400/8600 be a cheaper/cooler solution?Hulk - Monday, October 29, 2007 - link
Thanks for the reply.This card looks to be pretty cool running and when not running 3D intensive apps I'm sure power consumption and noise is really low.
So it might be nice to be able to play a little on a 52"LCD!
DerekWilson - Monday, October 29, 2007 - link
also, if you go with a less powerful card for HD HTPC you'll want at minimum the 8600 GTS -- which is not a good card. The 8800 GT does offer a lot more bang for the buck, and Sparkle is offering a silent version.spittledip - Monday, October 29, 2007 - link
Nothing like cherry picking the games... I don't understand why games like Stalker and Prey weren't tested as the 2900XT has superior performance on those titles, as well as other titles. Seems like a biased test.AssBall - Monday, October 29, 2007 - link
They didn't test The Sims2 or DeerHunter either...DerekWilson - Monday, October 29, 2007 - link
lol ... stalker and prey?we tested quake wars, which is effectively updated prey (id's engine).
and stalker runs better on nvidia hardware -- when tested properly (many people use demo flybys that point up at the sky way too much rather than fraps run throughs).
abe88 - Monday, October 29, 2007 - link
Hmmm I thought ATI's RV630 and RV610 chips both support PCI-E 2.0?Wirmish - Monday, October 29, 2007 - link
Yeah but it's not worth mentioning because theses GPU are not from nVidia.defter - Monday, October 29, 2007 - link
G92 has the same amount of SPs and MORE texturing power (twice as many addressing units) than G80. However, 8800GT card has some SPs and texture units disabled.
DerekWilson - Monday, October 29, 2007 - link
well, first, if G92 has those units disabled, then it can't claim them.second, NVIDIA would not confirm that the G92 as incarnate on 8800 GT has units disabled, but it is fair to speculate that this configuration was chosen to work out yields on their first 65nm part.
gamephile - Monday, October 29, 2007 - link
Based on benchmarks and price this card is finally in the sweet spot for me which means I can finally ditch my ATI X300! I only have one question remaining and that concerns the noise level. How does it compare to the 8800GTS? Why was this omitted from your review?Vidmar - Monday, October 29, 2007 - link
Ditto! Noise please!!!DerekWilson - Monday, October 29, 2007 - link
we didn't measure noise, as it's a reference board which doesn't necessarily reflect final boards available from OEMs.of course, since you guys want this, we'll try to add it to future GPU launch articles.
For now, it'll have to suffice to say that it isn't a loud card, and it doesn't seem any louder than the 8800 GTS.
Missing Ghost - Tuesday, October 30, 2007 - link
Theses times most retail cards are pretty much the same as the reference cards...michal1980 - Monday, October 29, 2007 - link
I want to know too. If its bettern then my 8800gts 640. I'll ebay that card now for the 8800. esspically with the smaller cooler and quiter.Dantzig - Monday, October 29, 2007 - link
Tom's Hardware did a noise comparison and found that the 8800GT was as quiet or quieter than any of the other 8800 series cards, the 8600 series, and the 2900XT.gamephile - Monday, October 29, 2007 - link
Yeah I saw that, I would just like confirmation from a source I trust.mpc7488 - Monday, October 29, 2007 - link
Lol - nice.The Tech Report did a good review, they have noise figures on page 7. http://techreport.com/articles.x/13479/7">Tech Report 8800GT Noise Levels
gamephile - Monday, October 29, 2007 - link
Also the power consumption image doesn't load for me either. I'm not behind any firewall or proxy.DerekWilson - Monday, October 29, 2007 - link
i'll look into the power graph thingDukeN - Monday, October 29, 2007 - link
This is unreal price to performance - knock on wood; play oblivion at 1920X1200 on a $250 GPU.Could we have a benchmark based on the Crysis demo please, how one or two cards would do?
Also, the power page pics do not show up for some reason (may be the firewall cached it incorrectly here at work).
Thank you.
Xtasy26 - Monday, October 29, 2007 - link
Hey Guys,If you want to see Crysis benchmarks, check out this link:
http://www.theinquirer.net/gb/inquirer/news/2007/1...">http://www.theinquirer.net/gb/inquirer/.../2007/10...
The benches are:
1280 x 1024 : ~ 37 f.p.s.
1680 x 1050 : 25 f.p.s.
1920 x 1080 : ~ 21 f.p.s.
This is on a test bed:
Intel Core 2 Extreme QX6800 @2.93 GHz
Asetek VapoChill Micro cooler
EVGA 680i motherboard
2GB Corsair Dominator PC2-9136C5D
Nvidia GeForce 8800GT 512MB/Zotac 8800GTX AMP!/XFX 8800Ultra/ATI Radeon HD2900XT
250GB Seagate Barracuda 7200.10 16MB cache
Sony BWU-100A Blu-ray burner
Hiper 880W Type-R Power Supply
Toshiba's external HD-DVD box (Xbox 360 HD-DVD drive)
Dell 2407WFP-HC
Logitech G15 Keyboard, MX-518 rat
Xtasy26 - Monday, October 29, 2007 - link
This game seems real demanding. If it is getting 37 f.p.s. at 1280 x 1024, imagine what the frame rate will be with 4X FSAA enabled combined with 8X Anistrophic Filtering. I think I will wait till Nvidia releases there 9800/9600 GT/GTS and combine that with Intel's 45nm Penryn CPU. I want to play this beautiful game in all it's glory!:)Spuke - Monday, October 29, 2007 - link
Impressive!!!! I read the article but I saw no mention of a release date. When's this thing available?Spuke - Monday, October 29, 2007 - link
Ummm.....When can I BUY it? That's what I mean.EODetroit - Monday, October 29, 2007 - link
Now.http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...18+10696...
poohbear - Wednesday, October 31, 2007 - link
when do u guys think its gonna be $250? cheapest i see is $270, but i understand when its first released the prices are jacked up a bit.EateryOfPiza - Monday, October 29, 2007 - link
I second the request for Crysis benchmarks, that is the game that taxes everything at the moment.DerekWilson - Monday, October 29, 2007 - link
we actually tested crysis ...but there were issues ... not with the game, we just shot ourselves in the foot on this one and weren't able to do as much as we wanted. We had to retest a bunch of stuff, and we didn't get to crysis.
yyrkoon - Monday, October 29, 2007 - link
Yes, I am glad instead of purchasing a video card, I instead changed motherboard/CPU for Intel vs AMD. I still like my AM2 Opteron system a lot, but performance numbers, and the effortless 1Ghz OC on the ABIT IP35-E/(at $90usd !) was just too much to overlook.I can definitely understand your 'praise' as it were when nVidia is now lowering their prices, but this is where these prices should have always been. nVidia, and ATI/AMD have been ripping us, the consumer off for the last 1.5 years or so, so you will excuse me if I do not show too much enthusiasm when they finally lower their prices to where they should be. I do not consider this to be much different than the memory industry over charging, and the consumer getting the shaft(as per your article).
I am happy though . . .