Comments Locked

111 Comments

Back to Article

  • Nightmare225 - Sunday, November 26, 2006 - link

    Are the FPS posted in this article, Minimum FPS, Average FPS, or Maximum? Thanks!
  • multiblitz - Monday, November 20, 2006 - link

    I enjoyed your reviews always a lot as they inclueded the video-capbilities for a HTPC on previous cards. Unfortunately this was this time not the case. Hopefully there will be a 2. Part covering this as well ? If so, it would be nice to make a compariosn on picture quality as well against the filters of ffdshow, as nvidia is now as well supporting postprocessing filters...
  • DerekWilson - Tuesday, November 21, 2006 - link

    What we know right now is that 8800 gets a 128 out of 130 on HQV tests.

    We haven't quite put together an HTPC look at 8800, but this is a possibility for the future.
  • epsil0n - Sunday, November 19, 2006 - link

    I am not agree with this:

    "It isn't surprising to see that NVIDIA's implementation of a unified shader is based on taking a pixel shader quad pipeline, and breaking up the vector units into 4 scalar units. Now, rather than 4 pixel quads, we see 16 SPs per "quad" or block of stream processors. Each block of 16 SPs shares 4 texture address units, 8 texture filter units, and an L1 cache."

    If i understood well this sentence tells that given 4 pixels the numbers of SPs involved in the computation are 16. Then, this assumes that each component of the pixel shader is computed horizontally over 16 SP (4pixel x 4rgba = 16SP). But, are you sure??

    I didn't found others articles over the web that speculate about this. Reading others articles the main idea that i realized is that a shader is computed by one and only one SP. Each vector instruction (inside the shader) is "mapped" as a sequence of scalar operations (a dot product beetwen two vectors is mapped as 4 MUD/ADD operations). As a consequence, in this scenario 4 pixels are computed only by 4 SPs.
  • DerekWilson - Tuesday, November 21, 2006 - link

    Honestly, NVIDIA wouldn't give us this level of detail. We certainly pressed them about how vertices and pixels map to SPs, but the answer we got was always something about how dynamic the hardware is able to dynamically schedule the SPs optimally according to what needs to be done.

    They can get away with being obscure about how they actually process the data because it could happen either way and provide the same effect to the developer and gamer alike.

    Scheduling the simultaneous processing one vec4 MAD operation on 4 quads (16 pixels) over 4 groups of 4 SPs will take 4 clock cycles (in terms of throughput). Processing the same 16 pixels on 16 SPs will also take 4 clock cycles.

    But there are reasons to believe that things happen the way we described. Loading components of 16 different "threads" (verts, pixels or whatever) would likely be harder on the cache than loading all 4 components of 4 different threads. We could see them schedule multiple ops from 4 threads to fill up each block of shaders -- like computing 4 consecutive scalar operations for 4 threads on 16 SPs.

    At the same time, it might be easier to maximize SP utilization if 16 threads were processed on one block of SPs every clock.

    I think the answer to this question is that NVIDIA knows, they didn't tell us, and all we can do is give it our best guess.
  • xtknight - Thursday, November 16, 2006 - link

    This has been AT's best article in awhile. Tons of great, concise info.

    I have a question about the gamma corrected AA. This would be detrimental if you've already calibrated your display, correct (assuming the game heeds to the calibration)? Do you know what gamma correction factor the cards use for 'gamma corrected AA'?
  • DerekWilson - Monday, November 20, 2006 - link

    I don't know if they dynamically adjust gamma correction based on monitor (that would be nice though) ...

    if they don't they likely adjusted for a gamma of either (or between) 2.2 or 2.5.

    Also, thanks :-) There was a lot more we wanted to pack in, but I'm glad to see that we did a good job with what we were able to include.

    Thanks,
    Derek Wilson
  • bjacobson - Sunday, November 12, 2006 - link

    This comment is unrelated, but could you implement some system where after rating a comment, on reload the page goes back to the comment I was just at? Otherwise I rate something halfway down and then have to spend several seconds finding where I just was. Just a little nuissance.

    Thanks for the great article, fun read.
  • neo229 - Friday, November 10, 2006 - link

    quote:

    Both cards are extremely quiet during operation...


    This is a very suspect quote. A card that requires two PCIe power connectors is going to dissipate a lot of heat. More heat means there must be a faster, louder fan or more substantial and costly heat sink. The extra costs associated with providing a truly quiet card mean that the bulk of manufacturers go with the loud fan option.
  • DerekWilson - Friday, November 10, 2006 - link

    If manufacturers go with the NVIDIA reference design, then we will see a nice large heatsink with a huge quiet fan.

    Really, it does move a lot of air without making a lot of noise ... Are there any devices we can get to measure the airflow of a cooling solution?

    We are also seeing some designs using water cooling and theres even one with a thermo-electric (peltier) cooler on it. Manufacturers are going to great lengths to keep this thing running cool without generating much noise.

    None of the 8 retail cards we are testing right now generate nearly the noise of the X1950 XTX ... We are working on a retail roundup right now, and we'll absolutely have noise numbers for all of these cards at load.
  • aweigh - Friday, November 10, 2006 - link

    You can just use the program DX Tweaker to enable Triple Buffering in any D3D game and use your VSYNC with negligable performance impact. So you can play with your VSYNC, a high-res and AA as well. :)
  • aweigh - Friday, November 10, 2006 - link

    I'm gonna buy an 88 specifically to use 4x4 SuperSampling in games. Why bother with MSAA with a card like that?
  • DerekWilson - Friday, November 10, 2006 - link

    Supersampling can make textures blurry -- especially very detailed textures.

    And the impact will be much greater with the use of longer more detailed pixel shaders (as the shaders must be evaluated at every sub-pixel in supersample).

    I think transparency / adaptive AA are enough.

    On your previous comment, I don't think we're to the point where we can hit triple buffering, vsync, high levels of AA AND high resolution (2560x1600) without some input lag (triple buffering plus vsync with framerates less than your refresh rate can cause problems).

    If you're talking about enabling all these options on a lower resolution lcd panel, then I can definitely see that as a good use of the hardware. And it might be interesting to look at more numbers with these type of options enabled.

    Thanks for the suggestion.
  • aweigh - Saturday, November 11, 2006 - link

    I never knew that about SuperSampling. Is it something similar to Quincux blurring? And would using a negative LOD via RivaTuner/nHancer counteract the effect?

    How about NVIDIA's Digital Sharpness setting in Color Correction? I've found a smidge of sharpening can do wonders to improve overall clarity.

    By the way, when you said Adaptive AA, were you referring to ATI cards?
  • Unam - Friday, November 10, 2006 - link

    Derek,

    Saw your comment regarding the rationale for the test resolution, while I understand your reasoning now, it still begs the question how many of your readers have 30" LCD flat panels?
  • DerekWilson - Friday, November 10, 2006 - link

    There might not be many out there right now, but it's still the right test platform for G80. We did test down to 1600x1200, so people do have information if they need it.

    But it speaks to who should own an 8800 GTX right now. It doesn't make sense to spend that much money on a part if you aren't going to get anything out of it with your 1280x1024 panel.

    Owners of a 2560x1600 panel will want an 8800 GTX. Owners of an 8800 GTX will want a 2560x1600 panel. Smooth framerates with the ability to enable 4xAA in every game that allowed it is reason enough. People without a 2560x1600 panel should probably wait until prices come down on the 8800 GTX or until games that are able to push the 8800 GTX harder to buy the card.
  • Unam - Tuesday, November 14, 2006 - link

    Derek,

    A follow up to testing resolutions, the FPS numbers we see in your articles, are they maximum, minimum or average?
  • Unam - Friday, November 10, 2006 - link

    Who the heck runs 2560x1600? At 4XAA? Come on guys, real world benchmarks please!
  • DerekWilson - Friday, November 10, 2006 - link

    we did:

    1600x1200, 1920x1440, and even 1280x1024 in Oblivion
  • dragonsqrrl - Thursday, August 25, 2011 - link

    ....lol, owned.
  • Sharky974 - Thursday, November 9, 2006 - link

    The new features of DX10 stuff was captivating at first, but quickly grew tiresome and needlessly complex. The IQ comparisons the same thing, some simplicity is needed here. Tell us in a nutshell what looks better and why. The mouse over pictures are well nigh useless as well, and all look like crap. Whatever needs to be changed to get the IQ point across, needs to be changed already, I'm guessing 200 zoom is a problem for starters.

    Then who's bright idea was it to only test one resolution, through the whole article?

    Then who's bright idea was it to dedicate just as many graphs as performance, one per game, to not only power draw, but the even more useless performance per watt? Meaning 66% of your data graphs, in an article about a paradign changin, long-anticipated, brand new GPU, are related to the power usage of the card. Are you electric workers monthly.com now?

    I am very surprised more of the comments weren't negative, this review was a total failure.

    And yeah, what's with all the non-standard resolution testing? All the big sites like H, Anand, and FS go round and round talking about the incredible depths they go to get the bottom of real world performance as it relates to the real world, average user, and then you guys use stupid resolution likes 1280X960 (FS uses that particular one), that nobody on earth uses, regularly! It's really, really stupid. Hell for that matter, nobody uses 1600X1200 or any non-LCD native res anymore either, yet those are all staples of any review, and so these "real world" articles aren't very real world at all. But that's somewhat of a tangent issue, and I actually dont mind a lot of different resolutions tested, just as long as the big common ones are hit (which is not always the case)
  • DerekWilson - Friday, November 10, 2006 - link

    I'm always working on bringing down the complexity of my explainations. It's one of my weak points as a writer. It's difficult for me to take something and present it at a high level that doesn't reflect exactly what the thing is. Analogies are great -- I like them -- but I have a hard time using them because I can't ever think of analogies that are accurate enough.

    Any suggestions you have for helping me explain things completely, accurately, effectively, and (especially) in the most straight forward manner possible are very welcome.

    As for the IQ comparisons -- these were much more simplified than I had intended (because Anand told me we couldn't do rollovers with 40 images on one page -- it would load too slow). This is our version of putting things in a nutshell. I could get to the point faster though --

    IQ:

    gamma correct aa is great for edges, but it causes problems with thin lines and transparency/adaptive AA making textures look mushy. transparency/adaptive aa are great but have a large performance hit -- except in 8800 which keeps these features playable and offers higher IQ. CSAA is great at brining higher AA levels to edges, but the loss of Z data at the sub-pixel level makes it less effective at solving the thin line problem than equivalent MSAA modes. The roll overs illustrate all this.

    Thats as simple as I can make it -- I hope it helps.

    We did not only test at one resolution -- In every game we tested at 1600x1200, 1920x1440, and 2560x1600. In oblivion we tested at 1280x1024 as well.

    All our resolution data was in the last graph on each page -- resolution scaling. There are two graphs per page on performance. As you can see, at resolutions below 2560x1600, the 8800 GTX is almost over kill.

    1600x1200 is a standard LCD panel resolution and has been for quite some time. It's actually quite affordable now as well. 1280x1024 (while popular) is often too low to matter in a high end performance analysis piece (and where it did matter we tested it). 1920x1440 is a 4:3 resolution that will give 1920x1200 panel owners a very good idea of performance (differnce is usually under 5% in many games). 2560x1600 is a standard resolution for 30" LCD panels.

    I can understand being upset if you missed the performance data at other resolutions, but it seems like the rest of your complaints are that we put too much data in the article. I doubt this will change in the future, but is there anything else we could have done to make this article better? We are very willing to listen to feedback, especially on articles as big as this.

    Thanks,
    Derek Wilson
  • flexy - Friday, November 10, 2006 - link

    >>>
    complaints are that we put too much data in the article. I doubt this will change in the future,
    >>>

    i doubt you can make it RIGHT for everyone...however i share the opinion w/ MOST that it is an excellent review. TOO much data is seldom bad, NOT on a site where you can expect geeks and nerds digging every bit of information :)

    I remember times when reviews where FAR less detailed...and what can be better than going in-depth into AA/AF modi, showing their differnce in detail ? I think this was right on and i value such in-depth coverage !

    The DX10 coverage MAYBE was "too much info" for some...but then legitimate IMHO. We're talking about totally new h/w architecture, totally new and revamped DX API and the first hardware supporting it..so it was defintly a good place to cover this.

    Also...you always have the option to skip parts of a review...and the MORE detailed it is...the more it is a helpful resource (also later) to come back and read up. You dont need to comprehend any bit of information at once, but it's good to know it's there.

    my $0.2
  • jiulemoigt - Thursday, November 9, 2006 - link

    The first really big issue is that a poly can have more than one color on it, due textures, subsurface scattering, displacements, bump maps, normal maps, occulion passes, specular highlight, transparency, and a few others I can not think of off the top of my head, you could probaly find out just by asking in any cg forum like cgtalk or any dev who has worked with a profesional 3d package. That being said it may have confused people to try and explain how it really works.
    The other issue is to deal with gamma correct AA, maybe my moniter is showing a way different image but I'm not really sure how you can even compare
    http://images.anandtech.com/reviews/video/NVIDIA/G...">http://images.anandtech.com/reviews/video/NVIDIA/G...
    http://images.anandtech.com/reviews/video/NVIDIA/G...">http://images.anandtech.com/reviews/video/NVIDIA/G...
    as the light is highlighting the building from two different direction in the images, the nvidia image is coming from the left and behind the buildings and the ati image is coming from the right and about midway down the image in front of the little building,
    though a question that should be asked what time of day is it supposed to be the nvidia looks like dusk, and the ati looks blown out even for high noon, though the one above seems to be the same time of day and the nvidia is blown and the ati is shadowing correctly... really odd for the images, which suggests that some other filter is causing the issue on both cards like hdr, or something else.
  • DerekWilson - Thursday, November 9, 2006 - link

    Yes a poly can have more than one color on it, and I agree our explaination could have been better ... but it is a difficult topic to talk about.

    The whole basis of multisample AA relies on the assumption that the color of a poly *within one pixel* will not vary significantly. Of course, this is not always true. This is, in fact, the reason supersample AA does make a difference -- it takes into account the actual color of the pixel at the position of the sub-pixel. This is also why its so much more expensive.

    I didn't mean to imply that an entire poly must have only one color. But it's hard to talk about MSAA without pointing out the fact that the algorithm assumes one color per pixel per poly (calculated at the pixel center in most cases).

    We did enable HDR, but we tried our hardest to take the screenshots at exactly the same ammount of time after loading the scene (Valve's HDR uses dynamic exposure which does change saturation over time and with light level coming into the camera).

    While this would impact general image comparison, it doesn't impact the effect of gamma correct AA on thin lines (which is what we were trying to show).

    Thanks for the feedback -- if there's anything you can add to help us be more specific in our description, we would certainly appreciate it. We would like to avoid simply leaving details out -- we'd like to learn how to better impart knowledge.
  • Nimbo - Thursday, November 9, 2006 - link

    This must be the first GPU article that does not derive in a flame war between ATI and Nvidia fanboys...
  • flexy - Thursday, November 9, 2006 - link

    i actually dont care. I look at performance and comparisons, and then chose what card to get :) Although w/ ATI for years already.

    If one card, however, has some substantial advantage over another, i'll gladly point that out and also gladly debate with others why i'd prefer card X over Y.

    Thats the difference between a fanboy and a enthusiast, i think. As long as i can back up statements w/ facts instead of just defeinding a "brand".

    the other "problem" is really that same gen cards USUALLY are pretty much on par prformance wise...so debating/defeninf brand X over Y does make as much sense as defending ferrari over lamborghini :)

    But then..if we wouldn't do that and even discuss about the "littlest" details and have lengthy conversations on forums eg. WHICH AA methods is better and why...and why 5 FPS there are better...and/or why this AF method is better than the other...it would be pretty boring.

    I mean we're hardware-enthusiasts, and gfx-cards are (IMHO) the most interesting component in a PC :)
  • DigitalFreak - Thursday, November 9, 2006 - link

    I thought we were done with the days of >$499 single GPU cards after the 7900GTX launch. Guess not.
  • VooDooAddict - Thursday, November 9, 2006 - link

    Great article.

    Now I just need to figure out if a 8800GTX will fit in a mATX UltraFly Case.
  • Araemo - Thursday, November 9, 2006 - link

    Everyone is repeating microsoft's claim that dx10 will be Vista-only.

    the inq (I know, I know....) reported http://www.theinquirer.net/default.aspx?article=35...">here that there will be a directx '9.0L' for XP that supports the new rendering features of DirectX10, but without the new virtualization/driver model improvements.
  • haris - Thursday, November 9, 2006 - link

    You must have missed the article they published the very next day http://www.theinquirer.net/default.aspx?article=35...">here. saying they goofed.
  • Araemo - Thursday, November 9, 2006 - link

    Yes I did - thanks.

    I wish they would have updated the original post to note the mistake, as it is still easily accessible via google. ;) (And the 'we goofed' post is only shown when you drill down for more results)
  • Araemo - Thursday, November 9, 2006 - link

    In all the AA comparison photos of the power lines, with the dome in the background - why does the dome look washed out in the G80 images? Is that a driver glitch? I'm only on page 12, so if you explain it after that.. well, I'll get it eventually.. ;) But is that just a driver glitch, or is it an IQ problem with the G80 implementation of AA?
  • bobsmith1492 - Thursday, November 9, 2006 - link

    Gamma-correcting AA sucks.
  • Araemo - Thursday, November 9, 2006 - link

    That glitch still exists whether or not gamma-correcting AA is enabled or disabled, so that isn't it.
  • iwodo - Thursday, November 9, 2006 - link

    I want to know if these power hungry monster have any power saving features?
    I mean what happen if i am using Windows only most of the time? Afterall CPU have much better power management when they are idle or doing little work. Will i have to pay extra electricity bill simply becoz i am a cascual gamer with a power - hungry/ ful GPU ?

    Another question pop up my mind was with CUDA would it now be possible for thrid party to program a H.264 Decoder running on GPU? Sounds good to me:D
  • DerekWilson - Thursday, November 9, 2006 - link

    oh man ... I can't believe I didn't think about that ... video decoder would be very cool.
  • Pirks - Friday, November 10, 2006 - link

    decoder is not interesting, but the mpeg4 asp/avc ENCODER on the G80 GPU... man I can't imagine AVC or ASP encoding IN REAL TIME... wow, just wooowww
    I'm holding my breath here
  • Igi - Thursday, November 9, 2006 - link

    Great article. The only thing I would like to see in a follow up article is performance comparison in CAD/CAM applications (Solidworks, ProEngineer,...).

    BTW, how noisy are new cards in comparison to 7900GTX and others (in idle and under load)?
  • JarredWalton - Thursday, November 9, 2006 - link

    I thought it was stated somewhere that they are as loud (or quiet if you prefer) as the 7900 GTX. So really not bad at all, considering the performance offered.
  • DerekWilson - Thursday, November 9, 2006 - link

    i'm sure there was a lot burried in there ... sorry if it wasn't easy to find.

    8800 gtx and gtx are both no louder than 7900 gtx. 1950 xtx still takes the cake for loudest graphics card around by a long shot -- especially after it heats up in a game.
  • crystal clear - Thursday, November 9, 2006 - link

    My comments in Daily Tech on this subject-

    More "G80" Derivatives in February R
    E: More info would be nice
    By crystal clear on 11/8/06, Rating: 2
    By crystal clear on 11/8/2006 8:03:43 AM , Rating: 2

    If you link VISTA -SANTA ROSA platform-Core2DUO(merom)CPU line up(T7300,7500,7700 models)then a matching Graphics card
    to complete the link.

    So a G80 for laptops/notebooks?

    The pairing of Intels Santa Rosa platform with Vista in the 2Q 07 is next big thing for the first tier notebook manufacturers & all they need is a matching G80 for this setup.

    Unquote-
    Nvidia currently caters to Desktop requirement/needs with the new G80 releases,wonder how the notebook/server versions will be-with Vista ofcourse.



  • yyrkoon - Thursday, November 9, 2006 - link

    Vitual memory is probably a good thing for most cases, but in the graphics arena, this *could* potentially make for sloppy/ bad coding practises. Knowing a lot of game devers (some of which actually work for well known companies), I've heard them from time to time complain about maxing a 16x PCI-E pipe. What I'm trying to say here, is that while it would be a good thing for never having to run out of texture memory, but that system memory, and definately the swap disk can not hold a candle to the memory bandwidth that most Video cards are capable of. End result, is that you definately *will* get a performance hit. All this, and we already know the memory bandwidth capabilities of modern PCs, suffice it to say, the most we'll see from current systems is what ? 12-13K GB/s ? Even a 7800GS can do roughly 35 GB/s on card. A 7600GT ? 22GB/s ?

    Still I think Directx10 is a very good thing, and as I didnt read the whole article, perhaps a missed a little ? Reason being, I've been reading about Directx10 since April, and a friend of mine was privy to some of this information after an interview with ATI.

    http://www.gamedev.net/reference/programming/featu...">http://www.gamedev.net/reference/programming/featu...
  • saratoga - Thursday, November 9, 2006 - link

    I don't know how they threading really works, but its quite possible VM support is required in order to allow multiple threads to run without stepping all over each other,.
  • saratoga - Thursday, November 9, 2006 - link

    Sorry, should read "I don't know how THEIR threading works"
  • falc0ne - Thursday, November 9, 2006 - link

    I don't know what is the problem but I'm really unable to see the images within the latest articles from Anand...Can anyone give me a suggestion? What might be the cause of that?
    The thing is I'm really, really interested in these articles and I need to see those images. Thanks
  • yyrkoon - Thursday, November 9, 2006 - link

    Oh, er, then in the options tab of Firefox, (tools->options->content) check the "load images" check box ;)
  • falc0ne - Thursday, November 9, 2006 - link

    well...it would've been simple but I'm afraid is not that...It might be the addblock extension from firefox, other than that I have nooo ideeea...Well I will use the IE tab option instead and load the pages using IE 7. Thanks anyway:)
  • yyrkoon - Thursday, November 9, 2006 - link

    Checked the exceptions list ? I know that firefox makes it really simple to block images from a site (to a point of being too easy).
  • JarredWalton - Thursday, November 9, 2006 - link

    If you've got AdBlock on Firefox, press Ctrl+Shift+A and you can see what it's blocking. If it blocks the images.anandtech.com stuff, you can then see which RegEx isn't working right and edit that.
  • yyrkoon - Thursday, November 9, 2006 - link

    If you're using Firefox, get, and install the extension "flashblock". Just did this myself today, tired of all the *animated* adds bothering me while reading articles.

    Sorry AT guys, but we've had this discussion before, and its realy annoying.
  • JarredWalton - Thursday, November 9, 2006 - link

    Do you want to be able for us to continue as a site? Because ads support us. Anyway, his problem is related to not seeing images, so your comment about blocking ads via flashblock is completely off topic.
  • yyrkoon - Thursday, November 9, 2006 - link

    Of course I want you guys to continue on as a site, just wish it were possible without annoying flashing adds in a section where I'm trying to concentrate on the article.

    As for the off topic part, yeah, my bad, I mis-read the full post (bad habit). Feel free to edit or remove that post of mine :)
  • archcommus - Thursday, November 9, 2006 - link

    What browser are you using?
  • falc0ne - Thursday, November 9, 2006 - link

    firefox 2.0
  • JarredWalton - Thursday, November 9, 2006 - link

    If Firefox, I know there's an option to block images not on the originating website. In this case, images come from image.anandtech.com while the article is on www.anandtech.com, so that my be the cause of your problems. IE7 and other browsers might have something similar, though I haven't ever looked. Other than that, perhaps some firewall or ad blocking software is to blame - it might be getting false positives?
  • archcommus - Thursday, November 9, 2006 - link

    Wow to Anandtech - another amazing, incredibly in-depth article. It is so obvious this site is run by dedicated professionals who have degrees in these fields versus most other review sites where the authors just take pictures of the product and run some benches. Articles like this keep the AT reader base very very strong.

    Also wow to the G80, obviously an amazing card. My question, is 450W the PSU requirement for the GTX only or for both the GTX and GTS? I ask because I currently have a 400W PSU and am wondering if it will be sufficient for next-gen DX10 class hardware, and I know I would not be buying the highest model card. I also only have one HDD and one optical drive in my system.

    Yet another wow goes out to the R&D monetary investment - $475 million! It's amazing that that amount is even acceptable to nVidia, I can't believe the sales of such a high end, enthusiast-targeted card are great enough to warrant that.
  • JarredWalton - Thursday, November 9, 2006 - link

    Sales of the lower end parts which will be based off G80 are what make it worthwhile, I would guess. As for PSU, I think that 450W is for the GTX, and more is probably a safe bet (550W would be in line with a high-end system these days, although 400W ought to suffice if it's a good quality 400W). You can see that the GTX tops out at just under 300W average system power draw with an X6800, so if you use an E6600 and don't overclock, a decent 400W ought to work. The GTX tops out around 260W average with the X6800, so theoretically even a decent 350W will work fine. Just remember to upgrade the PSU if you ever add other components.
  • photoguy99 - Thursday, November 9, 2006 - link

    I just wanted to second that thought -

    AT articles have incredible quality and depth at this point - you guys are doing great work.

    It's actually getting embarrasing for some of your competing sites, I browsed the Tom's article and it had so much fluff and retread I had to stop.

    Please don't forget the effort is noticed and appreciated.
  • shabby - Wednesday, November 8, 2006 - link

    It wasnt mentioned in the review, but whats the purpose of the 2nd sli connector?
  • JarredWalton - Wednesday, November 8, 2006 - link

    Page 17:

    "The dual SLI connectors are for future applications, such as daisy chaining three G80 based GPUs, much like ATI's latest CrossFire offerings."

    Using a third GPU for physics processing is another possibility, once NVIDIA begins accelerating physics on their GPUs (something that has apparently been in the works for a year or so now).
  • Missing Ghost - Wednesday, November 8, 2006 - link

    So it seems like by substracting the highest 8800gtx sli power usage result with the one for the 8800gtx single card we can conclude that the card can use as much as 205W. Does anybody knows if this number could increase when the card is used in DX10 mode?
  • JarredWalton - Wednesday, November 8, 2006 - link

    Without DX10 games and an OS, we can't test it yet. Sorry.
  • JarredWalton - Wednesday, November 8, 2006 - link

    Incidentally, I would expect the added power draw in SLI comes from more than just the GPU. The CPU, RAM, and other components are likely pushed to a higher demand with SLI/CF than when running a single card. Look at FEAR as an example, and here's the power differences for the various cards. (Oblivion doesn't have X1950 CF numbers, unfortunately.)

    X1950 XTX: 91.3W
    7900 GTX: 102.7W
    7950 GX2: 121.0W
    8800 GTX: 164.8W

    Notice how in this case, X1950 XTX appears to use less power than the other cards, but that's clearly not the case in single GPU configurations, as it requires more than everything besides the 8800 GTX. Here's the Prey results as well:

    X1950 XTX: 111.4W
    7900 GTX: 115.6W
    7950 GX2: 70.9W
    8800 GTX: 192.4W

    So there, GX2 looks like it is more power efficient, mostly because QSLI isn't doing any good. Anyway, simple subtraction relative to dual GPUs isn't enough to determine the actual power draw of any card. That's why we presented the power data without a lot of commentary - we need to do further research before we come to any final conclusions.
  • IntelUser2000 - Wednesday, November 8, 2006 - link

    It looks like putting SLI uses +170W more power. You can see how significant video card is in terms of power consumption. It blows the Pentium D away by couple of times.
  • JoKeRr - Wednesday, November 8, 2006 - link

    well, keep in mind the inefficiency of PSU, generally around 80%, so as overall power draw increases, the marginal loss of power increases a lot as well. If u actually multiply by 0.8, it gives about 136W. I suppose the power draw is from the wall.
  • DerekWilson - Thursday, November 9, 2006 - link

    max TDP of G80 is at most 185W -- NVIDIA revised this to something in the 170W range, but we know it won't get over 185 in any case.

    But games generally don't enable a card to draw max power ... 3dmark on the other hand ...
  • photoguy99 - Wednesday, November 8, 2006 - link

    Isn't 1920x1440 a resolution that almost no one uses in real life?

    Wouldn't 1920x1200 apply many more people?

    It seems almost all 23", 24", and many high end laptops have 1900x1200.

    Yes we could interpolate benchmarks, but why when no one uses 1440 vertical?

  • Frallan - Saturday, November 11, 2006 - link

    Well i have one more suggestion for a resolution. Full HD is 1920*1080 - that is sure to be found in a lot of homes in the future (after X-mas any1 ;0) ) on large LCDs - I believe it would be a good idea to throw that in there as well. Especially right now since loads of people will have to decide how to spend their money. The 37" Full HD is a given but on what system will I be gaming PS-3/X-Box/PC... Pls advice.
  • JarredWalton - Wednesday, November 8, 2006 - link

    This should be the last time we use that resolution. We're moving to LCD resolutions, but Derek still did a lot of testing (all the lower resolutions) on his trusty old CRT. LOL
  • Sunrise089 - Thursday, November 9, 2006 - link

    Then I suppose he's in the market to part with an ugly old high-end CRT. I'd love to buy it from him. Seriously.
  • JarredWalton - Thursday, November 9, 2006 - link

    You want an older 21" Cornerstone CRT? It's a beast, but you can have it for the cost of shipping (which unfortunately would probably be ~$50). I'd also sell my Samsung 997DF 19" CRT for about $50, and maybe an NEC FE991-SB for $50 (which unfortunately has a scratch from my daughter in the anti-glare coating). If anyone lives in the Olympia, WA area, you know how to contact me (I hope). I'd rather someone come by to pick up any of these CRTs rather than shipping, as I don't think I have the original boxes.
  • DerekWilson - Thursday, November 9, 2006 - link

    lol next thing you know links to ebay auctions are gonna start showing up in our articles :-)
  • yyrkoon - Thursday, November 9, 2006 - link

    lol, I've got a 21" techtronics I'll sell for $200 usd, plus shipping ;) Hasnt been used since I purchased my Viewsonic VA1912wb (well, been used very little ).
  • imaheadcase - Wednesday, November 8, 2006 - link

    can't stand AA benchmarks myself :)

    Question: Do you have any info on what kinda card nvidia releasing this feb? Is it something in between these 2 cards or something even lower?

    Im looking for a $300ish g80! :D
  • flexy - Wednesday, November 8, 2006 - link

    if ANYTHING counts then how those high-end cards perform WITH their various AA settings.... the power of those cards (and the money spent on :) RIGHT translated into ---> IMAGE QUALITY/PERFORMANCE.

    Please dont tell you you would get an G80 but do NOT care about AA, this does NOT make any sense...sorry...

    I am especially impressed reading that transparency AA has such a LITTLE performance impact. What game engine did you test this on ?

    On the older ATI cards (and am i right that T.A. is the same as "adaptive antialiasing" ? )...this feature (depending on game engine) is the FPS killer....eg. w/ games like oblivion (WHERE ARE THE GOTHIC 3 BENCHEIS BTW ? :)...much vegetation etc. game-engines.

    Enable transparency AA and see all those trees, grass etc. without jaggies.

  • imaheadcase - Thursday, November 9, 2006 - link

    Well lots of people don't are for AA. Even if i had this card I would not use it. I visually see NO difference with it on or off. Its personal test. I don't even see "jaggies" on my older 9700 PRO card.
  • flexy - Thursday, November 9, 2006 - link

    you sure are talking about ANTIALIASING ???

    What resoltions do you run ? Not that my CRT can even handle more than 1600x1200..but even w/ 1600 i get VERY prominent jaggies if i dont run AA.

    I made it a habit to run at least 4xAA in ANY game, and some engines (hl2:source engine) etc. run extremely well with 4xAA, even 6xAA is very playable at elast with HL2.

    The very recent games, namely NWN2 and G3 now dont support AA, playing at 1280x1024 and it looks utterly horrible ! If you say you dont see jags in say ANY resolution under 1600..very hard to believe
  • imaheadcase - Thursday, November 9, 2006 - link

    Yes im talking about antialiasing. I normally play BF2 and oblivion at 1024x768 (9700 pro remember).

    Fact is most people won't see them unless someone points them out. The brain is still better at rendering stuff the way you want to see it vs hardware :)
  • flexy - Thursday, November 9, 2006 - link

    ok..but then it's also a performance problem. If it doesn't bother you, well ok.
    I also have to settle w/ the fact that many RECENT games are even unable to do AA..however i wish they would.

    But once i get a 8800 i will do &&&& to get the most out of IQ, AA, AF, transparency/texture AF, you name it. ALONE also for the reason that i would need a super-high end monitor first to even run resolutions like 2000xsomething...and as long as i have a lame 19" CRT and CANNOT even go over 1600 (99,99% of games even running everything on 1280x or 1360x) i will use all the power to get out best possible IQ in those low resolutions.

    Also..looking at the benchmarks..its NOT that you lose any real time gaming-experiencee since THOSE monster cards are made for exactly this...eg. running oblivion with all those settings at MAX AND AA on and HDR...and you are still in VERY reasobale FPS ranges.
  • dwalton - Thursday, November 9, 2006 - link

    When using older cards sacrificing IQ for performance is typically acceptable. Who needs AA when running F.E.A.R on a 9700 Pro.

    However, on a just launched high-end card, why would anyone feel the need to sacrifice IQ for performance? Some may say resolution over AA, but I find it hard to believe that there is a lot of gaming enthusiasts with deep pockets, who play with insane resolutions yet no AA.
  • JarredWalton - Thursday, November 9, 2006 - link

    If I look for jaggies, I see them. On most games, however, they don't bother me much at all. Running at native resolution on LCDs or at a really high resolution on CRTs, I'd take that over a lower res with 4xAA. If you have the power to enable 4xAA, great, but I'm certainly not one to suggest it's required. I'd rather be able to enable vsync without a massive performance hit (i.e. stay above 60 FPS) than worry about jaggies. Personal preference.
  • munim - Wednesday, November 8, 2006 - link

    "With the latest 1.09 patch, F.E.A.R. has gained multi-core support,"

    Where is this?
  • JarredWalton - Wednesday, November 8, 2006 - link

    I wrote that, but it may be incorrect. I'm trying to get in contact with Gary to find out if I'm just being delusional about Quad Core support. Maybe it's NDA still? Hmmm.... nothing to see here!
  • JarredWalton - Wednesday, November 8, 2006 - link

    Okay, it's the 1.08 patch, and that is what was tested. Since we didn't use a quad core CPU I don't know if it will actually help or not -- something to look at in the future.
  • Nelsieus - Wednesday, November 8, 2006 - link

    I haven't even finished reading it yet, but so far, this is the most comprehensive, in-depth review I've seen on G80 and I just wanted to mention that beforehand.

    :)
  • GhandiInstinct - Wednesday, November 8, 2006 - link

    What upcoming games will be the first to be fully made on DX10 structure? And does the G80 have full support of DX10?
  • timmiser - Thursday, November 9, 2006 - link

    Microsoft Flight Simulator X will be DX10 compliant via a planned patch once Vista comes out.
  • JarredWalton - Wednesday, November 8, 2006 - link

    All DX10 hardware will be full DX10 (see pages 2-4). As for games that will be DX10 ready, Halo 2 for Vista will be for sure. Beyond that... I don't know for sure. As we've explained a bit, DX10 will require Vista, so anything launching before Vista will likely not be DX10 compliant.
  • shabby - Wednesday, November 8, 2006 - link

    They're re-doing a dx8 game in dx10? You gotta be kidding me, whats the point? You cant polish a turd.
  • JarredWalton - Wednesday, November 8, 2006 - link

    They did the same thing with the original Halo, porting it (and slowing it down) to DX9. MS seems to think making Halo 2 Vista-only will get people to upgrade to the new OS. [:rolls eyes:]
  • stmok - Wednesday, November 8, 2006 - link

    How else are they gonna get gamers to upgrade to Vista? :)
    (by cornering them into adopting Vista, using DirectX 10.0)

    Its sad and pathetic at the same time.

    DirectX 10.0 should be a "transitional" solution...That is, it covers both XP and Vista. This allows people to gradually upgrade their hardware, and if they wish, to Vista. What MS is doing now, is throwing everyone (developers and consumers) into the deep end, and expecting them to pay for the changes. (I suspect some would be put off by this, while the majority will continue to accept it...Which is unfortunate).


    Great article BTW. Interesting to see the high-end stuff...But I doubt I can afford it in this lifetime!

    I have two questions!

    (1) Any chance of looking at a triple video card setup?
    (I saw a presentation slide which had 2 video cards in SLI, while a third showed something else on screen).

    (2) Any idea when the GF8600-series comes?
    (mainstream market solution).
  • yyrkoon - Thursday, November 9, 2006 - link

    Great, links arent working ?

    http://www.gamedev.net/reference/programming/featu...">http://www.gamedev.net/reference/programming/featu...
  • yyrkoon - Thursday, November 9, 2006 - link

    http://www.gamedev.net/reference/programming/featu...">

    This article was written by a friend of mine back in April after an interview with ATI. Perhaps this will clear some things up.
  • yyrkoon - Thursday, November 9, 2006 - link

    When you break all hardware/software ties to something that has been around for 4-5 years? Its not that easy making it "transitional". From a software perspective, D3D10 is not compatable with XP in the least.

    I for one, think this is a step in the right direction.
  • JarredWalton - Thursday, November 9, 2006 - link

    Supposedly all of the changes to the WDDM make porting DX10 back to Windows XP "impossible", although I'm more inclined to think the correct term would be "difficult" and you also have to add in "it doesn't fit with MS marketing protocol". WDDM is quite different in Vista however, so maybe there's some substance to the claims.
  • cosmotic - Wednesday, November 8, 2006 - link

    On page 9:

    --Briefly explain what a sub-pixel is in the sentence before--
  • JarredWalton - Wednesday, November 8, 2006 - link

    Due to the size of this article and the amount of time it took to get ready, let me preempt any comments about the spelling and grammar. I am in the process of editing the final document as I read through it, and there are spelling/grammar errors. If they bother you too much, check back in an hour. If you read this an hour from now and you still find errors, then you can respond, though it would be useful to keep all responses in a single thread like this one.

    Thanks in advance,
    Jarred Walton
    Editor
    AnandTech.com
  • xtknight - Thursday, November 16, 2006 - link

    On p 12 (gamma corrected AA):

    "This causes problems for thing like thin lines."
  • acejj26 - Wednesday, November 8, 2006 - link

    "If DirectX 10 sounds like a great boon to software developers, the fact that DX10 will only be supported in Windows XP is certain to curb enthusiasm. "

    I believe this should say "DX10 will only be supported in Windows Vista..."

    Not to be rude, but shouldn't the article be edited BEFORE being published??
  • JarredWalton - Wednesday, November 8, 2006 - link

    The text is basically complete, and minor spelling issues aren't going to change the results. Obviously, proofing 29 pages of article content is going to take some time. We felt our readers would be a lot more interested in getting the content now rather than waiting even longer for me to proof everything. I know the vast majority of readers don't bother to comment on spelling and grammar issues, but my post was to avoid the comments section turning into a bunch of short posts complaining about errors that will be corrected shortly. :)
  • Iger - Wednesday, November 8, 2006 - link

    Pff, of course we would! If I would like to read a novel I would find a book! Results first - proofing later... if ever :) Thanks for the article!
  • JarredWalton - Wednesday, November 8, 2006 - link

    Did I say an hour? Okay, how about I just post here when I'm done reading/editing? :)
  • JarredWalton - Wednesday, November 8, 2006 - link

    Okay, I'm done proofing/editing. If you still see errors, feel free to complain. Like I said, though, try to keep them in this thread.

    --Jarred
  • LuxFestinus - Thursday, November 9, 2006 - link

    Pg. 3 under <b>Unified Shaders</b>

    quote:

    <i>Until now, building a GPU with unified shaders would not have desirable, let alone practical, but Shader Model 4.0 lends itself well to this approach.</i>

    Should read as follows:
    <i>Until now, building a GPU with unified shaders would not have <b>been</b> desirable, let alone practical, but Shader Model 4.0 lends itself well to this approach.</i>

    Good try though.;)
  • shabby - Wednesday, November 8, 2006 - link

    $600 for the gtx and $450 for the gts is pretty good seeing how much they crammed into the gpu, makes you wonder why the previous gen topped 650 bucks at times.
  • dcalfine - Wednesday, November 8, 2006 - link

    How does the 8800GTX compare to the 7950GX2? Not just in FPS, but also in performance/watt?
  • dcalfine - Wednesday, November 8, 2006 - link

    Ignore ^^^
    sorry


    Hot card by the way!
  • neogodless - Wednesday, November 8, 2006 - link

    I know you touched on this, but I assume that DirectX 10 is still not available for your testing platform, Windows XP Professional SP2, and additionally no games have been released for that platform. Is this correct? If so...

    Will DirectX 10 be made available for Windows XP?
    Will you publish a new review once Vista, DirectX 10 and the new games are available?

    Can we peak into the future at all now?
  • JarredWalton - Wednesday, November 8, 2006 - link

    DX10 will be Vista only according to Microsoft. What that means according to some game developers is that DX10 support is going to be somewhat slow, and it's also going to be a major headache because for the next 3-4 years they will pretty much be required to have a DX9 rendering solution along with DX10.
  • DerekWilson - Wednesday, November 8, 2006 - link

    No DX10 for winxp -- but you've got OGL with extensions.

    We will certainly take a look at DX10 performance once we have DX10 apps.

    Have fun glimpsing :-)

Log in

Don't have an account? Sign up now