Image Quality

The first issue that we will address is trilinear and anisotropic filtering quality. All three architectures support at least 8:1 anisotropic sampling, with ATI and NVIDIA including 16:1 support. We used the D3D AF Tester to examine the trilinear and anisotropic quality of each card, and found quite a few interesting facts. NVIDIA does the least amount of pure trilinear filtering, opting for a "brilinear" method, which is bilinear near mip levels and trilinear near transistions. ATI's trilinear filtering seems a bit noisy, especially when anisotropic filtering is enabled. 3Dlabs does excellent trilinear filtering, but their anisotropic filtering algorithm is only really applied to surfaces oriented near horizontally or vertically.

Of course, pictures are worth a thousand words:



This is the 3Dlabs card with 8xAF applied.



This is the ATI card with 16xAF applied.



This is the NVIDIA card with 16xAF applied.


Anisotropic filtering is employed less in professional applications than in games, but trilinear filtering is still very important. Since the key factor in trilinear filtering is to hide transitions between mip-map levels, and the NVIDIA card accomplishes this, we don't feel that this is a very large blow against the Quadro line. Of course, we would like to have the option of enabling or disabling this in the Quadro driver as we do in the consumer level driver. In fact, the option seems almost more important here, and we wonder why it is missing.

On the application side, we were able to use the SPECapc benchmarks to compare image quality between the cards, including custom drivers. We will want to take a peek at line AA quality. Looking at one of the images captured from the 3dsmax APC test, we can easily compare the quality of line AA between all three cards. Looking at the diagonal lines framing the camera's view volume, we can see that ATI does a better job of smoothing lines in general than either of the other two GPUs. These same lines look very similar on the NVIDIA and 3Dlabs implimentation. Upon closer examination, however, the Quadro FX 4000 presents an advantage. Horizontal and vertical lines have slightly less weight than on the other two architectures. This helps keep complex wireframe images from getting muddy. Take a look at what we're talking about:



The Wildcat Realizm 200 with line antialising under 3dsmax 6.



The Quadro FX 4000 with line antialising under 3dsmax 6.



The FireGL X3-256 with line antialising under 3dsmax 6.


We only noticed one difference between the capabilities of the cards when looking at either standard OpenGL or custom drivers. It seems that the 3Dlabs card is unable to support stipple patterns for lines (either that or it ignores the hint for 3dsmax). Here's a screenshot of the resulting image, again from the 3dsmax APC test (the sub-object edges test).



The Quadro FX 4000 line stipple mask under 3dsmax 6.



The FireGL X3-256 line stipple mask under 3dsmax 6.



The Wildcat Realizm 200 line stipple mask under 3dsmax 6.


The Quadro FX 4000 gets big quality points for their line stippling quality. It's not a very widely used feature of OpenGL, but the fact that the 3Dlabs card doesn't even make an attempt (the FireGL X3 support is quite pathetic) is not what we want to see at all. This is especially true in light of the fact that both of our consumer level cards were able to put off images with the same quality of the ATI workstation class card under the D3D driver.

Moving on to shader quality, we would like to mention again that GLSL shader quality on the 3Dlabs part is top notch and second to none. Since we don't have an equivalent to Shadermark in the GLSL world, we'll only take a look at HLSL shader support.

For ATI, 3Dlabs, and NVIDIA, we were running in ps2_0b, ps2_0a, and ps3_0 mode respectively. We're taking a look at shader 15 from Shadermark v2.1, and you can notice that ATI and NVIDIA render the image slightly differently, but there is a bit of quantization evident in the 3Dlabs image. This type of error was apparent in multiple shaders (though there were plenty that were clean looking).



Quadro FX 4000



FireGL X3-256



Wildcat Realizm 200


We really do hope that through driver revisions and pushing further into the Microsoft and DirectX arena, 3Dlabs can bring their HLSL support up to the level of their GLSL rendering quality.

Shader Analysis Final Words
Comments Locked

25 Comments

View All Comments

  • Jeanlou - Thursday, December 1, 2005 - link

    Hello,
    I just bumped into AnandTech Video Card Tests, and I'm really impressed !

    As a Belgian Vision Systems Integration Consultant (since 1979), I'm very interrested about the ability to compare these 3 cards (Realizm 200 vs FireGL X3 256 vs NVIDIA Quatro FX 4000).

    I just had a bad experience with the Realizm 200 (!)

    On a ASUS NCCH-DL motherboard, Dual Xeon 2.8GHz, 2GB DDR 400, Seagate SCSI Ultra 320 HDD, 2 EIZO monitors (Monitor N°1= L985EX at 1600x1200 px), (Monitopr N°2= L565 at 1280x1024 px), Windows XP Pro SP2 x32bit partition C:\ 16GB, Windows XP Pro x64bit edition partition D:\ 16GB, plus Extended partions (2 logical E:\ and F:\). All NTFS.

    Using the main monitor for images analyses (quality control) and the slave monitor for tools, I was unable to have a stable image at 1600 by 1200 pixels. While the Wildcat4 - 7110, or even the VP990 Pro have a very stable screen at maximum resolution. But the 7110 and the VP990 Pro don't have drivers for Window XP x64bit.

    Tried everything, latest BIOS, latest drive for ChipSet...
    Even 3Dlabs was unable to give me the necessary support and do not answer anymore !

    As soon I reduced the resolution from the main monitor to 1280 by 1024, was everything stable, but that's not what I want, I need the maximum resolution on the main monitor.

    The specs from 3Dlabs resolution table is giving 3840 by 2400 pixels maximum!

    I send it back, and I'm looking for an other card.

    I wonder if the FireGL X3 256 will do the job ?
    We also use an other monitor from EIZO (S2410W) with 1920 by 1200 pixels !
    What are exactly the several resolutions possible with the FireGL X3 256 using 2 monitors ? I cannot find it on the specs.

    Any comment will be appreciated,

    Best regards,
    Jean
  • kaissa - Sunday, February 20, 2005 - link

    Excellent article. I hope that you make workstation graphic card comparision a regular article. How about an article on workstation notebooks? Thanks a lot.
  • laverdir - Thursday, December 30, 2004 - link

    dear derek wilson,

    could you tell us how much is the performance
    difference between numa and uma in general
    on this tests..

    and it would be great if you could post maya
    related results for guadro 4k with numa enabled..


    seasonal greetings
  • RedNight - Tuesday, December 28, 2004 - link

    This is the best workstation graphics card review I have read in ages. Not only does it present the positive and negatives of each the principal cards in question, it presents them in relationship to high end mainsteam cards and thereby helps many, including myself, understand the real differences in performance. Also, by inovatingly including AutoCAD and Gaming Tests one gets a clear indication of when the workstation cards are necessary and when they would be a waste of money. Thanks
  • DerekWilson - Monday, December 27, 2004 - link

    Dubb,

    Thanks for letting us know about that one :-) We'll have to have a nice long talk with NV's workstation team about what exactly is going on there. They very strongly gave us the idea that the featureset wasn't present on geforce cards.

    #19, NUMA was disabled because most people running a workstation with 4 or fewer GB of RAM on a 32 machine will not be running with the pae kernel installed. We wanted to test with a setup most people would be running under the circumstances. We will test NUMA capabilities in the future.

    #20,

    When we test workstation CPU performance or system performance, POVRay will be a possible inclusion. Thanks for the suggestion.

    Derek Wilson
  • mbhame - Sunday, December 26, 2004 - link

    Please include POVRay benchies in Workstation tests.
  • Myrandex - Saturday, December 25, 2004 - link

    I wonder why NUMA was fully supported but yet disabled. Maybe instabilities or something.
  • Dubb - Friday, December 24, 2004 - link

    http://newbietech.net/eng/qtoq/index.php

    http://forums.guru3d.com/showthread.php?s=2347485b...
  • Dubb - Friday, December 24, 2004 - link

    uhhh.. my softquadro'd 5900 ultra begs to differ. as would all the 6800 > qfx4000 mods being done by people on guru3d's rivatuner forum.

    I thought you guys knew that just because nvida says something doesn't mean it's true?

    they must consider "physically different sillicon" to be "we moved a resistor or two"...
  • DerekWilson - Friday, December 24, 2004 - link

    By high end features, I wasn't talking about texturing or prgrammatic vertex or fragment shading (which is highend in the consumer space).

    I was rather talking about hardware support for: AA lines and points, overlay plane support, two-sided lighting (fixed function path), logic operations, fast pixel read-back speeds, and dual 10-bit 400MHz RAMDACs and 2 dual-link DVI-I connectors supporting 3840x2400 on a single display (the IBM T221 comes to mind).

    There are other features, but these are key. In products like Maya and 3D Studio, not having overlay plane support creates an absolutely noticable performance hit. It really does depend on how you push the cards. We do prefer the in application benchmarks to SPECveiwperf. Even the SPECapc tests can give a better feel for where things will fall -- because the entire system is a factor rather than just the gfx card and CPU.

    #14, Dubb -- I hate to be the one to tell you this -- GeForce and Quadro are physically different silicon now (NV40 and NV40GL). AFAIK, ever since GF4/Quadro4, it has been impossible to softquadro an nvidia card. The Quadro team uses the GeForce as it's base core, but then adds on workstation class features.

Log in

Don't have an account? Sign up now