The Chip

R420 is a very power GPU in tight little package. ATI opted not to go with full DirectX 9.0 Shader Model 3.0 support in their latest GPU, but that doesn't mean that this chip doesn't pack a punch. Here's a breakdown of what's on the top of the line playing field now.

  NV38 NV40 R360 R420
Transistors
130M
222M
110M
160M
Core clock
475MHz
400MHz
412MHz
500MHz
Mem clock
950MHz
1.1GHz
900MHz
1.12GHz
Memory Bus
256bit
256bit
256bit
256bit
Vertex Pipelines
~4
6
4
6
Pixel Pipelines
4x2
16x1
8x1
16x1
Shader Model
2.0+
3.0
2.0
2.0+
Fab Process
130nm
130nm
150nm
130nm

  GeForce 6800 GT GeForce 6800 Ultra GeForce 6850 Ultra Radeon X800 Pro Radeon X800 XT PE
Price
$399
$499
$499+
$399
$499

The first thing we see is that R420 has the highest clock speed (giving it the highest peak fillrate), and it just edges out NV40 for memory speed. Of course, these theoretical numbers don't really translate directly into performance. In order to understand where performance comes from, we'll need to take a much closer look at the architecture.

Before we get in over our heads on this, it is important to differentiate the hardware itself from how the hardware looks in terms of a graphics API. Both NVIDIA and ATI, in presenting their hardware to us, have relied heavily on using the constructs of DirectX 9 to explain what's going on at different stages in the pipeline. This is useful in that we can understand how the hardware looks to the software, but there are some caveats. We will be keeping this in mind as we look over the new offerings from ATI and NVIDIA.

Index The R420 Vertex Pipeline
Comments Locked

95 Comments

View All Comments

  • l3ored - Tuesday, May 4, 2004 - link

    only the 800xt was winning, the pro usually came after the 6800's
  • Keeksy - Tuesday, May 4, 2004 - link

    Yeah, it is funny how ATi excels in DirectX, yet loses in the OpenGL bechmarks. Looks like I'm going to have both an NVIDIA and an ATi card. The first to play Doom3, the other to play HL2.
  • peroni - Tuesday, May 4, 2004 - link

    I wish there was some testing done with overclocking.

    There are quite a few spelling errors in there Derek.

    Did I miss something or I did not see any mention of prices for these 2 cards?
  • Glitchny - Tuesday, May 4, 2004 - link

    #11 thats what everyone thought when Nvidia bought all the people from 3dFX and look what happened with that.
  • araczynski - Tuesday, May 4, 2004 - link

    i agree with 5 and 10, still the same old stalemate as before, one is good at one thing, the other is good at another. i guess i'll let price dictate my next purchase.

    but ati sure did take the wind out of nvidia's sails with these numbers.

    i wish one of the two would buy the other one out and combine the technologies, one would think they would have a nice product in the end.
  • eBauer - Tuesday, May 4, 2004 - link

    #8 - OpenGL still kicks butt on the nVidia boards. Think of all the Doom3 fans that will buy the 6800's....

    As for myself, I will wait and see how the prices pan out. For now leaning on the X800.
  • ViRGE - Tuesday, May 4, 2004 - link

    ...On the virge of ATI's R420 GPU launch...

    Derek, I'm so touched that you thought of me. ;)
  • Tallon - Tuesday, May 4, 2004 - link

    Ok, so let's review. with the x800XT having better image quality, better framerates, only taking up one slot for cooling and STILL being cooler, and only needing one molex connecter (uses less power than the 9800 XT, actually), who in their right mind would choose a 6800u over this x800XT? I mean, seriously, NVIDIA is scrambling to release a 6850u now which is exactly identical to a 6800u, it's just overclocked (which means more power and higher temperatures). This is ridiculous. ATI is king.
  • noxipoo - Tuesday, May 4, 2004 - link

    ATi wins again.
  • Akaz1976 - Tuesday, May 4, 2004 - link

    Dang! On one hand, I am saddened by the review. My recently purchased (last month) Radeon9800PRO would be at the bottom of the chart in most of the tests carried out in this review :(

    On the other hand this sure bode well for my next vid card upgrade. Even if it is a few months off! :)

    Akaz

Log in

Don't have an account? Sign up now