Scaling Down the X800

The basic architecture of the X700 has already been covered, as it is based on the X800's R423 core. The X700 takes it down a notch and runs with only 8 pixel pipelines at a maximum core clock speed of 475MHz on the XT version. The memory interface on the X700 series of cards is 128bit, giving it half the memory bandwidth (on a clock for clock basis) of its big brothers. Oddly, this time around there are two $200 versions of the card. The X700 XT (which we are testing today) has 475/1.05 core/mem clocks and 128MB of RAM. The X700 Pro will launch with 420/864 core/mem clocks and 256MB of RAM. We are definitely interested in testing the performance characteristics of the X700 Pro to determine if the lower clocks are able to take advantage of the larger memory to provide performance on par with its price, but for now we'll just have to settle for X700 XT numbers. ATI will also be producing an X700 card with 400MHz core and 128MB of 700MHz memory that will retail for $149.



In a very interesting twist, one of the features of R423 that didn't get scaled down was the vertex engine. The RV410 GPU retains all 6 vertex pipelines. This should give it an advantage over NVIDIA's solution in geometry heavy environments, especially where an under powered CPU is involved. NVIDIA only opted to keep 3 of NV40's 6 vertex pipes. Of course, NVIDIA's solution is also capable of VS3.0 (Vertex Shader 3.0) functionality which should help out if developers take advantage of those features. Vertex performance between these midrange cards will have to be an ongoing battle as more and more features are supported and drivers mature. We will also have to come back and examine the cards on an underpowered CPU to see if the potential advantage ATI has in a wider vertex engine manifests itself in gameplay on a midrange priced system.

In addition to scaling down a few key features of the X800, ATI physically scaled down the RV410 by fabbing it on TSMC's 110nm process. This is the same process that is used in the X300 series. This isn't a surprising move, as ATI has been introducing new fab processes on cheaper parts in order to sort out any wrinkles before a brand new architecture is pushed out. NVIDIA also went with the 110nm process on this round.

Interestingly, ATI is pushing it's 110million transistors at a 25MHz lower core clock than NV43. This does give NVIDIA a fillrate advantage, though pure fillrate means very little these days. At first, we were very surprised that ATI didn't push their GPU faster. After all, RV410 is basically a cut down R423 with a die shrink. The X800 XT runs at the same core speed as the 6600 GT (500MHz). If NVIDIA could increase its clock speed from the 6800 series to the narrower and smaller 6600 series, why couldn't ATI do the same when moving to X700 from X800?

In asking ATI about this, we got quite a few answers, the combination of which generally satisfied us. First, ATI's 130nm process is a low-k process. This means that there is less capacitance between metal layers on the die, which gives ATI's GPUs added stability at higher core speeds. Shrinking to a 110nm process and losing the low-k advantage makes this different than a simple die shrink. Also, ATI's X800 series of cards are supplied with external power, while the X700 relies only on the PCIe connection for juice. This could easily explain why the X700 XT is clocked lower than the X800 XT, and the extra vertex processing power could help explain why NVIDIA was able to hit a higher clock speed than ATI. We could be wrong, but without knowing both companies' yield rates and power consumption we really can't say for sure why things ended up this way.

The X700 retains the video processing features of the X800 as well, and we are starting to see both ATI and NVIDIA gearing up for some exciting things in video. Additionally, ATI has stated that their X700 is capable of running stable passively cooled in a BTX system (albeit with a massive passive heatsink). This, of course, is something the X800 cannot do. ATI has said that initial BTX versions of their cards will be OEM only as that will be where BTX makes its first mark.

Now that we know a little more about the X700, lets take a look at the numbers.

Index The Test
Comments Locked

40 Comments

View All Comments

  • ThePlagiarmaster - Friday, September 24, 2004 - link

    #39 ianwhthse

    Very funny stuff. OK, I don't want a porsche now...LOL. I think I get it. If you own a porsche, you spend so much time working to pay for it you're too wore out to have sex? :)

    I get your point on the obvious. I guess I just like to show someone a printout to go along with the stuff I tell them. They seem to like it better when they can SEE it also. Years of dealing with mechanics and appliance repair people (and bad PC people I guess) have taught most people that step into a PC store that the guy selling them anything must be lying or stupid. Which is true in a lot of cases. Can't really blame them for being skeptics. Then again some people are just VISUAL and can't seem to get it when the same thing is SAID to them instead of shown to them.

    In any case, I hope they do another review soon with more benches.
  • ianwhthse - Thursday, September 23, 2004 - link

    #31, ThePlagiarmaster:

    Actually, check this out.

    http://www2.autospies.com/article/index.asp?articl...

    But maybe that's just because you have to be rich to afford one. Then you're old.

    #38, (Also ThePlagiarmaster) they were given limited time with the card. With a limited number of cards to pass around, and a deadline to meet, ATi didn't just give them the card and say, “send it back when you’re done.” In such a situation, Anandtech would need to try and do the most important tests. And which would we rather see? A comparison where every card has the exact same framerates +- .1 frames? Or bench in a fashion where we can actually TELL which card is better?

    I totally understand where you're coming from. I just moved from 10x7 up to 12x10 on my 19 incher only about 3 months ago, myself, but you need to face facts, too. You're a smart person, who obviously has a lot of experience with computers, and like you said, we influence the people we talk to. So tell them the obvious point you brought up. That a weaker cpu is gonna limit how fast the gpu will be (to an extent). We all know that, here. Most of us keep that actively in mind when we're reading the graphs. We can figure that out (though not to an exact amount) on our own. It’s more important (especially for a limited-time initial review) that we find out what the graphics card is capable of.

    I’m sure once retail boards start hitting shelves at a store near you, there will be plenty of “real-world” tests that will check things out at the level you’re talking about, but you can’t expect them to do what you’re asking for in an initial review of a reference card.
  • ThePlagiarmaster - Thursday, September 23, 2004 - link

    #37 blckgrffn

    ROFL. Depends on your video card AND your monitor. Mine has GREAT text at 120hz@1024x768. If you'd have read my post you'd already know I said it goes to 150 but looks like crap. Do you honestly think I wouldn't know to try other refresh rates? LOL. I'm an A+ certified PC Tech (among other certs, this is the relevant one here) and own a PC business. I've owed a computer of some sort since the apple // (back in the green/amber monitor days). I'd say you don't know that many people if everyone you know runs above 1024x768. I however, SELL these things every day.

    I didn't say anandtech shouldn't publish the the numbers they did. I merely suggested they add one resolution to the picture. If the world was using higher resolutions as a whole, the web would be optimized for it. But it's not IS IT? Apparently you don't play many games online. At 1600x1200 a LOT of games out there can't run without tanking. Even without being online a top end machine can't cut it at 1600x1200 (with all the candy turned on) in EVERY game out there as you suggest. Your experience would definitely not be described as BUTTER SMOOTH. I'd further think you don't play that many games, or your just full of crap. Unless everything in your machine is overclocked you don't have a prayer of doing what you say. Congrats, you own a machine that the rest of us haven't figured out how to create yet.

    If you don't like people talking back in a comments forum then leave. I merely responded to people that replied to my posts. Discussing the review is what this is all about, or did you not get that? What makes you think your favorite res is more important than mine (and the rest of the world)? Jeez, do you actually think the world revolves around you? Do you actually think most of the world owns the top cpu/gpu in their machines? Get real. I'd venture to guess that less than 5% of anandtech readers own both the top cpu (amd or intel) and the top gpu (Nvidia or Ati). Apparently you have no idea that the "middle class" even exists in our society. Sales of Intel/AMD's top cpu's say I'm right. They're so low they won't even tell us how many they sold in a quarterly report.
  • blckgrffn - Thursday, September 23, 2004 - link

    ThePlagiarmaster ENOUGH. We get your argument. I have a 21" and I NEVER play a game under 1600*1200. When the rig can't handle it anymore it is time to upgrade, end of story. This is why I have a pc and not a console. I think that everyone I know plays at higher than 1024*768, even my dad on his 17" plays @ 1152*864. Thank you Anandtech for publishing numbers that I can use. 1024*768 are useless for me, as equally useless the higher res ones are for ThePlagiarmaster. By the way, running ultra high refresh rates tends to make text more blurry than it would be at 75 or 80 hertz. Try it once and you will see what I mean. Try setting your desktop to 1152*864, you will probably like that too.
  • Staples - Thursday, September 23, 2004 - link

    I am kind of worried about Xbox 2. ATI is doing a horrible job of putting a good price/performance card out there.
  • ThePlagiarmaster - Thursday, September 23, 2004 - link

    #32 AtaStrumf

    Actually I have 20/15 vision with almost 20/10 in my left eye. Lasik surgery is great for $3000 I can almost see like an eagle. If you'd read my post you'd already know I don't enjoy headaches. As such I run at 120hz. My Viewsonic P225F will actually run at 150hz@1024x768 but my vidcard doesn't handle that too well. This is another excuse to run at a lower res (no, not 800x600 or 640x480 mind you), you get a really high refresh rate. Most 19's can't handle 100hz at 1600x1200. Like TrogdorJW said, there isn't much difference in the look above 1024x768. Can you really see the difference at 1280x1024? In most games I can't. I'd rather have a super high refresh rate, and never see that dip in fps that happens in some games when the action really heats up.


    Anandtech readers encompass a small number of people. However we advise many people (as is the case with my customers). If I sell someone a $500 videocard and it runs no faster than their neighbors $200 card (because the guy is CPU limited in his games) I look like a fool or worse, get bitched out. Sadly a lot of people do by 17's, and with a $200 vid card or more sometimes. I'd like to have a better idea of where the cpu/gpu line is drawn in the sand.

    I'm not saying throw away the high end benchmarks. Just saying I'd like to see the res a HUGE portion of the population runs in tested. Change your res to 1600x1200 and look at anandtech's website (or any other for that matter). See those huge bars of wasted space on the sides? Why does anandtech (and everyone else) optimize for 1024x768? Because 90% of the world runs in this res! On top of this, most don't like switching to a different res for each game depending on the fps they can get in each. It's just a pain in the A$$.

    PrinceGaz

    I agree CRT's are superior. But don't agree 1600x1200 is the best res on a 19 or a 21 (well 21 maybe, but only if you're using graphics apps, or cad type apps where higher res is VERY important and useful). You really like browsing the web while losing 1/3 of your screen? I take it you don't mind switching res all day (I highly doubt you browse that high). Most cards can't cut it at 1600x1200 without major frame hits (only the latest and greatest and even then you'll switch to lower res often). The TI4200 (I have a 4400 in one machine) certainly is a card where you must be switching all day long on a game by game basis. That gets old to me, and I wouldn't even want to go there with my PC ILLITERATE customers (as #32 called them - perhaps rightly so).

    Perhaps you're happy switching, and i'm not trying to take that away from you here. I'd just like to see a res that benefits recommendations to the average user (the hugest population of pc users that is). Is a hardware review site supposed to cater to the top 5% of the population, or the other 95% of the world? Don't get me wrong I love knowing what the advantage is at the high cpu/high gpu end is, but I don't get the opportunity to recommend that stuff very often. Do game designers make their games for the top 5% of pc users or aim them at the masses? Witness the popularity (still! ouch) of Geforce4MX cards and you'll see my point. I'm not asking them to throw out the highend stuff, nor to add 640x480 or 800x600 (ugh!). But 1024x768 is NORMAL. Of all the people you know, how many have 19's or bigger? If it's more than 50% you have a small circle of affluent people apparently. Again, not all that normal. While high res can be argued on large monitors, I'd argue right back that most monitors sold are 17in or smaller. The percentages just don't lie.
  • PrinceGaz - Thursday, September 23, 2004 - link

    #30- The Plagiarmaster

    Actually I *don't* have an LCD monitor myself, I was just saying that many people do. My main monitor is a 22" CRT and I would never consider exchanging it for even a top-of-the range 20" LCD (same viewable area) as I feel CRTs are superior.

    As #32 said, anyone who buys a 19" or worse still a 21" and only uses it at 1024x768 is nuts. 1600x1200 is usually the best resolution for 21/22" CRTs, and 1280x960 or 1280x1024 for 19" CRTs.

    I generally play recent games at 1280x960, or 1280x1024 if that is all that is offered, but do sometimes need to drop that to 1024x768, and even 800x600 for Doom 3 as that is all my Ti4200 can manage. No point my upgrading it as I'm about to build a PCI-e system. In older games I play at 1600x1200 if it is available and it looks great. If not available I play at the highest resolution offered and crank up the AA. There is no point playing at a lower resolution if your card and monitor are capable of working well at a higher resolution.

    #33- TrogdorJW

    I assume you use an ATI rather than an nVidia card then? If you do use an nVidia card then theres an option in the drivers (since 55.xx I believe) in nView Display Modes -> Device Settings button -> Device adjustments... -> Display Timing tab, where you can tick 'Enable doublescan for lower resolution modes'. For me that makes 800x600 scan just like 1600x1200, and 640x480 is like 1280x960. They look *far* better with doublescan enabled than without on my 22" CRT. It just extends what is done normally at 512x384 to higher resolutions. For me, 1024x768 is unaffected by it because I choose a high refresh-rate (well above what my monitor or card could do at 2048x1536).

    If ATI don't have that option available, then they should add it as it can't be very difficult to implement. Like I say the drivers do it anyway at up to 512x384 so its just a case of extending it.
  • TrogdorJW - Wednesday, September 22, 2004 - link

    32 - Hey, back off the 21" users! ;) I have a 21" monitor that I routinely use at 1024x768 in games. The difference between that and 1280x960/1024 is not that great, and 1600x1200 is really still too small for my liking. Performance is also an issue. If I can run 1280x1024 at good frame rates, I will, but I also have no problem running 1024x768 where required. 800x600 and lower, of course, are a different story. I start to see horizontal lines on my monitor at those resolutions.

    Anyway, the nice thing is that a $200 card is coming out that will have about the same performance as the 9800 Pro, and in some cases better performance. Hmmm... but my 9800 Pro cost $200 back in April. Heheh. The added features might be nice, but I'm not that concerned. If you want a 6600GT or X700XT in AGP flavor, the 9800 Pro is still a viable option if you can find it for $200 or less. JMHO.
  • AtaStrumf - Wednesday, September 22, 2004 - link

    ThePlagiarmaster, enough with the 1024x768 rant.

    You made your point, but you're forgeting that most of your computer illiterate customers are not reading this site.

    People who buy 21" monitors to run them at 1024x768, must have a few screws loose in their heads or are suffering from a serious vision impairment. I suppose you also run it at 50 Hz or sth like that.

    Anyho' I bet most AT readers run at least 1280x1024 on a 19" monitor and that includes their games.

    And anyway if a customer refuses to part with $50 in return for a much bettter monitor, what makes you think they will surrender $200 for a graphics card???

    They deserve Intel's extremely sh*ty integrated graphics engine and nothing else.
  • ThePlagiarmaster - Wednesday, September 22, 2004 - link

    #30 PrinceGaz

    So what you're saying is, everyone buys LCD's? NO? So everyone buys laptops to play games then? Neither of these is true. Most laptops are sold to business users. It's a rare person that specifically buys a laptop for games. LCD's are too expensive in large sizes (I know I don't sell many), and suck for games anyway. Only a FEW can run fast enough to play games that don't give you headaches (eye aches?..whatever). I hate BLUR. 1280x960 is not common. Unless you think a ton of people have widescreen lcd's at home (or widescreen laptops?) and bought them to play games?

    Apparently you missed most of the point of the post. Is it worth upgrading from older cards at a normal resolution (meaning, 1024x768, do a poll, you'll find most run here). Most people buy these things then wonder why they don't run much faster. With which gpu's are we cpu limited at 1024x768? By throwing the 9700pro (and lower) into these super high resolutions it might make someone (eh, a lot of people) think they're card is junk and an upgrade to one of these will solve all problems. NOT...If you tossed a few 1024x768 tests in, someone might find they're cpu limited with the current card already. Tossing on an even more powerful card is pointless to these people. Too bad they wouldn't figure that out in a review such as this.

    Why do you think people used to always run in 640x480 when testing cpus (which I hated, it isn't real-world)? Because some games are/were GPU limited above this. In order to eliminate the GPU they would test at 640x480. So yea, running in a lower resolution will sometimes let your cpu run wild (eh, produce high fps :) ). The point was, we're pretty much seeing the highest of both ends here. How does that help someone trying to figure out if a $200 card will help them get more fps? Look at #29's question.

    I have a 21in and a 19in, both are in 1024x768. My dad has a 21in he runs in the same res. Most web pages are designed for this res, a lot of games are too. So most people run in this res. A Porsche is designed to do about 150+mph but do you see anyone doing that on the highway? No, but that doesn't mean getting from 0-60 is any less fun now does it? Even though you don't run it at 150mph it still gets the women doesn't it? Not too many high performance cars advertise their top speed. Why? Because nobody uses it anyway.

    PC's weren't designed to play games. But some of them sure are fun today eh?

    #28, I know that, you know that, but most of the world still saves a buck or two on the monitor. As much as I push 19inchers, people are just CHEAP. I still sell a good number of 15's! Even when I tell them a decent 19 would only cost them $50 more and they have to live with it for the next 5yrs or so at 15in. Even on my 21 I don't see how 1024x768 is tunnel vision though. The web is great, pics are fine, I don't have to screw with font sizes all the time to get some things right, game interfaces are ALL designed to make 1024x768 LOOK perfect. They may design for others also, but they make sure this res is working as it's the most used.

Log in

Don't have an account? Sign up now