ATI's New Leader in Graphics Performance: The Radeon X1900 Series
by Derek Wilson & Josh Venning on January 24, 2006 12:00 PM EST- Posted in
- GPUs
Hardware Features and Test Setup
We're talking about features and tests today because we are going to be trying something a bit different this time around. In addition to our standard noAA/4xAA tests (both of which always have 8xAF enabled), we are including a performance test at maximal image quality on each architecture. This won't give us directly comparable numbers in terms of performance, but it will give us an idea of playability at maximum quality.
These days, we are running out of ways to push our performance tests. Plenty of games out there are CPU limited, and for what purpose is a card as powerful as an X1900XTX or 7800 GTX 512 purchased except to be pushed to its limit and beyond? Certainly, a very interesting route to go would be for us to purchase a few apple cinema displays and possibly an old IBM T221 and go insane with resolution. And maybe we will at some point. But for now, most people don't have 30" displays (though the increasing power of today's graphics cards is certainly a compelling argument for such an investment). For now, people can push their high end cards by enabling insane features and getting the absolute maximum eye candy possible out of all their games. Flight and space sim nuts now have angle independent anisotropic filtering on ATI hardware, adaptive antialiasing for textured surfaces helps in games with lots of fences and wires and tiny detail work, and 6xAA combined with 16xAF means you'll almost never have to look at a blurry texture with jagged edges again. It all comes at a price, or course, but is it worth it?
In our max quality tests, we will compare ATI parts with 16xAF, 6xAA, adaptive AA, high quality AF and as little catalyst AI as possible enabled to NVIDIA parts with 16xAF, 4x or 8xS AA (depending on reasonable support in the application), transparency AA, and no optimizations (high quality) enabled. In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA. Some games with in game AA settings didn't have an option for 8xAA and didn't play well when we forced it in the driver, so we opted to go with the highest in game AA setting most of the time (which is reflected by the highest MSAA level supported in hardware - again most of the time). We tend to like NVIDIA's transparency SSAA a little better than ATI's adaptive AA, but that may just come down to opinion and it still doesn't make up for the quality advantages the X1900 holds over the 7800 GTX lineup.
Our standard tests should look pretty familiar, and here is all the test hardware we used. Multiple systems were required in order to test both CrossFire and SLI, but all single card tests were performed in the ATI reference RD480 board.
ATI Radeon Express 200 based system
NVIDIA nForce 4 based system
AMD Athlon 64 FX-57
2x 1GB DDR400 2:3:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStream PSU
First up is our apples to apples testing with NVIDIA and ATI setup to produce comparable image quality with 8xAF and either no AA or 4xAA. The resolutions we will look at are 1280x960 (or 1024) through 2048x1536.
We're talking about features and tests today because we are going to be trying something a bit different this time around. In addition to our standard noAA/4xAA tests (both of which always have 8xAF enabled), we are including a performance test at maximal image quality on each architecture. This won't give us directly comparable numbers in terms of performance, but it will give us an idea of playability at maximum quality.
These days, we are running out of ways to push our performance tests. Plenty of games out there are CPU limited, and for what purpose is a card as powerful as an X1900XTX or 7800 GTX 512 purchased except to be pushed to its limit and beyond? Certainly, a very interesting route to go would be for us to purchase a few apple cinema displays and possibly an old IBM T221 and go insane with resolution. And maybe we will at some point. But for now, most people don't have 30" displays (though the increasing power of today's graphics cards is certainly a compelling argument for such an investment). For now, people can push their high end cards by enabling insane features and getting the absolute maximum eye candy possible out of all their games. Flight and space sim nuts now have angle independent anisotropic filtering on ATI hardware, adaptive antialiasing for textured surfaces helps in games with lots of fences and wires and tiny detail work, and 6xAA combined with 16xAF means you'll almost never have to look at a blurry texture with jagged edges again. It all comes at a price, or course, but is it worth it?
In our max quality tests, we will compare ATI parts with 16xAF, 6xAA, adaptive AA, high quality AF and as little catalyst AI as possible enabled to NVIDIA parts with 16xAF, 4x or 8xS AA (depending on reasonable support in the application), transparency AA, and no optimizations (high quality) enabled. In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA. Some games with in game AA settings didn't have an option for 8xAA and didn't play well when we forced it in the driver, so we opted to go with the highest in game AA setting most of the time (which is reflected by the highest MSAA level supported in hardware - again most of the time). We tend to like NVIDIA's transparency SSAA a little better than ATI's adaptive AA, but that may just come down to opinion and it still doesn't make up for the quality advantages the X1900 holds over the 7800 GTX lineup.
Our standard tests should look pretty familiar, and here is all the test hardware we used. Multiple systems were required in order to test both CrossFire and SLI, but all single card tests were performed in the ATI reference RD480 board.
ATI Radeon Express 200 based system
NVIDIA nForce 4 based system
AMD Athlon 64 FX-57
2x 1GB DDR400 2:3:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStream PSU
First up is our apples to apples testing with NVIDIA and ATI setup to produce comparable image quality with 8xAF and either no AA or 4xAA. The resolutions we will look at are 1280x960 (or 1024) through 2048x1536.
120 Comments
View All Comments
bob4432 - Thursday, January 26, 2006 - link
Good for ATI, after some issues in the not so distant past it looks like the pendulum has swung back in their direction.i really like this, it should drop the 7800GT prices down maybe to the ~$200-$220(hoping, as nvidia want to keep the market hold...) which would actually give me a reason to switch to some flavor of pci-e based m/b, but being display limited @ 1280x1024 with a lcd, my x800xtpe is still chugging along nicely :)
Spoelie - Thursday, January 26, 2006 - link
it won't, they're in a different pricerange alltogether, prices on those cards will not drop before ati brings out a capable competitor to it.neweggster - Thursday, January 26, 2006 - link
How hard would it be for this new series of cards by ATI to be optimized for all benchmarking softwares? Well ask yourself that, I just got done talking to a buddy of mine whos working out at MSI. I swear I freaked out when he said that ATI is using an advantage they found by optimizing the new R580's to work better with the newest benchmarking programs like 3Dmark 06 and such. I argued with him thats impossible, or is it? Please let me know, did ATI possibly use optimizations built into the new R580 cards to gain this advantage?Spoelie - Thursday, January 26, 2006 - link
how would validating die-space on a gpu for cheats make any sense? If there is any cheat it's in the drivers. And no, the only thing is that 3dmark06 needs 24bit DST's for its shadowing and that wasn't supported in the x1800xt (uses some hack instead) and it is supported now. Is that cheating? The x1600 and x1300 have support for this as well btw, and they came out at the same time as the x1800.Architecturally optimizing for one kind of rendering being called a cheat would make nvidia a really bad company for what they did with the 6x00/Doom3 engine. But noone is complaining about higher framerates in those situations now are they?
Regs - Thursday, January 26, 2006 - link
....Where in this article do you see a 3D Mark score?mi1stormilst - Thursday, January 26, 2006 - link
It is not impossible, but unless your friend works in some high level capacity I would say his comments at best are questionable. I don't think working in shipping will qualify him as an expert on the subject?coldpower27 - Wednesday, January 25, 2006 - link
http://www.anandtech.com/video/showdoc.aspx?i=2679...">http://www.anandtech.com/video/showdoc.aspx?i=2679..."Notoriously demanding on GPUs, F.E.A.R. has the ability to put a very high strain on graphics hardware, and is therefore another great benchmark for these ultra high-end cards. The graphical quality of this game is high, and it's highly enjoyable to watch these cards tackle the F.E.A.R demo."
Wasn't use of this considered a bad idea as Nvidia cards have a huge performance penalty when used in this and the final buuld was supposed to be much better???
photoguy99 - Wednesday, January 25, 2006 - link
I noticed 1900x1440 is commonly benchmarked -Wouldn't the majority of people with displays in this range have 1920x1200 since that's what all the new LCDs are using? And it's the HD standard.
Aren't LCDs getting to be pretty capable game displays? My 24" Acer has a 6 ms (claimed) gray to gray response time, and can at least hold it's own.
Resolution for this monitor and almost all others this large: 1920x1200 - not 1920x1440.
Per Hansson - Wednesday, January 25, 2006 - link
Doing the math:Crossfire = 459w - 1900XTX = 341w = 118w, efficiency of PSU used@400w=78% so 118x0.78=92,04w
Per Hansson - Friday, January 27, 2006 - link
No replies huh? Cause I've read on other sites that the card draws upto 175w... Seems like quite a stretch so that was why I did the math to start with...