As one who has been waiting for you to revisit this issue--
quote: We will be looking closely at how ATI's latest drivers handle digital video processing... [and] how it compares to NVIDIA's decoding. There have been some changes by ATI which address some decoding problems in the past, and we're interested to see how well they've done.
--I'm glad to see ATI back on it's feet, after falling down so clearly, though unexpectedly, when you last compared it's "entertainment video" capabilities against Nvidia's.
However, I'm unclear under what restrictions --if at all-- these results can be extrapolated to other ATI offerings.
I understand you were testing "ATI's latest drivers," but is the improved digital video processing ONLY a function of the new drivers (as opposed to new drivers taking advantage of new or improved silicon on the AIW X1900 PCB...), and if so, what other recent vintage, non-AIW, ATI cards will yeild the same improved performance, using the same new drivers? Or, if the improved results are a matter of BOTH new drivers AND new hardware, what other non-AIW cards in the ATI pipeline will provide both of these elements, and when and at what cost will they become available?
Thanks & Keep up the PFEGL work.
("performance_far_exceeding_grade_level")
...I have the HQV test DVD and the results are somewhat different on an X1800.
Anyway...
Noise Reduction: Unless you're watching analog TV you don't want this enabled at all. Is the ATI decoder actually doing noise reduction or is it just mangling the video?
Wacky film cadences: This I find amazing that the card passed all of the funky cadences. Want a real test: watch the re-runs of Star Trek: TNG on Spike. They are time-compressed. If they are nice and smooth on the ATI solution they have a real winner there.
I'd alos like to know if all of these tests pass when the DVD is run within Media Center.
One of the main features of the new cards from S3 is the video decoding features. Obviouslly they can't compete with the high-end nVidia and ATI cards for gaming performance, but they are supposed to be good for DVD playback and the like.
How about getting hold of an S3 Chrome S27 (their "high-end" card) and putting it through the tests comparing video playback quality, and also how it compares performance-wise to equivalent nVidia and ATI cards (6600, 6600GT, X700, X1300, X1600 sort of price-range). There are more than two companies that manufacture graphics-cards after all.
...of course applying noise reduction to a clean DVD just means you are removing detail. Unless the ATI decoder has the ability to disable this feature I wouldn't call it a bonus. On the other hand it looks like the NV decoder is actually preserving all detail in the MPEG2 video rather then smearing it around.
A alos seriously doubt the ability of the ATI decoder to lock all of those cadences.
I have an X1900XT, and I'm wondering how to tell if I'm actually using the AVIVO capabilities. I use (retail) PowerDVD6 which includes it's own software decoder, and I'm using the Catalyst 6.2 drivers. Anyone have any ideas?
Any idea if the new core will support crossfiring with the AIW? I've read beyond3d's explanation w.r.t. 1800's. Xfiring would probably help out with the slower gpu/memory benches?
I think this is a great test. Nevertheless if you want to benchmark their DVD-performance, conpare them as well with benchmark: zoomplayer with dscaler5 with ffdshow in VMR9 with activated scaling, denoise and sharpening.
And as many users use a HTPC because they have have a high-resolution plasma or projector and want to have a DVD-player with a build in hig-end-scaler: TEst the scaler ability as well, please.
Bottom line will be at the current stage that the lanczos 4 sclaing of ffdshow in combination with the great sharpening capbilities are unbeaten.
So, if someone is not watching TV, but DVD and want the best quality, he can forget the latest cards as they still cant compete. Instead, you should invest in a nice dual-processor CPU and a not so fresh card like the X800 to get superior performance (for DVD) for the same / less money.
Any update on the G80 release schedule? Still looking like May/June? Want to see what NVidia responds with, both to see how it performs and to see how ATI's X1900 series card prices come down.
The Nvidia 7900 demolishes the X1900 which is why ATI got it out the door asap. ATI can't compete with an old solution from Nvidia the 7800 so they put out a x1800 which was half a year late therefore a new release then the x1900 even newer and it still can't literally demolish every Nvidia product. ATI did get the hardware out en masse which is good but I hope they sort out their driver quality problems not to mention ATI IQ which alot think is better is showing to be worse in alot of cases. ATI does alot of things which make you think you are getting high quality but aren't not that Nvidia is innocent here either.
Nvidia really doesn't need to release anything since the X1900 isn't much of a king of the hill line ATI still can't beat 7800GT 6800GS cards and the top of the line doesn't really run that much faster but the price? Only card that is overpriced for Nvidia is the 512mb card and it's still a better design overall than the ATI counterparts.
We have to wait and see what the DX10 capable cards are like because when Nvidia decides to let the 7900 loose the x1900 is history. Think 512mb GTX but gets double the rates and you can conjure an image of how strong that card is. The x1900 isn't much of a step up from a x1800 only reason to buy an ATI card is an AIW card honestly.
Any ATI card under a x1800XL or x1900 is crap ATI wise.
I am looking to build a Media Center PC. I need a card that has a YPbPr input and can be paired with a SB audigy to record the signal coming off my Directv HD Reciever. Tivo doesnt have a PVR yet that can work with this and I was kind of hoping to build one myself. Would this card allow me to do this? If not, what do I need. The MIT MyHD cards will not work as they do not have either a DVI input or a YPbPr input.
"If performance continues to increase at the rate that it has been, we aren't sure how game software will be able to keep up. We are always happy when we see advancements in technology, but the huge sizes of some of these high end cards make us think better efficiency might be good direction for graphics hardware to move toward."
I'm so tired of reading comments like these! Look at the benches of Fear and try to tell me that graphics power is in abundance. You have to buy a $600 graphics card to play the game at a decent framerate on pretty much any LCD monitor sold today. Even if you buy a low end 17 inch LCD you are going to be running 1280x1024 because that is your monitors native resolution and anything less results in a much poorer picture.
With the number of features built into most motherboards I cant see any problem with filling the extra space with graphics cards. You dont need 5 PCI slots. The charm of the ATX standard is and always has been adaptability.
P.S. The price of graphics cards is totally rediculous these days. The cheapest 7800gtx 512mb you can buy in Canada is $899 on sale! The x1900xtx can be found for around $680. I hope Nvidia doesnt continue this crazy pricing in their spring release.
It's taking ATI an awfully long time to figure out how to put the Theater 550 on All-In-Wonder cards in place of the Theater 200. Until they can do it, and thus give us hardware MPEG-2 encoding when recording video, I'd rather pay to have both a video card, and a separate tuner card (like my Hauppauge WinTV PVR 150) in my system. The All-In-Wonder X1900 is supposed to be the Cadillac; why are the video recording features more akin to a Chevy?
Is this your attempt at a troll, or are you just uninformed?
This card has MPEG 1/2/4 hardware encoding. ATI has had hardware encoding on the AIW card for over a year.
None of the ATI cards do HW based MPEG2 (or any other sort) of encoding. At least not yet. ATI has demoed their AVC codec, of which the released version of the demo does not actually use the hardware and produce horrendous quality becuasr they skip most of the AVC codec features to attain high speeds.
To see this all you need to do is compare CPU load of teh AIW -vs- the ATI 550 Elite and see that the AIW is not using hardware of any kind.
You have identical numbers for w/ AA and w/o. Also, the text's comment on the X1900AIW being playable at all reolutions with AA uses the incorrect numbers.
does the ATI decoder give you the option to ADD sharpening? or not at all? my whole reason for wanting a (hardware accellerated) software decoder is so i can have a pic quality rivaling a ("popular") $200 hardware player for whatever extra it would be for the software. supposedly free for ATI or $30 for nVidia (for my application: 2.0 sound)
It uses 2.0ns chips from what I recall, as does newer BBA x1800xl's (instead of 1.4ns).
There was one site that did an overclocking section on it, I forget which. The results were similar to x1800xl's, the end clock speed ending up 600+/almost 700 iirc. You know how XL's clock, i'm sure.
So in essence, yes, it overclocks well, and I do remember the site being amazed by the performance improvement through overclocking. I still don't get how 2.0ns chips can hit 1.4ns speeds if there is a speed bin in-between for cards like nvidia's 7800gt/gtx that you would think use that supply...but i've seen quite a few cards with the newer, slower, chips hitting the same approx speeds as the old ones with 1.4ns, and i'm not complaining. ;)
I'm sure with the overclock or ATiTool soft-vmods this thing would be killer, especially with better cooling than the known-for-sucking XL stock cooler.
does the 7800 gt use another video processor then the GTX? because in the last review with the HQV tests the 7800 gtx scored better then the gt in this review...
The example shown does not change the overall score of the card. The example shown is for the readers reference to the test, and is not what the test scored from. There may be other reasons someone may not give these tests merit, but this is not one of them. You could maybe rank on the author for this, but not the tests.
The example shown does not change the overall score of the card.
But how do we know that? Take example cadence 2224. According to the text the same item is being compared, yet different frames are clearly shown. If their methadology was more concise, their text is not.
You have to be clear about this or it misleads your readers. Its like doing a 3D test using 2 different scenes to render. Anandtech uses all the same 3D scenes to render right?
quote: If performance continues to increase at the rate that it has been, we aren't sure how game software will be able to keep up.
By adding more polys, textures, particles, lights, shadows and shaders. You really didn't know this? Call any respected game dev house and ask them if they could possibly come up with a use for more GPU horsepower. The answer will be "Of course genius, we've got code and models we're waiting for capable hardware to run on, it's been that way for years. We'll take every bit of it we can get." Tell Anand I want you to spend this weekend benching EQ2 maxed out and tell us Monday if "we" still "aren't sure."
Anyhow, sounds like a nice card, but I'd rather have a more dedicated gaming card and a seperate TV tuner solution.
Of course they want more power so they dont have to write efficient and optimized code. Especially your EQ2 example comes to mind. There are far too few companies that come up with highly optimized code that will run top notch on current hardware and provide extra eye candy on future generations.
The 7800GT used in the test must be stock. The one I purchased came overclocked and performs much better than what the benchmarks are showing.
ATI is still too pricy at the moment, I looked up and down for an X1800XL that could come with in price range of the 7800GT that I purchased, and I couldn't find one. I wasn't going to pay $60 over when they perform so identical. The prices were approx.
I sold my 7800GTX bought a x1900XT and i couldn't be happier :! if G71 fixes some issues like IQ and HDR with AA then i will sell my X1900XT and buy a 7900GTX :) or eles wait for R6XX and G8X.
Having owned an X800xl and a 7800GT, I honestly didn't see an IQ difference. The whole HDR with AA thing, well, you must play a lot of Far Cry.
Good luck with keeping up with the latest and greatest though, it's almost a game with in it self. If you sell at the right times, you can upgrade for very little and still have the newest toys.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
43 Comments
Back to Article
CleanBoot - Tuesday, February 14, 2006 - link
As one who has been waiting for you to revisit this issue----I'm glad to see ATI back on it's feet, after falling down so clearly, though unexpectedly, when you last compared it's "entertainment video" capabilities against Nvidia's.
However, I'm unclear under what restrictions --if at all-- these results can be extrapolated to other ATI offerings.
I understand you were testing "ATI's latest drivers," but is the improved digital video processing ONLY a function of the new drivers (as opposed to new drivers taking advantage of new or improved silicon on the AIW X1900 PCB...), and if so, what other recent vintage, non-AIW, ATI cards will yeild the same improved performance, using the same new drivers?
Or, if the improved results are a matter of BOTH new drivers AND new hardware, what other non-AIW cards in the ATI pipeline will provide both of these elements, and when and at what cost will they become available?
Thanks & Keep up the PFEGL work.
("performance_far_exceeding_grade_level")
ianken - Monday, February 13, 2006 - link
...I have the HQV test DVD and the results are somewhat different on an X1800.Anyway...
Noise Reduction: Unless you're watching analog TV you don't want this enabled at all. Is the ATI decoder actually doing noise reduction or is it just mangling the video?
Wacky film cadences: This I find amazing that the card passed all of the funky cadences. Want a real test: watch the re-runs of Star Trek: TNG on Spike. They are time-compressed. If they are nice and smooth on the ATI solution they have a real winner there.
I'd alos like to know if all of these tests pass when the DVD is run within Media Center.
PrinceGaz - Monday, February 13, 2006 - link
One of the main features of the new cards from S3 is the video decoding features. Obviouslly they can't compete with the high-end nVidia and ATI cards for gaming performance, but they are supposed to be good for DVD playback and the like.How about getting hold of an S3 Chrome S27 (their "high-end" card) and putting it through the tests comparing video playback quality, and also how it compares performance-wise to equivalent nVidia and ATI cards (6600, 6600GT, X700, X1300, X1600 sort of price-range). There are more than two companies that manufacture graphics-cards after all.
ianken - Sunday, February 12, 2006 - link
...of course applying noise reduction to a clean DVD just means you are removing detail. Unless the ATI decoder has the ability to disable this feature I wouldn't call it a bonus. On the other hand it looks like the NV decoder is actually preserving all detail in the MPEG2 video rather then smearing it around.A alos seriously doubt the ability of the ATI decoder to lock all of those cadences.
aggressor - Saturday, February 11, 2006 - link
I have an X1900XT, and I'm wondering how to tell if I'm actually using the AVIVO capabilities. I use (retail) PowerDVD6 which includes it's own software decoder, and I'm using the Catalyst 6.2 drivers. Anyone have any ideas?Innovato - Saturday, February 11, 2006 - link
Any idea if the new core will support crossfiring with the AIW? I've read beyond3d's explanation w.r.t. 1800's. Xfiring would probably help out with the slower gpu/memory benches?multiblitz - Saturday, February 11, 2006 - link
I think this is a great test. Nevertheless if you want to benchmark their DVD-performance, conpare them as well with benchmark: zoomplayer with dscaler5 with ffdshow in VMR9 with activated scaling, denoise and sharpening.And as many users use a HTPC because they have have a high-resolution plasma or projector and want to have a DVD-player with a build in hig-end-scaler: TEst the scaler ability as well, please.
Bottom line will be at the current stage that the lanczos 4 sclaing of ffdshow in combination with the great sharpening capbilities are unbeaten.
So, if someone is not watching TV, but DVD and want the best quality, he can forget the latest cards as they still cant compete. Instead, you should invest in a nice dual-processor CPU and a not so fresh card like the X800 to get superior performance (for DVD) for the same / less money.
yacoub - Saturday, February 11, 2006 - link
Any update on the G80 release schedule? Still looking like May/June? Want to see what NVidia responds with, both to see how it performs and to see how ATI's X1900 series card prices come down.Humble Magii - Saturday, February 11, 2006 - link
The Nvidia 7900 demolishes the X1900 which is why ATI got it out the door asap. ATI can't compete with an old solution from Nvidia the 7800 so they put out a x1800 which was half a year late therefore a new release then the x1900 even newer and it still can't literally demolish every Nvidia product. ATI did get the hardware out en masse which is good but I hope they sort out their driver quality problems not to mention ATI IQ which alot think is better is showing to be worse in alot of cases. ATI does alot of things which make you think you are getting high quality but aren't not that Nvidia is innocent here either.Nvidia really doesn't need to release anything since the X1900 isn't much of a king of the hill line ATI still can't beat 7800GT 6800GS cards and the top of the line doesn't really run that much faster but the price? Only card that is overpriced for Nvidia is the 512mb card and it's still a better design overall than the ATI counterparts.
We have to wait and see what the DX10 capable cards are like because when Nvidia decides to let the 7900 loose the x1900 is history. Think 512mb GTX but gets double the rates and you can conjure an image of how strong that card is. The x1900 isn't much of a step up from a x1800 only reason to buy an ATI card is an AIW card honestly.
Any ATI card under a x1800XL or x1900 is crap ATI wise.
z3R0C00L - Saturday, February 11, 2006 - link
Humble Magii,So you're one of those Fanboys who get's nVidia hardware in exchange for comments like this huh...:p
Stop with the fanboy crap.. seriously.. it's gettng old.
ayersmj - Saturday, February 11, 2006 - link
I am looking to build a Media Center PC. I need a card that has a YPbPr input and can be paired with a SB audigy to record the signal coming off my Directv HD Reciever. Tivo doesnt have a PVR yet that can work with this and I was kind of hoping to build one myself. Would this card allow me to do this? If not, what do I need. The MIT MyHD cards will not work as they do not have either a DVI input or a YPbPr input.Thanks.
PeteRoy - Friday, February 10, 2006 - link
Hey put the old way of showing benchmark figures, this is too confusing and hard to read!Pastuch - Friday, February 10, 2006 - link
"If performance continues to increase at the rate that it has been, we aren't sure how game software will be able to keep up. We are always happy when we see advancements in technology, but the huge sizes of some of these high end cards make us think better efficiency might be good direction for graphics hardware to move toward."I'm so tired of reading comments like these! Look at the benches of Fear and try to tell me that graphics power is in abundance. You have to buy a $600 graphics card to play the game at a decent framerate on pretty much any LCD monitor sold today. Even if you buy a low end 17 inch LCD you are going to be running 1280x1024 because that is your monitors native resolution and anything less results in a much poorer picture.
With the number of features built into most motherboards I cant see any problem with filling the extra space with graphics cards. You dont need 5 PCI slots. The charm of the ATX standard is and always has been adaptability.
P.S. The price of graphics cards is totally rediculous these days. The cheapest 7800gtx 512mb you can buy in Canada is $899 on sale! The x1900xtx can be found for around $680. I hope Nvidia doesnt continue this crazy pricing in their spring release.
flashbacck - Friday, February 10, 2006 - link
Where does one get the ATI DVD decoder? Does that only come with the AIW cards?LoneWolf15 - Friday, February 10, 2006 - link
It's taking ATI an awfully long time to figure out how to put the Theater 550 on All-In-Wonder cards in place of the Theater 200. Until they can do it, and thus give us hardware MPEG-2 encoding when recording video, I'd rather pay to have both a video card, and a separate tuner card (like my Hauppauge WinTV PVR 150) in my system. The All-In-Wonder X1900 is supposed to be the Cadillac; why are the video recording features more akin to a Chevy?Questar - Friday, February 10, 2006 - link
Is this your attempt at a troll, or are you just uninformed?This card has MPEG 1/2/4 hardware encoding. ATI has had hardware encoding on the AIW card for over a year.
http://www.ati.com/products/radeonx1900/aiwx1900/f...">http://www.ati.com/products/radeonx1900/aiwx1900/f...
LoneWolf15 - Monday, February 13, 2006 - link
Gee, thanks for your insulting reply. It is possible to be informative without being insulting, you know.LoneWolf (the "troll" who owns a Radeon X800XL and an ATI TV Wonder PCI)
Questar - Monday, February 13, 2006 - link
How can you own an X800XL and not know it does hardware mpeg encoding?Questar - Monday, February 13, 2006 - link
Oh, I get it, you didn't get the AIW version.Hmmmmmm....
ianken - Monday, February 13, 2006 - link
None of the ATI cards do HW based MPEG2 (or any other sort) of encoding. At least not yet. ATI has demoed their AVC codec, of which the released version of the demo does not actually use the hardware and produce horrendous quality becuasr they skip most of the AVC codec features to attain high speeds.To see this all you need to do is compare CPU load of teh AIW -vs- the ATI 550 Elite and see that the AIW is not using hardware of any kind.
Sunrise089 - Friday, February 10, 2006 - link
You have identical numbers for w/ AA and w/o. Also, the text's comment on the X1900AIW being playable at all reolutions with AA uses the incorrect numbers.plonk420 - Friday, February 10, 2006 - link
does the ATI decoder give you the option to ADD sharpening? or not at all? my whole reason for wanting a (hardware accellerated) software decoder is so i can have a pic quality rivaling a ("popular") $200 hardware player for whatever extra it would be for the software. supposedly free for ATI or $30 for nVidia (for my application: 2.0 sound)hwhacker - Friday, February 10, 2006 - link
here we go:http://www.bytesector.com/data/bs-article.asp?ID=6...">http://www.bytesector.com/data/bs-article.asp?ID=6...
590/684...I was close.
18+% improvement in 3dmark06, you know that has to translate to something good in gaming.
hwhacker - Friday, February 10, 2006 - link
It uses 2.0ns chips from what I recall, as does newer BBA x1800xl's (instead of 1.4ns).There was one site that did an overclocking section on it, I forget which. The results were similar to x1800xl's, the end clock speed ending up 600+/almost 700 iirc. You know how XL's clock, i'm sure.
So in essence, yes, it overclocks well, and I do remember the site being amazed by the performance improvement through overclocking. I still don't get how 2.0ns chips can hit 1.4ns speeds if there is a speed bin in-between for cards like nvidia's 7800gt/gtx that you would think use that supply...but i've seen quite a few cards with the newer, slower, chips hitting the same approx speeds as the old ones with 1.4ns, and i'm not complaining. ;)
I'm sure with the overclock or ATiTool soft-vmods this thing would be killer, especially with better cooling than the known-for-sucking XL stock cooler.
Shadowmage - Friday, February 10, 2006 - link
What I'm curious to know is whether the AIW can overclock to roughly XT/XTX speeds.What type of RAM does the AIW use?
Zebo - Saturday, February 11, 2006 - link
Agreed. How can AT not include this? Lame.DigitalFreak - Friday, February 10, 2006 - link
Although I appreciate the DVD decoder tests, how is this review related to the AIW features of the card?highlandsun - Friday, February 10, 2006 - link
I would have been more interested in seeing how well it handled H.264 decoding at 1920x1080p.oxid - Friday, February 10, 2006 - link
does the 7800 gt use another video processor then the GTX? because in the last review with the HQV tests the 7800 gtx scored better then the gt in this review...mpeavid - Friday, February 10, 2006 - link
You guys need to use the exact same frame as an example for all cadence tests. Not doing so can invalidate your test.bldckstark - Friday, February 10, 2006 - link
The example shown does not change the overall score of the card. The example shown is for the readers reference to the test, and is not what the test scored from. There may be other reasons someone may not give these tests merit, but this is not one of them. You could maybe rank on the author for this, but not the tests.mpeavid - Friday, February 10, 2006 - link
The example shown does not change the overall score of the card.But how do we know that? Take example cadence 2224. According to the text the same item is being compared, yet different frames are clearly shown. If their methadology was more concise, their text is not.
You have to be clear about this or it misleads your readers. Its like doing a 3D test using 2 different scenes to render. Anandtech uses all the same 3D scenes to render right?
rjm55 - Friday, February 10, 2006 - link
Other sites did AIW 1900 reviews on January 31st. Why so long for AT? Did ATI pass you over on sending a sample?fishbits - Friday, February 10, 2006 - link
By adding more polys, textures, particles, lights, shadows and shaders. You really didn't know this? Call any respected game dev house and ask them if they could possibly come up with a use for more GPU horsepower. The answer will be "Of course genius, we've got code and models we're waiting for capable hardware to run on, it's been that way for years. We'll take every bit of it we can get." Tell Anand I want you to spend this weekend benching EQ2 maxed out and tell us Monday if "we" still "aren't sure."
Anyhow, sounds like a nice card, but I'd rather have a more dedicated gaming card and a seperate TV tuner solution.
Griswold - Monday, February 13, 2006 - link
Of course they want more power so they dont have to write efficient and optimized code. Especially your EQ2 example comes to mind. There are far too few companies that come up with highly optimized code that will run top notch on current hardware and provide extra eye candy on future generations.Backslider - Friday, February 10, 2006 - link
The 7800GT used in the test must be stock. The one I purchased came overclocked and performs much better than what the benchmarks are showing.ATI is still too pricy at the moment, I looked up and down for an X1800XL that could come with in price range of the 7800GT that I purchased, and I couldn't find one. I wasn't going to pay $60 over when they perform so identical. The prices were approx.
X1800XL 256 Stock $330
7800GT 256 OC $270
ATI get those prices down.
tuteja1986 - Friday, February 10, 2006 - link
I sold my 7800GTX bought a x1900XT and i couldn't be happier :! if G71 fixes some issues like IQ and HDR with AA then i will sell my X1900XT and buy a 7900GTX :) or eles wait for R6XX and G8X.Backslider - Friday, February 10, 2006 - link
Having owned an X800xl and a 7800GT, I honestly didn't see an IQ difference. The whole HDR with AA thing, well, you must play a lot of Far Cry.Good luck with keeping up with the latest and greatest though, it's almost a game with in it self. If you sell at the right times, you can upgrade for very little and still have the newest toys.
Happy gaming
MrKaz - Friday, February 10, 2006 - link
I have a ati 9700 and geforce 6600gt and ati rendering look better.There are some annoying layers/plates on the nvidia rendering that i dont like.
And just one note: the display is the same on both cards.
DeathByDuke - Friday, February 10, 2006 - link
I'd certainly buy one if it was around $299-349, considering it performs closer to a much more expensive X1800XTDeathByDuke - Friday, February 10, 2006 - link
oh yeah, you need to fix the images on the games benchmarks, they are all URL links lolWesley Fink - Friday, February 10, 2006 - link
The HTML code is now fixed and the game graphs are displaying.bigboxes - Friday, February 10, 2006 - link
I am going to be in market for such a card come this summer. This card looks to foot the bill and the price should come down by then. Very nice!