Original Link: https://www.anandtech.com/show/2553



It's been one of those long nights, the type where you don't really sleep but rather nap here and there. Normally such nights are brought on by things like Nehalem, or NVIDIA's GT200 launch, but last night was its own unique flavor of all-nighter.

On Monday, AMD had a big press event to talk about its next-generation graphics architecture. We knew that a launch was impending but we had no hardware nor did we have an embargo date when reviews would lift, we were at AMD's mercy.

You may already know about one of AMD's new cards: the Radeon HD 4850. It briefly appeared for sale on Amazon, complete with specs, before eventually getting pulled off the site. It turns out that other retailers in Europe not only listed the card early but started selling them early. In an effort to make its performance embargoes meaningful, AMD moved some dates around.

Here's the deal: AMD is launching its new RV770 GPU next week, and just as the RV670 that came before it, it will be available in two versions. The first version we can talk about today: that's the Radeon HD 4850. The second version, well, just forget that I even mentioned that - you'll have to wait until the embargo lifts for more information there.

But we can't really talk about the Radeon HD 4850, we can only tell you how it performs and we can only tell you things you would know from actually having the card. The RV770 architectural details remain under NDA until next week as well. What we can tell you is how fast AMD's new $199 part is, but we can't tell you why it performs the way it does.

We've got no complaints as we'd much rather stay up all night benchmarking then try to put together another GT200 piece in a handful of hours. It simply wouldn't be possible and we wouldn't be able to do AMD's new chips justice.

What we've got here is the polar opposite of what NVIDIA just launched on Monday. While the GT200 is a 1.4 billion transistor chip found in $400 and $650 graphics cards, AMD's Radeon HD 4850 is...oh wait, I can't tell you the transistor count quite yet. Let's just say it's high, but not as high as GT200 :)

Again, we're not allowed to go into the architectural details of the RV770, the basis for the Radeon HD 4800 series including today's 4850, but we are allowed to share whatever data one could obtain from having access to the card itself, so let's get started.

Running GPU-Z we see that the Radeon HD 4850 shows up as having 800 stream processors, up from 320 in the Radeon HD 3800 series. Remember that the Radeon HD 3800 was built on TSMC's 55nm process and there simply isn't a smaller process available for AMD to use, so the 4800 most likely uses the same manufacturing process. With 2.5x the stream processor count, the RV770 isn't going to be a small chip, while we can't reveal transistor count quite yet you can make a reasonable guess.

Clock speeds are also fair game as they are reported within GPU-Z and AMD's Catalyst control panel:

That's a 625MHz core clock and 993MHz GDDR3 memory clock (1986MHz data rate). We've got more stream processors than the Radeon HD 3870, but they are clocked a bit lower to make up for the fact that there are 2.5x as many on the same manufacturing process.

  ATI Radeon HD 4850 ATI Radeon HD 3870
Stream Processors 800 320
Texture Units I can't tell you 16
ROPs 16 16
Core Clock 625MHz 775MHz+
Memory Clock 993MHz (1986MHz data rate) 1125MHz (2250MHz data rate)
Memory Bus Width 256-bit 256-bit
Frame Buffer 512MB 512MB
Transistor Count it's a secret 666 million
Manufacturing Process TSMC 55nm TSMC 55nm
Price Point $199 $199

 

The rest of the specs are pretty straightforward, it's got 512MB of GDDR3 connected to a 256-bit bus and the whole card will set you back $199. The Radeon HD 4850 will be available next week, and given that we've already received cards from 3 different manufacturers - we'd say that this thing is going to be available on time.
 



8-channel LPCM over HDMI

You may have heard that I've recently become somewhat infatuated with HTPCs. I've been hammering on all of the AnandTech staffers to start looking at the needs of HTPC enthusiasts, and I've personally been on a bit of a quest to find the perfect HTPC components.

Blu-ray (and HD-DVD) both support Dolby TrueHD and DTS-HD audio encoding, which offer discrete 8-channel audio output. The problem is that there's currently no way to send a TrueHD or DTS-HD encoded stream from a PC over HDMI to a receiver, the stream must be decoded on the PC. Cyberlink's PowerDVD will decode these high definition audio formats just as well as any receiver into 8-channel LPCM audio, but you need support for sending 8-channel LPCM over HDMI.

Most graphics cards that implement HDMI simply pass SPDIF from the motherboard's audio codec over HDMI, which is unfortunately only enough for 2-channel LPCM or 6-channel encoded Dolby Digital/DTS audio. Chipsets with integrated graphics such as NVIDIA's GeForce 8200 and Intel's G35 will output 8-channel LPCM over HDMI, but AMD's 780G will not.

All of AMD's Radeon HD graphics cards have shipped with their own audio codec, but the Radeon HD 4800 series of cards finally adds support for 8-channel LPCM output over HDMI. This is a huge deal for HTPC enthusiasts because now you can output 8-channel audio over HDMI in a motherboard agnostic solution. We still don't have support for bitstreaming TrueHD/DTS-HD MA and most likely won't anytime this year from a GPU alone, but there are some other solutions in the works for 2008.

To use the 8-channel LPCM output simply configure your media player to decode all audio streams and output them as 8-channel audio. HDMI output is possible courtesy of a DVI-to-HDMI adapter bundled with the card; AMD sends audio data over the DVI interface which is then sent over HDMI using the adapter.



NVIDIA's Unexpected Response

At 1:25AM EST we received an email from NVIDIA PR, announcing a product we had no idea was coming: the GeForce 9800 GTX+.

The 9800 GTX+ is a die-shrunk G92 based on TSMC's 55nm process, the same one used by AMD for the Radeon HD 4850.  The GTX+ runs at 738MHz/1836MHz (core/SP) up from the stock clock speeds of 675MHz/1690MHz (core/SP). The moderate increase in clock speed (8 - 9%) is one thing, the price point is another: $229.

To make matters worse for AMD, the vanilla GeForce 9800 GTX is going to be priced at $199. Had AMD not introduced the Radeon HD 4850, the 9800 GTX would never have to drop in price, and thus we enjoy the benefits of an AMD that is once again competitive in the marketplace. The price drops on the 9800 GTX will begin to take effect next week (conveniently enough) and the GTX+ will be widely available starting July 16th.

NVIDIA's timing is suspicious, it had a full reviewer's guide ready so clearly it anticipated AMD being very competitive with the Radeon HD 4850, but the email came at an odd time of night.

It's a sneaky move by NVIDIA, had that email never been sent, AMD would have had its day of glory - its own 8800 GT if you will, a $199 part that reset all expectations and raised the bar. Instead, NVIDIA preempted any such move by pre-announcing a 9800 GTX price drop as well as a new, higher end 9800 GTX+ SKU. That's what competition is folks.



AMD: The Peoples' GPU Maker

This week AMD came out and codified its new GPU strategy, but in reality it's the same strategy that has been in place since the release of the R600 GPU (Radeon HD 2900 XT). On paper (or LCD), it's the best idea ever, take a look:

Obviously this mythical GPU that can let you play at any resolution with any detail settings doesn't exist, but the idea is that AMD will continue to target the $200 - $300 market segment with its GPU designs.

The buck doesn't stop there though, AMD will continue to build more and less expensive GPUs, they will simply be derived off of this one mainstream design. Again this is nothing new, it's exactly what AMD did with R600 and RV670.

NVIDIA's approach is markedly different as this week's GT200 launch clearly illustrates. NVIDIA continues the approach of building a very large, monolithic GPU, eventually scaling the architecture down to lower power and price points. The GT200 is the latest example of the large monolithic die and subsequent mainstream parts will be based on some version of that GPU.

AMD argues that NVIDIA's approach means that there's too long of a time to market for high speed mainstream GPUs and it keeps power/costs high. There is truth in what AMD is saying but not entirely.

NVIDIA could just as easily introduce a brand new architecture with a mainstream part, it simply chooses not to as it's far easier to recoup R&D costs by selling ultra high end, high margin GPUs.

The power/cost argument is a valid one but AMD's approach isn't actually any better from that standpoint:

 

A pair of RV770s, AMD's new GPU, end up consuming more power than a single GT200 - despite being built on a smaller 55nm process.

A pair of these RV770s only costs $400 compared to $650 for a single GT200, but I suspect that part of that is due to differences in manufacturing process. If NVIDIA hadn't been so risk averse with the GT200 and built it on 55nm (not that I'm advocating it, simply posing a hypothetical), the cost differences would be smaller - if not in favor of NVIDIA since GT200 is built on a single card.

When the smoke clears, AMD's strategy is to simply build a GPU for the masses and attempt to scale it up and down. While NVIDIA is still building its GPUs the same way it has for decades, starting very large and scaling down.

AMD isn't taking a radically different approach to building and designing GPUs than NVIDIA, it's simply building one market segment lower.



Power, Thermals, Noise and Die Size

The Radeon HD 4850 is a single slot design, but the card itself gets very hot. At idle the card is mostly silent, but like the GeForce GTX 280 you can hear this thing once the fan spins up. It's definitely not as loud as the GTX 280, but it's not silent under full load.

 

The Radeon HD 4850 draws a bit less power than its closest competitor, the GeForce 9800 GTX.

 

With two 4850s paired up in CrossFire, we once again ran into issues with our power supply. Our 1000W OCZ EliteXStream wasn't always enough for the dual-GPU setup and in Call of Duty 4 our system rebooted in the middle of our test at 2560 x 1600. Thankfully OCZ sent us a PC Power & Cooling Turbo Cool 1200W unit that is certified for use with GeForce GTX 280 SLI, and if it works on that beast, it had better work with a pair of 4850s in CrossFire.

The PCP&C unit is quite loud as we mentioned in our review, but it got the job done, we were able to run all of our benchmarks without a hiccup after swapping power supplies. Despite AMD's small-GPU strategy, power consumption on multi-GPU configurations is still just as much of a problem as it is for NVIDIA.

The Test

Test Setup
CPU Intel Core 2 Extreme QX9770 @ 3.20GHz
Motherboard EVGA nForce 790i SLI
Intel DX48BT2
Video Cards ATI Radeon HD 4850
ATI Radeon HD 3870 X2
ATI Radeon HD 3870
NVIDIA GeForce 9800 GTX
NVIDIA GeForce 9800 GX2
NVIDIA GeForce 8800 GTX
NVIDIA GeForce 8800 GT
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 260
Video Drivers Catalyst
Catalyst 8.5
ForceWare 177.34 (for GT200)
ForceWare 175.16 (everything else)
Hard Drive Seagate 7200.9 120GB 8MB 7200RPM
RAM 4 x 1GB Corsair DDR3-1333 7-7-7-20
Operating System Windows Vista Ultimate 64-bit SP1
PSU PC Power & Cooling Turbo Cool 1200W


Crysis

For Crysis, we used the Crysis Benchmark Tool available from Crymod. Rather than testing our own recorded demo, this time we went with the built-in GPU benchmark that does a flyby of the island level. Our test was run with all the settings except for shader quality on high. Shader quality was set to very high.

 


Click to Enlarge

With less than 5% difference between the 9800 GTX and the 4850, the new card from AMD certainly holds its own. it still isn't really playable at 1080p or higher resolutions, and even 1680x1050 is a tall order, but the 9800 GTX is in the same boat. Playing at 1280x1024 or lower or dropping down the quality settings is recommended until you start to spend much more money.



Call of Duty 4

For Call of Duty 4, we max out all the settings we are able. Our benchmark is the cutscene directly before the first mission: flying over the sea in a hellicopter on a rainy night. We used FRAPS to record the data for this test.

 


Click to Enlarge

While NVIDIA has been the reigning Call of Duty champion for quite some time, the 4850 really changes things up. The gap between the 9800 GTX and the 4850 gets smaller as resolution increases, but even at 2560x1600 the previously more expensive 9800 GTX loses out.



Enemy Territory: Quake Wars

This benchmark is performed using the 1.5 version of ET:QW, and we recorded our own demo on the island level. The demo isn't very long but tries to stress the GPU a fair amount. We tested with all the settings on their highest level except for AA which was set to 4x in game.

 


Click to Enlarge

Even more so than in Call of Duty, the 4850 clobbers the 9800 GTX and performs dangerously close to NVIDIA's upper echelon of hardware. Performance at 2560x1600 tapers off quite a bit, but the 4850 still does quite well comparatively. ET:QW being our only OpenGL benchmark, this is quite a surprising result as NVIDIA has always been the goto solution for OpenGL value. This time around they lose out quite handily.



Assassin's Creed

Another FRAPS benchmark, in Assassin's Creed we found a repeatable area of the game to benchmark at the opening of a memory. In this test, we start walking and benchmarking as soon as the memory finishes fading in and we stop as soon as we trigger the next cut scene. We test with AA disabled in all resolutions as AA isn't available above 1680x1050. Because the negative impact of the patch on the performance of AMD cards only happens with AA enabled, we do test with the latest patch applied.

 


Click to Enlarge

The margin isn't as large as in some of the other games we've tested, but the 4850 does mantain a performance advantage over the 9800 GTX. Again, the margin also narrows as resolution increases, but the 4850 still remains on top.



Oblivion

Our Oblivion test hasn't changed in quite some time. This is a FRAPS benchmark with a straight line run through part of a level. We take the first run through and exit the game between testing resolutions, as this gives us more consistent results. We are using the 1.2.0416 version of the game with the test taking place in a forest in the south of the Shivering Isles.


Click to Enlarge 

In this benchmark, the 9800 GTX and the 4850 perform comparably until 2560x1600 where the 9800 GTX takes the lead. This trend of increasing competitiveness of the 9800 GTX at 2560x1600 does seem to indicate that the advantages in fill rate and memory bandwidth may make the difference here.



The Witcher

Near the beginning of the game, our main character must lift a gate for his friends to run through before a giant praying mantis eats them (or something). Anyway, this is a benchmark of that cut scene using FRAPS. We test with 2xAA at 1680x1050 and 1920x1200 while our tests at 2560x1600 are performed without any antialiasing. This is due to the fact that the game removes the option to enable AA at certain resolutions on some cards. While we understand the need to keep gamers who don't know any better from cranking everything up on crappy hardware and complaining, it would be great if people who knew what they were doing had the option of forcing specific settings.

 


Click to Enlarge

The 9800 GTX holds its own against the onslaught of the 4850 under The Witcher. These cards basically perform the same with slightly more difference at lower resolutions in favor of the 4850. The differences aren't large here and for all intents and purposes these cards have the same value for this test.



Bioshock

Using the same benchmark we've used in other graphics hardware reviews, this is a FRAPS runthrough of a relatively open part of the Medical Pavilion. While it is a bit on the short side, it does a good job of representing gameplay.

 


Click to Enlarge

AMD's new 4850 just destroys its competition and just about everything else in this test. Not even the new GTX 260 which costs twice as much can touch this card in Bioshock. Performance here is simply incredible.



Challenging NVIDIA's Strategy: Are Two RV770s Faster than One GT200?

NVIDIA insists on building these massive GPUs while AMD is heading in the direction of multiple, smaller GPUs in order to keep development time and costs manageable. Does NVIDIA's strategy make sense? In order to find out we paired two Radeon HD 4850s in CrossFireX and ran through our benchmark suite, this time focusing on a comparison to the recently announced GeForce GTX 280 as well as the 9800 GX2. The results were surprising:

512 256MB
  AMD Radeon HD 4850 CF NVIDIA GeForce GTX 280 NVIDIA GeForce 9800 GX2
Crysis 36.4 34.3 39.9
Call of Duty 4 88.2 67.4 73.2
Enemy Territory: Quake Wars 53.7 70.2 62.2
Assassin's Creed 51.9 45 52.6
Oblivion 39.5 36.8 35.6
The Witcher 20.9 37.7 37.6
Bioshock 68.6 63.9 75.4

So does AMD's approach invalidate NVIDIA's big-monolithic-GPU strategy? Not exactly. While it is true that two RV770s can outperform a single GT200 in many cases, you could also make the argument that two GT200s could outperform anything that AMD could possibly concoct (3 and 4-way CF scaling isn't nearly as good as 2-way). AMD's strategy makes sense, for AMD, but it's fundamentally no different than what NVIDIA is doing - AMD is simply targeting a different initial market and scaling up/down from there.

The scaling, or lack thereof, in games like Enemy Territory: Quake Wars highlights an important caveat with AMD's strategy: there are still software issues with SLI and CrossFireX. What is necessary is a truly seamless multi-GPU implementation, with shared frame buffer and where both GPUs operate as an extension of each other with direct GPU-to-GPU communication over a high speed (not PCIe) bus, similar to how AMD's Opteron or Intel's Nehalem work in multi-socketed systems.



Crysis

With CrossFire, under Crysis, the 4850 does quite well. There's a bit of an odd scaling anomaly, but no matter how much we retested things it came out the same way. We've had discussions on the fact that the built-in test we used can end up bound by the fact that it moves so fast that it's got to move things on and off the graphics cards way more than normal. This might help explain why the relative performance of the 4850 CrossFire setup goes up at 2560x1600, as it may have plenty of power to process the scene but not the capability to handle the swapping in and out of so many complex objects (which wouldn't happen under nomral conditions in the game).


Click to Enlarge

Call of Duty 4

4850 CrossFire simply ownz Call of Duty 4. Nuff said.

 


Click to Enlarge

Enemy Territory: Quake Wars

Our Quake Wars benchmark shows a much different CrossFire scaling story, as it really just doesn't do much. The impressive highs we see with CrossFire are underscored by these inconsistencies in support.


Click to Enlarge



Assassin's Creed

Two GeForce 8800 GTs in SLI outperform a single GeForce GTX 280, and two Radeon HD 4850s in CrossFire outperform the 8800 GT SLI, so AMD manages to outperform NVIDIA's brand new GT200 with a pair of cheaper, slower cards. The two actually end up performing like a GeForce 9800 GX2 here as well.

It's not so much that the Radeon HD 4850 is ultra competitive, but rather that the GTX 280 isn't terribly competitive with NVIDIA's own $400-$500 multi-GPU solutions.

 

Assassin's Creed has a ~60 fps frame rate cap, so the flat performance of the 4850 in CrossFire simply indicates that it kept bumping off of the frame rate limiter resulting in static performance throughout all three resolutions.


Click to Enlarge

Oblivion

While CrossFire tends to not scale as consistently as SLI, when it does, it scales very well. The performance of two 4850s is nearly double that of a single card and it puts AMD at the absolute top of the performance charts here.


Click to Enlarge

The Witcher

The Witcher is a good example of an area where CrossFire fails to scale - despite the Radeon HD 3870 X2 scaling, we could not get the 4850 to show any performance benefit with two GPUs. It could be an issue with the 4850 drivers or a special trait of the 3870's driver, at this point it's tough to tell.

 


Click to Enlarge

Bioshock

While we see scaling at 1920 x 1200, at 2560 x 1600 there's no benefit to two 4850s over one. We could be bumping into a memory bandwidth limitation or some continued strangeness in AMD's CrossFire drivers.


Click to Enlarge



Final Words

There's an unexpected amount of concluding we can do already based on these early results.

For starters, the Radeon HD 4850 looks to be the best buy at $199, even better than NVIDIA's price-dropped GeForce 9800 GTX. What's also unbelievable is that compared to the 4850, our beloved GeForce 8800 GT seems downright slow in a number of benchmarks - and the 8800 GT is only 8 months old. It's also very refreshing to see this sort of competitive pressure at such a reasonable price point, while it's fun to write about 1.4 billion transistor GPUs it's a dream come true to be able to write about this type of performance at under $200.

Take two 4850s, put them together and now you've got something even faster than NVIDIA's GeForce GTX 280 in most cases. It shouldn't be too surprising since 8800 GT SLI and 9800 GX2 both outperform the GTX 280 as well.

Our CrossFire investigation illustrated a very good point: AMD's multi-GPU solutions still don't behave as well as their single-GPU products, there are still cases where performance doesn't improve at all and that's where these large monolithic GPU designs hold their value. Hopefully with continued effort in the multi-GPU space AMD can get us to a point where there is no perceivable difference between single and multi-GPU solutions. Until then, NVIDIA's strategy will continue to have a great deal of merit - although the GTX 280 isn't the best example of that, at least from a gamer's perspective. On the CUDA side however...

We'll have much more information on the Radeon HD 4850 and its faster brother next week when we can completely unveil AMD's RV770, until then sit tight and be content with the knowledge that the days of the 8800 GT vs. 3870 weren't a fluke, the new mainstream wars are upon us thanks to AMD's Radeon HD 4850.

Log in

Don't have an account? Sign up now