nForce2: More than meets the eye

In Part I we ran the vast majority of our benchmarks with a single stick of memory, utilizing only one of the two 64-bit memory controllers the nForce2 IGP/SPP is equipped with. Our reasoning was that the added bandwidth provided by a 128-bit memory interface is not utilized unless integrated graphics is enabled; this turned out to be both true and false.

If you'll remember back to our review of the original nForce chipset, the nForce-420 (128-bit dual channel DDR) was not any faster than nForce-220 (64-bit single channel DDR); the reason being that the Athlon XP was being provided enough memory bandwidth by a single 64-bit DDR channel, rendering the additional 64-bit channel relatively useless. The exception to this was if integrated graphics was enabled since, as we all know, graphics cores are very bandwidth dependent and will easily make use of any additional bandwidth they share with a CPU.

Corsair DIMMs were our memory of choice for this comparison - 2 x 256MB modules (left) for DualDDR and 1 x 512MB module (right) run at the same timings

The nForce2 chipset behaves relatively similarly; if you look in most applications, the benefit from going to 128-bit DDR mode (DualDDR) is under 3% - less than the normal variance in these benchmarks. Unlike the original nForce however, there are some exceptions to the rule with nForce2.

Let's start out by taking a look at the performance gains resulting from running in DualDDR mode:

DDR vs. DualDDR
Benchmark
DualDDR333 vs. DDR333
(% Gain)
DualDDR400 vs. DDR400
(% Gain)
Content Creation Winstone 2002
1.3%
1.1%
Business Winstone 2002
0.7%
0.6%
SYSMark 2002 - Internet Content Creation
3.2%
-0.4%
SYSMark 2002 - Office Productivity
0.0%
0.0%
3DSMAX5 - SinglePipe2.max
0.5%
0.0%
3DSMAX5 - Underwater_Environment_Finished.max
0.0%
0.0%
Maya 4.0.1 - Rendertest
1.4%
0.0%
Lightwave 7.5 - Raytrace
0.2%
0.0%
Lightwave 7.5 - Radiosity Reflective Things
0.2%
0.1%
XMpeg DiVX/MPEG-4 Encoding
0.5%
1.9%
LAME MP3 Encoding
0.0%
0.0%
UnrealTournament 2003 Demo Flyby
0.5%
0.3%
Jedi Knight 2
1.4%
1.3%
Serious Sam 2
0.3%
0.2%
Comanche 4
1.2%
0.5%
SPECviewperf 7 - 3dsmax-01
8.3%
5.1%
SPECviewperf 7 - ugs-01
23.0%
15.6%
SPECviewperf 7 - proe-01
20.5%
7.7%
SPECviewperf 7 - drv-08
2.1%
0.8%
SPECviewperf 7 - dx-07
16.9%
11.3%
SPECviewperf 7 - light-05
0.3%
0.1%

Everything here pretty much follows our hypothesis with the exception of SPECviewperf where we see some incredible performance gains when going to DualDDR. Boosts of over 20% in some cases are larger than you'd get from upgrading an Athlon XP 2000+ to a new 2800+, but why are they isolated to SPECviewperf and not the rest of the benchmark suite?

In order to figure out exactly what was going on we went back to NVIDIA with our data, hoping for an explanation; we got that and much more in return.

First of all, as the nForce2 chipset, BIOSes and motherboards matured after numerous revisions it turns out that the performance difference between single channel and dual channel DDR configurations narrowed. NVIDIA's original performance numbers for DDR333 vs. DualDDR333 yielded 5 - 8% gains in business/content creation tests while now we're seeing numbers well under 3%.

Secondly, there has been a lot that has been improved under the hood of NVIDIA's nForce2.

Index DASP Take Two
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now