DivX Encoding Performance

We have been using a DivX encoding test as a part of our CPU benchmarking suite for quite some time now, however the performance test has never been truly realistic as it wasn't geared towards producing a high quality DivX rip - rather it was designed to stress CPU performance.

We have since revised our benchmark and now follow the DivX 5 encoding guide published at Doom9.net . For our test title we use Chapter 9 from The Sum of All Fears DVD. We conduct a 2-pass encoding process and report the encoded FPS from both passes averaged together. The results are lower than our previous Xmpeg tests, however they are much more applicable to real-world usage.

Hyper-Threading along with other features of the NetBurst architecture give Intel the performance advantage in video encoding as you can see by our DivX results above. The Athlon 64 3400+ performs respectably but it isn't the CPU that's best for these sorts of tasks.

OpenGL Performance 3D Rendering Performance
POST A COMMENT

38 Comments

View All Comments

  • Pumpkinierre - Tuesday, January 06, 2004 - link

    #15 I have'nt heard AMD call the 3000+ Newcastle and other sites dont refer to it as such. Many sites say that the die size and transistor count is the same (193mm2, 105million) as the 3200+. hardtecs4u has cpz on the 3000 and 3400:

    http://www.hardtecs4u.com/reviews/2003/amd_athlon6...

    http://www.hardtecs4u.com/reviews/2004/amd_athlon6...

    Same family, same stepping, same revision and code name:clawhammer. I havent found out whether cache associativity has been cut down from 16 but that is a minor point.

    If it looks the same and smells the same dont be prudish call it the same.
    Reply
  • JohnrrDrake - Tuesday, January 06, 2004 - link

    I appreciate the VisualStudio compiler test.

    As a developer, I am much more interested how much compile time is saved instead of how many more FPS I get.

    You may also consider throwing in a GCC compiler test (linux kernal is fairly typical).

    Reply
  • KF - Tuesday, January 06, 2004 - link

    >reljam is right, if it isn't CPU limited,
    >why include it? Or why not lower the res?
    Because it tells the truth about what people should expect? Gamers might like to know that they can use a slower (and cheaper) CPU. Very few people even have a graphics card as good as the testers used, so they don't even need CPUs like these. The real problem with benchmarks is how (or whether) they apply to real use. That makes null results like this important, IMO. Boring maybe. But useful.

    Older games were included, and they show a CPU difference. I suppose that is because the newer games use the GPU for things that older games did with the CPU.

    >But I think AMD may have shot themselves in the foot...

    Companies that are not in a monopoly position need to put forth the best product they can at the time, or else get crushed by the competition. If that means some products are short-lived...well it's better than losing. A few months at the top is actually not bad the way things go. If you won't beat yourself, then the competition will.

    I remember Intel putting out pin incompatible PPGA, FCPGA, and FCPGA2 PII/IIIs in quick succession, and then the totally incompatible P4. That was Intel scrambling with AMDs close competition.
    Reply
  • KristopherKubicki - Tuesday, January 06, 2004 - link

    By AMD's definition, NewCastle is the same core as ClawHammer with half the cache. Regardless if it is a compeltely different core or not the performance is going to be the same between a 1/2 ClawHammer or a NewCastle.

    Kristopher
    Reply
  • dvinnen - Tuesday, January 06, 2004 - link

    reljam is right, if it isn't CPU limited, why include it? Or why not lower the res?

    And are yall sure this is newcastle? Always figured it was a clawhammer with half the cache turned off. If you take the heat spreader off, it should be easy to tell. I seeing the bang-for-buck comparisons in a few weeks after the price drops start to take effect would be nice.

    atir: they did some 64 bit to 32 comparisons in one of the opteron reviews.
    Reply
  • atlr - Tuesday, January 06, 2004 - link

    Has anyone seen any performance comparisons of 32-bit versus 64-bit compiled programs? Reply
  • reljam - Tuesday, January 06, 2004 - link

    The AquaMark DX9, Halo (both benchmarks), and GunMetal are graphics limited benchmarks and are not adding any value to the review. Reply
  • Lonyo - Tuesday, January 06, 2004 - link

    Why run the games at 1024x768? I know it gives more real world performance, but if a game is limited by the graphics card, what use does the benchmark have in a CPU article (like the Aquamark benchmark numbers (not the CPU part though)
    Things such as UT2k3 are valid, and Comanche 4 woul dbe very good to look at differences (since unless I am mistaken, it used to be CPU limited most of the time).
    Of course, time constraints may be the issue, but it seems in a way wasteful to do tests which are more GPU tests than CPU tests.
    But otherwise, very good article in terms of non-gaming stuff, and it shows that the 3000+ is probably the best buy of the lot, good performance and a nice price.
    Reply
  • EddNog - Tuesday, January 06, 2004 - link

    Well on NewEgg when U go to buy your CPU, check the memory (Single channel DDR, vs. Dual channel DDR), the number of pins/socket type (Socket 754, Socket 939) and most importantly, the cache size (512KB, vs. 1MB).

    -Ed
    Reply
  • Icewind - Tuesday, January 06, 2004 - link

    I wish they would give the freaking newcastle Athlons a different name, cause how the hell are you supposed to tell the difference cause they have the samen damn name??

    Sometimething tells me the Athlon FX 939 pin next year will change things.
    Reply

Log in

Don't have an account? Sign up now