DivX 8.5.3 with Xmpeg 5.0.3

Our DivX test is the same DivX / XMpeg 5.03 test we've run for the past few years now, the 1080p source file is encoded using the unconstrained DivX profile, quality/performance is set balanced at 5 and enhanced multithreading is enabled:

DivX 6.8.5 w/ Xmpeg 5.0.3 - MPEG-2 to DivX Transcode

DivX encoding performance improved by around 9%, definitely tangible and definitely worth waiting for if you're ready to buy a new Atom based system.

x264 HD Video Encoding Performance

Graysky's x264 HD test uses the publicly available x264 codec (open source alternative to H.264) to encode a 4Mbps 720p MPEG-2 source. The focus here is on quality rather than speed, thus the benchmark uses a 2-pass encode and reports the average frame rate in each pass.

x264 HD Encode Benchmark - 720p MPEG-2 to x264 Transcode

x264 HD Encode Benchmark - 720p MPEG-2 to x264 Transcode

 

Windows Media Encoder 9 x64 Advanced Profile

In order to be codec agnostic we've got a Windows Media Encoder benchmark looking at the same sort of thing we've been doing in the DivX and x264 tests, but using WME instead.

Windows Media Encoder 9 x64 - Advanced Profile Transcode

Pine Trail holds a 9% advantage over the older Atom 330 in our Windows Media Encoder test.

SYSMark & Photoshop Performance 3D Rendering Performance
Comments Locked

41 Comments

View All Comments

  • Shadowmaster625 - Monday, December 21, 2009 - link

    Why doesnt AMD just take one of their upcoming mobile 880 series northbridges and add a memory controller and a single Athlon core? It would be faster than atom, more efficient than Ion, and could be binned for low power. Instead they just stand there with their thumbs up their butts while Intel shovels this garbage onto millions of unsuspecting consumers at even higher profit margins.
  • JarredWalton - Monday, December 21, 2009 - link

    The problem is that even single core Athlons are not particularly power friendly. I'm sure they could get 5-6 hours of battery life if they tried hard, but Atom can get twice that.
  • Hector1 - Monday, December 21, 2009 - link

    Do you want some whine with that ? Where were you when chipsets were created by taking a bunch of smaller ICs on the motherboard and putting them altogether into one IC ? PCs became cheaper and faster. We thought it was great. Do you know anything about L2 Cache ? It used to be separate on the motherboards as well until it was integrated into the CPU. PCs became cheaper and faster and we thought it was great. Remember when CPUs were solo ? They became Duo & Quad making the PCs faster and dropping price/performance. AMD & Intel integrated the memory controller and, whoa!, guess what ? Faster & lower price/performance and, yes, we thought it was great. It's called Moore's Law and it's all been part of the semiconductor revolution that's still going on since the '60s. GPUs are no different. They're still logic gates made out of transistors and with new 32nm technology, then 22nm and 16nm, the graphics logic will be integrated as well. Seriously, what did you think would happen ?
  • TETRONG - Monday, December 21, 2009 - link

    Do you understand that Moore's Law is not a force of nature?

    Intel has artificially handicapped the low-voltage sector in order to force consumers to purchase Pentiums. Right where they wanted you all along.

    Since when is it ok for Intel to dictate what type of systems are created with processors?

    First it was the 1GB of Ram limitation, now you can't have a dual-core. When does it end?

    "We have a mediocre CPU, combined with a below average GPU-according to our amortization schedule you could very well have it in the year 2013(after the holidays of course), by which time we should have our paws all over the video encoding and browsing standards, which we'll be sure to make as taxing as possible. Official release of USB 3.0 will be right up in a jiff!
  • Voldenuit - Monday, December 21, 2009 - link

    The historical examples you cite are not analagous, because intel bundling their anemic GPUs onto the package makes performance *worse*, and bundling the two dies onto a single package (they're not on the same chip, either, so there is no hard physical limitation) makes competing IGPs more expensive, since you now have to pay for a useless intel IGP as well as a 3rd party one if you were going to buy an IGP system.

    And just because a past course of action was embraced by the market does not mean it was not anti-competitive.
  • bnolsen - Saturday, December 26, 2009 - link

    Performance is worse?? As far as I can see the bridge requires no heat sink and the cpu can be cooled passively. Power use went way down. For this plaform that is improved performance.

    My current atom netbooks do fine playing flash on their screens and just fine playing 720p h264 mkv files.

    If you want a bunch of power use and heat, just skip the ion platform and go with a core2 based system.
  • Hector1 - Monday, December 21, 2009 - link

    You need to re-read the tech articles. Pineview does integrate both the graphics and memory controller into the CPU. It's the ICH that remains separate. Even if it didn't, what do you think will happen when this goes to 32nm, 22nm and 16nm ? As for performance, Anand says in the title "Pine Trail Boosts Performance, Cuts Power" so that's good enough for me.

    Intel obviously created the Atom for a low cost, low power platform and they're delivering. It'll continue to be fine-tuned with more integration to lower costs. The market obviously wants it. SOC is coming too (System On a Chip) for even lower costs. Not the place for high performance graphics, I think.

    This is really about Moore's Law marching on. It's driven down prices, increased performances and lowered power more than anything else on the planet. Without it, we'd still be paying the $5000 I paid for my 1st PC in 1980 -- an Apple II Plus. What you're saying, whether you know it or not, is that we should stop advancing processes and stop Moore's Law. Personally, I'd like to see us not stop at 45nm and keep going.
  • kaoken - Monday, December 21, 2009 - link

    I agree that progress should be made, but bundling an intel cpu and IGP into a chip is anti-competitive. I wouldn't mind though if there were an intel cpu and ati/nvidia on a chip.
  • JonnyDough - Tuesday, December 22, 2009 - link

    Hector is right in one respect, and that is that if Intel is going to be dumb, we don't have to purchase their products. I especially like the sarcastic cynicism in the article when mentioning all the things that Intel's chip CAN'T do. They just don't know how to make a GPU without patent infringement. If they can't compete, they'll try using their big market share to hurt competition. Classic Intel move. They never did care about innovation, only about market share and money. But I guess that's what happens when you're a mega corp with lots of stockholder expectations and pressure. I'll give my three cheers to the underdogs!
  • overvolting - Monday, December 21, 2009 - link

    Hear Hear

Log in

Don't have an account? Sign up now