Crysis: Warhead

Our first graphics test is Crysis: Warhead, which in spite of its relatively high system requirements is the oldest game in our test suite. Crysis was the first game to really make use of DX10, and set a very high bar for modern games that still hasn't been completely cleared. And while its age means it's not heavily played these days, it's a great reference for how far GPU performance has come since 2008. For an iGPU to even run Crysis at a playable framerate is a significant accomplishment, and even more so if it can do so at better than performance (low) quality settings.

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

While Crysis on the HD 4000 was downright impressive, the HD 2500 is significantly slower.

Metro 2033

Our next graphics test is Metro 2033, another graphically challenging game. Since IVB is the first Intel GPU to feature DX11 capabilities, this is the first time an Intel GPU has been able to run Metro in DX11 mode. Like Crysis this is a game that is traditionally unplayable on Intel iGPUs, even in DX9 mode.

Metro 2033

Metro 2033

Metro 2033

DiRT 3

DiRT 3 is our next DX11 game. Developer Codemasters Southam added DX11 functionality to their EGO 2.0 engine back in 2009 with DiRT 2, and while it doesn't make extensive use of DX11 it does use it to good effect in order to apply tessellation to certain environmental models along with utilizing a better ambient occlusion lighting model. As a result DX11 functionality is very cheap from a performance standpoint, meaning it doesn't require a GPU that excels at DX11 feature performance.

DiRT 3

DiRT 3

Portal 2

Portal 2 continues to be the latest and greatest Source engine game to come out of Valve's offices. While Source continues to be a DX9 engine, and hence is designed to allow games to be playable on a wide range of hardware, Valve has continued to upgrade it over the years to improve its quality, and combined with their choice of style you’d have a hard time telling it’s over 7 years old at this point. From a rendering standpoint Portal 2 isn't particularly geometry heavy, but it does make plenty of use of shaders.

It's worth noting however that this is the one game where we encountered something that may be a rendering error with Ivy Bridge. Based on our image quality screenshots Ivy Bridge renders a distinctly "busier" image than Llano or NVIDIA's GPUs. It's not clear whether this is causing an increased workload on Ivy Bridge, but it's worth considering.

Portal 2

Portal 2

Ivy Bridge's processor graphics struggles with Portal 2. A move to fewer EUs doesn't help things at all.

Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it was the first AAA DX10+ game. Consequently it makes no attempt to shy away from pushing the graphics envelope, and pushing GPUs to their limits at the same time. Even at low settings Battlefield 3 is a handful, and to be able to run it on an iGPU would no doubt make quite a few traveling gamers happy.

Battlefield 3

The HD 4000 delivered a nearly acceptable experience in single player Battlefield 3, but the HD 2500 falls well below that. At just under 20 fps, this isn't very good performance. It's clear the HD 2500 is not made for modern day gaming, never mind multiplayer Battlefield 3.

Starcraft 2

Our next game is Starcraft II, Blizzard’s 2010 RTS megahit. Starcraft II is a DX9 game that is designed to run on a wide range of hardware, and given the growth in GPU performance over the years it's often CPU limited before it's GPU limited on higher-end cards.

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 performance is borderline at best on the HD 2500. At low enough settings the HD 2500 can deliver an ok experience, but it's simply not fast enough.

Skyrim

Bethesda's epic sword & magic game The Elder Scrolls V: Skyrim is our RPG of choice for benchmarking. It's altogether a good CPU benchmark thanks to its complex scripting and AI, but it also can end up pushing a large number of fairly complex models and effects at once. This is a DX9 game so it isn't utilizing any of IVB's new DX11 functionality, but it can still be a demanding game.

The Elder Scrolls V: Skyrim

The Elder Scrolls V: Skyrim

At lower quality settings, Intel's HD 4000 definitely passed the threshold for playable in Skyrim on average. The HD 2500 is definitely not in the same league however. At 21.5 fps performance is marginal at best, and when you crank up the resolution to 1680 x 1050 the HD 2500 simply falls apart.

Minecraft

Switching gears for the moment we have Minecraft, our OpenGL title. It's no secret that OpenGL usage on the PC has fallen by the wayside in recent years, and as far major games go Minecraft is one of but a few recently released major titles using OpenGL. Minecraft is incredibly simple—not even utilizing pixel shaders let alone more advanced hardware—but this doesn't mean it's easy to render. Its use of massive amounts of blocks (and the overdraw that creates) means you need solid hardware and an efficient OpenGL implementation if you want to hit playable framerates with a far render distance. Consequently, as the most successful OpenGL game in quite some number of years (at over 5.5mil copies sold), it's a good reminder for GPU manufacturers that OpenGL is not to be ignored.

Minecraft

Our test here is pretty simple: we're looking at lush forest after the world finishes loading. Ivy Bridge's processor graphics maintains a significant performance advantage over the Sandy Bridge generation, making this one of the only situations where the HD 2500 is able to significantly outperform Intel's HD 3000. Minecraft is definitely the exception however as whatever advantage we see here is purely architectural.

Civilization V

Our final game, Civilization V, gives us an interesting look at things that other RTSes cannot match, with a much weaker focus on shading in the game world, and a much greater focus on creating the geometry needed to bring such a world to life. In doing so it uses a slew of DirectX 11 technologies, including tessellation for said geometry, driver command lists for reducing CPU overhead, and compute shaders for on-the-fly texture decompression. There are other games that are more stressful overall, but this is likely the game most stressing of DX11 performance in particular.

Civilization V

Civilization V

Civilization V was an extremely weak showing on the HD 4000 when we looked at it last month, and it's even worse on the HD 2500. Civ players need not bother with Intel's processor graphics, go AMD or discrete.

Intel's HD 2500 & Quick Sync Performance Intel HD 2500: Compute, Synthetics & Power
Comments Locked

67 Comments

View All Comments

  • etamin - Thursday, May 31, 2012 - link

    I just glossed through the charts (will read article tomorrow), but I noticed there are no Nehalems in the comparisons. It would be nice if both a Bloomfield and a Gulftown were thrown in. If Phenom IIs are still there, Nehalem shouldn't be THAT outdated right? Anyways, I'm sure the article is great. Thanks for your hard work and I look forward to reading this at work tomorrow :)
  • SK8TRBOI - Thursday, May 31, 2012 - link

    I agree with etamin - if Phenom is in there, a great Intel benchmark cpu would be the Nehalem i7-920 D0 OC'd to 3.6Ghz - I'd wager a significant percentage of Anand's readers (myself included!) still have this technological wonder in our everyday rigs. The i7-920 would be a good 'reference' for us all when evaluating/comparing performance.
    Thanks, and awesome article, as always!
  • CeriseCogburn - Thursday, May 31, 2012 - link

    It's always "best" here to forget about other Intel and nVidia - as if they suddenly don't exist anymore - it makes amd appear to shine.
    Happens all the time. Every time.
    I suppose it's amd's evil control - or the yearly two week island vacation (research for reviewers of course)
  • LancerVI - Thursday, May 31, 2012 - link

    Throw me into the list that agrees.

    Still running a i7 920 C0. the Nehalems being in the chart would've been nice.
  • jordanclock - Thursday, May 31, 2012 - link

    That's what Bench is for.
  • HanzNFranzen - Thursday, May 31, 2012 - link

    I have to agree as well. I have an i7 920 C0 and often wonder how it stacks up today against Ivy Bridge. I'm thinking that holding off for Haswell is a safe bet even though I have the upgrade itch! It's been 3 years, which is great to have gotten this much milage out of my current system, but I wanna BUILD SOMETHING!!
  • CeriseCogburn - Monday, June 11, 2012 - link

    It's whole video card tier of frame rate difference in games, plus you have sata 6 and usb 3 to think about not to mention pci-e 3.0 to be ready for when needed.

    Buy your stuff and sell your old parts to keep it worth it.
  • jwcalla - Thursday, May 31, 2012 - link

    I'm guessing the stock idle power consumption number is with EIST disabled?

    I've been waiting for some of these lower-powered IVB chips to come out to build a NAS. Was thinking a Core i3 (if they ever get around to releasing it), or maybe the lowest Xeon. Though at this point I might just bite the bullet and wait for 64-bit ARM... or go with a Cortex-A15 maybe.
  • ShieTar - Thursday, May 31, 2012 - link

    If a Cortex-A15 would give you enough computing power, you should also be happy with a Pentium or even a Celeron. The i3 is already rather overkill for a simple NAS.

    I have a fileserver with Windows Server Home running on a Pentium G620, and it has absolutely no problem to push 120 MB/s over a GBit Ethernet switch from a RAID-0 pack of HDDs while running µtorrent, Thunderbird, Miranda and Firefox on the side. Power consumption of the complete system is around 40-50W in idle, and I havn't even shopped for specifically low-power components but used a lot of leftovers.
  • BSMonitor - Thursday, May 31, 2012 - link

    Yeah, the CPU is just 1 part of the power consumption puzzle.. And since in "file sharing" mode, it will almost always be in a low power/idle state... An ARM CPU would show little improvement..

    But if you ever offloaded any kind of work to that box, you'd have wasted your money with an ARM box, as no ARM processor will ever match real task performance of any of the x86 processors.

Log in

Don't have an account? Sign up now