Crysis: Warhead

Our first graphics test is Crysis: Warhead, which in spite of its relatively high system requirements is the oldest game in our test suite. Crysis was the first game to really make use of DX10, and set a very high bar for modern games that still hasn't been completely cleared. And while its age means it's not heavily played these days, it's a great reference for how far GPU performance has come since 2008. For an iGPU to even run Crysis at a playable framerate is a significant accomplishment, and even more so if it can do so at better than performance (low) quality settings.

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

While Crysis on the HD 4000 was downright impressive, the HD 2500 is significantly slower.

Metro 2033

Our next graphics test is Metro 2033, another graphically challenging game. Since IVB is the first Intel GPU to feature DX11 capabilities, this is the first time an Intel GPU has been able to run Metro in DX11 mode. Like Crysis this is a game that is traditionally unplayable on Intel iGPUs, even in DX9 mode.

Metro 2033

Metro 2033

Metro 2033

DiRT 3

DiRT 3 is our next DX11 game. Developer Codemasters Southam added DX11 functionality to their EGO 2.0 engine back in 2009 with DiRT 2, and while it doesn't make extensive use of DX11 it does use it to good effect in order to apply tessellation to certain environmental models along with utilizing a better ambient occlusion lighting model. As a result DX11 functionality is very cheap from a performance standpoint, meaning it doesn't require a GPU that excels at DX11 feature performance.

DiRT 3

DiRT 3

Portal 2

Portal 2 continues to be the latest and greatest Source engine game to come out of Valve's offices. While Source continues to be a DX9 engine, and hence is designed to allow games to be playable on a wide range of hardware, Valve has continued to upgrade it over the years to improve its quality, and combined with their choice of style you’d have a hard time telling it’s over 7 years old at this point. From a rendering standpoint Portal 2 isn't particularly geometry heavy, but it does make plenty of use of shaders.

It's worth noting however that this is the one game where we encountered something that may be a rendering error with Ivy Bridge. Based on our image quality screenshots Ivy Bridge renders a distinctly "busier" image than Llano or NVIDIA's GPUs. It's not clear whether this is causing an increased workload on Ivy Bridge, but it's worth considering.

Portal 2

Portal 2

Ivy Bridge's processor graphics struggles with Portal 2. A move to fewer EUs doesn't help things at all.

Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it was the first AAA DX10+ game. Consequently it makes no attempt to shy away from pushing the graphics envelope, and pushing GPUs to their limits at the same time. Even at low settings Battlefield 3 is a handful, and to be able to run it on an iGPU would no doubt make quite a few traveling gamers happy.

Battlefield 3

The HD 4000 delivered a nearly acceptable experience in single player Battlefield 3, but the HD 2500 falls well below that. At just under 20 fps, this isn't very good performance. It's clear the HD 2500 is not made for modern day gaming, never mind multiplayer Battlefield 3.

Starcraft 2

Our next game is Starcraft II, Blizzard’s 2010 RTS megahit. Starcraft II is a DX9 game that is designed to run on a wide range of hardware, and given the growth in GPU performance over the years it's often CPU limited before it's GPU limited on higher-end cards.

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 performance is borderline at best on the HD 2500. At low enough settings the HD 2500 can deliver an ok experience, but it's simply not fast enough.

Skyrim

Bethesda's epic sword & magic game The Elder Scrolls V: Skyrim is our RPG of choice for benchmarking. It's altogether a good CPU benchmark thanks to its complex scripting and AI, but it also can end up pushing a large number of fairly complex models and effects at once. This is a DX9 game so it isn't utilizing any of IVB's new DX11 functionality, but it can still be a demanding game.

The Elder Scrolls V: Skyrim

The Elder Scrolls V: Skyrim

At lower quality settings, Intel's HD 4000 definitely passed the threshold for playable in Skyrim on average. The HD 2500 is definitely not in the same league however. At 21.5 fps performance is marginal at best, and when you crank up the resolution to 1680 x 1050 the HD 2500 simply falls apart.

Minecraft

Switching gears for the moment we have Minecraft, our OpenGL title. It's no secret that OpenGL usage on the PC has fallen by the wayside in recent years, and as far major games go Minecraft is one of but a few recently released major titles using OpenGL. Minecraft is incredibly simple—not even utilizing pixel shaders let alone more advanced hardware—but this doesn't mean it's easy to render. Its use of massive amounts of blocks (and the overdraw that creates) means you need solid hardware and an efficient OpenGL implementation if you want to hit playable framerates with a far render distance. Consequently, as the most successful OpenGL game in quite some number of years (at over 5.5mil copies sold), it's a good reminder for GPU manufacturers that OpenGL is not to be ignored.

Minecraft

Our test here is pretty simple: we're looking at lush forest after the world finishes loading. Ivy Bridge's processor graphics maintains a significant performance advantage over the Sandy Bridge generation, making this one of the only situations where the HD 2500 is able to significantly outperform Intel's HD 3000. Minecraft is definitely the exception however as whatever advantage we see here is purely architectural.

Civilization V

Our final game, Civilization V, gives us an interesting look at things that other RTSes cannot match, with a much weaker focus on shading in the game world, and a much greater focus on creating the geometry needed to bring such a world to life. In doing so it uses a slew of DirectX 11 technologies, including tessellation for said geometry, driver command lists for reducing CPU overhead, and compute shaders for on-the-fly texture decompression. There are other games that are more stressful overall, but this is likely the game most stressing of DX11 performance in particular.

Civilization V

Civilization V

Civilization V was an extremely weak showing on the HD 4000 when we looked at it last month, and it's even worse on the HD 2500. Civ players need not bother with Intel's processor graphics, go AMD or discrete.

Intel's HD 2500 & Quick Sync Performance Intel HD 2500: Compute, Synthetics & Power
Comments Locked

67 Comments

View All Comments

  • JarredWalton - Thursday, May 31, 2012 - link

    Intel actually has a beta driver (tested on the Ultrabook) that improves Portal 2 performance. I expect it will make its way to the public driver release in the next month. There are definitely still driver performance issues to address, but even so I don't think HD 4000 has the raw performance potential to match Trinity unless a game happens to be CPU intensive.
  • n9ntje - Thursday, May 31, 2012 - link

    Don't forget memory bandwidth. Both the CPU and GPU use the same memory on the motherboard.
  • tacosRcool - Thursday, May 31, 2012 - link

    kinda a waste in terms of graphics
  • paraffin - Thursday, May 31, 2012 - link

    With 1920×1080 being the standard thesedays I find it annoying that all AT tests continue to ignore it. Are you trying to goad monitor makers back into 16:10 or something?
  • Sogekihei - Monday, June 4, 2012 - link

    The 1080p resolution may have become standard for televisions, but it certainly isn't so for computer monitors. These days the "standard" computer monitor (meaning, what an OEM rig will ship with in most cases whether it's a desktop or notebook) is some variant of 136#x768 resolution, so that gets tested for low-end graphics options that are likely to be seen in cheap OEM desktops and most OEM laptops (such as integrated graphics seen here.)

    The 1680x1050 resolution was the highest end-user resolution available cheaply for a while and is kind of like a standard among tech enthusiasts- sure you had other offerings available like some (expensive) 1920x1200 CRTs, but most people's budget left them with sticking to 1280x1024 CRTs or cheap LCDs or if they wanted to go with a slightly higher quality LCD practically the only available resolution at the time was 1680x1050. A lot of people don't care enough about the quality of their display to upgrade it as frequently as performance-oriented parts so many of us still have at least one 1680x1050 lying around, probably in use as a secondary or for some even a primary display despite 1080p monitors being the same cost or lower price when purchased new.
  • Beenthere - Thursday, May 31, 2012 - link

    I imagine with the heat/OC'ing issues with the trigate chips, Intel is working to resolve Fab as well as operational issues with IB and thus isn't ramping as fast as normal.
  • Fritsert - Thursday, May 31, 2012 - link

    Would the HQV score of the HD2500 be the same as the HD4000 in the Anandtech review? Basically would video playback performance be the same (HQV, 24fps image enhancement features etc.)?

    A lot of processors in the low power ivy bridge lineup have the HD2500. If playback quality is the same this would make those very good candidates for my next HTPC. The Core i5 3470T specifically.
  • cjs150 - Friday, June 8, 2012 - link

    Also does the HD2500 lock at the correct FPS rate which is not exactly 24FPS. AMD has had this for ages but Intel only caught up with the HD4000. For me it is the difference between an i7-3770T and an i5-3470T
  • Affectionate-Bed-980 - Thursday, May 31, 2012 - link

    This is a replacement of the i5-2400. Actually the 3450 was, but this is 100mhz faster. You should be comparing HD2000 vs HD2500 as well as these aren't top tier models with the HD3000/4000.
  • bkiserx7 - Thursday, May 31, 2012 - link

    In the GPU Power Consumption comparison section, did you disable HT and lock the 3770k to the same frequency as the 3470 to get a more accurate comparison between just the HD 4000 and HD 2500?

Log in

Don't have an account? Sign up now