Intel's HD 2500 & Quick Sync Performance

What makes the 3470 particularly interesting to look at is the fact that it features Intel's HD 2500 processor graphics. The main difference between the 2500 and 4000 is the number of compute units on-die:

Intel Processor Graphics Comparison
  Intel HD 2500 Intel HD 4000
EUs 6 16
Base Clock 650MHz 650MHz
Max Turbo 1150MHz 1150MHz

At 6 EUs, Intel's HD 2500 has the same number of compute resources as the previous generation HD 2000. In fact, Intel claims that performance should be around 10 - 20% faster than HD 2000 in 3D games. Given that Intel's HD 4000 is getting close to the minimum level of 3D performance we'd like to see from Intel, chances are the 2500 will not impress. We'll get to quantifying that shortly, but the good news is Quick Sync performance is retained:

CyberLink Media Espresso 6.5 - Harry Potter 8 Transcode

The HD 2500 does a little better than our HD 4000 here, but that's just normal run to run variance. Quick Sync does rely heavily on the EU array for transcode work, but it looks like the workload itself isn't heavy enough to distinguish between the 6 EU HD 2500 and the 16 EU HD 4000. If your only need for Intel's processor graphics is for transcode work, the HD 2500 appears indistinguishable from the HD 4000.

The bad news is I can't say the same about its 3D graphics performance.

Introduction & Overclocking Intel HD 2500 Performance
Comments Locked

67 Comments

View All Comments

  • JarredWalton - Thursday, May 31, 2012 - link

    Intel actually has a beta driver (tested on the Ultrabook) that improves Portal 2 performance. I expect it will make its way to the public driver release in the next month. There are definitely still driver performance issues to address, but even so I don't think HD 4000 has the raw performance potential to match Trinity unless a game happens to be CPU intensive.
  • n9ntje - Thursday, May 31, 2012 - link

    Don't forget memory bandwidth. Both the CPU and GPU use the same memory on the motherboard.
  • tacosRcool - Thursday, May 31, 2012 - link

    kinda a waste in terms of graphics
  • paraffin - Thursday, May 31, 2012 - link

    With 1920×1080 being the standard thesedays I find it annoying that all AT tests continue to ignore it. Are you trying to goad monitor makers back into 16:10 or something?
  • Sogekihei - Monday, June 4, 2012 - link

    The 1080p resolution may have become standard for televisions, but it certainly isn't so for computer monitors. These days the "standard" computer monitor (meaning, what an OEM rig will ship with in most cases whether it's a desktop or notebook) is some variant of 136#x768 resolution, so that gets tested for low-end graphics options that are likely to be seen in cheap OEM desktops and most OEM laptops (such as integrated graphics seen here.)

    The 1680x1050 resolution was the highest end-user resolution available cheaply for a while and is kind of like a standard among tech enthusiasts- sure you had other offerings available like some (expensive) 1920x1200 CRTs, but most people's budget left them with sticking to 1280x1024 CRTs or cheap LCDs or if they wanted to go with a slightly higher quality LCD practically the only available resolution at the time was 1680x1050. A lot of people don't care enough about the quality of their display to upgrade it as frequently as performance-oriented parts so many of us still have at least one 1680x1050 lying around, probably in use as a secondary or for some even a primary display despite 1080p monitors being the same cost or lower price when purchased new.
  • Beenthere - Thursday, May 31, 2012 - link

    I imagine with the heat/OC'ing issues with the trigate chips, Intel is working to resolve Fab as well as operational issues with IB and thus isn't ramping as fast as normal.
  • Fritsert - Thursday, May 31, 2012 - link

    Would the HQV score of the HD2500 be the same as the HD4000 in the Anandtech review? Basically would video playback performance be the same (HQV, 24fps image enhancement features etc.)?

    A lot of processors in the low power ivy bridge lineup have the HD2500. If playback quality is the same this would make those very good candidates for my next HTPC. The Core i5 3470T specifically.
  • cjs150 - Friday, June 8, 2012 - link

    Also does the HD2500 lock at the correct FPS rate which is not exactly 24FPS. AMD has had this for ages but Intel only caught up with the HD4000. For me it is the difference between an i7-3770T and an i5-3470T
  • Affectionate-Bed-980 - Thursday, May 31, 2012 - link

    This is a replacement of the i5-2400. Actually the 3450 was, but this is 100mhz faster. You should be comparing HD2000 vs HD2500 as well as these aren't top tier models with the HD3000/4000.
  • bkiserx7 - Thursday, May 31, 2012 - link

    In the GPU Power Consumption comparison section, did you disable HT and lock the 3770k to the same frequency as the 3470 to get a more accurate comparison between just the HD 4000 and HD 2500?

Log in

Don't have an account? Sign up now