Intel's first 22nm CPU, codenamed Ivy Bridge, is off to an odd start. Intel unveiled many of the quad-core desktop and mobile parts last month, but only sampled a single chip to reviewers. Dual-core mobile parts are announced today, as are their ultra-low-voltage counterparts for use in Ultrabooks. One dual-core desktop part gets announced today as well, but the bulk of the dual-core lineup won't surface until later this year. Furthermore, Intel only revealed the die size and transistor count of a single configuration: a quad-core with GT2 graphics.

Compare this to the Sandy Bridge launch a year prior where Intel sampled four different CPUs and gave us a detailed breakdown of die size and transistor counts for quad-core, dual-core and GT1/GT2 configurations. Why the change? Various sects within Intel management have different feelings on how much or how little information should be shared. It's also true that at the highest levels there's a bit of paranoia about the threat ARM poses to Intel in the long run. Combine the two and you can see how some folks at Intel might feel it's better to behave a bit more guarded. I don't agree, but this is the hand we've been dealt.

Intel also introduced a new part into the Ivy Bridge lineup while we weren't looking: the Core i5-3470. At the Ivy Bridge launch we were told about a Core i5-3450, a quad-core CPU clocked at 3.1GHz with Intel's HD 2500 graphics. The 3470 is near identical, but runs 100MHz faster. We're often hard on AMD for introducing SKUs separated by only 100MHz and a handful of dollars, so it's worth pointing out that Intel is doing the exact same here. It's possible that 22nm yields are doing better than expected and the 3470 will simply quickly take the place of the 3450. The two are technically priced the same so I can see this happening.

Intel 2012 CPU Lineup (Standard Power)
Processor Core Clock Cores / Threads L3 Cache Max Turbo Intel HD Graphics TDP Price
Intel Core i7-3960X 3.3GHz 6 / 12 15MB 3.9GHz N/A 130W $999
Intel Core i7-3930K 3.2GHz 6 / 12 12MB 3.8GHz N/A 130W $583
Intel Core i7-3820 3.6GHz 4 / 8 10MB 3.9GHz N/A 130W $294
Intel Core i7-3770K 3.5GHz 4 / 8 8MB 3.9GHz 4000 77W $332
Intel Core i7-3770 3.4GHz 4 / 8 8MB 3.9GHz 4000 77W $294
Intel Core i5-3570K 3.4GHz 4 / 4 6MB 3.8GHz 4000 77W $225
Intel Core i5-3550 3.3GHz 4 / 4 6MB 3.7GHz 2500 77W $205
Intel Core i5-3470 3.2GHz 4 / 4 6MB 3.6GHz 2500 77W $184
Intel Core i5-3450 3.1GHz 4 / 4 6MB 3.5GHz 2500 77W $184
Intel Core i7-2700K 3.5GHz 4 / 8 8MB 3.9GHz 3000 95W $332
Intel Core i5-2550K 3.4GHz 4 / 4 6MB 3.8GHz 3000 95W $225
Intel Core i5-2500 3.3GHz 4 / 4 6MB 3.7GHz 2000 95W $205
Intel Core i5-2400 3.1GHz 4 / 4 6MB 3.4GHz 2000 95W $195
Intel Core i5-2320 3.0GHz 4 / 4 6MB 3.3GHz 2000 95W $177

The 3470 does support Intel's vPro, SIPP, VT-x, VT-d, AES-NI and Intel TXT so you're getting a fairly full-featured SKU with this part. It isn't fully unlocked, meaning the max overclock is only 4-bins above the max turbo frequencies. The table below summarizes what you can get out of a 3470:

Intel Core i5-3470
Number of Cores Active 1C 2C 3C 4C
Default Max Turbo 3.6GHz 3.6GHz 3.5GHz 3.4GHz
Max Overclock 4.0GHz 4.0GHz 3.9GHz 3.8GHz

In practice I had no issues running at the max overclock, even without touching the voltage settings on my testbed's Intel DZ77GA-70K board:

It's really an effortless overclock, but you have to be ok with the knowledge that your chip could likely go even faster were it not for the artificial multiplier limitation. Performance and power consumption at the overclocked frequency are both reasonable:

Power Consumption Comparison
Intel DZ77GA-70K Idle Load (x264 2nd pass)
Intel Core i7-3770K 60.9W 121.2W
Intel Core i5-3470 54.4W 96.6W
Intel Core i5-3470 @ Max OC 54.4W 110.1W

Power consumption doesn't go up by all that much because we aren't scaling the voltage up significantly to get to these higher frequencies. Performance isn't as good as a stock 3770K in this well threaded test simply because the 3470 lacks Hyper Threading support:

x264 HD Benchmark - 2nd pass - v3.03

Overall we see a 10% increase in performance for a 13% increase in power consumption. Power efficient frequency scaling is difficult to attain at higher frequencies. Although I didn't increase the default voltage settings for the 3470, at 3.8GHz (the max 4C overclock) the 3470 is selecting much higher voltages than it would have at its stock 3.4GHz turbo frequency:

Intel's HD 2500 & Quick Sync Performance
POST A COMMENT

66 Comments

View All Comments

  • shin0bi272 - Friday, June 01, 2012 - link

    any gamer with a good quad core doesnt need to upgrade their cpu. Who's going to spend hundreds of dollars to upgrade from another quad core (like lets say my i7 920) to this one for a whopping 7 fps in one game and 1 fps in another? That sounds like something an apple fanboy would do... oh look the new isuch-and-such is out and its marginally better than the one I spent $x00 on last month I have to buy it now! no thanks. Reply
  • Sogekihei - Monday, June 04, 2012 - link

    This really depends a lot on what you have (or want) to do with your computer. Architectural differences are obviously a big deal or else instead of an i7-920 you'd probably be rocking a Phenom (1) x4 or Core2 Quad by your logic that having a passable quad core means you don't need to upgrade your processor until the majority of gaming technology catches up.

    Let's take the bsnes emulator as an example here, it focuses on low-level emulation of the SNES hardware to reproduce games as accurately as possible. With most new version releases, the hardware requirements gradually increase as more intricate backend code needs to execute within the same space of time to avoid dropping framerates; being that these games determined their running speed by their framerate and being sub-60 or sub-50 (region-dependent) means running at less than full speed, this could eventually be a problem for somebody wanting to use such an accurate emulator. From what I've heard, most Phenom and Phenom II systems are very bogged down and can barely get any games running at full speed on it these days and from my own experience, Nehalem-based Intel chips either require ludicrous clock speeds or simply aren't capable of running certain games at full speed (such as Super Mario RPG.) Obviously in cases such as this, the performance increases from a new architecture could benefit a user greatly.

    Another example I'll give is based on the probability through my own experiences dealing with other people that the vast majority of gamers DO use their rigs for other tasks too. Any intensive work with maths, spreadsheets, video or image editing and rendering, software development, blueprinting, or anything else you could name that people do on a computer nowadays instead of by hand in order to speed the process will see massive gains when moving to a faster processor architecture. For anybody that has such work to do, be it for a very invested-in hobby, as part of a post-secondary education course, or as part of their career, the few hundred dollars/euros/currency of choice it costs to update their system is easily worth the potentially hundreds or thousands of hours per upgrade cycle they may save through the more powerful hardware.

    I will concede that in today's market, the majority of gaming-exclusive cases don't yield much better results from increasing a processor's power (usually being GPU-limited instead) however that's a very broad statement and doesn't account for things that are heavily multithreaded (like newer Assassin's Creed games) or that are very processor-intensive (which I believe Civilization V can qualify as in mid- to late-game scenarios.)

    There will always be case-specific conditions which will make buying something make sense or not, but do try to keep in mind that a lot of people do have disposable income and will very likely end up putting it into their hobbies before anything else. If their hobbies deal with computers they're likely going to want to always have, to the best extent they can afford, the latest and greatest technology available. Does it mean your system is trash? Of course not. Does it mean they're stupid? No moreso than the man that puts $10 a week into his local lottery and never wins anything. It just comes down to you having different priorities from them.

    The only other thing I want to address is your stance on Apple products. Yes the hipsters are annoying, but you would likely lose the war if you wanted to argue on the upgrade cycle users take with Mac OSX-based computers. New product generations only come about once a year or so and most users wait 2-3 generations before upgrading and quite a few wait much longer than the average Linux/Windows PC user will before upgrading. The ones that don't wait are usually professionals in some sort of graphic arts industry (such as photography) where they need the most processing power, memory, graphics capabilities, and battery life possible and it's a justified business expense.
    Reply
  • CeriseCogburn - Monday, June 11, 2012 - link

    People usually skip a generation - so from i7 920 we can call it one gen with SB being nearly the same as IB, so you're correct.

    But anyone on core 2 or phenom 2 or athlon 2x or 4x, yeah they could do it an be happy - and don't forget the sata 6 and usb 3 they get with the new board - so it's not just the cpu with IB and SB - you get hard drive and usb speed too.

    So with those extras it could drive a few people in your position - sata 6 card and usb 3 card is half the upgrade cost anyway, so add in pci-e 3 as well. I see some people moving from where you are.
    Reply
  • ClagMaster - Saturday, June 02, 2012 - link

    The onboard graphics of the Ivy Bridge processors was never seriously intended for playing games. It is intended to replace chipset graphics for to support office applications with large LCD monitors. And it adds transcoding capabilities.

    @Anand : If you want to do a more meaningful comparison of graphics performance for those that might be doing gaming, why not test and compare some DX9 games (still being written) of titles available 5 years ago. Real people play these games because they are cheap or free and provide as much entertainment as DX10 or DX11 games. Frame rates will be 60fps or slightly better. Or will your sponsors at nVidia, AMD or Intel not permit this sort of comparison.

    Its ridiculous to compare onboard graphics to discrete graphics performance. A dedicated GPU, optimized for graphics, will always beat a onboard graphics GPU for a given gate size.

    The Ivy Bridge graphics (performance/power consumption) , if I interpret these comparisons that have been presented correctly, is also inefficient compared to the processing capabilities of a discrete graphics card.
    Reply
  • vegemeister - Wednesday, June 06, 2012 - link

    As you mentioned, I'd like to see some mention of the 2D performance. I use Awesome WM on a 3520x1200 X screen, and smooth scrolling can sometimes get choppy with my Graphics My Ass GPU.

    I'd like to upgrade my Core2 duo, but I'm not sure whether the HD2500 graphics in this chip will suffice, or if I need to be looking at higher end CPUs. I don't really care about the difference between shitty 3D and ho-hum 3D.
    Reply
  • P39Airacobra - Tuesday, July 01, 2014 - link

    That's a shame that they still sale the GT 520 and GT 610 and the ATi 5450, When a integrated GPU like the HD 2500 out performs a dedicated GPU it's time to retire them from the market. I bought a 3470 and I am running a R9 270 with 8GB of 1600 Ripjaws. I tried out the HD 2500 on the chip just to see how it would do, It honestly sucked, But for videos and gaming on very low settings it works, It actually surprised me. But I don't think I could ever stand to have a intergrated GPU, What's the point in buying a i5 if you are only going to use the integrated gpu? It does not make sense, You may as well keep your old P4 if you are not going to add a real GPU to it. This is why I don't understand the point of a integrated GPU inside a high end processor. Reply

Log in

Don't have an account? Sign up now