It’s about time we got an interesting value processor to review from Intel. I simulated one in our Clarkdale review, but today I’m bringing you a full review of the most interesting dual-core Westmere for the desktop - the Core i3 530.

Priced at $113 (and selling for about $125 on the street) the 530 runs at 2.93GHz and features no turbo modes. It’ll run at 1.33GHz at its lowest frequency, and no faster than 2.93GHz at full load. The missing turbo boost is almost all you sacrifice as the 530 still has a hefty 4MB L3 cache shared between both cores. Each core gets a 256KB 10-cycle L2, just like the i5s and i7s.

The un-core is clocked at 2.13GHz, down from 2.40GHz in the i5. That should hurt performance a bit compared to our simulated i3 in the launch article. Aside from Turbo the other thing you give up with the i3 is AES acceleration. Westmere's AES New Instructions (AES-NI) are disabled on all of the i3s in typical Intel fashion. There has to be some reason for users to opt for a Core i5 instead.

Processor Core Un-core GPU Cores / Threads L3 Cache Max Turbo TDP Price
Intel Core i5-670 3.46GHz 2.40GHz 733MHz 2 / 4 4MB 3.76GHz 73W $284
Intel Core i5-661 3.33GHz 2.40GHz 900MHz 2 / 4 4MB 3.60GHz 87W $196
Intel Core i5-660 3.33GHz 2.40GHz 733MHz 2 / 4 4MB 3.60GHz 73W $196
Intel Core i5-650 3.20GHz 2.40GHz 733MHz 2 / 4 4MB 3.46GHz 73W $176
Intel Core i3-540 3.06GHz 2.13GHz 733MHz 2 / 4 4MB N/A 73W $133
Intel Core i3-530 2.93GHz 2.13GHz 733MHz 2 / 4 4MB N/A 73W $113
Intel Pentium G9650 2.80GHz 2.00GHz 533MHz 2 / 2 3MB N/A 73W $87

 

Sitting next to the 32nm CPU die is a 45nm GPU/memory controller:

Like the majority of the Core i5 processors, the GPU here runs at 733MHz. The exception being our recently reviewed 661 which runs the GPU at 900MHz for those who want that extra bit of mediocre gaming performance.

From Intel the closest competitor is the Core 2 Duo E7600, which runs at 3.06GHz but with a 3MB L2 cache. AMD provides the biggest threat with its Athlon II X4 630 and Phenom II X2 550 BE. The latter isn't on AMD's official price list but you can still find it online today for $99.

In a market full of good alternatives, whether it’s an ultra-cheap quad-core or a solid dual-core, it’s time to find out if there’s any value in the Core i3 530.

Fixes Since Last Time

There were two outstanding issues in our Clarkdale review that needed fixing after CES. First and foremost was power consumption. We incorrectly assumed that Clarkdale's idle power consumption was worse than Lynnfield due to the 45nm on-package chipset. As many of you pointed out, it was an issue with our ASUS H57 motherboard. After CES we switched over to Gigabyte's GA-HS57M-USB3 and the idle power consumption improved considerably. Since then ASUS appears to have fixed the problem but our data for this review was still run with Gigabyte's board.

Unfortunately these sorts of issues aren't rare with any new motherboard/chipset release. Our ASUS H57 board had idle power issues, while our Gigabyte H57 board had overclocking issues. No one seems to get it right on the first try.

The second issue that needs correcting is the system power consumption while playing back an x264 video using integrated graphics. Our AMD numbers were unusually high in our initial review, which we've since corrected:

While playing H.264 encoded video the GPU does all of the heavy lifting and there's no power advantage for Clarkdale to rest on. When watching a movie the AMD system is indistinguishable from our Clarkdale test bed.

We are still running into an issue with MPC-HC and video corruption with DXVA enabled on the 790GX, but haven't been able to fix it yet. Have any of you had issues with video corruption with AMD graphics and the latest stable build of MPC-HC for 64-bit Windows? Or should we chalk it up to being just another day in the AnandTech labs.

The Performance & Power Summary
Comments Locked

107 Comments

View All Comments

  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    I'm a huge fan of competition, but I'm not sure I want to deal with the 3rd party chipset guys again. As the technologies got more sophisticated we started to see serious problems with all of the 3rd party guys (Remember VIA when we got PCIe? Remember the issues with later nForce chipsets?). As more features get integrated onto the CPU, the role of competition in the chipset space is less interesting to me.

    While I'd love to see competition there, I'm not willing to deal with all of the compatibility and reliability nightmares that happened way too often.

    Take care,
    Anand
  • Penti - Sunday, January 24, 2010 - link

    Also the competition didn't help with the K8-chipset prices. Even though they didn't have to do a memory controller any more, and some registers/logic moved to the CPU. Even if single chip chipsets where possible and available.

    I agree, I'm glad to see the third party chipsets gone. It would just be bad in the corporate space today when features as remote management needs to be built in. (Better to just have that removed from the consumer products then having very different products). Unless Intel / AMD gives away (sell) IP-blocks to the third parties I don't see the point. Extra features is perfectly capable of being implemented / integrated directly on the motherboard. And it would be enough to give incentive and pressure the chipset maker (Intel or AMD) to better themselves. Unless chipsets moves to the SoC-model, I guess that will do. It's not like VIA, nVidia and SiS did beat Intel when they made chipsets for the QPB-bus/P4. Plus I doubt they could make a cheaper southbridge for the Nehalem/DMI platform. It's still SATA, USB, PCI-e, Ethernet MAC/PHY, Audio and SPI/LPC and other legacy I/O. As well as the video/display output stuff.
  • geok1ng - Friday, January 22, 2010 - link

    FPS numbers for this Intel IGP is too good to be true. Intel has cheated before with IGP that ddin,t render all polygons and textures to achieve "competite" frames per second numbers.

    hence i request (once again)side by side screenshots to put aside any doubts that this "competitive" IGP from Intel has a similar image quality of NIVIDIA and ATI integrated graphics.
  • silverblue - Friday, January 22, 2010 - link

    I'm still not convinced by the IGP. Those 661 results are a 900MHz sample vs. the 700MHz HD 3300 on the 790GX board, and the 530 uses a 733MHz IGP. In every single case, the AMD solution wins, be it by a small margin (Dragon Age Origins) or a large one (CoD: MW2) against the 530, but in general, AMD's best is probably about 20-25% better, clock for clock, than Intel's - all depending on title of course. Overclocking the new IGP turns out excellent, however we're still comparing it to a relatively old IGP from AMD.

    If AMD updates their IGP and bumps up the clock speed, the gap will increase once again. There's nothing to say they can't bring out a 900MHz, 40nm flavour of the 3300 or better now that TSMC have sorted out their production issues. Intel's IGPs are on a 45nm process so AMD may produce something easily superior that doesn't require too much power. However... I'm still waiting.

    Intel are definitely on the right track, though they need to do something about the amount of work per clock.
  • IntelUser2000 - Monday, January 25, 2010 - link

    Silverblue, I don't know how it is on the GMA HD, but at least till the GMA 4500, the Intel iGPUs clocked in a similar way like the Nvidia iGPUs. The mentioned clocks are for the shader core. The rest like the ROP and the TMU is likely lower.

    While for AMD, 700MHz is for EVERYTHING.

    Plus, the 780/785/790 has more transistors than the GMA HD. The AMD chipset has 230 million transistors while GMA HD package has 170 million.

    All in all, the final performance is what matters.
  • Suntan - Friday, January 22, 2010 - link

    I would agree. I question why it wasn’t compared against the newer (and usually cheaper) 785g mobos (that are ATI 4200 based systems.)

    -Suntan
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    I chose to compare it to the 790GX because that's currently the fastest AMD IGP on the market. I wanted to put AMD's best foot forward. The 785G is literally the same performance as the 780G, the more impressive product name comes from its DirectX 10.1 support.

    The 890GX is the next IGP from AMD. In our Clarkdale review I mentioned that Intel simply getting close to AMD's IGP performance isn't really good enough. Unfortunately the current roadmaps have the 890GX IGP running at 700MHz, just like the 790GX. The only question I have is whether or not the 890GX has any more shader cores (currently at 8). We'll find out soon enough.

    Take care,
    Anand
  • Suntan - Friday, January 22, 2010 - link

    [QUOTE]I chose to compare it to the 790GX because that's currently the fastest AMD IGP on the market.[/QUOTE]

    Fair enough. But your own tests in the 785 article shows the two being basically identical in performance, the 790 might have some small advantage in certian places (gaming) but if a person wasn't happy with the 785 in that use, I'd put wagers that they wouldn't be happy with the 790 either.

    At this point, integrated gfx are completely capable for pcs for everything except gaming (and some NLE video editors that use heavy gfx accel for pre-visualizing EFX.) In both those cases, even a $50 dedicated card will far overshadow all the integrated options to the point that I think it isn't right to concentrate on game capability for picking integrated gfx cards.

    In my opinion, the last area for IGX to compete is video acceleration, which the 785 is at paratiy or beats the 790 there. (Although they both fail without the inclusion of digital TrueHD/DTSHDMA support) but at least the 785 is usually cheaper.

    In any case, the new i3 really doesn't impress me *just* because it has an integrated gfx on the chip. It seems that the same thing still holds true, namely: If your building a computer for anything other than gaming, you can probably build one where an AMD will result in the same or better performance for less money (TrueHD/DTSHDMA not withstanding.) If you're building one for gaming, you're probably not going to be using the integrated gfx anyway.

    -Suntan

  • archcommus - Friday, January 22, 2010 - link

    I don't understand what the big deal is about TrueHD/DTS-HD MA bitstreaming. I've been decoding in software and sending 6 channels over 3 analog cables (is this LPCM?) to my 5.1 system ever since I've owned one, and the sound quality is fantastic. Is there really a perceived quality difference with bitstreaming to a high quality receiver if you own a very high end sound system?
  • eit412 - Friday, January 22, 2010 - link

    TrueHD/DTS-HD MA use lossless compression and if you bit stream them to the receiver instead of decoding in software there is less of a chance of interference(the longer the signal is in analog form the more interference is possible). In most cases the difference is not distinguishable, but people love to say that theoretically my stereo is better than yours.

Log in

Don't have an account? Sign up now