Comments Locked

107 Comments

Back to Article

  • a1623363 - Wednesday, March 24, 2010 - link

    Using the same Gigabyte motherboard and same core i3, I could overclock my graphics to 900 mhz only. I pushed to 950 Mhz, but this caused errors. Serious YMMV on overclocking above 900 Mhz, especially to 1200+ !

    Overall, I have to say I am happy with system performance, but very disappointed in the integrated graphics. Here Intel is barely able to match performance of ATI or NVidia from years ago. I have always had an integrated graphics machine, and simply chose to play games on lower settings, but now am having to buy a Radeon HD 4770 as the performance of Intel's solution doesn't allow you to play anything made in recent years. Not to mention that games like Mass Effect 2 don't support Intel or S3 chipsets, even when Intel HD Graphics are above the minimum system requirements in terms of performance.

    I am keeping my system, but if I was buying again today, knowing that the integrated graphics is sub-par, I would take a closer look at AMD plus a third party graphics card.
  • partha77 - Thursday, March 4, 2010 - link

    Hi! As a novice in this field, i'ld like to know 3 things regarding intel core i3-530 vis-a-vis intel core 2 duo e7500 -> 1) which one delivers more performance in general? 2) which one has longer functional life span? 3) which one has more future upgradability?
  • crochat - Tuesday, February 16, 2010 - link

    It is really strange to me that all reviews I've read about intel processors with integrated graphics always tested system with discrete graphic cards. I don't play games and don't see the use of spending money on a graphic card if an IGP can deliver what I need. I suppose graphic card may have an impact not only on idle power consumption figures, but also on load power consumption figures. I wonder how i3 530 IGP compares with athlon II X4 635 with e.g. 785g.
  • slikazn09 - Friday, January 29, 2010 - link

    4ghz sounds WHOO! But what temperature does it get up to when pushed to 4ghz? You thoughtfully quoted in yourr final word, about the options set out which is pretty helpful (and i like it:] ), but what about the heat when overclocked?!
  • slikazn09 - Friday, January 29, 2010 - link

    buying options* - correcting myself from last post. any help would be greatly appreciated :). should i buy a 3rd party cooler to ensure long term stability?
  • piasabird - Friday, January 29, 2010 - link

    It seems comparing the processor to an E7500 to an I3 would be benneficial. Does the I3 really run faster than an E7500?

  • ericore - Wednesday, January 27, 2010 - link

    Hence double the speed.
    Simple math ladies and gents.
    Bandwidth is not a factor in this case.

    If you want to check it out for yourself google: intel ark
  • kwrzesien - Tuesday, January 26, 2010 - link

    The Gigabyte GA-H57M-USB3 board has just been posted on Newegg:
    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...
  • marraco - Tuesday, January 26, 2010 - link

    TomsHardware had unveiled the awful 2D behavior of the most expensive nVidia and ATI card:

    http://www.tomshardware.com/reviews/2d-windows-gdi...">http://www.tomshardware.com/reviews/2d-windows-gdi...

    They perform slower than integrated chipsets. Sometimes 10X slower.

    I would like to see the same benchmarks on this integrated video
  • dgingeri - Monday, January 25, 2010 - link

    This would make an excellent home/ small business server as well. Low idle power consumption, low price, integrated video, and virtualization all combine to make for an excellent platform for 5-10 users for file sharing, web based local apps, and minor SQL server duties.

    I just wonder how it compares to the new AMD chips that came out today in server performance.
  • gfredsen - Monday, January 25, 2010 - link

    I know this borders on thread crapping, but can someone tell me why Fry's is selling the newer 45 Watt AMD cpus and virtually no one else.
    I post this here, because for the price I still believe the AMD solution to be the best for how I use a computer, HTPC and SOHO.
    Add in the price of a MB, DDR3 which I don't have and the Intel i3 still offers me no advantage that I can see.
    I'll pass for now.
  • papapapapapapapababy - Sunday, January 24, 2010 - link

    "From Intel the closest competitor is the Core 2 Duo E7600, which runs at 3.06GHz but with a 3MB L2 cache"

    lol my old retro E6600 runs at 3.2GHz also has 4MB L2 cache, and smokes my 3,2GHZ E7300 in photoshop... intel am fail.
  • smilingcrow - Sunday, January 24, 2010 - link

    If you can't get more than 3.2 from your E7300 sounds like your motherboard = fail.
  • papapapapapapapababy - Sunday, January 24, 2010 - link

    diminishing returns, girl.
  • smilingcrow - Tuesday, January 26, 2010 - link

    Hi girl, I don't really need to know your gender and the fact that you aren't very good at over-clocking might reinforce in some peoples’ minds that the fairer sex aren’t so good with tech. Anyway, take a look at the forums at overclockers.com so you can get the best out of that CPU.
  • Fjodor2000 - Saturday, January 23, 2010 - link

    Anand, is there any way you could provide reliable measurements for the Core i3-530's Idle Power Consumption when only the IGP is used? One of your earlier articles (http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?... indicated that it could reach as low as 27.6 W (!), however in that article there were no details on what Clarkdale CPU was used or other details of the system used.
  • - Saturday, January 23, 2010 - link

    Your going to need a mini-ITX board, some Gskill 1.35v memory, a SSD, a pico-PSU (probably with 100w brick) to have any chance of reaching 27.6w idle.

    I know for a fact that Intel used Gskill 1.35v memory (there was an article about it somewhere)and I'm sure they used every trick in the book to get the idle power down that low. I looked on Newegg for a mini-ITX board and couldn't find one so your probably going to have to wait if you want the ultimate power sipping HTPC.
  • smilingcrow - Sunday, January 24, 2010 - link

    The G.Skill Eco RAM (1.35V) has a negligible impact on idle power consumption according to the only review I’m seen which isn’t surprising as a single RAM stick at stock voltage doesn’t consume much to start with. At load the test showed gains of between 3 and 5W.

    I’m not a fan of this site but it’s the only review I could find and probably even they couldn’t screw this up; on second thoughts…
    http://www.fudzilla.com/content/view/16833/1/1/3/">http://www.fudzilla.com/content/view/16833/1/1/3/
  • Fjodor2000 - Sunday, January 24, 2010 - link

    I see. It would be really interesting to see if it can be reproduced, and just how low it is possible to get the power consumption using such a system as you described.

    Also, I think it would be interesting to see the Idle Power Consumption with IGP Only, for a more "normal" system. I.e. uATX motherboard (without Idle power issues!), Intel i3-530 CPU, 4 GB RAM, ~1 TB low power HDD, and NO external GFX-card (the review for some reason currently only contains Idle power consumption when used with an external GFX card).

    Anand, do you think it would be possible to run an idle power consumption test for the core i3-530 setup you used, but without an external GFX card? After all, I suppose that will be the common setup for most i3-530 based systems?
  • kwrzesien - Monday, January 25, 2010 - link

    I just built a 530 system with the Gigabyte GA-H55M-UD2H board, Artic Freezer 7, a WD 640 GB Blue, LG BDROM/DVDRW, 4 GB Mushkin (2 x 2GB 1600 C9) RAM with an Antec 380D Green PSU in a P180 mini case (200mm, 120mm fans on low). Terrific build, everything works great except the Freezer is blocking the first DIMM slot so installing a second pair of sticks is going to be a problem.

    Idle power at Win7 desktop, all stock settings, no discrete video card, is 41W! I was expecting ~60W so I'm really surprised, this puts my Core i7 920 to shame - but that is also my 'everything' box. I highly recommend this entire setup except maybe the Freezer - it is just too bulky in the fan cage. Don't go bigger on the PSU unless you need more than one 6-pin PEG connector or are installing 4+ hard drives. Next build (ordering today) will be the same with an SSD and 1TB Green drive...
  • dandar - Saturday, January 23, 2010 - link

    Bought 3-pack of win7 pro and these things run beautiful. Fast, win xp mode is supported so every app they had back with win xp still runs. I'm very happy with this platform from Intel. This was a home run in my opinion.
  • raced295 - Friday, January 22, 2010 - link

    Correct me if I'm wrong but I thought it was well understood that Intel uses processor power to accelerate gpu functions. In other words the GPU is not nearly as fast as is appears the drivers offload work to the cpu since it is very powerful. I will try to post links to support this but don't remember them off hand as I thought this was common knowledge on "in depth" sites such as this. I believe that this was going on in part on G45 at least with benchmark software. Also that the driver looks for benchmark exe's and game exe's and manipulates quality and offloads to CPU.

    It doesn't change much for the end user as its still faster but bottom line I don't think the GPU is as good as it appears. Net performance however is as good as it has shown and this is brilliant of Intel to do.

    Submit to the community for thought.
  • MrSpadge - Friday, January 22, 2010 - link

    So the integrated GPU can be overclocked really high. Nice, but we're not gaming on this thing anyway.

    No, what this really means is that the GPU voltage is set unnecessarily high by Intel! We'd be using these integrated GPUs to save power compared to discrete GPUs, not to waste it. It doesn't have sophisticated power management yet, but at least a reasonable voltage would be nice..
  • sandman1687 - Friday, January 22, 2010 - link

    On the fourth page it says that overclocking without increasing the voltage will not increase power consumption. This is not true; while power is directly proportional to frequency, exponentially to voltage. Could you post some idle and load power figures when overclocked at 4 and even 3.3GHz? I'm always hesitant to raise the CPU voltage. Thanks.

    -sanders
  • Rajinder Gill - Saturday, January 23, 2010 - link

    As power consumption and efficiency has become more important, and manufacturing processes improved, how far you can push a CPU without increasing its core voltage appears to be the most efficient way to overclock. You minimize any increases in power consumption while maximizing performance.


    It does not say the power consumption will not increase, but that you will somewhat minimize the increase by keepingthe voltage at stock.

    regards
    Raja
  • MrSpadge - Saturday, January 23, 2010 - link

    You're right, but one also needs to keep the absolute value of the stock voltage in mind. A fine example are the initial 45 nm Phenom II chips: reviewers applauded them for being able to OC rather high at stock voltage. However, "stock" in this case meant 1.35 - 1.40 V. That's more or equal to the maximum Intel gave it's 65 nm chips.

    So either one could argue "great stock voltage OC" or one could say "wasting power by setting a unnecessarily high stock voltage".

    MrS
  • MrSpadge - Friday, January 22, 2010 - link

    P ~ V^2, not exponential. Some leakage currents scale stronger, but these don't dominate the overall power draw (otherwise the process / design would be pretty bad).
  • sandman1687 - Friday, January 22, 2010 - link

    Sorry, meant quadratic.
  • Finally - Friday, January 22, 2010 - link

    I compared power draw under LOAD for the Phenom II 965 BE and here: http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?... you say it draws 274W (sic!), while in your database the same figure is much lower. How come?
  • janm9 - Friday, January 22, 2010 - link

    i am looking for the power consumption of the an amd athlon x4 620/605e with the 785G chipset. i can not find a review using just onboard graphics. the anandtech benchmark database is nice, but it only lists power consumption with either the 5870 or geforce 280gtx.

    i myself have run into problems ( artifacts ) under windows 7 with both media center and MPC-HC in 64bit with DXVA too.

    thank you for writing an interesting article :)
  • kwrzesien - Friday, January 22, 2010 - link

    When is that Gigabyte board going to be available? I've already bought two 530's from Microcenter for $99 each and one GA-H55M-U2H board from Newegg to go into Antec P180's. I'm really hoping to get the USB3 ports and triple-power USB that is on the -USB3 model. It's been announced since mid-December and is only Gigabytes website but absolutely no signs of it in e-tail yet.
  • Rajinder Gill - Friday, January 22, 2010 - link

    Hi,

    We'll ask about this first thing Monday morning and get back to you with an answer if possible.


    regards
    Raja
  • kwrzesien - Monday, January 25, 2010 - link

    Raja,

    Thanks for looking into this! Still no sign on NewEgg as of this morning. I need to order by tomorrow to get my friend's build out, maybe I should just look for USB3.0 PCIe cards...

    Thanks,
    Kirk
  • Rajinder Gill - Tuesday, January 26, 2010 - link

    Hi.

    I'll paste the response I got back from GB this morning:

    We have 4 H57/H55 models on the NA marketing currently. The model name and selling price of Newegg are listed below:



    "H57M-USB3: $10+ up than H55M-USB3, wait for posting from Newegg.



    H55M-USB3: $109.99, http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.a...&cm_...



    H55M-UD2H: $104.99, http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.a...&cm_...



    H55M-S2H: $89.99, http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.a...&cm_...

    "


    Hope this helps!

    Raja
  • kwrzesien - Tuesday, January 26, 2010 - link

    Raja,

    THANKS! Really I just couldn't wait so I ordered another UD2H and a $40 USB3 card, the second build can get upgraded with USB3 later. Looks like the H57M-USB3 would be a better price overall but then you do loose two USB2 ports from the back panel - I hope they include a 2-port slot adapter on the USB3 model because they sure don't in the UD2H model, which already has 2 internal USB2 headers.
  • Shadowmaster625 - Friday, January 22, 2010 - link

    An AMD motherboard, with a northbridge and IGP is cheaper than an H55 motherboard that has no northbridge. I want to know why, and that should be the #1 question when it comes to i3, and yet you didnt even address this issue.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    I mentioned this in an earlier comment. It's the same reason that AMD motherboard prices didn't go down when we moved from the K7 to K8 - chipset prices remained the same.

    The H5x chipsets, despite most of the logic being shifted onto the CPU package, are no cheaper than the previous generation G4x chipsets. Both AMD and Intel have made it very clear that as they integrate more functions onto the CPU, they aren't going to lower chipset prices. Instead, profit margins go up.

    It's a fairly new platform so I'd expect average prices to drop as production ramps up, but that's the main reason the boards aren't any cheaper. I believe you can buy H55 boards for less than $90 on Newegg now, and then there's this ECS board that sells for under $80 (under $70 with MIR) - http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.a...cm_re=In...

    Take care,
    Anand
  • tno - Friday, January 22, 2010 - link

    I have a Q9300@3/X48/4890 based system. I'm a pretty average user, and the hardest work my CPU does is the occasional HD encode, and the systems hardest daily toll is gaming on one screen with an HD video on the other. I have not seen any slow down or deficiencies in my system, and don't feel limited in the slightest. I haven't played the newest, most stressful games around, but on the whole, I don't see a big case for making the jump to Nehalem or Clarkdale. Indeed, I feel comfortable sitting on my rig till Sandy Bridge.

    So, am I nuts? Am I missing some hugely compelling reason to make the jump? Is it the efficiency? Or is this tock really not as big a deal as the last tock (Penryn)?

    tno
  • Taft12 - Friday, January 22, 2010 - link

    If you're a "pretty average" user, why did you buy a "pretty high end" motherboard and video card if you don't even play new games?? Shoulda gone for something half the price and upgrade twice as often :)
  • tno - Friday, January 22, 2010 - link

    My wife was on a bunch of away rotations and so I was home alone and bored. That lead to a series of purchases that were in tune with the lifestyle of a guy with time and cash to spend. Then my wife came home and suddenly the time and the cash went away. Don't get me wrong, I love my wife, but it was a fun time and had she been away much longer I'd be cruising with some wicked water cooled i7 rig.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    No you're pretty much right on the money. If you have a good system already, you're always better off waiting until the next big thing. In my eyes Penryn wasn't a very big deal if you already had Conroe. Clarkdale's real advantages are in power consumption and threaded performance. If you already have a quad-core CPU, chances are that you'll want to be looking at Lynnfield/Phenom II X4 or wait for Sandy Bridge if you can.

    Take care,
    Anand
  • tno - Friday, January 22, 2010 - link

    Cool! A reply from the man himself! Thanks, Anand! My leap was from a 2.4GHz Celeron to a PD805 to Penryn, so Penryn seemed like a revelation, highly efficient, easy to cool, fast and quadcore. Now, if you happen to have any loose systems that you're not using and want to send my way so I can experience the Lynnfield difference myself, I won't object.

    tno
  • kwrzesien - Friday, January 22, 2010 - link

    I had an AMD 1.2 GHz single-core with an GeForce 2MX. It was a HUGE upgrade!
  • lopri - Friday, January 22, 2010 - link

    [QUOTE]We are still running into an issue with MPC-HC and video corruption with DXVA enabled on the 790GX, but haven't been able to fix it yet. Have any of you had issues with video corruption with AMD graphics and the latest stable build of MPC-HC for 64-bit Windows? Or should we chalk it up to being just another day in the AnandTech labs.[/QUOTE]

    Instead of such fleeting one-liners, how about telling us the title, format, and codec in question so that we can verify it? This is a finest example of yellow journalism.

    I'm still waiting for an answer whether 2560x1600 and dual-displays work with these CPUs. Considering the silence, however, I think I know the answer.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    It's a Dark Knight rip we use. Take the original Blu-ray, use AnyDVD HD to strip out the DRM, re-encode to reduce file size and toss into an mkv container. The problem appears on all H.264 content though played through MPC-HC.

    As far as resolution support goes, Intel lists 2560 x 1600 as the maximum resolution available over Display Port. For DVI/HDMI you're limited to 1920 x 1200. VGA will get you up to 2048 x 1536.

    There are four independent display ports, so in theory you should be able to run one 2560 x 1600 panel and one 1920 x 1200 (or two 25x16 panels if you had a board with dual DisplayPort outputs).

    Take care,
    Anand
  • lopri - Friday, January 22, 2010 - link

    Thank you for the explanation, but unfortunately I couldn't replicate the 'problem' (what exactly?) you've experienced. I don't have The Dark Kight, so I tried Children of Men on a neighbor's 785G system I built for him. That title was chosen because its original content on the disc was encoded VC1 just like The Dark Knight. MediaInfo gave the following information:

    Video
    ID : 1
    Format : AVC
    Format/Info : Advanced Video Codec
    Format profile : High@L4.1
    Format settings, CABAC : Yes
    Format settings, ReFrames : 4 frames
    Muxing mode : Container profile=Unknown@4.1
    Codec ID : V_MPEG4/ISO/AVC
    Duration : 1h 49mn
    Bit rate : 14.2 Mbps
    Nominal bit rate : 14.5 Mbps
    Width : 1 920 pixels
    Height : 1 040 pixels
    Display aspect ratio : 16:9
    Frame rate : 23.976 fps
    Resolution : 24 bits
    Colorimetry : 4:2:0
    Scan type : Progressive
    Bits/(Pixel*Frame) : 0.296
    Stream size : 10.8 GiB (88%)
    Title : Video @ 14489 kbps
    Writing library : x264 core 67 r1165M 6841c5e

    Flawless playback both in Windowed mode as well as full-screen mode, on a 30" LCD. Just to be sure, I tested The Dark Knight trailer which is a VC1 clip, and various H.264 content in .mkv, .mp4, and .m2ts. Using MPC-HC svn 1.3.3347 32-bit AND 64-bit binaries. System had an WHQL driver dated 8/17/2009, installed via Windows Updates. Only codecs installed are Matroksa Splitter and AC3filter.

    So there. Now, what exactly is the problem that I don't see but you do?

    WRT resolutions - Intel listed 2560x1600 on G45 as well. I even got an ADD2 (interesting choice of name, btw) card off eBay hoping it'd work, but that was simply waste of money. I am as skeptical as can be on GMA after my bitter experiences with G35/G45, and it is puzzling why you can't verify that in your lab instead of being a messenger. ("Intel says so")

    Would you feel bad at all if I say I purchased G35/G45 based on your reviews, only to be greatly disappointed? I couldn't even give away a G35 based system to a junior-high kid, because the kid is someone I see on and off and I feared a potential embarrassment and unexpected calls for support.

    Your reviews are full of contradictions one after another, and I am concerned whether you've lost the sense and connection to the real world.
  • Shadowmaster625 - Friday, January 22, 2010 - link

    Given the level of integration, what is making these motherboards so expensive? When are we going to see $35 motherboards? What would keep the prices from coming down that low?
  • strikeback03 - Friday, January 22, 2010 - link

    IIRC the chipset itself currently costs $40.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    Correct. Despite moving much of the "chipset" on-package, the actual H5x chipsets are no cheaper than their predecessors. Remember that as AMD and Intel integrate more onto the CPU they still want to preserve or increase profit margins. It's just fortunate for all of us that in the process of integration we actually get a performance benefit.

    Take care,
    Anand
  • Taft12 - Friday, January 22, 2010 - link

    Sounds like we are very much in need of competition in the 3rd party chipset market like the good old days!

    Things are going in the wrong direction with NVIDIA exiting the market, Via and SiS long gone...
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    I'm a huge fan of competition, but I'm not sure I want to deal with the 3rd party chipset guys again. As the technologies got more sophisticated we started to see serious problems with all of the 3rd party guys (Remember VIA when we got PCIe? Remember the issues with later nForce chipsets?). As more features get integrated onto the CPU, the role of competition in the chipset space is less interesting to me.

    While I'd love to see competition there, I'm not willing to deal with all of the compatibility and reliability nightmares that happened way too often.

    Take care,
    Anand
  • Penti - Sunday, January 24, 2010 - link

    Also the competition didn't help with the K8-chipset prices. Even though they didn't have to do a memory controller any more, and some registers/logic moved to the CPU. Even if single chip chipsets where possible and available.

    I agree, I'm glad to see the third party chipsets gone. It would just be bad in the corporate space today when features as remote management needs to be built in. (Better to just have that removed from the consumer products then having very different products). Unless Intel / AMD gives away (sell) IP-blocks to the third parties I don't see the point. Extra features is perfectly capable of being implemented / integrated directly on the motherboard. And it would be enough to give incentive and pressure the chipset maker (Intel or AMD) to better themselves. Unless chipsets moves to the SoC-model, I guess that will do. It's not like VIA, nVidia and SiS did beat Intel when they made chipsets for the QPB-bus/P4. Plus I doubt they could make a cheaper southbridge for the Nehalem/DMI platform. It's still SATA, USB, PCI-e, Ethernet MAC/PHY, Audio and SPI/LPC and other legacy I/O. As well as the video/display output stuff.
  • geok1ng - Friday, January 22, 2010 - link

    FPS numbers for this Intel IGP is too good to be true. Intel has cheated before with IGP that ddin,t render all polygons and textures to achieve "competite" frames per second numbers.

    hence i request (once again)side by side screenshots to put aside any doubts that this "competitive" IGP from Intel has a similar image quality of NIVIDIA and ATI integrated graphics.
  • silverblue - Friday, January 22, 2010 - link

    I'm still not convinced by the IGP. Those 661 results are a 900MHz sample vs. the 700MHz HD 3300 on the 790GX board, and the 530 uses a 733MHz IGP. In every single case, the AMD solution wins, be it by a small margin (Dragon Age Origins) or a large one (CoD: MW2) against the 530, but in general, AMD's best is probably about 20-25% better, clock for clock, than Intel's - all depending on title of course. Overclocking the new IGP turns out excellent, however we're still comparing it to a relatively old IGP from AMD.

    If AMD updates their IGP and bumps up the clock speed, the gap will increase once again. There's nothing to say they can't bring out a 900MHz, 40nm flavour of the 3300 or better now that TSMC have sorted out their production issues. Intel's IGPs are on a 45nm process so AMD may produce something easily superior that doesn't require too much power. However... I'm still waiting.

    Intel are definitely on the right track, though they need to do something about the amount of work per clock.
  • IntelUser2000 - Monday, January 25, 2010 - link

    Silverblue, I don't know how it is on the GMA HD, but at least till the GMA 4500, the Intel iGPUs clocked in a similar way like the Nvidia iGPUs. The mentioned clocks are for the shader core. The rest like the ROP and the TMU is likely lower.

    While for AMD, 700MHz is for EVERYTHING.

    Plus, the 780/785/790 has more transistors than the GMA HD. The AMD chipset has 230 million transistors while GMA HD package has 170 million.

    All in all, the final performance is what matters.
  • Suntan - Friday, January 22, 2010 - link

    I would agree. I question why it wasn’t compared against the newer (and usually cheaper) 785g mobos (that are ATI 4200 based systems.)

    -Suntan
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    I chose to compare it to the 790GX because that's currently the fastest AMD IGP on the market. I wanted to put AMD's best foot forward. The 785G is literally the same performance as the 780G, the more impressive product name comes from its DirectX 10.1 support.

    The 890GX is the next IGP from AMD. In our Clarkdale review I mentioned that Intel simply getting close to AMD's IGP performance isn't really good enough. Unfortunately the current roadmaps have the 890GX IGP running at 700MHz, just like the 790GX. The only question I have is whether or not the 890GX has any more shader cores (currently at 8). We'll find out soon enough.

    Take care,
    Anand
  • Suntan - Friday, January 22, 2010 - link

    [QUOTE]I chose to compare it to the 790GX because that's currently the fastest AMD IGP on the market.[/QUOTE]

    Fair enough. But your own tests in the 785 article shows the two being basically identical in performance, the 790 might have some small advantage in certian places (gaming) but if a person wasn't happy with the 785 in that use, I'd put wagers that they wouldn't be happy with the 790 either.

    At this point, integrated gfx are completely capable for pcs for everything except gaming (and some NLE video editors that use heavy gfx accel for pre-visualizing EFX.) In both those cases, even a $50 dedicated card will far overshadow all the integrated options to the point that I think it isn't right to concentrate on game capability for picking integrated gfx cards.

    In my opinion, the last area for IGX to compete is video acceleration, which the 785 is at paratiy or beats the 790 there. (Although they both fail without the inclusion of digital TrueHD/DTSHDMA support) but at least the 785 is usually cheaper.

    In any case, the new i3 really doesn't impress me *just* because it has an integrated gfx on the chip. It seems that the same thing still holds true, namely: If your building a computer for anything other than gaming, you can probably build one where an AMD will result in the same or better performance for less money (TrueHD/DTSHDMA not withstanding.) If you're building one for gaming, you're probably not going to be using the integrated gfx anyway.

    -Suntan

  • archcommus - Friday, January 22, 2010 - link

    I don't understand what the big deal is about TrueHD/DTS-HD MA bitstreaming. I've been decoding in software and sending 6 channels over 3 analog cables (is this LPCM?) to my 5.1 system ever since I've owned one, and the sound quality is fantastic. Is there really a perceived quality difference with bitstreaming to a high quality receiver if you own a very high end sound system?
  • eit412 - Friday, January 22, 2010 - link

    TrueHD/DTS-HD MA use lossless compression and if you bit stream them to the receiver instead of decoding in software there is less of a chance of interference(the longer the signal is in analog form the more interference is possible). In most cases the difference is not distinguishable, but people love to say that theoretically my stereo is better than yours.
  • Suntan - Friday, January 22, 2010 - link

    Then again, it may just be that those people do have better stereos (or more importantly, better rooms and speakers) then yours.

    In any case, 99% of this website’s existence is possible because of the tendency for people to try and get “just a little more” out of their electronics…

    -Suntan
  • Suntan - Friday, January 22, 2010 - link

    In a word, yes. If your room/components/setup is worth it, then it can make a big difference.

    Analog output (and no, what you are doing is not LPCM) can compete sound-wise with digital output (either LPCM or bit streaming) if you have a quality soundcard (think Lynx 2B, not anything that says soundblaster) but even then, you lose other digital signal processing capabilities that a good AVR can offer (namely Audyssey Multi-EQ room correction and Audyessey Dynamic-EQ.)

    That said, I too think it is a little over exaggerated. Simply because most people really wouldn’t notice the benefit in their situations.

    Lastly, I happen to believe that the big flap over LPCM vs. bitstream *is* completely overblown. Ymmv.

    -Suntan
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    I agree that 8-channel LPCM is pretty much good enough. There are issues with players downsampling audio when outputting 8-channel LPCM from TrueHD/DTS-HD MA sources, but to my ears I could never notice the difference.

    Also remember that the majority of users don't even have 8-channel audio setups. We're talking 5.1 at best, and honestly a huge advantage of these high def audio formats is the support for discrete rear surround channels.

    That being said, it's still a valid option to want. Blu-ray players have been able to do it for quite a while, there's no reason we shouldn't demand the same out of our HTPCs. Some users do prefer to keep the decoding on their receiver/preprocessor and for them it's the only option.

    Hopefully by the end of this year all platforms will offer it and we can just assume its presence as a checkbox feature.

    Take care,
    Anand
  • cmdrdredd - Friday, January 22, 2010 - link

    The big problem is some players mess with the audio during the decode and don't output LPCM in a raw untouched format. Bitstreaming means nothing is meddes with, no volume normalization and the like.
  • archcommus - Friday, January 22, 2010 - link

    Thanks all for the info, helped clear things up. With my 5.1 system and onboard audio with Windows 7 these days the software decoding and analog output is fine. But I guess in the future with a possible 7.1 system I would at least want LPCM over HDMI. I couldn't imagine needing bitstreaming unless I was building a true home theater.
  • Spivonious - Friday, January 22, 2010 - link

    LPCM is decoding the channels and sending them digitally over HDMI.

    I don't understand the big hype either, unless the decoder in your AVR is much better than the one in software. Otherwise the sound would be the same. I guess it's fun to get the AVR to display "DTS-HD".
  • blckgrffn - Friday, January 22, 2010 - link

    Can we see what the 4ghz power consumption looked like? Was speed step still active? How about the OC'd graphics power consumption?

    Thanks :)
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    At 4GHz the i3 530 used 129W under load, up from 101.4W :)

    I don't remember if speedstep was still functioning when I adjusted the clock multiplier, I believe it was but I'll need to double check.

    Take care,
    Anand
  • blckgrffn - Saturday, January 23, 2010 - link

    Thank you!

    I guess it doesn't matter if speedstep was functioning if you could provide the idle power consumption of 4ghz as well.

    The information is very much appreciated.
  • Deaks2 - Friday, January 22, 2010 - link

    So were the the overclocking tests done on the Asus or MSI mainboard? Also, would the use of a P55 or H55 chipset affect o/c'ability of this CPU?
  • marc1000 - Friday, January 22, 2010 - link

    its interesting how close in performance the e8600 is to the i3... of course you loose some power efficiency but if you already have a good discrete GPU and a e8600 (or another older penryn clocked to e8600 levels, as it is my case), then none of the new CPU's are ground-breaking deals... they are faster, of course, but they are also expensive. not funny. going from a P4 to a Penryn was really great, but from a Penryn to these new ones... its just "good".
  • Grooveriding - Friday, January 22, 2010 - link

    Any chance of seeing some comparisons of this chip at 4ghz vs an i7 920 at 4ghz. I'd love to get an idea of how it compares clock for clock in gaming against the 1366 platform.

    Be interesting to see if it's better just getting this if you're a gamer.
  • cmdrdredd - Friday, January 22, 2010 - link

    2.66Ghz i7 920 vs 2.93Ghz i3 530 is already there. The 2.66Ghz i7 is on average 10fps faster. Overclocking both to 4Ghz would have the same results if not even furthering the gap. Why? Because a slower clock speed CPU is already faster. So equaling the clock speed doesn't mean equaling performance.
  • strikeback03 - Friday, January 22, 2010 - link

    Even better, if they could add overclocked results they already have to Bench.
  • ltcommanderdata - Friday, January 22, 2010 - link

    I'm wondering if overclocking the IGP's shader units overclocks the memory controller as well since they are on the same die? That would help explain the good performance scaling. As well, was power consumption significantly different with the IGP at 1200MHz? If not, then Intel should definitely have clocked their IGPs higher. Catching up to current gen IGPs from nVidia and ATI is noteworthy for an Intel IGP, but presumably nVidia and ATI have their next gen IGPs right around the corner and Intel's IGP doesn't push new performance boundaries.
  • IntelUser2000 - Monday, January 25, 2010 - link

    No, the IGP is on its own clock domain. You can overclock the iGPU seperately from everything, even the base clock. On the motherboards which allow overclocking of the iGPU on Clarkdale, you don't have multiplier options, but a straightforward frequency adjustment.
  • Abhilash - Friday, January 22, 2010 - link

    sysmark is absurd
  • Abhilash - Friday, January 22, 2010 - link

    http://www.anandtech.com/bench/default.aspx?p=112&...">http://www.anandtech.com/bench/default.aspx?p=112&...
  • karlkesselman - Friday, January 22, 2010 - link

    hi,

    On page 2, Load Power Consumption you have i870 using less power than i750. This can't be. It's either a misprint or the "load" test doesn't fully stress the i870 or maybe some hardware misconfiguration.

    Then there is the WoW test.
    i750 has 92 fps
    i530 has 77 fps
    and
    i530 (OC @ 4 GHz) has also 92 fps
    We know that WoW only uses 2 cores so i750 must be having turbo boost enabled running @ 3.2 GHz. That explains why it gets 92 fps. But then the i530 @ 4 GHz gets the same fps. This is either a mistake (was the test running the same hardware?) or i530 is less efficient then i750 (at least running WoW; maybe because of the memory controller and/or the 8 MB L3 cache or both?).
    Also in this case (WoW test) it would be interesting if we could see the power consumption during the test (i750 compared to i530).
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    Our 870 has always used slightly less power than our 750 sample. A while ago Intel did away with having a single voltage for each product shipped. In my experience, the higher end chips are usually the ones that can run at the lowest voltages.

    All of our Core i7/i5 numbers are run with Turbo enabled, but remember that Clarkdale's memory performance isn't as good as Lynnfield. We see this manifest itself in more than just WoW. If you have the money, you're better off with Lynnfield. But at $113 you're at nearly half the price of the cheapest Lynnfield.

    Take care,
    Anand
  • MadMan007 - Friday, January 22, 2010 - link

    The Clarkdale CPUs are that much less efficient, likely because of the off-die but on-package memory controller not to mention only 2 'real' cores. It's more like having a fast-connected Northbridge in a traditional FSB arrangement than the on-die memory controller of Lynnfield. ardocp did their Clarkdale review with set speeds and no Turboboost and Clarkdale needed a lot more clockspeed to equal Lynnfield. That's why the i5-600 CPUs make little sense unless you desparately want the combination of certain features and integrated graphics, they are too close in price to the i5-750.
  • StormyParis - Friday, January 22, 2010 - link

    95 euros for Gigabyte 13566 UD2H, vs 80 for their 785G. That's 20-25 US dollars. At least, both have DVI and HDMI, contrary to Intel Atom 510 board (what were they thinking ?)
  • Calin - Friday, January 22, 2010 - link

    I would still prefer integrated graphics from AMD/ATI - but did you saw (or felt) any graphical issues with the integrated graphic from Intel?

    I'm waiting for the next IGP from AMD/ATI, based on what the current competition is, it should be much better than what Intel has now.
  • Egowhip69 - Thursday, January 28, 2010 - link

    Picked up one of these things... along with a gigabyte ga-h55m-ud2h board.

    Having AWFUL issues with random reboots. Changed the Memory, PSU, HDD, you name it... then I uninstalled the intel graphics and changed the chip to an i5... no problems.

    Just to check, I threw the i3 back in... but no intel drivers... no reboots on a 3 day burn in... added the drivers back... reboot within 45 min.

    Both on Win7 pro 64bit, and Xp pro 32bit.

    Intel's drivers are VERY immature at the moment...
  • bupkus - Friday, January 22, 2010 - link

    Anand, does the table showing the results of the 4GHz i3 530 overclock include a graphics overclock as well?
  • Calin - Friday, January 22, 2010 - link

    The game results with the 4GHz overclock are obtained with a heavy-duty video card, there's no way the integrated graphics would get such results.
    What I'd like to know is - was the integrated graphic chip active during that 4GHz overclock? And how much could one push the i3 with active (eventually downclocked) internal graphic?
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    Correct. The IGP wasn't running, only the Radeon HD 5870. I haven't tried to figure out the max overclock while pushing both the CPU and the GPU, I'd guess it'd be relatively similar though. The two chips are physically separate, so as long as you can adequately remove the heat of the GPU you should be fine.

    Take care,
    Anand
  • notty22 - Friday, January 22, 2010 - link

    Your going to catch some flak for deeming this a overall better gaming cpu , in a chart comparing it to a amd 965.
  • nerdtalker - Friday, January 22, 2010 - link

    The i3 is creeping surprisingly close to the i7 920, too close for my comfort, in fact.

    /goes and overclocks i7 920 even more
  • kwrzesien - Friday, January 22, 2010 - link

    /goes and turns down the thermostat
  • vol7ron - Saturday, January 23, 2010 - link

    You increased the power .16V AND decreased the multiplier.

    It's nice to see the overclock that got, but could you be consistent in what you present us? I'd like to really know what made the overclock beneficial.


    Please, be aware of your control group in your tests and at least give us one of the following:

    1. (Stock Power + Stock Multiplier) vs. (Stock Power + [Lower] Multiplier)
    2. (Stock Power + Lower Multiplier) vs. ([Higher] Power + Lower Multiplier)
    3. (Stock Power + Stock Multiplier) vs. ([Higher] Power + Stock Multiplier)


    Notice: in each test there is only one thing that changes (in the brackets).

    That will help answer my question: Can the i3 530 overclock to ~4000MHz at a lower multiplier on stock power?


    vol7ron
  • Minion4Hire - Sunday, January 24, 2010 - link

    I think it was implied (or just directly stated) that he was unable to overclock the 530 past 3.3 GHz in any manner until more voltage was applied. That could just be an "anomaly" of sorts with their 530 so it's probably best not to dwell on it. If you actually intend to buy and overclock the 530 you'll figure it out then. The small details and mindless minutiae really don't matter. It can hit 4 GHz with relative ease; What more could you ask for?
  • vol7ron - Sunday, January 24, 2010 - link

    I took that to mean 3.3GHz was the highest he got at a stock multiplier. If what you say is correct, it'd be nice to see the highest overclock out of the box (stock power/multiplier) -- a benchmark is needed.

    "If you actually intend to buy and overclock the 530 you'll figure it out then."
    - I will give you time to retract this statement, since it is the most ignorant thing I've heard regarding a review site. After all, AnandTech.com's subtitle: "your source for hardware analysis and more." If overclocking CPUs is not part of hardware analysis, then I invite you to leave. When determining an i3 vs i7 buy, overclocking makes a big difference, especially on stock power.

  • AssBall - Monday, January 25, 2010 - link

    If you think comparing a 300 dollar cpu to a 120 dollar one is relevent, then I also invite your egotistical ass to leave. It was a good article, and you are just trolling.

    Set up your own multinational hardware site, then come and spout your anal retentive horse shit.
  • jigglywiggly - Friday, January 22, 2010 - link

    anandtech, you want to give me one
  • lanvince - Friday, January 22, 2010 - link

    ???????~~I would like to own one frankly
  • formulav8 - Friday, January 22, 2010 - link

    Anand, I'm not sure why you keep saying Intel has better integrated graphics than nvidia, and even amd.

    Your own results shows the AMD graphics besting both the i3 and the i5 660. AMD wins 3 and Intel wins 2. 1 is a tie.

    Also it appears where the i660 loses, it loses by quite a lot. AMD loses one test up to 20% and the other is about 15%. Intel loses up to 30% and almost 30% in another.

    So whats the deal? Am I simply reading your graphs wrong? And when you think about it, Intels graphics having direct mem controller access and still can't truly beat nvidia/amd is pretty sad you have to admit.


    But one thing is for sure. AMD cpu's is now behind in the lower midrange area in quite a few areas. The best thing is you can get $50 mobo's for AMD. Intel boards still cost more even including rebates, unless things has changed recently...



    Jason
  • Penti - Sunday, January 24, 2010 - link

    He's not saying that. He just implies it's a better platform and that it's better for HTPC. It's really good enough if you don't game, so why the fuss? No IGP is really gameable. He has already implied that it might change with 880/890 integrated graphics.
  • 0roo0roo - Tuesday, April 20, 2010 - link

    I just find that the more cores feels much more responsive to general system use while doing such encoding tasks compared to a core 2, so i have doubts they can be compared so simply/synthetically.
  • Ronstar - Thursday, May 20, 2010 - link

    Hi

    I bought a PC with an I3 2.93ghz 1GB CPU and would like to upgrade the Graphic card. I do not know if their is a correlation between the power if the CPU and what graphic card would work well, but I assume that a bottleneck could happen at the CPU in which case I would not benefit from a very high powered graphic card capabilities. then maybe I am wrong......

    Could someone please advise what the best graphic card is that would be worth upgrading too?

    thank you plenty
    Ron
  • loucm - Sunday, August 1, 2010 - link

    using the same motherboard gigabyte h55m-ud2h
    i manage to get my i3 530 @ 4GHZ stable with only 1.184 core voltage.
  • polarq - Tuesday, August 3, 2010 - link

    Hi, I have a core i3-530 and h55m-ud2h, bought a coolermaster 212 and I'm trying to push the limits :) But with the GPU, in bios there's a possibility to change it from 733mhz, which I did, on 900 - but it still shows 733, both in bios and cpu-z.

    So I can freely input the value, but 733 is still lights up white in bios. Is there any possibility it was locked after it couldnt take 900? Can I unlock it again? I tried CLR_CMOS, load fail safe defaults, but nothing changed. Any advice please?
  • Alejo1879 - Wednesday, February 2, 2011 - link

    Need Opinion For Core i3 550
    Hello people, i need some opinions about the intel core i3 550.
    Is this CPU faster than core i3 530, i think so, but in cpubenchmark.com, in the High End CPUs - Updated 1st of February 2011 the core i3 530 is in the top and the i3 550 don't.

    Intel Core i7 995X @ 3.60GHz 10,795
    Intel Core i3 530 @ 2.93GHz 3,546 $124.95*
    AMD Phenom II X4 B40 3,534 NA
    Intel Core i5 655K @ 3.20GHz 3,525 $192.95*
    Intel Core i7 820QM @ 1.73GHz 3,520 $546.00**
    Intel Core2 Extreme @ 2.26GHz 3,518 NA
    Intel Core2 Quad Q9100 @ 2.26GHz 3,428 $385.89**
    Intel Core i7-2620M @ 2.70GHz 3,415 NA
    Intel Core i5 661 @ 3.33GHz 3,295 $209.99*
    Intel Core i3 560 @ 3.33GHz 3,283 $149.99*
    AMD Phenom II X4 910 3,281 $204.19**
    AMD Phenom II X4 B93 3,275 NA
    Intel Core i5 650 @ 3.20GHz 3,120 $182.99*
    Intel Core i5 660 @ 3.33GHz 3,101 $207.99*
    Intel Core i3 550 @ 3.20GHz 3,026 $124.99*

    IS IN FACT THE CORE I3 530 BETTER THAN I3 540, 550 AND EVEN I3 560?
    PLEASE HELP.
  • angga - Wednesday, September 28, 2011 - link

    i like this MOTHERBOARD, may i have this Motherboard please?
    <a href="http://www.briefcases-formen.com/">Briefca... for Men</a>
    <a href="http://thebriefcasesforwomen.com/">Briefca... for Women</a>
  • dayat - Wednesday, September 28, 2011 - link

    I like the article you wrote here; it is very informative and useful for the internet users like me. I will come back to read more blog posts on your website and I have bookmarked your website as well
    Thank You

    <a rel="dofollow" href="http://calphalonoutlet.us/">Calphalon Outlet</a>
  • sarp - Wednesday, January 11, 2012 - link

    Ive i3 530 @2.93

    Ive managed to get it to 4.00GHz with 182 BClk by changing QPI, Graphics and RAM Frequencies accodringly. Sometimes my screen just freezes with a solid color and I have to restart it in order to get to Windows again. Anyway, I want to boost Intel Graphics HD 3000 performance, it's default clock is 733MHz, since Ive changed BClk to 182, to be on the safe side it is 500MHz. I wonder, how much should I increase graphics core voltage in order to get to 900Mhz with 182 BClk?
  • KinderJoy - Tuesday, September 28, 2021 - link

    What software are you using to overclock?

Log in

Don't have an account? Sign up now