ASRock X79 Extreme11 In The Box

For the price tag, we expect a lot of extras in the box for what is ASRock’s high end model.  Previously we have seen additional Front USB panels and/or WiFi connectors available in ASRock’s packages.  With the X79 Extreme11, we get the following:

Rear IO Panel
Manual
Driver DVD
Six SATA cables
Two SATA to Molex Connectors
Two Short SLI Fingers
One Long SLI Finger
One Rigid 3-Way SLI Finger
Front USB 3.0 Panel

 

Personally, with the inclusion of the PLX and the LSI chips, I would have expected either a full compliment of SATA cables or a four-way SLI connector.  There would have also been scope to increase the Front USB panels to either two, or a larger 5 1/4" bay with four ports and two connectors (depending on if workstation type cases came with USB 3.0).  With boards like this, it might be worthwhile motherboard companies allying with a case manufacturer as a suggested build scenario.

Voltage Readings

After my first publication of OCCT voltage readings, a few readers responded with a more in-depth reasoning behind some of the results we were seeing.  With this in mind, I would like to re-describe what we are doing with this test, and how it comes about.

Much of what an enthusiast overclocker does is monitor CPU temperature and voltage.  Whatever settings a user places in the BIOS or OS is at the mercy of the motherboard - in terms of actually setting the values and reporting the values back.   As an enthusiast, we have to rely on what readings we get back, and hope that motherboard manufacturers are being honest with their readings.

Take CPU voltage.  What we as a user see in CPU-Z or OCCT is a time-averaged value that hides voltage ripple (if any) for power delivery.  It is very easy for a motherboard manufacturer to hide this value, or to disregard slight deviations and report a constant value to the user.  The CPU voltage reading can be taken at a variety of places on the power plane, which can vary between motherboards and manufacturers, meaning that each reading is essentially not comparable with the other.  Nevertheless, as an enthusiast, we will constantly compare value A with value B. 

Whether or not I can achieve 4.7 GHz with 1.175 volts on a particular board is inconsequential - your motherboard may perhaps produce the same result with a reading at 1.200 volts.  The only way to test the actual value is with consistent methodology is via an oscilloscope connected to similar points on each board.  This may sound like taking an OCCT reading is therefore redundant.

However, motherboards have settings relating to load line calibration.  As load is applied to the CPU, the voltage across the processor decreases (VDroop).  Load Line calibration essentially attempts to control this level of droop, by increasing voltage when voltage drops are detected away from a fixed value.  Manufacturers have different ideas on how to modify LLC with respect to load, or whether the level of modification should be controlled by the user.  Some manufacturers offer the option at a variety of levels, such that overclockers can be sure of the applied setting (even if it increases peak voltage, as explained by AnandTech in 2007).

By doing a full load OCCT test, we are essentially determining both how aggressive the motherboard is reporting the CPU voltage under load and how aggressive load line calibration is performing (from the point of view of the user without an oscilloscope or DVM).  If someone has one of the motherboards we have tested and you have a different one, variations in load voltage should describe the offset you may require for overclock comparisons.

Like most ASRock boards, we see a small variation in the voltage reading under OCCT.  Note that as OCCT is itself a time-smoothed reading of the ripple, perhaps the ASRock solution is not be as smooth as it could be.  Nevertheless, I had no voltage issues on the board itself, and the X79 Extreme11 does a good job keeping around the 1.160 volt region during load at stock settings.

ASRock X79 Extreme11 Software ASRock X79 Extreme11 Overclocking
Comments Locked

62 Comments

View All Comments

  • cjs150 - Tuesday, September 4, 2012 - link

    All those right angled, stacked SATA connectors and the 24 pin ATX connector is still sticking straight up!

    Come on, it costs little to make this a right angled connector and makes for much better cable management (especially if you are also using all the SATA connectors)

    Has anyone checked the accuracy of the ASRock Hardware monitor for temperatures? There are reports that on other ASRock board these are significantly inaccurate.

    Ultimately other than M/B p#rn not really sure what market this board is aimed at. For a workstation I would prefer dual CPUs and 48 PCI lanes
  • dgingeri - Tuesday, September 4, 2012 - link

    I think that LSI 2308 chip is the same as the chip used in Dell's PERC H310 controller, with slightly different firmware/bios. (Dell customizes theirs to call it a PERC and label it as a Dell controller, but it is still labeled as a LSI copyright.) If so, that's a very good controller, from my experience.
  • ComputerGuy2006 - Tuesday, September 4, 2012 - link

    I want Ivy Bridge-E, not going to go from 1336 setup to x79 setup without knowing if ivy-e is even coming out (much less if it will work on the same mobo)
  • dgingeri - Tuesday, September 4, 2012 - link

    With AMD providing absolutely no competition in this space, I would say it is unlikely they'll come out with any updates worthy of spending money. Think of the time with the P4 while AMD wasn't providing competition. Intel put out processors from 2.8 to 3.8GHz over the course of 2 years which cost more and gobbled up more electricity, yet provided minimal performance enhancement. (iirc, the 3.8GHz chip was only about 10% faster than the 2.8GHz chip because of memory bandwidth limitations and thermal throttling, yet Intel charged more than double the intro price than the 2.8GHz for the 3.8GHz chip.) Intel without competition is just a money hog, gobbling up more and more money with little to show for it. I doubt Haswell with be anything special, either. IB certainly isn't.

    Oh, yeah, they say Haswell will be 10% faster than IB at the same clock rate. While AMD was providing competition, they were putting out things that ran 50-60% faster per clock, and had faster clock rates to boot. So, at the price point, we'd get a 75-100% boost per generation. (Core 2 had a drop in clock rate, sure, but it was WAY faster than the P4, giving us a 80%+ performance boost at the same clock rate. Coppermine was a huge boost in both clock rate and efficiency. Katmai was a huge boost in clock rate and efficiency. Same with Klamath and Deschutes. Yet, the P4 generation was a huge stall point, and also the point where most of Intel competitors got out of the race and AMD was way behind.)

    I wish someone would come into the market and provide an little incentive to get Intel to move their butts forward, but we're not going to see that for probably another decade.
  • Master_shake_ - Tuesday, September 4, 2012 - link

    problem is Intel won't allow any more companies to get the x86 license to make desktop/notebook cpus.

    i want Nvidia to make one just to have a 3rd choice.
  • fteoath64 - Thursday, September 6, 2012 - link

    "i want Nvidia to make one just to have a 3rd choice.".

    Yeah, then Nvidia buys VIA and starts making NV-nano as the Tegra3 of x86 for the super tablets that would be weight compatible with current 10.1 Android/iOS tablet ..... {pipe dream ...}

    Nvidia doing a x86 and ARM hybrid processor would be really cool for a new generation of UltraBooks that does Win8 and Android together. Imagine when docked you have both Win8 (external monitor) and ICS/JB on tablet with touch. Win8 tablets being much thicker plas plenty of space for 2 SDcard slots and 2 MicroSD slots.
  • fteoath64 - Saturday, July 12, 2014 - link

    When the discrete gpu market for high-end notebooks dries up, then Nvidia might make a VIA play. For now, they cannot afford such an investment especially when they had sunk millions on Denver (Arm V8) 64-bit arm with the K1. IF they produce a great Arm 64bit core, then they might have a great chance on the tablet and high-end mobile market. Also, left over for the low-end and microserver market.
  • Frallan - Wednesday, September 5, 2012 - link

    I was complaining the other day about AT becoming an iSite talking more about iWare than anything else.

    My honesty compels me to write in after the last few days and apologize. There has been a number of good interesting computer and component articles the last week that proved me wrong.

    Thank you AT and keep the good work up.
  • BlueReason - Wednesday, September 5, 2012 - link

    "ASRock have potentially missed a trick here"

    It's becoming trendy for American-based tech blogs to use the British standard for subject/verb agreement when it comes to businesses. You could debate what it ought to be all day, but American professional writing standards dictate that companies be referred to as a singular entity. You can do whatever you like, of course, but just an fyi in case you submit a piece to a major American publication. They won't see your usage as fancy.
  • Razorbak86 - Monday, September 24, 2012 - link

    "Lighten up, Francis." -- Sergeant Hulka, Stripes (1981).

    Although you may view Ian Cutress' prose as "fancy", he was hardly being pretentious. He lives in London, and he was educated at the University of Oxford. That might not mean much to you, but feel free to Google a map or two and educate yourself about world geography.

    I lived and worked in the United Kingdom for 5 years as an American expatriate. My daughter was born in Aberdeen, Scotland, and my kids grew up with British accents. I can assure you that it is standard practice in the United Kingdom to refer to companies in the plural. Fortunately for me, the British people were very gracious hosts. Despite the subtle differences between my American dialect and the Queen's English, they always treated me and my family with great respect throughout our stay.

    So please be a little more polite when referring to one of the Senior Editors of AnandTech. You are, after all, communicating in HIS native tongue, not yours. ;-p

Log in

Don't have an account? Sign up now