POST A COMMENT

70 Comments

Back to Article

  • Ilias78 - Wednesday, July 25, 2012 - link

    Great review for a great product :) Thank you! Reply
  • thewhat - Thursday, July 26, 2012 - link

    Mediocre POST time alone is making this MB less than great.

    Some manufacturers have shown that fast POST can be achieved. Why can't the rest do the same?
    Reply
  • IanCutress - Thursday, July 26, 2012 - link

    Typically manufacturers have save guards in place for detecting memory, CPU, digital power delivery, or fan controllers that require initialization. Certain USB 3.0 or SATA controllers also can add a few seconds each to the POST time. This board has three USB 3.0 controllers, dual NIC, mSATA and the rest, so it is unsurprising.

    Ian
    Reply
  • houkama - Friday, May 24, 2013 - link

    I disagreed as 12 seconds once per time I turned the computer on is hardly something that significantly reduces my enjoyment, so I bothered to buy the board and then I was pleasantly surprised when my post time was closer to 4 seconds. I'm certain that Ian is telling the truth, but in my setup it's just not a problem. Reply
  • greno - Thursday, August 02, 2012 - link

    The Marvell controllers do not support TRIM function for SSD drives.

    If you really test the drive and look for zeroed sectors you'll see that TRIM does not work on Marvell controllers.

    .
    Reply
  • MamiyaOtaru - Wednesday, July 25, 2012 - link

    no ps/2 port, no buy. Rest of it looks pretty neat though :( Reply
  • Spivonious - Wednesday, July 25, 2012 - link

    Wow, really? I haven't used a PS/2 device in over five years. Reply
  • johnsmith9875 - Wednesday, July 25, 2012 - link

    I'm a big fan of PS/2 keyboards, because USB keyboards are horrible at buffering keystrokes properly.
    A fast typist will notice the difference.
    Reply
  • SodaAnt - Wednesday, July 25, 2012 - link

    Not really, unless you have a really bad keyboard. I have usb keyboards which you can pretty much hammer the keyboard as fast as you can spam keys and you will never notice the difference. Reply
  • Einy0 - Wednesday, July 25, 2012 - link

    Agreed... You must have used a crappy keyboard or something else was messed up with the pc / os. My brother in law averages 150wpm and doesn't have any issues with USB keyboards. The only limitation I know of USB versus PS/2 is that USB keyboards can limit the number of simultaneous key presses. I've heard of some cheaper ones being limited but most will do at least 7 simultaneous keys. Better ones will do 10 plus. Reply
  • Aunt Fritzi - Wednesday, July 25, 2012 - link

    I have a 3-year-old Biostar motherboard (go ahead and laugh)... it has two PS/2 ports. A PS/2 keyboard is needed to bring up the BIOS tool at startup. Is that generally the case when mobos have PS/2 ports? Reply
  • Samus - Thursday, July 26, 2012 - link

    You need to enable "USB Legacy Keyboard" in the BIOS, and your USB keyboard will work. Reply
  • Samus - Thursday, July 26, 2012 - link

    PS/2 has a limitation of 5 keystrokes within 250ms of each other.

    The ultimate test is Stepmania (DDR) where you hit many, many keys at the same time.

    PS/2 is a joke for gaming or fast typing. It has no place in modern computing.
    Reply
  • Questor - Tuesday, July 31, 2012 - link

    My wife can type like lightning and mistake free. It astounds me to watch her. I made the switch from PS/2 to buying USB keyboards some years ago, she has been blowing my mind with them without missing a beat since the switch.
    I am not half-bad at typing and have neither of us have noticed a difference.
    I am not saying you are wrong, I am just saying, we have not noticed any issues.
    Reply
  • Einy0 - Wednesday, July 25, 2012 - link

    PS/2 is dead long live USB!!! Reply
  • Belard - Thursday, July 26, 2012 - link

    For my 1996 keyboard, the ps/2 port is a must. They don't make them like they used... my $20 keyboard is easily better made than todays $20~80 keyboards.

    Really, todays keyboards from MS and Logitech and most others use stickers for for the keys and have weak support. Mine is solid, heavy and will hopefully last another 10 years.

    Most PS/2>USB adapters don't work.
    Reply
  • Grok42 - Wednesday, July 25, 2012 - link

    I guess I'll be the lone supporter of ps/2 as well. I haven't found a replacement for my $20 IBM KB-8923 ps/2 keyboard. I don't even consider myself a picky keyboard guy either. I don't want a cheap light piece of junk that moves around my desk as I type and I don't want some 10lb monster clickty clackety old school keyboard either. Most importantly, I like to keep the number of "shopping" and "email" buttons to a minimum. Seems that's an impossible list of needs these days.

    Of course, it might be possible to use this ps/2 keyboard with a ps/2 to USB converter but I haven't tried.
    Reply
  • Belard - Thursday, July 26, 2012 - link

    That IO panel is full of ports. If you need ps/2, go for a lower-end Gigabyte board. Their Z77 boards start at about $125 (or $80 in Dallas) and have ps/2 connectors.

    For my 1996 keyboard, the ps/2 port is a must. They don't make them like they used... my $20 keyboard is easily better made than todays $20~80 keyboards.
    Reply
  • johnrysf - Saturday, August 04, 2012 - link

    Before ya'll plunge off the road and into the weeds in the typing speed, etc. discussion that follows, let me mention that I have 2-3 ps/2 <---> USB pigtails that I've picked up free over the years. They're maybe 8" long. Perhaps this will offend your PC-related aspirations, but one of them even says "Radio Shack" on it.

    Life really is short. Geez.
    Reply
  • Nickel020 - Wednesday, July 25, 2012 - link

    First, nice review! I like the more in-depth single board reviews!

    You always point out the software situation with GB boards. While this is definitely an issue as far as fan controls goes, as far as overclocking goes, there's also the GB Tweak Launcher:
    http://gigabytedaily.blogspot.de/2012/04/gigabyte-...
    While not looking pretty, I actually prefer this functionality focussed UI to a fancy one (while I like the Asus UI, the clicking does get a little bit tedious when trying a lot of different settings). I don't know whether you're not aware of this software, or don't mention it for another reason, but I think you should include it in the review. Or at least mention it, so people are aware of it.

    Another point is the voltage read points, which you don't mention in the review as well. I think these are actually a major selling point for overclockers, and should be mentioned, if not even used to check actual voltages versus BIOS settings. That's also an issue that I have with your UD3H/GD65 etc. review: You say that the GD65 gains voltage read points over the UD3H, yet the UD3H *has* voltage read points (even though I like MSI's implementation far better than what Asus/GB are doing).
    http://www.anandtech.com/show/5793/intel-z77-mothe...

    That's it for now, I'll read the complete review now :)
    Reply
  • IanCutress - Wednesday, July 25, 2012 - link

    As a competitive overclocker, I have used GBTL when pushing the BCLK of these boards as far as my CPU will allow (http://hwbot.org/submission/2301438_). I like GBTL - no mess and no fuss. But it is understandable why they do not include it in the Support CD, and hence why it doesn't really get a mention here. I did touch upon it very briefly in the overclocking section of the Z77X-UD3H review back at Ivy launch. As for voltage read-points, they are mentioned briefly in the board features, but I am also in agreement that perhaps the implementation of other manufacturers is more beneficial in our very niche usage scenario :)

    Ian

    PS On the multi-board reviews I try not to take anything out from what is in a single board review. Every benchmark, test and bit of analysis in each of them gets put in :)
    Reply
  • Nickel020 - Wednesday, July 25, 2012 - link

    Oh, I missed the part about the GBTL in the UD3H review. While I haven't goten around to playing with my UD3H, I have found the Asus AI suite very practical for "normal" overclocking and I believe the GBTL will also be a real benefit for anyone working out a good 24/7 overclock. If I didn't knwo about it already I certainly would want to read about in your review if I was going to get this board for 24/7 OC.

    I also missed you mentioning the UD3Hs voltage read points, but in any case there's still an error in the conclusion though where you implicitly state that the UD3H does not have voltage read points (in the part about the GD65).

    I know you run the same benchmarks, but I find the text/user experience more interesting than the benches, and there's definitely more text in a single board review! The benches I only care about to see whether a board has a significant performance issue, since I'm not into competitive OC I don't care about slight differences that I won't notice anyway.

    Conderning the benches, I'm also a little surprised that you somewhat praise GB for auto-overclocking the CPU. IIRC Anandtech has been opposed to that in the past, since it's technically overclocking and thus theoretically voids your CPU warranty. It also makes it hard to compare board performance when CPU settings are actually the same, such as when using a manual overclock. I know it's considerably more work, but I would love to see the benches with the CPU forced to run at stock settings added to the charts, the current version is an apples to oranges comparison imho. For someone just looking at the charts (and not the text, as many do...) the current ones give a very wrong impression, they make it seem like Asus and GB perform better, when without the auto-overclock, they might actually be worse...
    Reply
  • IanCutress - Wednesday, July 25, 2012 - link

    Most users of these boards never touch the BIOS, let alone update it. This is why we run the boards at default - some manufacturers are being more aggressive with their settings and that is what you are paying for. If that aggressive setting compromises stability, then that can also be an issue. Thus it is a like-for-like comparison, as if a user was taking the boards out of the box and then just strapping in a CPU.

    After all, if we start changing the application of Turbo modes, what else do we change? Setting the voltage equal on each board to get a VMM reading that is always the same across the range? How about disregarding any board that uses x8/x4/x4 PCIe 3.0 against x8/x8/x4 PCIe 2.0? Default is the choice because that's what most users will end up with. Visiting some LANs recently, you would be surprised how many people buy 2133+ kits of memory and not enable XMP. That's the reality of it.

    I used to be wary of this feature (as per my review of the P9X79 Pro, where I disabled it and was severely disgruntled), and still am as it results in motherboard manufacturers artificially inflating some results as to what you would expect. But this did happen before in earlier chipsets, when one manufacturer would run 100.5 BCLK, and the next would use 101.3, and even 102 BCLK, stating 'that's just how the design works'.

    There's nothing we can do to change this, so I am taking the position of sitting back and analysing what they are doing, and how aggressive they are taking this philosophy. Any good reviewer will recognise what is pure statistical variation and not assign world class status to a result that is 0.01% difference.

    With regards the warranty, it is a tough hammer to nail down. Would a pair of companies ever advertise that by default their settings technically breaches warranty? Or how would Intel take it, given that technically none of the cores ever went past the top turbo mode? Without a direct response on the issue, it's not worth speculating. I've known users to repeatedly successfully RMA CPUs they've overclocked on LN2 way too hard and broke them, so we don't really know if Intel will draw a line much.
    Reply
  • Nickel020 - Thursday, July 26, 2012 - link

    It's certainly a matter of opinion. As an "enthusiast" I'm of the opinion that a board should not overclock without my knowledge/express wish (since I can easily do so myself. Practically the overclock of course has no bearing on CPU warranty (the CPU also being the very last PC component that you're likely to need warranty on...).
    I agree that for the average user this is actually added value, a slight performance bonus at absolutely no cost other than a little bit more power consumption. Maybe point out both sides in future reviews? That way everyone's happy :)

    PS: Please do fix the error in the UD3H, GD65 conclusion, it's wearing me down ;)
    http://www.anandtech.com/show/5793/intel-z77-mothe...

    "For the price we lose PCI and mSATA over the Gigabyte, but gain SATA, voltage read points, [...]" <--- wrong, maybe say "better implemented voltage read points"? ;)
    Reply
  • Nickel020 - Thursday, July 26, 2012 - link

    Thanks Ian :) Reply
  • mystikl - Wednesday, July 25, 2012 - link

    No VGA port, no floppy connector, no buy . Reply
  • Dustin Sklavos - Wednesday, July 25, 2012 - link

    Seriously?

    First, it does have a VGA port. Why you would want to use one escapes me now, but it's there.

    Second...you still need a floppy drive and can't make do with a USB 2.0 one? Almost no modern motherboards include floppy connectors because floppy disks are horrendously outdated and that real estate can be better employed elsewhere.
    Reply
  • mystikl - Wednesday, July 25, 2012 - link

    I was actually making fun of the guy who posted the second comment! Why on earth people still need those ancient connectors is beyond me. Some may argue that some ancient software doesn't run without that specific port, but software that old doesn't require a computer with a quad core, 16 GB RAM and 3 videocards. Reply
  • shin0bi272 - Wednesday, July 25, 2012 - link

    bios flashes on some boards still require a floppy disk... even on a quad core. Reply
  • SodaAnt - Wednesday, July 25, 2012 - link

    Luckily those boards have floppy ports then. Reply
  • Dustin Sklavos - Wednesday, July 25, 2012 - link

    Derp. Well, color me stupid.

    Nice one, though. ;)
    Reply
  • Belard - Thursday, July 26, 2012 - link

    Okay, you are stupid.

    Well, for another reason. Nobody makes a keyboard worthwhile to replace the one I use today... 1996 era. This keyboard *IS SO OLD*, its not even a PS/2! Its an AT-connector, plugged into a PS/2 3" Adapter into an extended PS/2 cable. A USB convert doesn't work.

    But gigabyte makes many boards with ps/2 ports.

    VGA is not needed on a high end board (lower end $60~120 boards have VGA).

    And actually, a floppy connector *IS NOT* needed of modern boards. When a modern board like Gigabyte doesn't include a floppy connector, they can update the BIOS from within the OS or with a flash drive.... far easier than a stupid old-school floppy drive (I keep one just in case).
    Reply
  • DigitalFreak - Wednesday, July 25, 2012 - link

    Did you test the VIA USB 3 chip performance, or only the Intel PCH controller? Reply
  • Mustang66 - Wednesday, July 25, 2012 - link

    Are the two red USB ports on the back 2.0 or 3.0? The paragraph says they're 3.0 but the feature list seems to infer they are 2.0. Reply
  • IanCutress - Wednesday, July 25, 2012 - link

    Sorry, the red ones are USB 2.0. USB 3.0 on every manufacturer is currently blue (for now).

    Ian
    Reply
  • cameleon - Wednesday, July 25, 2012 - link

    No Dual link DVI Port on HD4000, so I can't run my old 30" display without dedicated card. Reply
  • Craxit - Wednesday, July 25, 2012 - link

    Display-Port to DVI-D cable?

    Same prob here.
    Reply
  • PolarisOrbit - Wednesday, July 25, 2012 - link

    So Gigabyte finally dropped VIA as the onboard audio provider on their motherboards. About time- I almost swore off Gigabyte because I had so much trouble with that driver! Reply
  • rickon66 - Wednesday, July 25, 2012 - link

    I have this board and really like it. The board package includes a front panel USB 3 bracket so you can add USB3 to the front of your case. I thought that was a very nice addition. The board is built so stoutly that it could double as body armor. Reply
  • Craxit - Wednesday, July 25, 2012 - link

    Has anybody found out how to switch on the board
    by keyboard?

    Can do wakeup from sleep using the kbd but not a cold start.
    Reply
  • Belard - Thursday, July 26, 2012 - link

    Its in the BIOS POWER settings. I've been building some systems with its smaller sister boards. You can go to gigabyte, track down the manual and look it up... it should be there. Also, it gives you the option to power up with a mouse.

    Even a wireless USB keyboard manage to power up the system (cool).
    Reply
  • shin0bi272 - Wednesday, July 25, 2012 - link

    I know intel is capable of doing on chip video and there have been boards with onboard video for forever but the trend of putting 9001 video ports on the back of the thing instead of oh say 1 is disturbing.

    Lets be honest if youre a gamer and you want 3 way SLI you dont need onboard video. Likewise if youre not a gamer and you want to plug your monitor into the motherboard you dont need 3way pci-e 3.0 sli. Pick one and go with it!

    Plus if you wanted to you could include a couple of adapters to go from dvi to vga or dvi to hdmi and have 1 plug on the board itself which will save space on the back i/o panel and allow for more important things like more esata or usb3.0 or even that wifi that the review alluded to.

    This is a case of a motherboard manufacturer trying to please everyone with 1 board instead of making a gamer board and a HTPC board and a file server board. Saves them money but screws the consumer.
    Reply
  • shin0bi272 - Wednesday, July 25, 2012 - link

    oh and if usb 3.0 is backwards compatible with 2.0 ... why include 2.0 at all? Reply
  • Dustin Sklavos - Wednesday, July 25, 2012 - link

    USB 3.0 support is still a little bit hinky; a fresh install of Windows 7 may not recognize your keyboard if it's plugged into a USB 3.0 port without drivers.

    And uh...I use two of the display outputs on the back of my motherboard. Multi-monitor isn't that uncommon these days.
    Reply
  • IanCutress - Wednesday, July 25, 2012 - link

    Each USB 3.0 controller has an associated Bill of Materials cost. You only get 4 USB 3.0 from the chipset, but 12 USB 2.0. USB 3.0 as Dustin says is a bit flaky at times - technically the Intel USB 3.0 should work at boot but they do not always, depends on how the motherboard traces are routed.

    Regarding boards and video outputs. If the CPU has the capability to, motherboard manufacturers get slammed if they don't include at least one or two video outputs just in case a user wants them. Imagine I had this board and strapped in a few NVIDIA GPUs for CUDA programming. If I could, I'd use the onboard IGP for my display, then have the GPUs purely for computational needs, and still have all the PCIe 3.0 bandwidth I would need.

    Ian
    Reply
  • Grok42 - Wednesday, July 25, 2012 - link

    I don't think I've ever agreed and disagreed with a post so much before.

    I think it is about time that motherboards ship with the ability to run multi-monitor setups out of the box. Hopefully all four can drive a monitor at once! What is crazy is that they are shipping with 3-4 DIFFERENT connectors! I think all graphics connectors are completely terrible. Not one of them could drive an iPad3 screen even if the DVI was dual-link. This is why Apple is moving to thunderbolt I think but it still isn't clear to me that a Thunderbolt port could drive a hi-res display like an iPad. The next connector should have the ability to drive an 16k display so we can live with one connector for a decade. Monitors last 2x-4x the lifespan of a computer. Build a connector that will last!

    Of all the things we need more of, USB isn't one of them. At work we drive 24-48 USB devices on standard low end dell computers. Most DIY motherboards like this support at least 6 and more typically 10. If you need more than that a simple hub which you already have in your monitor/keyboard/mouse/toaster will give you all you need.

    Now the part I think you're spot on is that they are trying to please everyone with one board. I would expand this to the entire industry. If you've seen any of my screeds about computer cases you know that there is really one one type of case for sale, the one that sorta works for everyone but isn't great for anyone. The MB market is better but still a mess. Right now they have lines that are broken into feature grades with each higher level board simply adding more stuff. Instead they should be aimed at what consumers want to build.

    If you are making a highly overclockable board with support for 3 PCI-E graphics boards do you really need/want 10 SATA ports? Who is overclocking their file server and run an SLI console? The problem is if you back off to a lower grade board you lose something you do need so your file server has SLI support even if you don't want it.
    Reply
  • epobirs - Thursday, July 26, 2012 - link

    You are completely wrong. The iPad3 display is merely 2048x1536. Not a big deal for dual link DVI which has been used to drive 2560x1600 displays for many years before the 'Retina' designation came out of Apple's marketing department. The idea that the iPad3 display is somehow the bleeding edge of screen tech is laughter inducing. The only thing remarkable about is the small size. Such resolutions are old news for large displays, especially in the professional markets. Keep in mind, the Retina designation is about pixel density, not just resolution.

    The only port on that panel that cannot drive a Retina display without breaking a sweat is the legacy VGA. DVI is showing its age but we have two successors already in HDMI and Display Port. Both of those are capable of driving 4K displays that won't be common in the consumer sector for several years. More importantly, the on-board GPU tops out at 4K, so equipping the board to drive anything greater is an utter waste.

    The newer ports are already designed with monitors most people won't be able to consider buying for a very long time. Nor do today's displays have the same longevity as CRTs did. Fortunately, they compensate by rapidly improving in bang for the buck. When my $300 1680x1050 22" monitor, which seemed an amazing bargain when first purchased, died after a bit over three years, I replaced it with a 27" 1080p screen for around $250. On another desk I put in an ACER 32" HDTV as the monitor for $250, just because I could. (I remember paying close to $1,000 for my first 17" CRT that weighed close to 80 pounds.)

    Trying to design for what will be called for a decade from now is just a waste of time. Extremely few consumers will benefit and there is a good chance NOBODY will benefit because something came along that changed things so much as to render your long term plan badly obsolete. The payoff just isn't there. VGA has been around since 1987 but there aren't any displays from that era or ten years later that are worth the trouble to use today.

    I'm reminded of back in 1999 when A certain type of Apple snob loved to go about how the original 128K Mac had no Y2K issues. Who cares? If you were still relying on an 80s Mac in 1999 your life would have to be so miserable as to make Y2K terribly low on your list of troubles.

    As for USB and SATA ports, in a full sized ATX board I'd far rather have ports going unused than have to add more later. If I want minimalist I'll build with a smaller board and case. They each have their place.
    Reply
  • aaronb1138 - Wednesday, September 05, 2012 - link

    Even VGA can drive up to around 2560x1600 @ 60 Hz, but cable quality and length becomes a factor (you need a $15-30 shielded cable instead of a $5 one). I run a Sony FW900 at 2048x1280 @ 85 Hz over VGA cleanly (BNC or VGA connectors, both are equal with good cables).

    Dual link DVI can run 2560x1600 @ 60 Hz at 10 bits per color (30 bit color).
    Reply
  • Belard - Thursday, July 26, 2012 - link

    This is not a true 3X SLI board. It has the slots, but not the lanes to do full blown 16x16x6 or even 8x8x8. At $140~170, its a upper mid-range board.

    I build system with Gigabtye mATX boards... it'll support 8x8 SLI or two Cross-fire boards, it also has 3 16x slots. Not bad for $80 (Microcenter discounts).

    So having the various types of video ports is very good for typical people who only use a single monitor. With the 4 types, everyone is covered. For a dual monitor with DVI inputs, I used a DVI-DVI cable and spent $15 for a HDMI>DVI cable... not a big deal.

    Even $400 video cards will require adapter cables to use in multi-monitor setups.
    Reply
  • rickon66 - Wednesday, July 25, 2012 - link

    I still think that this board has great bang for the buck, especially since itis often available for $139.99 at Micro Center. Reply
  • jardows2 - Wednesday, July 25, 2012 - link

    Can anyone explain the value in multiple Ethernet ports? Outside of being server board, and some specialized workstations, the practicality (and added cost) of multiple Ethernet ports escapes me. Reply
  • IanCutress - Wednesday, July 25, 2012 - link

    Connecting to multiple networks, redundancy, teaming for better throughput, connecting via ICS, VM throughput, one specifically for backups, separation of traffic (i.e. you could have a combo web/database server, same network, put all web traffic on one NIC, db traffic on the other, makes it easier to calculate loads for traffic types). If you're streaming from a NAS that supports teaming, then the improved bandwidth can benefit users that stream from that device. Agreed, it is a perhaps a niche scenario, but there are enough users that want it. The Realtek NIC + Audio is a relatively cheap bundle, but some people prefer the Intel NIC. So why not have both, as long as the price for the user is reasonable.

    Ian
    Reply
  • Snotling - Wednesday, July 25, 2012 - link

    If your NAS has two ports... you can team up your nick on both ends.

    If your two NICs have different chip-sets then it may be for compatibility reasons. Some businesses will want to use only the Broadcom or only the Intel or Marvel... etc. Maybe at some point you can save downtime if a driver update causes a problem either by being bugged or missing.

    Load balancing, bridging networks, Acting as gateway or firewall... even if you do not actually run a server on the board you may want to do it for test purposes or some weird networking condition. Like having two different VPNs that require you to be on two different subnets.

    I admit, most of this is exceptional conditions but the exceptions addup and higher end boards aim to cover the needs of those who may run into those situations or actually need them.
    Reply
  • Grok42 - Wednesday, July 25, 2012 - link

    I can't figure it out either. I've built boxes with many nics before for routers, gateways and bridges. Almost all the servers I've built have had 4 nics. However, I can't imagine using the two nics on this board for anything. Why would I want to build a NAS box with SLI and overclocking? Why wouldn't I get a much different board and add a good discrete NIC board with multiple ports? At the consumer level I can't imagine doing any of this. My file server only has a single gigabit nic and is WAY faster that I need. I can move GBs of files in just a few seconds between it and my workstations. At work we have 10GB and we team ports to increase even that so I know there are needs for higher speeds, I just can't figure out a reason at the consumer level this board is obviously focused on. Reply
  • Einy0 - Wednesday, July 25, 2012 - link

    I own this board, it is amazing thus far. I haven't really had a chance to really push it too hard yet... One of these weekends I will try some overclocking. The 3770K is so fast, I'm still getting used to it. I am really impressed with the Z77's SATA controllers. My Vertex 4 is topping out at about 562MB/s for reads and my 4 disk (500GB WD Blue) RAID5 Array is hitting around 362MB/s for reads. I would love one more USB 2.0 header or a USB 3.0 to 2.0 header adapter. A non Realtek audio codec would be terrific too... Reply
  • vailr - Wednesday, July 25, 2012 - link

    There's evidently 2 board versions of the UD5H:
    The older version has a space in between the 2nd & 3rd DDR3 slots, with blue capacitors.
    The newer version has no space in between the 2nd & 3rd DDR3 slots, with purple capacitors.

    Question: why doesn't Gigabyte provide drivers for the VIA USB 3.0 ports? There are some VIA USB 3.0 drivers on www.station-drivers.com, but those fail to install on Windows 7 64-bit.
    Reply
  • Sabresiberian - Thursday, July 26, 2012 - link

    One of Gigabyte's strengths is that they've long had dual Ethernet capability, but -

    Why only one Intel? Is it really that much more expensive to just put the best in here?

    I think your read of Gigabyte has been right on the money Ian, I've long thought the same, and wondered why some media types blew their horn so loudly.

    ;)
    Reply
  • Zak - Saturday, July 28, 2012 - link

    I see no point in adding FireWire any more... I'd rather have two eSATA ports or another SPDIF output. Any why having DVI, DP and VGA? Waste of space. I really doubt anyone has a need for all three simultaneously. If someone needs to use VGA they can use DVI or DP adapter. Reply
  • Zak - Saturday, July 28, 2012 - link

    Typo: "Overclocking on the UD5H was a mixed back of results" Reply
  • JimDicks - Saturday, July 28, 2012 - link

    This GB mainboard comes with a Marvell 9172 6Gbit/s S-ATA controller, almost same as my GB mainboard. When I recently bought a 6Gbit/s SSD and connected it to the 'superb' Marvell, it only reached about 250MB/s instead of the advertised 600MB/s. A whole afternoon searching and reading forums and specifications revealed that most of these 3rd party chips have a higher latency than the Intel/AMD south bridges, and reach much lower data rate than advertised, because they are connected via 1, maximum 2 PCIe 2.0 lanes to the mainboard. That means that a controller with 4 6.0Gbit/s connectors would need 2.4GB/s to transfer, yet it can only theoretically transfer 0.5GB/s (1 lane) or 1.0GB/s (2 PCIe 2.0 lanes) to the mainboard. In fact, the practical PCIe speed is much less.

    I recommend that Anandtech not only checks USB speeds, but also S-ATA speeds via the 3rd party chips, the southbridge and via external PCIe x8 SAS Raid Controller (ie. LSI MegaRAID SAS 9240 or 9260). The latter could also be used to check the practical PCIe bandwiths.
    Reply
  • IanCutress - Tuesday, July 31, 2012 - link

    Please note that even if you have purchased a '6Gbit/s' SSD, it will never run at that speed. If you look at our SATA and USB testing, none of the peak speeds we see in our benchmarking ever reach the peak advertised by the port due to the limitations of the hardware. They more often than not do not even reach the peaks of the hardware due to latency or real-world situations. Most rated speeds are for compressible continuous data with a high queue-depth - not ever a realistic scenario.

    Testing every 3rd party controller adds testing time. Going from 2-3 boards a month do maybe one and a bit. Especially if they're all connected differently on the board (which we don't always know without specific chipset diagrams for the specific product, which are not always available). After all, peak tests are limited in their understanding anyway - take the new USB boost technologies from ASUS and ASRock. ASUS' implementation affects mainly short size block transfers by several order of magnitudes (as found out in our testing), rather than peak by any significant amount.

    I am quite astounded by your insistence that if you have many ports that they must all run at their peak speed in conjunction with each other at the same time. It just doesn't work that way. Have a gander at the chipset diagrams in our first Z77 reviews to understand this. Bandwidth is delivered to what needs it at the right time - the usage model for hammering all the ports at once is almost non-existent.

    Ideally we'd love to test everything at the peak, but we do not have an unlimited amount of cash to go and buy equipment. We're independent freelance reviewers making do with what kit we can get together or are offered.
    Reply
  • iCrunch - Sunday, July 29, 2012 - link

    I've become so spoiled by how incredibly seamlessly, fast, and consistent Thunderbolt works. Add to that the fact that I can probably get most of the specs in an external Thunderbolt enclosure/dock solution fairly soon...albeit for a tad more money. Reply
  • Scootiep7 - Sunday, July 29, 2012 - link

    Wow, so Gigabyte finally took my complaints to heart and started releasing boards with something other than that ALC887 miserable excuse for an audio codec. Bravo good sirs, better late than never. Now get it done on more of your boards! Reply
  • Questor - Tuesday, July 31, 2012 - link

    I have this motherboard with an Ivy Bridge i5 3570K, currently at a stock speed, with a Plextor 128 GB SATA III SSD as the OS drive. Even with the post logo screen enabled, I get to desktop log-in in about 7 seconds. Excluding the "artificially" added time to type my password, full desktop is an average of 2.5 seconds from there. Not bad considering the programs I have that load versus Anantech testing.
    I have had my board since its early retail release and this has not changed much; even after replacing the original i5 2500K and Samsung SATA II 128 GB SSD for the above mentioned components.
    Reply
  • Questor - Tuesday, July 31, 2012 - link

    In addition to the boot time being misleading commentary and the response to the USB vs PS/2 keyboard, my present computer build with this Gigabyte board is one of the best/favorite rigs in my 14 years of building and tweaking computers.
    The other few are ASUS P8P67-PRO, Gigabyte MA790GP-UD4H (X2 with two cores unlocked to quad and STILL running stable), EPoX EP-9NPA+ Ultra, ASUS P5A with AMD K6-350 (severe electrical spike took it out after years of OCing - jumped the surge protector)!
    Reply
  • Questor - Tuesday, July 31, 2012 - link

    In addition to the boot time being misleading commentary and the response to the USB vs PS/2 keyboard, my present computer build with this Gigabyte board is one of the best/favorite rigs in my 14 years of building and tweaking computers. The other few are ASUS P8P67-PRO, Gigabyte MA790GP-UD4H (X2 with two cores unlocked to quad and STILL running stable), EPoX EP-9NPA+ Ultra, ASUS P5A with AMD K6-350 (severe electrical spike took it out after years of OCing - jumped the surge protector)! Reply
  • minlian - Sunday, August 12, 2012 - link

    Is the reaction time DPC Latency error fixed on the Gigabyte Z77X-UD5H-WB-WiFi version or are they just the same motherboard with just a wifi difference? Reply
  • Triniman - Sunday, August 12, 2012 - link

    Ian Cutress wrote, "...a total of 10 USB 3.0 ports available (if you have enough USB 3.0 panels)."

    Does anyone know of any cases that will would allow you to use the 3 onboard USB 3.0 headers?

    Also, is it correct that the Gigabyte doesn't include drivers for these VIA controllers?
    Reply
  • ETruett - Tuesday, September 11, 2012 - link

    Hay. I am wondering if the Gigabyte GA-Z77X-UD5H Motherboard will fit in the Dell E510 Case? Or which Case can i get? I would like to build a nice gaming pc that whill not break me. I dont have much money. My eMail is: etruett9@gmail.com..... :-)) Reply
  • gkatsanos - Monday, November 26, 2012 - link

    Hi, It's weird how in any of the reviews there was a word for the problems (bugs) this M/B has. I ordered it a week ago and even if I upgraded my BIOS to the latest version, it still suffers from problems.. I had for example to disable the Marvell sata controller and connect my drive to the intel chipset one for the system to shutdown properly... Other folks have had freezes etc. Google the model + problems and you'll see. I am thinking of changing it with an Intel m/b or something more stable.. Reply

Log in

Don't have an account? Sign up now