NVIDIA introduced the 680i chipset back in early November with eager anticipation from users looking to try something different within the Intel processor world. There was a lot of hype, fanfare, and a great deal of media coverage that surrounded the launch of the NVIDIA 600i family of chipsets. We reviewed the 680i chipset in-depth at launch and came away very impressed with the capabilities for the upper-end enthusiast. Since the launch, most of the focus surrounding the 680i chipset changed from its impressive performance and flexibility to problems that seemed to plague the reference board designs from the launch partners such as EVGA and BFG and later on with in-house designs from ASUS and abit. These problems centered on audio issues when using SLI, as well as data corruption or severe performance loss when utilizing SATA drives on the reference boards. The audio issues were solved with a quick BIOS update, although we found in our testing that loading the latest Microsoft DX9 updates helped tremendously. The data corruption on drives was an entirely different issue that seemed to be centered on users with RAID arrays but also spread to single drive users under varying circumstances.

Several BIOS releases later and it seems as if the majority of data corruption issues with the reference boards have been cured. Lucky for us, this was an issue we did not witness in our testing of the ASUS or abit 680i motherboards or with two of our own EVGA motherboards. While NVIDIA attributes these problems to signal timings on the motherboard, we still do not completely understand why the issue occurs on one board and not the other. NVIDIA has commented that statistical variations in the electrical paths between each board can vary, resulting in some boards being affected and others not. We know from the board manufacturers that this chipset is very sensitive to electrical noise and is one of the main reasons why a specific set of voltages is required to reach the upper overclock limits of the board. This specific set of voltage settings seem to differ from board to board and in our initial opinion is based upon having a very "tolerant" MCP. We are almost finished with our testing of several 680i/650i boards and will have a roundup ready in the near future but at this time we commend NVIDIA for their continued persistence in trying to solve these problems.

While it appears the early problems have been solved somewhat, there are still numerous users who are having issues with everything from USB keyboard compatibility to one of our main concerns at this time, quad core overclocking. While overclocking is never guaranteed, paying over $250 for a 680i board that is usually advertised as being designed for the gamer and serious enthusiast with extreme overclocking in mind would lead one to think overclocking would not be a problem. The problems with quad core overclocking on the 680i chipset have been well documented and it appears these problems will be solved shortly, or at least we hope they will be. Enough history; let's discuss the board we are previewing today.


Gigabyte has launched the GA-N680SLI-DQ6 motherboard as of this week. We first discussed this motherboard in early November and then saw a working sample in early December. It has taken Gigabyte some time to get the board launched as they designed it in-house instead of simply reselling the reference board design from NVIDIA. This can be both a positive and a negative depending on how you look at it. One negative is that Gigabyte is late to market, but hopefully the positive is that they learned from the problems surrounding the reference board launch and have something extra to offer the potential buyer looking at a 680i chipset equipped board.

We can definitely say that Gigabyte has something extra to offer on their 680i board. If we were to hold a Pimp my Board contest today, Gigabyte would win without breaking a sweat. This is not meant as a knock against Gigabyte; quite the contrary, they have added just about every feature you possibly can on a motherboard and it all works together. It really works. Their marketing group has been working overtime on figuring every potential way to use the word Quad in front of a feature on the board. We have Quad-Core Optimized, Quad SLI Ready, Quad Gigabit LAN, Quad-Triple Phase, Quad BIOS, and Quad e-SATA features being discussed on the website and product packaging. What really matters in the end is if the board performs well, is stable, has high quality components, and is well supported. We think Gigabyte has met those basic criteria.

This leads us into today's performance preview of the Gigabyte GA-N680SLI-DQ6. In our article today we will briefly go over the board layout and features, provide a few important performance results, and discuss our issues with the board. We will provide a further review of this product and additional performance results in our upcoming roundup. With that said, let's take a look at this board now.

Basic Features
Comments Locked

12 Comments

View All Comments

  • sirius4k - Thursday, May 17, 2007 - link

    Overview in Gigabyte' website said there will be some eSATA (Quad eSATA or something) ports. On this preview... read panel indicates no eSATA ports :S
    ---
    No eSATA means going back to Striker Extreme... of course.
  • yacoub - Friday, March 30, 2007 - link

    The reviews at NewEgg are tearing this board a gaping butthole. I'm staying away. :[
  • Gary Key - Monday, April 2, 2007 - link

    Every review at NewEgg was either a four or five star rating for this board. Where are the bad ones?
  • Binkt - Monday, March 19, 2007 - link

    Can someone over there put in a few PCI-E RAID cards in those extra PCI-E slots and see if they function? The Areca SATA RAID cards (ARC-12x0ML) are what I'm looking at right now. Pretty please?!

    There is a rather cryptic FAQ entry on using PCI-E for "graphics" slots on Areca's website in regards to this subject. I'd just like some more physical validation before plunking down the green.
  • erwos - Monday, February 26, 2007 - link

    Am I the only one who's totally and utterly confused as to why this board has four ethernet interfaces? I can see using two interfaces. I could even contemplate three for really weird setups. But what networking setup requires four gigabit interfaces? Are they supposed to be bonded, or used for fail-over?

    Speaking purely as a gamer, the MSI P6N Diamond looks like a better deal. It may be shorter on the ports, but that built-in X-Fi seems a lot more handy than a couple more SATA and ethernet ports.
  • Gary Key - Monday, February 26, 2007 - link

    1. XP Professional will show 3.25GB of RAM when 4GB is installed. The board will show 4GB at POST.


    2. The RAM timings will drop with 4 x1GB when overclocking, at stock speeds with the F3 BIOS they require an additional .0125V to operate at the same timings.

    3. The timings matter when using 2x2GB compared to 2x1GB,512MB, however at same timings we found 2X2GB was generally more stable and performance did not vary more than a percent or two.

    4. If you use a 32-bit OS such as XP you are limited to 3.25GB of usable memory space.

    5. This board did not have an issue with Vista-64 and recognizing 4GB or 8GB of memory, as stated in the article we are still conducting memory compatibility testing as certain modules perform better than others (stability, voltages, timings), even though they are based on the same IC. Gigabyte still has some tuning work to do in this area.

    Thanks, more information will be in the roundup.
  • anandtech02148 - Saturday, February 24, 2007 - link

    per example dfi infity 975g requireds 300watts just to post.
    also what is the idle /load for this? more electricity mo heat.
  • cornfedone - Saturday, February 24, 2007 - link

    ...or don't. As long as gullible, foolish fanboys buy these defective products, there is no FINANCIAL incentive for these unscrupulous companies to change their ways and deliver quality products.

    Obviously if every hardware review site on the planet can duplicate the unending operational (and often design/engineering) defects in these mobos, then certainly the mobo and chip makers could detect these defects BEFORE they ship this crap if they weren't intentionally pumping garbage out the door to suckers willing to pay $200 plus for a mobo that is a total POS.

    There is absolutely NO reason to release a defective hardware product today other than financial greed and/or technical incompetence. Hell most of the Asian mobo companies can't even make a friggin quality copy of a reference mobo from AMD or ATI so why would you expect them to deliver a properly functioning "performance mobo" priced at hundred of dollars more when they can't buy a clue?

    With any luck all of the slimy mobo makers will go tits-up soon and the real mobo companies will see an opportunity to provide quality mobos to the marketplace. At $200 a copy there is one Helleva incentive for honest, competent mobo companies to step forward and waste the Asian scum who are dumping crap into the marketplace. When a $200 plus mobo causes data corruption it's time for a massive class action lawsuit to end this consumer fraud and exploitation.

    Now is the time.
  • sdsdv10 - Tuesday, February 27, 2007 - link

    quote:

    At $200 a copy there is one Helleva incentive for honest, competent mobo companies to step forward and waste the Asian scum who are dumping crap into the marketplace.


    Cornfedone, what major motherboard manufacturer isn't in Asia? It appears you are painting all the current companies with the same bruch, Asus, Gigabyte, abit... Who would be left to be the "honest, competent mobo companies"?
  • tuteja1986 - Sunday, February 25, 2007 - link

    Well Gigabyte GA-N680SL-DQ6 isn't even selling it. It will sale next month. They still have time to fix the bugs. Anyways i say buying the striker at launch for $400 was a foolish thing to do since it was buggy as hell. It took for them months to fix the problem.

Log in

Don't have an account? Sign up now