Lucid’s Hydra Unleashed: Part 1

by Ryan Smith on 1/7/2010 11:00 AM EST
Comments Locked

47 Comments

Back to Article

  • liveonc - Tuesday, March 23, 2010 - link

    Hydra is still pretty raw, but can it be the One Chip to rule them all, One Chip to find them, One Chip to bring them all and in the darkness bind them In the Land of Mordor where the Shadows lie? CPU, GPU, GPGPU wars comming to a standstill, where it doesn't matter if you use an Intel, AMD, Nvidia or Ati.
  • Focher - Wednesday, January 13, 2010 - link

    I think people should really give Lucid their due in regards to proving the underlying concept - that it is feasible to deliver mixed frame rendering in real time. Granted, the technology still seems immature but one has to remember that AMD and NVIDIA have both rejected the approach at this point.

    I'm still prepared to wait and see how the technology - and not just the current approach from Lucid - evolves. For example, perhaps AMD and NVIDIA will put some RnD efforts into multi-GPU cards that are better equipped at mixed frame rendering. Having it all on the same board could alleviate some of the bottlenecks.
  • Baron Fel - Sunday, January 10, 2010 - link

    Crysis has a 91 at Metacritic and sold millions.

    Just wanted to point that out.
  • x86 64 - Saturday, January 9, 2010 - link

    I thought the Hydra didn't do SLI\CF through software? I thought that was one of the main benefits of Hydra, no software profiles were needed. The preliminary results you guys posted are less than impressive. Not to sound like a pessimist but I figured it was too good to be true.
  • Focher - Wednesday, January 13, 2010 - link

    I think the term "profiles" isn't appropriate, as the review suggests it's more of a whitelist than any type of profile with customized settings for the specific game.
  • prophet001 - Friday, January 8, 2010 - link

    The implications of this technology are tremendous. I'm rather surprised at people brushing it off. It is fledgling and will obviously need some work but they will be able to do some really neat things once this matures. I'm thinking GPU farm via external PCI-E.
  • hyvonen - Friday, January 8, 2010 - link

    "We’ll start with Call of Juarez, which is one of the Hydra’s better titles. With our 5850s in Crossfire, we get 94fps, which is just less than double the performance of a single 5850 (49.5)."

    Don't mean "... we get 94fps, which is more than double the performance of a single 5850 (49.5)."

    Or, on other words, WFT happened - how do you get more than double the performance with CF?!?!? Something got messed up in your test here, bro.
  • Veerappan - Friday, January 8, 2010 - link

    Read it again... He's saying that the 94fps that they got is just slightly LESS THAN double 49.5 fps. So if a single 5850 gets 49.5 fps, double that is 99 fps.

    They got 94 fps, which is just a little bit less than 99.
  • jmurbank - Friday, January 8, 2010 - link

    To me Lucid got something going but they should have done it differently. If they created the Hydra chip to be an on-board graphics chip and dispatcher, things will be different. Right now all they have is a dispatcher chip that uses a discrete graphics card to output video which makes it have multiple bottlenecks. It will be better if the Hydra output the graphics on on its own through its own display port while all the processing is done by the discrete graphics cards using stream processing technology like CAL (ATI) and CUDA (nVidia).

    Of course bad drivers screws up everything. Have look at ATI's history. ATI still makes poor software, but people do not mind. It seems people care more about performance than reliable and stable drivers. I care more about reliable and stable drivers, so it screws up my day if my computer crashes because of a driver.
  • beginner99 - Friday, January 8, 2010 - link

    The worst thing you can do is promise stuff you can't deliver. it's sad. After these first benchmark, the tech will just have a bad reputation even if it will get better over time. Intel way would have been better. Just don't release it at all if it's an underperformer.

    I do see that it must we extremly complex to get this running at all. So it's actually quite an achievment but it's similar to cars. Combustion engines have been optimized during the last 100 years. No wonder no new technology can compete.
    Maybe in 1-2 years this will be usable. If lucid is still alive then...Don't believe many will buy this board.

    I also was rather suprised about CF. Used to be quite bad too as I remember? Probably due to a driver update? And how nows what nvidia or ATI is doing in there drivers. I assume they could put in stuff to cripple hydra on purpose.
  • cesthree - Friday, January 8, 2010 - link

    Multi GPU gaming already suffers from drivers that suck. You want the < 3% who actually run multi GPU's to throw HYDRA driver issues into the mix? That doesn't sound appealing, at all, even if I had thousands to throw at the hardware.

    Fastest Single GPU. Nuff Said.

    Although if Lucid can do this, then maybe ATI and Nvidia will get off their dead-bums and fix their drivers already.
  • Makaveli - Thursday, January 7, 2010 - link

    The major fail is most of the post on this article, its very early silicon with beta drives. And most of you expect it to be beating Xfire and Sli by 30%. When the big guys have had years to tune their drives and they own the hardware. I would like to see where this by next christmas before I pass judgement. Just because you don't see it in front of your face doesn't mean the potential isn't there.

    Sometimes alittle faith will go along way.

  • prophet001 - Friday, January 8, 2010 - link

    i agree
  • Hardin - Thursday, January 7, 2010 - link

    It's a shame the results don't look as promising as we had hoped. Maybe it's just early drivers issues. But it looks like it's too expensive and it's not any better than crossfire as it is. It doesn't even have dx 11 support yet and who knows when they will add it.
  • Jovec - Thursday, January 7, 2010 - link

    With these numbers, I wonder why they allowed them to be posted. They had to know they were getting much worse results with their chips than XF, and the negative publicity isn't going to do them any good. I suppose they didn't want to have another backroom showing, but that doesn't mean they should show at this stage.
  • jnmfox - Thursday, January 7, 2010 - link

    As has been stated the technology is unimpressive, hopefully they can get things fixed. I am just happy to see one of the best RTS ever made in the benchmarks again. CoH should always be part of anandtech's reviews, then I wouldn't need to go to other sites for video card reviews :P.
  • IKeelU - Thursday, January 7, 2010 - link

    I was actually hoping AMD would buy this tech and integrate it into their cards/chipsets. Or maybe Intel. As it stands, we have a small company, developing a supposedly GPU-agnostic "graphics helper" that is attempting to supplant what the big players are already doing with proprietary tech. They need support from mobo manufacturers and cooperation from GPU vendors (who have little incentive to help at the moment due to the desire to lock-in users to proprietary stuff). I really, really, want the Hydra to be a success, but the situation is a recipe for failure.
  • nafhan - Friday, January 8, 2010 - link

    That's the same thing I was thinking through the whole article. The market they are going after is small, very demanding, and completely dependent on other companies. The tech is good, but I have a hard time believing they will ever have the resources to implement it properly. Best case scenario (IMO): AMD buys them once they go bankrupt in a year or so, keeps all the engineers, and integrates the tech into their enthusiast NB/SB.
  • krneki457 - Thursday, January 7, 2010 - link

    Anand couldn't you use a gtx295 to get approximate gtx280 SLI figures? I read that Hydra doesn't work with dual GPU cards, but couldn't you disable Hydra? You mentioned in the article, that this is possible.

    As for technology itself, like a lot of comments already mentioned, I really don't see much use in it. Even if it worked properly it would have been more at home in low to mid range motherboards.
  • Ryan Smith - Thursday, January 7, 2010 - link

    I'm going to be seriously looking at using hacked drivers to get SLI results. There are a few ways to add SLI to boards that don't officially support it.

    It's not the most scientific thing, but it may work to bend the rules this once.
  • krneki457 - Friday, January 8, 2010 - link

    Sorry Ryan just noticed you wrote the article. Well it was just an idea how to get at least some SLI results with as little hassle as possible. Presuming Hydra can be turned off to work only as PCIe bridge, than this ought to work.
  • chizow - Thursday, January 7, 2010 - link

    Have you tried flashing the Trinergy BIOS for SLI support? It might kill off Hydra capabilities in the meantime and deprecate the Hydra 200 to its basest form, a PCIe controller but for purposes of measuring N-mode performance that should suffice. The other alternative would be to simply use the Trinergy with SLI results as a plug-in doppelganger since it is identical to the Fuzion, save for the NF200 vs. Hydra 200 serving as PCIe switches.
  • jabber - Thursday, January 7, 2010 - link

    I think it has some promise. I think the ultimate aim is to be able to 'cobble' together a couple of GPUs of similar capability, have them work efficiently together and not have to worry about profiles. The profiles could just be handled seamlessly in the background.

    If they can push towards that then I'll give them the time.
  • chizow - Thursday, January 7, 2010 - link

    The technology does still rely on profiles though. You don't need to set-up game specific profiles like with Nvidia, even if that kind of granularity is probably the best option, your choices are limited to a handful of somewhat generic performance/optimization profiles provided by Lucid.

    The scariest part of it all is that these profiles will rely on specific profiles/drivers from both Nvidia and AMD too. I'm pretty sure its covered in this article, but its covered for sure in Guru3D's write-up. Hydra only plans to release updates *QUARTERLY* and those updates will only support specific drivers from Nvidia and ATI.

    Obviously, depending on Lucid's turnaround time, you're looking at signficant delays in their compatibilities with Nvidia/ATI, but you're also looking at potentially 3 months before an update for an Nvidia/ATI driver that supports a newer game you're interested in playing. Just way too many moving parts, added complexity and reliance on drivers/profiles, all for a solution that performs worst and costs more than the established AFR solutions.
  • danger22 - Thursday, January 7, 2010 - link

    maybe the amd 5000 cards are to new to have support for hyrda? what about trying some older lower end cards? just for interest... i know you wouldn't put them in a $350 mobo
  • vol7ron - Thursday, January 7, 2010 - link

    I like the way this technology is headed.

    Everyone is saying "fail" and maybe they're right because they want more from the release, but I think this still has potential. I would say either, keep the funding going, or open it up to the community at large to hopefully adopt/improve.

    The main thing is that down the road this will be cheaper, faster, better. When SSDs came out stuttering, people were also saying "fail."
  • shin0bi272 - Thursday, January 7, 2010 - link

    I know how you feel but their original claim was scalar performance with a 7watt chip on the mobo. It's not even as good as standard crossfire (and probably not even standard sli) so that's what's prompting the fail comments. Instead of getting 75fps on call of juarez with a pair of 5850's they should be getting 99 or 100 according to their original claim. Dont get me wrong it functions and for a chip thats literally a couple of months old (maybe 24 since its announcement) thats great but the entire point of hydra was to do it better out of the box than the card makers were doing it.
  • shin0bi272 - Thursday, January 7, 2010 - link

    I had high hopes for this technology but alas it appears it is just not meant to be. Maybe its the single pci-e 16x lane they are using to try to feed 2 pci-e 2.0 16x lane video cards... just saying. Would have been nice to be able to keep my 8800gtx and add in a 5870 but oh well.
  • AznBoi36 - Thursday, January 7, 2010 - link

    Why would you spend $350 on this mobo and then spend another $350 for a 5870, just so you can use your old 800GTX with a minimal gain? You could spend $150 on a CF mobo, plus 2 4890's at $150 each for a total of $350 that would give a 5870 a run for it's money.
  • shin0bi272 - Thursday, January 7, 2010 - link

    oh and the reason for the 5850's is because I am really wanting the dx11 capabilities... I could go with 2 4890's and end up paying less yes but it wouldnt be dx11.
  • shin0bi272 - Thursday, January 7, 2010 - link

    I know thats what I was saying. The technology was supposed to be more worthwhile than this. Plus you cant mix gpus with a regular motherboard so Id have to get another 8800gtx to match mine on my sli supported motherboard. Or if I wanted to go with ati's new card Id have to get 2x5870's ($400ea)and a new crossfire mobo($150) to go crossfire instead. That's more expensive than getting this $350 mobo and adding a 5870 to my 8800gtx. Even if I went with 2 5850's at $300 each its still more expensive than buying this $350 mobo and one 5850. So you see why I really was hoping this tech would work better than it does.

    This would really do well in the budget mobo market IMO, so that people who didnt want to pay 300+ dollars for a motherboard then buy two video cards could use an old card and get better performance than they would have by just getting the low end mobo and using their old gpu.

    If they can get it to be truly scalar (or close to it) like they originally claimed it would be then maybe some other motherboard makers will pick it up but if they dont get it fixed it will end up falling by the wayside as a tech that missed its time (sort of like the hardware physx cards).

    Then again the crossfire 5850 in the call of juarez test got nearly scalar performance increases itself which is sort of new isnt it? isnt it the norm for crossfire and sli setups to do 40-50% better than single cards not 94%? Could just be my erroneous recollection but I dont recall near perfect doubling of fps with sli or crossfire before.
  • GourdFreeMan - Thursday, January 7, 2010 - link

    It is an amazing technological feat that they got this working at all, but in the few games in which it does work properly the performance is frankly terrible. Look at what happens when you pair a 5850 and a GTX280 -- equal or worse performance to a 5850 by itself, when theoretically you should get a ~75% gain over a single card.
  • FATCamaro - Thursday, January 7, 2010 - link

    This technology had fail written all over it. They unleashed a big sack of fail...
  • danger22 - Thursday, January 7, 2010 - link

    maybe the amd 5000 cards are to new to have support for hyrda? what about trying some older lower end cards? just for interest... i know you wouldn't put them in a $350 mobo
  • chizow - Thursday, January 7, 2010 - link

    Sadly, I think Lucid missed their window of opportunity as the need for Hydra largely evaporated with X58 and certainly P55's mainstream launch, offering support for both CF and SLI on the same platform. The only real hope for Hydra was the prospect of vendor-agnostic multi-GPU with better-than-AFR scaling.

    Those lofty goals seem to be unrealistic now that we've seen the tech, and with its current slew of problems and its incredibly high price tag, I just don't see the technology gaining any significant traction over the established, supported multi-GPU AFR methods.

    The article touched on many of the key problems early on, but never really drilled down on them, hopefully we'll see more on this in the next installment, especially IQ and latency:

    quote:

    What you won’t see them focusing on is the performance versus AFR, the difference in latency, or game compatibility for that matter.


    Guru3D did an extensive review as well and found CF scaled significantly better than Hydra without fail. Add in the various vendor-specific feature compatibility questions and an additional layer of driver profiles that need to be supported and synchronized between potentially 3 parties (Nvidia, ATI sync'd to each Lucid release) and you've got yourself a real nightmare from an end-user perspective.

    I'm impressed they got their core technology to work, I was highly skeptical in that regard, but I don't think we'll be hearing too much about this technology going forward. Its too expensive, too overly complicated and suffers from poor performance and compatibility. I don't see the situation improving any time soon and its clearly going to be an uphill struggle to get their drivers/profiles in order with both older titles and new releases.
  • sheh - Thursday, January 7, 2010 - link

    I agree. It's interesting from a technical standpoint but not many would want to go through all the fuss of SLI/CF (noise, heat, power) plus having to worry about the compatiblity of two or three sets of drivers at the same time. And that's assuming costs weren't high, and performance was better.

    I suspect in 1-2 years NV or ATI will be buying this company.

    (I'm somewhat surprised even SLI/CF exists, but maybe the development costs aren't too high or it's the bragging rights. :))
  • chizow - Thursday, January 7, 2010 - link

    quote:

    I suspect in 1-2 years NV or ATI will be buying this company.


    Not so sure about that, Intel has actually been pumping venture capital into Lucid for years, so I'm sure they're significantly vested in their future at this point. I actually felt Lucid's Hydra was going to serve as Intel's CF/SLI answer not so much as a straight performance alternative, but rather a vessel to make Larrabee look not so....underwhelming.

    Think about it, sell a Larrabee for $200-$300 that on its own, is a terrible 3D rasterizer and pair it up with an established, competent GPU from Nvidia or ATI and you'd actually get respectable gaming results. Now that Larrabee has been scrapped for the foreseeable future, I'd say Intel's financial backing and plans for Hydra are also in flux. As it is now, Hydra is in direct competition with the PCIe controllers they provide for little added cost that support both SLI and CF natively (licensing fee needed for SLI). In comparison, the Hydra 200 chip reportedly costs an additional $80!
  • TemjinGold - Thursday, January 7, 2010 - link

    I think the issue I see is that X-mode will most commonly be used by people looking to save a few bucks when upgrading by combining the card they already have with a new one they buy. Unfortunately, I seriously doubt that this is the same crowd that would shell out $350 for a mobo. That just leaves A and N modes, which the Hydra currently loses horribly to CF/SLI.

    If the Hydra was put on a cheap mobo, I might see where it could be appealing. But someone who spends $350 on a mobo will most likely just shell out for 2-3 new gfx cards at the same time rather than going "gee, I could put this $50 to use if I reuse my old video card."
  • AznBoi36 - Thursday, January 7, 2010 - link

    Right on. If I were to spend that much on a mobo, I wouldn't be thinking about saving some money by using an old video card, and in no way would I be mis-matching cards anyway. Seeing all this performance issues, I wonder how 3-way would be like...
  • ExarKun333 - Thursday, January 7, 2010 - link

    3-way would be ideal. ;)
  • ExarKun333 - Thursday, January 7, 2010 - link

    Performance seems a little weak; hopefully driver fixes will increase trhe scaling. If it can't meet SLI/CF specs, it probably won't survive. Less scaling in "X" mode is acceptable, but you would still hope for decent performance gains with mis-matched cards.
  • Zstream - Thursday, January 7, 2010 - link

    Please!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    I could care least if CF/SLI achieves 100FPS average when it dips down to 15FPS.
  • Mr Perfect - Thursday, January 7, 2010 - link

    He's absolutely right.

    When buying a video card I'm not worried about the highest frame rate peak, or even the average framerate, I'm trying to avoid the dreaded minimum FPS slideshow. If a card spends half of it's time doing 100FPS, and the other half at 15FPS, I'd rather buy the card the does a steady 50FPS.
  • StraightPipe - Thursday, January 7, 2010 - link

    The article clearly stated that FRAPS doesnt work, and they are just grabbing FPS off the screen. Not very accurate or reliable, but it's better than nothing.

    Here's to hoping that we'll get some more detail about MIN FPS in Part2.
  • jnmfox - Thursday, January 7, 2010 - link

    QFT
  • GeorgeH - Thursday, January 7, 2010 - link

    I agree, the minimum frame rate is very important here, especially in X mode. I understand it might not be possible without better support for games and benchmarking tools, but if you can find a way please do so.
  • AmbroseAthan - Thursday, January 7, 2010 - link

    Do the Hydra drivers allow for Folding@Home to be run, or is it another sacrifice for using Hydra?

    I have hopes Lucid can get Hydra up and running pretty solidly, as the concept is wonderful. But it does seem multiple reviews right now are finding a lot of issues with the drivers.

Log in

Don't have an account? Sign up now