Visiting the SilverStone booth was rather interesting.  In a similar ilk to MSI’s GUS which used Thunderbolt to transmit data over PCIe, Silverstone are developing their own at-home device capable of dual-width GPUs up to 450W.

This is still in the development cycle – the main issue with MSI’s device (and I’d assume Silverstone as well) is how to deal with hot-plugging during the middle of intense GPU workloads.  Aside from frame buffer management, at the point in time when the device is detatched there is no longer access to any data on the card – the PC or laptop then has to transfer what it knows to the frame buffer on the IGP and recalculate.  Sounds easy-ish, but not a trivial task by any means.

Silverstone told me there is no release date as such as they are still in the alpha phase of development, but they are aiming for the $200-250 price point without a GPU. 



View All Comments

  • DanNeely - Friday, June 07, 2013 - link

    You're trying to flip it a different way than I'm suggesting. Spin it around the vertical axis so the cards fan is up against an exterior vent, not facing the internal volume of the enclosure. Reply
  • hypopraxia - Friday, June 07, 2013 - link

    I have wanted and waited for exactly this product. External GPU connected via Thunderbolt that feeds its frame buffer back to the internal display. -:excited:- :) Dedicated GPU at home, integrated GPU on the go. It's so... elegant. Reply
  • twistedgamez - Friday, June 07, 2013 - link

    the frame buffer is not sent back to the original display in current solutions, i am quite sure that you need to plug an external display into the graphics card, i believe the thunderbolt 1 cable essentially acts as a 4x pcie extension cable

    look at the linked MSI page and the Sonnet thunderbolt echo express and the videos on youtube of them in action

    thunderbolt currently runs pcie 4x v2 - hopefully thunderbolt 2 can is 8x v2 or 4x v3...
  • jstapels - Friday, June 07, 2013 - link

    Unfortunately, but hopefully we will eventually get to the stage where these are designed more as "framebuffer" accelerators. Reply
  • Death666Angel - Friday, June 07, 2013 - link

    Already exists in some fashion. Many people have hacked their expresscard slots in the laptop to take advantage of the x1 or x4 PCIe interface for external GPUs.
    But the problem of what to do when hot-plugging remains, I believe.
  • winterspan - Saturday, June 08, 2013 - link

    its not a hack... you can buy expresscard->PCIe shells for this very purpose. However, expresscard (1x PCIe) didn't have enough bandwidth for high-end GPUS.. I believe the one I saw had an ATI 3770 back when the 4870 was king Reply
  • Ryan Smith - Friday, June 07, 2013 - link

    A GTX 480? Now these guys are just showing off. Haha. Reply
  • Khenglish - Friday, June 07, 2013 - link

    With current thunderbolt graphics cards run on a restricted x4 pci-e 2.0 link. The bandwidth is only around 20% higher than x2 2.0. optimus will allow you to use your internal LCD screen just like if you had a dedicated gpu inside the laptop, but there will be a significant performance hit due to needing to send the frame buffer back over the small x2 2.0 link.

    nvidia has this pci-e compression that pops up when running x1 links. Would be nice if they expanded this to x2 and x4 links.

    As for the cost it'd be nice if the price went down to that of the old bplus adapter, but $250 is far more reasonable than what sonnet wanted. They should offer a PSUless option for people who already have a desktop psu lying around to cut costs.
  • jtd871 - Friday, June 07, 2013 - link

    Sorry, Ian (and everybody else at AT). I really hate to link to an external review site, but there's already this bad beast:

    Shouldn't be too hard to substitute a Thunderbolt interface. Yes, it'll still have hot plug issues.
  • Khenglish - Friday, June 07, 2013 - link

    When I used an external card through the expresscard slot "hotplug issues" that have been brought up here were a minor concern, especially when the external card is nvidia. The card is available in the "safely remove hardware" icon and you can eject it at any time. If you are actively rendering something off that card there will obviously be a problem, but after disabling/ejecting the card everything runs off the IGP like the card was never there. The only issue is that games will reset settings on you when they see that they are being run by a different gpu.

    The big problems that people have are not enough PCI config space (mostly a problem on macs and some dells), which can be solved for all systems by enabling 36-bit allocation through a DSDT override. The 2nd problem being some laptops being quirky when detecting an extra gpu on post (refusing to post, disabling the IGP, etc), and that is solved by just hotplugging the external card (the expresscard, not the pci-e slot).

    For those of you wondering about the performance on thunderbolt, look at the x2 2.0 results in what I linked above. Giant performance tables if you scroll down. x2.2 means x2 PCI-E 2.0 link, while x1.2opt means an x1 2.0 link with nvidia's pci-e compression running.

Log in

Don't have an account? Sign up now