The first time I laid eyes on this card I was visiting AMD's headquarters in Sunnyvale before the Radeon HD 5800 series launch event.  I could take photos of the 6 displays it was driving, but not the card itself.  So we'll start off with a picture of the things that set the Radeon HD 5870 Eyefinity 6 Edition card apart from its 3-display counterpart.

The most obvious changes are the display outputs.  While your standard 5870 has two DL-DVI, one DisplayPort and one HDMI output, the Eyefinity 6 Edition has six mini Display Port connectors.  

You can further convert two of those DP outputs into any combination of DVI, HDMI (only one can be HDMI) and VGA.  The remaining four connectors must remain Display Port due to the limited number of timing sources on the 5870.  The card will come with two mini DP to DP adapters, 2 passive mini DP to SL-DVI dongles and one passive mini DP to HDMI dongle.

Clock speeds have not changed.  The GPU still runs at 850MHz core and the memory runs at a 1.2GHz clock speed (4.8GHz data rate).  Memory size did change however, the Eyefinity 6 Edition card ships with 2GB of GDDR5 to accommodate the resolutions this thing will be driving. As 256MB GDDR5 is still not available for mass production (and won't be until later this year), AMD is using 16 x 128MB GDDR5 chips in 16-bit mode.

  AMD Radeon HD 5970 AMD Radeon HD 5870 Eyefinity 6 AMD Radeon HD 5870 AMD Radeon HD 5850
Stream Processors 2x1600 1600 1600 1440
Texture Units 2x80 80 80 72
ROPs 2x32 32 32 32
Core Clock 725MHz 850MHz 850MHz 725MHz
Memory Clock 1GHz (4GHz data rate) GDDR5 1.2GHz (4.8GHz data rate) GDDR5 1.2GHz (4.8GHz data rate) GDDR5 1GHz (4GHz data rate) GDDR5
Memory Bus Width 2x256-bit 256-bit 256-bit 256-bit
Frame Buffer 2x1GB 2GB 1GB 1GB
Transistor Count 2x2.15B 2.15B 2.15B 2.15B
TDP 294W 228W 188W 151W
Manufacturing Process TSMC 40nm TSMC 40nm TSMC 40nm TSMC 40nm
Price Point $699 $479 $390-420 $300

As a result of the added memory, power consumption has also gone up slightly.  The Radeon HD 5870 Eyefinity 6 Edition now requires both a 6-pin and an 8-pin PCIe power connector instead of the two 6-pin connectors of the stock 5870:

The extra memory and five adapters that you get in the box do come at a price.  The Radeon HD 5870 E6 Edition is expected to retail for $479.  That's $100 more than the MSRP of the 5870 but only $59 more than its actual street price.  It remains to be seen what the street price of the 5870 E6 will end up being given that TSMC 40nm production is still limited with improved but not yet perfect yields. These cards should be available immediately.

Update 4/1/2010: Launch prices appear to have missed their target. We're seeing the 5870E6 sold out at $499, and in-stock elsewhere at $549. This puts it at an $80 premium over the reference 1GB 5870.

Setting up Six Displays
Comments Locked

78 Comments

View All Comments

  • frenchfrog - Wednesday, March 31, 2010 - link

    It would be so nice:

    -3 monitors for left-center-rigth views
    -1 monitor for rear view
    -2 monitors for guages/GPS/map/flight controls
  • vol7ron - Wednesday, March 31, 2010 - link

    I'm not sure why a "wall" was created. Your problem with FOV is the fact that you have too much of a 2D setup, rather than an easier-to-view 3D.

    Suggestion: 3 stands.

    Center the middle pair to your seat.
    Adjust the right and left pair so they're at a 15-25 degree slant, as if you were forming a hexadecagon (16 sided polygon @ 22.5 degrees)

    vol7ron
  • cubeli - Wednesday, March 31, 2010 - link

    I cannot print your reviews anymore.. Any help would be greatly appreciated!
  • WarlordSmoke - Wednesday, March 31, 2010 - link

    I still don't understand the point of this card by itself, as a single card(no CF).

    It's too expensive and too gaming oriented to be used in the workplace where, as someone else already mentioned, there have been cheaper and more effective solutions for multi-display setups for years.
    It's too weak to drive the 6 displays it's designed to for gaming. Crysis(I know it's not a great example of an optimized engine but give me a break here) which is a 3 year old game isn't playable at < 25fps and I can't imagine the next generation of games which are around the corner to be more forgiving.

    My point is, why build a card to drive 6 displays when you could have 2 cards that can drive 3 displays each and be more effective for gaming. I know this isn't currently possible, but that's my point, it should be, it's the next logical step.

    Instead of having 2 cards in crossfire, where only one card has display output and the other just tags along as extra horsepower, why not use the cards in parallel, split the scene in two and use two framebuffers(one card with upper 3 screens and the other card with the lower 3 screens) and practically make crossfire redundant(or just use it for synchronizing the rendering).

    This should be more efficient on so many levels. First, the obvious, half the screens => half the area to render => better performance. Second, if the scene is split in two each card could load different textures so less memory should be wasted than in crossfire mode where all cards need to load the same textures.
    I'm probably not taking too seriously the synchronization issues that could appear between them, but they should be less obvious when they are between distinct rows of displays, especially if they have bezels.

    Anyway this idea with 2 cards with 3 screens each would have been beneficial to both ATI(sales of more cards) and to the gamers: Buy a card and three screens now, and maybe later if you can afford it buy another card and another three screens. Not to mention the fact that ATI has several distinct models of cards that support 3 displays. So they could have made possible 6 display setups even for lower budgets.

    To keep a long story short(er), I believe ATI should have worked to make this possible in their driver and just scrap this niche 6 display card idea from the start.
  • Bigginz - Wednesday, March 31, 2010 - link

    I have an idea for the monitor manufacturers (Samsung). Just bolt a magnifying glass to the front of the monitor that is the same width and height (bezel included). I vaguely remember some products similar to this for the Nintendo Gameboy & DS.

    Dell came out with their Crystal LCD monitor at CES 2008. Just replace the tempered glass with a magnifying glass and your bezel problem is fixed.
    http://hothardware.com/News/Dell_Crystal_LCD_Monit...
  • Calin - Thursday, April 1, 2010 - link

    Magnifying glass for such a large surface would be thick and heavy (and probably prone to cracking), and "thin" variations have image artefacts (I've seen a magnifying "glass" usable as a bookmark, and the image was good, but it definitely had issues
  • imaheadcase - Wednesday, March 31, 2010 - link

    As much R&D the invested in this, It seems better to use it towards making own monitors that don't have bezels. The extra black link is a major downside to this card.

    ATI monitors + video setup would be ideal. After all, when you are going to drop $1500 + video card setup, what is a little more in price for a streamlined monitors.
  • yacoub - Wednesday, March 31, 2010 - link

    "the combined thickness of two bezels was annoying when actually using the system"

    Absolutely!
  • CarrellK - Thursday, April 1, 2010 - link

    There are a fair number of comments to the effect of "Why did ATI/AMD build the Six? They could have spent their money better elsewhere..." To those who made those posts, I respectfully suggest that your thoughts are too near-term, that you look a bit further into the future.

    The answers are:

    (1) To showcase the technology. We wanted to make the point that the world is changing. Three displays wasn't enough to make that point, four was obvious but still not enough. Six was non-obvious and definitely made the point that the world is changing.

    (2) To stimulate thinking about the future of gaming, all applications, how interfaces *will* change, how operating systems *will* change, and how computing itself is about to change and change dramatically. Think Holodeck folks. Seriously.

    (3) We wanted a learning vehicle for ourselves as well as everyone else.

    (4) And probably the biggest reason of all: BECAUSE WE THOUGHT IT WOULD BE FUN. Not just for ourselves, but for those souls who want to play around and experiment at the edges of the possible. You never know what you don't know, and finding that out is a lot of fun.

    Almost every day I tell myself and anyone who'll listen: If you didn't have fun at work today, maybe it is time to do something else. Go have some fun folks.
  • Anand Lal Shimpi - Thursday, April 1, 2010 - link

    Thanks for posting Carrell :) I agree with the having fun part, if that's a motivation then by all means go for it!

    Take care,
    Anand

Log in

Don't have an account? Sign up now