Once man can harness the capability of rearranging matter (elements, protons, neutrons, etc), this graphical hologram will be a thing of the past.
All that we'll need to do is provide existing matter and the technology will be able to break it down and reconfigure it to another molecular structure. Thus, changing trash into food.
In coordination with that, the real holodeck will do the same thing, but instead of making food, it'll make people and inanimate objects. However, a computer program will track what it's made so that it can't "hurt" real humans.
That is the wave of the future, which may very-well be possible with metaphysics.
I'm surprised there's no mention of HDR. That's what's required for images to be indistinguishable from reality. Without HDR, all the new game engines, megapixels, and multi-monitors won't get you there.
I think instead of focusing on millions of millions of pixels that the eye will not be able to perceive, why not focus on widescreen, high resolution glasses?
To see a single image (does not even need to be 3D) through glasses, and when rotating the head, the display rotates too (but the controller,mouse or keyboard controls the direction that the in game character faces).
Why spend precious resources on:
- First of all pixels we're not able to perceive with the eye
- Second of all, when we focus to the right with our eye, why render high quality images on the left monitors (when we're not able to see them anyways)?
Makes more sense to go for glasses.
,or, some kind of sensor on the head of the player,that will tell the pc, where to focus it's graphics to.
Images shown in the corner of the eye, don't need highest quality, because we can't perceive images in the corners as well as where we focus our eyes to.
Second of all;it would make more sense spending time in research on which monitor is the ultimate gaming monitor?.
The ultimate gaming monitor depends on how far you're away from the monitor.
For instance a 1920x1080 resolution screen might be perfect as a 22" monitor on a desk 2 feet away,while that same resolution might fit an 80" screen 10 feet away.
There is need for researching this optimal resolution/distance calculation, and then focus on those resolutions.
It makes no sense to put a 16000 by 9000 resolution screen on a desk 3 feet away from you,and will take plenty of unneccesary GPU calculations.
In practice, the "Single Large Surface" approach will not be as good as the O/S being aware of multiple monitors.
Why? Because the monitors do not present a single contiguous display surface - there are gaps. So if the O/S isn't aware, in certain monitor setups it's going to plonk a dialog box across a boundary or more, making it hard to read. And when you maximize a window it gets ridiculous. I don't think it helps to have your taskbar stretched across 3 screens either..
Nvidia actually does let you treat two displays as one (span mode), but they also allow you to expose them to windows (dual view). And I prefer dual view to span, and don't have many problems with it.
I can quickly move windows to different monitors with hotkeys (or just plain mouse dragging). The problem I have is moving "full screen" stuff to different monitors usually doesn't work.
"Single Large Surface"/span mode is actually going to be suboptimal for most users. Just a kludge to support OSes that can't handle 6 displays in various configs.
FWIW, I suspect SLS will incorporate some sort of "hard frame" adjustment...if only because freeware "multimonitor" solutions like SoftTH already do so even now.
But, at the same time, I'll also say I wouldn't be surprised to see this driver team miss out completely, re: a version 1.0 release.
buy 6 cheap led projectors, 800x480 resolution each, easily make a surround setup with it too.
best part is no bezels. thats the ticket. "
What in the world is that about. I think everyone here is looking at a minimum resolution of 1920x1080 or 1920x1200 for a 24 inch panel. To get that in a projector would cost about $2400 right now per projector.
For me to do that with this step, I would need 6 of them.
Now if you are talking about 30 inch panels, then the resolution per panel is 2560x1600.
Try finding a projector that normal people can afford with the pixel count, and then multiply it by 6
Once Dell comes out with an OLED 30 inch panel with out a bezel I am in for 6 of them.
HD TV is already obsolete and it has been for a long time.
I dont really see the appeal of a wraparound display in the concave styles. No matter what you do it isnt going to be 3D and its not going to fool anyone.
However, imagine if you took 2 monitors and angled them in the opposite manner, so that they were at a slight angle away from you. Convex in other words. Then you can generate a convincing 3D image. In theory, if you had a semi-spherical display, it could generate a very convincing 3D image. It would take some powerful software, but it would look cool as hell.
Don't know if it is mentioned above, but you can throw 3 (well 6 now) LCD projectors onto angled screens and do away with the bezel lines all together. And no, it does not work the same as projecting one big screen. The screens on the side definitely add to the experience.
If you've never used a multi-monitor setup, you know that the monitor edges don't mean jack squat. For a triple setup, you don't actually look at the outer monitors. They're for peripheral vision. Alot of people miss the point entirely on that. Now for a x6 setup, yes this would be annoying, but for 3x or 5x setups, you don't even notice it. After racing on a x3 setup, racing on anything else is completely pointless. It's something that truly needs to be experienced to understand.
I agree, if for no other reason that a 6x2 setup would have two of the six monitor top/bottem edges close to the center of the large image. I'm quite sure it would be unworkable for me, in the long run.
Doesn't mean the vertical monitor edges in a 3 monitor rig are as intrusive, however, and I tend to believe they wouldn't be, yes.
They won't be, if your brain can get past them, any more that the low resolution/hi-pixelization image of a large, close projection would be.
But the big question is "IF." Yes, I can look past the low res image out onto the imaginary road beyond...but others might find it more difficult.
And maybe you can easily ignore vertical edge on a flatscreen, while I find it more difficult to do so....
Projection solutions should make both of those issues easier to solve.
Agree too that the tow peripheral monitors (or projections) in a three monitor/projector setup should NOT require either hi-resolution or even focus--ideally they are seen out of the corners of the two eyes, and are important only for determining overall sense of speed when driving--for viewing "objects passing by at high speeds."
The person who took the pictures for the article should have been standing further back. For displays this large, they actually don't look good if you're too close to them. Something like the 2 x 3 setup would require you be a good 10 - 15 feet away from it to make the bezels less noticeable and really show off the massive resolution.
Home theater guys will tell you when you're buying a big screen TV that you need XX feet from the TV, minimum, for it to look good. Same goes for this setup. The best picture was the last one on the 2nd page, of Dirt 2 I believe. A little distance back really shows off how crisp and huge this display setup is. Seeing several balding white guys sitting 2 feet from this massive display doesn't really give you a sense of how it will look in person.
I can appreciate the desire NOT to see individual pixels in any given image, and, yes, I certainly share that ultimate goal...
...but at the same time, I think there IS something to be said for seeing a very large image up close--at a certain point, the "immersion" factor kicks in, and it really can help offset the downside of seeing a large image under low-resolution.
I drive racing simulations using a XGA resolution projector, and I have it set up to throw a LARGE image not far from my eyes--60" horizontal image (4:3) only 36" or so from my eyes.
It certainly IS pixelated, yes. The Windows hourglass is just silly.
But when used to experience a "3D space" I find myself looking through the screen...and therefore past the pixels...into that 3D world.
I will agree in the next breath that seeing 2D chat can be a problem, however. :D
Ye, it looks better from the distance rather than from close. But most people won't use Eyefinity, the really great part of the card is its power. I mean being able to run Dirt 2 on a playable frame rate on such a high resolution is quite impressive.
Can't wait till some PC magazines get their hands on one of those and show us the real difference in power compared with current cards.
7680 x 3200 is nice but the DPI is praticaly same. And if u sit right before them like the guy in the wow picture its even confusing. To hawe much higher dpi u would need to watch the display from much greater distance (and than u couldnt read the chat). Of course it doesnt aply to panoramatic wiew in simulators or racing games where the 3 display setup can hawe use (arcades maybe) and in fps games for extra space for maybe map or fps wiew from other squad members like in old hired guns on amiga and pc.
The big attraction for me here is the possibility to run these outputs through projectors, rather than flatscreen monitors.
I DO spend most of my PC gaming time driving racing simulators (primarily "rFactor"), and do use a projector to throw up a LARGE, CLOSE image. Pixelation is an issue, but, IMO, the rest of the tradeoffs make it worthwhile to go this route.
What intrigues me about this new card/system are two things: (1) The possibility of running this card output thru two or three-projector rigs, in which one or two "widescreen" projections (covering most or all of a full-surround 180 degree "dome/planetarium" space) are overlaid in the center with a smaller and more highly detailed/higher resolution third projection. If such a rig could be melded with realtime headtracking/eyetracking inside a projection CAVE *or better yet, a dome*, it seems to me we might finally realize the holy-grail: A full-surround, simulation space, at fairly nominal cost.
(2) The possibility of enabling at least that smaller, central region for 3D (stereo) imaging. Obviously, since this is an AMD card, any stereo output would necessarily depend on alternative solutions to Nvidia 3D...but there is at least one of those solutions that might work: TI's "DLP-link," which apparently can be used to enable some new projectors (ViewSonic) with the new Crystal Eyes-5 shuttterglasses to allow 3D output (all without using Nvidia's cards and 3D specific drivers)....
...so let's amend that ^ to read: "A surround, 3D simulation space, at fairly nominal cost."
why is everyone whinning about the monitors??? the interesting part is the new GPU. when will you guys get your hands on these new boards to dissecate them for us?
win7 is about to launch retail in less than a month, so it is about time to see those new DX 11 boards hit the market!
Sigh. And here I thought Anandtech readers were a brighter group of people. A 6 monitor setup pumped out of one video card is incredible, no doubt about it. But to the average consumer it's not even close to practical. Everyone is talking about the six display setup capabilities, issues with bezel and LEDs as though they are considering taking advantage of this. Guys, read between the lines: the real story is a GPU that can play DX11 titles so well that even 6 monitors at 4 times the typical person's resolution aren't even enough to bring it to its knees.
No, I'm pretty sure the optimization is resolution scaling (largely memory bound) and not necessarily raw throughput (GPU bound). Unless they have more surprises.
They would show a demo of WoW on ultra-high resolutions for that reason. Using FSAA or Pixel Shaders will be much more stressful.
My $85 4770 laughs at this news. Hell my 9800GT and even the 8600GTS sit here unused and laugh at this fuking news. BASICALLY...
FUCK YOU ATI. FUCK YOU NVIDIA. FUCK YOU AMD. AND FUCK YOUR ASS INTEL. OH AND MICROSOFT. FUCK YOU TOO, YOU STUPID FUCKS. LET ME ASK YOU THIS...
Where the hell is my KZ2 caliber game, exclusive for the PC? MY GT5 CALIBER GAME? AH?
Crysis? Fuck that. YES, It was funny game for like a few days, but basically that's just throwing shit at my PC, so you FUCKS can justify selling your ridiculous hardware. That doesn't strike me as a good, intelligent, and honest effort. That's not efficient. That doesn't wow me. KZ2 does. GT5 does ( For the record, im no ps3 fan) And those games are running on a super piece of shit notebook gpu from 2005!!
So enough of this bullshit. ENOUGH! YOU WANT ME TO BUY YOUR STUPID HARDWARE? WOW ME. USE WHAT I HAVE FOR A FUKING CHANGE. PUT SOME FUCKING EFFORT ON IT. HIGHER ANTI ALIASING AND HIGHER RESOLUTION IS NOT GOING TO CUT IT ANYMORE. IM NOT ASKING FOR MUCH. 720P AND 30FPS IS GOOD ENOUGH FOR ME.
JUST TAKE WHATS LEFT AND SQUEEZE REALLY HARD. YOU KNOW? LIKE YOU FUCKS DO WITH THE CONSOLES. UNTIL THEN, FUCK YOU.
sometimes less is more. thats why i love my hdtv, less resolution (720p), more screen (42) and thats why i hate desktop lcds, to much fuking resolution + tinny screens. ATI this shit does not appeal to me at all. Give me a gpu that renders at low res and then scales my games ( not movies) at 1080p lcd resolution so i can play crysis on a cheap desktop lcd. This Eyegyimmicky? > stupid.
In reading through the first page of comments, bezel issues were mentioned.
I personally wouldn't want a 2x3 panel setup because of those; any even number of panels puts the center of view in the middle of a bezel.
1x3 is great though, as race and flight simmers will attest. With that setup, most games (including shooters and RPGs) will give a dramatic peripheral view.
Unfortunately for Matrox, this more or less kills their $300 "TripleHead2Go" product.
Something I cannot do today is to have two displays of a cloned desktop, one being a different resolution than the other.
Why would I want to do that? Sometimes I would like to display a game on the television. It accepts VGA input (yes, yes, it's old tech), but I have to change the monitor to the same resolution as the TV in order to do that. You would think it would be so simple to display the same desktop on two monitors, but you can't do it if the resolutions aren't the same.
Obviously this card (and a hundred others) has the power to do that simple setup. I wonder if it lets you.
I had said this after the last launch of gpu's but I think AMD/NVIDIA are on a very slippery slope right now. With the vast majority of people (gamers included) using 19-22" monitors there are really no games that will make last gen's cards sweat at those resolutions. Most people will start to transition to 24" displays but I do not see a significant number of people going to 30" or above in the next couple of years. This means that for the majority of people (lets face it CAD/3D modeling/etc. is a minority) there is NO GOOD REASON to actually upgrade.
We're no longer "forced" to purchase the next great thing to play the newest game well. Think back to F.E.A.R., Oblivion, Crysis (crap coding, but still); all of those games when they debuted were not able to be played even close to the max settings on >19" monitors.
I haven't seen anything yet coming out this year that will tax my 4870 at my gaming resolution (currently a 19" LCD, looking forward to a 24" upgrade in the next year). That is 2 generations back(depending on what you consider the 4890) from the 4970, and the MAINSTREAM card at that.
We are definitely in the age where the GPU, while still the limiting factor for gaming and modeling, has surpassed what is required for the majority of people.
Don't get me wrong, I love new tech and this card looks potentially incredible, but other then new computer sales and the bleeding edge crowd, who really needs these in the next 12-24 months?
we are quite close to the roof already if the 2GB RAM on card informations are correct... and we dropped Z-Buffer, Stencil Buffer, textures, everything...
no, I write it in bits so people are not puzzled where I took 4 bytes multiplication.... 7680*3200*4 = 98,304,000 Bytes.
if it was in bits... 7680*3200*32 = 786,432,000 bits
They use the Trillian(Six?) card. AFAIK no specs has been leaked about this card. The 5870 will come in 1 and 2 gig flavours, maybe the "Six" will come with 4 as an option?
why is so much attention given to its support for 6 monitors? that's cool and all, but who on earth is gonna use that feature? seriously, lets write stuff for your target middle class audience, techies that generally dont have $1600 nor the space to spend on 6 displays.
A 30" screen is more expensive than 2 24" screens. So when Samsung (And the other WILL follow suit) comes with thin bezel screens, high resolutions will become affordable.
So seriously, this is written for the "middle" class audience, you just have to understand the ramifications of this technology.
And as far as I know, OLED screens can be made without bezels entirely.. I guess the screen manufacturers is going to push that tech faster now, since it actually can make a difference.
I remembered when 17 inch LCDs looked horrible and cost almost 2000$. This is a bargain by comparison, assuming you have the app to take advantage of it.
I could do this with two monitors long ago. This has more monitors and maybe have less bugs ("just works") but its still a reimplementation of an old thing. A real multi-monitor setup with independent displays, where a maximized window does NOT span across all displays is much more usable.
What old card gives you the option of running 6 screens from one card?
And that should be a consumer product, not a professional one. And if you actually read the article, you'll see that you CAN setup each monitor independently. Or in 5 groups. Or in 4. Or in 3. Or 2. Also as one big screen.
Watch this video closely. There are 24 1080p monitors being rendered by 5800-class Quadfire. Notice how the screens lag when the camera pans? Chances are that maybe you wouldn't notice when up close, but it certainly is distracting...
Well, OLED can do without the bezel, And i believe LED Backlight can also be done as well.
The good things about this is, if it really do take off we can finally GET RID OF TN Panel. Because having poor vertical viewing angle would make the experience yuck.
It seems to me three or five monitors in a tilted single row configuration would do best at minimizing bezel annoyance while also giving a nice increase to vertical real estate.
I had to create an account to comment on this. I am running 2 ATI 4870's with 3 Dell 2408WFP's and a 42 inch Sony XBR on HDMI
6 Dell 3008WFP's would be sweet and at 80FPS.
My only question... WoW? An ATI 1x series card from 15 years ago can run WoW at 80FPS at full res...
Why not give us some info using a game that can take advantage of a card like that.
If you are going to pick wow, at least look at Guild Wars where the graphics can actually scale to the resolution and test the card out... need I say... Does it play Crysis at that res at 80FPS? lol
Agreed - if they wanted to demo an MMORPG they should've used EVE Online, at least it has a recent graphics engine that doesn't look like ass. WoW's minimum system requirements are hardware T&L for crying out loud... that's the original GeForce!
good to see the hardware manufacturers bastardizing moore's law. not only does the technology double in power every 18 months (or whatever), it also suspiciously doubles in price at the same time! well, at least on the new tech to take up the slack for the price halving of the old tech.
There is a sweetspot for pixel density (~100 dpi) and minimal viewing distance (~20 inches). More Pixels and bigger screens just means bigger minimal distance from the screen.
Unless there are concerns with myopia, big but distant screens don't make much sense for reading or playing. (People beyond 35 tend to pull up and "hug" their monitors anyway.)
Bigger resolutions are interesting for advertisers who don't want their walls of flickering commercials appear blurry when approached.
Also 10 and more megapixels would make it possible to show photographs in native resolution.
There is no point of rendering a plain on a surrounding monitor setup, but rendering a long horizon with 3 monitors could be quite immersive indeed.
A setup of six 16:10 monitors has practically cinemascope aspect ratio 1:2,4. Six 200$ monitors could create a brighter alternative to the projector home cinema experience.
"A setup of six 16:10 monitors has practically cinemascope aspect ratio 1:2,4. Six 200$ monitors could create a brighter alternative to the projector home cinema experience"
No thanks, i'd take a single 1080p projector over 6 monitors. Whats the point of all that res if you have distracting black bars running through it? LCD manufacturers could make super large, super high res displays right now...if there was demand.
How about flight sim and motor racing games? This GPU coupled with a curved/wrap-around 100" LCD. How about a 4 GPU "personal supercomputer" setup like the Tesla? =)
There is one inherent problem that needs to be addressed with the curved, wrap around. It has to have several FOVs, one for each monitor. We ran into this using the Evans & Sutherlin Laser Projector (5Kx4K and 8Kx4K - yes those were real resolutions using 16 PCs). Generally video game will use a single FOV, whereas you would need one for each monitor in reality. The other way is to get distortion correction working for the whole picture, but that involves rendering in an area bigger than the displayed resolution and also doing it on the fly, which will induce a frame lag in rendering.
I would just like to see an FPS that shows you what you would see with peripheral vision. Then I could see the point in having a three monitor setup for games. In real life, I can tell when someone is right beside me, but in an FPS, I can't unless I turn that way. Just having a wider view of what is in front of me doesn't do anything for me.
I believe Matrox's Triple Head gaming was supposed to do this, and not just offer a wider resolution. I don't think it had much support, but I do think that Quake III was one title that did support it.
The setup reminds me of the Phillips Videowall technology of the 1980's. Consisted of a set of rear-projection monitors that could be individually addressed or spanned as one, with a barely visible seam between them that quickly became unnoticeable if you sat and watched a show.
We set one up in a retail location, and since it coincided with the VHS boom at the time, we showcased new titles on it, which led to really high sales. Even Monserrat Caballé came to the shop to give a short recital "broadcast" on site. (And, yes, once one disgruntled employee managed to put a truly nasty sex tape on at peak shopping hours). I tell you, we wowed the rubes. But it was mega-$$$a the time.
The Samsung concept display shown in the article looks attractive, and a first step toward getting it right. One can see IMax+3D home theaters in the offing in a few years.
Blurry textures, blocky models, leafs and vegetation made out of 2D sprites, and overly shiny, plastic, artificial looking trees and rocks will still look as crappy at 50 megapixels as they do at 1.8 Mp.
Especially when combined with cramped levels, repetitive gameplay, dumb AI and a completely non-interactive environment (being able to blow everything to pieces doesn't count).
Well not quite but a lot more advanced then what ati is talking about. It's called a cave. You sit in a large cube, every wall of which has 2 projectors back projecting on to it. Put on the 3d glasses it's like the holo deck in that everything is in 3d and surrounds you.
It's used for things like styling cars - designers can make the inside of a new car in cad then sit inside it too see if it really works.
Costs a fortune however and wouldn't fit in the average house :)
It's cool and all, but it makes it like you're playing all your games through a football (gridiron) helmet due to the edges of the monitors intersecting the display.
Also, unless you use one of the far right-side monitors as your primary desktop, you'll have that annoying run-off problem with the mouse cursor where it slides over to the display to the right when you just want to click the [X] for the window in your current monitor, just due to our ingrained nature of over-sliding the mouse to the upper-right corner when we want to close a window. :)
Using a left-side monitor as my primary, I can solve this with UltraMon by making my second display not only to the right but LOWER than my primary display, so that the cursor still stops at the upper right edge of my primary display.
I wonder how this system will handle it. Perhaps it will be smart enough or have a setting that allows you to keep the cursor from sliding beyond the right edge of the current screen when a program is maximized on that screen, so that you can hit the [X] easily like you can on a single display setup (or a right-side display in a multi-display setup).
As I understand it, you can't just maximize windows into screens. The applications and OS don't know your desktop is spread across several screens, so using maximize will maximize the window to cover all available screens.
Which kinda sucks if you want to have several different windows all fullscreen on their own monitor.
Can you have Windows manage the monitors instead of going through this AMD software trickery? I would imagine you would want them to appear to Windows as 6 separate monitors, but then turn on the AMD single-surface-whizbang when you launched an OpenGL or DirectX app.
I see they're touting Linux support for this. I hope they start taking their Linux drivers more seriously.
This will be huge news for guys using DMX and Chromium (not the Google browser, the other Chromium) to do the giant wall-o-monitors displays.
It says that you can manage windows within the software.
In the article it mentions taking 6 monitors and dividing it so 4 are one "screen" and the other two form a "second" screen. I presume that means within each grouping applications would maximise as they would if you had 2 physical monitors and were letting Windows control them.
It's like a grid system (which already exists within at least the NV driver options) but spread across multiple monitors in groups, I would assume.
Windows will see whatever ATI get their drivers to show it, so if ATI allow you to manipulate monitors into groupings, that's what Windows will see.
This sort of setup isn't ideal for all games, and I doubt anyone would argue it is, but it is great for some titles.
In RTS games the borders don't matter.
In racing games the borders don't really matter, the added visual field is very advantageous if you are using an in-car view. Going from a 20" flanked by two 17" monitors to a single 24", I notice the loss of peripheral vision, and the monitor breaks weren't far off the roof pillar positions anyway.
In flight sums, as has been said in the article, not a problem.
In FPS games maybe it will be more of a nuisance, but not every game is an FPS.
I would think it would be a problem in ANY game. It even looks like it from the screenshots in article.
Look, the best way to implement a muilti-monitor setup is what has been done in games already. Supreme commander is a good example..you make a monitor setup with each monitor acting independent of the other.
Open map on one monitor, game on other, stats on one, etc.
Having a large screen, with bezels like that, does not impressive or work towards a advantage in a game to the user. Having a multi-monitor setup with the game outputting scenes you want to each monitor would be far more impressive. So many more ways a game could take advantage of that.
The game that comes out that has those features implemented well into the game play would drive the sells of these setups into the gaming market. But till then its going to never take off.
I agree with lonyo. The bezels aren't really that intrusive in some games. A 3x2 setup in a FPS would be a PITA as that would mean my corsair is cut into 2 but a 3x1 setup is just dandy.
Define 'maxed out'. Was multisampling maxed and triple buffering turned on? I can't imagine a single card could drive that resolution at 80fps. If so, wow, nvidia is in serious trouble and AMD is going to be getting -a lot- of customers..
Not necessarily. Simply put, triple buffering still allows the GPU to push max frames of its ability, but it throws out frames not synched to the display frequency. So while the GPU may be rendering at 80fps internally, you only see 60fps (assuming 60Hz display).
did you notice that mr shimpi is only impressed by anything intel says,
like the lynnfield review, come on, ''harder,stronger,longer'' give me a break.
Can someone do my sanity a favor and ban this idiot? Do you think he actually has a job? Hell, does anyone even think he has a frakking GED? I love how he thinks every review site is engaged in a mass conspiracy against AMD.
Yeah that's why the Radeon HD4870 review was "HD4870 - The Card To Get".
Almost every preview article (this is a preview article) doesn't have any kind of flashy subtitle/comment, just a quick summary of what it's about.
When the reviews of the ATI DX11 cards come out, I am sure they will have some sort of zany subtitle/comment about how amazing they are (since compared to current gen they are sure to be amazing, and I doubt by that time based on rumour we will have anything from nvidia except pre-release benchmarks if they feel desperate).
I didn't see this in the article. How does a graphics card with two outputs drive 6 displays? What hardware is needed to do this? Is there some kind of splitter or DP box they all plug into? I have two dell 3007wfp-hc's already and this is making me want to buy a 3rd, but I don't know if I need anything else to drive it.
I think they use the "Trillian" AIB, or possibly known as the "Six". It features six mini-displayport outputs. Other articles on the net show ATi demonstrating a 24 monitor setup using 4(!!) Six cards..
I dont know which setup(gpu) the "six" cards uses, rumours were both Cypress(aka r870) and Hemlock(akar800). From the coolers shown, I think it uses "just" Cypress.(Single chip not duallie).
I also believe that the long socalled 5870 card shown in photos around the net is Hemlock(5870x2), not 5870.
And for you concerned about your power bill, rumours state that the 5870 uses 28W(!!!!!!) in idle and 2D.
This ATi generation rocks, I only hope nVidia will get their card out and survive. Anyway how you look their future is bleak. Their chipset business is coming to an end except for AMD cpu's, and theyre late with the gt300.
from page 1:
"The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today."
I can't make much sense out of this however. Current cards have two independent display controllers. These two display controllers afaik don't really share anything, so referring to them as one display engine doesn't really make a lot of sense. So any rv870 really has 6 display controllers (though probably not that many internal tmds transmitters, so if you'd want to drive more than 2 dvi/hdmi displays I'd guess you're out of luck or you'd need a card with external tmds transmitters)?
Am I the only one here that thinks the real story is not the multi-monitor support, but rather the ridiculous GPU power driving them all? I realize they haven't fully disclosed the specs publically, but 7000x3000 resolution over 60fps? The article barely seems impressed by these numbers. Was this the performance expected from single-GPU setups for this generation? I didn't see this coming at all and I'm completely floored!
Also, I would just like to add that I have always preferred being able to segregate displays so that I can easily maximize multiple applications within their own screen. Having everything as "one giant display" has been possible for years and is a less than desirable for everything BUT gaming... IMO
Yes it's nuts and you can thank ATi for finally upping the texture units and ROPs. I think they've been the same in number, although they've gotten faster, since the x1k series!
Probably were. But look, it's a lvl 1 toon wandering the world. I wanna see the performance in a 25 man raid or simply wandering around Dalaran. Oh and the shadows better be maxed too!
LCD's require control electronics around all 4 sides, making the bezel a necessity. It could easily be 1/4 the width of current monitors. I messed around with stitching the images from 3 rear-mounted projectors together. The image was seamless, but the price would be astronomical. That, and you have to have a VERY good screen to project on to, or all your wonderful resolution gets muddied.
Or the Nec X461UN, which looks very similar (btw you don't need the 460UTn, the 460UT would do as there's no use for the built-in PC in this scenario)
Those are really expensive (>5000 USD), are huge and low-res (1366x768). That's really for big video walls, not suitable for some monster gaming setup. But really, it shouldn't be much of a problem manufacturing 24 inch or so tfts with similar slim bezels. There just hasn't been a market for this up to now...
You make a good point, but the other side of the coin is also true - Intel processors are very strong, and AMD processors suck by comparison.
It's a pity ATI stopped making chipsets for Intel motherboards. They'd make money, Intel would still sell processors, and the only real loser would be NVIDIA. It's surprising how many chipsets they sell. I don't know many people who would buy NVIDIA chipsets, like most people, but it seems they sell them well with HP and Dell, where no one asks or knows the difference. ATI should really make chipsets for the Atom too. That would be a great combination.
I'm not sure why you think AMD's cpu's stink? All benches i've seen is they can run any game out there and push more than 30fps in all tested games not limited by the video card and even push the frames to where the video card ends up the bottleneck. No?
Even compared to i5, the PII 9650 held its own quite well in alot of area's.
For the past few years Intel has defintely had the trashiest iGPU's and probably will at least until the forseeable future. And I wouldn't count on Larrabee to change that all that much by the time it comes out. You can have the strongest cpu in the world but if you have gpu trash like Intels you can't game anyway at good resolutions and speeds to make good use of the fastest cpu in the world.
I think people sometimes instinctively balance things out, and forget they are even doing it.
Keep in mind that Phenom II processors are the same size as the Nehalems, and you're forced to compared the low end of a brain-damaged line with the highest end AMD CPU, and still generally come out the loser. That's not a good CPU design, if you think about it.
I don't agree with your remarks about great processors not mattering if you don't have the GPU to go with it. Stuff like that just makes you sound irrational and too pro-AMD. Not everyone needs powerful 3D graphics, but more to the point, you can get an ATI card for your Intel motherboard. So, sorry, they don't need to make a good video card for you to benefit from it.
The Apple store on Michigan in Chicago is using something very similar to what's being shown here. If I recall correctly, they had 8 panels arranged 2 across by 4 down. They were running an animation showing all the iPhone apps that were available. I noticed the display from the other side of Michigan and was impressed enough to cross the street to see how they did it.
The device was probably purposely made by the LCD manufacturerer as the seams were about as wide as a single thin bezel on Samsung monitors.
hmm, running that is pretty nice, especially with the framerate given, but god save us from electricity bills... screens with good colour reproduction of this size are about to take 80Watt at least each... giving half kilowatt on screens only.
Anyway I am eager to get detailed review soon... And hopefully nVidia counterpart as well... $1000+ graphic cards are just nice to see in benchmarks so fingers crossed for fast competition...
They are actually going to be $450 cards for the initial 5870's with 2GB of GDDR5 1300MHz ram on board.
So they are probably reasonable for what they offer.
I've yet to see someone use a monitor setup like that for gaming. It looks terrible with the lines between monitors. Video card manufacturers have been trying that for ages now, Matrox tried and failed at it already. Let it die. AMD.
Mission control stuff, large desktop for apps sure be nice. But to game on? yuk.
Case in point, look at the WoW screen shot..screen cut right in half were character is located. lol
Now if some monitor company decides to make a monitor that lets you remove bezel around edges to a smooth setup, then we can talk.
Apparently you never played on a 3 monitor setup. (yeah a 6 monitor setup would suck)
I can tell from personal experiences that it awesome.
I have tree 22" wide screens Matrox TripleHead2Go setup, so have no problem whit bezels in the middle of my screen.
Yeah of course the bezels bug me, but just as mouths as the roof beams do when i am driving in my car,
And yeah a 6 monitor setup just doesn't work in most games(except games like C&C), would be the same as putting a band of duct tape on eye height around the car.
If you wane see for your self go over to gaming site from Matrox to see some screen dumps from for example WoW
Ore see how it looks on YouTube
NFS is also awesome on tree monitors
And whit the low prizes of monitors anyone can now have a 3x 22" setup
What actually counts is that the graphics card is so powerful that it can support the high resolution!!! And this is just the first generation D11 card. I doubt that we will see anytime soon DX12 so I guess the third generation DX11 cards will rock the earth and break the walls!
Case in point, look at the WoW screen shot..screen cut right in half were character is located. lol
I wonder why they didnt use pivot to circumvent that problem. Five displays in a single row instead of two and the center display will not cut the area with the character in half, while at the same time provide the desired hight at the cost of some width. That would work for me...
perhaps have a look at Barco, also one of the preferred ATI/AMD board vendors and see what they do with screens, they already have full glued monitors where you can drive 2 gpu to 1 monitor
Please read the entire article before posting. It's clearly stated that AMD already has one manufacturer lined up to make monitors with very thin bezels, and for FPSs, you would use an odd number of displays so your crosshairs are in the center of a monitor. It's all in the article.
How many organs do i have to sell to buy a 60+ inch led display with a resolution greater than 25+ mega pixels? Seriously though yields would be so low on those.
quote: Now if some monitor company decides to make a monitor that lets you remove bezel around edges to a smooth setup, then we can talk.
Building such a monitor is not difficult. Making something run on it is.
By giving us the software, the open the door for a company to design such a monitor.
[quoting block]I've yet to see someone use a monitor setup like that for gaming. It looks terrible with the lines between monitors. Video card manufacturers have been trying that for ages now, Matrox tried and failed at it already. Let it die. AMD.[/quoting block]
I think that what makes this a brilliant thing isn't that you _can_ hook up 6 monitors to a single graphics card, but that it works as if it's just one monitor. I think THAT is the innovation that ATI's done - all other implementations I've seen basically treat each monitor as a separate entity, and have to handle them within the OS separately. This treats it all as a single 7500x3200 resolution display. THAT is the cool part, I think.
I think that one of the things that Matrox did back in the day was some clever tricks to try and get the OS to recognize the displays correctly. Looking back on it, there was this concept of "primary display", "secondary display" and "tertiary display". Not "this is my display that's huge".
Perhaps it's just a "name swap", but I can see that this sort of thing is what you need for abstracting arbitrarily large displays (made of one, giant, single display or multiple smaller displays).
I could see it working if they just find a different mounting scheme.
It seems the only obstacle is the bezels, surely there's technology for that side of the coin as well to allow the panels to butt up against each other without a bezel.
I have already seen flight sim stations that it won't be a problem for even now because the "cockpit" windows already encompass a bezel.
Course once we have flexible "affordable" OLEDs maybe even wilder setups possible.
That is true but you're not seeing the big picture and yes that pun was intentional.
Current monitors strapped together like in the demo are worthless for things like L4D or Counter-Strike. But what about future monitors? I can see vendors building a single giant monitor built from 3 or 6 or more monitors and making them a more or less seamless and market those to the extremely hardcore gamer.
Ok, but then why not just have it be one monitor to begin with?
One whole panel will always be better than pieces strapped together even if they are without bezels. One single video port and cable, even if it is "six-link displayport" or some other freaky name will be better than multiple cables.
For games yes it's useless, but have you ever seen the gigapixel projects (and the displays for viewing very high resolution images) universities do? This will simplify it brutally, before they had a PC for every screen or two and rendered with software. Now they can just connect them to a single desktop if 6 displays is all they need, or when running with software rendering cut down the number of PCs by a factor of 3. When you need high resolution you can't just use one big display (or projector) because the resolution will be to low for that stuff.
On the other hand, this is just for displaying the multi-monitor capabilities and people use more then one monitor in the real world, maybe not as one giant screen (they do that too of course) but that's not the point that is just one of the mode you can use. And for those who use that mode it's now greatly simplified, before it was a hell getting acceleration and stuff working across both screens at the same time or stretch some apps across the two screens, this is now no problem as it's just one single surface for windows and the applications. But as said you simply can't build a single display with a resolution of say 11520 x 6480.
[quote]One single video port and cable, even if it is "six-link displayport" or some other freaky name will be better than multiple cables. [/quote]
Is there a problem i don't see ?
Its the same as whit RGB cables you have 3 of them together and only the ends fan out.
Not seeing the problem here
How many monitors have you seen that can do a 10240 x 3200 display? I think that was part of the points of this demo as well is the throughput of the card. Techology ain't there yet. This is where the flexible panels of the future will shine.
Cost & manufacturing complexity. It is much much easier & cheaper for them to build smaller displays. So it is much cheaper, and easier for them the put 3 small displays in one package than make one huge panel with the same resolution.
And having 6 panels strapped together makes 6 points of failure. Any one of the panels has bad pixels or an other bad component and you will have a mess. Replacing one will probably lead to temperature differences and right next to each other that type of issue will show. I agree that it would need panels without bezels to work right for gaming. It would be annoying as a desktop too. Can you imaging trying to use IE or Firefox with lines in the middle of your web pages?
and having one giant panel makes 1 point of failure. if you have a problem, you replace the whole thing. this is MUCH more costly than replacing one panel out of six. indeed, it's probably more costly than replacing all six panels all together.
the technology for such a display has been around for at least a year, if not longer. i recall seeing a three panel display at a tech conference (can't remember which), where a high end computer was driving crysis on all three displays.
I see it as a good thing. Panels have an advantage of "if one goes bad replace it" as opposed to a giant monitor that gets a bad pixel and is irreplaceable due to cost. $150 and I'm good as new. With a $12K monitor, I imagine I'd just keep using it with a few bad pixels in it.
FullCon Solutions, LLC partnered with Duke University & Iowa State to promote the benefits of using 6-Sided, Fully Immersive CAVE's in the AEC & Marine Industries. FullScale Analysis of 3D models are dramatically improving communication between all stakeholders. Ultimately, this technology allows are clients to design better, clarify expectations sooner and profit from improved decision making.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
137 Comments
Back to Article
Deke2009 - Sunday, September 20, 2009 - link
will we be able to align displays simular to matrox's triplehead2go?TheOtherMrSmith - Saturday, September 19, 2009 - link
If you want to see a really interesting monitor design that could really benefit from this new video card tech, check out NEC's new goods:
# 42.8" diagonal
# 2880 x 900
# 0.02 ms. response time
# 12-bit color range
http://www.homotron.net/2009/01/macworld_2009_nec_...">http://www.homotron.net/2009/01/macworld_2009_nec_...
This thing is really amazing!
Deke2009 - Sunday, September 20, 2009 - link
woohoo 900 vertical res, what a piece of crapvol7ron - Tuesday, September 15, 2009 - link
Once man can harness the capability of rearranging matter (elements, protons, neutrons, etc), this graphical hologram will be a thing of the past.All that we'll need to do is provide existing matter and the technology will be able to break it down and reconfigure it to another molecular structure. Thus, changing trash into food.
In coordination with that, the real holodeck will do the same thing, but instead of making food, it'll make people and inanimate objects. However, a computer program will track what it's made so that it can't "hurt" real humans.
That is the wave of the future, which may very-well be possible with metaphysics.
JonnyDough - Monday, September 14, 2009 - link
Samsung was a great choice.I <3 AMD.
I wonder what the power draw on the entire system, including monitors was. There's a gray fuse box right next to the monitors...yikes!
AnnonymousCoward - Monday, September 14, 2009 - link
Here's your damn holodeck. 6-sided fully enclosed cube, 16.8MP per wall, in 3D. http://www.vrac.iastate.edu/c6.php">http://www.vrac.iastate.edu/c6.php I've been in it.I'm surprised there's no mention of HDR. That's what's required for images to be indistinguishable from reality. Without HDR, all the new game engines, megapixels, and multi-monitors won't get you there.
ProDigit - Saturday, September 12, 2009 - link
I think instead of focusing on millions of millions of pixels that the eye will not be able to perceive, why not focus on widescreen, high resolution glasses?To see a single image (does not even need to be 3D) through glasses, and when rotating the head, the display rotates too (but the controller,mouse or keyboard controls the direction that the in game character faces).
Why spend precious resources on:
- First of all pixels we're not able to perceive with the eye
- Second of all, when we focus to the right with our eye, why render high quality images on the left monitors (when we're not able to see them anyways)?
Makes more sense to go for glasses.
,or, some kind of sensor on the head of the player,that will tell the pc, where to focus it's graphics to.
Images shown in the corner of the eye, don't need highest quality, because we can't perceive images in the corners as well as where we focus our eyes to.
Second of all;it would make more sense spending time in research on which monitor is the ultimate gaming monitor?.
The ultimate gaming monitor depends on how far you're away from the monitor.
For instance a 1920x1080 resolution screen might be perfect as a 22" monitor on a desk 2 feet away,while that same resolution might fit an 80" screen 10 feet away.
There is need for researching this optimal resolution/distance calculation, and then focus on those resolutions.
It makes no sense to put a 16000 by 9000 resolution screen on a desk 3 feet away from you,and will take plenty of unneccesary GPU calculations.
lyeoh - Saturday, September 12, 2009 - link
In practice, the "Single Large Surface" approach will not be as good as the O/S being aware of multiple monitors.Why? Because the monitors do not present a single contiguous display surface - there are gaps. So if the O/S isn't aware, in certain monitor setups it's going to plonk a dialog box across a boundary or more, making it hard to read. And when you maximize a window it gets ridiculous. I don't think it helps to have your taskbar stretched across 3 screens either..
Nvidia actually does let you treat two displays as one (span mode), but they also allow you to expose them to windows (dual view). And I prefer dual view to span, and don't have many problems with it.
I can quickly move windows to different monitors with hotkeys (or just plain mouse dragging). The problem I have is moving "full screen" stuff to different monitors usually doesn't work.
"Single Large Surface"/span mode is actually going to be suboptimal for most users. Just a kludge to support OSes that can't handle 6 displays in various configs.
kekewons - Saturday, September 12, 2009 - link
FWIW, I suspect SLS will incorporate some sort of "hard frame" adjustment...if only because freeware "multimonitor" solutions like SoftTH already do so even now.But, at the same time, I'll also say I wouldn't be surprised to see this driver team miss out completely, re: a version 1.0 release.
Because it's happened before......
snarfbot - Friday, September 11, 2009 - link
um projectors?buy 6 cheap led projectors, 800x480 resolution each, easily make a surround setup with it too.
best part is no bezels. thats the ticket.
SnowleopardPC - Sunday, September 13, 2009 - link
"um projectors?buy 6 cheap led projectors, 800x480 resolution each, easily make a surround setup with it too.
best part is no bezels. thats the ticket. "
What in the world is that about. I think everyone here is looking at a minimum resolution of 1920x1080 or 1920x1200 for a 24 inch panel. To get that in a projector would cost about $2400 right now per projector.
For me to do that with this step, I would need 6 of them.
Now if you are talking about 30 inch panels, then the resolution per panel is 2560x1600.
Try finding a projector that normal people can afford with the pixel count, and then multiply it by 6
Once Dell comes out with an OLED 30 inch panel with out a bezel I am in for 6 of them.
HD TV is already obsolete and it has been for a long time.
Shadowmaster625 - Friday, September 11, 2009 - link
I dont really see the appeal of a wraparound display in the concave styles. No matter what you do it isnt going to be 3D and its not going to fool anyone.However, imagine if you took 2 monitors and angled them in the opposite manner, so that they were at a slight angle away from you. Convex in other words. Then you can generate a convincing 3D image. In theory, if you had a semi-spherical display, it could generate a very convincing 3D image. It would take some powerful software, but it would look cool as hell.
diggernash - Friday, September 11, 2009 - link
Don't know if it is mentioned above, but you can throw 3 (well 6 now) LCD projectors onto angled screens and do away with the bezel lines all together. And no, it does not work the same as projecting one big screen. The screens on the side definitely add to the experience.diggernash - Friday, September 11, 2009 - link
take that LCD outta my comments, massive brain failurekekewons - Friday, September 11, 2009 - link
Right. Probably better using DLP projectors (particularly since the latter are more flexible with regard to 3D technologies.xetura - Friday, September 11, 2009 - link
If you've never used a multi-monitor setup, you know that the monitor edges don't mean jack squat. For a triple setup, you don't actually look at the outer monitors. They're for peripheral vision. Alot of people miss the point entirely on that. Now for a x6 setup, yes this would be annoying, but for 3x or 5x setups, you don't even notice it. After racing on a x3 setup, racing on anything else is completely pointless. It's something that truly needs to be experienced to understand.kekewons - Friday, September 11, 2009 - link
I agree, if for no other reason that a 6x2 setup would have two of the six monitor top/bottem edges close to the center of the large image. I'm quite sure it would be unworkable for me, in the long run.Doesn't mean the vertical monitor edges in a 3 monitor rig are as intrusive, however, and I tend to believe they wouldn't be, yes.
They won't be, if your brain can get past them, any more that the low resolution/hi-pixelization image of a large, close projection would be.
But the big question is "IF." Yes, I can look past the low res image out onto the imaginary road beyond...but others might find it more difficult.
And maybe you can easily ignore vertical edge on a flatscreen, while I find it more difficult to do so....
Projection solutions should make both of those issues easier to solve.
Agree too that the tow peripheral monitors (or projections) in a three monitor/projector setup should NOT require either hi-resolution or even focus--ideally they are seen out of the corners of the two eyes, and are important only for determining overall sense of speed when driving--for viewing "objects passing by at high speeds."
willssi - Friday, September 11, 2009 - link
The person who took the pictures for the article should have been standing further back. For displays this large, they actually don't look good if you're too close to them. Something like the 2 x 3 setup would require you be a good 10 - 15 feet away from it to make the bezels less noticeable and really show off the massive resolution.Home theater guys will tell you when you're buying a big screen TV that you need XX feet from the TV, minimum, for it to look good. Same goes for this setup. The best picture was the last one on the 2nd page, of Dirt 2 I believe. A little distance back really shows off how crisp and huge this display setup is. Seeing several balding white guys sitting 2 feet from this massive display doesn't really give you a sense of how it will look in person.
kekewons - Friday, September 11, 2009 - link
I can appreciate the desire NOT to see individual pixels in any given image, and, yes, I certainly share that ultimate goal......but at the same time, I think there IS something to be said for seeing a very large image up close--at a certain point, the "immersion" factor kicks in, and it really can help offset the downside of seeing a large image under low-resolution.
I drive racing simulations using a XGA resolution projector, and I have it set up to throw a LARGE image not far from my eyes--60" horizontal image (4:3) only 36" or so from my eyes.
It certainly IS pixelated, yes. The Windows hourglass is just silly.
But when used to experience a "3D space" I find myself looking through the screen...and therefore past the pixels...into that 3D world.
I will agree in the next breath that seeing 2D chat can be a problem, however. :D
So I often turn it off....
Noctifer - Friday, September 11, 2009 - link
Ye, it looks better from the distance rather than from close. But most people won't use Eyefinity, the really great part of the card is its power. I mean being able to run Dirt 2 on a playable frame rate on such a high resolution is quite impressive.Can't wait till some PC magazines get their hands on one of those and show us the real difference in power compared with current cards.
Zool - Friday, September 11, 2009 - link
7680 x 3200 is nice but the DPI is praticaly same. And if u sit right before them like the guy in the wow picture its even confusing. To hawe much higher dpi u would need to watch the display from much greater distance (and than u couldnt read the chat). Of course it doesnt aply to panoramatic wiew in simulators or racing games where the 3 display setup can hawe use (arcades maybe) and in fps games for extra space for maybe map or fps wiew from other squad members like in old hired guns on amiga and pc.kekewons - Friday, September 11, 2009 - link
The big attraction for me here is the possibility to run these outputs through projectors, rather than flatscreen monitors.I DO spend most of my PC gaming time driving racing simulators (primarily "rFactor"), and do use a projector to throw up a LARGE, CLOSE image. Pixelation is an issue, but, IMO, the rest of the tradeoffs make it worthwhile to go this route.
What intrigues me about this new card/system are two things: (1) The possibility of running this card output thru two or three-projector rigs, in which one or two "widescreen" projections (covering most or all of a full-surround 180 degree "dome/planetarium" space) are overlaid in the center with a smaller and more highly detailed/higher resolution third projection. If such a rig could be melded with realtime headtracking/eyetracking inside a projection CAVE *or better yet, a dome*, it seems to me we might finally realize the holy-grail: A full-surround, simulation space, at fairly nominal cost.
(2) The possibility of enabling at least that smaller, central region for 3D (stereo) imaging. Obviously, since this is an AMD card, any stereo output would necessarily depend on alternative solutions to Nvidia 3D...but there is at least one of those solutions that might work: TI's "DLP-link," which apparently can be used to enable some new projectors (ViewSonic) with the new Crystal Eyes-5 shuttterglasses to allow 3D output (all without using Nvidia's cards and 3D specific drivers)....
...so let's amend that ^ to read: "A surround, 3D simulation space, at fairly nominal cost."
Could it be we are finally getting close?
marc1000 - Friday, September 11, 2009 - link
why is everyone whinning about the monitors??? the interesting part is the new GPU. when will you guys get your hands on these new boards to dissecate them for us?win7 is about to launch retail in less than a month, so it is about time to see those new DX 11 boards hit the market!
coffeehousejam - Friday, September 11, 2009 - link
Sigh. And here I thought Anandtech readers were a brighter group of people. A 6 monitor setup pumped out of one video card is incredible, no doubt about it. But to the average consumer it's not even close to practical. Everyone is talking about the six display setup capabilities, issues with bezel and LEDs as though they are considering taking advantage of this. Guys, read between the lines: the real story is a GPU that can play DX11 titles so well that even 6 monitors at 4 times the typical person's resolution aren't even enough to bring it to its knees.jimhsu - Friday, September 11, 2009 - link
No, I'm pretty sure the optimization is resolution scaling (largely memory bound) and not necessarily raw throughput (GPU bound). Unless they have more surprises.They would show a demo of WoW on ultra-high resolutions for that reason. Using FSAA or Pixel Shaders will be much more stressful.
papapapapapapapababy - Friday, September 11, 2009 - link
copy paste from neogafMy $85 4770 laughs at this news. Hell my 9800GT and even the 8600GTS sit here unused and laugh at this fuking news. BASICALLY...
FUCK YOU ATI. FUCK YOU NVIDIA. FUCK YOU AMD. AND FUCK YOUR ASS INTEL. OH AND MICROSOFT. FUCK YOU TOO, YOU STUPID FUCKS. LET ME ASK YOU THIS...
Where the hell is my KZ2 caliber game, exclusive for the PC? MY GT5 CALIBER GAME? AH?
Crysis? Fuck that. YES, It was funny game for like a few days, but basically that's just throwing shit at my PC, so you FUCKS can justify selling your ridiculous hardware. That doesn't strike me as a good, intelligent, and honest effort. That's not efficient. That doesn't wow me. KZ2 does. GT5 does ( For the record, im no ps3 fan) And those games are running on a super piece of shit notebook gpu from 2005!!
So enough of this bullshit. ENOUGH! YOU WANT ME TO BUY YOUR STUPID HARDWARE? WOW ME. USE WHAT I HAVE FOR A FUKING CHANGE. PUT SOME FUCKING EFFORT ON IT. HIGHER ANTI ALIASING AND HIGHER RESOLUTION IS NOT GOING TO CUT IT ANYMORE. IM NOT ASKING FOR MUCH. 720P AND 30FPS IS GOOD ENOUGH FOR ME.
JUST TAKE WHATS LEFT AND SQUEEZE REALLY HARD. YOU KNOW? LIKE YOU FUCKS DO WITH THE CONSOLES. UNTIL THEN, FUCK YOU.
papapapapapapapababy - Friday, September 11, 2009 - link
http://www.neogaf.com/forum/showthread.php?t=37255...">http://www.neogaf.com/forum/showthread.php?t=37255...papapapapapapapababy - Friday, September 11, 2009 - link
sometimes less is more. thats why i love my hdtv, less resolution (720p), more screen (42) and thats why i hate desktop lcds, to much fuking resolution + tinny screens. ATI this shit does not appeal to me at all. Give me a gpu that renders at low res and then scales my games ( not movies) at 1080p lcd resolution so i can play crysis on a cheap desktop lcd. This Eyegyimmicky? > stupid.Wererat - Friday, September 11, 2009 - link
In reading through the first page of comments, bezel issues were mentioned.I personally wouldn't want a 2x3 panel setup because of those; any even number of panels puts the center of view in the middle of a bezel.
1x3 is great though, as race and flight simmers will attest. With that setup, most games (including shooters and RPGs) will give a dramatic peripheral view.
Unfortunately for Matrox, this more or less kills their $300 "TripleHead2Go" product.
justaviking - Friday, September 11, 2009 - link
Yes, that is indeed a cool demonstration.My question is "Can you mix resolutions?"
Something I cannot do today is to have two displays of a cloned desktop, one being a different resolution than the other.
Why would I want to do that? Sometimes I would like to display a game on the television. It accepts VGA input (yes, yes, it's old tech), but I have to change the monitor to the same resolution as the TV in order to do that. You would think it would be so simple to display the same desktop on two monitors, but you can't do it if the resolutions aren't the same.
Obviously this card (and a hundred others) has the power to do that simple setup. I wonder if it lets you.
7Enigma - Friday, September 11, 2009 - link
I had said this after the last launch of gpu's but I think AMD/NVIDIA are on a very slippery slope right now. With the vast majority of people (gamers included) using 19-22" monitors there are really no games that will make last gen's cards sweat at those resolutions. Most people will start to transition to 24" displays but I do not see a significant number of people going to 30" or above in the next couple of years. This means that for the majority of people (lets face it CAD/3D modeling/etc. is a minority) there is NO GOOD REASON to actually upgrade.We're no longer "forced" to purchase the next great thing to play the newest game well. Think back to F.E.A.R., Oblivion, Crysis (crap coding, but still); all of those games when they debuted were not able to be played even close to the max settings on >19" monitors.
I haven't seen anything yet coming out this year that will tax my 4870 at my gaming resolution (currently a 19" LCD, looking forward to a 24" upgrade in the next year). That is 2 generations back(depending on what you consider the 4890) from the 4970, and the MAINSTREAM card at that.
We are definitely in the age where the GPU, while still the limiting factor for gaming and modeling, has surpassed what is required for the majority of people.
Don't get me wrong, I love new tech and this card looks potentially incredible, but other then new computer sales and the bleeding edge crowd, who really needs these in the next 12-24 months?
Zingam - Friday, September 11, 2009 - link
Everybody is gonna buy this now and nobody will look at GF380 :DVision and Blindeye technoglogies by AMD are a must have ones!!!
Nvidia and Intel are doomed!
Holly - Friday, September 11, 2009 - link
Please correct me if I am wrong but it seems to me that card simply lacks enough RAM to run at that resolution...Forgeting everything except framebuffer
7680*3200*32bits per pixel = 98,304,000 bytes
now add 4x FSAA... 98,304,000 * 4*4 = 1,572,864,000 bytes (almost 1.5 GB)
we are quite close to the roof already if the 2GB RAM on card informations are correct... and we dropped Z-Buffer, Stencil Buffer, textures, everything...
Zool - Saturday, September 12, 2009 - link
"7680*3200*32bits per pixel = 98,304,000 bytes" u are changing bits to Bytes :P. Thats only 98,304,000/8 Byts.Holly - Sunday, September 13, 2009 - link
no, I write it in bits so people are not puzzled where I took 4 bytes multiplication.... 7680*3200*4 = 98,304,000 Bytes.if it was in bits... 7680*3200*32 = 786,432,000 bits
Dudler - Friday, September 11, 2009 - link
They use the Trillian(Six?) card. AFAIK no specs has been leaked about this card. The 5870 will come in 1 and 2 gig flavours, maybe the "Six" will come with 4 as an option?poohbear - Friday, September 11, 2009 - link
why is so much attention given to its support for 6 monitors? that's cool and all, but who on earth is gonna use that feature? seriously, lets write stuff for your target middle class audience, techies that generally dont have $1600 nor the space to spend on 6 displays.Dudler - Friday, September 11, 2009 - link
A 30" screen is more expensive than 2 24" screens. So when Samsung (And the other WILL follow suit) comes with thin bezel screens, high resolutions will become affordable.So seriously, this is written for the "middle" class audience, you just have to understand the ramifications of this technology.
And as far as I know, OLED screens can be made without bezels entirely.. I guess the screen manufacturers is going to push that tech faster now, since it actually can make a difference.
jimhsu - Friday, September 11, 2009 - link
I remembered when 17 inch LCDs looked horrible and cost almost 2000$. This is a bargain by comparison, assuming you have the app to take advantage of it.camylarde - Friday, September 11, 2009 - link
8th - Lynnfield article and everybody drools to death about it.10th - 58xx blog post and everybody forgets Lynnfield and talks about AMD.
15th - Wonder what Nvidia comes up with ;-)
Azarien - Friday, September 11, 2009 - link
I could do this with two monitors long ago. This has more monitors and maybe have less bugs ("just works") but its still a reimplementation of an old thing. A real multi-monitor setup with independent displays, where a maximized window does NOT span across all displays is much more usable.Dudler - Friday, September 11, 2009 - link
So tell me:What old card gives you the option of running 6 screens from one card?
And that should be a consumer product, not a professional one. And if you actually read the article, you'll see that you CAN setup each monitor independently. Or in 5 groups. Or in 4. Or in 3. Or 2. Also as one big screen.
therealnickdanger - Thursday, September 10, 2009 - link
Watch this video closely. There are 24 1080p monitors being rendered by 5800-class Quadfire. Notice how the screens lag when the camera pans? Chances are that maybe you wouldn't notice when up close, but it certainly is distracting...http://www.youtube.com/watch?v=N6Vf8R_gOec">http://www.youtube.com/watch?v=N6Vf8R_gOec
captcanuk - Thursday, September 10, 2009 - link
Watch this other video and the music will distract you from the lag: http://www.youtube.com/watch?v=tzvfzJq3VTU">http://www.youtube.com/watch?v=tzvfzJq3VTUiwodo - Thursday, September 10, 2009 - link
Well, OLED can do without the bezel, And i believe LED Backlight can also be done as well.The good things about this is, if it really do take off we can finally GET RID OF TN Panel. Because having poor vertical viewing angle would make the experience yuck.
Cmiller303 - Thursday, September 10, 2009 - link
that is fucking hideouswagoo - Thursday, September 10, 2009 - link
Does the driver support tilted monitors?It seems to me three or five monitors in a tilted single row configuration would do best at minimizing bezel annoyance while also giving a nice increase to vertical real estate.
8000x2560 or 4800x2560? Great if it works..
Byrn - Friday, September 11, 2009 - link
Just what I was thinking...Looks like it will from the article here: http://www.techradar.com/news/gaming/hands-on-ati-...">http://www.techradar.com/news/gaming/hands-on-ati-...
SnowleopardPC - Thursday, September 10, 2009 - link
I had to create an account to comment on this. I am running 2 ATI 4870's with 3 Dell 2408WFP's and a 42 inch Sony XBR on HDMI6 Dell 3008WFP's would be sweet and at 80FPS.
My only question... WoW? An ATI 1x series card from 15 years ago can run WoW at 80FPS at full res...
Why not give us some info using a game that can take advantage of a card like that.
If you are going to pick wow, at least look at Guild Wars where the graphics can actually scale to the resolution and test the card out... need I say... Does it play Crysis at that res at 80FPS? lol
ipay - Friday, September 11, 2009 - link
Agreed - if they wanted to demo an MMORPG they should've used EVE Online, at least it has a recent graphics engine that doesn't look like ass. WoW's minimum system requirements are hardware T&L for crying out loud... that's the original GeForce!AnandTech benchmarked WoW way back in 2005 (see http://www.anandtech.com/video/showdoc.aspx?i=2381...">http://www.anandtech.com/video/showdoc.aspx?i=2381... and the cards there could almost hit 80FPS at 800x600, so I don't think it's that much of an achievement to hit the same performance on 10x the screen real estate.
Anubis - Friday, September 11, 2009 - link
lol no it cant
zebrax2 - Thursday, September 10, 2009 - link
Well he did say that he played dirt 2 with playable frame ratesaraczynski - Thursday, September 10, 2009 - link
good to see the hardware manufacturers bastardizing moore's law. not only does the technology double in power every 18 months (or whatever), it also suspiciously doubles in price at the same time! well, at least on the new tech to take up the slack for the price halving of the old tech.know of fence - Thursday, September 10, 2009 - link
There is a sweetspot for pixel density (~100 dpi) and minimal viewing distance (~20 inches). More Pixels and bigger screens just means bigger minimal distance from the screen.Unless there are concerns with myopia, big but distant screens don't make much sense for reading or playing. (People beyond 35 tend to pull up and "hug" their monitors anyway.)
Bigger resolutions are interesting for advertisers who don't want their walls of flickering commercials appear blurry when approached.
Also 10 and more megapixels would make it possible to show photographs in native resolution.
There is no point of rendering a plain on a surrounding monitor setup, but rendering a long horizon with 3 monitors could be quite immersive indeed.
A setup of six 16:10 monitors has practically cinemascope aspect ratio 1:2,4. Six 200$ monitors could create a brighter alternative to the projector home cinema experience.
RubberJohnny - Friday, September 11, 2009 - link
"A setup of six 16:10 monitors has practically cinemascope aspect ratio 1:2,4. Six 200$ monitors could create a brighter alternative to the projector home cinema experience"No thanks, i'd take a single 1080p projector over 6 monitors. Whats the point of all that res if you have distracting black bars running through it? LCD manufacturers could make super large, super high res displays right now...if there was demand.
Randomblame - Thursday, September 10, 2009 - link
and I thought my WoW experiance improved when I went to my 24in 1080p monitor now I feel shamed. I must have this....EJ257 - Thursday, September 10, 2009 - link
How about flight sim and motor racing games? This GPU coupled with a curved/wrap-around 100" LCD. How about a 4 GPU "personal supercomputer" setup like the Tesla? =)stevejg61 - Thursday, September 10, 2009 - link
There is one inherent problem that needs to be addressed with the curved, wrap around. It has to have several FOVs, one for each monitor. We ran into this using the Evans & Sutherlin Laser Projector (5Kx4K and 8Kx4K - yes those were real resolutions using 16 PCs). Generally video game will use a single FOV, whereas you would need one for each monitor in reality. The other way is to get distortion correction working for the whole picture, but that involves rendering in an area bigger than the displayed resolution and also doing it on the fly, which will induce a frame lag in rendering.james jwb - Thursday, September 10, 2009 - link
those who've been keeping an eye on Matrox triplehead2go are going to be drooling at this. This is easily the right way to do this.What is the b/w limit of this system, does it have 6 display port's b/w?
drmo - Thursday, September 10, 2009 - link
I would just like to see an FPS that shows you what you would see with peripheral vision. Then I could see the point in having a three monitor setup for games. In real life, I can tell when someone is right beside me, but in an FPS, I can't unless I turn that way. Just having a wider view of what is in front of me doesn't do anything for me.Myrandex - Friday, September 11, 2009 - link
I believe Matrox's Triple Head gaming was supposed to do this, and not just offer a wider resolution. I don't think it had much support, but I do think that Quake III was one title that did support it.Jason
Hlafordlaes - Thursday, September 10, 2009 - link
The setup reminds me of the Phillips Videowall technology of the 1980's. Consisted of a set of rear-projection monitors that could be individually addressed or spanned as one, with a barely visible seam between them that quickly became unnoticeable if you sat and watched a show.We set one up in a retail location, and since it coincided with the VHS boom at the time, we showcased new titles on it, which led to really high sales. Even Monserrat Caballé came to the shop to give a short recital "broadcast" on site. (And, yes, once one disgruntled employee managed to put a truly nasty sex tape on at peak shopping hours). I tell you, we wowed the rubes. But it was mega-$$$a the time.
The Samsung concept display shown in the article looks attractive, and a first step toward getting it right. One can see IMax+3D home theaters in the offing in a few years.
Flyboy27 - Thursday, September 10, 2009 - link
Dude...! $%&* Crysis!Seriously, its not THAT good. Ok for the people that always ask. From now on, yes it will run Crysis, and no you shouldn't give a $%&*.
Pirks - Thursday, September 10, 2009 - link
I doubt it will render Crysis faster than 1 frame per second. Good luck enjoying your focking slideshow :)JimmiG - Thursday, September 10, 2009 - link
Blurry textures, blocky models, leafs and vegetation made out of 2D sprites, and overly shiny, plastic, artificial looking trees and rocks will still look as crappy at 50 megapixels as they do at 1.8 Mp.Especially when combined with cramped levels, repetitive gameplay, dumb AI and a completely non-interactive environment (being able to blow everything to pieces doesn't count).
Pirks - Thursday, September 10, 2009 - link
It'll be 10 or 20 years till they could render Crysis with that 7000x3000 resolution with playable framerates*yawns*
Maybe my son lives long enough to see it... I hope
Griswold - Friday, September 11, 2009 - link
"Maybe my son lives long enough to see it... I hope"I hope you're kidding and not actually breeding...
Pirks - Friday, September 11, 2009 - link
I'll teach him how to taunt you too :Psbuckler - Thursday, September 10, 2009 - link
Well not quite but a lot more advanced then what ati is talking about. It's called a cave. You sit in a large cube, every wall of which has 2 projectors back projecting on to it. Put on the 3d glasses it's like the holo deck in that everything is in 3d and surrounds you.It's used for things like styling cars - designers can make the inside of a new car in cad then sit inside it too see if it really works.
Costs a fortune however and wouldn't fit in the average house :)
jay401 - Thursday, September 10, 2009 - link
It's cool and all, but it makes it like you're playing all your games through a football (gridiron) helmet due to the edges of the monitors intersecting the display.Also, unless you use one of the far right-side monitors as your primary desktop, you'll have that annoying run-off problem with the mouse cursor where it slides over to the display to the right when you just want to click the [X] for the window in your current monitor, just due to our ingrained nature of over-sliding the mouse to the upper-right corner when we want to close a window. :)
Using a left-side monitor as my primary, I can solve this with UltraMon by making my second display not only to the right but LOWER than my primary display, so that the cursor still stops at the upper right edge of my primary display.
I wonder how this system will handle it. Perhaps it will be smart enough or have a setting that allows you to keep the cursor from sliding beyond the right edge of the current screen when a program is maximized on that screen, so that you can hit the [X] easily like you can on a single display setup (or a right-side display in a multi-display setup).
strikeback03 - Thursday, September 10, 2009 - link
As I understand it, you can't just maximize windows into screens. The applications and OS don't know your desktop is spread across several screens, so using maximize will maximize the window to cover all available screens.Which kinda sucks if you want to have several different windows all fullscreen on their own monitor.
HelToupee - Thursday, September 10, 2009 - link
Can you have Windows manage the monitors instead of going through this AMD software trickery? I would imagine you would want them to appear to Windows as 6 separate monitors, but then turn on the AMD single-surface-whizbang when you launched an OpenGL or DirectX app.I see they're touting Linux support for this. I hope they start taking their Linux drivers more seriously.
This will be huge news for guys using DMX and Chromium (not the Google browser, the other Chromium) to do the giant wall-o-monitors displays.
Lonyo - Thursday, September 10, 2009 - link
It says that you can manage windows within the software.In the article it mentions taking 6 monitors and dividing it so 4 are one "screen" and the other two form a "second" screen. I presume that means within each grouping applications would maximise as they would if you had 2 physical monitors and were letting Windows control them.
It's like a grid system (which already exists within at least the NV driver options) but spread across multiple monitors in groups, I would assume.
Windows will see whatever ATI get their drivers to show it, so if ATI allow you to manipulate monitors into groupings, that's what Windows will see.
Lonyo - Thursday, September 10, 2009 - link
This sort of setup isn't ideal for all games, and I doubt anyone would argue it is, but it is great for some titles.In RTS games the borders don't matter.
In racing games the borders don't really matter, the added visual field is very advantageous if you are using an in-car view. Going from a 20" flanked by two 17" monitors to a single 24", I notice the loss of peripheral vision, and the monitor breaks weren't far off the roof pillar positions anyway.
In flight sums, as has been said in the article, not a problem.
In FPS games maybe it will be more of a nuisance, but not every game is an FPS.
imaheadcase - Thursday, September 10, 2009 - link
I would think it would be a problem in ANY game. It even looks like it from the screenshots in article.Look, the best way to implement a muilti-monitor setup is what has been done in games already. Supreme commander is a good example..you make a monitor setup with each monitor acting independent of the other.
Open map on one monitor, game on other, stats on one, etc.
Having a large screen, with bezels like that, does not impressive or work towards a advantage in a game to the user. Having a multi-monitor setup with the game outputting scenes you want to each monitor would be far more impressive. So many more ways a game could take advantage of that.
The game that comes out that has those features implemented well into the game play would drive the sells of these setups into the gaming market. But till then its going to never take off.
Just my opinion :P
zebrax2 - Thursday, September 10, 2009 - link
I agree with lonyo. The bezels aren't really that intrusive in some games. A 3x2 setup in a FPS would be a PITA as that would mean my corsair is cut into 2 but a 3x1 setup is just dandy.skrewler2 - Thursday, September 10, 2009 - link
Define 'maxed out'. Was multisampling maxed and triple buffering turned on? I can't imagine a single card could drive that resolution at 80fps. If so, wow, nvidia is in serious trouble and AMD is going to be getting -a lot- of customers..skrewler2 - Thursday, September 10, 2009 - link
Ok, derr, they obviously didn't have triple buffering turned on if they were getting 80fps.therealnickdanger - Friday, September 11, 2009 - link
Not necessarily. Simply put, triple buffering still allows the GPU to push max frames of its ability, but it throws out frames not synched to the display frequency. So while the GPU may be rendering at 80fps internally, you only see 60fps (assuming 60Hz display).This explains it better:
http://www.anandtech.com/video/showdoc.aspx?i=3591...">http://www.anandtech.com/video/showdoc.aspx?i=3591...
snakeoil - Thursday, September 10, 2009 - link
did you notice that mr shimpi is only impressed by anything intel says,like the lynnfield review, come on, ''harder,stronger,longer'' give me a break.
anyway not that i care.
Chlorus - Friday, September 11, 2009 - link
Can someone do my sanity a favor and ban this idiot? Do you think he actually has a job? Hell, does anyone even think he has a frakking GED? I love how he thinks every review site is engaged in a mass conspiracy against AMD.Lonyo - Thursday, September 10, 2009 - link
Yeah that's why the Radeon HD4870 review was "HD4870 - The Card To Get".Almost every preview article (this is a preview article) doesn't have any kind of flashy subtitle/comment, just a quick summary of what it's about.
When the reviews of the ATI DX11 cards come out, I am sure they will have some sort of zany subtitle/comment about how amazing they are (since compared to current gen they are sure to be amazing, and I doubt by that time based on rumour we will have anything from nvidia except pre-release benchmarks if they feel desperate).
skrewler2 - Thursday, September 10, 2009 - link
I didn't see this in the article. How does a graphics card with two outputs drive 6 displays? What hardware is needed to do this? Is there some kind of splitter or DP box they all plug into? I have two dell 3007wfp-hc's already and this is making me want to buy a 3rd, but I don't know if I need anything else to drive it.Dudler - Friday, September 11, 2009 - link
I think they use the "Trillian" AIB, or possibly known as the "Six". It features six mini-displayport outputs. Other articles on the net show ATi demonstrating a 24 monitor setup using 4(!!) Six cards..I dont know which setup(gpu) the "six" cards uses, rumours were both Cypress(aka r870) and Hemlock(akar800). From the coolers shown, I think it uses "just" Cypress.(Single chip not duallie).
I also believe that the long socalled 5870 card shown in photos around the net is Hemlock(5870x2), not 5870.
And for you concerned about your power bill, rumours state that the 5870 uses 28W(!!!!!!) in idle and 2D.
This ATi generation rocks, I only hope nVidia will get their card out and survive. Anyway how you look their future is bleak. Their chipset business is coming to an end except for AMD cpu's, and theyre late with the gt300.
snakeoil - Thursday, September 10, 2009 - link
come on you have a brain, what about two cards in crossfire.skrewler2 - Thursday, September 10, 2009 - link
Uh, in the article they said one card was driving 6 monitors.. what the fuck does your comment even mean?wifiwolf - Thursday, September 10, 2009 - link
from page 1:"The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today."
mczak - Thursday, September 10, 2009 - link
I can't make much sense out of this however. Current cards have two independent display controllers. These two display controllers afaik don't really share anything, so referring to them as one display engine doesn't really make a lot of sense. So any rv870 really has 6 display controllers (though probably not that many internal tmds transmitters, so if you'd want to drive more than 2 dvi/hdmi displays I'd guess you're out of luck or you'd need a card with external tmds transmitters)?mczak - Thursday, September 10, 2009 - link
actually no all passive DP/DVI converters should still work. You just need enough space on your desk :-)therealnickdanger - Thursday, September 10, 2009 - link
Am I the only one here that thinks the real story is not the multi-monitor support, but rather the ridiculous GPU power driving them all? I realize they haven't fully disclosed the specs publically, but 7000x3000 resolution over 60fps? The article barely seems impressed by these numbers. Was this the performance expected from single-GPU setups for this generation? I didn't see this coming at all and I'm completely floored!Also, I would just like to add that I have always preferred being able to segregate displays so that I can easily maximize multiple applications within their own screen. Having everything as "one giant display" has been possible for years and is a less than desirable for everything BUT gaming... IMO
MadMan007 - Friday, September 11, 2009 - link
Yes it's nuts and you can thank ATi for finally upping the texture units and ROPs. I think they've been the same in number, although they've gotten faster, since the x1k series!Golgatha - Thursday, September 10, 2009 - link
No, I would agree that this resolution with those kinds of framerates is just nuts.skrewler2 - Thursday, September 10, 2009 - link
Yeah, seriously, the performance is unreal. I'm wondering if the settings were really maxed out..AznBoi36 - Thursday, September 10, 2009 - link
Probably were. But look, it's a lvl 1 toon wandering the world. I wanna see the performance in a 25 man raid or simply wandering around Dalaran. Oh and the shadows better be maxed too!theslug - Thursday, September 10, 2009 - link
I could see it being practical if the bezel of each monitor wasn't visible. They would need the actual LCD panels attached to one another instead.HelToupee - Thursday, September 10, 2009 - link
LCD's require control electronics around all 4 sides, making the bezel a necessity. It could easily be 1/4 the width of current monitors. I messed around with stitching the images from 3 rear-mounted projectors together. The image was seamless, but the price would be astronomical. That, and you have to have a VERY good screen to project on to, or all your wonderful resolution gets muddied.USRFobiwan - Friday, September 11, 2009 - link
How about the Samsung 460UTn with just 4mm bezels...mczak - Friday, September 11, 2009 - link
Or the Nec X461UN, which looks very similar (btw you don't need the 460UTn, the 460UT would do as there's no use for the built-in PC in this scenario)Those are really expensive (>5000 USD), are huge and low-res (1366x768). That's really for big video walls, not suitable for some monster gaming setup. But really, it shouldn't be much of a problem manufacturing 24 inch or so tfts with similar slim bezels. There just hasn't been a market for this up to now...
snakeoil - Thursday, September 10, 2009 - link
wow this is spectacular.intel is in big trouble because intel graphics are pretty much garbage
while amd's graphics are real gems.
TA152H - Thursday, September 10, 2009 - link
You make a good point, but the other side of the coin is also true - Intel processors are very strong, and AMD processors suck by comparison.It's a pity ATI stopped making chipsets for Intel motherboards. They'd make money, Intel would still sell processors, and the only real loser would be NVIDIA. It's surprising how many chipsets they sell. I don't know many people who would buy NVIDIA chipsets, like most people, but it seems they sell them well with HP and Dell, where no one asks or knows the difference. ATI should really make chipsets for the Atom too. That would be a great combination.
formulav8 - Friday, September 11, 2009 - link
I'm not sure why you think AMD's cpu's stink? All benches i've seen is they can run any game out there and push more than 30fps in all tested games not limited by the video card and even push the frames to where the video card ends up the bottleneck. No?Even compared to i5, the PII 9650 held its own quite well in alot of area's.
For the past few years Intel has defintely had the trashiest iGPU's and probably will at least until the forseeable future. And I wouldn't count on Larrabee to change that all that much by the time it comes out. You can have the strongest cpu in the world but if you have gpu trash like Intels you can't game anyway at good resolutions and speeds to make good use of the fastest cpu in the world.
Just my humble opinion :)
Jason
TA152H - Friday, September 11, 2009 - link
I think people sometimes instinctively balance things out, and forget they are even doing it.Keep in mind that Phenom II processors are the same size as the Nehalems, and you're forced to compared the low end of a brain-damaged line with the highest end AMD CPU, and still generally come out the loser. That's not a good CPU design, if you think about it.
I don't agree with your remarks about great processors not mattering if you don't have the GPU to go with it. Stuff like that just makes you sound irrational and too pro-AMD. Not everyone needs powerful 3D graphics, but more to the point, you can get an ATI card for your Intel motherboard. So, sorry, they don't need to make a good video card for you to benefit from it.
Kary - Thursday, September 10, 2009 - link
That might have looked really cool using projectors since that would get rid of your borders...REALLY EXPENSIVE, but cool :)Maybe on the side of a building
anandreader - Thursday, September 10, 2009 - link
The Apple store on Michigan in Chicago is using something very similar to what's being shown here. If I recall correctly, they had 8 panels arranged 2 across by 4 down. They were running an animation showing all the iPhone apps that were available. I noticed the display from the other side of Michigan and was impressed enough to cross the street to see how they did it.The device was probably purposely made by the LCD manufacturerer as the seams were about as wide as a single thin bezel on Samsung monitors.
Holly - Thursday, September 10, 2009 - link
hmm, running that is pretty nice, especially with the framerate given, but god save us from electricity bills... screens with good colour reproduction of this size are about to take 80Watt at least each... giving half kilowatt on screens only.Anyway I am eager to get detailed review soon... And hopefully nVidia counterpart as well... $1000+ graphic cards are just nice to see in benchmarks so fingers crossed for fast competition...
teldar - Thursday, September 10, 2009 - link
They are actually going to be $450 cards for the initial 5870's with 2GB of GDDR5 1300MHz ram on board.So they are probably reasonable for what they offer.
imaheadcase - Thursday, September 10, 2009 - link
I've yet to see someone use a monitor setup like that for gaming. It looks terrible with the lines between monitors. Video card manufacturers have been trying that for ages now, Matrox tried and failed at it already. Let it die. AMD.Mission control stuff, large desktop for apps sure be nice. But to game on? yuk.
Case in point, look at the WoW screen shot..screen cut right in half were character is located. lol
Now if some monitor company decides to make a monitor that lets you remove bezel around edges to a smooth setup, then we can talk.
Havor - Saturday, September 12, 2009 - link
Apparently you never played on a 3 monitor setup. (yeah a 6 monitor setup would suck)I can tell from personal experiences that it awesome.
I have tree 22" wide screens Matrox TripleHead2Go setup, so have no problem whit bezels in the middle of my screen.
Yeah of course the bezels bug me, but just as mouths as the roof beams do when i am driving in my car,
And yeah a 6 monitor setup just doesn't work in most games(except games like C&C), would be the same as putting a band of duct tape on eye height around the car.
If you wane see for your self go over to gaming site from Matrox to see some screen dumps from for example WoW
Ore see how it looks on YouTube
NFS is also awesome on tree monitors
And whit the low prizes of monitors anyone can now have a 3x 22" setup
Havor - Saturday, September 12, 2009 - link
Darn shitty quote and link system on AnandtechMatrox game site: http://www.matrox.com/graphics/surroundgaming/en/h...">http://www.matrox.com/graphics/surroundgaming/en/h...
WoW site: http://www.matrox.com/graphics/surroundgaming/en/g...">http://www.matrox.com/graphics/surroundgaming/en/g...
YouTube movies: http://www.youtube.com/results?search_query=triple...">http://www.youtube.com/results?search_q...riplehea...
Zingam - Friday, September 11, 2009 - link
What actually counts is that the graphics card is so powerful that it can support the high resolution!!! And this is just the first generation D11 card. I doubt that we will see anytime soon DX12 so I guess the third generation DX11 cards will rock the earth and break the walls!pixel00706 - Friday, September 11, 2009 - link
Yeah But imagine 6 X cheapy data projectors....no lines massive display :). Problem is an area big enough to project it ontoGriswold - Friday, September 11, 2009 - link
Case in point, look at the WoW screen shot..screen cut right in half were character is located. lolI wonder why they didnt use pivot to circumvent that problem. Five displays in a single row instead of two and the center display will not cut the area with the character in half, while at the same time provide the desired hight at the cost of some width. That would work for me...
duploxxx - Friday, September 11, 2009 - link
perhaps have a look at Barco, also one of the preferred ATI/AMD board vendors and see what they do with screens, they already have full glued monitors where you can drive 2 gpu to 1 monitormisuspita - Friday, September 11, 2009 - link
Those monitors are not thought out for that purpose. But http://www.seamlessdisplay.com/products_radius320....">http://www.seamlessdisplay.com/products_radius320.... these are.Iketh - Friday, September 11, 2009 - link
Please read the entire article before posting. It's clearly stated that AMD already has one manufacturer lined up to make monitors with very thin bezels, and for FPSs, you would use an odd number of displays so your crosshairs are in the center of a monitor. It's all in the article.RamarC - Thursday, September 10, 2009 - link
gotta agree. nice tech demo, but wouldn't you rather have a single 60+ inch LED display?zebrax2 - Thursday, September 10, 2009 - link
How many organs do i have to sell to buy a 60+ inch led display with a resolution greater than 25+ mega pixels? Seriously though yields would be so low on those.taltamir - Thursday, September 10, 2009 - link
Building such a monitor is not difficult. Making something run on it is.
By giving us the software, the open the door for a company to design such a monitor.
erple2 - Thursday, September 10, 2009 - link
[quoting block]I've yet to see someone use a monitor setup like that for gaming. It looks terrible with the lines between monitors. Video card manufacturers have been trying that for ages now, Matrox tried and failed at it already. Let it die. AMD.[/quoting block]I think that what makes this a brilliant thing isn't that you _can_ hook up 6 monitors to a single graphics card, but that it works as if it's just one monitor. I think THAT is the innovation that ATI's done - all other implementations I've seen basically treat each monitor as a separate entity, and have to handle them within the OS separately. This treats it all as a single 7500x3200 resolution display. THAT is the cool part, I think.
I think that one of the things that Matrox did back in the day was some clever tricks to try and get the OS to recognize the displays correctly. Looking back on it, there was this concept of "primary display", "secondary display" and "tertiary display". Not "this is my display that's huge".
Perhaps it's just a "name swap", but I can see that this sort of thing is what you need for abstracting arbitrarily large displays (made of one, giant, single display or multiple smaller displays).
sparkuss - Thursday, September 10, 2009 - link
I could see it working if they just find a different mounting scheme.It seems the only obstacle is the bezels, surely there's technology for that side of the coin as well to allow the panels to butt up against each other without a bezel.
I have already seen flight sim stations that it won't be a problem for even now because the "cockpit" windows already encompass a bezel.
Course once we have flexible "affordable" OLEDs maybe even wilder setups possible.
SkullOne - Thursday, September 10, 2009 - link
That is true but you're not seeing the big picture and yes that pun was intentional.Current monitors strapped together like in the demo are worthless for things like L4D or Counter-Strike. But what about future monitors? I can see vendors building a single giant monitor built from 3 or 6 or more monitors and making them a more or less seamless and market those to the extremely hardcore gamer.
Visual - Friday, September 11, 2009 - link
Ok, but then why not just have it be one monitor to begin with?One whole panel will always be better than pieces strapped together even if they are without bezels. One single video port and cable, even if it is "six-link displayport" or some other freaky name will be better than multiple cables.
Penti - Sunday, September 13, 2009 - link
For games yes it's useless, but have you ever seen the gigapixel projects (and the displays for viewing very high resolution images) universities do? This will simplify it brutally, before they had a PC for every screen or two and rendered with software. Now they can just connect them to a single desktop if 6 displays is all they need, or when running with software rendering cut down the number of PCs by a factor of 3. When you need high resolution you can't just use one big display (or projector) because the resolution will be to low for that stuff.On the other hand, this is just for displaying the multi-monitor capabilities and people use more then one monitor in the real world, maybe not as one giant screen (they do that too of course) but that's not the point that is just one of the mode you can use. And for those who use that mode it's now greatly simplified, before it was a hell getting acceleration and stuff working across both screens at the same time or stretch some apps across the two screens, this is now no problem as it's just one single surface for windows and the applications. But as said you simply can't build a single display with a resolution of say 11520 x 6480.
Havor - Saturday, September 12, 2009 - link
[quote]One single video port and cable, even if it is "six-link displayport" or some other freaky name will be better than multiple cables. [/quote]Is there a problem i don't see ?
Its the same as whit RGB cables you have 3 of them together and only the ends fan out.
Not seeing the problem here
Ratinator - Friday, September 11, 2009 - link
How many monitors have you seen that can do a 10240 x 3200 display? I think that was part of the points of this demo as well is the throughput of the card. Techology ain't there yet. This is where the flexible panels of the future will shine.Ratinator - Friday, September 11, 2009 - link
Meant 7680x3200slowside - Friday, September 11, 2009 - link
Cost & manufacturing complexity. It is much much easier & cheaper for them to build smaller displays. So it is much cheaper, and easier for them the put 3 small displays in one package than make one huge panel with the same resolution.gamara - Friday, September 11, 2009 - link
And having 6 panels strapped together makes 6 points of failure. Any one of the panels has bad pixels or an other bad component and you will have a mess. Replacing one will probably lead to temperature differences and right next to each other that type of issue will show. I agree that it would need panels without bezels to work right for gaming. It would be annoying as a desktop too. Can you imaging trying to use IE or Firefox with lines in the middle of your web pages?moriz - Sunday, September 20, 2009 - link
and having one giant panel makes 1 point of failure. if you have a problem, you replace the whole thing. this is MUCH more costly than replacing one panel out of six. indeed, it's probably more costly than replacing all six panels all together.the technology for such a display has been around for at least a year, if not longer. i recall seeing a three panel display at a tech conference (can't remember which), where a high end computer was driving crysis on all three displays.
JonnyDough - Monday, September 14, 2009 - link
I see it as a good thing. Panels have an advantage of "if one goes bad replace it" as opposed to a giant monitor that gets a bad pixel and is irreplaceable due to cost. $150 and I'm good as new. With a $12K monitor, I imagine I'd just keep using it with a few bad pixels in it.Zoomer - Saturday, September 12, 2009 - link
Solution: 6 * 1080p PROJECTORS!gospastic - Thursday, September 10, 2009 - link
enhanced immersion is lost when have you have stupid horizontal and vertical bars running across the display area.Sazar - Thursday, September 10, 2009 - link
I thought Dell was the first to embrace Display Port as it was part of the group that brought it to fruition.VaultDweller - Tuesday, September 15, 2009 - link
You are correct, but the article says that Apple was the first to embrace mini-DisplayPort (emphasis on the mini).Totally - Thursday, September 10, 2009 - link
Dell wasn't the first to embrace Displayport, it's Dell's freaking baby. I'm guessing Samsung it's the first to adopt it on a same scale as Dell.FullCon Solutions - Wednesday, April 7, 2010 - link
FullCon Solutions, LLC partnered with Duke University & Iowa State to promote the benefits of using 6-Sided, Fully Immersive CAVE's in the AEC & Marine Industries. FullScale Analysis of 3D models are dramatically improving communication between all stakeholders. Ultimately, this technology allows are clients to design better, clarify expectations sooner and profit from improved decision making.Would this be of any interest to you?
FullCon Solutions, LLC
David Fuller | President
shin0bi272 - Tuesday, November 8, 2011 - link
http://www.engadget.com/2007/08/10/omni-directiona...