Comments Locked

134 Comments

Back to Article

  • EzioAs - Monday, July 1, 2013 - link

    Since mid-range cards aren't really that strong yet for 2560x1440(/1600), I'd say 2-3 years waiting for it to be strong enough for 4K is wishful thinking and at the same time, games will be even more demanding. Still, very nice to see the numbers though.

    Thanks Ian.
  • B3an - Tuesday, July 2, 2013 - link

    Yeah theres no way even dual mid-range cards with handle games at 4k in just 2 - 3 years.

    I'm also expecting quite a graphics jump and especially a big VRAM usage jump because of the new consoles and the good amount of memory they have. We're going to start seeing very high res textures. So mid-range cards will also start needing 6GB+ for PC game, especially @ 4k res in 2 - 3 years time.
  • MordeaniisChaos - Sunday, July 7, 2013 - link

    Next year, video cards will double in density. 3 years from now, that will be doubled. In other words, the systems in 3 years will be hugely powerful to what we have now, much less the nearly year old Titan, which again, you are comparing to a new video card which will be twice as dense as a video card that is twice as dense as the Titan, approximately. Think of what card was released 4 years before the 680, and look at how they stack up performance wide.
    Also keep in mind that VRAM is a big part of all of this, and all signs point to the next generation of cards having more VRAM standard.
  • turbosloth - Wednesday, July 24, 2013 - link

    Um, the GTX Titan isnt anywhere near a year old. I dont think that this meteoric rise in GPU power will happen quite as fast as you're expecting/hoping for, especially in reasonably priced consumer products.
  • Jergos - Saturday, November 23, 2013 - link

    MordeaniisChaos is citing Moores Law http://en.wikipedia.org/wiki/Moore's_law. His post is entirely accurate.

    The main problem with the future of Moore's Law is the actual physical boundaries. Eventually we will get to the atomic size limit of transistors which means that some serious innovation will have to occur to pass tip toe around that boundary. We don't know when we'll hit that limit but it'll happen. Until then MordeaniisChaos will be correct.
  • HSR47 - Saturday, January 25, 2014 - link

    That's an entirely inaccurate reading of Moore's law:

    Moore's law relates to the increase in transistor count, and transistor count does NOT track 1:1 with overall performance.
  • zerockslol - Thursday, May 1, 2014 - link

    The first titan was released Feb 2013, it's now may 2014 which is more than a year after
  • Thatguy97 - Sunday, December 18, 2016 - link

    Lol
  • airmantharp - Tuesday, July 2, 2013 - link

    I'd think that mid-range cards would actually do a lot better at 4k if they simply all had their VRAM doubled- especially the Nvidia cards with 2GB/GPU. An 8GB GTX690 should be pretty potent at 4k if settings are balanced out a little.

    Note that I appreciate Ian pushing the sliders all the way to the right, to set a good performance baseline for 4k. I'm sure they'll have much more reasonable reviews when 4k60 panels in desktop form factors start dipping into the enthusiast friendly <$1000 bracket in the next year or so.,
  • ehpexs - Wednesday, July 3, 2013 - link

    Even a decent card like a 7950 has trouble at high reses. I run triple 2560x1440 27s. For one display it's fine, but for three, we're talking medium settings in Crysis 3 and BF3 with the cards being badly vram capped. That was one thing I felt was missing in this review, vram usage.
  • MordeaniisChaos - Sunday, July 7, 2013 - link

    It would be nice to see VRAM usage, agreed. In fact the whole thing could be more in-depth. Would be nice to see all of the performance and usage levels through a benchmark or something.
  • TallestJon96 - Friday, March 24, 2017 - link

    Well, reporting back now in 2017, and yeah, 4k is reserved for high end cards, we have only just gotten to mainstream 1440p.

    Interesting to look back on this one
  • nakquada - Monday, July 1, 2013 - link

    Why on earth would you want to cripple your framerate by using any form of SuperSampling/Anti-aliasing at that resolution?
  • aetherzero - Monday, July 1, 2013 - link

    This is an excellent question. Aren't the point of these rendering features to eliminate artifacts that are a result of the pixellation and line breaking at lower resolutions? Shouldn't these artifacts should be inherently eliminated by the higher resolution?
  • xTRICKYxx - Monday, July 1, 2013 - link

    Apparently, antialiasing still has a place at 4K because aliasing is still visible at that PPI. I
  • smartypnt4 - Monday, July 1, 2013 - link

    You need some AA at any resolution for a game to look its best. There are patterns that the human eye can pick out relatively easily that are still present on high-PPI/DPI montiors.

    Example: go take a look at Galaxy on Fire 2 running on the iPad 4. I know for a fact it runs at native resolution, but there are times when you notice aliasing going on, or at the very least some weird jagged edges, especially when looking at a ship against the black background of space.

    That said, SSAA is COMPLETELY unnecessary for this resolution. It'd be nice to get some analysis of what kind of settings we can expect out of games for the near-ish future at 4K. That's dependent on getting one of the reviewers a 4K TV to test on, which should be easy with those Seiki sets available.
  • IanCutress - Monday, July 1, 2013 - link

    I'd love one for testing, though shipping to the UK is a problem and I don't have space for a 50". Shame they don't sell local.
  • smartypnt4 - Monday, July 1, 2013 - link

    That's true. I keep forgetting you're on the other side of the pond. I'm sure a solution will present itself, though. It's not like anyone really needs these in the next 6 months or so.
  • mapesdhs - Tuesday, July 2, 2013 - link


    Ian, just a thought, given the level of GPU power we're talking about here, and what future
    displays might require, perhaps European power outlets will just about to be able to cope
    given their 3kW+ typical limit, but what about US power outlets? Despite the inevitable
    advances in lower voltage GPUs, etc., is it possible we'll see ever more heat & power
    problems in the future at the top-end of PC gaming? Even if it's not bad enough to be
    a problem at the wall socket (at least not for Europeans), what about venting the heat
    from out of a case? Makes me wonder whether the ATX standard itself is out of date;
    perhaps it needs to move up a size level.

    Btw, whereabouts are you in the UK?

    Ian.
  • HSR47 - Saturday, January 25, 2014 - link

    "Shouldn't these artifacts should be inherently eliminated by the higher resolution?"

    Not really, no.

    The issue is that you're trying to render three dimensional objects on a two dimensional plane: that means that things get stretched and contorted. Also, these features are useful for blurring the edges between different rendered objects.
  • r3loaded - Monday, July 1, 2013 - link

    That's what I was wondering too. At such a high resolution and a high enough pixel pitch, any form of anti-aliasing is just pointless. 2x SSAA is practically equivalent to running the cards at 8K, a resolution which is still a decade or two away from being seen outside the lab.
  • nathanddrews - Monday, July 1, 2013 - link

    While we might have to wait a little longer in the USA, NHK Japan is on track to begin 8K (Super Hi-Vision) satellite broadcasts in 2016. The codec and compression methods have been finalized, commercial equipment is being prepared for launch... 8K will be here sooner than you think. I never would have thought we'd have 4K TV/monitors for $699 already, but we do.

    I've said it before, but GPU makers really need to get on the ball quick. Hopefully mainstream Volta and Pirates will be up to the task...
  • Vi0cT - Monday, July 1, 2013 - link

    Error, in 2016 Japan will start the trails for 8K but it's not going to be rolled out until 2020 (If they get the Olympics). You can read the official paper from the Ministry of Internal Affairs and Communications
  • Vi0cT - Monday, July 1, 2013 - link

    Add: The last comment may come out as hostile but there was no hostility intended :), so if you misunderstood it please excuse me.
  • nathanddrews - Monday, July 1, 2013 - link

    It's cool, I'm not butthurt.
  • Vi0cT - Monday, July 1, 2013 - link

    By 2020 Japan will start 8K Broadcast, and 4K Broadcast will start next year so it's not that far.
  • mapesdhs - Tuesday, July 2, 2013 - link


    "... outside the lab"; not so, many movie studios are already editing in 8K, and some are
    preparing the way to cope with 16K material.

    Ian.
  • nathanddrews - Wednesday, July 3, 2013 - link

    I don't believe anyone is *editing* in 8K.

    Most movies shot on film (35/70MM) pre-2000 are scanned at 2K or 4K (a select few have been scanned at 8K), then sampled down to 1080p for Blu-ray with some to 4K for the new 4K formats rolling out right now. Those movies were edited on film using traditional methods. Any CGI was composited on the film. Those movies have an advantage of always being future proof as one can always scan at a higher resolution to get more fidelity. 35MM tops out at about 8K in terms of grain "resolution", while 65/70MM is good for 16K. Results vary by film stock and quality of elements.

    Most movies shot on film after the DI (digital intermediate) became mainstream had ALL of their editing (final color grading, contrast, CGI, etc.) completed in a 2K environment no matter what resolution the film was scanned. "O Brother Where Art Thou" and the "Lord of the Rings" trilogy are examples of films that will never be higher native resolution than 2K. The same goes for digitally shot movies using the RED cameras. Even thought they shot at 4K or 5K, the final editing and mastering was a 2K DI. "Attack of the Clones" was shot entirely digital 1080p and will never look better than it does on Blu-ray. Only in the past couple years has a 4K DI become feasible for production.

    10 years from now, the only movies that will fully realize 8K will be older movies shot and edited entirely on film or movies shot and edited in 8K digital. We'll have this gap of films between about 1998 and 2016 that will all be forever stuck at 1080p, 2K, or 4K.
  • purerice - Sunday, July 21, 2013 - link

    nathanddrews, thank you for your comment. I asked a relative who helps design movie theater projectors about that very issue. The answer is the industry has no answer, which you confirm. Most of the early digital films will depend on upscaling. On the plus side, in 20 years we should be ready for another LOTR remake anyway :)
  • Streetwind - Monday, July 1, 2013 - link

    That was the same thought that occurred to me as well. I'd love to see how the results would have been without antialiasing. As it stands I barely use it on my 1200p monitor (though admittedly a big reason for that is because of the glorified fullscreen blur filter they generally try to sell you as "antialiasing" nowadays).

    Too bad Ian didn't have more time to test.
  • IanCutress - Monday, July 1, 2013 - link

    I wish I did as well! Perhaps SSAA was properly overkill at that resolution; with more testing I would obviously like to see the differences. Though it may be a while before I can get a 4K in to test.
  • mavere - Monday, July 1, 2013 - link

    Retina iOS games running native with 2x MSAA will still display some jaggies here and there. Maybe 4x MSAA?
  • mapesdhs - Tuesday, July 2, 2013 - link


    Ian, there are kindof a few models available in the UK, if you have the moolah. ;)

    http://www.thruput.co.uk/products/monitors/56-8-Me...
    http://s3dpostproduction.co.uk/mitsubishi-56p-qf60...

    As for pricing though, interesting to contrast this:

    http://www.tnpbroadcast.co.uk/Product/TVLogic_LUM-...

    with this:

    http://www.gizmodo.co.uk/2013/04/seiki-50-inch-4k-...

    :D

    Another headline I found:

    http://www.sharp.co.uk/cps/rde/xchg/gb/hs.xsl/-/ht...

    Ian.

    PS. Alternative: use a splitter to feed the output into 4 separate HD displays? Or would
    that mess up the GPU's ability to detect that it's able to drive a 3840x2160 display?
  • makerofthegames - Monday, July 1, 2013 - link

    I game on a 1440p monitor, and antialiasing is still useful at that resolution. It's just not as important as at lower resolutions - on my old 1600x900, anything short of 8x AA seemed jaggy, but at 2560x1440, 2xMSAA is generally fine, maybe 4xMSAA if there's horsepower for it. In some games I actually prefer to use just FXAA, instead putting the horsepower behind some other setting. At 2160p, I would expect 2xMSAA to still be somewhat useful, although more than that is probably overkill (and 8xSSAA is already overkill at 1440p).

    Of course, overkill is fun sometimes. I actually run some older games (like UT3) with 8xSSAA, just because I can still get 80+ FPS with everything maxed out.
  • nathanddrews - Monday, July 1, 2013 - link

    I was thinking the same thing. At that PPI, don't you essentially get AA for free? Any chance you collected FPS with AA disabled? I would happily suffer through some very minor artifacts to get a smoother 4K experience. Not unlike when I used to play games at 1920x1200 without AA on my old Dell 17" laptop (Core Duo, 7950Go).

    While I know it's a different scenario than with fixed pixel displays, I currently do that with my FW900. I run it at 2560x1600@75Hz with some games, but I run them without AA and can't detect (most) artifacts.

    I know AT wants to push the bleeding edge here and that this is just a quick look, but $700-1200 for a 4K monitor (Seiko) isn't exactly in the same ultra-enthusiast territory as 4X Titans. That said, it would be nice to see some benchmarks at 4K with medium-to-high settings without AA for the less-wealthy enthusiasts.

    Also, does this monitor also operate well at 1080p@120fps?
  • chizow - Monday, July 1, 2013 - link

    Yeah 4K is effectively 1080p with 2x2 SSAA, or 4xSSAA. Does seem overkill, but even at higher resolutions, AA can be beneficial to reduce shimmering by reducing pixel transitions in motion.
  • iamezza - Monday, July 1, 2013 - link

    +1
  • airmantharp - Tuesday, July 2, 2013 - link

    My initial thought was that Ian was trying to set a long-term baseline; using these settings in the future, he'd be able to show how far future cards have come relative to the (future) old beasts.
  • BMNify - Sunday, July 7, 2013 - link

    because its there, and it clearly shows that at only 4K these gfx cores Need a lot more work to come up to standard when people migrate to this standard, And note I said "at Only 4K" ....

    the reasoning is simple, the UK,Japanese and Chinese PBS are all working to implement Real 8K over the air broadcasting technology Today and for several years now with the expected commercial retail availability by 2020 .... so this near 4k/2k standard is only the start of the ultra HD transition and there's not much reative time to get the computer GFX up to speed at and affordable price point to make it a generic mid range option by 2020 never mind 2023
  • MordeaniisChaos - Sunday, July 7, 2013 - link

    Considering this is testing 4K resolutions, it's not really judging "real world" environments. So I think the SSAA and running it with everything pumped up is a good way to go. Especially considering we are on the cusp of games getting much better looking, so in a way it could very well be appropriate. We won't be playing games in 4K for quite some time just because the screens aren't really out there for the every man. By then the games will look much better, much denser, with lots of fancy new rendering techniques.
  • MordeaniisChaos - Sunday, July 7, 2013 - link

    Actually, some sort of comparison of "jaggedness" on a 4k screen vs a 1080 screen would be really nice to see. I'm curious just how much better the PPI will actually make edges look. I'm sure it won't eliminate jagged edges, but surely more than 2xSomethingAA would be enough. Only logical considering a lot of games don't benefit from 8x over 4xMSAA in any meaningful way. And something like 2xTXAA would probably look fantastic. The resolution helps with the blurring issue (which I never minded or really noticed too much, I was happy to have clean, smooth, non-crawling edges), and 2xTXAA should be pretty easy to run. Hell, and I can't believe I'm saying this, but maybe even... FXAA won't look terrible at those resolu-... *barfs* nope, can't say it.
  • designerfx - Monday, July 1, 2013 - link

    wait, so a hd 7990 6gb beats out a single titan?

    that's quite interesting. I wonder what would have happened with 2 HD 7990 6GB's?
  • nakquada - Monday, July 1, 2013 - link

    A 7990 is effectively somewhere between 2 x 7970 and 2 x 7970 Ghz edition. So a single 7990 would be much faster than a Titan. They are however in now way in competition as AMD doesnt have a Titan-class single GPU. Its 7990 is head-to-head with a GTX 690 (which itself is effectively 2 x 680s)
  • Bandanawedgie - Monday, July 1, 2013 - link

    Perhaps 4x 7970s would be better, as Toms Hardware struggled to get 2 to run without heat issues.
  • mapesdhs - Tuesday, July 2, 2013 - link


    It isn't a 7990 6GB. Each GPU has 3GB, so that's the max VRAM the host system
    can use. Adding the VRAM of each GPU is misleading marketing. Find a gaming
    test that runs on a Titan using (say) 4GB VRAM (eg. heavily modded Skyrim at
    mega res and AA), try it on a 7990, it'll tank.

    Ian.
  • Civver27 - Monday, July 1, 2013 - link

    Interesting, but how much difference to the frame-rate it would have made it running the cpu at stock on the the 4 gpu setup and single GPU setup. That way we would all know to waht degree -that clocking your cpu to 5GHz really benefits your frame rates at these resolutions which is only the price ot 4 * 1080p monitors
  • EzioAs - Monday, July 1, 2013 - link

    I think at this high resolution, we won't (or shouldn't) be seeing the CPU bottleneck, but rather GPU bottleneck. So, running the CPU at stock speeds or overclocked won't (shouldn't) make a difference. If there is one, it's probably very miniscule.
  • EzioAs - Monday, July 1, 2013 - link

    Actually, looking at the numbers back, Metro 2033 is probably the only title that has a CPU bottleneck with 4xTitan, although that could also be quad-SLI not scaling properly.
  • Kevin G - Monday, July 1, 2013 - link

    I'd expect to see a difference between PCI-E 2.0 and 3.0 as well as 8x and 16x lane configurations since SLI/CrossFire is really helpful at 4k. This would place socket 2011 chips at an advantage here.
  • jrs77 - Monday, July 1, 2013 - link

    I've said it before, and I'll say it again... 4k resolution is total bollocks and offers no improvements over the current 1080p standard.

    High density screens are for mobile devices, where you look at them from 30cm distance. And for this area it's allready hitting the same wall as the 1080p resolution for TVs.

    Usually you sit some 70cm away from your PC-screen, or in a more ergonomic way, the length of your arm. At this distance tho, you won't really see single pixels anymore on a 22-24" 1080p screen, especially when talking about movies or games instead of still images.
    For a TV in the living-room, the rule of thumb is 1m distance for every 15". So you'll sit at some 3m distance when looking at a 40" TV. At this distance you won't notice too much differences between 1080p and 4k. So the only thing a 4k-resolution does really is to unnecessarily increase bandwith or storagespace.

    For games... yeah... look at the results there. Quad-Titan Setup to get 60FPS in Metro 2033 or Sleeping Dogs. This is the totally wrong direction imho. We want hardware that can drive 4k-resolutions, but at the same time we all know that energy isn't getting any cheaper. And to have a PC that sucks 1kW while playing a video-game :cough: sorry, but that's just stupid.

    Leave it at 1080p and get me hardware that allows me to play games like Metro at max settings with only 100-150W for the complete system, i.e. 35W TDP CPU + 75W TDP GPU (75W is the maximum powerdraw over PCIe x16, so no extra PCIe-powerconnector).
  • dishayu - Monday, July 1, 2013 - link

    Interesting perspective. Until about 2 months ago I would have supported your opinion without reservations. But ever since I've bought my HTC One, I'm not so sure. going from 720p screens to 1080p screens on mobile was supposed to be a pointless excercise as well, only increasing the compute requirements and not giving anything back in return. But it DOES genuinely look a LOT better than 720p screens of the same size, so i'll reserve my opinion until I get to see a 4K desktop panel, around 27-30", myself.

    For me, i believe the ideal screen would be a 27 inch 4K IPS/OLED panel at 120Hz.
  • Oxford Guy - Wednesday, July 3, 2013 - link

    A-MVA has better contrast than IPS.
  • EzioAs - Monday, July 1, 2013 - link

    Just because you don't see the benefit of higher resolutions and high PPI doesn't mean the rest of us have to follow. A few years later, there's a chance you'll get what you want (75W GPU that can max Metro). Of course there's an even higher chance that at the same time, games released will have graphics and physics that make Metro looks more like Super Mario (okay, maybe that's going a bit overboard but you get the point)
  • jrs77 - Monday, July 1, 2013 - link

    It's not about personal preferences, but about physiology of the human eyes.

    Higher PPI only is benfecial at closer distance, hence why mobile devices benefit from it. The farther you get away from the screen (PC usually ~80cm and TV 3 or more meters) however, the less difference you'll notice and the only thing that increases is the energy needed to drive the hardware.
  • ATWindsor - Monday, July 1, 2013 - link

    Needed resolution has nothing to do with distance (at least not directly), and everything to do what Field of view the screen covers in your field of vision, that field of view is huge for a 30 inch monitor in regular settings. We can see details down to 8 -10 arc-seconds in some circumstances, even down to 0,5 arc-seconds in extreme cases. We need well above 1080p for that, even above 4k. Even 0,5 to 1 arc-minute for pure pixel-resolving in non-special-cases needs more than 1080p, and thats with 20/20 vision, which large portions of the public exceeds.
  • solipsism - Monday, July 1, 2013 - link

    Adding to ATWindsor's comment, as we move to larger and larger displays, either as a "TV set" in the HEC or a computer display on our desk the "Retina" effect will go away.

    Right now a 50" 1080p "TV" and 2560x1440 27" display may place the viewer far enough away that they can't see the pixels… but display will continue to get larger.

    For those aforementioned resolutions and display sizes it's 6.5 and 2.6 feet to get the "Retina" effect with 20/20(6/6) vision. That sounds to me we are pretty much on the cusp of this, especially with the HEC display. You can buy a 65" 1080p for under $1500 and yet those that doing so to replace a smaller 1080p HDTV might not be getting a better overall experience. You have to sit at least 8.45 feet away to get the effect.

    I'm glad that display technology, GPUs, and even codecs (H.265) are all lining up nicely to tackle this issue.

    PS: Note that 4K is exactly 3x720p and 2x1080p.
  • mapesdhs - Tuesday, July 2, 2013 - link


    jrs77, regret to say you're wrong wrt human eye visual fidelity. Humans can resolve
    far greater detail than the examples you give as being supposedly the most required
    for anyone, though of course it varies between individuals. However, just because
    one person can't tell the difference, I can guarantee others could.

    Rough approximation of human vision is about 8000 pixels across at near distance,
    115K pix across at far horizon. We can also resolve a much greater number of shades
    of each colour than are currently shown on today's screens (which is why SGI's old
    high-end systems supported 12bits/channel as far back as the early 1990s, because
    it was regarded as essential for high fidelity visual simulation, especially for night time
    scenarios, sunrise/sunset/etc.)

    Indeed, some TV companies have suggested the industry should skip 4K technology
    completely and just jump straight to 8K once the tech is ready, though it seems plenty
    are willing to hop on the 4K bandwagon as soon as possible whatever happens.

    NB: Caveat to the above: how the eye resolves detail (including brightness, contrast,
    movement, etc.) is of course not uniform across one's field of vision. Imagine a future
    GPU with eye tracking which in real-time can adjust the detail across the screen
    based on where one is looking, reducing the load in parts of the screen that are
    feeding to our peripheral vision, focusing on movement & contrast in those areas
    instead of colour and pixel density. That would more closely match how our eyes
    work, reducing the GPU power required to render each frame. Some ways off though
    I expect. Even without eye tracking (which is already available in other products),
    one could by default focus the most detail in the central portion of the display.

    Ian.
  • Oxford Guy - Wednesday, July 3, 2013 - link

    The gamut available to the eye is much larger than the miniscule sRGB color space that still dominates computing today.
  • Zorkwiz - Monday, July 1, 2013 - link

    If you search for the famous "Viewing Distance When Resolution Becomes Noticeable" chart, you will see that at a 30-31" screen size, you have to be sitting about 4 feet back for 4k not to matter vs 1080p for someone with good vision. The full benefit of 4k on a 30" screen is visible at approximately 2 feet, which is pretty much exactly how far back I sit from the screen, so I think a 30" 4K panel would be just about perfect.
  • xTRICKYxx - Monday, July 1, 2013 - link

    Now we need to wait for 60hz/120hz 4k panels to become affordable....
  • Kjella - Monday, July 1, 2013 - link

    That's what I found out too, I could use a 30"/4K monitor but my 60"/1080p TV is fine for my couch distance - now if they sold 100-120" 4K TVs or 4K projectors priced for mortals it would be different, else I'd have to get a lot closer..
  • Dribble - Monday, July 1, 2013 - link

    In the end it'll be worth it, but res is only one part of the package for a good gaming monitor - you need low input lag, 120hz refresh, good colours, etc. Are there any 4K monitors with 60hz refresh even, let alone 120hz (most are 30hz right now)?

    So right now you have to spend a fortune on something that ticks the resolution box lots of times, but has a lot of x's elsewhere.

    You'd be best with a 3*1080p @ 120hz surround with lightboost for fast paced games.
  • JeBarr - Monday, July 1, 2013 - link

    +1

    I'd take a single 23 or 24 inch 1080p 120Hz+ TN panel over a 4k 30/60Hz any and every day.
  • This Guy - Monday, July 1, 2013 - link

    I can see pixels on a 27" 2560*1440 monitor at around a metre. Full stops are still made up of just six pixels at my prefered text size and hence look blocky.

    Those rule of thumbs are silly. Why would we have movie theaters that clearly break those rules unless people enjoyed massive screens?

    Next, these tests are at max settings, including max AA. Duel 770's can average 44fps in Sleeping Dogs at 7680*1440 (33% higher resolution than 4k) with AAx2 and everything else maxed. The GPUs + CPU are rated for around 550W. So yes, 1kW for 4k is stupid.

    Lastly, if you want a 35W CPU and 75W GPU try a laptop. Intel has been selling 35W quad cores that turbo to around 3GHz for the last three generations. ATI and Nvidia both have compelling products that will easily push 1080p at max if you go easy on the AA. Best part is you get a extra screen, UPS and a small form factor.
  • xTRICKYxx - Monday, July 1, 2013 - link

    Don't worry, we keep getting better performance per watt every year. Its a no-brainer that we will be able to have notebooks playing 4K games natively and use less power than they do now.

    Current GPU technology does not work well with 4K. But it will.
  • douglord - Monday, July 1, 2013 - link

    Of course you think 1080P is fine sitting 3M from a 40" screen. YOU CANT EVEN SEE YOUR TV!!! :P
    I've got less then 10' between me and a 65", and Im installing a 120" drop down screen infront of that. I have a 30" on my desktop and would go bigger if I could. 4k is not BS, but you need 4k content, disks that can store it uncompressed, players and screens. No upsampling etc...

    The delta for true 4k is almost as big as DVD to blueray or cable to DVD
  • Gigaplex - Tuesday, July 2, 2013 - link

    Why do you need to store it uncompressed? We have lossless compression for a reason.
  • tackle70 - Tuesday, July 2, 2013 - link

    Meh... 1080p is for people who sit far away from their screen and/or for people with lousy eyesight. I sit about 3 feet away from my 27" 1440p monitor and I can see the pixels quite easily. I'd love a higher resolution screen!

    For TVs, it's pointless because there's no 4k content for a TV and there won't be for a long time. But for a monitor attached to a high end PC, it's great!
  • CaedenV - Wednesday, July 3, 2013 - link

    well, if you insist on using a small 22-24" monitor, then I would have to agree with you that 4K would be overkill; But nobody is going to buy a 4K 22" monitor for their computer (though in time I expect 4K 10" tablets as an extension of 5" 1080p phones). We are going to be buying 4K monitors for our computers in the 35-45" range, and still sitting just as close to the monitor as we do currently. At those sizes and distances a 45" 4K monitor is going to have almost the exact same pixel density as your 22" 1080p screen. But the 45" screen will be huge and immersive, while your 22" screen is on the small side even by today's standards.

    I am currently staring at a 45" cardboard cutout which is sitting right behind my 28" monitor and it fits my field of vision quite nicely. It is big, and I am probably going to get a tan just from turning it on, but someday in the next few years that cardboard cutout will be a 4K monitor, and I am going to be a very happy nerd.

    For the living room 4K is going to be huge. As you mentioned, 3m distance equals a 45-50" 1080p TV. 4K has a similar rule, and you just double the size of the TV. At 3m you would technically want a 90-100" TV. The pixel density is the same, but the TV fills more of your vision out of sheer size. 90" is very large... but it is not so large that it is not going to fit in a house (though transporting it there may be a trick).

    But when you start talking about 8K, then the size doubles again. Meaning that the optimal size for an 8K set at 3m would be 180"... which is enormous! That is 9 feet away, but with a diagonal size of 15 feet! We are talking about a 7.5 foot tall screen that is 13 feet wide! That would not even fit in my front door, and the screen would be as tall as the walls in my home before you even add height for a stand and bezel.
    So when you start talking about 8K not being practical, then I will believe you because it plainly isn't practical. I can even say with some certainty that I will probably never own a TV larger than 140" even if it was affordable simply due to size constraints in my home. I may at some point own an 8K TV or monitor, but I am under no illusion thinking that I am going to see any great improvement between 4K and 8K for screen sizes that fit my field of vision. But if it becomes standard and affordable you are not going to hear me belly-aching about how "4K is good enough, and 8K brings nothing to the table but pain and missery". Instead I will get my eyes augmented so that I can appreciate the glory of 16K screens...

    Lastly, for the game results, keep in mind that these games were played with max settings, including max AA/AF turned way up, and dirt was already playable with a single high end GPU. At these resolutions AA and AF are essentially not needed (or maybe at a 2x setting?). This is not going to make these games all of the sudden playable for my GTX570... but a GTX1070 may be able to play more intense games at these resolutions with low AA at decent settings without requiring me to get a 2nd mortgage.
    Or put another way: 10 years ago we were playing on the PS2 which could not even play full res standard def games at 30fps. GTA Vice City, and Tony Hawk's Underground were cutting edge games of the time, and they look absolutely terrible by today's standards! Back then nobody was imagining us playing games at 1080p at 120 fps in 3D with passable physics and realistic textures while being on the verge of realistic lighting... But today you can do all of that, and while it requires decent hardware, it does not require a 4 Titan setup to achieve.

    Point is that we are still a year and a half away from general availability and a wide selection of 4K screens. And another 2 years after that before the price will hit a point where they start selling in real volume and a decent amount of 4K content becomes available. That puts us 3.5 years in the future which will be right on track for high end setups to be playing maxed out settings at nearly 60fps on these screens. Another 2 years after that (5 years from now) and mainstream cards will be able to manage these resolutions just fine. After that it is all gravy.
    Rome was not built in a day, and moving to a new resolution standard does not happen overnight. If you still like your 1080p screen 5 years from now then buy a next gen console when they come out and enjoy it! They will not be playing 4K games for another 9 years. But the PC will be playing 4K resolution in 3-5 years, and we will pay extra to do it, but we will enjoy it. If nothing else, hitting the 4K 'maximum' will finally put an end to the chasing of graphics at the cost of all else, and we will start to see a real focus on story telling, plot, and acting.
  • StealthGhost - Monday, July 1, 2013 - link

    They call it "4k" because that's how much you have to spend on GPUs to get playable FPS!

    And I thought I had it rough at 1440p
  • JeBarr - Monday, July 1, 2013 - link

    HAHA! good one

    multi monitor gamers already know the drill though :D
  • NLPsajeeth - Monday, July 1, 2013 - link

    The ASUS 4K60 monitor will not work with Nvidia cards for the time being. The monitor is being driven by two rx chips via two hdmi cables or a single DisplayPort cable using MST. This means GPUs must support a 2x1 monitor setup to drive the ASUS 4K60. While Eyefinity supports 2x1, Nvidia Surround only supports 1x1 and 3x1 making this monitor useless for gaming.

    Nvidia has indicated that they will be including such support in R325+ but it still has not appeared yet in the R326 drivers that are available.
  • Kevin G - Monday, July 1, 2013 - link

    Weird. I didn't think it'd have such a handicap but looked it up in the manual. And yes, the Asus display will appear as two 1920 x 2160 displays using DP 1.2 with a 60 Hz refresh rate. This could allow for some funky things like independent rotation, resolution and scaling in windows even though both logical monitors are part of the same physical display.

    It does appear as one 3840 x 2160 resolution display when the fresh rate is set to 30 Hz over DP 1.2.
  • jjj - Monday, July 1, 2013 - link

    First time Anandtech tests at 4k,you should frame the article.
    The good thing about GPUs is that 20nm is about to arrive,then the pseudo 16/14nm soon after and if they start to put the RAM on a silicon interposer they gain a lot of memory bw too.
    Also the 39inch Seiki is not just announced ,can be pre-ordered at Sears and they list release date as 08/05/13 (fingers crossed for a review and maybe them commenting more on where they are on using 2xHDMI inputs).
  • IanCutress - Monday, July 1, 2013 - link

    After testing, I know I'll want at least a 4K60 for gaming. 4K30 isn't going to cut it, I'd rather have 1080p120 or 1440p60.
  • dishayu - Monday, July 1, 2013 - link

    Or 1440p120, perhaps. Which can be achieved with most X-star and QNIX panels, both of which sell for under 300$ shipped. ;)
  • xTRICKYxx - Monday, July 1, 2013 - link

    I've been so tempted at buying a cheap korean 1440 panel. But then I realize my GPU will not get good framerates.
  • dishayu - Tuesday, July 2, 2013 - link

    I don't really play games much. My HD6770 drives the panel at 100Hz+ and plays my casual games (UT3, Counter strike, dota) just fine. And everything else looks pretty. :D
  • jjj - Monday, July 1, 2013 - link

    Yeah ofc ,even if the GPUs most have won't be enough to push 60FPS.But remains to be seen if 4k monitors will be the preferred gaming display , i just want a bigger screen with decent DPI and price for everything and 4k will push prices down but for gaming i do wonder if Oculus Rift and other similar products won't rule the market soon.Even for non gaming glasses could take over ,they can be a few times cheaper than big screens.
  • ludikraut - Monday, July 1, 2013 - link

    Having seen a Seiki 4K30 display, I definitely agree that 4K30 isn't going to cut it. It looked awful. For now I'll be waiting for a reasonably priced 55-60" 4K60 (or higher) screen before I replace my existing 1080P display.

    l8r)
  • mapesdhs - Tuesday, July 2, 2013 - link


    Ian, just a thought, did you use the default SLI mode when testing 3/4-way Titan?
    When testing 2/3-way 580s, I found the default was in some cases nowhere near
    as fast as selecting some other mode manually, eg. AFR2 boosted one of the
    3DMark13 tests by 35% (though at the same time it dropped another by 10%).

    Ian.
  • BMNify - Sunday, July 7, 2013 - link

    "The good thing about GPUs is that 20nm is about to arrive,then the pseudo 16/14nm soon after and if they start to put the RAM on a silicon interposer they gain a lot of memory bw too."

    you don't really want an old silicon interposer for the next generations though, you Really Need the current and later "Wide IO(2)" with its generic 512bit bus as 4x128 configuration and Terabits/s throughput interconnect ,especially for the real 4K+/8K stuff in the pipeline now....

    ask yourself this, how long is the real development timescale from one generation of CPU and GFX chip given the expected real 8K over the air retail devices by 2020ish...

    that's only two GFX generations away, maybe 3 at best.... times running out if they don't already have acceptable throughput 8K + many audio channels (was it 128 channels or some such, i forget now) capable silicon in the lab right now
  • SetiroN - Monday, July 1, 2013 - link

    No frame metering,

    doesn't matter.
  • MrSpadge - Monday, July 1, 2013 - link

    Do you mean you'd prefer perfectly steady 10 fps over average 60 fps with dips down to 30 fps, i.e. very inconsistent? Otherwise.. the test does matter.
  • chizow - Monday, July 1, 2013 - link

    Interesting results, thanks for posting this. Would certainly like to see some of the blanks filled in with card configs and SLI/CF, as well as some settings with AA disabled.

    Either way, it does like we are in for a big spike in GPU requirements between higher resolutions, 3D and the next-gen consoles. Probably what Nvidia and AMD have been waiting for to spur sales.
  • tackle70 - Monday, July 1, 2013 - link

    Good to know my two overclocked 780s are ready for this... I don't need AA at such high resolution, and I could always add a third 780 down the road. Now I just hope someone puts out a bare bones 4k panel with DisplayPort for $1K or so
  • This Guy - Monday, July 1, 2013 - link

    Use 3x X-star or QNIX monitors with VESA stands, mount in portrait and bingo, 4320*2560 @ 100-135Hz for around 1k.

    If you want to minimise bezels remove the panel and electronics from the case and build a new wooden one for them. The bezel is about twice as thick as the metal edge around the panel so you can get them a fair bit closer.
  • tackle70 - Monday, July 1, 2013 - link

    Yeah I know about those options but I am a one screen guy... Just cant stand bezels. I'm happy with my 1440p monitor until 4k 60 Hz comes around
  • nakquada - Monday, July 1, 2013 - link

    Metro 2033 is an awfully programmed game and I really don't know why it is used in benchmarks. The developers admitted themselves that the engine is horribly optimized. I'd like to see something like Battlefield 3 or Tomb Raider or Skyrim + Mods.
  • IanCutress - Monday, July 1, 2013 - link

    Games that have separate interfaces to run benchmarks are a godsend to testers. Normally while I have a benchmark running I can continue writing another article or set up the next test system / install an OS. Metro2033 does this better than most, and is a strenuous enough benchmark.

    Despite this, a lot of games and engines are not optimized. Plus it doesn't really matter if the engine is optimised for benchmarks - users who play the game are going to experience the same results as we do with the same settings. So an argument that 'this benchmark is not optimised' is quite a large non-sequitur in the grand scheme of things - the games are what they are and it's not for reviewers to optimise them. If I had had more time and preparation, I would have perhaps included Tomb Raider / Bioshock Infinite in there as well.
  • xTRICKYxx - Monday, July 1, 2013 - link

    Unoptimized game engines can be pretty good benchmarks. I've always wished you guys included an ARMA 3 benchmark (built-in benchmark support).
  • bminor13 - Monday, July 1, 2013 - link

    Well since that game hasn't been finished yet, any performance tests would probably lose meaning over time, as they optimize the engine and whatnot. Maybe they'll include it after the game gets released.
  • yougotkicked - Monday, July 1, 2013 - link

    I have to wonder how much improvement might be made with better engine programming. I realize that a lot of work has already gone into making game engines very efficient, but with the growth of GPGPU computing over the last few years there is a lot more work being done on GPU computing in general. Perhaps some creative programming may get us playable framerates at 4k sooner than we expect.
  • silenceisgolden - Monday, July 1, 2013 - link

    XBox Two (Four?) just made my day.
  • Aegwyn11 - Monday, July 1, 2013 - link

    This isn't 4K (4096x2160). Its UHD (3840x2160). UHD is exactly four times HD (1920x1080).
  • jadedcorliss - Monday, July 1, 2013 - link

    It seems like UHD or 3840x2160 is going to be a common resolution at some point and will be referred to as 4k.
  • Aegwyn11 - Monday, July 1, 2013 - link

    The problem here is that 4K is an existing term that refers to a digital cinema format that's been around for many years. Just because some screw up the terminology doesn't make it okay.
  • Gigaplex - Tuesday, July 2, 2013 - link

    The terminology regarding resolutions has been FUBAR for a very long time.
  • mapesdhs - Tuesday, July 2, 2013 - link

    Aegwyn11 is right.

    I asked a friend of mine at a movie studio about this last year. I hope he won't mind
    my quoting his very informative reply...

    "The differences between "HD" and "2K" are not just a matter of spatial
    dimensions. Although modern (ie. digital) HD is always either 1920x1080
    or 1280x720; there are a multitude of both frame and field rates - in
    fact over a dozen for each format (although the most widely used
    presently are 50Hz and 59.94Hz interlaced). The colour space for HD
    video is invariability 10-bit per component; linear.

    By contrast, 2K is generally regarded as a nebulous resolution - it can
    mean either 2048x1556, 2048x1536, 2048x3072, 1828x1556, 1828x1332 and at
    least six additional dimensions... The reason for such confusion is that
    when Kodak released their Cineon film scanner / workstation / printer
    system back in 1992 (and thus single-handedly invented both the concept
    and technology of the digital film intermediate) these figures
    represented the quarter resolutions of various 16mm / 35mm / 65mm film
    formats.

    For example, when scanning a full aperture 4-performation 35mm film frame
    at 6m (ie. noise level and thus differences are indistinguishable to the
    human eye) the 24.892mm x 18.669mm frame is sampled into a 4096x3112
    pixel image (4K); a quarter of which is 2048x1556. Additionally, in
    scanned film data colour space is almost always 10-bit per component
    logarithmic (roughly equivalent to 14-bit linear). Interestingly, new
    Kodak Vision3 film stocks are capable of recording two additional stops
    (in the shoulder section of the sensiometric curve); which require 16-bit
    per component linear scans to record highlights without clipping!

    Have you launched Cineon on any of your IRIX hosts? Kodak were so clever
    they practically spawned an entire industry (it was years ahead of its
    time - hence the quarter resolutions) ..."

    Ian.
  • jesh462 - Monday, July 1, 2013 - link

    I'd rather see benchmarks for the Oculus Rift at 1080p, the consumer version. I don't know why any PC gamer cares about 4k displays when the Oculus Rift is coming out so soon.
  • kyuu - Tuesday, July 2, 2013 - link

    Well let's see:

    1) The Oculus Rift makes no sense for games that are not FPV. Personally, there's only one FPV game coming out in the future that I'm interested in.

    2) Currently, there are only a very few games that support the Rift and none of them are being used for benchmarking. Even if they were, no one has a final release Rift because they aren't out yet.

    3) Running the Rift should be identical to running a 3D 1080p monitor as far as graphics performance is concerned, so there isn't a real need to specifically benchmark the Rift.

    4) Lots of people care more about higher resolutions than the Rift because, again, the Rift only makes sense for FPV games.
  • testbug00 - Monday, July 1, 2013 - link

    Why no 7970 6GB?

    Gigabyte does not have those 0.0?

    NOTE: is it possible to disable one of the GPU on dual-GPU boards? I would guess not, but it would be cool if you could :)
  • jadedcorliss - Monday, July 1, 2013 - link

    Pure insanity. I feel that gaming at 1920x1080p or 2560x1440p should be something of a norm for the foreseeable future. I probably won't be able to be bothered to even think about gaming beyond that, and I'm not much of a 3D gaming or multiple monitor gaming fan.
  • hfm - Monday, July 1, 2013 - link

    Metro 2033.. or the year a single GPU will be able to do this...
  • JeBarr - Monday, July 1, 2013 - link

    I hope not. If I have to wait that long for a single GPU to run UHD at 30Hz then let's just stop and throw in the towel right now before any moar time and money is wasted.
  • JlHADJOE - Monday, July 1, 2013 - link

    3x 7950s beating 2x Titans makes AMD look like pretty awesome value.
  • titanmiller - Monday, July 1, 2013 - link

    It's a shame that the next gen consoles are coming into the world at the same time as 4K. This means that consumers will have to wait...5-10 years before they can buy a console with enough power to run 4K games.
  • Impulses - Tuesday, July 2, 2013 - link

    Timing's not that bad really, it'll probably take about that long for these newer displays to come down in price and become the norm... I doubt pushing a 4k-capable console in two year's time would really accelerate that process, but I'm not really into consoles these days so maybe I'm underestimating their impact. Hopefully PC GPUs move at a much faster pace, am I the only one that's thinking multiple 4k displays would make a badass setup? :p C'mon now, there's already a lot of people running 3x1440... I'd be one if I had an unlimited budget and space, 3x1200 will have to do for now!
  • Seegs108 - Monday, July 1, 2013 - link

    I think it's important to note the various typos in this article. This is not 4K resolution, this is UHD resolution. For an analogy, this is like calling 1080p '2K'. 4K is 4096 x 2160. UHD is 3840 x 2160. I find it annoying this is still an issue people keep making.
  • Gigaplex - Tuesday, July 2, 2013 - link

    Even though it's not correct terminology, the fault does not lie with this article. The entire industry is getting it wrong, and these UHD displays are being marketed as 4K.
  • mapesdhs - Tuesday, July 2, 2013 - link

    Yup; see my earlier post with some further details on this. However, not really any
    surprise that the consumer industry simplifies things for the sake of marketing, etc.

    Ian.
  • Seegs108 - Tuesday, July 2, 2013 - link

    Just imagine people calling 1920 x 1200 "1080p" in an article similar to this one. It seems people are either ignorant to what I'm talking about or they just don't care enough to make the differentiation between these two as they do others. Seems a little stupid, especially when talking about this on a technology forum where things are very specific to begin with.
  • Sabresiberian - Tuesday, July 2, 2013 - link

    Thanks for doing this Ian, very informative.
  • Touche - Tuesday, July 2, 2013 - link

    Nobody noticed 7950 scoring better than GTX680?
  • JlHADJOE - Tuesday, July 2, 2013 - link

    Yeah I noticed that too. I'm guessing the GK104s are becoming severely RAM or bandwidth limited at those high resolutions. Even in older benchmarks the Tahitis tended to claw back some ground, showing off their 384-bit bus to good effect at the highest resolutions (2560x1440/1600), so I guess with UHD the effect is even more pronounced.

    Kinda sad that the cheapest 384-bit part Nvidia has now starts at $650.
  • turco2025 - Tuesday, July 2, 2013 - link

    Hello, sorry for the question, which brand is the keyboard you are using ??
    thanks.
  • Wolfpup - Tuesday, July 2, 2013 - link

    Wow, that's kind of amazing that TECHNICALLY you can get hardware today that seems to run high end games with high end settings at 4K. IF you're willing to buy 2-4 Titans that is lol
  • tackle70 - Tuesday, July 2, 2013 - link

    Oh stop being facetious. Some of us use our computer monitor also for movies/netflix/etc and having bezels is beyond horrible for that type of use.

    One large single screen is the way to go for an all purpose display, and it needs to be 60 Hz for gaming unless the price point is obscenely cheap (sub-$1k).
  • glockjs - Tuesday, July 2, 2013 - link

    thank you and...FML
  • brucek2 - Wednesday, July 3, 2013 - link

    I'd love to see a future AnandTech investigation into the practical implications of 4K for the average enthusiast. 4 Titans are likely out of reach for most, but are there ways that 4K can make life better on realistic budgets? As many have suggested, lower amounts of *AA are an easy start; if necessary could gamers even let the display upscale from 1080p while still taking advantage of the much higher resolution for non-gaming tasks?
  • haukionkannel - Wednesday, July 3, 2013 - link

    Ok. So practically current generation GPU hardware is not guite ready for 4K... Nice! So there is really a good reson to push GPU technology forward. The 4K will be near 1500$ in 3 to 4 years in smaller screen sizes, so Nvidia and AMD has that much time to make it happen. Until then 3 to 4 cards combinations are solution for those who really can afford these. The need for GPU upgrades has been stagnated for so long time that this is actually refressing!
    I would take 4K screen now if I could affrd it. Run all desktop aplications in 4K mode and games in 1080p untill I would see enough GPU power for it with single or two card combinations.
  • damianrobertjones - Wednesday, July 3, 2013 - link

    "3840x2160" - I like the fact that it runs as it was intended, full res, nothing scaled etc. Shame that Apple and Google machines get so much praise while offering zero increased 'working' space.
  • Pastuch - Wednesday, July 3, 2013 - link

    You can't be serious, I can't even use my 2560x1440 monitor without large screen fonts which look awful in Windows. I have 20/20 vision too so it has nothing to do with eye sight. The fact that Android scales so beautifully is a huge advantage. The smallest 4k monitor with comfortable font sizes without scaling the UI is over 70 inches for the bulk of north americans. Eyesight is NOT improving.
  • Pastuch - Wednesday, July 3, 2013 - link

    The smallest 4k monitor with comfortable font sizes IN WINDOWS without scaling the UI is over 70 inches for the bulk of north americans.
  • pprime - Thursday, July 4, 2013 - link

    What I want to know is, since this is as near as makes no difference to 4x27" 1080 monitors, do I get the same effect of viewing 4 times the content, or do I just get 4 times the detail?

    Let's say in your game with the 27" the aspect ratio lets you see a 5 foot radius (hor) in front of you (It's an example), with this will I see a 10 foot radius, or will I see the same 5 feet just bigger.

    If it's the latter, what's the point for gaming?
  • Mithan - Monday, July 8, 2013 - link

    Happy with my 2 Dell 2407/2412m monitors.
    1920x1200 is fine for me.
  • bds71 - Tuesday, July 9, 2013 - link

    did anyone notice the perfect scaling of Sleeping Dogs with Titan and the 7950? not sure why the 680 performed so completely and utterly horrendous. Ian: any insight?

    Titan:
    13.63 -> 27.4 -> 41.07 -> 57.78
    1 -> 2.01 -> 3.01 -> 4.24

    7950:
    11.1 -> -> 34.58
    1 -> -> 3.12

    680:
    6.25 -> 8.55
    1 -> 1.37 (ouch?)

    what allows Titan and the 7950 to scale so perfectly (other than the obvious: true GPU limited graphics vs resolution limiting)? and, why does the 680 suck so bad with this title and scale so poorly (i'm thinking driver simply not optimized)? do you think this is indicative of 680 scaling in general? (this effects me personally because i'm looking to get a 2nd 690 specifically for 4k/UHD gaming!!)

    maybe 4k (UHD for those who care) can shed some light on architectural differences not only between AMD and nVidia, but generationally between makers as well. i look forward to any insight you can offer (now, and down the road)

    note: the new 55/65/73 in. Sony Wega line is advertised as both 4k and UHD - for those who are offended by this lack of distinction, blame the manufacterers :) for those who are interested the costs are 5k, 7k, and a whopping 25k for the 73 (ouch!)
  • geok1ng - Tuesday, July 9, 2013 - link

    the numbers provide ZERO evidence of a VRAM limitation at 4k resolutions. at metro 6gb titan much slower than 2x3gb 7990. dirt 6gb titan slower than 2x2gb 690. sleeping dogs 6gb titan less than half performance of 2x3gb 7990 or 1/3 of the performance of 3x3gb 7950. As tested before, its only at 3x1440p/3x1600p resolutions that you start to see some VRAM limitation on a few games. It's amazing how cofirmation bias can work the human mind and distort reality. there not a single number on the article that could remotely speak of a VRAM limitation, but we have dozens of comments saying such. as xkcd once joked " dear god, i would like to file a bug report"
  • mac2j - Tuesday, July 9, 2013 - link

    Someone needs to beat the HDMI 2.0 with their own cables until they release.... I'm pretty sure they promised Q2-Q3 2013. Its crazy we can't get 4K60s right now (or more than 8 bit color) because a bunch of EEs can't get it together and roll out a long overdue cable. (And yea I know its the content providers hassling them about security but its definitely time to tell them to STFU)
  • Parablooper - Friday, July 12, 2013 - link

    Why no 7970/GE...? Crossfiring those tends to give better performance than a 7990. It's also AMD's flagship card so its absence is a little troubling.
  • corhen - Monday, July 15, 2013 - link

    How bad a review can you do? "i know, we will take a 4K monitor.. and then use AA that QUADRUPLES THE RESOLUTION!... AND THEN CLAIM YOU NEED 4 TITANS! GENIUS!"

    Hang your head in shame anandtech, you botched this.
  • Kidster3001 - Tuesday, July 16, 2013 - link

    I've tried. I just can't stay quiet any more.

    Why do we let them call this 4K? We've always called our monitors by vertical resolution. 720p, 1080p. 2560x1600 is not known as 2K or 2.5K. Now we have a 2160p monitor and they try to call it 4K. Marketing shouldn't win. We need to fight back!

    /rant LoL
  • Treckin - Tuesday, July 16, 2013 - link

    You guys apparently forgot that Oculus will obsolete the resolution race if you ask me - you can just move your head to view more desktop realestate, and the price of the panels will track closely the price of cellphone and small formfactor tablet displays.

    I predict that precisely because of the graphics horsepower required to drive 8million pixels - 4 titans! - something like Oculus will dominate this market at least until the power of titans can be had for ~$300... Which looking at the history of performance improvement per generation, could be far longer than the 2-3 years hypothesized by the author.
  • klepp0906 - Tuesday, February 9, 2016 - link

    Love articles like this.

    "When 4K tvs are a thing in the home by 2023"

    /em looks at the 4K tv he bought two years ago.

    /em counts tvs on best buys site and notices 4K is the vast majority buyable.

    Only off by a little under a decade /shrug

    Yea game developers need to take it easy. Spent 10k building my quad Titan beast and it's already a terd cause I chose to go surround (roughly 5k reso). I figure the benchmarks at 4K are close to what I get at 5k due to my overclocks.

    Ironically I've tried none of the aforementioned games due to a predisposition for mmorpgs.

    Btw anyone know where I can find a quad Titan vs quad 980ti review?

Log in

Don't have an account? Sign up now