• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY

Part of my extra-curricular testing post Computex this year put me in the hands of a Sharp 4K30 monitor for three days and with a variety of AMD and NVIDIA GPUs on an overclocked Haswell system.  With my test-bed SSD at hand and limited time, I was able to test my normal motherboard gaming benchmark suite at this crazy resolution (3840x2160) for several GPU combinations.  Many thanks to GIGABYTE for this brief but eye-opening opportunity.

The test setup is as follows:

Intel Core i7-4770K @ 4.2 GHz, High Performance Mode
Corsair Vengeance Pro 2x8GB DDR3-2800 11-14-14
GIGABYTE Z87X-OC Force (PLX 8747 enabled)
2x GIGABYTE 1200W PSU
Windows 7 64-bit SP1
Drivers: GeForce 320.18 WHQL / Catalyst 13.6 Beta

GPUs:

NVIDIA
GPU Model Cores / SPs MHz Memory Size MHz Memory Bus
GTX Titan GV-NTITAN-6GD-B 2688 837 6 GB 1500 384-bit
GTX 690 GV-N690D5-4GD-B 2x1536 915 2 x 2GB 1500 2x256-bit
GTX 680 GV-N680D5-2GD-B 1536 1006 2 GB 1500 256-bit
GTX 660 Ti GV-N66TOC-2GD 1344 1032 2 GB 1500 192-bit
AMD
GPU Model Cores / SPs MHz Memory Size MHz Memory Bus
HD 7990 GV-R799D5-6GD-B 2x2048 950 2 x 3GB 1500 2x384-bit
HD 7950 GV-R795WF3-3GD 1792 900 3GB 1250 384-bit
HD 7790 GV-R779OC-2GD 896 1075 2GB 1500 128-bit

For some of these GPUs we had several of the same model at hand to test.  As a result, we tested from one GTX Titan to four, 1x GTX 690, 1x and 2x GTX 680, 1x 660Ti, 1x 7990, 1x and 3x 7950, and 1x 7790.  There were several more groups of GPUs available, but alas we did not have time.  Also for the time being we are not doing any GPU analysis on many multi-AMD setups, which we know can have issues – as I have not got to grips with FCAT personally I thought it would be more beneficial to run numbers over learning new testing procedures.

Games:

As I only had my motherboard gaming tests available and little time to download fresh ones (you would be surprised at how slow in general Taiwan internet can be, especially during working hours), we have a standard array of Metro 2033, Dirt 3 and Sleeping Dogs.  Each one was run at 3840x2160 and maximum settings in our standard Gaming CPU procedures (maximum settings as the benchmark GUI allows).

Metro 2033, Max Settings, 3840x2160:

Metro 2033, 3840x2160, Max Settings

Straight off the bat is a bit of a shocker – to get 60 FPS we need FOUR Titans.  Three 7950s performed at 40 FPS, though there was plenty of microstutter visible during the run.  For both the low end cards, the 7790 and 660 Ti, the full quality textures did not seem to load properly.

Dirt 3, Max Settings, 3840x2160:

Dirt 3, 3840x2160, Max Settings

Dirt is a title that loves MHz and GPU power, and due to the engine is quite happy to run around 60 FPS on a single Titan.  Understandably this means that for almost every other card you need at least two GPUs to hit this number, more so if you have the opportunity to run 4K in 3D.

Sleeping Dogs, Max Settings, 3840x2160:

Sleeping Dogs, 3840x2160, Max Settings

Similarly to Metro, Sleeping Dogs (with full SSAA) can bring graphics cards down to their knees.  Interestingly during the benchmark some of the scenes that ran well were counterbalanced by the indoor manor scene which could run slower than 2 FPS on the more mid-range cards.  In order to feel a full 60 FPS average with max SSAA, we are looking at a quad-SLI setup with GTX Titans.

Conclusion:

First of all, the minute you experience 4K with appropriate content it is worth a long double take.  With a native 4K screen and a decent frame rate, it looks stunning.  Although you have to sit further back to take it all in, it is fun to get up close and see just how good the image can be.  The only downside with my testing (apart from some of the low frame rates) is when the realisation that you are at 30 Hz kicks in.  The visual tearing of Dirt3 during high speed parts was hard to miss.

But the newer the game, and the more elaborate you wish to be with the advanced settings, then 4K is going to require horsepower and plenty of it.  Once 4K monitors hit a nice price point for 60 Hz panels (sub $1500), the gamers that like to splash out on their graphics cards will start jumping on the 4K screens.  I mention 60 Hz because the 30 Hz panel we were able to test on looked fairly poor in the high FPS Dirt3 scenarios, with clear tearing on the ground as the car raced through the scene.  Currently users in North America can get the Seiki 50” 4K30 monitor for around $1500, and they recently announced a 39” 4K30 monitor for around $700.  ASUS are releasing their 4K60 31.5” monitor later this year for around $3800 which might bring about the start of the resolution revolution, at least for the high-end prosumer space.

All I want to predict at this point is that driving screen resolutions up will have to cause a sharp increase in graphics card performance, as well as multi-card driver compatibility.  No matter the resolution, enthusiasts will want to run their games with all the eye candy, even if it takes three or four GTX Titans to get there.  For the rest of us right now on our one or two mid-to-high end GPUs, we might have to wait 2-3 years for the prices of the monitors to come down and the power of mid-range GPUs to go up.  These are exciting times, and we have not even touched what might happen in multiplayer.  The next question is the console placement – gaming at 4K would be severely restrictive when using the equivalent of a single 7850 on a Jaguar core, even if it does have a high memory bandwidth.  Roll on Playstation 5 and Xbox Two (Four?), when 4K TVs in the home might actually be a thing by 2023.

16:9 4K Comparison image from Wikipedia

POST A COMMENT

131 Comments

View All Comments

  • EzioAs - Monday, July 01, 2013 - link

    Just because you don't see the benefit of higher resolutions and high PPI doesn't mean the rest of us have to follow. A few years later, there's a chance you'll get what you want (75W GPU that can max Metro). Of course there's an even higher chance that at the same time, games released will have graphics and physics that make Metro looks more like Super Mario (okay, maybe that's going a bit overboard but you get the point) Reply
  • jrs77 - Monday, July 01, 2013 - link

    It's not about personal preferences, but about physiology of the human eyes.

    Higher PPI only is benfecial at closer distance, hence why mobile devices benefit from it. The farther you get away from the screen (PC usually ~80cm and TV 3 or more meters) however, the less difference you'll notice and the only thing that increases is the energy needed to drive the hardware.
    Reply
  • ATWindsor - Monday, July 01, 2013 - link

    Needed resolution has nothing to do with distance (at least not directly), and everything to do what Field of view the screen covers in your field of vision, that field of view is huge for a 30 inch monitor in regular settings. We can see details down to 8 -10 arc-seconds in some circumstances, even down to 0,5 arc-seconds in extreme cases. We need well above 1080p for that, even above 4k. Even 0,5 to 1 arc-minute for pure pixel-resolving in non-special-cases needs more than 1080p, and thats with 20/20 vision, which large portions of the public exceeds. Reply
  • solipsism - Monday, July 01, 2013 - link

    Adding to ATWindsor's comment, as we move to larger and larger displays, either as a "TV set" in the HEC or a computer display on our desk the "Retina" effect will go away.

    Right now a 50" 1080p "TV" and 2560x1440 27" display may place the viewer far enough away that they can't see the pixels… but display will continue to get larger.

    For those aforementioned resolutions and display sizes it's 6.5 and 2.6 feet to get the "Retina" effect with 20/20(6/6) vision. That sounds to me we are pretty much on the cusp of this, especially with the HEC display. You can buy a 65" 1080p for under $1500 and yet those that doing so to replace a smaller 1080p HDTV might not be getting a better overall experience. You have to sit at least 8.45 feet away to get the effect.

    I'm glad that display technology, GPUs, and even codecs (H.265) are all lining up nicely to tackle this issue.

    PS: Note that 4K is exactly 3x720p and 2x1080p.
    Reply
  • mapesdhs - Tuesday, July 02, 2013 - link


    jrs77, regret to say you're wrong wrt human eye visual fidelity. Humans can resolve
    far greater detail than the examples you give as being supposedly the most required
    for anyone, though of course it varies between individuals. However, just because
    one person can't tell the difference, I can guarantee others could.

    Rough approximation of human vision is about 8000 pixels across at near distance,
    115K pix across at far horizon. We can also resolve a much greater number of shades
    of each colour than are currently shown on today's screens (which is why SGI's old
    high-end systems supported 12bits/channel as far back as the early 1990s, because
    it was regarded as essential for high fidelity visual simulation, especially for night time
    scenarios, sunrise/sunset/etc.)

    Indeed, some TV companies have suggested the industry should skip 4K technology
    completely and just jump straight to 8K once the tech is ready, though it seems plenty
    are willing to hop on the 4K bandwagon as soon as possible whatever happens.

    NB: Caveat to the above: how the eye resolves detail (including brightness, contrast,
    movement, etc.) is of course not uniform across one's field of vision. Imagine a future
    GPU with eye tracking which in real-time can adjust the detail across the screen
    based on where one is looking, reducing the load in parts of the screen that are
    feeding to our peripheral vision, focusing on movement & contrast in those areas
    instead of colour and pixel density. That would more closely match how our eyes
    work, reducing the GPU power required to render each frame. Some ways off though
    I expect. Even without eye tracking (which is already available in other products),
    one could by default focus the most detail in the central portion of the display.

    Ian.
    Reply
  • Oxford Guy - Wednesday, July 03, 2013 - link

    The gamut available to the eye is much larger than the miniscule sRGB color space that still dominates computing today. Reply
  • Zorkwiz - Monday, July 01, 2013 - link

    If you search for the famous "Viewing Distance When Resolution Becomes Noticeable" chart, you will see that at a 30-31" screen size, you have to be sitting about 4 feet back for 4k not to matter vs 1080p for someone with good vision. The full benefit of 4k on a 30" screen is visible at approximately 2 feet, which is pretty much exactly how far back I sit from the screen, so I think a 30" 4K panel would be just about perfect. Reply
  • xTRICKYxx - Monday, July 01, 2013 - link

    Now we need to wait for 60hz/120hz 4k panels to become affordable.... Reply
  • Kjella - Monday, July 01, 2013 - link

    That's what I found out too, I could use a 30"/4K monitor but my 60"/1080p TV is fine for my couch distance - now if they sold 100-120" 4K TVs or 4K projectors priced for mortals it would be different, else I'd have to get a lot closer.. Reply
  • Dribble - Monday, July 01, 2013 - link

    In the end it'll be worth it, but res is only one part of the package for a good gaming monitor - you need low input lag, 120hz refresh, good colours, etc. Are there any 4K monitors with 60hz refresh even, let alone 120hz (most are 30hz right now)?

    So right now you have to spend a fortune on something that ticks the resolution box lots of times, but has a lot of x's elsewhere.

    You'd be best with a 3*1080p @ 120hz surround with lightboost for fast paced games.
    Reply

Log in

Don't have an account? Sign up now