POST A COMMENT

26 Comments

Back to Article

  • blanarahul - Thursday, March 13, 2014 - link

    Didn't AMD say that Hawaii was a gaming oriented chip?? I mean that's why it has 1/8 FP64 performance. Reply
  • Ian Cutress - Thursday, March 13, 2014 - link

    Some of my compute algorithms are all FP32 - it's not as if all compute demands FP64, though some algorithms obviously do (solving PDEs in small scale). Reply
  • Flunk - Thursday, March 13, 2014 - link

    Most password hashing algorithms, scrypt, stuff like that. Reply
  • Darkstone - Thursday, March 13, 2014 - link

    Most hashing algorithms use integers. Which is apparently the main reason AMD GPU's are preferred over NVIDIA's for such tasks: integer shift/rotate performance. Reply
  • The Von Matrices - Thursday, March 13, 2014 - link

    NVidia's Maxwell architecture now matches AMD's GCN at these operations. Reply
  • npz - Friday, March 14, 2014 - link

    No it does not. It's better than Kepler, but it's still far away Reply
  • npz - Friday, March 14, 2014 - link

    Shift, rotate is part of it. But there are many more integer operations and all of them perform well on GCN. In addition, there is branching performance to consider that practically no one talks about. Reply
  • npz - Friday, March 14, 2014 - link

    Lots of very useful things are still FP32. All the multimedia and 3D rendering is FP32 and integer (h.264 is integer for example). Engineering is still very useful in FP32 so long as you don't need absolute precision or can deal with small rounding errors. Reply
  • Lorthreth - Thursday, March 13, 2014 - link

    Looks like Vapor has single DisplayPort and Toxic has 2 mini-DP. Reply
  • PixyMisa - Friday, March 14, 2014 - link

    Looks like you're right. That's the first 290 card I've seen with two mini-DP. I'm looking at getting a couple of Dell's 4K 24" monitors, so I need two DP outputs. My existing 7950 has that, and my Linux box has a 7770 with two mini-DP, but almost no current-gen cards from either AMD or Nvidia offer this. (Except the workstation cards.) Reply
  • Alchemy69 - Thursday, March 13, 2014 - link

    8GB of video memory is the Emperor's new clothes of video memory, only bought by kids with their parents money because they've been told that they need it if they want to be hardcore gamers. Reply
  • geekman1024 - Thursday, March 13, 2014 - link

    Yeah, and Bill Gates said 640K is enough. Reply
  • Brooklands - Thursday, March 13, 2014 - link

    I've heard that statement a few years ago, when 128MB vram was considered "to much". It wasn't. And for some applications now and even more in 1-2 years 8GB won't be "to much". Games like Hitman Absolution, Thief 4 and some mods can easily max out the vram of a Titan, while achieving playable FPS in 4K resolution.

    http://translate.google.de/translate?sl=auto&t...
    Reply
  • Mr Perfect - Thursday, March 13, 2014 - link

    Considering the PS4 and XBox 1 both have 8GB of ram each, I was a little surprised that the new videocards where coming out with only 4GB. Granted, it's 8GB of shared ram in the consoles, but next gen games are going to start taking large pools of ram for granted now. Reply
  • piroroadkill - Thursday, March 13, 2014 - link

    Nah, Xbox 1 has 64MiB RAM.

    Xbox One has 8GiB unified.
    Reply
  • ImSpartacus - Thursday, March 13, 2014 - link

    The consoles use that RAM for the entire system, not just the GPU. Reply
  • rish95 - Thursday, March 13, 2014 - link

    Do the PS4 and Xbox One even have the horsepower to make good use of 8GB of RAM? Reply
  • nathanddrews - Friday, March 14, 2014 - link

    It's probably better to have too much and not need it than to not have enough and need it. They did it due to pressure from developers. If developers find a way to use it all, the PS4 will have a distinct advantage. Reply
  • EzioAs - Thursday, March 13, 2014 - link

    So these kids, I'm guessing they used their parent's money as well to buy 3 2560x1600 monitors to actually utilize the 8GB memory that comes with these cards? Reply
  • nathanddrews - Friday, March 14, 2014 - link

    That's not quite fair. You assume it's worthless and bought for spoiled rich kids.

    First, it's been confirmed through testing that you need ~4GB if you plan on doing at 4xAA at 4K, depending on the game. If you start playing with higher settings, higher-res texture packs, and newer games designed to take advantage of more VRAM (thanks Xbone/PS4), then 8GB won't seem too extreme.

    Second, you assume that only gamers want this, but there are many non-gaming applications that will eat up every last bit of VRAM you can throw at them and many consumers unwilling to spend the incredible premium on workstation cards.

    Third:
    http://www.overclock.net/t/1472145/got-4k
    Reply
  • The Von Matrices - Thursday, March 13, 2014 - link

    Do these cards use 16 4Gb modules or 32 2Gb modules? Reply
  • Frenetic Pony - Thursday, March 13, 2014 - link

    Exactly what I've been waiting for, dedicated next gen games should start eating up video ram once they drop last gen as their minimum spec. Future proof gaming here we are! Reply
  • MrSpadge - Friday, March 14, 2014 - link

    Sure... with higher end Maxells and 20 nm almost around the next corner. Reply
  • JlHADJOE - Thursday, March 13, 2014 - link

    AMD: Future-proofing their GPUs against mining hardware compensation. Reply
  • jasonelmore - Friday, March 14, 2014 - link

    Titanfall For PC is treating Video RAM like System Memory. Its one of the only PC Games i have that will fully saturate my GTX 780;s 3GB of GDDR5 at the main menu Reply
  • Johnmcl7 - Friday, March 14, 2014 - link

    No he didn't:

    http://www.wired.com/politics/law/news/1997/01/148...

    John
    Reply

Log in

Don't have an account? Sign up now