Today at GTC NVIDIA announced their next GTX Titan family card. Dubbed the GTX Titan Z (no idea yet on why it’s Z), the card is NVIDIA's obligatory entry into the dual-GPU/single-card market, finally bringing NVIDIA’s flagship GK110 GPU into a dual-GPU desktop/workstation product.

While NVIDIA has not released the complete details about the product – in particular we don’t know precise clockspeeds or TDPs – we have been given some information on core configuration, memory, pricing, and availability.

  GTX Titan Z GTX Titan Black GTX 780 Ti GTX Titan
Stream Processors 2 x 2880 2880 2880 2688
Texture Units 2 x 240 240 240 224
ROPs 2 x 48 48 48 48
Core Clock 700MHz? 889MHz 875MHz 837MHz
Boost Clock ? 980MHz 928MHz 876MHz
Memory Clock 7GHz GDDR5 7GHz GDDR5 7GHz GDDR5 6GHz GDDR5
Memory Bus Width 2 x 384-bit 384-bit 384-bit 384-bit
VRAM 2 x 6GB 6GB 3GB 6GB
FP64 1/3 FP32 1/3 FP32 1/24 FP32 1/3 FP32
TDP ? 250W 250W 250W
Transistor Count 2 x 7.1B 7.1B 7.1B 7.1B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Launch Date 04/XX/14 02/18/14 11/07/13 02/21/13
Launch Price $2999 $999 $699 $999

In brief, the GTX Titan Z is a pair of fully enabled GK110 GPUs. NVIDIA isn’t cutting any SMXes or ROP partitions to bring down power consumption, so each half of the card is equivalent to a GTX 780 Ti or GTX Titan Black, operating at whatever (presumably lower) clockspeeds NVIDIA has picked. And although we don’t have precise clockspeeds, NVIDIA has quoted the card as having 8 TFLOPS of FP32 performance, which would put the GPU clockspeed at around 700MHz, nearly 200MHz below GTX Titan Black’s base clock (to say nothing of boost clocks).

On the memory front GTX Titan Z is configured with 12GB of VRAM, 6GB per GPU. NVIDIA’s consumer arm has also released the memory clockspeed specifications, telling us that the card won’t be making any compromises there, operating at the same 7GHz memory clockspeed of the GTX Titan Black. This being something of a big accomplishment given the minimal routing space a dual-GPU card provides.

In terms of build the GTX Titan Z shares a lot of similarities to NVIDIA’s previous generation dual-GPU card, the GTX 690. NVIDIA is keeping the split blower design, with a single axial fan pushing out air via both the front and the back of the card, essentially exhausting half the hot air and sending the other half back into the case. We haven’t had any hands-on time with the card, but NVIDIA is clearly staying with the black metal styling of the GTX Titan Black.

The other major unknown right now is power consumption. GTX Titan Black is rated for 250W, and meanwhile NVIDIA was able to get a pair of roughly 200W GTX 680s into the 300W GTX 690 (with reduced clockspeeds). So it’s not implausible that GTX Titan Z is a 375W card, but we’ll have to wait and see.

But perhaps the biggest shock will be price. The GTX Titan series has already straddled the prosumer line with its $1000/GPU pricing; GTX Titan was by far the fastest thing on the gaming market in the winter of 2013, while GTX Titan Black is a bit more professional-leaning due to the existence of the GTX 780 Ti. With GTX Titan Z, NVIDIA will be asking for a cool $3000 for the card, or three-times the price of a GTX Titan Black.

It goes without saying then that GTX Titan Z is aimed at an even more limited audience than the GTX Titan and GTX Titan Black. To be sure, NVIDIA is still targeting both gamers and compute users with this card, and since it is a GeForce card it will use the standard GeForce driver stack, but the $3000 price tag is much more within the realm of compute users than gamers. For gamers this may as well be a specialty card, like an Asus ARES.

Now for compute users this will still be an expensive card, but potentially very captivating. Per FLOP GTX Titan Black is still a better deal, but with compute users there is a far greater emphasis on density. Meanwhile the GTX Titan brand has by all accounts been a success for NVIDIA, selling more cards to compute users than they had ever expected, so a product like GTX Titan Z is more directly targeted at those users. I have no doubt that there are compute users who will be happy with it – like the original GTX Titan it’s far cheaper per FP64 FLOP than any Tesla card, maintaining its “budget compute” status – but I do wonder if part of the $3000 pricing is in reaction to GTX Titan undercutting Tesla sales.

Anyhow, we should have more details next month. NVIDIA tells us that they’re expecting to launch the card in April, so we would expect to hear more about it in the next few weeks.

Source: NVIDIA

POST A COMMENT

64 Comments

View All Comments

  • Shadowmaster625 - Wednesday, March 26, 2014 - link

    It wouldnt be worth $3000 even if it was 20nm and the card only had a TDP of 240W. But at least then you could argue the case for it. As it is, its just a fed funny money grab. Reply
  • mapesdhs - Thursday, March 27, 2014 - link

    An item is only ever worth what someone is willing to pay.

    If a professional can use this card to complete a task that results in a much greater
    return (by whatever means), then the investment will more than pay for itself, in
    which case the price isn't a problem at all.

    This is why SGI was able to sell IR-based RealityCentre systems for a million+,
    because companies that used them (eg. oil/gas) recovered their investment
    often in just a single session using the system. In the case of oil exploration,
    being able to more accurately determine where not to drill based on stereo
    visulation of GIS data; every failed test drill wastes a lot of money, so a better
    hit rate made the RealityCentres an excellent investment.

    When I was an admin of a R.C. in the early 2000s, I asked an oil company guy
    about this; he estimated their $1.5M R.C. cost was recouped in just 6 seconds.

    Same concept applies today. The AE guy I know told me the most important
    thing for him is interactive responsive while working with the app, being able to
    see the results of changes as quickly as possible, which means he can make
    decisions about what to do sooner, or experiment with more alternatives in the
    same time frame. Thus, spending more on something that enables this will
    pay off in the long term.

    Ian.
    Reply
  • amo_ergo_sum - Wednesday, March 26, 2014 - link

    Can someone say CPU bottleneck? I guess that's what dx12 is for. Reply
  • MrManchego - Wednesday, March 26, 2014 - link

    But..Does it play Crysis.. Reply
  • MrManchego - Wednesday, March 26, 2014 - link

    This is a great card to push the industry forward..in 3 years or less this will be main stream Reply
  • Miqunator - Thursday, March 27, 2014 - link

    While I raised an eyebrow at the 1000$ difference between bying two Titan Blacks and the Z I wouldn't call anyone greedy.

    If it helps Nvidia push out good midrange cards for me at better prices it's just silly complaining
    Reply
  • MrManchego - Thursday, March 27, 2014 - link

    this is the kind of GPU necessary to run truly immersive VR... Reply
  • Sabresiberian - Friday, March 28, 2014 - link

    As I see it, there are 2 problems with this otherwise impressive piece of gear -

    1) it won't be all that long until Maxwell flagship products come out. I don't "need" to replace my graphics cards right now, and I'm going to wait for Maxwell in any case. Early indications is that it will be well worth waiting for.

    2) Price. 2 Titan Blacks would cost $800 less. 2 GTX 780Ti's would cost as much as $1600 less. So, the only way this would make sense is that you wanted to install 2 of them on a mainboard that couldn't handle 4 separate discreet cards. And, really, how likely is someone that can afford $6000 for their video solution to own a mainboard with only 2 PCIe 3.0 slots?

    As far as it being some kind of productivity card and not intended for gaming - well, it might be used for that more often than for gaming, as are the previous Titans, but it is being sold by Nvidia as "the most powerful graphics card for gaming ever released" (I just received an Nvidia newsletter in which the Titan Z is described just that way). Despite Nvidia's billing though, I do think they are intending to reap the benefit of its suitability for purposes other than gaming, a part of the industry in which the graphics solutions cost a lot more than $3000.

    So, that $3000 might look extreme for someone building or upgrading a gaming rig (it certainly does to me), but looking at it from the perspective of a professional using it in a productivity scenario - it is a bargain. :)
    Reply
  • mpdugas - Friday, March 28, 2014 - link

    ...and it still vents all that warmth INSIDE the case; the standard Nvidia cooler design is the best for avoiding that. Reply
  • PoolePaul - Friday, March 28, 2014 - link

    Will this be able to run Minecraft in 1080p? Reply

Log in

Don't have an account? Sign up now