Meet The GeForce GTX Titan

As we briefly mentioned at the beginning of this article, the GeForce GTX Titan takes a large number of cues from the GTX 690. Chief among these is that it’s a luxury card, and as such is built to similar standards as the GTX 690. Consequently, like the GTX 690, Titan is essentially in a league of its own when it comes to build quality.

Much like the GTX 690 was to the GTX 590, Titan is an evolution of the cooler found on the GTX 580. This means we’re looking at a card roughly 10.5” in length using a double-wide cooler. The basis of Titan’s cooler is a radial fan (blower) sitting towards the back of the card, with the GPU, RAM, and most of the power regulation circuitry in front of the fan. As a result the bulk of the hot air generated by Titan is blown forwards and out of the card. However it’s worth noting that unlike most other blowers technically the back side isn’t sealed, and while there is relatively little circuitry behind the fan, it would be incorrect to state that the card is fully exhausting. With that said, leaving the back side of the card open seems to be more about noise and aesthetics than it does heat management.

Like the GTX 580 but unlike the GTX 680, heat transfer is provided by a nickel tipped aluminum heatsink attached to the GPU via a vapor chamber. We typically only see vapor chambers on premium cards due to their greater costs, but also when space is at a premium. Meanwhile NVIDIA seems to be pushing the limits of heatsink size here, with the fins on Titan’s heatsink actually running beyond the base of the vapor chamber. Meanwhile providing the thermal interface between the GPU itself and the vapor chamber is a silk screened application of a high-end Shin-Etsu thermal compound; NVIDIA claims this compound offers over twice the performance of GTX 680’s grease, although of all of NVIDIA’s claims this is the least possible to validate.

Moving on, catching what the vapor chamber doesn’t cover is an aluminum baseplate that runs along the card, not only providing structural rigidity but also providing cooling for the VRMs and for the RAM on the front side of the card. Baseplates aren’t anything new for NVIDIA, but again this is something that we don’t see a lot of except on their more premium cards.

Capping off Titan we have its most visible luxury aspects. Like the GTX 690 before it, NVIDIA has replaced virtually every bit of plastic with metal for aesthetic/perceptual purposes. This time the entire shroud and fan housing is composed of casted aluminum, which NVIDIA tells us is easier to cast than the previous mix of aluminum and magnesium that the GTX 690 used. Meanwhile the polycarbonate window makes its return allowing you to see Titan’s heatsink solely for the sake of it.

As for the back side of the card, keeping with most of NVIDIA’s cards Titan runs with a bare back. The GDDR5 RAM chips don’t require any kind of additional cooling, and a metal backplate while making for a great feeling card, occupies precious space that would otherwise impede cooling in tight spaces.

Moving on, let’s talk about the electrical details of Titan’s design. Whereas GTX 680 was a 4+2 power phase design – 4 power phases for the GPU and 2 for the VRAM – Titan improves on this by moving to a 6+2 power phase design. I suspect the most hardcore of overclockers will be disappointed with Titan only having 6 phases for the GPU, but for most overclocking purposes this would seem to be enough.

Meanwhile for RAM it should come as no particular surprise that NVIDIA is once more using 6GHz RAM here. Specifically, NVIDIA is using 24 6GHz Samsung 2Gb modules here, totaling up to the 6GB of RAM we see on the card. 12 modules are on front with the other 12 modules on the rear. The overclocking headroom on 6GHz RAM seems to vary from chip to chip, so while Titan should have some memory overclocking headroom it’s hard to say just what the combination of luck and the wider 384bit memory bus will do.

Providing power for all of this is a pair of PCIe power sockets, a 6pin and an 8pin, for a combined total of 300W of capacity. With Titan only having a TDP of 250W in the first place, this leaves quite a bit of headroom before ever needing to run outside of the PCIe specification.

At the other end of Titan we can see that NVIDIA has once again gone back to their “standard” port configuration for the GeForce 600 series: two DL-DVI ports, one HDMI port, and one full-size DisplayPort. Like the rest of the 600 family, Titan can drive up to four displays so this configuration is a good match. Though I would still like to see two mini-DisplayPorts in the place of the full size DisplayPort, in order to tap the greater functionality DisplayPort offers though its port conversion mechanisms.

Who’s Titan For, Anyhow? Titan For Compute
Comments Locked

157 Comments

View All Comments

  • mrdude - Tuesday, February 19, 2013 - link

    I doubt it, given the transistor count and die size. This thing isn't exactly svelte, with 7.1Billion transistors. The viable-chips-per-wafer must be quite low, hence the price tag.

    What I don't understand is why people would buy a a $1000 GPU for compute? I can understand why somebody buys a ~$300 GPU to add a little extra horsepower to their small selection of applications, but if you're paying $1000 for a GPU then you're also expecting a decent set of drivers as well. But both AMD and nVidia have purposely neutered their consumer cards' performance for most professional tasks and applications. As a result, you can buy a cheaper FirePro or Quadro with professional drivers based on the smaller die/GPU (like a 7850 or 660Ti) that will outperform this $1000 single GPU card in a variety of software.

    If I'm paying upwards of $1000 for a GPU, it sure as hell has to work. Buying a consumer grade GPU and relying on consumer (gaming) drivers just means that you'll almost never hit anywhere near the max theoretical throughput of the card. In essence, you're paying for performance which you'll never get anywhere close to.

    This is a perfect card for the fools who overspend on their gaming GPUs. For everyone else it's just a high-priced bore.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    All those fools, we have been told over and over, and in fact very recently by the site's own, are here !

    That's what this is for, dimwit. Not for crybaby losers who can barely scrape up an HD 5750.

    Let's face it, every one of you whining jerks is drooling uncontrollably for this flagship, and if you're just a loser with a 450W power supply, no worries, they're being sold in high priced systems with that.

    You'd take in a minute, happily, and max out your games and your 1920x1080 monitor in MOST games.

    I mean I have no idea what kind of poor all you crybabies are. I guess you're all living in some 3rd world mudhole.
  • madmilk - Thursday, February 21, 2013 - link

    They're clearly not in any kind of hurry, given how well Tesla is selling at 3 times the price. These are probably just the rejects, set to a higher voltage and TDP and sold to the consumer market.
  • mrdude - Thursday, February 21, 2013 - link

    Oh yea, nVidia is never going to jeopardize the cash cow that is the Tesla for the HPC crowd, or Quadro for the professional market. The margins there aren't worth giving up in order to bring GPU compute (and its drivers) to the mass market.

    This notion that this is a GPGPU card is silly, frankly. We can throw around the max theoretical GFLOPs/TFLOPs figures all we please, the reality is that you'll never see anywhere close to those in professional applications. There are two reasons for that: Tesla and Quadro.
  • chizow - Tuesday, February 19, 2013 - link

    Yeah, totally agree with the post title, Nvidia has lost their fking minds.

    And PS: The X-Men *STILL* want their logo back.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    This isn't 19G80 Kansas anymore Dorothy.

    Do any of you people live in the USA ?

    I mean really, how frikkin poor are all you crybabies, and how do you even afford any gaming system or any games ?

    Are you all running low end C2D still, no SSD's, and 1280x1024, do you live in a box ?

    How can you be in the USA and whine about this price on the very top end product for your Lifetime Hobby ?

    What is wrong with you, is the question.
  • Pariah - Tuesday, February 19, 2013 - link

    In most cases, this card won't make sense. There are at least a couple of scenarios where it might make sense. One, in an ultra highend gaming system. That means multiple Titan cards. Because these are single GPU cards, an SLI Titan setup should scale much better than an SLI 690 with 4 GPU's would. And further that point with triple SLI Titans.

    Secondly, this card is smaller and uses less power than a 690, which means you can use it in much smaller cases, even some mini-itx cases. That would be one helluva a nice portable LAN box.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    This card makes sense for anyone running a mid sandy bridge and 1920x1080 monitor.
    After I complained about the 1920X1200 reviews here, pointing out nVidia is 12% BETTER compared to amd in the former resolution, 50 raging amd fanboys screeched they have a 1920X1200 monitor they run all the time and they were more than willing to pop the extra $150 bucks for it over the 1920x1080...

    So we can safely assume MOST of the people here have a 1920X1080 for pete sakes.
    A low end sandy is $50 to $80, same for a board, DDR3 is the cheapest ram.
    So for less than $200 bucks to prepare at max, (use your old case+ps) near everyone here is ready to run this card, and would find benefit from doing so.

    Now lying about that just because they don't plan on buying one is what most here seem to want to do.

  • Deo Domuique - Friday, March 8, 2013 - link

    This card should be cost ~600-650$. Not a single cent more. The rest is ala Apple markup for the mindless consumer. Unfortunately, there are a lot of them.
  • trajan2448 - Tuesday, February 19, 2013 - link

    Obviously a great piece of technology. Interested to see what the over clockers can achieve.
    If it was $700 It would make a lot more sense. Nonetheless, fun to see some fanatics do a TRI SLI overclocked and blow up their monitor.

Log in

Don't have an account? Sign up now