NVIDIA Releases GeForce GTX Titan Z
by Ryan Smith on May 28, 2014 10:45 AM ESTBack in March at GTC 2014, NVIDIA announced their forthcoming flagship dual-GPU video card, the GeForce GTX Titan Z. Based on a pair of fully enabled GK110 GPUs, NVIDIA was shooting to deliver around twice the performance of a single Titan Black in a single card form factor.
At the time of NVIDIA’s initial announcement GTX Titan Z was scheduled for release in April. April of course came and went with no official word from NVIDIA on why it was delayed, and now towards the tail end of May the card is finally up for release. To that end NVIDIA sent out a release a bit ago announcing the availability of the card, along with putting up the card's product page and confirming the final specifications of the card.
GTX Titan Z | GTX Titan Black | GTX 780 Ti | GTX Titan | |
Stream Processors | 2 x 2880 | 2880 | 2880 | 2688 |
Texture Units | 2 x 240 | 240 | 240 | 224 |
ROPs | 2 x 48 | 48 | 48 | 48 |
Core Clock | 706MHz | 889MHz | 875MHz | 837MHz |
Boost Clock | 876MHz | 980MHz | 928MHz | 876MHz |
Memory Clock | 7GHz GDDR5 | 7GHz GDDR5 | 7GHz GDDR5 | 6GHz GDDR5 |
Memory Bus Width | 2 x 384-bit | 384-bit | 384-bit | 384-bit |
VRAM | 2 x 6GB | 6GB | 3GB | 6GB |
FP64 | 1/3 FP32 | 1/3 FP32 | 1/24 FP32 | 1/3 FP32 |
TDP | 375W | 250W | 250W | 250W |
Width | Triple Slot | Double Slot | Double Slot | Double Slot |
Transistor Count | 2 x 7.1B | 7.1B | 7.1B | 7.1B |
Manufacturing Process | TSMC 28nm | TSMC 28nm | TSMC 28nm | TSMC 28nm |
Launch Date | 05/28/14 | 02/18/14 | 11/07/13 | 02/21/13 |
Launch Price | $2999 | $999 | $699 | $999 |
First and foremost, with NVIDIA initially holding back on publishing some of the specifications of the GTX Titan Z at its announcement in March, we now have the final two pieces of the puzzle: the card’s official GPU clockspeeds, and the TDP. Our earlier estimation of the core clock, based on NVIDIA’s performance figures, turned out to be correct, with the card shipping at 706MHz. Meanwhile the boost clock is revealed to be at 876MHz.
This makes for an especially large delta between the base and boost clocks – 170MHz – which is consistent with the TDP-constrained nature of this card. NVIDIA’s last dual-GPU card, GTX 690, also had a larger than average clock delta, so this is not unexpected though it is the widest delta we’ve seen yet. What this means is that it’s reasonable to assume that the performance of GTX Titan Z is going to be more TDP sensitive than on GTX Titan Black; in TDP-heavy scenarios the card is going to have to fall back more often, while in TDP-light scenarios it should still have the chance to perform near its maximum boost clock. Speaking of which NVIDIA doesn’t publish the maximum boost clock, so from these figures it’s reasonable to expect the GTX Titan Z to underperform the GTX Titan Black in SLI, but it’s not possible to tell how well peak performance will compare.
Meanwhile we also have a final confirmation on the card’s TDP. As we suspected back in March, NVIDIA has configured the card with a 375W TDP, putting the TDP roughly 50% higher than a single GTX Titan Black and indicating that along with a wider range of clockspeeds NVIDIA is aggressively binning GPUs for this part. This lower TDP means that while we expect GTX Titan Z to underperform GTX Titan Black in SLI, it looks like it should significantly undercut the latter’s power consumption, improving overall power efficiency.
Looking at the power delivery mechanism itself, NVIDIA has also sent over a shot of the bare board itself, along with a bit of information on how it’s configured. GTX Titan Z uses 12 power phases (split in half for each GPU), which as we can see mostly reside at the center of the card between the two GPUs. Delivering power to these VRMs is a pair of 8pin PCIe power sockets, which combined with the PCIe slot itself allow up to 375W to be pulled, the card’s TDP.
This 375W beast will in turn be cooled via a triple slot cooler, owing to the greater amount of heat to dissipate. Triple slot cards are commonly seen in high-end partner designs, but this mark the first time we’ve seen a triple slot card as a reference design. The triple slot design is also going to be notable since when coupled with the split-blower design of the cooler, it further increases the amount of space the card occupies. Axial fan designs such as the one used on GTX Titan Z need a PCIe slot’s worth of breathing room to operate, which means that altogether the GTX Titan Z is going to take up 4 slots of space. Which in turn is notable because it means that in principle GTX Titan Z won’t save on any space compared to GTX Titan Black in SLI; the latter uses a tried and true blower design that allows the cards to be used directly next to each other (though it’s not preferable), consuming 4 slots of space in an SLI configuration.
Moving on, today’s announcement also sees confirmation of the I/O port configuration and the number of displays supported for the card. NVIDIA’s specs say that GTX Titan Z will support up to 4 displays, indicating that all I/O ports are being routed through a single GPU. However NVIDIA’s port configuration is downright odd for a $3000 card: 1x DVI-I, 1x DVI-D, 1x DisplayPort, and 1x HDMI. This is admittedly us being picky, but the inclusion of the HDMI port in a $3000 card is genuinely odd. The DVI ports make sense in as much as they work with legacy DVI displays at a time when a DisplayPort-to-DL-DVI adapter is $100, but the HDMI port offers neither flexibility nor cost savings. Replacing the HDMI port with a second DisplayPort would grant the card far more flexibility – including driving a second 4K@60Hz monitor – all the while still allowing HDMI through a simple passive DisplayPort-to-HDMI adapter. But I digress…
As far as pricing and availability are concerned, as per NVIDIA’s initial announcement the GTX Titan Z is retailing at $2999 (ed: or about £2350 in the UK), making it NVIDIA’s most expensive GeForce card yet. We’ve seen announcements from MSI, Zotac, and EVGA so far, so it looks like a decent selection of NVIDIA’s partners will be selling the card, though it’s not clear at this time which regions each of those partners will be selling in. With the GTX Titan cards thus far, NVIDIA has only let a couple of partners sell the card since they’re selling identical low volume products. In any case availability is immediate, with Newegg already listing the EVGA card as in stock as of press time.
Of course it goes without saying that $3000 is going to be a steep price to pay for GTX Titan Z, both compared to the AMD and even the NVIDIA competition. A pair of GTX Titan Blacks would run for $2000, a full $1000 less, and as we discussed before the triple slot design of the GTX Titan Black doesn’t afford much in the way of space savings over dual slot cards. Which doesn’t mean we’re writing off GTX Titan Z – NVIDIA is many things, and diligent about their research is one of those – but it will be interesting to see what their end users and OEM/boutique builders do with the card. The benefits of GTX Titan Z over two single-GPU cards are not as cut-and-dry as with NVIDIA’s other dual-GPU cards, which means that it’s more of a lateral move than usual.
A big part of how GTX Titan Z is going to be used will in turn depend on who the buyer is. NVIDIA’s compute group is pushing GTX Titan Z as the ultimate compute card at the same time as their gaming group is pushing it as the ultimate gaming card, and like NVIDIA’s other Titan cards this product will be serving two masters. That said it’s clear from NVIDIA’s presentations and discussions with the company that they intend it to be a compute product first and foremost (a fate similar to GTX Titan Black), in which case this is going to be the single most powerful CUDA card NVIDIA has ever released. NVIDIA’s Kepler compute products have been received very well by buyers so far, including the previous Titan cards, so there’s ample evidence that this will continue with GTX Titan Z. At the end of the day the roughly 2.66 TFLOPS of double precision performance on a single card (more than some low-end supercomputers, we hear) is going to be a big deal, especially for users invested in NVIDIA’s CUDA ecosystem.
Gaming on the other hand looks to be murkier. Certainly GTX Titan Z can and will be used as a gaming card (expect to see this one popular in high-end boutique systems), but NVIDIA faces extremely stiff competition from AMD’s recently released Radeon R9 295X2, which at $1500 retails for half the price of GTX Titan Z. Given GTX Titan Z’s sub-Titan Black clockspeeds and higher price, NVIDIA faces an uphill battle here on price and performance, and it makes a lot of sense in light of this why GTX Titan Z is first and foremost a compute card. None the less, with NVIDIA controlling around 2/3rds of the discrete GPU market and GTX Titan Z consuming around 25% less power (on paper), we certainly expect it to appear in gaming systems, especially in builds where price is no object and two cards can be installed.
Wrapping things up, the launch of GTX Titan Z appears to be the capstone for Kepler’s career over at NVIDIA. While we will likely see rebadges and reconfigurations over the coming generations, with NVIDIA now shipping a dual-GPU GK110 card they have assembled virtually every Kepler combination possible. And for that they go out with a bang, while on the long term we turn our eyes towards NVIDIA’s new Maxwell architecture and what it might mean for the high-end once it makes its way into NVIDIA’s most powerful GPUs.
Spring 2014 GPU Pricing Comparison | |||||
AMD | Price | NVIDIA | |||
$3000 | GeForce GTX Titan Z | ||||
Radeon R9 295X2 | $1500 | ||||
$1100 | GeForce GTX Titan Black | ||||
$650 | GeForce GTX 780 Ti | ||||
Radeon R9 290X | $550 | ||||
$500 | GeForce GTX 780 |
58 Comments
View All Comments
grahaman27 - Wednesday, May 28, 2014 - link
It's just a money grab.dagnamit - Wednesday, May 28, 2014 - link
Yeah, they're in the business of making money. Nvidia thinks that's what the market will bear. They're probably right. Titans have proven quite popular in a market that they've created (CUDA). Now they're reaping the benefits of that effort. Good for them.If you're buying this card for gaming, you're rich or dumb, or both. The biggest problem is that it's best use case for gaming, small form factor 1440p/4k, is thrown out the window by the triple slot cooler.
ddriver - Wednesday, May 28, 2014 - link
OpenCL + ATi gives you about 10 times better bang for the buck. Only people looking to waste money will go for nvidia compute solutions.MadMac_5 - Wednesday, May 28, 2014 - link
There's a lot of inertia in the scientific community, though. For the longest time CUDA was the only API that was well documented enough to use, and as such there would be a lot of resistance to changing it. When code is written and released for free by academics or developed by a grad student/postdoc researcher who moves on to somewhere else after 2-4 years, it's a MASSIVE pain to re-write it for a different API like OpenCL. Hell, there's still legacy FORTRAN code running around the Physics community partly because no one has been able to dedicate the time to port it to something newer and make it as efficient.SolMiester - Wednesday, May 28, 2014 - link
for about 100 x less the applications!heffeque - Thursday, May 29, 2014 - link
You only need it to run your own scientific application, not everybody else's apps.TheJian - Friday, May 30, 2014 - link
ROFL@OpenCL doing ANYTHING better than cuda. Only people wanting to waste money buy cards for opencl. CUDA gets real work done. I don't see AMD chanting OpenCL benchmark numbers VS CUDA at all. That means it sucks. Ryan etc won't test them vs each other either, so again it SUCKS or he would run it to make Cuda look bad. The problem is YOU CAN'T...ROFL. I'm not a fan of proprietary stuff unless it's better, and in this case it is. Gsync to, until proven otherwise by AMD (I suspect their solution will still have lag, and they'll wait on scalers just like NV who built their own rather than waiting). Cuda is taught in 100's of universities DUE to billions in funding from NV. It's in hundreds of pro apps.In work, I don't care who it is I'm using, I just want to be faster. OpenCL isn't. Free crap generally is...crap. People don't like working for free, get over it or live with 2nd best for life.
Also AMD must be money grabbing too right? 2x290x is not $1500. But with a worse name in the business you can't charge what the king does. See Intel vs. AMD. Same story. You pay a premium for a premium product. Phase3 drivers for AMD means no $3000 money grab. People seem to miss why AMD can't charge what NV does. R&D like cuda, Gsync, drivers NOT in phase3...Get the point people? This isn't a comment on who I like, just stating the facts of business. Apple had a premium for a few years, but then everyone got retina like screens, fast chips etc etc...Now price parity at the top basically. If AMD stops getting phase 3 drivers, bad ref designs that can run as advertised, etc..etc...They will be able to charge more.
Evilwake - Sunday, June 8, 2014 - link
I don't use CUDA to play games these are suppose to be gaming cards or at least that's what Nvida is saying they are if they are not where are the pro drivers for it.You fanboys believe crap smells like grapes as long as its your team saying so ,its a over price card that wants to be a gaming card and wants to be a pro card for that money if I need a pro card which come with pro drivers i spend the little extra for the reall thing.Flunk - Thursday, May 29, 2014 - link
I don't think it's just that. I think they don't expect many people to buy them. It's there as a halo product to get people to buy their less expensive cards.If you're a gamer going over 2x GTX 780 Ti doesn't make a lick of sense financially and if you're buying for compute multiple Titan Blacks makes more sense.
dstarr3 - Wednesday, May 28, 2014 - link
As a compute card, the price is generally pretty fair. You could spend a lot more. But as a gaming card, it's a right ripoff.