Saturday 3 May 2014

Nvidia Gtx Titan

Nvidia GeForce GTX TITAN Black Review: feat. ZOTAC

Manufacturer: Zotac
UK price (as reviewed): MSRP £785
US price (as reviewed): MSRP $999

Last week, along with the well received GTX 750 Ti (and GTX 750), Nvidia quietly launched the GTX Titan Black into retail channels. It hasn't sampled it to media outlets, but Zotac has kindly furnished us with a sample so we can see what it can do. While the launch of the original Titan was heavily covered and a fine example of how to wave willies, Titan Black's performance gains are going to be minimal compared to the GTX 780 Ti in games, hence the fuss-free launch. The original Titan, which this card outright replaces, is, after all, less of a gaming product than it is an entry level professional compute one – apparently only around half of those sold were ever used to play games at all.

Nvidia GeForce GTX TITAN Black Review: feat. ZOTAC
Click to enlarge

So, what exactly is the point of GTX Titan Black? Essentially, it's just to ensure that the top card in Nvidia's GeForce product stack is technically the best one available for both gaming and GPGPU workloads. Previously, it was the GTX 780 Ti for one and GTX Titan for the other, which was something of an oddity given the massive price tag of the latter card (which the GTX Titan Black will sell for too). That's not to say that GTX Titan Black will suddenly become an attractive card on the price-performance scale, however.

Nvidia GeForce GTX TITAN Black Review: feat. ZOTAC
Click to enlarge - GTX Titan Black utilises a fully enabled Kepler GK110 GPU

To achieve this, as you can see in the specs table below, GTX Titan Black combines the best parts of the GTX Titan and the GTX 780 Ti to produce the ultimate GK110 product. Like the GTX 780 Ti, its 15 SMXs are all fully enabled, giving it 2,880 CUDA cores and 240 texture units over GTX Titan's 2,688 and 224. It also borrows the GTX 780 Ti's memory frequency of 1.75GHz (7GHz effective) for a 16.7 percent increase in memory frequency and bandwidth over GTX Titan.

GTX Titan Black also carries over the elements of GTX Titan's design that give it the edge in compute tasks. As such, it has a full 6GB of GDDR5, double that of the GTX 780 Ti, as well as uncapped double precision performance – in non-Titan GK110 products, Nvidia's drivers artificially limit the performance of the FP64 CUDA cores to 1/8 of their potential as a way of distinguishing the product lines and marking Titan out as the main GeForce compute card by default.

Nvidia GeForce GTX Titan Black 6GBNvidia GeForce GTX 780 Ti 3GBNvidia GeForce GTX Titan 6GBNvidia GeForce GTX 780 3GB
GPU
ArchitectureKeplerKeplerKeplerKepler
CodenameGK110GK110GK110GK110
Base Clock889MHz876MHz836MHz836MHz
Boost Clock980MHz928MHz876MHz900MHz
Stream Processors2,8802,8802,6882,304
Layout5 GPCs, 15 SMXs5 GPCs, 15 SMXs5 GPCs, 14 SMXs4 GPCs, 12 SMXs
Rasterisers5554
Tesselation Units15151412
Texture Units240240224194
ROPs48484848
FP64 Performance1/3 FP321/24 FP321/3 FP321/24 FP32
Transistors7.1 billion7.1 billion7.1 billion7.1 billion
Die Size561mm2561mm2561mm2561mm2
Process28nm28nm28nm28nm
Memory
Amount6GB GDDR53GB GDDR56GB GDDR53GB GDDR5
Frequency1.75GHz (7GHz Effective)1.75GHz (7GHz Effective)1.5GHz (6GHz Effective)1.5GHz (6GHz Effective)
Interface384-bit384-bit384-bit384-bit
Bandwidth336GB/sec336GB/sec288GB/sec288GB/sec
Card Specifications
Power Connectors1 x 6-pin, 1 x 8-pin PCI-E1 x 6-pin, 1 x 8-pin PCI-E1 x 6-pin, 1 x 8-pin PCI-E1 x 6-pin, 1 x 8-pin PCI-E
Stock Card Length267mm267mm267mm267mm
TDP250W250W250W250W
Typical Street Price£785£500£770£385
Click to enlarge - The full array of Nvidia's consumer GK110 products

The final piece of the puzzle is a clock speed bump. The base clock is up from 875MHz in GTX 780 Ti to 889MHz here, making it the fastest reference GK110 product available. It has an extended boost curve as well, as the boost clock is listed at 980MHz compared to 928MHz on the GTX 780 Ti – our Titan Black constantly hit 1,058MHz in real workloads, so it's not afraid to get its Boost on. Ultimately, this is what will net it a small advantage over the GTX 780 Ti in games, as the extra VRAM is unlikely to do much while FP64 performance is irrelevant. Compute performance gains over GTX Titan will be more tangible, however.

We were disappointed to learn that the card hasn't been given a makeover to match its name - the “Black” in the product name had us hoping for a black anodised version, which we think would look phenomenal (though it's still a great looking card). The heatsink and “Titan” engraving on the card are black now, but that's about it. As with GTX Titan, Nvidia does not permit its board partners to ship custom cooling configurations for GTX Titan Black.

Nvidia GeForce GTX TITAN Black Review: feat. ZOTAC
Click to enlarge

GTX Titan Black uses the familiar 267mm GK110 PCB. It has a 6-pin/8-pin PCI-E power connection combination, dual SLI connections and the standard selection of video outputs: dual-link DVI-I, dual-link DVI-D, HDMI, DisplayPort (so G-Sync is supported). A metal contact plate cools the twelve front SK Hynix memory chips as well as the 6+2 phase power VRMs, while a vapour chamber and heatsink tame the GPU. A single radial fan provides airflow, and the closed design means heat is essentially all exhausted out of the rear I/O panel. The additional twelve memory modules on the rear of the PCB are left uncooled.

Nvidia GeForce GTX TITAN Black Review: feat. ZOTAC
Click to enlarge

While there will be no design variation between board partners, bundles are likely to vary. For its part, Zotac supplies its Splinter Cell games bundle (including Double Agent, Conviction and Blacklist), as well as a DVI to VGA adaptor and a few power cable adaptors. It's nothing special but doesn't feel like it's lacking anything either.

While we'll be running the GTX Titan Black through our benchmarks, this isn't much of a review in the traditional sense. At £785, we can already tell you that the GTX Titan Black is, like its predecessor, poor value from a gaming standpoint, and that the GTX 780 Ti will match its performance after a minute of overclocking. Nevertheless, we still can't help but want to see what the world's new fastest gaming GPU can do.

Oculus Rift








Every time virtual-reality company Oculus brings a prototype of its Rift headset to a show, it takes another big step forward. And the prototype at this year’s CES may be the biggest leap yet.
Last January, Oculus arrived at its first CES with a degree of uncertainty. It hadn’t yet released the developer-only Rift headset it Kickstarted in the previous fall. In fact, few outside the company had even seen it. Programmer John Carmack had brought an early prototype to videogame show E3 that summer, but since then there’d been radio silence as Oculus’ bare-bones staff worked heads-down on the developer unit. Last year’s CES was in many ways Oculus’ coming-out party. As it turned out, plenty of people attended: the Rift snagged “Best of CES” awards from everyone and their mother (including WIRED, though our mothers weren’t involved in the voting).
Since then, Oculus has continually improved and refined the Rift en route to a consumer release later this year. The display has been kicked up to 1080p; the form factor has become sleeker. Perhaps most importantly for adoption, potential latency has been greatly reduced, alleviating much of the “simulator sickness” that can accompany wearing VR headsets. And now, with another CES upon us, others are getting in on the act; Sony announced a new head-mounted display for movie viewing and games. It should be noted, though, that this is unlikely to be a direct competitor to the Rift — Sony’s unit gives wearers a 45-degree field of vision, compared to the Rift’s staggering 110 degrees.
Oculus unveiled even more this morning. There’s a new demo, courtesy of Epic Games. There’s a new AMOLED screen. There’s low persistence, a display technology that mitigates motion blur and “smearing,” both of which can contribute to user discomfort. For the first time, Rift is capable of positional tracking, which allows users to lean and move within the game environment by simply moving their head. And there’s a new prototype — known as “Crystal Cove” — that incorporates it all, getting latency down to around 30 milliseconds (on its way to the sub-20 threshold that Oculus considers the holy grail).

“We’ll need some seat belts for people. You want to stand up, you want to walk around.” — Oculus CEO Brendan Iribe
The new demo is visually similar to a previous demo that Oculus used throughout 2013 to show off the Rift’s immersive 360-degree playspace. Both were designed by Epic Games, and both occur within the universe of “Elemental,” Epic’s Unreal 4 game engine demo. The new demo places the user inside the same stone cave, facing the same horned lava-god/monster being as in the previous demo (bear with us here). This time, users play a top-down tower-defense scenario while the horned lava-god/monster guy watches. Like the two previous demos, the visual effects are plentiful.
Unlike the two previous demos, however, it monitors the user’s head movements in real space, and it’s able to translate those movements into not just orientation changes — looking up, down, or behind you — but also as actual motion, which previously was possible only by using a game controller in conjunction with the Rift. It utilizes an “outside-in” system: an externally mounted camera tracks small LED lights on the prototype’s faceplate, adding three “degrees of freedom” (forward/backward, left/right, and up/down) to the Rift’s tracking ability. Up until now, developers and early Oculus adopters have only been able to accomplish this by taping a Razer Hydra motion controller to the side of their Rift headsets. Now, though, leaning down while playing the demo brings you closer to the tower-defense game, and lets you watch the armies you control firing turrets and launching minions. It’s the first look at an untethered VR experience.
“We’ll need some seat belts for people,” says Oculus CEO Brendan Iribe. “You want to stand up, you want to walk around.”
The demo also highlights the display’s low persistence. In previous prototypes, turning your head quickly caused your surroundings to blur, an effect caused by the device registering new movement before the frame had a chance to update. Iribe describes it as “the wrong image being stuck to your face.” That’s effectively gone now.
“In the past,” Iribe says, “people would have to stop moving to stare at something. With low persistence, you can continue to stare at an object or read text while you’re moving your head.”
A second demo allows users to play EVE: Valkyrie, a space dogfighting game that’s part of the EVE: Online universe. Oculus brought the demo to E3 last year on its non-HD prototypes, but the company has updated it with the new feature set and the 1080p screen.
Of course, Oculus being Oculus, how the Crystal Cove prototype accomplishes low persistence and 6-DOF tracking are subject to change.
“This is just a feature prototype,” Iribe says. “It’s not at all representative of the final consumer look and feel. Once we feel like something is good enough and we’re confident we’ll be able to ship it with the consumer product, we feel good about announcing it. We still may change how it’s done, but we feel great about the positional tracking system. It’s been a year in the works, we’ve tried multiple different approaches, and this delivered the experience we were looking for.”
Even the display is subject to change. That’s why Oculus won’t even cop to the vendor it’s using for the screen. “We first showed HD without committing to what exactly it would be,” Iribe says. “It’s at least going to be 1080p, but we don’t know what screen we’re going to use, what size. We didn’t even know the resolution.”
Someday, all of those questions will be answered. Until then, there’s CES.