• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

NVIDIA to release GeForce Titan

I just read through the toms article... Great read. What I took from it is a 690 is still king if your going for a one card solution. However the performance of sli titans at 5760x1080/1200 and 2560x1600 hit me in the weak spot.

Now that I've seen the benches I don't want anything other than two. Amazing performance.
FFS, don't listen to Tom's. It's so bad these days. They don't even understand what they are doing half the time, and the other half are too lazy to run repeat benches for consistency. It's total trash.

Anand/TechReport > PCPer/TPU/Nordic > Everyone Else
 
Interesting what HardOCP are saying about 7970 Crossfire

Once again we also tested at 1080p to see at what level the GeForce GTX TITAN was playable at. We were pleasantly surprised to find the game playable at Extreme AA setting at 1920x1080 on GeForce GTX TITAN. Extreme AA enables the highest level of AA possible with High FXAA and High SSAA. At these settings, we are running at the highest possible settings the game supports at 1080p in quality settings and AA. We were averaging 40 FPS and never dropped below 36 FPS. Once again, it looks like the HD 7970 GE would be playable, but similar to above, it just wasn't in actual real-world gaming.

In the apples-to-apples test we are running at the High AA setting, which the GeForce GTX TITAN has no trouble delivering a playable experience. Now, it looks like the Radeon HD 7970 GE would be playable with its level of performance, however, the game is laggy and feels choppy despite the framerate showing what would normally be a good level for playability. The actual experience was different than the framerates show, it felt a lot slower than the framerates were showing. TITAN didn't experience this; it was perfectly smooth with no lag.

Another reason I'm leaning towards the Titan rather than going mGPU.
 
Crossfire, no matter what the numbers say, is still lacking behind and still something I'd avoid. It was truly bad back in the day but it still has issues today.
 
Also, better 95th and 99th percentile frame times than the 690 at 1080p. I knew the FPS benches would be hiding this. This is, for all intents and purposes, a better performing card than the 690.

skyrim-19x10-per.png

You cherry picked that graph. The 690 has better frame times in BF3, Crysis 3, Sleeping Dogs, and is a wash in FC3, Dirt 3, and Skyrim (Just looking at the 1920x1080 graphs). Overall the 690 is faster in both frame rate and frame times. There are a few specific titles where the Titan edges out the 690 a bit but that's not the norm. The Titan will also be less constrained in high-memory usages and those looking to go SLI Titans over SLI 690's (4x SLI 680s).
 
You cherry picked that graph. The 690 has better frame times in BF3, Crysis 3, Sleeping Dogs, and is a wash in FC3, Dirt 3, and Skyrim (Just looking at the 1920x1080 graphs). Overall the 690 is faster in both frame rate and frame times. There are a few specific titles where the Titan edges out the 690 a bit but that's not the norm. The Titan will also be less constrained in high-memory usages and those looking to go SLI Titans over SLI 690's (4x SLI 680s).
I looked over them more, and you're right.

But the bolded is repeating yourself. Frame rate is just a really bad and inaccurate measuring of the latter.

FPS and Frame Times measure the same thing. The former is just an average of the latter every second, so an 'average FPS' is an average of an average, which any statistician will tell you is a really bad thing to do.

d3-19x10-per.png


sleep-19x10-per.png


skyrim-19x10-per.png


crysis3-19x10-per.png


bf3-19x10-per.png
 
FFS, don't listen to Tom's. It's so bad these days. They don't even understand what they are doing half the time, and the other half are too lazy to run repeat benches for consistency. It's total trash.

Anand/TechReport > PCPer/TPU/Nordic > Everyone Else

They are all going to say the same thing.. That sli titans offer great performance at extreme resolutions

Or rather I hope they all do
 
Those high memory situations and scenarios can't be understated.

These high-end nextgen games are going to bring 2GB cards to their knees over the next 18 months. Titan and similar will keep going when other cards that suggest equivalent performance have to drop the texture quality or resolution in order to maintain framerates.

I think the reviews do a disservice by not mentioning the significance of 2GB vs 6GB with the future looming. Sure most games don't hit 2GB of VRAM yet, but how much longer will that be true of the premiere titles and franchises these expensive GPU's are purchased to play?

I've said it before: I wouldn't buy a 2GB card right now unless I just had to. 3GB, minimum. Preferrably, I'd wait until the next wave of cards come out. Those will undoubtedly start at 4GB or so. If you have enough GPU power to get you buy until December and don't have money (or interest) to buy Titan, I'd wait and see what comes down the pike.
 
Those high memory situations and scenarios can't be understated.

These high-end nextgen games are going to bring 2GB cards to their knees over the next 18 months. Titan and similar will keep going when other cards that suggest equivalent performance have to drop the texture quality or resolution in order to maintain framerates.

I think the reviews do a disservice by not mentioning the significance of 2GB vs 6GB with the future looming. Sure most games don't hit 2GB of VRAM yet, but how much longer will that be true of the premiere titles and franchises these expensive GPU's are purchased to play?

I've said it before: I wouldn't buy a 2GB card right now unless I just had to. 3GB, minimum. Preferrably, I'd wait until the next wave of cards come out. Those will undoubtedly start at 4GB or so. If you have enough GPU power to get you buy until December and don't have money (or interest) to buy Titan, I'd wait and see what comes down the pike.
That's because speculation is a bad thing to do in a review that is otherwise supposedly based on empiricism.
 
Interesting what HardOCP are saying about 7970 Crossfire

Another reason I'm leaning towards the Titan rather than going mGPU.

Doesn't the graph over frame latencies with crossfire look like when Vsync switches between 30 and 60 fps? It's really jerky and even playing Crysis 3 beta on my poor 7870 at the very highes with full SMAA at like 20 fps is was still more smooth than 40 fps jumping all over the place. That constant motion is really important, it's probably why I found (sub) 30 fps 360 games in my youth to be much smoother than much of what I was used to on PC.
 
Doesn't the graph over frame latencies with crossfire look like when Vsync switches between 30 and 60 fps? It's really jerky and even playing Crysis 3 beta on my poor 7870 at the very highes with full SMAA at like 20 fps is was still more smooth than 40 fps jumping all over the place. That constant motion is really important, it's probably why I found (sub) 30 fps 360 games in my youth to be much smoother than much of what I was used to on PC.
They don't use frame latency graphs. Those are just FPS over time. *edit* Unless I'm not understanding you properly here, which I think is the case.

*edit2*

Tech Report review up!

http://techreport.com/review/24381/nvidia-geforce-gtx-titan-reviewed

bl2-99th.gif


bl2-beyond-16.gif


gw2-99th.gif


gw2-beyond-16.gif


dogs-99th.gif


dogs-beyond-16.gif


ac3-99th.gif


ac3-beyond-16.gif


fc3-99th.gif


fc3-beyond-16.gif
 
The thing is radial fans would just blow air out the top. Imagine that... white thing is a radial fan. The arrows show where the air would go. The actual airflow is more curved and in a blower card the air is directed one way. Don't know how the Titan doesn't blow too much air out the rear, but I'm guessing it is much easier for the air to go over the big heatsink (tighter fins over the VRM) to force the majority to cool the GPU.


BTW, that pic took way too much effort. I gave the center "fan" 1024 edges and thanks to the cycles rendering with 150 samples and a huge ring lighting everything up it took my 3570k 35 seconds to render that simple graphic :P
The white backgound is a part of the light source with rays being rendered and reflected and it's still flawed with some random dots because 150 samples wasn't quite enough. As you can see I need a Titan :(


You're right. Maybe my picture doesn't have enough polys XD The idea is that that "door" must be before the fan, not just on the top. So you can have a bigger flow. I need another picture.
 
They don't use frame latency graphs. Those are just FPS over time. *edit* Unless I'm not understanding you properly here, which I think is the case.

*edit2*

Tech Report review up!

http://techreport.com/review/24381/nvidia-geforce-gtx-titan-reviewed

Was talking about the early tests at Tech Report where crossfire had like a constant 30-20-30-20 etc. etc. frame time on the graph which is similar to the 17-33-17-33 you'd get with bad vsync. In my personal experience that felt even worse than C3 running at sub 30 fps.
You're right. Maybe my picture doesn't have enough polys XD The idea is that that "door" must be before the fan, not just on the top. So you can have a bigger flow. I need another picture.
IMO the best thing would be getting rid of the thin coolers and giving GPUs proper cooling à la what you see on CPUs. That will probably not happen in the near future though :(
Are you thinking about a thinner fan that could draw air from top and bottom like this very accurate GPU shroud?
I suck at modelling :/
 
I think I managed to convince myself to purchase one of these...

How does nVidia's triple monitor setup work? Can you run mismatched resolutions or do they all three have to be the same?
 
So much this, Scott Wasson is right on the money:

With that said, I can't help but think most PC gamers would give up the double-precision support and accept a plastic cooling shroud and "only" 3GB of onboard memory in exchange for a price that's several hundred bucks lower. That was essentially the deal when the GeForce GTX 580 debuted for 500 bucks—or, if you want to point to an even larger chip than the GK110, when the GTX 280 started life at $650. A premium product like the Titan is no bad thing, but we're kind of hoping Nvidia follows up with something slightly slower and little more affordable, as well. At present, your best value-for-dollar proposition in this space from Nvidia likely comes from dual GTX 680s in SLI, which should perform very much like a GTX 690 at a lower overall price.

IMO the best thing would be getting rid of the thin coolers and giving GPUs proper cooling à la what you see on CPUs. That will probably not happen in the near future though :(
Are you thinking about a thinner fan that could draw air from top and bottom like this very accurate GPU shroud?
Most non-reference coolers are extremely similar to CPU coolers in design premise.

Even then, the best way to keep multiple cards cool in SLI remains this:


Can't hear the fans on the cards when they don't have any!
 
I think I managed to convince myself to purchase one of these...

How does nVidia's triple monitor setup work? Can you run mismatched resolutions or do they all three have to be the same?

Well usually the point of it would be to run in 5760x1080 or something similar, making three screens really just one big screen. Then if a game needs 1920x1080 because it doesn't support the wider res then it just runs in the middle monitor and the surrounding two are normal desktop.
 
TR Review is U..... Dammit!

Giving it a look over. Love them TR GPU reviews.
Seriously. It makes every other website p'much obsolete.

Also, pre-orders on EK blocks are up. Sk3tch, Smokey, don't disappoint me here.

http://www.frozencpu.com/newproducts/list/p1/201302/NewProducts-Page1.html
I wonder what performance will be like if I downsample with this card.
All the TechReport benches are at 1440p.

*edit*

Pre-Order available from NCIXUS

http://us.ncix.com/products/?sku=80597&vpn=06G-P4-2790-KR&manufacture=eVGA
 
Most non-reference coolers are extremely similar to CPU coolers in design premise.

Yeah, but they emulate the crappy top down coolers that match zero fan configs in the case and have to make them super thin. Was thinking more like a tower cooler and the GPU on its side.

Wonder if AIO GPU cooling will take off with non-reference coolers as even a thin 120mm rad destroys most non-reference coolers and a thick rad is able to keep two 7970 Ghz ed. running cool.
 
Yeah, but they emulate the crappy top down coolers that match zero fan configs in the case. Was thinking more like a tower cooler and the GPU on its side.

Wonder if AIO GPU cooling will take off with non-reference coolers as even a thin 120mm rad destroys most non-reference coolers and a thick rad is able to keep two 7970 Ghz ed. running cool.
It's growing in popularity, but becomes extremely unwieldy and ugly as soon as you have two cards in the system.

There's a large number of top down coolers that handily outperform a lot of tower coolers, like the C12/C14 from Noctua. There's also the Prolimatech MK-26, which is pretty much on par with most of those AIO ad-hoc setups.
Do you guys think that there will be a Titan with a custom cooler ala the the Gigabyte Windforce series? I really like those.
As Kharma said, they will not be allowing non-reference design outside of specialized EVGA waterblocks (which are not great).
 
That's because speculation is a bad thing to do in a review that is otherwise supposedly based on empiricism.

Such matters should be mentioned, however. The amount of RAM shouldn't be simply skipped over just because the software to take advantage of it isn't out yet. It should at least be mentioned.
 
Well usually the point of it would be to run in 5760x1080 or something similar, making three screens really just one big screen. Then if a game needs 1920x1080 because it doesn't support the wider res then it just runs in the middle monitor and the surrounding two are normal desktop.
I don't have 3 27" monitors though :(.

I guess I'd only be able to utilize the other 2 for normal Windows mode, and stick to the main 27" for gaming? Unless I wanted to drop my 27" to 1920x1080 for all 3, hmm...
 
Sorry if this has already been addressed. But if i'm looking to buy a single GPU, what are the disadvantages of me picking a 690 over the titan? Thinking seriously about getting one soon.
 
A single one barely offers a performance jump over what I have now (5-15%-...). But when you factor in the overclocking performance I become so so tempted.

Then there is 2 of them in SLI... drool.
 
If ya don't mind the Asus brand, Newegg's got their preorder up for 02/28:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121724&Tpk=asus titan
From the looks of it, this thing should be pretty easy to get compared to the GTX 690.
I actually prefer ASUS in general when it comes to video cards...but no ROG DCU cooler :| . I'm really not a fan of duct coolers like the one used on that reference card, but I hear this one might do ok in terms of noise.
 
Definitely do not opt for a GTX 690. I have 3x 1080p 120hz monitors as well, and I've powered them with 4-way GTX 680 2GB and 4-way GTX 680 4GB - it's not an overly pleasant experience. You want to go 3-way GTX Titan (or 2-way and upgrade later). Tri-SLI is the sweet spot and it seems NVIDIA is forcing that this round (see earlier in this thread - looks like they only support 3-way as max).

I believe getting SLI Titans could be potentially better.

But dats RIDICULOUS.

Appreciate the responses! And yep, SLI Titans is ridiculous. Or does it make more sense than anything ever?

The idea of knowing I could play anything at max settings anytime is so much more satisfying than ever actually playing anything at max settings.

It's a disease, and lack of money is the only cure.
 
Top Bottom