Titan X Launch / Review / Tears Thread

Tier 3 isnt a term used by MS or the DX group at all from what I understand. Also... isn't that slide considered unofficial and possibly FUD?

come on, we all know this stuff is conjecture and possibility. based on history tho, its more likely than not that microsoft will add a tier 3.
 
come on, we all know this stuff is conjecture and possibility. based on history tho, its more likely than not that microsoft will add a tier 3.

I would not doubt exactly that something similar may occur, but at the same time it may not be true (who knows!) as well as being possibly inconsequential. ( Like DX 10.1s adoption and use, but this time being even more specific).

Back on topic... it is looking more and more like I may be joining the Titan X club. It doesn't help that my GF is egging me on regarding the purchase.
 
I would not doubt exactly that something similar may occur, but at the same time it may not be true (who knows!) as well as being possibly inconsequential. ( Like DX 10.1s adoption and use, but this time being even more specific).

Back on topic... it is looking more and more like I may be joining the Titan X club. It doesn't help that my GF is egging me on regarding the purchase.

I had a similar problem with my GF, best of luck!
 
Back on topic... it is looking more and more like I may be joining the Titan X club. It doesn't help that my GF is egging me on regarding the purchase.

Your GF wants you to spend a grand on a graphics card? You are one helluva lucky guy. Hold on to her! :D

I had a similar problem with my GF, best of luck!

Wait your GF was encouraging you too? And that's a problem? Is today opposite day?
 
Your GF wants you to spend a grand on a graphics card? You are one helluva lucky guy. Hold on to her! :D



Wait your GF was encouraging you too? And that's a problem? Is today opposite day?

Sorry! Misread that! English not my native, so i assumed "egging" me like "holding me back". My bad, to this day luckly she didnt find out about the titan.
 
Is it true that the Titan X doesn't support all DX12 features in hardware? I'm not sure I want to spend so much on something that's going to run into that issue so soon.
No. Maxwell 2 fully support the highest feature level of DX12 - FL12.1.

Tier 3 isnt a term used by MS or the DX group at all from what I understand. Also... isn't that slide considered unofficial and possibly FUD?
Oh there is a Tier 3 but it is related to requirements for support of several features only, not DX12 in general. There is no Tier 3 support requirement in FL12.1 and there is no higher FLs so this support is basically something "extra" on top of the highest DX12 feature level. Also - feature tiers are numbers basically. If a GPU support tier 1 and another GPU support tier 3 of the same feature - this means that they both have support for that feature but the amount of things which can be done with it is different.

come on, we all know this stuff is conjecture and possibility. based on history tho, its more likely than not that microsoft will add a tier 3.

a. Read above.
b. A GPU must support one of four feature levels to be DX12 compatible. Supporting a highest tier of only one feature means nothing really.
c. They can add a new feature level, sure - let's call it "12.2". But since they didn't do that already it is highly possible that there won't be any h/w capable of supporting it by the launch time in June-August. This likely mean that Fiji - the only new GPU coming out in this timeframe - is either FL 12.1 or 12.0 even. Thus there is no reason to add a higher feature level.
 
I would not doubt exactly that something similar may occur, but at the same time it may not be true (who knows!) as well as being possibly inconsequential. ( Like DX 10.1s adoption and use, but this time being even more specific).

Back on topic... it is looking more and more like I may be joining the Titan X club. It doesn't help that my GF is egging me on regarding the purchase.

Damn, now that's a lucky guy. While I can't complain about my wifey, if I said I was dropping a grand on a video card... there may be some complications ;)
 
Anyone willing to share some ASIC Quality and OC values?

Currently testing +200mhz on the core, boosting to 1400mhz, so far it seems stable.

Asic quality = 78.9%
 
Sorry! Misread that! English not my native, so i assumed "egging" me like "holding me back". My bad, to this day luckly she didnt find out about the titan.

Oh. Haha! Secret stays with gaf! ;)

No. Maxwell 2 fully support the highest feature level of DX12 - FL12.1.


Oh there is a Tier 3 but it is related to requirements for support of several features only, not DX12 in general. There is no Tier 3 support requirement in FL12.1 and there is no higher FLs so this support is basically something "extra" on top of the highest DX12 feature level. Also - feature tiers are numbers basically. If a GPU support tier 1 and another GPU support tier 3 of the same feature - this means that they both have support for that feature but the amount of things which can be done with it is different.



a. Read above.
b. A GPU must support one of four feature levels to be DX12 compatible. Supporting a highest tier of only one feature means nothing really.
c. They can add a new feature level, sure - let's call it "12.2". But since they didn't do that already it is highly possible that there won't be any h/w capable of supporting it by the launch time in June-August. This likely mean that Fiji - the only new GPU coming out in this timeframe - is either FL 12.1 or 12.0 even. Thus there is no reason to add a higher feature level.

Well explained. Will probably quote you when it inevitably comes up again :)
 
Your GF wants you to spend a grand on a graphics card? You are one helluva lucky guy. Hold on to her! :D



Wait your GF was encouraging you too? And that's a problem? Is today opposite day?

Yeah seriously wtf lol. I have to explain why spending over $500 on a card is worth it.
Or just flat out lie about how much I spent.
 
I dont even know what Asic means....

"The ASIC Quality screenshot on the right can be evoked from GPU-Z's context menu and is individual for each graphics card and GPU. This feature has been developed for Nvidia’s Fermi (GX10x and GF11x) and AMD’s Southern Islands chips (Radeon HD 78xx and HD 79xx) and is supposed to indicate the quality of the specific GPU, in percent, based on electrical leakage data.

The GPU of our sample of the card has an ASIC quality of 76.6%. The higher this number, the lower voltage the GPU needs to work at the default clock rate and the higher overclocking results you can get with it by increasing its voltage.


According to Alexey Nikolaichuk (the author of RivaTuner and MSI Afterburner), the correlation between voltage and quality is like follows:

ASIC quality < 75% - 1.1750 V;
ASIC quality < 80% - 1.1125 V;
ASIC quality < 85% - 1.0500 V;
ASIC quality < 90% - 1.0250 V;
ASIC quality &#8804; 100% - 1.0250 V."
 
Mine says 66.4%. Which is probably shit, knowing my luck.

I have mine overclocked to +220 core, +400 memory, +112mV, 110% power. I THINK it's stable, but sometimes my games seem to crash for no damn reason. I attribute that to a likely unstable overclock... but otherwise it seems fine. Temps never exceed 70 degrees with my fan profile... benchmarks all complete with no problems. But I doubt I could ever push the overclock any higher.
 
Mine says 66.4%. Which is probably shit, knowing my luck.

I have mine overclocked to +220 core, +400 memory, +112mV, 110% power. I THINK it's stable, but sometimes my games seem to crash for no damn reason. I attribute that to a likely unstable overclock... but otherwise it seems fine. Temps never exceed 70 degrees with my fan profile... benchmarks all complete with no problems. But I doubt I could ever push the overclock any higher.

Benchmarks seemed stable for me at +220 and +400, but games kept on crashing randomly until I lowered the core overclock to +75, and memory to +250. Voltage tweaks didn't improve anything. Now, how's that for luck? Yeah, I'm seriously considering donating this particular card to my dad to replace his original Titan, and picking up an EVGA SC when they're in stock here in Vancouver.
 
Has amazon started shipping these bad boys yet?

Not sure. I was wondering the same thing. I ordered an EVGA Titan X a week ago from Amazon and so far I have not been given a shipping date. Anyone else have any luck with Amazon or hear anything on when they will be getting more in?
 
Mine says 66.4%. Which is probably shit, knowing my luck.

I have mine overclocked to +220 core, +400 memory, +112mV, 110% power. I THINK it's stable, but sometimes my games seem to crash for no damn reason. I attribute that to a likely unstable overclock... but otherwise it seems fine. Temps never exceed 70 degrees with my fan profile... benchmarks all complete with no problems. But I doubt I could ever push the overclock any higher.

66.4 is on the low end. i think 59 is as low as it goes.
 
Leak testing, just a couple more hours!

dsc01254q3pcd.jpg
 
12.3? Yeah. That's the FUD I was referring to. 12.2 doesn't exist, let alone 12.3. That's what happens when leaks are misinterpreted by sites that don't know what they are talking about.

This was precisely why I asked him for a source. Not about being, "touchy". Some of us just like clarification. Thanks for offering that!
 
One of the top features of the Titan X is the ability to let people on the internet know you own a titan x. Compound scaling bonus for each titan x after the first.
 
ASICs are 69.2% and 73.8% - they boost to 1430MHz in SLI; standard BIOS, NVIDIA reference cards.

I bet you are already power limited like me and many others, im kepping myself away from flashing a higher TDP limit and dont know how much more i can resist.
 
That's a weird loop. You've got CPU and GPUs adjacent, rather than separated by a radiator. The components will be receiving water that hasn't been cooled. Why did you make that decision?

That's not how custom loops work. So long as the res is directly feeding the pump, the order of the rest of the components doesn't really matter as far as temps/performance goes. I don't think I've ever seen a custom build with a rad in between the GPU(s) and CPU.
 
Finally finished building and benching my new 5960X, 2x TITAN X, 16GB-2133 system. Found it could do 4.5GHz with decent RAM and CPU cache overclocks, but the temps and noise on the Nepton 120XL weren't fun. Switched down to 4.4GHz to hit a peak of 80-82C under 100% load for 1 hour+ at acceptable noise when using headphones, making it even quieter during gaming.

Similarly, dialled back my GPU clocks to maintain a relatively quiet system, in the end settling on 1,237MHz boost, with 1,853MHz memory and a modified fan curve.

Firemark Extreme: http://www.3dmark.com/3dm/6457783 (should be #18 when the HoF updates)
Firemark Ultra: http://www.3dmark.com/3dm/6458189 (#13 when HoF updates)

Extreme went from 9663 with OC'd i7-2600K and OC'd 980s, to 12398 with OC'd 5960X and OC'd 980s, to 16338 with OC'd 5960X and OC'd TITAN Xs.

4 way isn't officially supported by nvidia

It is for TITAN X (our deep learning box is configured with 4-Way): http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-x/specifications Perf will be limited by CPU, game engine, game's specific coding, and SLI profile. Does Tomb Raider 2013 scale with 4-Way? That game's usually the poster child for multi-GPU scaling.
 
I may have to start planning for x99+5960x combo to pair with my cards. Getting stomped in benchmarks!

Love my rampage iv black edition though
 
So supposedly they've made some sort of changes to the hardware to fix the issue with my motherboard (and 3 others supposedly).

So I'm waiting on the new batch of cards to get shipped out! On one card til then :p
 
Anyone willing to share some ASIC Quality and OC values?

Currently testing +200mhz on the core, boosting to 1400mhz, so far it seems stable.

Asic quality = 78.9%

Damn that is high for ASIC. Both of my cards are 68.9/69.4. But they are boosting in-game stable at 1450!!! For benchmarking, I can hit 1495, but that is not stable for gaming and if the benchmark runs too long it will crash. It is just strong enough to last through a benchmark run.

I am at 400 on the memory for benchmarking and am at 200 for gaming. I have not really invested too much time in memory OC, but being that they run pretty damn hot (the memory modules) I think I will leave them where they are at for gaming.
 
4k in GPU 20fps in shadow of mordor :\

Did you do any OC on your card? I am getting right around 60 with some drops here and there in SLI. 20fps seems low for single Titan. Do you have AA solution maxed out? which is FXAA and some other stuff?
 
I may have to start planning for x99+5960x combo to pair with my cards. Getting stomped in benchmarks!

Love my rampage iv black edition though

I am in the same boat. My wife was fine with me getting 2 Titan X's and then another $300 for Waterblocks and Backplates and a few water cooling supplies. She was watching me bench my cards and I was showing her (at the time) that I was in second place on the SLI Ultra Firestrike.

Out of the blue she made a comment on how all the other guys in the top 10 have that 5960 thing. She then said, do you need that??? I kind of hesitated and said that would require a new mother board, new ram etc. She then said, "well you have been saying you wanted to do a rebuild, your bonus is here in a few days, why don't you do it?".

I basically fell on the floor at that time. However, for me, it is not about benchmarks, it is about gaming and hitting 4k/60fps as much as possible and going by some comments over at overclock.net, going from a 3930k to a 5960k wont gain me much for gaming, which is my main point of having my rig.

So right now I am not sure what I want to do. My wife is basically giving me a blank $4,000.00 check to do what I want computer wise.
 
Finally finished building and benching my new 5960X, 2x TITAN X, 16GB-2133 system. Found it could do 4.5GHz with decent RAM and CPU cache overclocks, but the temps and noise on the Nepton 120XL weren't fun. Switched down to 4.4GHz to hit a peak of 80-82C under 100% load for 1 hour+ at acceptable noise when using headphones, making it even quieter during gaming.

Similarly, dialled back my GPU clocks to maintain a relatively quiet system, in the end settling on 1,237MHz boost, with 1,853MHz memory and a modified fan curve.

Firemark Extreme: http://www.3dmark.com/3dm/6457783 (should be #18 when the HoF updates)
Firemark Ultra: http://www.3dmark.com/3dm/6458189 (#13 when HoF updates)

Extreme went from 9663 with OC'd i7-2600K and OC'd 980s, to 12398 with OC'd 5960X and OC'd 980s, to 16338 with OC'd 5960X and OC'd TITAN Xs.



It is for TITAN X (our deep learning box is configured with 4-Way): http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-x/specifications Perf will be limited by CPU, game engine, game's specific coding, and SLI profile. Does Tomb Raider 2013 scale with 4-Way? That game's usually the poster child for multi-GPU scaling.

You need to go H20 so bad Andy! That rig of yours could be so much more potent with a Custom Water loop. It is in an investment, but the amount of performance you can get due to how cool your components will be is worth it. My Titan X's in SLI under H20 never hit higher than 48c (it was a warm day in the house), usually they are around 35-38c under full load hitting 1450 on the core and memory with a +200.
 
GTX TITAN X 4-Way SLI + 5K Benchmarks:

https://youtu.be/XOTCTdzTPwE

The skinny is that 4-Way SLI scaling sucks balls.

VRAM usage across most games is stratospheric at 5K with max settings (doh!).

I really don't get why people keep expecting 4-way SLI to work great, 3-way just seems so much more worth it especially when you consider how much these Titans costs. And I liked how he used 4xMSAA at 5K in BF. Sure, the cards weren't being used to their full potential, but he would still see a huge performance boost just using post-process AA. At that resolution FXAA/SMAA is actually really good and doesn't ruin the image quality like it does at 1080p.

I wish I was rich, would be fun to have a rig with 3 Titan X just to experience it. But I guess I have to settle with my slow 970. :((

I love my 970
 
You need to go H20 so bad Andy! That rig of yours could be so much more potent with a Custom Water loop. It is in an investment, but the amount of performance you can get due to how cool your components will be is worth it. My Titan X's in SLI under H20 never hit higher than 48c (it was a warm day in the house), usually they are around 35-38c under full load hitting 1450 on the core and memory with a +200.

I would love to watercool, but I do so much component swapping it sadly wouldn't be feasible.
 
Well I've had my 2 X's for a couple of days and really loving it. Can't wait for The Witcher 3 and Arkham Knight to come and really give me a couple titles that I can crank to "Max" and actually enjoy them (Not a fan of all the recent shit that everyone benchmarks like AC:U, FC4, SoM, Ryse etc.).

SLI scaling on the X's seems a bit borked for some titles though.
The more recent titles seem to do fine. Ryse for example was getting ~40FPS with a single X and ~60 with 2 (Max with 2:2), could be better but still decent.
However I tried The Witcher 2 & Arkham City, both titles which had near-perfect scaling with my 2 680's, now with the X's I'm getting only ~10FPS extra when I run with SLI Enabled.

Perhaps it's just bad compatibility with older titles, or teething issues via Drivers (Is it a Maxwell problem in general?), not really sure, but that's a little disappointing (If the scaling was the same as I was getting on Kepler I could play Witcher 2 4K with Ubersampling - Essentially 8K - and probably still get ~60FPS).

I dunno maybe Andy you can possibly shed some light on the matter? I doubt it's something that'll get fixed, and it's not really anything of huge issue, the games still run great, just odd to see such meh SLI scaling on titles that were previously so perfect. At least the upcoming big titles should get more attention and up-to-date SLI profiles and I'm looking forward to them greatly.
 
Damn that is high for ASIC. Both of my cards are 68.9/69.4. But they are boosting in-game stable at 1450!!! For benchmarking, I can hit 1495, but that is not stable for gaming and if the benchmark runs too long it will crash. It is just strong enough to last through a benchmark run.

I am at 400 on the memory for benchmarking and am at 200 for gaming. I have not really invested too much time in memory OC, but being that they run pretty damn hot (the memory modules) I think I will leave them where they are at for gaming.

Im leaving the core at +200 for daily use, i tried +250 but it would crash after a few runs of firestrike ultra.
I´m having the same doubt about the memory. I didnt overclock it yet but after seeing the temperatures in reviews im a little worried, does it make any performance diference? Im wanting on the evga backplate that they confirmed to be in the works on their forums.

Did you do any OC on your card? I am getting right around 60 with some drops here and there in SLI. 20fps seems low for single Titan. Do you have AA solution maxed out? which is FXAA and some other stuff?

I was talking about the 5k video with 4 titans doing about ~20fps, seems really low.
 
Well I've had my 2 X's for a couple of days and really loving it. Can't wait for The Witcher 3 and Arkham Knight to come and really give me a couple titles that I can crank to "Max" and actually enjoy them (Not a fan of all the recent shit that everyone benchmarks like AC:U, FC4, SoM, Ryse etc.).

SLI scaling on the X's seems a bit borked for some titles though.
The more recent titles seem to do fine. Ryse for example was getting ~40FPS with a single X and ~60 with 2 (Max with 2:2), could be better but still decent.
However I tried The Witcher 2 & Arkham City, both titles which had near-perfect scaling with my 2 680's, now with the X's I'm getting only ~10FPS extra when I run with SLI Enabled.

Perhaps it's just bad compatibility with older titles, or teething issues via Drivers (Is it a Maxwell problem in general?), not really sure, but that's a little disappointing (If the scaling was the same as I was getting on Kepler I could play Witcher 2 4K with Ubersampling - Essentially 8K - and probably still get ~60FPS).

I dunno maybe Andy you can possibly shed some light on the matter? I doubt it's something that'll get fixed, and it's not really anything of huge issue, the games still run great, just odd to see such meh SLI scaling on titles that were previously so perfect. At least the upcoming big titles should get more attention and up-to-date SLI profiles and I'm looking forward to them greatly.

What were your framerates and resolution on the 2 680s? What are you using and getting on TX SLI?

What is your system config for the new TX SLI system?
 
Top Bottom