• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

NVIDIA to release GeForce Titan

I'm on the verge of buying one of these, my 6970's keep overheating in games and gives me a black screen.

Just need to be sure the problem is the graphics card and not something else before I put a titan in there...
 
preordered on newegg yesterday, have they ever done preorders before? I'm just curious if i should try and buy one on monday too? in case I have to wait for a second supply run?


anyone have any experience with newegg preorders?

Edit: Just noticed the release date on newegg is 2/28 now, I'm not sure if thats what it was before. Is that the official date for the titan now?
 
preordered on newegg yesterday, have they ever done preorders before? I'm just curious if i should try and buy one on monday too? in case I have to wait for a second supply run?


anyone have any experience with newegg preorders?

Edit: Just noticed the release date on newegg is 2/28 now, I'm not sure if thats what it was before. Is that the official date for the titan now?

that was the release date before.
 
All about this GPU makes me to have hard times trying to control a nervous laugh.

Tech Report review doesnt help either.

I love the crazy people selling GTX680 SLIs and GTX690 to get this.

Yea,

dual 670's dominate the Titan, and cost 40% less. Whats the appeal?
Answer: i guess lack of micro-stuttering due to single GPU?
 
Yea,

dual 670's dominate the Titan, and cost 40% less. Whats the appeal?
Answer: i guess lack of micro-stuttering due to single GPU?
Says who?

Dual 670s are $750-800. That's 20-25% less. The Titan is neck and neck with the 690, even surpassing it on a few benches.

I think you're reading the wrong reviews.
 
Says who?

Dual 670s are $750-800. That's 20-25% less. The Titan is neck and neck with the 690, even surpassing it on a few benches.

I think you're reading the wrong reviews.

Now I feel better about snapping up 2 670's for 500USD. I couldn't pass that deal up.
 
Yea,

dual 670's dominate the Titan, and cost 40% less. Whats the appeal?
Answer: i guess lack of micro-stuttering due to single GPU?

Tell you the truth, I don't really know what microstuttering is. I've seen some really horrible examples on youtube, but I don't see those on the GTX 690. Most stutter (when panning, or strafing) I get in games (Dota 2, UE3 engines) clean right up with turning Vsync on. On higher refresh rates (100hz) I do notice less smooth pans on the UE3 engine games, but I'm not really sure that counts as microstutter. Anyone has any real world examples on current 600 series Nvidia cards? Preferably on the GTX 690 so I can try and replicate it.

Oh yeah I remember, isn't Metro 2033 the poster boy for microstutter on Nvidia cards?
 
Take any game where you can have a smooth pan of some kind. Either running left/right in a platformer, running forward and watching the periphery, or spinning the camera around you with an analog stick in a 3rd person game. If it's not perfectly smooth, there's some microstutter.

Darksiders is the worst example I can think of.
 
I want to upgrade my aging 570 so bad. But I honestly think I'd be just fine with a 4GB 670. The only games that give me a hard time are the high-res textures in BF3, Witcher 2, and Crysis 2/3... they choke up my 1.2GB of VRAM.
 
Take any game where you can have a smooth pan of some kind. Either running left/right in a platformer, running forward and watching the periphery, or spinning the camera around you with an analog stick in a 3rd person game. If it's not perfectly smooth, there's some microstutter.

Darksiders is the worst example I can think of.

I'll check out Darksiders. But one thing though, I did find Dishonored to be slightly stuttery when playing with the mouse, but on an analog stick, looking around is smooth as butter. That was one of things that made me question what microstutter is. It could be the mouse, polling rate, and not actually because of SLI.
 
Yea,

dual 670's dominate the Titan, and cost 40% less. Whats the appeal?
Answer: i guess lack of micro-stuttering due to single GPU?

You are 1080p right?

Sorry, quoted the wrong post lol. But I'm pretty sure a single Titan handles Crysis 3 at that resolution. Although I don't know if it's over 60fps.
 
Always and forever.

That's what I like about the Titan though. That smooth single card performance, and lots of it.

Single card Titan is barely enough though. I will still need to SLI it to get to where I want... mainly Hawken on Ultra at 120hz.

Right now I can only do Hawken at a pretty consistent 8.2ms frametime on Medium graphics and I want more! Realistically though I might just be looking at High than Ultra judging on current benches, which is sort of depressing.

But yeah I would love for someone to point out to me how microstutter looks exactly because it doesn't seem like it is as cut and dry as Multi GPU = microstutter. Lots of factors are in play that can cause stutters; e.g. mouse drivers, panel refresh rate, even the game engine itself. Basically show me how it looks so I know exactly what to look out for.
 
But yeah I would love for someone to point out to me how microstutter looks exactly because it doesn't seem like it is as cut and dry as Multi GPU = microstutter. Lots of factors are in play that can cause stutters; e.g. mouse drivers, panel refresh rate, even the game engine itself. Basically show me how it looks so I know exactly what to look out for.

Pick up a used 5970 and strap yourself in, because you will be on the highway to stutteringtown.
 
I don't understand how anyone can play anything without VSync. Tearing is the absolute worst.
As noted above, I'll take tearing and stutters over input lag and a disconnected feeling.

Granted, I play on 120hz monitors, so I don't really notice a lot of tearing anyway.
 
I don't understand how anyone can play anything without VSync. Tearing is the absolute worst.

I don't get that much tearing.

That said, if you are on a 60hz monitor, and are pushing over 60FPS (and have screen tearing), then why not lock into 60FPS?

Unless I'm wrong, and you can tell the difference. But I've always thought that if you have a 60hz monitor, all you can really tell is 60FPS, and anything above that is irrelevant. So if you are pushing 120FPS on a 60 hz, it won't be any different than 60FPS. Anyone with more expertise on this can clear this up? EDIT: Okay, people are now talking about input lag lol. So I guess a difference is made by putting on V Sync.

For the record, I always leave mine off.
 
With Hawken, you will always be hard CPU bound due to the UE3 engine, it's just like T:A. What's your processor at?

It's a 3770K at 4.6 ghz.

On a GTX 690 running 1080p, Ultra Textures:

Medium Graphics gets me 120 fps (small drops when the action heats up)
High drops to around 90 to 100 fps
Ultra around 70 to 90+ fps.

PhysX knocks the fps lower of course, even with a dedicated 650ti.
 
You are 1080p right?

Sorry, quoted the wrong post lol. But I'm pretty sure a single Titan handles Crysis 3 at that resolution. Although I don't know if it's over 60fps.

1920x1200. My monitor has 1:1 hardware scaling, so for games like C3 i drop down to 1680x1050 to give my single N670 a fighting chance.
 
I don't get that much tearing.

That said, if you are on a 60hz monitor, and are pushing over 60FPS (and have screen tearing), then why not lock into 60FPS?

Unless I'm wrong, and you can tell the difference. But I've always thought that if you have a 60hz monitor, all you can really tell is 60FPS, and anything above that is irrelevant. So if you are pushing 120FPS on a 60 hz, it won't be any different than 60FPS. Anyone with more expertise on this can clear this up? EDIT: Okay, people are now talking about input lag lol. So I guess a difference is made by putting on V Sync.

For the record, I always leave mine off.

This is what I did on my 60hz monitor. Kept temps down too.
 
I don't get that much tearing.

That said, if you are on a 60hz monitor, and are pushing over 60FPS (and have screen tearing), then why not lock into 60FPS?

Unless I'm wrong, and you can tell the difference. But I've always thought that if you have a 60hz monitor, all you can really tell is 60FPS, and anything above that is irrelevant. So if you are pushing 120FPS on a 60 hz, it won't be any different than 60FPS. Anyone with more expertise on this can clear this up? EDIT: Okay, people are now talking about input lag lol. So I guess a difference is made by putting on V Sync.

For the record, I always leave mine off.

In my experiment I found that running Hawken at 120 fps on a 60hz display still gives me a much higher polling rate for my mouse and a more connected feeling. The feel of 120hz if you will!
 
Time to delid that sucker and get it to 5 GHz!

Serious though, we couldn't play at a consistent 8.3ms in T:A unless our 2500/2600Ks were clocked to 4.9-5.0, regardless of the video card.

I'm assuming you have to do the frame smoothing hack to get the lower frame times?
 
This is what I did on my 60hz monitor.

Interesting. So if you are on a 60hz monitor, is it better to leave V Sync on to lock in 60FPS? Or is it better to leave it off?

Now I'm hearing about screen tearing *shifty eyes*

In my experiment I found that running Hawken at 120 fps on a 60hz display still gives me a much higher polling rate for my mouse and a more connected feeling. The feel of 120hz if you will!


Alright. I've always left it off anyways, so I guess I'll just keep it off. I don't even have screen tearing that much anyways. Was just curious.
 
Time to delid that sucker and get it to 5 GHz!

Serious though, we couldn't play at a consistent 8.3ms in T:A unless our 2500/2600Ks were clocked to 4.9-5.0, regardless of the video card.

I'm assuming you have to do the frame smoothing hack to get the lower frame times?

120fps was unlocked in a previous patch IIRC. But I did change the config before that.
Delid... haha... maybe one day!

Another thing I found out that I wanted to share is that running my PC through my Denon Amp to the HDTV added a significant amount of input lag. Connecting straight to the HDTV in PC mode alleviated most of the lag. Right now I'm looking to purchase an SPDIF cable to get the audio signal out to the amp. Hopefully it wouldn't cause any audio out of sync problems this way...
 
I will be - but not until next week. NVIDIA GTX 690 (OEM from Alienware - I picked it up on eBay when I went quad-SLI).

Mind if I ask how much you're asking? Pure curiosity... ;)
 
You guys saw the "announcement" video, right? It sounds and looks like a freaking Mass Effect commercial!

http://www.youtube.com/watch?v=0_xgeOZrnJI

untitled-2h8sw0.gif
 
Top Bottom