• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce GTX 1080 Ti launch/review day - 2017/03/09

People crack me up with 60FPS talk... as if the standard refresh rate isn't 144/165 nowadays. Who the hell would buy a card this expensive on a 60hz monitor?

I don't think you know what standard means.

I'd also love it if you could point me towards a 65 inch 4k monitor that runs at 144 or 165
 

Celeras

Banned
I don't think you know what standard means.

I'd also love it if you could point me towards a 65 inch 4k monitor that runs at 144 or 165

If you are gaming on a 65 inch 4K/60hz "monitor" (lol) someone should revoke your wallet. 60hz is a collosal downgrade.

Not that it's any of my business, I just said it cracks me up. I am unfortunately thoroughly aware of the existence of stupid people with too much disposal income.
 

Xyber

Member
4K120Hz is still far from viable with todays GPU's unless you only play old games.

The 1080Ti will be good enough for me at 1440p165Hz, but it's the Volta Ti card I'm really waiting for. But that's still far off. :(
 

Renekton

Member
People crack me up with 60FPS talk... as if the standard refresh rate isn't 144/165 nowadays. Who the hell would buy a card this expensive on a 60hz monitor?
I believe if you max/near-max the settings on certain games (Dishonored 2, Mankind Divided, Ghost Recon, Watch Doge 2), a GTX1080 will only give you slightly north of 60fps on a 1080p screen.
 

Pagusas

Elden Member
People crack me up with 60FPS talk... as if the standard refresh rate isn't 144/165 nowadays. Who the hell would buy a card this expensive on a 60hz monitor?

Going to be running a $5,000 75" 4k HDR screen, I think I'll be ok with 60hz ;)
 
If you are gaming on a 65 inch 4K/60hz "monitor" (lol) someone should revoke your wallet. 60hz is a collosal downgrade.

Not that it's any of my business, I just said it cracks me up. I am unfortunately thoroughly aware of the existence of stupid people with too much disposal income.

I'll take my 65 inch 4k tv screen at 60 over a 27, 32, or even a 34 ultra wide any day.

Ideally I'd love 144 native 4k, but I don't have access to a time machine.

Playing on a screen that small is a literal downgrade. That cracks me up.
 

Grechy34

Member
This is the first year I’m not getting sucked into NVidia’s hype wagon. For the last 2 years I’ve upgraded from the 980 to the 980ti and the 1080 to the 1080ti. I expected my 1080 to at least now keep up with 4K consistently but in the space of not even a year it seems it’s struggling already and 1440P is the sweet spot for the card. I’m not forking out another $500-600 (AU) for another 6 months. I still don’t think 4K 60 is quite there yet even though the benchmarks right now look good.
 
Jeez this thing runs hot.

index.php


Will be like having a small heater in your room when gaming :)

84 degrees is shocking. I would advise anyone and everyone to avoid the FE and wait for the customs as always.
 
Yea, I'll be waiting for Vega to see what they offer. But man, that is finally a proportionate value to the 1070 in terms of performance. Would wait for an AIB anyway though.
 

jrcbandit

Member
People crack me up with 60FPS talk... as if the standard refresh rate isn't 144/165 nowadays. Who the hell would buy a card this expensive on a 60hz monitor?

Who the hell wants to play games on a tiny 27 inch monitor (the vast majority of 1440p and 4k monitors)? And the ultrawide 34 inch monitors are typically ~ the same height as a 27 inch monitor.

I'll take my 4k 40 inch TV any day over a 27 inch display and good luck running 4k much higher than 60 hz unless you only play old games or something graphic-lite.
 
I'm about to put my MSI 780 it's for sale and was wondering what a good price for one would be and if I should try selling them as a pair or individualy. Is $160 too much?
 

Raticus79

Seek victory, not fairness
Who the hell wants to play games on a tiny 27 inch monitor (the vast majority of 1440p and 4k monitors)? And the ultrawide 34 inch monitors are typically ~ the same height as a 27 inch monitor.

I'll take my 4k 40 inch TV any day over a 27 inch display and good luck running 4k much higher than 60 hz unless you only play old games or something graphic-lite.

I have both and still wind up going with 4k60 over 1440p144 most of the time. It just depends on the game.

I'll probably ditch both for a proper HDR 4K/120 once they're available.
 
Yeah noise and throttling when overclocked are going to be problems.

Funny as 'Vidya came out with the same BS of how they've made the cooler so good now.
 
Is shocking?

That is the standard temperature for the most modern high end GPUs with reference cooler.

If you want to spend $700 on a card with such a weak cooler be my guest. If you want to buy one then put a custom water loop on it then that's not so bad but you're going to be missing a chunk of money out of your bank account.
 

Weevilone

Member
84 degrees is shocking. I would advise anyone and everyone to avoid the FE and wait for the customs as always.

If 84 degrees is shocking then you haven't been paying attention. Reference cards either run up against the thermal limit or the power limit, period. It's been this way for years.
 

Xyber

Member
Who the hell wants to play games on a tiny 27 inch monitor (the vast majority of 1440p and 4k monitors)? And the ultrawide 34 inch monitors are typically ~ the same height as a 27 inch monitor.

I'll take my 4k 40 inch TV any day over a 27 inch display and good luck running 4k much higher than 60 hz unless you only play old games or something graphic-lite.

My 1440p 27" monitor fills up more of my vision than my 49" 4KTV, so I find playing on my monitor much more immersive than the TV. It's not all about the size, it's the distance to the screen that matters. And you would have to sit real close to a 40" screen to make it feel big.

I also mostly play FPS games and would never ever play that on the TV over my monitor.
 

BriGuy

Member
For the price and performance, it almost doesn't make sense to get anything other than a Ti card. Hop on a Ti schedule and replace it with another in however many years.
 
My 1440p 27" monitor fills up more of my vision than my 49" 4KTV, so I find playing on my monitor much more immersive than the TV. It's not all about the size, it's the distance to the screen that matters. And you would have to sit real close to a 40" screen to make it feel big.

I also mostly play FPS games and would never ever play that on the TV over my monitor.

I'll give you competitive shooters for sure.

As far as distance goes, I sit about 3.5 feet from my 65.


Edit : Just checked with a tape measure. It's a bit closer to 4 feet actually.
 
I might just run my PC through my 50" 4K Samsung until I solve the 1440p DVI issue. Ordered an "active" adapter for $30, let's hope it works.
 

ss_lemonade

Member
If you are gaming on a 65 inch 4K/60hz "monitor" (lol) someone should revoke your wallet. 60hz is a collosal downgrade.

Not that it's any of my business, I just said it cracks me up. I am unfortunately thoroughly aware of the existence of stupid people with too much disposal income.

To each his own I guess. Witcher 3 looked amazing at 4k on my 65in ks9000 in the IQ department (my 780 could barely run it lol), and I still see 60fps as smooth even though I'm used to my 144hz monitor. Not sure why you think gaming on a 65in TV is lol-worthy though.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Very happy with the performance versus cost. My gut says I'll get one eventually, when the third party models release and we get a better idea of OC costs and cooling, but I'll see how I go.

At the moment I'm using an OC'd 1080 on a 144hz 1440p Gsync display and it's a wonderful pairing, but I'm asomeone who despite my adoration for high framerate also buys high end hardware for the ability to crank game settings as high as possible. I'm not just in it for performance; I want my fancy bells and whistles and the nicest IQ possible. This is a massive draw for me to keep playing on PC and the reality is this path, while expensive, benefits from each and every GPU advancement.

As I've noted in several threads already my absolute dream is a HDR 4K 144hz Gsync display and a GPU that can match performance within the ballpark of my 1080 + 1440p. Pipedream for now, but we're getting there.
 

Celeras

Banned
To each his own I guess. Witcher 3 looked amazing at 4k on my 65in ks9000 in the IQ department (my 780 could barely run it lol), and I still see 60fps as smooth even though I'm used to my 144hz monitor. Not sure why you think gaming on a 65in TV is lol-worthy though.

Sounds like you have both? Grab a window on your desktop and move it in a circle. Then do it again on your 144hz.

THAT is enough to make me cringe. With actual in-game motion? Full on eye bleed.
 

PFD

Member
Jeez this thing runs hot.

index.php


Will be like having a small heater in your room when gaming :)

84 degrees is shocking. I would advise anyone and everyone to avoid the FE and wait for the customs as always.

Par for the course for a 250W card

edit: are we really turning this into a display wars thread?
 
I needed this to suck. I've been putting off building a new pc and this makes it tough.

Just imagine how freaking amazing the next card will be. That's how I convince myself to not buy every card released


Jeez this thing runs hot.

index.php


Will be like having a small heater in your room when gaming :)

84 degrees is shocking. I would advise anyone and everyone to avoid the FE and wait for the customs as always.

I know they improved the cooler but I was so disappointed by my 1080 FE reference cooler. I got the card for super cheap but I'll probably never buy a reference card again
 

prophecy0

Member
I game on both a 27" 1440p 144Hz G-Sync monitor and a 120" screen by way of a projector (1080p @ 60Hz). When I game on the projector I downsample the hell out of whatever game I'm playing and it looks great. I'm upgrading from a 980Ti to a 1080ti because I can afford it. It will help me get closer to 144fps @1440p and I can just go for pure IQ @ 60fps when I use my projector.

I sit close enough to my monitor that it is plenty immersive, even when compared to my 120" screen.

Edit: On the cooler front, I'll probably end up putting an EVGA hydro cooler on my FE card at some point down the line.
 

poodaddy

Member
Sounds like you have both? Grab a window on your desktop and move it in a circle. Then do it again on your 144hz.

THAT is enough to make me cringe. With actual in-game motion? Full on eye bleed.

Dude why are you denigrating people's choices of display to play on? Some of us prefer to have our PC's hooked up to the living room TV as we game out there on a controller, so 60 Hz will do just fine. You think my wife and daughter give a fuck about how many frames the display can display? Nah, they just want to be able to chill on the couch with me and enjoy games on a big screen. You literally called people who prefer this stupid earlier.....come on man. Some of us have families, high refresh rate monitors are not ideal for living room situations, so we get the shiny new GPU's for improved picture quality and resolution instead; nothing stupid about that.
 
Just imagine how freaking amazing the next card will be. That's how I convince myself to not buy every card released




I know they improved the cooler but I was so disappointed by my 1080 FE reference cooler. I got the card for super cheap but I'll probably never buy a reference card again

in what ways were you disappointed?
 

Celeras

Banned
Dude why are you denigrating people's choices of display to play on? Some of us prefer to have our PC's hooked up to the living room TV as we game out there on a controller, so 60 Hz will do just fine. You think my wife and daughter give a fuck about how many frames the display can display? Nah, they just want to be able to chill on the couch with me and enjoy games on a big screen. You literally called people who prefer this stupid earlier.....come on man. Some of us have families, high refresh rate monitors are not ideal for living room situations, so we get the shiny new GPU's for improved picture quality and resolution instead; nothing stupid about that.

So your wife and daughter don't give a fuck about how many frames are displayed, but they care about how many frames are displayed should you increase the picture quality without the "shiny new GPU". Alright, then.

I literally answered the man's direct question to me. Not sure why that involves or offends you and your daughter. Enjoy your setup any way you'd like. But you can't be surprised at those scoffing at the astronomical prices someone would pay to be so vastly inferior... on purpose.
 

Durante

Member
Is shocking?

That is the standard temperature for the most modern high end GPUs with reference cooler.
Yeah, nothing shocking about it.

It's also not like that is a random temperature it just ends up at, it's exactly what it is set to run at, and that's why it does.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
For the price and performance, it almost doesn't make sense to get anything other than a Ti card. Hop on a Ti schedule and replace it with another in however many years.

What about the next x70 card that comes out next year with similar performance and half the price?
 

Pagusas

Elden Member
How unfortunate, I'm sorry for your loss. Screen tears on a 75" must be quite the sight to behold. Or do you opt for the input lag instead?

I can deal with 35ms of lag in hdr mode and 22 in normal gaming mode, input lag doesn't bother me in the games I play, so I'll always use vsync.
 

Xyber

Member
What about the next x70 card that comes out next year with similar performance and half the price?

It all comes down to when you want the performance. Have it now at a premium price, or in a year at a more reasonable price for the bigger audience.

The Ti cards should be bought at or close to launch to get your moneys worth or else it's better to just wait for the x70 GPU down the line.
 
Top Bottom