Nvidia GeForce GTX 1080 reviews and benchmarks

Yay!!!!
U1W1xgw.jpg
 
JELLY

I should probably have asked EVGA whether I can pick up my FTW at their branch in Munich, it will arrive on Monday according to tracking :(
Then again I don't think they have an actual store there, so that wouldn't have worked out anyway.
 
Quick question, I just picked up a Zotec 1080 Amp edition (not Amp Extreme). Is it a solid card with good over locking potential, or am I missing out by not going for the Amp Extreme (or other 3rd party card)?
 
Any chance of an i5 4690k bottlenecking the 1080 at 3440x1440 if I've OCed it to 4.4Ghz? Hopefully getting a 1080 and an LG 34UM97 soon and I want the best possible experience.
 
Finally got my EVGA Superclocked 1080, and it's the best! I'm coming off an R9 290 (and a defective one, at that), and the switch was dead easy. Literally took like 45 minutes all in. This card is actually a bit smaller too, which was a pleasant surprise.

Tried out DOOM and Witcher 3 just for kicks, and box can be maxed out with zero hitches in framerate, even with that stupid hairworks thing on.

Now the next step is to get a 4K TV, because my current 1080p 60hz TV makes it so games cap out at 60FPS. Can't have that!

What FPS do you think you will be getting on a 4k/60hz tv?
 
Any chance of an i5 4690k bottlenecking the 1080 at 3440x1440 if I've OCed it to 4.4Ghz? Hopefully getting a 1080 and an LG 34UM97 soon and I want the best possible experience.

I think you should be fine. I had an ivy bridge 3570K at 4.4GHz which worked great with a 16:9 144Hz monitor. I take it that the LG 34UM97 is a 60Hz monitor, right?
 
I thought some TVs were 120hz? Is that not a thing? Would that not allow me to bump my FPS?

I'm pretty dumb when it comes to this stuff.

120hz TVs aren't actually 120hz

They just use interlacing to bump their output to 120hz and can't actually accept 120hz signals.
 
120hz TVs aren't actually 120hz

They just use interlacing to bump their output to 120hz and can't actually accept 120hz signals.

It's not interlacing, it creates a entire new frame with information from the previous and next frame and the result is pretty good most of the time, at least on the Samsung and LG LED TVs I'm using.
 
aww man.

My 600W PSU only came with a 6+8 gpu power adapter but this thing is 8+8 and my system is telling me I have to plug in all the power cords. I'm gonna have to get one of those damn 6 --> 8 pin adapters. I have plenty of power but not enough pins. :-/

Party resumes on Monday unless people think I should just send it back instead. I would manage to pre-order one of the only cards that didn't come with a 6+8. :-/
 
USPS hits my place every day by lunch time. But apparently when they have my 1080 FTW on the truck, they're going to show up at like 6PM or something.
 
I think you should be fine. I had an ivy bridge 3570K at 4.4GHz which worked great with a 16:9 144Hz monitor. I take it that the LG 34UM97 is a 60Hz monitor, right?

Yeah, 21:9, 60hz, 3440x1440. I've found that GTAV is running a LOT better now I've OCed and that's only on a 970 at 1080p.
 
It's inducing input lag by design.

Yes, in order to create an additional in-between image, the screen needs to wait until it has at least two unique frames. This at minimum adds 16.7ms just by waiting for that extra frame (at 60fps), not to mention the added time for the TV to work its magic and create the third one based on the other two. This is why motion interpolation is generally a bad idea for video games, and why you should set your TV to gaming mode if possible to turn off any kind of post processing that adds input lag.
 
I thought some TVs were 120hz? Is that not a thing? Would that not allow me to bump my FPS?

I'm pretty dumb when it comes to this stuff.

All high end TVs have panels that are at least 120Hz internally, the trouble is that in most instances you can't input 120Hz (since only PCs can pull that off and the market interested in that capability is minuscule, TV manufacturers haven't been active in offering that feature). But some TVs at least unofficially allow you to input actual 1080p@120Hz. That was a part of why I went with the Sony X85C when buying a TV this Spring.
 
If you do run into CPU bottlenecked games, it might be worth looking into high speed ram http://www.overclock.net/t/1487162/...affect-fps-during-high-cpu-overhead-scenarios. Yes, it's expensive but it beats buying a new CPU, a motherboard and DDR4 sticks.

Thanks, any particular DDR3 high speed sticks you'd recommend? I currently have 8gb of the Corsair Vengeance LP. Low Profile RAM is preferable since I've got one of those massive Noctua coolers (which I only realised until Ocing how little I was taking advantage of them). I'll be able to buy 16gb of good RAM once I've sold off my 970.
 
Thanks, any particular DDR3 high speed sticks you'd recommend? I currently have 8gb of the Corsair Vengeance LP. Low Profile RAM is preferable since I've got one of those massive Noctua coolers (which I only realised until Ocing how little I was taking advantage of them). I'll be able to buy 16gb of good RAM once I've sold off my 970.

Unfortunately I don't know much about ram sticks. Just try to pick something that doesn't have terrible timings (lower timings are better). What I'm using is 2400mHz Kingston HyperX Beast with 11-13-14 timings (first number is most important, then comes the second etc.). I have no idea whether that's good or not. Timings dictate how quickly the ram responds while mHz tells you the bandwidth. Today's games seem to respond better to higher bandwidth than better timings so don't worry too much about it.
 
Wanna ask this question once more for the top of the page:

Is the Zotac Amp! (Non-extreme) a solid card with solid overclocking potential, or would I be better off looking at a Strix or Amp Extreme?
 
Man, after looking at 980 ti prices I'm just not sure what to do anymore. I've got around £1,200 to spend and with both an LG 34UM95 and a 1080 I'm cutting it very close to budget. The utter shitshow of prices (with UK prices being pretty much identical to the dollar) and availability is scaring me off the purchase despite knowing it'll be an amazing card for 3440x1440p.

How long am I going to have to wait for supply to level out and RRP to be reached? Because I'm more than willing to spend £600 on this if it means I'll be fine for at least 2 or 3 years. I'm just not willing to spend £700.
 
My watercooled 1080 gets up to 2126 on the core but that seems to be the max for me. Temperatures around 40-45 degrees under full load was hoping to crack that 2200 MHz on the core.
 
So I guess I wait a year and get an 1170 and that should push 4k pretty well

Volta is 2018 and based on inventory issues it's probable the 1080/1070 launches were pushed forward from what was originally planned. Pair that with the 970/980 lasting almost 2 year (September 2014 and only just replaced) it's quite possible there will be no 11XX's next year, opting rather to be Volta launch products in Q1/Q2 of '18. Obviously subject to change if AMD puts out competitive product, but going by the 480, that's unlikely.
 
Top Bottom