Nvidia GeForce GTX 1080 reviews and benchmarks

980 Ti is already doing that. Anything you throw at it the 980 Ti will happily drive it at 1440p on Ultra at >60 fps. Battlefront started to give it a run for its money but it hangs in there. Going up to a 1080 gives you a 50% increase in power for good measure.

Please point to me a source that shows the 1080 being 50% more powerful than a 980TI....
 
The Titan X usually throttled back to the base clocks within a few minutes. The cooler it shipped with was hilariously inadequate for it but because it was so powerful at the time compared to the 980 and few people even bought them that no one noticed or cared.

I see. I wasn't in the market for a card back then, so I ignored it.
 
Grrr

I'm waiting for custom but I hear that it won't be out for a looooong time given the reason would be so the ~*~*~*~founders edition*~*~*~ can keep selling for 7 hunnit as long as it can

I think it was jayztwocentz who said this
 
980 Ti is already doing that. Anything you throw at it the 980 Ti will happily drive it at 1440p on Ultra at >60 fps. Battlefront started to give it a run for its money but it hangs in there.
I guess you've never played The Witcher III or Rise of the Tomb Raider? There are plenty of important games that the 980ti can't max out. I wish people would stop making blanket statements like this.
 
Should be a lot... the GSync GAF thread is pretty big, and NV has at least 70% GPU market share.


I'd need to see some relevant numbers before I believe that. GSync isn't even that talked about in other rigging/clocking communities I frequent. It's brought up but not often.
 
I guess you've never played The Witcher III or Rise of the Tomb Raider? There are plenty of important games that the 980ti can't max out. I wish people would stop making blanket statements like this.



A 980ti can't max out the Witcher 3 at 1440p?
 
hmm

so since they aren't getting that much better results performance wise, does that mean that the cooler is actually decent?

Too bad they couldn't give it more juice via a 8+6 or 8+8 pin

No, it just means the card can't supply enough power via that single 8-pin. Wouldn't surprise me if it can be pushed to 2.4-2.5GHz on a partner board with more voltage to play with.
 
Eh, when I say value, I meant a substantial difference in price for a substantial difference in performance. Usually, the x80 is substantially more expensive with noticably, but minimally, better performance. This time, it seems like a bigger leap than usual.

My mistake. Benchmarks pending of course.

1080 looks to be better value than 980, but it is still bad value relative to the x70s. But that is to be expected because it is the current high end so you pay a premium for that and thats how it is.
 
A 980ti can't max out the Witcher 3 at 1440p?
No, even without GameWorks maxed there are some pretty demanding scenes.
dm2hZEp.png

5JhEdcI.png
Really? I find that hard to believe, it's up at 40 FPS+ even on a 970.
To be fair, it's mostly fine, however the game has too many synchronous locks to maintain a solid 60fps on anything.
 
Nvidia shill, we've got an Nvidia shill over here! :p

If you're not getting any compensation for inane comments like this...Then...Wow, you're wasting your life.

Sorry for not being a fucking asshole 24/7.

The only Nvidia card I've ever owned was an MX440. I wonder how many you've owned? The last four cards I've had were an ATI 9500 non pro, x1300 pro, 6950 and now I have a 7950.

Nvidia shill here boys. Got me.

I do think the founders edition shit is stupid, and they are trying to take advantage of people I guess. It's not really taking advantage of someone if you pay for it.

I dunno, I show people respect IRL. Maybe I'm a different breed, I don't laugh at people and snicker in a crowd and make rude comments and shit to people trying to do their job.
 
I wouldn't say it's the fool's choice. It's the choice for anyone that's budget restricted, assuming that it isn't beaten by the Polaris 10's performance.

Tbh I'm not surprised by the 1070's specs, NVidia weren't going to repeat what they did with the 970, the 970 effectively made the 980 redundant.

Whether Nvidia did it by accident or design, the 970 was a good move. Having a card be so relatively close in performance to the top end, but at a budget price drove sales. Yes, it probably reduced sales of the 980 a little bit, but that market is different anyway - it is for those that want the best performance with no compromises, and are willing to pay for it.

Having the 1070 be a step back from that seems like a mistake to me. It is still a good card, but it isn't such amazing value anymore - its just another in a line of slightly better performance for slightly more money. Maybe they want to push people onto the 1080 but I think that could backfire because the kinds of people in the market for a 970/1070 are not likely to spend another $300 for a bit more performance.
 
A 980ti can't max out the Witcher 3 at 1440p?

Not at locked 60fps, mainly because the scene complexity is all over the place. Some areas are significantly more demanding due to high foliage asset density and other effects. Druid camp in Skellige is a common performance crusher. Then you have stuff like Hairworks that scales with proximity to viewpoint, so if the camera swoops in during flamethrower igni or while indoors the performance again takes a massive hit.

But it's perfectly playable, in my opinion, and still averages very high. I play maxed out 1440p, with tweaks to make it even more demanding, and with a 980ti still average 40 - 50.
 
Not at locked 60fps, mainly because the scene complexity is all over the place. Some areas are significantly more demanding due to high foliage asset density and other effects. Druid camp in Skellige is a common performance crusher. Then you have stuff like Hairworks that scales with proximity to viewpoint, so if the camera swoops in during flamethrower igni or while indoors the performance again takes a massive hit.

But it's perfectly playable, in my opinion, and still averages very high. I play maxed out 1440p, with tweaks to make it even more demanding, and with a 980ti still average 40 - 50.


Interesting. Makes sense though, I spent the majority of my time playing Wild Hunt at 4K on my 970's, but sometimes would drop down to 1440p when things seemed to get a little intense for whatever reason. Performance skyrocketed.


Looking forward to seeing how the 1080 does. We have benches yes, but I want to see how it works on a board partners OCed 1080 with less/no AA on.

Should really perform well at 4K, and should have no problem maintaining close to 60 at all times at 1440p with tweaked/lowered AA.
 
Not at locked 60fps, mainly because the scene complexity is all over the place. Some areas are significantly more demanding due to high foliage asset density and other effects. Druid camp in Skellige is a common performance crusher. Then you have stuff like Hairworks that scales with proximity to viewpoint, so if the camera swoops in during flamethrower igni or while indoors the performance again takes a massive hit.

But it's perfectly playable, in my opinion, and still averages very high. I play maxed out 1440p, with tweaks to make it even more demanding, and with a 980ti still average 40 - 50.
What settings are the heavy hitters at that resolution?
 
Whether Nvidia did it by accident or design, the 970 was a good move. Having a card be so relatively close in performance to the top end, but at a budget price drove sales. Yes, it probably reduced sales of the 980 a little bit, but that market is different anyway - it is for those that want the best performance with no compromises, and are willing to pay for it.

Having the 1070 be a step back from that seems like a mistake to me. It is still a good card, but it isn't such amazing value anymore - its just another in a line of slightly better performance for slightly more money. Maybe they want to push people onto the 1080 but I think that could backfire because the kinds of people in the market for a 970/1070 are not likely to spend another $300 for a bit more performance.
This is a good post. You're convincing me to go 70.
 
I really don't understand why people are so eager to burden themselves with a 4K native display right now. You're delusional if you think single gpu 4K60 at high settings is happening any time soon. Games aren't standing still, plenty of games hit high-end GPUs hard at 1080p right now and the consoles are getting a decent hike in power soon.

Oh and those who passed up on a 980Ti to wait for the 1080 are going to be kicking themselves when the 1080Ti launces. Ti > Ti is where it's at.
 
Not at locked 60fps, mainly because the scene complexity is all over the place. Some areas are significantly more demanding due to high foliage asset density and other effects. Druid camp in Skellige is a common performance crusher. Then you have stuff like Hairworks that scales with proximity to viewpoint, so if the camera swoops in during flamethrower igni or while indoors the performance again takes a massive hit.

But it's perfectly playable, in my opinion, and still averages very high. I play maxed out 1440p, with tweaks to make it even more demanding, and with a 980ti still average 40 - 50.

What tweaks do you use? I'm very curious as I have the same exact system as you as my backup PC, I wouldn't mind having Witcher 3 on there.
 
980Ti is a monster card still, upgrading to a 1080 is a waste of money IMO, just wait for the 1080Ti

Some people really feel the urge to max out everything or just want to have the best 'thing' out there. And that's okay, pc gaming is a hobby and not science. It's fine as long as they don't build up dept or neglect responsibilities.

Still I agree, in most cases going from the 980Ti to the 1080 isn't necessary because the new 1080 is still not good enough for 4k gaming (really hoped for it...). I, for example, will even stay on my 980 and my 27" 1080p display and see if we'll get better gpus and 4k-HDR displays in the year 2017.

You mean, It looks hot? ;)

It seems to be. Reaching 82°C during heavy gaming seasons according to some reviews. That's not good at all.
 
I'd need to see some relevant numbers before I believe that. GSync isn't even that talked about in other rigging/clocking communities I frequent. It's brought up but not often.

Each of the 1440p G-sync monitors have gigantic forum threads on [H], overclock.net and overclockers.co.uk
 
The 980ti doesnt do that, nor will a 1080(its also not a 50% increase)

Please point to me a source that shows the 1080 being 50% more powerful than a 980TI....

Raw power? Stock 980 Ti is 5.63 TFLOPS, 1080 is 9 TFLOPS. That's a tad more than 50% extra power. 8.4 TFLOPS for a 1500MHz 980 Ti, 11.7 for a 2100MHz 1080, that's 40% more power at max OC. Depending on how hard you push your cards you're looking at anywhere between 40-60% extra power.

Oh wait my bad it's 8.2 not 9. So yeah, 45%.
 
If you have a 980 Ti and are thinking about upgrading to the 1080, save the money and invest in a great G Sync monitor instead. It's definitely a better upgrade than this sidegrade.
 
That's the beauty of waiting for the AIB cards from others. They already overlock it for you

I'd still wait, there are usually plenty of cards that come overclocked out of the box. You'll most likely have a faster, cheaper and more quiet card if you just wait a couple of weeks.

Then you have even more reasons to wait for partner cards which will be overclocked by factory :)

Sheesh...Ok I will wait. I am so impatient.
 
980 Ti is already doing that. Anything you throw at it the 980 Ti will happily drive it at 1440p on Ultra at >60 fps. Battlefront started to give it a run for its money but it hangs in there. Going up to a 1080 gives you a 50% increase in power for good measure.

This is factually incorrect. Play The Division, or The Witcher III, or Rise of the Tomb Raider, or Dark Souls III, or GTA V, or Hitman, etc.. etc.. and tell me that.

The 980 Ti is an excellent card for 1440p but there's many games where you aren't going to get a locked 60fps at max settings.

And no, the 1080 doesn't provide 50% increase in power over the 980 Ti. This post is all kinds of wrong.
 
Raw power? Stock 980 Ti is 5.63 TFLOPS, 1080 is 9 TFLOPS. That's a tad more than 50% extra power. 8.4 TFLOPS for a 1500MHz 980 Ti, 11.7 for a 2100MHz 1080, that's 40% more power at max OC. Depending on how hard you push your cards you're looking at anywhere between 40-60% extra power.

Oh wait my bad it's 8.2 not 9. So yeah, 45%.

Neither 980Ti is 5.6 nor 1080 is 8.2. Both cards basically never work on their base clocks and even their rated boost clocks are conservative as cards are usually hitting higher than that. 980Ti is 6+, 1080 is 8.9+
 
If you ask me, dropping 3- and 4-way SLI support is actually a surprisingly consumer-friendly move.

In 99% of real-world cases, it has been a terrible idea for a long time now, if not always. Still, it allowed NV to sell one or 2 additional GPUs to insane people, and they are giving that up.

Every extra card you add also adds an extra frame of input lag.
+ 3 frames of input lag with 4x SLI then
You couldn't pay me to play with that much input lag

Just the +1 from SLI is already throwing the baby away with the bathwater as far as I'm concerned,especially since SLI doesn't even scale 100 percent.

"Premium Cooling" as it chokes itself back to 1600~ for half of the gaming session.

Blower coolers have always sucked.

Nvidia are so full of shit with their 'premium materials'

I hope it's worth it to them to piss off a bunch of people who buy this only to find out their card throttles just for the quick early adopter cash grab...

You'd think they'd be smarter than this, but some beancounter must have weighed that the cash grab is worth more than the loss in goodwill from users.
 
Some nice results coming in, seems Nvidia have delivered with Pascal. Just need to see the 1070 results now, so I can make my mind up on which one I'm getting now. I'm not paying that extra premium for the Founders Edition though, hopefully Asus or MSI will have some nice OC cards with even better coolers, at a decent price.
 
All this talk about the cards hitting 82C with the fan at full power, people need to realise that a lot of review sites will be using open air test beds. These don't offer the best cooling, so these temps should be lower inside a sealed environment with plenty of air being pulled in via the front case fans.

As for me being a 980 Ti owner, I'll be upgrading to the 1080 purely because I own a 165Hz GSYNC monitor, so the more frames I can get the better at 1440p :)
 
Top Bottom