Nvidia GeForce GTX 1080 reviews and benchmarks

If you ask me, dropping 3- and 4-way SLI support is actually a surprisingly consumer-friendly move.

In 99% of real-world cases, it has been a terrible idea for a long time now, if not always. Still, it allowed NV to sell one or 2 additional GPUs to insane people, and they are giving that up.

Agreed. I'd be interested to see how many developers have complained or asked Nvidia for SLI "to just go away". I don't see Nvidia simply making this decision without any merit.

Still, as much as one could argue against it, the way cards are going it doesn't make much sense. Unless you're a super enthusiast who simply has to have bleeding edge setups.

On a different note, has anyone seen any places that are taking pre-orders for the GTX 1080? Yeah I got a GTX 980 Ti, but I sold the GTX 970 from my HTPC and plan on moving the GTX 980 Ti to that and putting the 1080 in my main PC. I surprised as I'd figured Nvidia would be taking orders already, since they're selling cards this time around.
 
Downsampling, VR, more likely to play the latest games at 1080p max settings. Don't know why people keep saying the 1080 is overkill for 1080p, the graphics in AAA games isn't going to just stay the same forever.

It will for the near future as most multiplats will be tied to their console counterparts
 

Durante

Member
Wait, holy crap, this is a thing? I clearly haven't been playing close enough attention to the news.

Game changer if this actually works. I was planning to switch to AMD for my next GPU, but not if nVidia offers an easy way to get real triple buffering working.
Yeah, from the demonstration it's just another driver option (in addition to vsync and adaptive sync).
 

Gbraga

Member
So FastSync is basically real triple buffering that can be applied on a driver level, and work on fullscreen mode?

It may not be revolutionary and deserving of a new name, but it sounds pretty damn good.
 

Knurek

Member
Yeah, from the demonstration it's just another driver option (in addition to vsync and adaptive sync).

To be fair, the demonstration mentioned FastSync being usable mostly in eSports scenario - high framerate gaming on low framerate monitor.

This is definitely something that should only be enabled in the NVIDIA Control Panel for those games that are running at frame rates well above the maximum refresh rate of your display. FastSync will be its very nature introduce some variability to the smoothness of your game, as it is “dropping” frames on purpose to keep you time with the refresh of your monitor while also not putting backpressure on the game engine to keep timings lined up. At high frame rates, this variability issue isn’t really an issue as the frame times are so high (200 FPS = 5ms) that +/- 5ms is the maximum frame to frame variance you would see. At lower frame rates, say 45 FPS, you could get as much as a 22ms variance.
 
I have it hooked up to my 50 inch TV, so 1080p is the limit right now. At that resolution, would there be any reason for a 1080? Any benefit in emulators like Dolphin?

With the 1070, I'd imagine you could max out games @1080P/60fps for 2-3+ years before having to worry about lowering many settings. With a 1080, you'd likely get a solid 3-4+ years. All speculation, of course.

Dolphin is very CPU dependent, and frankly any of the GPU's you're looking into will blow through Dolphin emulated games like nothing.
 

Hawkian

The Cryptarch's Bane
Welp. Just ordered an HTC Vive.

I suppose that the date it actually ships will determine which overpriced iteration of this card is the one to get my dollar bills.
 

Weevilone

Member
Not really, throttling at boost speeds is a fairly consistent phenomenen across all few reviews I've checked.

I saw the occasional momentary dips in the HardOCP testing, and I saw that Tom's was able to get it down past the floor with synthetic stress.
 
EDIT: I just got far enough into the video. They are actually NOT binned chips so there is no overclocking advantage there. I doubt their "better thermals" are going to compete with the partner coolers that typically do a way better job than the reference designs and will STILL wind up being cheaper than the Founder's Edition.

Why would anyone get this thing?

Purely for dick waving purposes.

And I guess people who are extremely impatient.
 

fred

Member
Sounds like the 1070 is the fools choice. 1080 or 980ti, if you must buy a new card now.

I wouldn't say it's the fool's choice. It's the choice for anyone that's budget restricted, assuming that it isn't beaten by the Polaris 10's performance.

Tbh I'm not surprised by the 1070's specs, NVidia weren't going to repeat what they did with the 970, the 970 effectively made the 980 redundant.
 
I have it hooked up to my 50 inch TV, so 1080p is the limit right now. At that resolution, would there be any reason for a 1080? Any benefit in emulators like Dolphin?

Dolphin (and other intensive emulators) are CPU dependent, and at that almost entirely single-CPU-core dependent (dual core helps somewhat, but beyond that almost nothing). Having a poor graphics card impacts Dolphin performance (integrated graphics, for example, won't work well), but any decent GPU will suffice unless you're going for REALLY high rendering resolutions.

Not that I'm necessarily recommending anything, but don't discount the advantages of downsampling! Even on a 1080p screen, rendering a game at 4K will make things look a lot cleaner. I'm not sure that it looks so good I'd buy a better card for it, but it makes old games look SUPER nice!
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Wow, the x80 card actually seems to be a better value than the x70 card. That hasn't been the case since... Before the 600 series?
 

J-Rzez

Member
The questions were good but the delivery and attitude of some of the people weren't.

No they were great. We need more of this, calling out companies for their BS instead of softies going in thinking "better not show disgust or I won't get invited for more free food and swag."

I applaud these guys, if I saw them in public, drink on me.
 

Odrion

Banned
Where are the 1070 benchmarks? Why are people talking about the 1070 as if actual benchmarks are out?
Wow, the x80 card actually seems to be a better value than the x70 card. That hasn't been the case since... Before the 600 series?
Where are the alarming large amount of posts of people telling you the 1080 won't be stronger than the 980ti?
 

Odrion

Banned
How would the x80 be a better value than x70? It costs 58% more, and at most there probably be a 30% difference in perf.
Historically no. But I could happen?

I mean, let me pull out Bullshit Math™...

edit: So the 1080 is, let's say, around 25% better than the 980ti. And that's $600

And the price difference between the 1070 and 1080 (without that FE fucking bullshit) is 57%

If the 1070 is anywhere near a 980ti. Or heck, it's not and it's like around 15% less that still will probably be the better value.
 
Historically no. But I could happen?

I mean, let me pull out Bullshit Math™...

Yea let me do some bullshit math for you. To be a better value than the 1070, the 1080 should be at least 58% faster which would put the 1070 at around 980 performance.

What might be true though is that the 1070 doesn't seem to have the same value advantage in relation to the 1080 (25% cores cut) like the 970 had compared to the 980.
 

x3sphere

Member
Historically no. But I could happen?

I mean, let me pull out Bullshit Math™...

edit: So the 1080 is, let's say, around 25% better than the 980ti. And that's $600

And the price difference between the 1070 and 1080 (without that FE fucking bullshit) is 57%

If the 1070 is anywhere near a 980ti. Or heck, it's not and it's like around 15% less that still will probably be the better value.

Yeah exactly. For the 1080 to be considered a better value, it would have be only like 5% over 980. That's not going to happen as Nvidia already said it's faster than Titan X. Sure, maybe they exaggerated about that and it's slower in some games but it's not going to be anywhere near 15-20% slower otherwise they were outright lying.

What might be true though is that the 1070 doesn't seem to have the same value advantage in relation to the 1080 (25% cores cut) like the 970 had compared to the 980.

That is what I expect.
 

Odrion

Banned
Yeah exactly. For the 1080 to be considered a better value, it would have be only like 5% over 980. That's not going to happen as Nvidia already said it's faster than Titan X. Sure, maybe they exaggerated about that and it's slower in some games but it's not going to be anywhere near 15-20% slower otherwise they were outright lying.
Yeah, no way. Absolutely no way.

I'm continuing on my unfounded (well, influenced by past memories of other card releases) belief that it'll be 10~15% less powerful than reference, but will OC past 980ti's reference power so people can still get hyped. Your MSI G1 Gamerwhatevers and your EVGA Illuminatiblah card will be factory clocked to be as or more powerful than the 980ti's reference clock.

edit: The Titan X has 10% more cuda cores over the 980ti's and the performance is near negligible. 25% may seem significant but it still may not add up to much. Don't panic until there's hard evidence of it's performance.
 

demigod

Member
This been posted yet? Pardon me if it has.

NVIDIA on founders edition.

This was very, very hard to watch...

So let me get this straight. Baldie says partners will release their custom coolers on the 27th at $649 while the Rip Me Off cards will be $699. Then he later says there will be cards priced higher than $699 im assuming its the hydro/classifieds. THEN he says partners will also have gimp old reference design cards at $599.

Whichever company releases their custom card at $599 gets my money, not holding my breath tho.
 
If you ask me, dropping 3- and 4-way SLI support is actually a surprisingly consumer-friendly move.

In 99% of real-world cases, it has been a terrible idea for a long time now, if not always. Still, it allowed NV to sell one or 2 additional GPUs to insane people, and they are giving that up.

threres a poster who sometimes links his videos here of various games running at 5k with a quad titan x system if i remember correctly, and i dont know how he plays his games like that. the microstutter makes his 40 to 70 fps look like <20.
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Historically no. But I could happen?

I mean, let me pull out Bullshit Math™...

edit: So the 1080 is, let's say, around 25% better than the 980ti. And that's $600

And the price difference between the 1070 and 1080 (without that FE fucking bullshit) is 57%

If the 1070 is anywhere near a 980ti. Or heck, it's not and it's like around 15% less that still will probably be the better value.
Eh, when I say value, I meant a substantial difference in price for a substantial difference in performance. Usually, the x80 is substantially more expensive with noticably, but minimally, better performance. This time, it seems like a bigger leap than usual.

My mistake. Benchmarks pending of course.
 

Haint

Member
I wouldn't say it's the fool's choice. It's the choice for anyone that's budget restricted, assuming that it isn't beaten by the Polaris 10's performance.

Tbh I'm not surprised by the 1070's specs, NVidia weren't going to repeat what they did with the 970, the 970 effectively made the 980 redundant.

The 970 is the highest selling enthusiast card they've had in years (likely in their entire history actually) and was the primary contributor in doubling their dGPU sales YoY last year (single handedly more than doubling 980, 980Ti, and Titan sales combined). I'd say they're fools not doing everything in their power to repeat what they did with the 970. The market for $600 and $700+ cards is demonstrably very small, and the volume a fantastic $350 card can do will eclipse the profits and revenue they will ever see in the over priced high end.
 

Renekton

Member
The 970 is the highest selling enthusiast card they've had in years (likely in their entire history actually) and was the primary contributor in doubling their dGPU sales YoY last year (single handedly more than doubling 980, 980Ti, and Titan sales combined). I'd say they're fools not doing everything in their power to repeat what they did with the 970.
It is possible existing 970 owners will stand pat this time, contented with 1080p/60/high. They ain't chasing the dragon like we are.

Plus 970 situation was only possible after the process node matured enough to get nice yields. 16FF+ needs time.
 
Yes, when do the preorders officially start?

They'll start when various web stores update. Nobody ever announces these things in advance. Some really crazy people write scripts to refresh Newegg or another site continuously and notify them when a change is made.

I'm waiting for custom cooler cards, so I'm just going to let the ravening hordes DDoS Newegg for FE cards. Just say no to stock clocks.
 

Haint

Member
It is possible existing 970 owners will stand pat this time, contented with 1080p/60/high. They ain't chasing the dragon like we are.

Plus 970 situation was only possible after the process node matured enough to get nice yields. 16FF+ needs time.

As a for profit corporation, their goal is to have every customer chasing the dragon. To do that, you put out an even better product that makes even the contented indifferents want it. We'll have to wait and see what the supply is really like.
 

BigTnaples

Todd Howard's Secret GAF Account
So let me get this straight. Baldie says partners will release their custom coolers on the 27th at $649 while the Rip Me Off cards will be $699. Then he later says there will be cards priced higher than $699 im assuming its the hydro/classifieds. THEN he says partners will also have gimp old reference design cards at $599.

Whichever company releases their custom card at $599 gets my money, not holding my breath tho.



Wait, the 27th of this month we get get custom cooled OC variants from EVGA and the like?


Hell yes.
 

Renekton

Member
As a for profit corporation, their goal is to have every customer chasing the dragon. To do that, you put out an even better product that makes even the contented indifferents want it. We'll have to wait and see what the supply is really like.
It's probably not consistent like Apple.

We'll soon find out where consumers stand when the $379 1070 and sub-$300 Polaris 10 come out. Maybe the 270/760 holdouts are feeling the squeeze right about now, and AMD is betting on that.
 
So let me get this straight. Baldie says partners will release their custom coolers on the 27th at $649 while the Rip Me Off cards will be $699. Then he later says there will be cards priced higher than $699 im assuming its the hydro/classifieds. THEN he says partners will also have gimp old reference design cards at $599.

Whichever company releases their custom card at $599 gets my money, not holding my breath tho.


Wait, the 27th of this month we get get custom cooled OC variants from EVGA and the like?


Hell yes.

Pretty sure baldie said that the partners would receive their boards on the 27th. Not release. I don't think we are getting any partner cards for a few weeks. Maybe see them announced at Computex.
 

BigTnaples

Todd Howard's Secret GAF Account
Pretty sure baldie said that the partners would receive their boards on the 27th. Not release. I don't think we are getting any partner cards for a few weeks. Maybe see them announced at Computex.


Ah fair enough. I can wait a few weeks for sure. 970 SLI is doing me really well right now.
 
Ah fair enough. I can wait a few weeks for sure. 970 SLI is doing me really well right now.

Yea, I don't get how people are that impatient that they can't make it another couple of weeks. Especially knowing that the cards will be better. There's really nothing out that is demanding you get a card right now!! anyway. I'm running integrated graphics now, but I'm definitely holding off, especially considering some of those thermal graphs that were posted today.
 

demigod

Member
Pretty sure baldie said that the partners would receive their boards on the 27th. Not release. I don't think we are getting any partner cards for a few weeks. Maybe see them announced at Computex.

Just went back and he says he doesnt know when they'll be available but believes they will start showing(to me sounds like selling) on the 27th. But yeah im sure you're right about it not being available until a couple weeks later, gotta get that nvidia tax to work first!

Theres already a couple of people in this thread ready to preorder the FE, lol.
 
Just went back and he says he doesnt know when they'll be available but believes they will start showing(to me sounds like selling) on the 27th. But yeah im sure you're right about it not being available until a couple weeks later, gotta get that nvidia tax to work first!

Theres already a couple of people in this thread ready to preorder the FE, lol.

Oh ok, well that's slightly better news. I could've sworn the first thing he said about the 27th was that that's when the partners were receiving their boards. Too lazy to watch again, and don't feel like watching Nvidia trip all over themselves again. You'd think they would've been prepared for questions like they got, and the negativity, but they seemed like it caught them offguard.
 

Odrion

Banned
It is possible existing 970 owners will stand pat this time, contented with 1080p/60/high. They ain't chasing the dragon like we are.

Plus 970 situation was only possible after the process node matured enough to get nice yields. 16FF+ needs time.
I wish Nvidia would adopt Freesync and rename it to Diet G-SYNC or whatever. Because even though they are selling $700 monitors I think offering a significantly cheaper solution would help them in the long run.

Adaptive sync technology is a great way to give people the incentive to upgrade more often. "But it would allow older video cards to remain relevant as well" that's true! But it also means every frame gained matters. People on 1080p/60fps monitors are still mostly fine with their 970s or 780s if games aren't really begging for Pascal at that framerate and resolution, but if a new videocard means the games they're currently playing would go from 60fps to 90fps that preposition becomes very tempting.
 

Renekton

Member
I wish Nvidia would adopt Freesync and rename it to Diet G-SYNC or whatever. Because even though they are selling $700 monitors I think offering a significantly cheaper solution would help them in the long run.
I think Nvidia will profit more from GSync because owners are vendor-locked to Nvidia cards for a very long time.

How long does a monitor last, 10 years?
 

holygeesus

Banned
No. a 1080 would be complete overkill for a 1080P TV set up.

That depends really. If you are not happy making compromises to hit 60fps neither a 980ti or a 1070 will be enough to proper max out settings. That is with current games with console equivalents too e.g. The Division, Rise of The Tomb Raider - neither game can be maxed out completely on my 6700k/980ti(OC) combo and keep 60 stable.

That is now. Future games may well be more demanding still, so a 1080 is still a legit choice for 1080p gaming depending on your requirements.
 

Weevilone

Member
I think Nvidia will profit more from GSync because owners are vendor-locked to Nvidia cards for a very long time.

How long does a monitor last, 10 years?

Physically, maybe.. But with aspect ratio changes, resolution changes, refresh rate improvements, etc.. I'd say less.
 

Derp

Member
I just want a card that will maintain 1440p 60 FPS at max settings on any game (until a reasonable point in the future of course). I don't think that will exist for a long while :/
 
I just want a card that will maintain 1440p 60 FPS at max settings on any game (until a reasonable point in the future of course). I don't think that will exist for a long while :/

980 Ti is already doing that. Anything you throw at it the 980 Ti will happily drive it at 1440p on Ultra at >60 fps. Battlefront started to give it a run for its money but it hangs in there. Going up to a 1080 gives you a 50% increase in power for good measure.
 

Odrion

Banned
I think Nvidia will profit more from GSync because owners are vendor-locked to Nvidia cards for a very long time.

How long does a monitor last, 10 years?
What's the adoption rate of $700 G-Sync monitors? Yeah, people on those monitors are stuck until they sell, but what does that amount to really?

Also couldn't Nvidia just have "Diet G-Sync" branded monitors using the Freesync solution could lock AMD users out of it if that was an important factor?
 
980 Ti is already doing that. Anything you throw at it the 980 Ti will happily drive it at 1440p on Ultra at >60 fps. Battlefront started to give it a run for its money but it hangs in there. Going up to a 1080 gives you a 50% increase in power for good measure.

The 980ti doesnt do that, nor will a 1080(its also not a 50% increase)
 
Top Bottom