Nvidia GeForce GTX 1080 reviews and benchmarks

Disappointed at the 1080. Can't do 4K VR. The end.

lol what :-D
guessing this is a joke
Also i am pretty sure it can, just depends on what level of detail you expect in the game and what game you are talking about.. also no VR headset supports 4K
 
lol what :-D
guessing this is a joke
Also i am pretty sure it can, just depends on what level of detail you expect in the game and what game you are talking about.. also no VR headset supports 4K

I always assume max settings, because if not then a lot of videocards can do 4K if you lower your video settings enough.
 
I always assume max settings, because if not then a lot of videocards can do 4K if you lower your video settings enough.

So we are betting on the graphics quality of games to stagnate because of consoles, so that one day we may enjoy 2x 4k 90hz at max settings.
 
What determines your boost speed out of interest? My base clock is ~1250 and my memory 1952MHz but my boost is only 1338???

each card has its own curve depending on the quality of the chip. 2 seperate 980tis can boost to a different level with an identical base clock for instance. from everything ive seen higher asic quality gpus boost higher off of a given base clock since they require less voltage
 
Thanks for the detailed and insightful answer.

The same question is asked and answered on every page. If you wanted more details then it's pretty easy to find them.

each card has its own curve depending on the quality of the chip. 2 seperate 980tis can boost to a different level with an identical base clock for instance. from everything ive seen higher asic quality gpus boost higher off of a given base clock since they require less voltage

Thanks.
 
In my experience they absolutely do (to varying levels), even on their crappy stock cooler.

Only the unlocked chips, think there are even timer limits on the normal chips.. And if we are talking reference design, then try using an Intel made motherboard.
 
PCB of a $700 "High End" GPU!
Wow that's a new low for Nvidia.

As we've been saying, it's literally nothing but an early adopter's tax and yes it's shameless and a new low.

It works though as people are now adopting the narrative that a decent pcb and aftermarket cooler have to cost 100 dollers over a trash tier pcb and cooler (like this reference shit, ironically)


That video just shows how bad it really is, they're pinching pennies on a SEVEN HUNDRED dollar product, just ridiculous.

This is the kind of thing you'd expect on some MX or OEM version of a low end part , not on an (according to some superfans on this forum) 'top shelf' part.
 
I got offered a couple week old EVGA 980ti FTW edition for $450. Do it??? I play at 1440p, and 4k. Will OC it further. Figure with SLI benchmarks doing well at 4k, I can add another down the line when prices drop further.....
 
I got offered a couple week old EVGA 980ti FTW edition for $450. Do it??? I play at 1440p, and 4k. Will OC it further. Figure with SLI benchmarks doing well at 4k, I can add another down the line when prices drop further.....

Depends on if you can wait for the non FE 1070 to come out(And assuming partner cards actually meet the non FE msrp)
 
I got offered a couple week old EVGA 980ti FTW edition for $450. Do it??? I play at 1440p, and 4k. Will OC it further. Figure with SLI benchmarks doing well at 4k, I can add another down the line when prices drop further.....

Depends on if you can wait for the non FE 1070 to come out(And assuming partner cards actually meet the non FE msrp)

This, I picked up an EVGA 980ti SC+ BP last weekend for a bit less then that. I have no regrets, but I did it because most of my quality gaming time over the next year (at least) will be in the next 3 months. I've also been sitting on a g-sync monitor since August. I have it paired with a 3770k and have no regrets, it's a magical experience.

With that said, I doubt finding a 980ti for $450 is going to be that much of a problem going forward, so waiting to see what the market looks like whenever the 3rd party 1070/1080s get out there makes a whole lot of sense.
 
each card has its own curve depending on the quality of the chip. 2 seperate 980tis can boost to a different level with an identical base clock for instance. from everything ive seen higher asic quality gpus boost higher off of a given base clock since they require less voltage

Actually checking the boost using Afterburner I am hitting near 1500 so I suspect GPU-Z just states your minimum boost speed or something.
 
Perhaps that's why it's called boost. Intel CPUs don't maintain their turbo on all cores if you take a completely hands off approach.

Its also been shown that the FE does this and more with some pretty simple adjustments to fan speed and such. No modding or anything necessary. Of course this might depend on the particular game, since data is still somewhat limited.

In its default state, my 4690K maintains a turbo clock much longer than the 1080.

Disappointed at the 1080. Can't do 4K VR. The end.

This is the epitome of unrealistic expectations.
 
Actually checking the boost using Afterburner I am hitting near 1500 so I suspect GPU-Z just states your minimum boost speed or something.

gpu z doesnt match what you will actually get in game. inspector however has a pretty accurate est max boost readout that you will maintain as long as you dont throttle due to power or temps
 
Only the unlocked chips, think there are even timer limits on the normal chips.. And if we are talking reference design, then try using an Intel made motherboard.

I think post Ivy Bridge, it is very much up to the motherboard manufacturers to allow it. I typically don't use budget boards so, that is the most likely explanation.
 
ooh god would people please stop.. just don't buy the reference card if you have issues with it

As someone who is interested in the 1080, I'm glad these discussions are being had in the thread. I like to be as informed as possible about my purchase and the technical specs/abilities of the card, beyond nVidia's marketing bullshit.

Why would anyone wanna stifle productive technical discussion?
 
As someone who is interested in the 1080, I'm glad these discussions are being had in the thread. I like to be as informed as possible about my purchase and the technical specs/abilities of the card, beyond nVidia's marketing bullshit.

Why would anyone wanna stifle productive technical discussion?

Yea I'd much rather discuss the technical aspect, then whine about the marketing, etc.
 
Yea I'd much rather discuss the technical aspect, then whine about the marketing, etc.

If that was a jab at me, then I'm not complaining about their marketing, but rather making a point that NO company would be forthcoming about the shortcomings of a product before release. The onus is on the reviewers, experts, and buyers to scrutinize the card and figure out what it can and cannot do.

All you're gonna get from nVidia are pretty nondescript graphs and "THIS IS THE BEST GTX CARD YET", which while true, leaves a lot in the dark for those of us hoping to upgrade from cards like the 980 Ti.
 
If that was a jab at me, then I'm not complaining about their marketing, but rather making a point that NO company would be forthcoming about the shortcomings of a product before release. The onus is on the reviewers, experts, and buyers to scrutinize the card and figure out what it can and cannot do.

All you're gonna get from nVidia are pretty nondescript graphs and "THIS IS THE BEST GTX CARD YET", which while true, leaves a lot in the dark for those of us hoping to upgrade from cards like the 980 Ti.

I was in complete agreement with you. You may be paranoid, but your take is on point.
 
Why would anyone wanna stifle productive technical discussion?

well its not really productive, it wont change the reference/founders card.

1) it's $100 to expensive (and much more in europe), its all about supply and demand. Already here people should stay away from it.
2) it can´t sustain overclock on its default fan/voltage curve (but can if you change it)
3) it has a sub par power delivery, everyone should know just by looking at that lonely 8pin power connector.
4) it is almost already certain that it has the worst and noisiest cooler of all 1080 cards that will ever be produced (maybe with the exception of the galax... that thing looks horrible, but on their own page they have changed the pictures to the reference cooler)

we basically already knew all of this 10 days ago
my best advice is to not buy the reference/founders card, and wait for the actual cards to come out...

just like i wouldn't buy a motherboard made by Intel
 
https://www.youtube.com/watch?v=myDYnofz_JE

Analysis by AdoredTV:
  • 3% faster than GTX 980ti at 1430mhz
  • 10% slower than Titan X at 1500 Mhz
  • Much better than Fury X overclocked @ 1150 or so
  • Founder's Edition throttles and wasn't benched in cases, rather open bench = different thermal environment
  • Usual clock at stock power and temp target after 20 minutes of gaming is only the base clock...

Arent there only reference Titan X's available?
 
If that was a jab at me, then I'm not complaining about their marketing, but rather making a point that NO company would be forthcoming about the shortcomings of a product before release. The onus is on the reviewers, experts, and buyers to scrutinize the card and figure out what it can and cannot do.

All you're gonna get from nVidia are pretty nondescript graphs and "THIS IS THE BEST GTX CARD YET", which while true, leaves a lot in the dark for those of us hoping to upgrade from cards like the 980 Ti.

There's nothing in the dark. Reference to reference is a good representation of what you'll see between OC vs OC as well. If you want an OC 1080 or is running an OC 980Ti the reference 1080 is obviously not for you. I'd argue that 1080 in general is a bad upgrade for 980Ti owners but hey it's not my money (I myself might jump to 1080 if only out of curiosity).
 
There's nothing in the dark. Reference to reference is a good representation of what you'll see between OC vs OC as well. If you want an OC 1080 or is running an OC 980Ti the reference 1080 is obviously not for you. I'd argue that 1080 in general is a bad upgrade for 980Ti owners but hey it's not my money (I myself might jump to 1080 if only out of curiosity).

The 980ti is a good overclocker though. There are no guarantees that either the 1070 or 1080 will be until we see the power situations of after market cards.
 
I'm thinking about a 1070. I'm currently on a 770 and I don't plan on moving beyond 1080p in the next 5 years so I suspect a 1070 should perform extremely well at least that long.
 
980tis dropped a bit further today on Amazon. Some of them have a $30 rebate as well.

Bargain-minded gamers should take note. Similar deals for the 390x too.

I'm thinking about a 1070. I'm currently on a 770 and I don't plan on moving beyond 1080p in the next 5 years so I suspect a 1070 should perform extremely well at least that long.

I'm definitely projecting here, but I'll offer up my opinion on the matter: can you possibly hold out until HBM2 cards are released? The folks that are dropping serious cash on these cards might be better served waiting 6-8 months, because HBM2 cards are going to be monstrous in terms of performance and allow for more compact designs.

That said, if you want something to last you for 5 years...why not the 1080?
 
Even on Amazon they aren't in the 400-500 range and that's all you should pay for hem as the 1070 is a faster card. Anything over 379 in fact is probably not a good decision. Low 400's is probably fine, but it's a tough call.
 
I'm definitely projecting here, but I'll offer up my opinion on the matter: can you possibly hold out until HBM2 cards are released? The folks that are dropping serious cash on these cards might be better served waiting 6-8 months, because HBM2 cards are going to be monstrous in terms of performance and allow for more compact designs.

That said, if you want something to last you for 5 years...why not the 1080?

1. HMB2 is going to be huge (Which is why I'm waiting myself) but we have no idea what the prices might be, and for such a new feature that's highly sought out, it might be really expensive. Demand could highly exceed supply.

2. For 5 years, I can definitely see the 1070 lasting. Especially if you're ok with dialing down a few settings in a couple of years and not having AA maxed.
 
We're preorders from nvidia supposed to go up today?

Apparently Newegg put them up already? I wasn't going to buy a reference card anyways.

I'm thinking about a 1070. I'm currently on a 770 and I don't plan on moving beyond 1080p in the next 5 years so I suspect a 1070 should perform extremely well at least that long.

A 1070 should kill 1080p for a long time. That said, even at 1080p there's supersampling and downsampling to consider.
 
Apparently Newegg put them up already? I wasn't going to buy a reference card anyways.



A 1070 should kill 1080p for a long time. That said, even at 1080p there's supersampling and downsampling to consider.
I don't know how I got it in my head that nvidia was selling them through their own website. Blech I need one of these asap.
 
In other words, it's exactly what it should be based on the currency conversion?

+1. I always find it hilarious when Canadians majorly overreact, as if they don't know what the exchange rate is.

P9lIMk7.png


Y'all need to simmer down.
 
I swear someone here said they'd be selling them on nvidia.com today.... What a disaster of a "launch"

I thought that was the 27th.

I know I've read in more than one place nvidia.com would be selling them, and I signed up for the email blast from nvidia to alert me when they'd be available.

And then somebody here a couple days ago said they'd be going up on nvidia.com for preorder on the 25th (today.)

I really dont want to get caught with my pants down and have them go on sale when I'm at work and can't place my order.
 

I wonder what pricepoint those are going to come in at. I hope it isn't $750, but I have a feeling they will be at $749. Why wouldn't they sell it higher than the $699 nVidia wants? Those things are selling and they are 'inferior.'
 
Top Bottom