What I mean about OC differentials:
![]()
Well 2080TI upgrade path is not 3080... so expected... like you said wait 3080TI.
Just to add.This is something most people are not taking into consideration and it's IMO very important. The 1080 was a noticeable improvement over the 980Ti on release, but the 2080 barely scraped past the 1080Ti which was a beast. The 3080 compared to the 2080Ti is back to 1080 vs 980Ti levels.
The 080 is at its core a Tier 2 machine. 2080Ti owners will have to get the Enthusiast Tier, the 3090, or wait for 3080Super/TI to get their money's worth.
No it can't, read the reviews.That is a very low OC.
The card can push more.
Huh? What are you waiting for that's going to change? Unless you shunt mod then this is it. MAYBE Kingpin and the like will do better but then you are paying a heft premium for single digit % gains. Hardly compelling.I'd wait before drawing any conclusion about OC capability because the FE are hitting power limit even at stock and GDDR6X has some kind of ECC that allows clocking the VRAM up to unstable levels at the cost of performance, but without producing any artifacts etc, so unless you do a lot of testing, you're not going to realize.
Just to add.
2080TI is basically twice the price of the 3080.
no better get 1500W to be sureSoon you will be mine!!!
But seriously I hope my 750W power supply can still fork out enough for this beast.
Thanks! If I don't get one, I don't get one. I'm honestly still doing OK at 1440p on my GTX 1080, but definitely want to upgrade if I don't have to pay exorbitant prices.
I read reviews...No it can't, read the reviews.
No it can't, read the reviews.
Huh? What are you waiting for that's going to change? Unless you shunt mod then this is it. MAYBE Kingpin and the like will do better but then you are paying a heft premium for single digit % gains. Hardly compelling.
What you are describing has already been encountered by the reviewers and they point it out, so you're not getting bad data from unstable OCs because they've weeded those out already.
TL;DW if you have a 2080 of any kind, you don't really need to upgrade.
If you have a 1080 Ti it's a massive increase.
Shame we can't have 3070 benchmarks yet.
I'm camping for a 3090 too. I'm actually scouting the 3080 launch at the store tonight so I can plan for next week.Im abit of a tech whore so i prolly will camp out for the 3090 lol. But im like you..shit I got a 2080 Super and a beast of a system..its not like Its anything coming out I won't be able to blast through...I got a 4K monitor and a huge 1440 165hz curved monitor and tbh I actually prefer my 1440p one cause of its size and I like the curve....
But as my wife says "You just can't wait for shit you so got damn spoiled"![]()
![]()
![]()
Why would you be against a company? My previous system was Intel(i7-5960X)/Nvidia(980Ti), then I switched to AMD (3900X)/AMD (5700XT) and recently switched over my GPU back to Nvidia.I'm very firmly set againts AMD, even if their cards are very very good, I won't buy. Only Nvidia.
9700k and 2080s.
Not sure about that , 2080 isn't that much faster than 1080Ti in rasterization , and the 3080 is as good as 80% faster than 2080 in 4k . It's still a damn awesome upgrade for it's price .
Great value. I wonder how 3090 will fit in this picture.Nice.
![]()
![]()
It's more than enough. Most reviews are showing 550~W power draw and that's with Ram and CPU overclocks. 700 is plenty.If I wanted to upgrade from my 1080TI, I am not sure my 700 watt PSU is enough. This is disappointing considering they they are on a new smaller process node.
At least at 4K people don't have to rush out and upgrade CPUs:
![]()
![]()
Well, 3090 won't be winning any value awards. It'll easily be 2 or 3 worst on the list on value. It'll just be the king of performance. Look at it like, you'll be paying more than 2x the 3080 and probably getting only another 15-25% improvement in performance.Great value. I wonder how 3090 will fit in this picture.
Maybe with DLSS. 3090 won't be a great 8k card unless DLSS becomes a standard. It'll be a great top of the line 4k card though.the F is all this talk about 4k?
3090 is all about 8k![]()
I'm upgrading from 2080Ti, which is almost at the bottom of that chart. I'm hoping for at least ~30% average improvement over 3080. That would make me a happy boy.Well, 3090 won't be winning any value awards. It'll easily be 2 or 3 worst on the list on value. It'll just be the king of performance. Look at it like, you'll be paying more than 2x the 3080 and probably getting only another 15-25% improvement in performance.
Which is absolutely acceptable if you know what you should be expecting.
Looks like Nvidia pulled an AMD and the 3080 would benefit a lot from being undervolted (assuming it's not a bottom of the barrel bin):
Are you telling me that a 750w could not be enough?Starting to worry about my 750W, the power draw is not to be trifled with
The 3080 should be compared to it's equivalent card, which is the 2080.It bugs me a little it's being compared to the 2080 and not the super. I guess it makes for a little more impressive numbers.
Are you telling me that a 750w could not be enough?
![]()
What are playing that you can't wait? If I had. 2080ti I'd wait, plain and simple for a 3080ti. Coming from the 10xx series or at most the 2060 is where one should be considering an upgrade. There's gonna be some great revision models in ~6 mos. or so. I'd at least wait for the revisions as there's no reason to get rid of a 2080ti.I'm upgrading from 2080Ti, which is almost at the bottom of that chart. I'm hoping for at least ~30% average improvement over 3080. That would make me a happy boy.
I have to be honest, only 25-30% better than 2080ti without rtx doesn't sound all that hot after all the big talks...So reviews/benchmarks for the most part seem to be pretty good!
Rasterization:
Avg of +25-30% uplift vs stock 2080ti at 4K
Ray Tracing:
Minor uplift with mixed rendering
Major uplift with only Path Tracing
Productivity:
Great performance boost in Blender and most productivity applications. Much higher than in gaming.
Cooling:
Great custom cooler on the FE cards, keeps temps in line and is relatively quiet for the power draw. Great job by Nvidia here. (will AIBs fare as well?)
Power Draw:
No getting around it, this card draws a huge amount of power but I think we all knew that already based on the 320W TDP. If power draw is a big deal to you then this might not be the card for you. If you only care about performance then you are good to go. Will most likely need a minimum 750W PSU.
Overclocking:
Doesn't appear to have much OC potential in gaming. Not sure what that means for AIB cards but I'm sure they will be able to squeeze out a little more performance. Keep an eye on power usage though.
Price:
Compared to Turing? Absolutely great at 699 dollars for 25-30% more performance than a 2080ti.
Does it live up to the hype?:
Well....no, but in fairness the hype was absolutely insane with funky TFLOP numbers and Nvidia claiming 2x performance of a 2080. Having said that it seems to be overall a great card and good buy, I think anyone getting this card will be really happy with the results.
Maybe not a great buy if you already have a 2080ti, you may want to wait for an eventual 3080ti or just splurge and go for the 3090. Memory is a little low for a 2020 flagship GPU but we all know that new higher memory models are just around the corner so if you are happy with 10GB VRAM then buy away but those not in a rush might want to wait for larger memory models.
If you are a 1080ti owner then the consensus seems to be that this is a great upgrade and will be well worth the money.
Looks like Nvidia pulled an AMD and the 3080 would benefit a lot from being undervolted (assuming it's not a bottom of the barrel bin):
What are playing that you can't wait? If I had. 2080ti I'd wait, plain and simple for a 3080ti. Coming from the 10xx series or at most the 2060 is where one should be considering an upgrade. There's gonna be some great revision models in ~6 mos. or so. I'd at least wait for the revisions as there's no reason to get rid of a 2080ti.
If I could buy a 2080ti for $350 or so I'd buy one right now and just wait for the revisions.
I have to be honest, only 25-30% better than 2080ti without rtx doesn't sound all that hot after all the big talks...
Isn't in line with the usual jump from top tier card to top tier card? Was the difference between 1080ti and 2080ti that much lower?
I bought into the hype, yeah.I'm not 100% sure about 1080ti vs 2080ti but I think it was around 20-25% in 4K.
Of course given that the 1080ti wasn't the most amazing card in the world for 4K we are looking at a smaller jump in number of frames.
In 1080p I think there was almost no difference at all or at least within like 5-7%
I think the biggest issue was the price to performance ratio at 1200 dollars for a 2080ti, plus at the time most people were not using 4K monitors and were likely playing at 1080p/1440p compared to now.
I still think 2080ti to 3080 (25-30%) is a solid enough jump, roughly in line with pre release leaks and expectations in the tech community. The problem was once the Nvidia hype machine started a lot of people lost any sense of rationality and got swept up in the hype expecting crazy performance boosts. A few people tried to bring expectations back to reality but people were too swept up in the hype.
In closing, it is only really disappointing if you bought into the hype. Otherwise it comes in around where we expected for a better price but with lower memory than originally expected.
So in gaming: good but not mind blowing. That Blender performance does look really tasty though.
The difference between the 2080ti and 1080ti was like 20-30% while adding $500 on top of the price card.I have to be honest, only 25-30% better than 2080ti without rtx doesn't sound all that hot after all the big talks...
Isn't in line with the usual jump from top tier card to top tier card? Was the difference between 1080ti and 2080ti that much lower?
I don't know if the jump from my 2070super is enough...The difference between the 2080ti and 1080ti was like 20-30% while adding $500 on top of the price card.
They essentially released the 1080ti 3 times in the 20xx series with the 2070, 2070super, 2080 and added RT and tensor cores on top. In practical application there was no value.
What the 30xx series is doing is bringing it back in line with what it used to be. At a slight premium price.
Jensen himself said that if you were 10xx series, this is the card for you.
People on the 20xx series are either inexperienced with GPU upgrades or were smoking crack rocks in thinking there's be way higher jumps. Nvidia didn't do theselves any favors with their 2x the performance of 2080. More like 70% performance gains overall.
Which is fantastic, Nvidia shouldn't have hyped the cards more and just let the numbers speak for themselves because the 30xx is a performer and way better value than garbage trash value of the 20xx series.
If I was in your situation I'd honestly wait. If Cyberpunk is what you are upgrading for I'd really wait and see what DLSS can do for the title. I think it's gonna work really well for the game. People forget that CDPR is a PC developer at heart and all their games have scaled very well across a ton of different hardware configurations. Just get out of the mindset of needing to max shit. It's a heartbreaking endeavor that'll leave you constantly disappointed.I don't know if the jump from my 2070super is enough...
I'm always impulsive when i have to buy a gpu![]()
I was not on board with ultra setting for the last 10 years of pc gaming, i already know all of that.If I was in your situation I'd honestly wait. If Cyberpunk is what you are upgrading for I'd really wait and see what DLSS can do for the title. I think it's gonna work really well for the game. People forget that CDPR is a PC developer at heart and all their games have scaled very well across a ton of different hardware configurations. Just get out of the mindset of needing to max shit. It's a heartbreaking endeavor that'll leave you constantly disappointed.
I remember Ubersampling in Witcher 2 and how upset I was that I couldn't run it. Turns out, no one can run it, it's even hard to run to this day.
Max what you can for a frame rate that's acceptable, if you can't get that anymore then upgrade.
My 1080ti isn't doing it very well anymore at 3440x1440 120hz so it's time for ME to upgrade.
If I was sitting on 2070super or higher I'd just wait until the card can't do it anymore.
Though I get it, it's fun to jump into the mix on new hardware and just the experience and comradery of new hardware with buddies and forum friends can be a really fun time.
Imagine if they used an Intel system max oc'd and also oc'd the GPU, that 750w would be on the verge. Cutting a bit too close imo.Starting to worry about my 750W, the power draw is not to be trifled with
Imagine the power draw of the 3090 with 24Gb of vram.That power draw and only 10gb vram. A big nope for me.
Every time Nvidia releases new cards I hear this, because they use games made for last gen cards with boosted fps. Even then, many of the current gen games will not run at 4k 60fps....When the new gen games start releasing next year, I will want to hear about this 4k 60fps with upcoming games, during the lifetime of this gpu with only 10Gb of vram going forward.Seems like we finally have proper 4k 60fps cards in the market.
Hum? I said every single time there is no proper 4k GPU on the market unless you want 30fps.Every time Nvidia releases new cards I hear this, because they use games made for last gen cards with boosted fps. Even then, many of the current gen games will not run at 4k 60fps....When the new gen games start releasing next year, I will want to hear about this 4k 60fps with upcoming games, during the lifetime of this gpu with only 10Gb of vram going forward.
Looks like Nvidia pulled an AMD and the 3080 would benefit a lot from being undervolted (assuming it's not a bottom of the barrel bin):