RTX 5060 fails to beat 7-year-old flagship GPU RTX 2080 "Fine Wine™" Ti

Gaiff

SBI’s Resident Gaslighter
s2rZ5fc.png


But hey, at least it beats the PS5, right?

tXHYcVv.png


Yes, but actually, no.

UAqkaoi.jpeg


Crumbles to sub-PS5 level when its 8GB are constrained and that's not exactly a rare occurrence. The 2080 Ti is 3 generations old. That'd be like a GTX 1060 being slower than a GTX 680.

The reason for the dismal performance of the RDNA2 cards is because they mostly used RT games to assess the capabilities of the 2080 Ti in modern titles using the full range of its feature set.



On the one hand, the RTX 2080 Ti being solid despite being so old is a testament to how well it's aged despites its initial lukewarm reception. On the other hand, the RTX 5060 not beating it convincingly shows how less NVIDIA gives compared to 7 years ago, especially in the budget segment.
 
While the 2080ti price/performance was miserable when it launched, the ability to still run the latest version of DLSS has given it quite the long lifespan. Just looking back, it launched two years before the PS5/XSX did and offers 11GB and faster performance (especially RT) compared to the 16GB consoles. The fastest GPU 2 years before the PS4 launched was the 1.5GB 580, which was about on par and had dramatically less VRAM compared to the PS4. It also had no DLSS to keep it relevant.

So yeah, terrible price/performance at launch, amazing capability to actually stay competitive almost 7 years later. Although a large part of its relevancy can probably be attributed to the slow down of render progress and Nvidia being a bastard when it comes to VRAM.
 
Memory must be the most expensive, most difficult thing to manufacture based on how it's rationed like were stranded on an an oil rig with monsters climbing up from the deep. If only we had more memory we could fix the radio and call for help. Nope were all going down in flames instead.
 
My plan is to get a 4090 (relatively) cheap when the 6000 series hit and ride it out as long as possible. Seems to be the much wiser option.
 
Yes, but the fastest consumer Pascal card was the 1080 Ti, not the 1080 and the 4060 is barely faster. It does offer RT and DLSS though.
The ti is a bit slower than the 4060, just looking to confirm.

It's interesting to see the disappointment that the 5000 series has been.
 
The difference maker in the 50 series was supposed to be mega geometry, but there's still nothing really using mesh shading outside of Alan Wake 2. Nvidia just didn't seem ready for it, and means it will be an early adopters thing like the 20 series was for RTX. It'll probably only start becoming mainstream in the 70 series making the cards a bit pointless now.
 
Yeah I've reconsidered my upgrade strategy partly thanks to gains slowing down, prices going up, and baselines being pinned farther down for longer times. Series S, long cross gen, 8gb gpus, high prices, maybe even switch 2. A gpu 2x as powerful as a PS5 is attainable, and even if the price is kinda high, you could use it for a long ass time. And a PS6 will end up getting most PS7 games.
 
Part of me thinks they're holding onto 8GB for dear life, as they know theboerformance difference between models is tiny (besides 80 to 90 series this past two gens).
 
"Nvidia" and "fine wine" in the same phrase is... ironic...

They created few cards that were great for many, many years:

8800GTX (8800GT in mainstream), 1080ti (1070 for mainstream), 2080ti and now 4090 and 5090 will be great for the future (I think 16GB cards will be fine too).
 
5060 is budget tier, did we expect it to beat a 2080TI? I never did.
I asked this too but historically the 60 class card HAS surpassed the high level cards 3 generations later. It's not happening anymore. Gamers nexus did a table on this a while ago demonstrating the diminishing returns for more money.
 
5060 is budget tier, did we expect it to beat a 2080TI? I never did.

There was a time... When xx70 card of next generation was matching top tog of previous gen. While xx60 cards performed like xx80 cards.

3070 = 2080ti (minus the vram), 1070 = 980ti, 1060 = 980 and few others. But for sure they weren't consistent with that.
 
Last edited:
This many generations later..........we should expect better.

bnXZLgF.jpeg
I agree, but most developers have built framegen, resolution scaling and various other technologies into their games so they don't have to optimize as much. So there's that. The game the OP posted is also a known resource hog that relies on said tech and is UE5 so it's no surprise.
 
Last edited:
"Nvidia" and "fine wine" in the same phrase is... ironic...

Ampere drivers aged super well, in fact if you benchmarked Ampere series vs RDNA 2 today you would find quite a difference with the day 1 review benchmarks.

Turing was also a shift in design which was for the future. So while peoples didn't understand or see the benefit early on, it aged relatively well with modern games. The fact it got DLSS 2 and transformer model is outside of what I would have imagined in 2018.
 
I agree, but most developers have built framegen, resolution scaling and various other technologies into their games so they don't have to optimize as much. So there's that. The game the OP posted is also a known resource hog that relies on said tech and is UE5 so it's no surprise.

They tested alot of games and the 5060 nay 4060 should easily outdo the 2080Ti in RT and FrameGen as it supposedly has newer generation RT cores and Tensor Cores.

  • RT Benchmark: Cyberpunk 2077
  • RT Benchmark: Alan Wake 2
  • RT Benchmark: Avatar Frontiers of Pandora
  • RT Benchmark: Hitman World of Assassination
  • RT Benchmark: A Plague Tale Requiem
  • RT Benchmark: F1 24
  • Raster Benchmark: Black Myth Wukong
  • Raster Benchmark: Senua's Saga Hellblade 2
  • Raster Benchmark: Forza Horizon 5
  • Raster Benchmark: Alan Wake 2

The fact that the 5060 isnt able to beat the 2080Ti even in old games that were very optimized like Hitman and Forza means its not up to "lazy devs" its just that Nvidia isnt releasing components that are as good name for name as in previous generations......xx70 should beat lastgens range topper......xx60 should match the range topper of two generations ago.......alas the 5060 from 2025 cant beat a GPU from 2018.
 
1060 and 2060 were as fast as the previous gen xx80 models. Everything went downhill after that starting with the 3060. Which, to be fair, at least the VRAM was future proof. 4060 and 5060 are complete garbage, but people don't have a choice since the next card with VRAM bump is $480.
 
They created few cards that were great for many, many years:

8800GTX (8800GT in mainstream), 1080ti (1070 for mainstream), 2080ti and now 4090 and 5090 will be great for the future (I think 16GB cards will be fine too).
Ampere drivers aged super well, in fact if you benchmarked Ampere series vs RDNA 2 today you would find quite a difference with the day 1 review benchmarks.

Turing was also a shift in design which was for the future. So while peoples didn't understand or see the benefit early on, it aged relatively well with modern games. The fact it got DLSS 2 and transformer model is outside of what I would have imagined in 2018.
Bro coma'd their way through 1080 ti
So they at least had some hits? That would be hard to believe without people anecdotes tbh, I've always had to "find the one driver this card don't start getting issues with" for old Nvidia cards and keep them stored in the computer because they all start having black screens or some weird glitch after certain driver version and that has happened to me at least twice (probably once more, not sure about that one), the card gets old enough and I can make sure current drivers will start conflicting somehow. That never happened to me with AMD, and I'm not even mentioning the low VRAM artificial obsolescence bs they've been doing for years already.
 
The 2080ti, A fine wine that cost over a grand and even more in the covid/miner/scalper shortage of 2020-2022. Sorry but it was expensive then its still expensive now. A used 3070 or 3070ti would be a much better card for that kind of dosh. I
 
The 2080ti, A fine wine that cost over a grand and even more in the covid/miner/scalper shortage of 2020-2022. Sorry but it was expensive then its still expensive now. A used 3070 or 3070ti would be a much better card for that kind of dosh. I
Not with 8GB.
 
s2rZ5fc.png


But hey, at least it beats the PS5, right?

tXHYcVv.png


Yes, but actually, no.

UAqkaoi.jpeg


Crumbles to sub-PS5 level when its 8GB are constrained and that's not exactly a rare occurrence. The 2080 Ti is 3 generations old. That'd be like a GTX 1060 being slower than a GTX 680.

The reason for the dismal performance of the RDNA2 cards is because they mostly used RT games to assess the capabilities of the 2080 Ti in modern titles using the full range of its feature set.



On the one hand, the RTX 2080 Ti being solid despite being so old is a testament to how well it's aged despites its initial lukewarm reception. On the other hand, the RTX 5060 not beating it convincingly shows how less NVIDIA gives compared to 7 years ago, especially in the budget segment.

It would actually be the equivalent of a 1060 being slower than a 480 GTX. Refreshes don't count as generations.
 
While the 2080ti price/performance was miserable when it launched, the ability to still run the latest version of DLSS has given it quite the long lifespan. Just looking back, it launched two years before the PS5/XSX did and offers 11GB and faster performance (especially RT) compared to the 16GB consoles. The fastest GPU 2 years before the PS4 launched was the 1.5GB 580, which was about on par and had dramatically less VRAM compared to the PS4. It also had no DLSS to keep it relevant.

So yeah, terrible price/performance at launch, amazing capability to actually stay competitive almost 7 years later. Although a large part of its relevancy can probably be attributed to the slow down of render progress and Nvidia being a bastard when it comes to VRAM.
Along with all the 2000 series cards, the 2060 can only run some of the latest DLSS features.
 
Nvidia is just mailing it in with gaming hardware at this point. They don't give a shit.


This interview with Jensen was enlightening for me. Crypto mining got popular? Scalp your own hardware on eBay. AI getting popular? Corner the market on hardware at insane margins. They're just winging it and riding the hot hand, and keep finding themselves in advantageous positions.

RTX 50 series just feels like they're keeping one foot in the door for GPUs but it's clearly not the company's focus at the moment.

I expect their luck to run out at some point but it's hard to argue with their results.
 
Top Bottom