Save more and you'll be able to afford the 5070Ok, my Budget is 1500€…
4k 240hz OLED monitors have been around nearly a year. All my games on a 4090 are not pushing nearly 240 fps, so...Seriously what is out of coming out that would get anywhere close to taxing these things, I can justify highly expensive Quadro cards for my work machines but for gaming how are these in anyway being utilized
Ok, my Budget is 1500€…
You forgot a 0Here's hoping I can get a 5090 at launch and my 1000w psu can handle it
15+ GB VRAM with mods? Stalker 2 seems to be allocating around 10GB VRAM (real usage 8-9GB).The 4080 is approaching DOA. Stalker 2 is regularly hitting 15+ gb, I'm debating getting a 7900XTX
I should've sprang for the 4090 this card is such a waste
15+ GB VRAM with mods? Stalker 2 seems to be allocating around 10GB VRAM (real usage 8-9GB).
Some game might fill all available VRAM but that doesnt mean they really need that much (there's a difference between VRAM allocation and process). Stuttering or performance degradation is the only way to know for sure that the game is really VRAM-limited.
As of now there are 3 games I'm aware of that can use more than 16GB VRAM for real.
- Warhammer 40K space marines 2 with 4K texture pack (this texture pack also requires 12 core 7900X3D).
- Indiana Jones at 4K native with PT with maxed out settings.
- cyberpunk with mods at 4K native with PT + FG.
The RX7900XTX 24GB would help you in Space Marine 2 (athough I havent noticed any difference in texture quality even when I made screenshot comparisons between ultra and high settings), but in PT games AMD card would not benefit you in any way. In Indiana Jones PT will not run at all on the AMD card, and in Cyberpunk at 4K native you will get an unplayable frame rate. On RTX 4080 you can at least play both games with PT without any problems with reasonable settings. For example DLSS quality in cyberpunk is enough to stay within 16GB VRAM budget. In Indiana jones at 4K I also need to set texture streaming budget to high (and I havent noticed texture quality reduction).
Besides these rare exceptions most games use around 9-12GB VRAM at 4K. I would worry if my card had 12GB VRAM, but 16GB VRAM is still plenty. IMO VRAM requirements won't increase that much until 2028/2029 when developers start making games with PS6 hardware in mind.
I'll screenshot, but last night I was playing it was bouncing around between 1450-15700 and Everytime it approached 16gb I was getting some MASSIVE lags.
Hmm, you might be right.Memory leak?
15+ GB VRAM with mods? Stalker 2 seems to be allocating around 10GB VRAM (real usage 8-9GB).
Some game might fill all available VRAM but that doesnt mean they really need that much (there's a difference between VRAM allocation and process). Stuttering or performance degradation is the only way to know for sure that the game is really VRAM-limited.
As of now there are 3 games I'm aware of that can use more than 16GB VRAM for real.
- Warhammer 40K space marine 2 with 4K texture pack (this texture pack also requires 12 core 7900X3D CPU to decompress texture data fast enough).
- Indiana Jones at 4K native with PT with maxed out settings.
- cyberpunk with mods at 4K native with PT + FG.
The RX7900XTX 24GB would help you in Space Marine 2 assuming you have 7900X3D (athough I havent noticed any difference in texture quality even when I made screenshot comparisons between ultra and high settings), but in PT games AMD card would not benefit you in any way. In Indiana Jones PT will not run at all on the AMD card, and in Cyberpunk at 4K native you will get an unplayable frame rate with PT. On RTX 4080 you can at least play both games with PT without any problems with reasonable settings. For example DLSS quality in cyberpunk is enough to stay within 16GB VRAM budget. In Indiana jones at 4K I also need to set texture streaming budget to high (and I havent noticed texture quality reduction).
Besides these rare exceptions most games use around 9-12GB VRAM at 4K. I would worry if my card had 12GB VRAM, but 16GB VRAM is still plenty. IMO VRAM requirements won't increase that much until 2028/2029 when developers start making games with PS6 hardware in mind.
So nope, I played some Cyberpunk 2077 tonight without path tracing and Stalker 2 again just to see and they were topping 15GB at 4K DLSS.Memory leak?
I don’t think people buying them care.16gb for 5080? scam. Dead in 2 years
12gb for 5070? Now that's a dead on arrival card.
I'll screenshot, but last night I was playing it was bouncing around between 1450-15700 and Everytime it approached 16gb I was getting some MASSIVE lags.
15+ GB VRAM with mods? Stalker 2 seems to be allocating around 10GB VRAM (real usage 8-9GB).
Some game might fill all available VRAM but that doesnt mean they really need that much (there's a difference between VRAM allocation and process). Stuttering or performance degradation is the only way to know for sure that the game is really VRAM-limited.
As of now there are 3 games I'm aware of that can use more than 16GB VRAM for real.
- Warhammer 40K space marine 2 with 4K texture pack (this texture pack also requires 12 core 7900X3D CPU to decompress texture data fast enough).
- Indiana Jones at 4K native with PT with maxed out settings.
- cyberpunk with mods at 4K native with PT + FG.
The RX7900XTX 24GB would help you in Space Marine 2 assuming you have 7900X3D (athough I havent noticed any difference in texture quality even when I made screenshot comparisons between ultra and high settings), but in PT games AMD card would not benefit you in any way. In Indiana Jones PT will not run at all on the AMD card, and in Cyberpunk at 4K native you will get an unplayable frame rate with PT. On RTX 4080 you can at least play both games with PT without any problems with reasonable settings. For example DLSS quality in cyberpunk is enough to stay within 16GB VRAM budget. In Indiana jones at 4K I also need to set texture streaming budget to high (and I havent noticed texture quality reduction).
Besides these rare exceptions most games use around 9-12GB VRAM at 4K. I would worry if my card had 12GB VRAM, but 16GB VRAM is still plenty. IMO VRAM requirements won't increase that much until 2028/2029 when developers start making games with PS6 hardware in mind.
That’s a good point. These cards alone draw what? 2-3x wattage compared to a console and will become obsolete sooner than should be due to low ram amount.I don’t think people buying them care.
It’s hilarious how much waste this is approaching considering the wattage the card will draw. GL with energy prices nowadays. Also: fuck the planet.
The only good card there seems to be 5070ti. Maybe 5060ti.That Ram setup on anything but the top cards....
Holy planned obsolescence Batman!
240fps is surely approaching the realms of diminishing returns, 120 -> 240 are you really going to see those differences?I want to game at 4k 240 fps and even the 5090 wont handle that
There are no diminishing returns yet with modern, LCD screens.240fps is surely approaching the realms of diminishing returns, 120 -> 240 are you really going to see those differences?
Why is this crazy? Almost all HW makers pull this shit. They make every model below the top tier undesirable in some way and make the top tier one 50% more expensive than 2nd best. If I was going to upgrade my 4090 I would have no choice but to get a 5090 without accepting some downgrade. If the 5080 had 24 GB it would probably be significantly better than a 4090. It would also look like a good path for those on 3090s.Yeah it's crazy they don't offer anything between the 16 and 32 range. Unless those cards will offered later.
And God i hope they'll drop 8GB, that shit would be lame as fuck
You'll get a nice and cool RTX 5060 for it.Ok, my Budget is 1500€…
The jump from 60 to 120 is pretty great but no the jump from 120 to 240 isn’t near as noticeable but I can tell a difference240fps is surely approaching the realms of diminishing returns, 120 -> 240 are you really going to see those differences?
Less motion blur with the oleds?There are no diminishing returns yet with modern, LCD screens.
If we were talking about CRT technology you would be right. But with LCDs, you need as much as you can get to completely get rid of the inherent motion blur.
I specifically bought a 240hz panel instead of a 120/144hz one because of this. And there is a big difference, assuming you can reach those numbers. I mostly test older games with unlocked fps to reach 240fps (the Quake 1/2 NightDive remasters are good for this too) and the difference in motion clarity between 120fps and 240fps is big. Though even at 240fps you still don't reach the crystal clear clarity of a CRT monitor at 75/85hz (which i still have and test). But it's close so think at 360hz you will be 100% there.
It's pretty impressive how superior the CRT technology is with this and most people have no idea.
That really seems to be a great card for the money, wise choiceI'm glad I bought my 4070ti Super the other week.12GB on the 5070 is a joke
120 to 240 is greatly visible.240fps is surely approaching the realms of diminishing returns, 120 -> 240 are you really going to see those differences?
So nope, I played some Cyberpunk 2077 tonight without path tracing and Stalker 2 again just to see and they were topping 15GB at 4K DLSS.
My rig is as follows:
4080 Founders
7800X3D
32gb DDR5 @ 6000
Averaging 14-1500gb of VRAM
What do you mean? All flat modern panels have motion blur.Less motion blur with the oleds?
I just had a look and there isn’t that many 4090’s for sale and seem to hold price a lot better than 4080’s. I might quickly sell and go for the 5090. Anyone know quickly what extra wattage needed between a 4090 and a 5090 might be? I’m sure I have overhead currently but would rather just a straight swap gpu without the hassle of changing psu too
What do you mean? All flat modern panels have motion blur.
All of them.
They still have motion blur.Oleds have the lowest pixel response time.
It’s the Corsair Rm1000x (2021) I have. Yea I use their bespoke 12vhpwr connector. Is the 5090 using that same connection again?!If your PSU is newer and natively has one of the 12vhpwr connectors then you shouldn't have too much to worry about provided you're over ~1000w.
It’s the Corsair Rm1000x (2021) I have. Yea I use their bespoke 12vhpwr connector. Is the 5090 using that same connection again?!
And depending on the rest of the system (CPU, etc), 1000w PSU might not be enough for a 5090 setup.Kind of has to if it's 600 W.
16gb for 5080? scam. Dead in 2 years
12gb for 5070? Now that's a dead on arrival card.
and obviously there is no 18/20/22/24 options. Just 32 for 5000000 $
Yep. He tried with titan cards and found out that suckers will but 1000$ more expensive you that’s 10% faster.More of Jensen's ewaste. A more cucked fanbase doesn't exist.