Honestly what's the point of spending money on the highest gpu

SNG32

Member
Since consoles are pretty much running on cpu architecture. What's the point of buying top of the line gpus when the majority of people have a 1060 and developers keep that in mind when developing most games. Sure the graphics maybe shinnier and you get a bit more fps but in comparison to consoles the gap isn't as big as it was like 10 years ago.
 
c0072009707644f4fb731d7a8289382a.gif
 
Mostly to reach the monitors high refresh rate at its native resolution. 60 fps is fine on a tv on the couch and shit but on PC? Over 120 fps is amazing. Anyone saying they cant feel the difference is a fucking idiot. No u dont need a 4k monitor, 1440p is the perfect resolution and to reach those framerates at that res, you'd generally need something over a 2080.
 
I don't think a huge portion of people who buy the highest end GPU even understand why they want it, and only ever get a fraction of the power it's capable of for several years. It's new, it's the latest and greatest. Which I get, I make stupid purchases too.
 
What's the point on playing videogames at all? We'll all be dead some day.


If you want a serious answer: more FPS (above 60 I mean) can mean better gameplay in some games, specially fast paced ones. You also get the chance to play on turbo ultra max settings, which is great and, if you get one with enough VRAM, playing any game at native 4K, which looks amazing.
You also get to game on the PC ecosystem, which has tons of adventages compared to console, but I'm not gonna list those because this matter has already been discussed hundreds of times.
 
Better framerates, better choices, mods. Better GPU's leave you with more choices and overhead for future decisions. My 1080ti would have easily lasted me another year playing at high/medium settings on current releases at 60fps, it came with 11gb's of VRAM back in 2016 when I got it. I felt no need for newer hardware because my hardware was still better or performing better than the PS5/XsX when it came out.

People love to talk about high prices while they play on their PS5 knowing damn well they'll be picking up a PS5Pro whenever that releases. You're better off just focusing on whatever hardware you need to play the games you want, the way you want to play them.

The value proposition drops heavily as you start going into the upper echelons of cards, I'd say the the 30xx at the MSRP were good values, but then they were barely sold at the MSRP when they came out which massively changed that value proposition. The new 40xx series aren't nearly worth the money they are asking for and we'll see them come down in price significantly I'd bet.

But there's no denying that playing a game at Ultrawide resolutions while having a second monitor playing or listening to something in a nice comfy chair keeps me in my gaming nirvana.
 
Last edited:
I can only speak for my best mate who basically buys the best GPU each generation and he does so because he loves his hobby and hardware.
He loves putting the RTX 4090 into his PC. He loves tuning that GPU so it needs maximum power for maximum performance.
The sacred first few weeks of endlessly running benchmarks to check if everything is okay and see the gigantic performance gain.
He doesn't even tell anyone or plays multiplayer games. He isn't active in any forums or social media, but when Nvidia announces a new generation he needs the best card.

When he games he wants to game knowing that he has the best card to offer the best performance. It gives him peace of mind. Also 4K 120hz.
 
Since consoles are pretty much running on cpu architecture. What's the point of buying top of the line gpus when the majority of people have a 1060 and developers keep that in mind when developing most games. Sure the graphics maybe shinnier and you get a bit more fps but in comparison to consoles the gap isn't as big as it was like 10 years ago.
You answered your own question.

For better graphics and more FPS.

FLVHCtGXoAczlBv



FLVHFSgXEAYzSj5



FLVHJUAXMAUttVb



FLVHKTFXMAMCUNM


DLSH looks nothing like this on consoles.....if the console version is fine for you, then thats that, some people want Raytraced Global Illumination.
 
You answered your own question.

For better graphics and more FPS.

FLVHCtGXoAczlBv



FLVHFSgXEAYzSj5



FLVHJUAXMAUttVb



FLVHKTFXMAMCUNM


DLSH looks nothing like this on consoles.....if the console version is fine for you, then thats that, some people want Raytraced Global Illumination.
I was stunned to see how much difference global illumination makes in Dying Light 2, it looks almost like two different games compared to when it's off.
 
Last edited:
Nothing is too good for that beautiful, sexy, girlfriend/boyfriend/something in between(???) substitute box of electronics, sitting next to them, making them hard/wet/something in between (???),In the only way they've ever known.
 
Sure the graphics maybe shinnier and you get a bit more fps but in comparison to consoles the gap isn't as big as it was like 10 years ago.
There has been a gap since roughly 2007. It is at its smallest the day a new console launches and then grows every year until the next generation of console launches. The current consoles are a generation behind the 30X0 and about to be two behind. That's a pretty big gap.
 
Same reason as always. Console games will always target 30 fps or make sacrifices to resolution in 60 fps. I want to play everything maxed out at 60 fps at the highest resolution.
 
I was stunned to see how much difference global illumination makes in Dying Light 2, it looks almost like to different games compared to when it's off.
Global Illumination is the true holy grail of RayTracing.
Weve been able to "fake" reflections for so long that I think techniques are good enough to just get away with it.
But Global Illumination is still some ways away from being properly "faked" to the point that its as game changing as it is in Metro Exodus and Dying Light.

Probe based GI does a really good job, but once youve gone RTGI, you really cant go back, even SSRTGI is better than no GI.
 
You get better shadow resolution and slightly improved texture sometimes. You also get the option to turn RT off instead of it being off by default.
 
Last edited:
Same reason as always. Console games will always target 30 fps or make sacrifices to resolution in 60 fps. I want to play everything maxed out at 60 fps at the highest resolution.

Yeah, but will you need a 4090 for that, or will a 4080 or even a 3090 be just fine for the rest of the generation?
 
You answered your own question.

For better graphics and more FPS.

FLVHCtGXoAczlBv



FLVHFSgXEAYzSj5



FLVHJUAXMAUttVb



FLVHKTFXMAMCUNM


DLSH looks nothing like this on consoles.....if the console version is fine for you, then thats that, some people want Raytraced Global Illumination.
I'm gonna be honest, i'm only slightly impressed by the first and second pic (except the low poly\detail that skeleton)

Don't know why, but it reminds me of this days gone pic


First pic of that post
 
Last edited:
Because some people want far better performance over consoles. Also the gap continues to widen as PC hardware isn't stagnant like consoles. Newer, more powerful hardware is constantly releasing.

I mean why buy a PS5 when there's barely any PS5 games?
Why buy a Series X when you can just buy a Series S?
Why buy a Switch when you already have a toaster?
 
I can only speak for my best mate who basically buys the best GPU each generation and he does so because he loves his hobby and hardware.
He loves putting the RTX 4090 into his PC. He loves tuning that GPU so it needs maximum power for maximum performance.
The sacred first few weeks of endlessly running benchmarks to check if everything is okay and see the gigantic performance gain.
He doesn't even tell anyone or plays multiplayer games. He isn't active in any forums or social media, but when Nvidia announces a new generation he needs the best card.

When he games he wants to game knowing that he has the best card to offer the best performance. It gives him peace of mind. Also 4K 120hz.
This is like the hardware version of "don't ask questions just consume product."
 
Yeah, but will you need a 4090 for that, or will a 4080 or even a 3090 be just fine for the rest of the generation?
Chasing that maxed out settings dragon for GPU's is a losing proposition, unless you have the very top of the line, you'll never get their, and even when you do have the top-of-the-line it still typically isn't worth it. The visual gains one gets from the very high end are usually so small that it's almost always not worth it. My buddy got a 3080 recenetly and he's pissed that he can't max out Cyberpunk 2077 and keep above 60fps. It's like he's not been paying attention for the last 10 years of him being a PC gamer.

Learning that you don't need to "max" out every game is a lesson that some people have a lot of trouble learning.
 
You get better shadow resolution and slightly improved texture sometimes. You also get the option to turn RT off instead of it being off by default.
Hahaha I like how you worded that.
Get the option to turn RT off, instead of it being off by default.
I'm gonna be honest, i'm only slightly impressed by the first and second pic (except the low poly\detail that skeleton)

Don't know why, but it reminds me of this days gone pic


First pic of that post
Wanna see what those same shots look like on console?
If you werent impressed by them with GI turned on....imagine how unimpressed you'd be by them with GI turned off?


P.S The depth of field in that shot is atrocious.
A nigh literal vaseline filter.
 
Last edited:
Hahaha I like how you worded that.
Get the option to turn RT off, instead of it being off by default.

Wanna see what those same shots look like on console?
If you werent impressed by them with GI turned on....imagine how unimpressed you'd be by them with GI turned off?


P.S The depth of field and that shot is atrocious.
A nigh literal vaseline filter.
Dl2 was another game like cyberpunk that even maxed out, never really super impressed me graphically.

Also, Literally every pics i see online look better than what i played on pc+oled.

I feel like people cherrypick pics with these games (i mean of course nobody take screens of ugly looking locations but these games can look very ugly very often)

They feel wildly uneven, like the lows are extremely low and too frequent.
 
Last edited:
4k120fps and the waytwayshing... I personally don't care about that, so 70 series is best for me
For Most people's gaming needs 60-70 is more than enough
 
I enjoy having the fastest gpu, which is why I went from a 3090 to a 3090ti and will very much likely buy a 4090 then a 4090Ti.
 
Last edited:
Dl2 was another game like cyberpunk that even maxed out, never really super impressed me graphically.

Also, Literally every pics i see online look better than what i played on pc+oled.

I feel like people cherry pick pics with these games (i mean of course nobody take screens of ugly looking locations but these games can look very ugly very often)

They feel wildly uneven, like the lows are extremely low and too frequent.

Same here, its just cherry picking pictures really.
 
Yeah, but will you need a 4090 for that, or will a 4080 or even a 3090 be just fine for the rest of the generation?
My 3080 cant run cyberpunk at 4k 60 fps. I have to use DLSS performance which uses 1080p base resolution and it still tops out around 50 fps with ray traced lighting set to Quality instead of the max Psycho setting.

Im guessing 4080 wont be able to run it at native 4k 60 fps with ray tracing enabled since Nvidia is being shady about revealing actual non-DLSS performance metrics. Matrix also topped out at around native 4k 40 fps.
 
Dl2 was another game like cyberpunk that even maxed out, never really super impressed me graphically.

Also, Literally every pics i see online look better than what i played on pc+oled.

I feel like people cherry pick pics with these games (i mean of course nobody take screens of ugly looking locations but these games can look very ugly very often)

They feel wildly uneven, like the lows are extremely low and too frequent.
Just like in real life,
Dynamic TOD means sometimes shit looks ugly other times it looks beautiful.
Consoles version looks uglier 10 times out of 10.

But thats not even the point.

We are talking about why would you buy a high powered PC to play the same games.....my point is the games look better on PC and run at a higher resolution and refresh rate.
If that improvement isnt worth it for you, thats fine....no one forced anyone to play on PC or even to buy a high powered GPU.
But for those that appreciate the higher quality.....theyve got hardware that can accommodate.

Theres people who still play on 1080p TN panels with washed out colors.
Thats them, you play on an OLED.
Effectively OP could have reframed this question as why do you want things to look better when you are actually playing the same shit?
Why did you pick up an OLED, why not keep playing on a TN 1080p TV.......that OLED cost ~1000 dollars, a TN 1080p screen you find in the bin for free.
 
Since consoles are pretty much running on cpu architecture. What's the point of buying top of the line gpus when the majority of people have a 1060 and developers keep that in mind when developing most games. Sure the graphics maybe shinnier and you get a bit more fps but in comparison to consoles the gap isn't as big as it was like 10 years ago.
Finally someone has seen through the BS that the GPU market is. Absolute waste of money when all it comes down to is higher framerate and resolution. A fool and his money Springs to mind.
 
Top Bottom