Honestly what's the point of spending money on the highest gpu

*Buying a 3080ti and still rocking a 1080p60fps monitor*

Get a 27 inch g sync compatible 240hz monitor and tell me you only get slight improvements compared to a console.

This sounds like some peasant downplaying something he never tried.

Nothing more cringe on the internet than PC elitism; video games are a form of entertainment. Why do you think you should talk down to me and call me a peasant because of the platform you play videogames on? It's honestly laughable. I hope your Dad bought you your PC and you are under the age of 18.

I prefer to optimize for graphical fidelity on PC, since I already have a 4K120 OLED for console gaming so high refresh rate isn't really a differentiator between PC/console for me. I run a 27in 2K 165Hz LED G-sync compatible monitor that I got when I bought my PC. This gives me a good balance of being able to choose max settings and still being able to run games at more than 60fps. Yes I can see that Destiny 2's textures and particle effects look better than on my PS5. Overall OLED was still a more significant upgrade than anything my 3080Ti gave me. I don't think my purchase was worth it. Sorry that offends you.
 
Last edited:
Would it have been so hard to type that in the first place? How could you expect anyone to know that lol (laughing out loud)?

You might have been thinking you were saving time not typing it out but look...you just had to type it out in a second comment. That is more wasted time than if you had just told everyone wtf (what the fuck) it was you were saying in the first place lol.
OprahBread_Embedded.jpg
 
Raw power allows to bruteforce poorly optimized PC ports (so most of them) and underperforming game engines.
Also VR needs power if you are serious about it.
Perf / W should always be the priority though, very high end OC CPUs and GPUs tend to be much less efficient, and the marginal gains are a waste of electricity IMO.
 
My brother was a corpo salary man. Made lots €€€.

Funny thing though: he never had the time to spend it. All day doing meetings with other soulless drones pretending the world really needed his job.
He quite, became an artist and now he works 3 days a week. He downgraded his Netflix subscription to basic.
 
Lets spend thousands of money to look at puddles.

These Dying Light pics though - damn!
What about the realistic bounce lights contrasting the difference between light and dark. How the lighting with RT shades the entire scene? Massive gap in that low quality image.
 
Nothing more cringe on the internet than PC elitism; video games are a form of entertainment. Why do you think you should talk down to me and calm me a peasant because of the platform you play videogames? It's honestly laughable. I hope your Dad bought you your PC and you are under the age of 18.

I prefer to optimize for graphical fidelity on PC, since I already have a 4K120 OLED for console gaming so high refresh rate isn't really a differentiator between PC/console for me. I run a 27in 2K 165Hz LED G-sync compatible monitor that I got when I bought my PC. This gives me a good balance of being able to choose max settings and still being able to run games at more than 60fps. Yes I can see that Destiny 2's textures and particle effects look better than on my PS5. Overall OLED was still a more significant upgrade than anything my 3080Ti gave me. I don't think my purchase was worth it. Sorry that offends you.
wait-what-james-franco.gif


My ninja didnt you say not one game has had you experience the 3080Ti at its fullest?
But now you saying you have a 1440p 165Hz monitor and play Destiny 2 on it?
The 3080Ti cant even consistently reach that refresh rate, it averages between 120 and 140fps in Destiny 2.
So you literally have seen your GPUs full potential?

If you wanted HDR OLED youve got the panel, connect your PC to it.


Just in case you forgot what you posted.....its below.

I bought a $2400 gaming rig with a 3080Ti in it about a year ago. It was a waste of money. Not one game have I ever felt like used the 3080Ti to its fullest potential. You can get slight improvements compared to next-gen consoles on games that are optimized well for 5x the price.

Let's not forget that most games are not optimized well for higher-end hardware and don't scale at all past a certain point (ex. Escape from Tarkov where I get the exact same performance as my friend with a 2080).
 
I've never seen the point. I suppose it depends on what you value, but staying a few years behind offers a wonderful price/performance ratio and any differences are minimal imho.
 
The gap between consoles and the highest end gaming rig is massive. Cyberpunk 2077 @ native 4k with max everything including RTX looks like a completely different game than what is on the Series X and PS5.
 
Last edited:
i play vampire savior on my RTX 3090ti so i can post on the steam discussions page telling how rock hard my nips are
 
This topic cause more high blood pressure than working in a salt mine.

There may be a thrifty (or poor! It's not funny!) gamer v oppulant gamer dynamic shaping up in this thread.

But I don't hate it because IT'S THE WEEKEND MUTHAFUCKAAAAAAAAHHHSSS!!!!!!

LET'S FIGHT ABOUT VIDEO GAMES!!!

SMOKE EM IF YOU GOT EM!!!

giphy.gif
 
wait-what-james-franco.gif


My ninja didnt you say not one game has had you experience the 3080Ti at its fullest?
But now you saying you have a 1440p 165Hz monitor and play Destiny 2 on it?
The 3080Ti cant even consistently reach that refresh rate, it averages between 120 and 140fps in Destiny 2.
So you literally have seen your GPUs full potential?

If you wanted HDR OLED youve got the panel, connect your PC to it.


Just in case you forgot what you posted.....its below.

By your logic every poorly optimized game that comes out shows me the power of my 3080Ti because I can't max out the framerate on ultra settings. Destiny 2 is not exactly poorly optimized, but it is surely not built to get the most out of a 3080Ti. It is likely that much of Destiny 2's graphics are determined by code that was written in 2017, and let's be honest the game looks basically identical to D1 which launched in 2014. I imagine the engine determines quite a bit of the game's graphical presence, and they probably haven't made any significant graphics updates to it since 2014.

Yes, some games are more scalable based on general power (i.e. teraflops) and my 3080Ti will perform better than a 2080 when making use of this scalability when included by devs. But how much better? Is it worth the cost over a console? No. I have not played a single game that was optimized to get the most out of a 3080Ti, because no developer is going to put time and resources into tricky performance optimizations for < 3 % of gamers, when the reality almost no dev gets to include everything they want to in a game from a feature perspective, let alone graphics. Even where they want to, it is also harder to do so because optimizing a game for a nebulous "PC" rig is different then optimizing for console, where they know exactly what hardware they are working with and can make more assumptions in optimizing their code. OK so a dev knows I have a 3080Ti. But what about my CPU, cooling…Not all PC gamers even have SSDs!

A spec sheet doesn't tell you everything. This is why games have the most graphical upgrades when a new console gen comes out, because this is when the industry (barring cross-gen titles) decides that the lower benchmark has been pushed up and devs can make stronger assumptions about what hardware their game will run on.
 
I'd rather just have a 1080p 165hz monitor and max that out compared to wasting all the power on 4k.
 
Last edited:
I can't even convince myself to upgrade my 980ti or 1080p monitor. It still runs every game pretty well and I do most of my gaming on consoles anyway.
 
Mostly to reach the monitors high refresh rate at its native resolution. 60 fps is fine on a tv on the couch and shit but on PC? Over 120 fps is amazing. Anyone saying they cant feel the difference is a fucking idiot. No u dont need a 4k monitor, 1440p is the perfect resolution and to reach those framerates at that res, you'd generally need something over a 2080.
Love everything you said except for the part about "not needing 4k". I would say that depends on what screen size you prefer and how close you sit to it. 1440p on a 32 in monitor at keyboard and mouse distance is easily noticeably less sharp than at 27 in and below. It is actually about as sharp as a 24 in 1080p monitor. Once you get to 32 and above, you absolutely will prefer 4k over 1440p. Now, most ppl say that 24-27 in is the ideal monitor size for esports since you can see whole screen but old guys like me just want immersion so I won't use anything smaller than 32 which means it's 4k for me.

I run a custom almost ultra wide res on a 48 in LG OLED so it's something like 3400x1700. My 3080ti can't run everything maxed out at 120fps and I have to drop to 60 or below to consider ray tracing so I feel like I "need" a 4090 so I can use my full 4k screen, ultra, RT and I still wont get 90 or 120fps with the 4090 unless I disable ray tracing. I know I'm ridiculous but this is the only thing I spend money on and I won't compromise
 
Ah, the "I would rather have" crowd.

I remember people would rather have 30 fps than paying over price for faster framerates when you can't feel or see the difference.

Then this gen having 120 fps was a game changer for the very same people.

It's easy to prefer what you have rather than what you can't achieve
Bad take, the non-RT image here is better. RT is not consistent and the difference is not worth the $$$.
 
wait-what-james-franco.gif


My ninja didnt you say not one game has had you experience the 3080Ti at its fullest?
But now you saying you have a 1440p 165Hz monitor and play Destiny 2 on it?
The 3080Ti cant even consistently reach that refresh rate, it averages between 120 and 140fps in Destiny 2.
So you literally have seen your GPUs full potential?

If you wanted HDR OLED youve got the panel, connect your PC to it.


Just in case you forgot what you posted.....its below.
There is more to this than resolution and frame rate lmao. I think the point he was talking is that no games push what those cards could really do in terms of real visuals, not pixel count and fps exclusively.

Example would be Crysis on a high end PC back in the day versus Crysis on the Xbox 360. There was a clear advantage to having a high end PC, it wasn't just about running the game faster at a higher res.
 
Last edited:
Mostly to reach the monitors high refresh rate at its native resolution. 60 fps is fine on a tv on the couch and shit but on PC? Over 120 fps is amazing. Anyone saying they cant feel the difference is a fucking idiot. No u dont need a 4k monitor, 1440p is the perfect resolution and to reach those framerates at that res, you'd generally need something over a 2080.
My son asked me to help as his computer had gone really laggy a couple of months back..

That smooth 60hz everyone recently loves had been what his gaming monitor had suddenly locked to.

Quick change of the setting and oh it's been to being smooth again. Thanks
 
Realistic games are ugly due to how unrealistic they are, after a decade of waiting we've just about gotten to the point where I find more than just bareable but actually nice looking. I think a 4080 will just about be enough to make Realistic games satisfying to me.
 
I kind of think it's funny that the most powerful gpus that can easily run 99% of all games have the most options for faking higher resolution and frame rate.

To me it would make more sense for someone like nvidia to make a hdmi pass through dlss device at a lower price.
The value of the difference is closer to $50, not $500+ . And RT is not consistently better, I would rather have the clearer non-RT setting.
FobnNKY.png
lol those images aren't even the same angels.
 
I'll never understand people who "future proof".

A xx70 series card is roughly half the price of the top end, which will let you play 100% games at max settings.

Cause you save so much money, you could literally upgrade every 2nd year to the next xx70 series card, while the x90 or Titan user is holding onto that thing for 4-5 years to try to justify their purchase.
 
I'll never understand people who "future proof".

A xx70 series card is roughly half the price of the top end, which will let you play 100% games at max settings.

Cause you save so much money, you could literally upgrade every 2nd year to the next xx70 series card, while the x90 or Titan user is holding onto that thing for 4-5 years to try to justify their purchase.
Budget is relative.
 
I kind of think it's funny that the most powerful gpus that can easily run 99% of all games have the most options for faking higher resolution and frame rate.

To me it would make more sense for someone like nvidia to make a hdmi pass through dlss device at a lower price.

lol those images aren't even the same angels.
Not my fault the camera kept moving as it shifted. you can check the Overdrive RT trailer.
This is just an example of an issue that reoccurs in RT games. Developers can't just turn RT on and leave it, they will still need to bake some lighting from scene to scene. In a Returnal developer talk they should how they needed to touch up RT images before copying the lighting files.
 
Nothing more cringe on the internet than PC elitism; video games are a form of entertainment. Why do you think you should talk down to me and call me a peasant because of the platform you play videogames on? It's honestly laughable. I hope your Dad bought you your PC and you are under the age of 18.

I prefer to optimize for graphical fidelity on PC, since I already have a 4K120 OLED for console gaming so high refresh rate isn't really a differentiator between PC/console for me. I run a 27in 2K 165Hz LED G-sync compatible monitor that I got when I bought my PC. This gives me a good balance of being able to choose max settings and still being able to run games at more than 60fps. Yes I can see that Destiny 2's textures and particle effects look better than on my PS5. Overall OLED was still a more significant upgrade than anything my 3080Ti gave me. I don't think my purchase was worth it. Sorry that offends you.
Has a PS5, a 27" 2K G-sync monitor, 4K 120Hz OLED TV and a 3080Ti in his PC - posting in thread about why would you spend money on x.
Classic.
 
Has a PS5, a 27" 2K G-sync monitor, 4K 120Hz OLED TV and a 3080Ti in his PC - posting in thread about why would you spend money on x.
Classic.

What does this even mean? I don't know if you missed this but according to some of the "real" PC gamers in this thread I am a peasant because I don't have a 240Hz monitor. If I make a purchase, I have to get some amount of value/enjoyment out of it to justify the price, and I'm just giving my opinion on why, in the case of my $2K gaming PC, this didn't really happen. I think there are some great things about gaming on PC, but paying for high-end graphics has not been worth it. If I had spent about half the price I would feel better, I should've went for a cheaper build.
 
Because I want to push my 4k 160hz monitor and for some the price isn't an issue though I totally get waiting for the more budget friendly versions of these cards
 
I'll never understand people who "future proof".

A xx70 series card is roughly half the price of the top end, which will let you play 100% games at max settings.

Cause you save so much money, you could literally upgrade every 2nd year to the next xx70 series card, while the x90 or Titan user is holding onto that thing for 4-5 years to try to justify their purchase.
I'm not sure that "future proofing" is making people hold on to the cards much longer, it's no Titan but I see lots of people with 3080s interested in upgrading now. Maybe because DLSS3 is 4000 serie exclusive?

Anyhow I jumped from 780Ti to 980Ti to 1080Ti, I won't pretend to be wise here. Waiting for the AMD showcase is probably the most sane thing to do this time, but there is nothing sane about this generation of PC hardware anyway.
 
Bragging rights mostly or if he have money to blow. GPU tech is too rapidy changing to go all in on the highest mode. In 2-3 years, there's already a newer and better card. You can sell it and get some of your money back but I'd rather just go x070 or x080 and have a nice card for several years without wasting too much money.
 
Honestly what's the point of spending money on the highest gpu

Honestly, what's the point of spending money on the GPU?
Honestly, what's the point of the GPU?
Honestly, what's the point of spending money?
Honestly, what's the point of money?
Honestly, what's money?
Honestly, what's the point?
Honestly, what?
What?
 
Because I want to play my games at 4K 120 FPS on my LG C9 without compromise? Sure as hell not going to do it with a 1060.
 
Honestly what's the point of spending money on the highest gpu

Honestly, what's the point of spending money on the GPU?
Honestly, what's the point of the GPU?
Honestly, what's the point of spending money?
Honestly, what's the point of money?
Honestly, what's money?
Honestly, what's the point?
Honestly, what?
What?

I am not so sure that picking a gpu, earning money, or participating in transactions involving money, are all as controversial as paying absurd prices for the flagship products.
 
Last edited:
Top Bottom