kiphalfton
Member
640 (x 480) ought to be enough for anybody.
Nintendo endorses this message
Last edited:
640 (x 480) ought to be enough for anybody.
I never did that comparison with 1440p 27" montior but I had few and it was barely any better than 1080p.
I had 1080p 27" monitor running 4k supersampling and 4k 27" monitor running native. These were the results...
![]()
Now. This was on 27". It's even more important on bigger screens
Don't you mean cries? It's time to move on bud. Even the Kuro elite plasmas (owned 2, one calibrated) and Panasonic VT60 just fail to impress these days. Oled has come so far and is relatively inexpensive. Even what are now going to be last gen models have addressed the weaknesses of Oled to such a degree that I think we're nearing the point like with cell phones where the average flagship is so capable that you really don't need the latest model (or even the one before that).laughs in 1080p plasma
Don't you mean cries? It's time to move on bud. Even the Kuro elite plasmas (owned 2, one calibrated) and Panasonic VT60 just fail to impress these days. Oled has come so far and is relatively inexpensive. Even what are now going to be last gen models have addressed the weaknesses of Oled to such a degree that I think we're nearing the point like with cell phones where the average flagship is so capable that you really don't need the latest model (or even the one before that).
Right PC is crazy expensive just for a monitor to go up to that.I have yet to game at 4k on PC. I look forward to it actually. Whenever I'm ready to spend another 700 dollars on a monitor. Til then 1440p max frames on ultra is more than enough.
I'm still on 1080p.
Not due to money issues, but my monitors just refuse to kick the bucket and I'm not gonna waste money
And honestly?
Games look great in 1080p!
Realistically 1440p is enough pixels for games. Native 4K at the expense of frame rate is a waste. Games need frames more than pixels for right now, so drop the internal resolution and do AI upscaling from 1440p to 4K for me all day every day.
Post a link and I'll tell you why I'm not considering it$700? There are 4k monitors for around $300
What are you talking about man? Perfect example...I can run ff15 at 8k with a 3080ti. 4090 would destroy it easily. In fact, it can run it easily above 50fps.Try that without any upscalers or frame generation, on the highest settings. Don't think so.
Dude, you are talking about upscaled picture, and since upscaling destroy edge sharpness (and especially without integer ratio), so of course you can see the difference. Upscaled 1440p looks very bad on 4K display, but the same 1440p looks amazing on 2560x1440p display.Those calculators are bullshit. I have a 48 inch CX Oled that I've been using as a monitor for well over a year. Switching back and forth between 1800p and 4k at a little over a meter viewing distance it's obvious that one is sharper than the other. 1800p is a great in between resolution because 1440p is completely insufficient at a size greater than 32 inches for PC gaming purposes. Again, calculator is bullshit--says more than 1440p is unnoticeable at 3.2 feet. I guess all those manufacturers making 4k 32inch monitors are doing it for the lulz
8k is for sour blue berries, 64k is where it's at.4k is overrated by people who can't achieve 8k.
That's what "sweet spot" means, if 4K was free on GPU cost, all of us would go 4K "just because" even when not getting the benefits due to screen size and pixel density... It's not free though, we have to find a balance between performance and IQ, and too be fair, 30 fps look ugly as fuck to many of us so 4K at 30 fps isn't even an option.When we get affordable GPUs that can do 4k at 240fps+ people will stop sign boosting 1440p over 4k. The problem with 4k is that it comes at a higher fps cost than 1440p but eventually 4k fps will reach an acceptable level for everyone who's not an eSports gamer just like 720p, 1080p and 1440p did.
Acts like I haven't had 1440p monitors before...I know how upscaling works. Had some of the highest end displays during the Xbox 360/PS3 and PS4/Xbox one gens and in each, having the best tv (best overall picture combined with low input lag) usually meant having one with a higher resolution than the native rez of the games. Handful of decent looking 1080p titles when I had my pioneer plasma really popped but most relied on scaling (the Xbox in the 360's case and I believe the tv in the ps3's, as that system did no internal scaling if I recall). When the first 4k oleds came out, you were better off with the 1080p model for most gaming needs. With image reconstruction, the need to stick to the displays native rez is fast becoming antiquated. People who drop major coin on 4080s/90s and then play them on a $350-400 monitor is a waste of money.Dude, you are talking about upscaled picture, and since upscaling destroy edge sharpness (and especially without integer ratio), so of course you can see the difference. Upscaled 1440p looks very bad on 4K display, but the same 1440p looks amazing on 2560x1440p display.
Probably because 1080p is already pretty good anyway at monitor sizes most people use, socially those for competitive gaming. Also, as PC gaming is about choices, nobody stops a PC player to use a 4K screen for single player stuff if that's what they want.It’s weird how for a long time PC gamers were playing at a very high res. Then since 1080p console gamers started to want higher res. Now it has switched.
On PC it’s always about flexibility.
I can’t wait to get a 1440p monitor and a Zen4 to run all the games ai want from the past 10 years
30fps is not screenshots. It's perfectly fine.4k 30fps and 1440p 60fps I know what I'll choose. We aren't playing screenshots.
Of course some games will run, but not all of them, not even close. But that's not the point, 4K is overrated and a waste or resources in many cases.What are you talking about man? Perfect example...I can run ff15 at 8k with a 3080ti. 4090 would destroy it easily. In fact, it can run it easily above 50fps.
And that's a high end game, anything below that a 4090 can run it fine at 8k.
77" OLED here, aside from a tiny bit of ghosting for very fine geometry I`ve had barely any artifacting with DLSS at Quality setting since dlss 2.0 released and with the current version native resolution is now completely obsolete for me.I dont know about that, but what I do know is that DLSS artifacts are way worse and more noticable on Plasma or any TV tech with High-Motion Resolution than DF admits (LCD "Blur" hides the artifacts).
I personally need Native because I game on Plasma.
Consoles should still aim at 1440p but for PC its Native all the way.
Nah red is the real deal. Blue is overrated.The color blue is the best
So if you know how upscaling works, then why were you whining about 1440p picture quality when upscaled to 4K, when you should know the same picture would look totally different on 2560x1440 monitor.Acts like I haven't had 1440p monitors before...I know how upscaling works. Had some of the highest end displays during the Xbox 360/PS3 and PS4/Xbox one gens and in each, having the best tv (best overall picture combined with low input lag) usually meant having one with a higher resolution than the native rez of the games. Handful of decent looking 1080p titles when I had my pioneer plasma really popped but most relied on scaling (the Xbox in the 360's case and I believe the tv in the ps3's, as that system did no internal scaling if I recall). When the first 4k oleds came out, you were better off with the 1080p model for most gaming needs. With image reconstruction, the need to stick to the displays native rez is fast becoming antiquated. People who drop major coin on 4080s/90s and then play them on a $350-400 monitor is a waste of money.