• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Silent Hill 2 Remake PC "Visuals Scale Beyond PS5 - But #StutterStruggle Cannot Be Avoided"

rodrigolfp

Haptic Gamepads 4 Life
There is a latency cost to DLSS but the whole point of it is to increase your framerate by lowering rendering time (thus gaining overall). For a given framerate though you end up with ever so slightly higher latency.
If there is, is compensated by the boost in frame rate.
 

Gaiff

SBI’s Resident Gaslighter
By system latency I mean the "PC latency" and "console latency" if using nvidia terminology then.

Yeah it's not much, was just making it clear that DLSS does increase latency because you stated it doesn't.
Yeah, but at this point we're getting into semantics. There's almost no scenario where DLSS will increase latency compared to native, unlike with frame generation. The upscaling cost does add to latency, but the end result is that the lowering of the resolution and speed up to rendering time completely erases this. The only times you will see an increase to latency is when you hit a CPU limit and DLSS still takes its few milliseconds to render without increasing the frame rate because the load on the CPU isn't any lighter. However, why would you use DLSS under these circumstances? I guess maybe to lower GPU power consumption?
Frame generation increases latency. DLSS doesn't.
"Technically" there is due to the fact that upscaling does have a frame time cost, which increases latency, but we're talking single digits in frame time and probably less than 1-2ms of latency.
There is a latency cost to DLSS but the whole point of it is to increase your framerate by lowering rendering time (thus gaining overall). For a given framerate though you end up with ever so slightly higher latency.
This. In practicality though, you will pretty much never see your latency increase by turning on DLSS.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
"Technically" there is due to the fact that upscaling does have a frame time cost, which increases latency, but we're talking single digits in frame time and probably less than 1-2ms of latency.
I don't doubt. But as I said, boosting the frame rate significantly compensates that and makes the latency lower in the end.
 

Thebonehead

Gold Member
Came close to actually and unironically ask my wife to record my hands while testing games.. what a goddamn moron. This place does no good for me.
Season 4 Reaction GIF by The Office
 

Three

Gold Member
Yeah, but at this point we're getting into semantics. There's almost no scenario where DLSS will increase latency compared to native, unlike with frame generation. The rendering cost does add to latency, but the end result is that the lowering of the resolution and speed up to rendering time completely erases this. The only times you will see an increase to latency is when you hit a CPU limit and DLSS still takes its few milliseconds to render without increasing the frame rate because the load on the CPU isn't any lighter.
"Technically" there is due to the fact that upscaling does have a frame time cost, which increases latency, but we're talking single digits in frame time and probably less than 1-2ms of latency.



This. In practicality though, you will pretty much never see your latency increase by turning on DLSS
You just said there is almost no scenario and then you posted the scenario. It's in the video you posted too with Fortnite. I'm not saying this is huge or not worth enabling DLSS/
PSSR for the improved image quality or performance. I just made a correction by stating that DLSS has a small latency cost because you stated it doesn't increase it.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
You just said there is almost no scenario and then you posted the scenario. It's in the video you posted too with Fortnite. I'm not saying this is huge or not worth enabling DLSS/
PSSR for the improved image quality. I just made a correction by stating that DLSS has a small latency cost because you stated it doesn't increase it.
Once again, you're being nitpicky. In practical terms, does it increase latency? No, it doesn't. If you turn on DLSS, will your latency increase or decrease? It will decrease. Now if you wanna go "but ackshually, it takes 1-4ms for the upscaling to process, increasing system latency" then go right ahead, but you're completely missing the point of the discussion.

As for the "almost no scenario," I think this still applies. Who turns on DLSS when they're hitting 300-400fps (or CPU limits)? In real life, you will virtually never run into scenarios where DLSS increases latency. With frame generation, if you don't have Reflex to compensate, you will see scenarios where your input latency increases, so that's a real and legitimate concern. That's why we all say that DLSS decreases latency and frame generation decreases it. We're talking in terms of what the user experiences, not what happens when you break down the rendering process into chunks and take the single-digit milliseconds of DLSS that are offset by the tens of ms of rendering time it saves anyway.
 
Last edited:

Three

Gold Member
Once again, you're being nitpicky. In practical terms, does it increase latency? No, it doesn't. If you turn on DLSS, will your latency increase or decrease? It will decrease. Now if you wanna go "but ackshually, it takes 1-4ms for the upscaling to process, increasing system latency" then go right ahead, but you're completely missing the point of the di
I don't understand why you're up in arms about something I've clarified several times, about it having a minor cost. It depends entirely on what you're turning it on for. That's the point. If I play at 1080p native then decide that I want to use DLSS to upscale to 1440p/4k with a base 1080p. Have I done so without a latency cost? You said it was latency free right?

I just made a small correction that said DLSS has a latency cost but usually it's used to increase framerate anyway. That was it, and yet you seem very adamant to argue about it.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I don't understand why you're up in arms about something I've clarified several times, about it having a minor cost. It depends entirely on what you're turning it on for. That's the point. If I play at 1080p native then decide that I want to use DLSS to upscale to 1440p with a base 1080p. Have I done so without a latency cost? You said it was latency free right?

I just made a small correction that said DLSS has a latency cost but usually it's used to increase framerate anyway. That was it, and yet you seem very adamant to argue about it.
I'm not up in arms about anything. Why do you think everyone is utterly confused by what you say until you clarify and we all go, "Oh, yeah, I see what you mean." Because you're not approaching this discussion like anyone else. When I say DLSS is latency-free, I mean in practical terms because flipping it on will cause the latency to decrease compared to off. That's what matters. That it adds a bit of latency that it completely eliminates and then some anyway, is what we care about.

Are you correct about what you say? 100%. Does it add to the discussion or help in any kind of way? Not really.
 

Three

Gold Member
I'm not up in arms about anything. Why do you think everyone is utterly confused by what you say until you clarify and we all go, "Oh, yeah, I see what you mean." Because you're not approaching this discussion like anyone else. When I say DLSS is latency-free, I mean in practical terms because flipping it on will cause the latency to decrease compared to off. That's what matters. That it adds a bit of latency that it completely eliminates and then some anyway, is what we care about.

Are you correct about what you say? 100%. Does it add to the discussion or help in any kind of way? Not really.
What exactly are you arguing about then if you're not up in arms? You seem to be resorting to this silly "nobody finds your posts useful" thing and keep repeating the same thing. Everything was clear from tbe beginning. This was my first post here:
I don't think so? DLSS doesn't increase input lag, nor does XeSS. What increases input lag is frame generation,

DLSS also does increase latency by a percentage but it reduces rendering time by lowering res anyway. The whole point is to increase native framerate. For a given base res though vs native base without DLSS your input latency would be higher (along with frametime) with DLSS.

That was it, but you're here arguing over and over about what exactly if it was clear enough?
 

Gaiff

SBI’s Resident Gaslighter
That was it, but you're here arguing over and over about what exactly if it was clear enough?
I'm arguing that your point was not only completely useless but also missed the point of the discussion.

I don't think so? DLSS doesn't increase input lag, nor does XeSS. What increases input lag is frame generation,

Do you know why I said this? Because Vick quoted a dev stating that Reflex was developed as a response to the latency introduced by DLSS. Given the context that DLSS in practical terms reduces latency, why would you then join the discussion to point to the 1-4ms DLSS takes to process as if it was relevant to NVIDIA developing Reflex? Because we both know NVIDIA didn't develop Reflex to get rid of those single-digit or below input latencies brought about by DLSS. That's what I'm arguing. You're not only nitpicky but your points were poorly constructed until you clarified. They added nothing to the discussion, and on the contrary, made it diverge into a pointless tangent.

Now if we could go back to discussing useful stuff instead of "ackshually, it increases latency but decreases it by an even greater amount!" that'd be great.
 
I wonder if Alex decided to compare PS5 with an RTX 4090?
There are no PS5 vs PC (same spec) benchmarks for GoW Ragnarök or SH2.
It would be helpful if he showed benchmarks on the most popular PC spec on Steam.
Don't know about SH2 (probably the usual PS5 vs PC using UE5) but GoW Ragnarok on PC is getting shockingly trounced by the PS5 version. Probably a new record of PS5 outperforming PC. Close to the metal advantage is real with that one.
 
Last edited:

sachos

Member
So all I'm really ask for is this at 60fps, which should be feasible looking at 4070 DLSS Balanced High Settings + RT performance above 60fps.
I dont know man, look at the 7700XT min fps performance on native 1080p with Epic settings and RT On https://www.techpowerup.com/review/silent-hill-2-fps-performance-benchmark/5.html
Even dropping the settings to High or Med would not put you at 68 FPS wich is what you need for locked 4K60 PSSR Performance taking into account the upsclaing ms cost.
Only way i see it is if the RDNA4 RT is much better than what the 7700XT can do.
 

Three

Gold Member
I'm arguing that your point was not only completely useless but also missed the point of the discussion.
My point wasn't completely useless. it just pointed out that you can't have DLSS for better image quality without a latency cost associated with it in case people thought DLSS was latency free from your post
Do you know why I said this? Because Vick quoted a dev stating that Reflex was developed as a response to the latency introduced by DLSS. Given the context that DLSS in practical terms reduces latency, why would you then join the discussion to point to the 1-4ms DLSS takes to process as if it was relevant to NVIDIA developing Reflex?
It wasn't relevant to that that's why. I didn't say it was why reflex was created. It was just a small correction about the latency cost of DLSS
You're not only nitpicky but your points were poorly constructed until you clarified.
Make up your fucking mind at least:
Why do you think everyone is utterly confused by what you say until you clarify and we all go, "Oh, yeah, I see what you mean."
 

proandrad

Member
PC gaming is getting worst every year for triple A games. You can’t brute force your way out of stuttering. Developers can’t even blame it on all the different variations of hardware when every pc gets stuttering. It’s either they aren’t putting the time and effort in these pc ports or Unreal has made it too easy to make games and new developers never learn the more advanced knowledge to fix these issues.
 

SpokkX

Member
Is there ANY example of a 100% stutter free ue5 game?

Hellblade 2 ran great but also had traversal stutters

Is this really an engine issue by this point?
 

rodrigolfp

Haptic Gamepads 4 Life
PC gaming is getting worst every year for triple A games. You can’t brute force your way out of stuttering. Developers can’t even blame it on all the different variations of hardware when every pc gets stuttering. It’s either they aren’t putting the time and effort in these pc ports or Unreal has made it too easy to make games and new developers never learn the more advanced knowledge to fix these issues.
More games with some problems is infinitely better than no more games with zero problems.
 

Killer8

Member
But, although at 30fps, PS5 version has RT and according to Alex Epic settings. Ray Reconstruction and the much better HW reflections are the real clear step on PC.
Well any reflection setting actually looks better than any on PS5, which is funny.

However, despite what some people would try to tell you, in reality it doesn't appear stutter is "just the same or worse on PS5".

Here a 77800X3D + 4090 using DLSS Performance drops to 52 because of traversal stutter, when same traversal on PS5 stay at locked 60fps even in presence of enemies or drop a single frame to 59 and in the worst cases to 58.

DXdH9y8.gif






Would be nice to get 1:1 but as is looks like stutter actually isn't close being a real issue on PS5, in the first video there's that guardrail zone which is the most covered area to show stutter on PC and it stays at locked 60fps on PS5 or at worse 59 for a fraction of a second.
All Performance Mode drops, like the beginning woods walk or multiple enemies pool fight, appear to be related to insufficient computational power only.


I think Alex was desperate to not let PS5 have the win:

"Reading online though, I see a lot of misconceptions that the situation is much better on PS5 and unfortunately, I can say it is not"

>goes on to show in his own video how the stutter situation is literally better on PS5
 

analog_future

Resident Crybaby
Just want to chime in to say that installing the newest UltraPlus version (1.0.2), the PureDark DLSS Frame Gen mod, and the Unlocked FPS Cutscene mod will make the PC experience damn near perfect.


I'm now running the game maxed out, 4K DLSS Quality, and the game now runs at a virtually locked 100+ fps and stutters have been pretty much completely eliminated on my 7800X3D/4090 rig. Wonderful experience now.
 
Last edited:

Vick

Gold Member
Please, tests you posted are done with 60Hz refresh rate, majority of PC players don't use that (and don't remember what it is). Even having the same framerate, jump from 60Hz to 120Hz refresh rate reduces input lag.
Happy Nicolas Cage GIF


We are now refuting a comparison paid by Nvidia (and using shield equip as reference in GoW) made to showcase native input latency on equal settings.. because of 120Hz monitors, claiming even same framerate would see a reducion in input lag?

Meaning the elementary notion that the same console versions with lower native input latency would benefit just the same by switching to 120Hz somehow escaped your thought process?

I actually had a little time and I did it myself. Fucking GAF..
Not even going to bother using notoriously instantaneous responsive games like Astro Bot or Spider-Man, but two third person games I already had installed that once released on PC sparked lots of complaints about their input latency:

giiMdnr.gif


Slowed down 50%:

4gpb2xQ.gif


Slowed down 30%:

BSJDEn4.gif


sg5a5oH.gif


ImYQvvl.gif


I rigorously always and only use the same Calibrated mode I use for everything, BD, games, and compressed material. Never settled for Game modes and crap like that and never will.
Also not sure if DualSense being wireless made a difference, but don't give a literal shit either way.
 

Bojji

Member
Happy Nicolas Cage GIF


We are now refuting a comparison paid by Nvidia (and using shield equip as reference in GoW) made to showcase native input latency on equal settings.. because of 120Hz monitors, claiming even same framerate would see a reducion in input lag?

Meaning the elementary notion that the same console versions with lower native input latency would benefit just the same by switching to 120Hz somehow escaped your thought process?


I actually had a little time and I did it myself. Fucking GAF..
Not even going to bother using notoriously instantaneous responsive games like Astro Bot or Spider-Man, but two third person games I already had installed that once released on PC sparked lots of complaints about their input latency:

giiMdnr.gif


Slowed down 50%:

4gpb2xQ.gif


Slowed down 30%:

BSJDEn4.gif


sg5a5oH.gif


ImYQvvl.gif


I rigorously always and only use the same Calibrated mode I use for everything, BD, games, and compressed material. Never settled for Game modes and crap like that and never will.
Also not sure if DualSense being wireless made a difference, but don't give a literal shit either way.

YES, 120hz refresh rate will lower input lag even with the same frame rate on consoles too. you can test it with game like uncharted 4 remaster, you can have the same locked 60fps with 60hz refresh and with 120hz refresh and 120hz refresh will have less input lag. why?

1. you have native refresh rate of the panel (that means less input lag)
2. game is not hitting vsync limit like on 60hz, this causes quite big input lag increase.

Vast majority of PS5 games are stuck with 60hz, vsync experience. I always choose 120hz refresh modes when available.
 
Last edited:

Vick

Gold Member
YES, 120hz refresh rate will lower input lag even with the same frame rate on consoles too.
Exaclty. So how would using a 120hz monitor somehow invalidate that graph showing PC games having natively more input latency than PS5 ones?
 

Bojji

Member
Exaclty. So how would using a 120hz monitor somehow invalidate that graph showing PC games having natively more input latency than PS5 ones?

this would be more like normal experience on PC. not many people use 60hz. Games would also not hit vsync wall when displayed in higher refresh rate.

1080p is the most common resolution on PC (steam surveys) but monitors are 120/144/240+ Hz.
 
Last edited:

Kangx

Member from Brazile
I dont know man, look at the 7700XT min fps performance on native 1080p with Epic settings and RT On https://www.techpowerup.com/review/silent-hill-2-fps-performance-benchmark/5.html
Even dropping the settings to High or Med would not put you at 68 FPS wich is what you need for locked 4K60 PSSR Performance taking into account the upsclaing ms cost.
Only way i see it is if the RDNA4 RT is much better than what the 7700XT can do.
I think it is possible. Take a look at this video.



It is all about optimization. Preset will tank Amd GPU. Lower shadow and shader will save tons of fps. Also, the beginning of the forest area is pretty heavy, it's not really an indicative of how the game run in most area. In door, at lower res around 1080p, it can gain 15 to 20 fps compare to the forest area. I see why Bloober optimize the way it is for the ps5.

Go to 13:30 on the video, at 4k fsr quality, the 6800xt is average around 58fps at the forest area, it should be 60fps all the time in the city and indoor. So, I can see the pro go for DRS from performance to balance with 4k PSSR. I think it would not lock the fps especially at the beginning, but it will be alot better than the ps5.
 

Shin-Ra

Junior Member
this would be more like normal experience on PC. not many people use 60hz. Games would also not hit vsync wall when displayed in higher refresh rate.

1080p is the most common resolution on PC (steam surveys) but monitors are 120/144/240+ Hz.
What are the steam survey results for connected monitor max refresh and actual/average game output refresh? Does Steam even record that data?
 

Shin-Ra

Junior Member
I have no idea, they only show resolutions.
Why are you so sure a majority of PC gamers don’t play this sort of game with 60fps or even lower as a target?

Please, tests you posted are done with 60Hz refresh rate, majority of PC players don't use that (and don't remember what it is). Even having the same framerate, jump from 60Hz to 120Hz refresh rate reduces input lag.



I like that you ignore almost 3x latency reduction here with reflex in some games. And as I said above, most PC players use refresh rate higher than 60Hz.
this would be more like normal experience on PC. not many people use 60hz. Games would also not hit vsync wall when displayed in higher refresh rate.

1080p is the most common resolution on PC (steam surveys) but monitors are 120/144/240+ Hz.
 

Bojji

Member
Why are you so sure a majority of PC gamers don’t play this sort of game with 60fps or even lower as a target?

I was talking about refresh rate alone, not fps.

First 120Hz monitors appeared ~14 years ago, I don't think many people buy 1080p 60hz monitors at this point (I'm not even sure they make them).
 

Killer8

Member
Just want to chime in to say that installing the newest UltraPlus version (1.0.2), the PureDark DLSS Frame Gen mod, and the Unlocked FPS Cutscene mod will make the PC experience damn near perfect.


I'm now running the game maxed out, 4K DLSS Quality, and the game now runs at a virtually locked 100+ fps and stutters have been pretty much completely eliminated on my 7800X3D/4090 rig. Wonderful experience now.

If this really does all of this I will double dip to play on my 4070.
 

SKYF@ll

Member
Reducing image quality to 120Hz with Vsync off is the most effective way to reduce input lag.
However, UE5 games are too heavy and the frame rate is too low even on expensive PCs.
nmN4rXB.jpg
 

Bojji

Member
rodrigolfp rodrigolfp Bojji Bojji Thebonehead Thebonehead O owandeseis Mister Wolf Mister Wolf Gaiff Gaiff

I can't believe you guys made me hold a phone between the legs like a whore to record my hands playing, only to completely disappear from the Thread..

Angry What Is Love GIF by Mike O'Hearn'Hearn


I'll now spam these Gifs under every single rodrigolfp rodrigolfp 's mention of input lag, till the end of time.

You did some good work (y)

There should be more options to reduce input lag on PS5, system ability to run all games in 120Hz would be a great start (Xbox has it...) because developers clearly don't give a fuck.
 
rodrigolfp rodrigolfp Bojji Bojji Thebonehead Thebonehead O owandeseis Mister Wolf Mister Wolf Gaiff Gaiff

I can't believe you guys made me hold a phone between the legs like a whore to record my hands playing, only to completely disappear from the Thread..

Angry What Is Love GIF by Mike O'Hearn'Hearn


I'll now spam these Gifs under every single rodrigolfp rodrigolfp 's mention of input lag, till the end of time.
Not sure what you are talking about, been away and haven't followed this thread, is it the gifs you posted? Yes that's what 80-130ms latency is, it's noticeable specially in the jumping and the shooting gif, that the animations don't match the button pressing, i never said it was unplayable or anything near that, i'd be totally fine with playing any single player like that except maybe stuff like Formula 1, or Fifa as i said before, but i still play it since console version is superior to PC one.



Here you can see another video of shooting with different input lag, only the 30ms one feels instant even in the slow motion part, but absolutely none feels bad to play.

Same goes for a game like League of Legends, playing with 70ms lag (plus the 30-40ms from your computer) is fine, but if you get to play at 20ms instead you'll notice a big difference and will think people playing with that ping is kinda cheating, it simply depends on the game.
 
PC gaming is getting worst every year for triple A games. You can’t brute force your way out of stuttering. Developers can’t even blame it on all the different variations of hardware when every pc gets stuttering. It’s either they aren’t putting the time and effort in these pc ports or Unreal has made it too easy to make games and new developers never learn the more advanced knowledge to fix these issues.

I thought it was already acknowledged to be a UE5 issue? Do we still not know?
 

Vick

Gold Member
Not sure what you are talking about, been away and haven't followed this thread, is it the gifs you posted? Yes that's what 80-130ms latency is, it's noticeable specially in the jumping and the shooting gif, that the animations don't match the button pressing
Well I suppose you're kind of gifted then, here's the same "jump" (it's a roll, an upgraded, longer and direction-controllable mid-animation roll) Gif advancing frame by frame, and between the O pressing and the rolling animation (and related in-game camera boost forward) there appears to be a single frame:

5cqTjbo.gif


Here again at regular speed:

sg5a5oH.gif


And these aren't games praised for their low input latency, like Insomniac ones for instance, PC ports of both games faced plenty of complaints.

i never said it was unplayable or anything near that, i'd be totally fine with playing any single player like that except maybe stuff like Formula 1, or Fifa as i said before, but i still play it since console version is superior to PC one.
I know it's OT, like everything esle at this point, but is this really still the case?
Haven't played a FIFA in about a decade I'd recon, but I clearly remember at the time the "Next-Gen" versions were only on Console while for some stupid reason PC was only getting the previous Gen one. Can't believe they're still doing this crap.. or this time it's inferior for some other, minor reason?



Here you can see another video of shooting with different input lag, only the 30ms one feels instant even in the slow motion part, but absolutely none feels bad to play.

Same goes for a game like League of Legends, playing with 70ms lag (plus the 30-40ms from your computer) is fine, but if you get to play at 20ms instead you'll notice a big difference and will think people playing with that ping is kinda cheating, it simply depends on the game.

Honestly only one I could notice was not exactly instantaneous in normal speed, at least in the first video, was the lower left.
 
Well I suppose you're kind of gifted then, here's the same "jump" (it's a roll, an upgraded, longer and direction-controllable mid-animation roll) Gif advancing frame by frame, and between the O pressing and the rolling animation (and related in-game camera boost forward) there appears to be a single frame:



And these aren't games praised for their low input latency, like Insomniac ones for instance, PC ports of both games faced plenty of complaints.


I know it's OT, like everything esle at this point, but is this really still the case?
Haven't played a FIFA in about a decade I'd recon, but I clearly remember at the time the "Next-Gen" versions were only on Console while for some stupid reason PC was only getting the previous Gen one. Can't believe they're still doing this crap.. or this time it's inferior for some other, minor reason?


Honestly only one I could notice was not exactly instantaneous in normal speed, at least in the first video, was the lower left.
I checked the first gif where you move, which is the only i can really tell when you did the movement/pressed button and i can only count 5 frames which would mean just 80ms of latency, the video seems recorded at 30 FPS tho so i have no clue the impact and how accurate this is but yea very good and as you said, it's a third person action game which is not a great game to judge.

About Fifa, i know it's the same nextgen version, but PC one has a separated market, i play it ratkid mode with the FUT cards, and pretty much 80%+ of the game sales are on Playstation , which makes me play it on console since there's many many more cards for sale there.

And yes i'm sure basically 100% of single player games wouldn't notice between 50ms and 100ms in a game that has fast reactions, and pretty much every player who hasn't played a lot of competitive games, meanwhile when some pro players get extra 20-30 ms in a fighting game, League of Legends or fast paced shooters they think it's simply unplayable, that's why many of them just care about having absurdly high FPS.

It just depends as i said, but yes, console versions at 60 FPS are pretty much perfect already, while PC are just better because of the VSync removal, and, if you can afford it, the extra FPS, but this has diminishing returns so the difference is small.
 

Gaiff

SBI’s Resident Gaslighter
rodrigolfp rodrigolfp Bojji Bojji Thebonehead Thebonehead O owandeseis Mister Wolf Mister Wolf Gaiff Gaiff

I can't believe you guys made me hold a phone between the legs like a whore to record my hands playing, only to completely disappear from the Thread..

Angry What Is Love GIF by Mike O'Hearn'Hearn


I'll now spam these Gifs under every single rodrigolfp rodrigolfp 's mention of input lag, till the end of time.
It really wasn’t a big deal to me lol. You were the one who was the most invested in that discussion. I chimed in a bit and then just let you guys go at it.
 

Vick

Gold Member
About Fifa, i know it's the same nextgen version, but PC one has a separated market, i play it ratkid mode with the FUT cards, and pretty much 80%+ of the game sales are on Playstation , which makes me play it on console since there's many many more cards for sale there.
Oh, I see. I thought it was some technical reasons like in the past.

It just depends as i said, but yes, console versions at 60 FPS are pretty much perfect already
Well, surely not on most LCDs I've seen where 60fps not only feel much worse than 30fps do on my panel, but look infinitely blurrier as well.
If I was forced to play on any other TV/Monitor we have in the house I would unironically stop playing. One of the reasons I would never consider playing while on vacation.

It really wasn’t a big deal to me lol. You were the one who was the most invested in that discussion. I chimed in a bit and then just let you guys go at it.
I know, I know. Just pulling your guys legs.

200.gif
 
Top Bottom