Evil Calvin
Afraid of Boobs
Is that supposed to look great? or were you being sarcastic? Because that doesn't look very next-gen at all.Agreed. PCs couldn't possibly hope to run all that next-gen goodness.
Is that supposed to look great? or were you being sarcastic? Because that doesn't look very next-gen at all.Agreed. PCs couldn't possibly hope to run all that next-gen goodness.
It is what it is. Consoles always have their limitations, be resolutions, performance, controllers, etc, etc. People choose to play on pc to avoid those problems...See I don't think this is right either. There's this bizarre schism in the gaming space where it's a mentality of my platform vs the other platforms but it shouldn't be the case. Console gamers shouldn't shit on PC gamers for this botched job and PC gamers shouldn't have to retaliate. We should be on the same side and shit on the publishers for delivering a shoddy product. Our side is the one of the consumers, not PC/console or what have you.
Doesn't seems as bad as reported:
No, the difference between 4 and 8GB textures is the LOD pop-in. Not the actual textures themselves. I see pop-in all the time now that im forced to settle for 4GB when i was running the game just fine at native 4k 60 fps with hair strands on at high a couple of weeks ago.In re4 there was no difference between 3-4 gb and 8gb texture, people just wanted to max out texture even if their system was weak, not my fault if people want to play everything on ultra even when turning down a setting has no visual quality loss amd can fix problems, i had a crash aswell when i was putting everything on ultra, i just needed to turn 1 or 2 settings down a notch to not have any problem.
In hogwarts you just needed to turn down rtx to solve most of the problem, i clearly remember you saying the same thing in the ot, the game was faaaaaaar from unplayable, same for dead space, callisto on ps5 had more stuttering than these 2 combined.
For everyone of those bad ports you named that weren't even broken for everyone, we had atomic hearts, dead island 2, forspoken and some others that runs well.
And i tell you a secret, stuttering in most of the cases is not related to how good your gpu is, if a game has shader compile stuttering, it is the same for everyone, so i experienced the same stuttering as you in most of the cases.
I think wild hearts was way more broken than re4 and ds remake tbh, that one was really unplayable on pc, no matter how strong your pc is.
It is a good year for pc gamers? No, but it's hardly this hell on earth that some of you want to promote.
Fixed that for youI’m just going to stop you right there, because you have no clue what you’re talking aboutin regards to this dogshit port.
In this thread:
Corporate loving Nvidia fans triggered and desperate that a pre-release version of a game
Yep, ive said it before. AMD sponsored games end up having cancer performance on PC.Every AMD sponsored game has bad performance
Now jedi survivor. Dead island 2 being the exception so far, but also that game is not that demanding afaik.The moment they start getting bundled with AMD parts is like a kiss of death
Callisto Protocol
Company of heroes 3
The last of us
Redfall (loading)
Dead island 2 (loading)
Yes, but how many can really have this mentality on forums, 20-30%? Maybe the ones that simply play games and don't comment anywhere, they can enjoy games, and have no problems with others enjoying games on different platforms.See I don't think this is right either. There's this bizarre schism in the gaming space where it's a mentality of my platform vs the other platforms but it shouldn't be the case. Console gamers shouldn't shit on PC gamers for this botched job and PC gamers shouldn't have to retaliate. We should be on the same side and shit on the publishers for delivering a shoddy product. Our side is the one of the consumers, not PC/console or what have you.
Even the PS5 version has stutter
Sony GAF where is your I/O god now?
You playing Redfall with people or solo?Nope, i still have to finish di2 and i chose to give priority to redfall.
Too many games between april and may.
It's an arkane game so i'm going solo.You playing Redfall with people or solo?
Ah yes, because Dead Space and RE4 had issues on PC. No wait, that was PS5...its fun to watch them get humbled though, every single ported game recently has been a mess,
Let me be clear: for most of the time I spent with the game before publishing the review, the optimization of the game was a disaster on PlayStation 5. The graphical "performance" mode had little to do with performance, and if I had to estimate the amount of time the game hit 60 FPS, it would be maybe 5% - the rest is "by eye" around 40 frames per second. In a flash, however, we received access to the "day 0" patch, which actually improved the performance of the game. It's still not perfect though.
Looks like another vrs + fsr2.0 tragedy... I hate this piexlated break up crap.
They've not been told that if PC players downgrade settings to PS5 levels games not only avoid most of bespoken issues but run better on mid range hardware than on it if the port isn't an absolute broken mess like TLOU.Learn how to use paragraphs when you post your utter nonsense.
Even better: don't post at all
Why do all these single platform ps5 warriors insist on posting in PC threads?
3080 with patch
"With DirectStorage PCs are hitting 20GB/s."
Those maximum speeds depend not only on the speed of the PCI-E interface, but also the speed of the NME-drive and your graphics card. A 4090 is going to perform better than a 1660 card here since texture decompression happens on the graphics card. I did the DirectStorage 1. 1 benchmark on my PC (with an RTX 3080) and I got great results (16 GB/sec) but I would need to have a 4090 to hit the max.
So the truth would be that "Absolute high-end PCs are hitting 20GB/s bandwidth" but it all depends on your particular configuration. Whereas all PS5s can reach that target.
Another thing: since DirectStorage uses the graphics card for texture decompression, this will have an effect on the graphics performance. That's not the case on consoles that have dedicated IO chips.
The game runs fine on 5800X and 3080, so false alarm.
They want to know if they paid $70 to play a inferior game.Why do all these single platform ps5 warriors insist on posting in PC threads?
This is the last thing I wanted to share; but somehow CPU performance has gotten *worse* on the day-1 patch. I have no idea why; my best guess, though I never bothered to check, but it's possible that the pre-release build didn't incorporate Denuvo, and with the day-1 patch did. That would add some extra CPU overhead. Regardless of the reason, the patch didn't help.
Using/owning the "best" Piano ever doesn't make you a Mozart or using/owning the best [insert_tool] doesn't make you a good [insert_job] ¯\_(ツ)_/¯Sony GAF where is your I/O god now?
Ooouf, those RT effects.
Thanks AMD!
I said it in a reply post but those estimations seem very conservative, my 5600X could run A Plague Tale Requiem at more than 80 fps before being bottlenecked by GPU at 1080p, at 1440p that CPU + RX 6700 XT ran the game from around 60 to over 70 fps... That's a game that is said to be bottlenecked on consoles by CPU so it's limited to 30-40 fps at 1440p upscaled to 4K, with Forspoken the story is basically the same, it just loads like 80% slower on my PC compared to PS5 (but that's just 3 seconds on PS5 vs 5 or so on my PC), I'm bottlenecked by PCIe 3.0 but people with beter GPU and PCIe 4.0 even beat PS5 on that regard, which has the most powerful I/O system of the consoles.Usually at this point pc cpus were anywhere from 5 - 10x faster (going all the way back ps1) so only 2x is a drastic slowing of pace.
GPU delta is a bit harder to directly compare since architectures didn't line up so neatly until ps4, but even there, 3x was far more accessible than 4x the price of console for gpu alone.
4050 could be excused, IMO. It's a low end GPU (watch Nvidia charge like $350 for it). So if it has 8gb, that's acceptable.Many people already know that, 4060Ti/4060/4050 are literally DOA GPUs.
But that's not a developer call. It would be their publisher's which is EA in this case.It’s laziness, with a big side of not giving a fuck about the product you’re putting out there, for customers to spend their hard earned $70 on.
It’s a “we’ll fix it later” mentality.
Which by the way the console versions suck just as well, so looks like priorities went right out the window. Maybe delay these shit games to please paying customers instead of shareholders?
The game runs fine on 5800X and 3080, so false alarm.
Running 5120x1440 (32:9), max graphics, ray tracing on, widest FOV, FSR2 off, vsync offBTW, haven't seen new reports on PC version, how is it running to people here in GAF? Or basically everybody decided to hold up lol
4090 + Ryzen 7700Ok, first impressions. Not bad. My Specs:
Quite a few micro stutters but nothing like hogwarts or other games with shader compilation stutters. The game starts with a gigantic stutter which made laugh. But other than that, its been fine. Performance doesnt fluctuate too much. Some cutscenes do go up to 120 fps, but gameplay mostly hovers around 55-65 fps in the linear intro.
- 3080 + 32 GB RAM + i7-1700k
- 1440p Epic Settings No Ray Tracing 55-65 fps
- VRAM usage 7GB
Aside from the micro stutters every now and then, I havent seen any glitches or bugs or crashes. GPU utilization is roughly 75-99%. Definitely CPU bound in some areas. I didn't get the HDR bug GamingTech was talking about so maybe the day one patch fixed all these issues.
The game feels really really good. Kind of a slow opening but the combat feels tight, visuals remind me of Ratchet though not as pretty of course. I really like all the new changes to the stances and NPCs.
P.S EA App Uses 600 MB itself. But I did see it go down to 200MB after I quit out of the game. No idea why these apps need dedicated vram.
Yep. Combat feels weighty yet responsive and far more brutal than the first game. I legit said wow when the game switched stances on me bb and i cut through half a dozen stormtroopers.4090 + Ryzen 7700
Same experience as you. I am playing 4k max everything with Ray Tracing and even though I didn't look at my FPS it felt pretty smooth except for the micro stutters. I wish they weren't there, but it's a very minor gripe. The game looks awesome and the HDR really pops on the OLED. The combat is good and the dismemberment is awesome. I finished a quick battle and there were arms and legs all over the place. I cut off another enemies arm and as I did you could see the bright red burn mark that reminded me of the old cigarette lighters in cars that would turn red on the ends.
GTX 3070's 8GB VRAM issue is the same garbage tactics as expected from NVIDIA's GTX 970's fake 0.5 GB VRAM.Do some people read the stupid shit that they tap before posting? PC gamers are paying customers, just like you are. Devs shit out a garbage port, instead of going after the ones responsible, you go "hur hur, serves you right PC gamers"?
Shit on NVIDIA for cheating you out of VRAM but mocking PC gamers for what is essentially a botched port that is solely the fault of the company is utterly moronic.
2GB VRAM PC GPUs were rendered obsolete with PS4 / XBO era.No, it's just incompetent developers who don't give a shit.