You are expecting integrity from devs who are trying to pass off PS4 quality visuals as next gen games just so they can charge $70 for it?It takes too long for devs to optimize for PC so they rely on patches instead of delaying the game.
If they don't patch this on day one, it deserves every bit of the review bombing, this is not acceptable.
Where is the god damn integrity PC devs? do you enjoy releasing stuttering shit we have to play on low?
Sorry in advance for a lengthy response.What is the latency of I/O access on a PS5 vs a PC? And how does it impact this specific game?
It is serious. See separate reply.Wait is this a meme now or are people being serious?
No, this trend is a chiefly an issue between DX12 and UE4 which is ubiquitous. It started long before the consoles were even out. The stuttering is mainly due to shader compilation.Sorry in advance for a lengthy response.
The trend that you all see in terms of more and more games having stuttering issues and frame drops on the PC that you do not see on the consoles come primarily down to the below.
If only playing on low has actually helped. Ive tried turning everyting down to low in fallen order, playing it at the lowest resolution the game has allowed and limiting it to 30fps, it had zero effect on stuttering. Theres no way to get around the software being poor.It takes too long for devs to optimize for PC so they rely on patches instead of delaying the game.
If they don't patch this on day one, it deserves every bit of the review bombing, this is not acceptable.
Where is the god damn integrity PC devs? do you enjoy releasing stuttering shit we have to play on low?
Yes it does. It sure don't help but Denuvo as bad as it is can't be solely responsible for that mess.I expected better from them honestly.
A hail mary accusation but does the game have Denuvo?
No, this trend is a chiefly an issue between DX12 and UE4 which is ubiquitous. It started long before the consoles were even out. The stuttering is mainly due to shader compilation.
Sorry in advance for a lengthy response.
The trend that you all see in terms of more and more games having stuttering issues and frame drops on the PC that you do not see on the consoles come primarily down to the below.
If you do a CGI movie, a single character model can have up to 100 textures. High resolution textures are roughly 50-100 MB in size uncompressed. In other words, a single character model has 5-10 GB of texture data under optimal conditions. In games it is obviously less. However, in the past there has been a hard cap on how many textures the game can utilize and what resolution they can have. Since consoles had mechanical drives (as well as most PCs) all textures were mainly loaded when a 'level' loaded. Nothing was loaded on the fly. On a PC this resulted in a texture limit around 4GB (based on the median graphic card that was utilized).
With the new generation of consoles with SSDs they can load textures on the fly so most games have started to do that and as a consequence the number of textures that are used have increased dramatically as well as their resolution.
On a console this is very easy since RAM and VRAM are the same and the GPU is allowed to call for data straight from the SSD without checks and balances. On a PC the GPU cannot access RAM and SSD directly at all since that would be a security risk. The GPU needs to loop through the kernel (with associated driver overhead) and ask for a file to be fetched from RAM or SSD to be loaded to VRAM to be accessed. This introduces a lot of latency and fairly dramatic frame drops if the texture is not where it should be to create a frame. Direct Storage tries to make this faster and smoother but it is still a far cry from what both the XSX and the PS5 can achieve due to the limitations of the PC architecture.
This tweet still holds despite a lot of PC lovers fighting its implications (and I love my PC - have a custom loop 4090 etc):
This game has very poor texture streaming on the PS5 so it isnt using the PS5 IO or Cernys SSD.
Here you can see 15 minutes to load textures. The rocks in the distance literally look like ps3 or ps2 quality rocks until they spawn in after 15 seconds.
This isnt like Hogwarts, RE4, or TLOU. This game is poorly unoptimized even on PS5.
It's okay you have the best looking game in the industry in Elden Ring to play on PC instead.
Keep enjoying paying thousands of dollars for the privilege of playing sub standard console ports.Keep enjoying bad IQ, framerate, input lag while paying $70.
This situation is becoming untenable now. I have a 4090 and I'm having to refuse buying new games for it that are supposed to stretch it's legs.
Where and how does this end?
This situation is becoming untenable now. I have a 4090 and I'm having to refuse buying new games for it that are supposed to stretch it's legs.
Where and how does this end?
Yeah, something is wrong with the process, they are reviewing the console version, and that's the score people see.it might end if game reviewers stopped having such low standards and stopped crawling so high up the rectum of publishers that they start seeing the light on the other end again.
a game like this on PC deserves nothing more than a 5/10 at best.
Yeah, something is wrong with the process, they are reviewing the console version, and that's the score people see.
Yeah, something is wrong with the process, they are reviewing the console version, and that's the score people see.
Thousands plural?Keep enjoying paying thousands of dollars for the privilege of playing sub standard console ports.
It wont end until these lazy devs are called out and put on full blast by the media. Things have gotten this bad because of the journalists' complacency.This situation is becoming untenable now. I have a 4090 and I'm having to refuse buying new games for it that are supposed to stretch it's legs.
Where and how does this end?
Where do you see Vram issues on a 4090?Yet another AMD sponsored game with VRAM issues. I'm so surprised...
And the custom i/o?
I dont expect IGN, Gamespot, Easy Allies, and Rock Paper Shotgun to stand up for PC ports, but GamerNexus, JayZTwoCents, Linus and all these youtubers with millions of subscibers covering PC hardware have dropped the ball big time. People buy hardware to play software, and all their hardware reviews mean nothing if the software is shipping in unplayable states taking months to fix. Played TLOU last night and had the same issues i had at launch. A month after launch.
The problem is that those people don't get given pre-release access to PC games in the same way that the obedient hack "gaming journalists" do.
So they aren't even being given the opportunity to do their jobs in that sense.
This is the big brain strat.As usual, wait six months and you get it cheaper with better performance.
Ive been gaming on PC since 2003. I get it. My PCs would always last until the next gen consoles came and vram would become the bottleneck forcing me to upgrade. My memory is hazy for my 2003 PC, but I distinctly remember the day my GTX 570 defaulted all settings to very low, and refused to load textures in CoD Advanced Warfare. A GPU just as powerful as the PS4.I do not doubt that - and my argument really had very little to do with the PS5 and whether they utilized the PS5 specific I/O solutions or not.
My point though is with regards to games having comparatively larger issues on the PC than on consoles lately (stuttering and frame drops). People really underestimate the challenges with improving graphics by utilizing larger amounts of high resolution textures - the PC environment has a real disadvantage here that is not easily remedied (compared to XSX and PS5).
I think this will continue to be challenging for the PCs this generation.
I don't pay a cent for sub standard ports. I wait for patched versions for less than $60...Keep enjoying paying thousands of dollars for the privilege of playing sub standard console ports.
Ive been gaming on PC since 2003. I get it. My PCs would always last until the next gen consoles came and vram would become the bottleneck forcing me to upgrade. My memory is hazy for my 2003 PC, but I distinctly remember the day my GTX 570 defaulted all settings to very low, and refused to load textures in CoD Advanced Warfare. A GPU just as powerful as the PS4.
I completely understand the vram side of things. I argued alongside you in other threads just last year. What's going on recently is more than just vram related. The TLOU medium textures are not even PS3 quality and they take up 10GBs? PS3 had 256MB of VRAM.
PCs also have the added benefit of having System RAM. This has been the standard for the last three gens. I understand if PS5 and XSX have this inherent advantage but I have a system that has almost 2x faster vram bandwidth, 50% faster CPU clocks, DDR5 RAM, and as SSD thats 30% faster than the PS5. That should be enough to compensate for whatever secret sauce Cerny and Jason Ronald put into the XSX and PS5. The fact that these issues are resolved with some patches show that its not hardware related, its just time related. They are willing to ship out these games knowing full well they stink.
Any game with shader stutters is proof of that. You run gotham knights, sackboy, elden rings for the first time and it will stutter like crazy. Unplayable. There is no way they dont catch that. TLOU took 40 minutes to 2 hours to build shaders. ND said that they are investigating why that could be. BULLSHIT. They knew. This happens with every single CPU. AMD, Intel, 8 core, 16 core. 4.0 Ghz or 5.0 Ghz. Doesnt matter. And they pretended as if it was an isolated issue. GTFO.
As representatives of PC gaming, I think they need to make a big stink about that.The problem is that those people don't get given pre-release access to PC games in the same way that the obedient hack "gaming journalists" do.
So they aren't even being given the opportunity to do their jobs in that sense.
The first game is great - and by now it's been heavily optimized on PC. If you haven't already played it, I'd recommend picking that up first. By the time you're done with it, this new game will be in a better technical state (hopefully) and potentially discounted, depending on how long it takes you to play through the first one.Is the first one any good or can I skip it
And your PC cost......I don't pay a cent for sub standard ports. I wait for patched versions for less than $60...
Cost cheap AF to run gazillions of non shitty ports, ex shitty ports (after patches), good build from the ground games and emulated games.And your PC cost......
Not thousands.And your PC cost......
Deflection is as good as admission in my book.Cost cheap AF to run gazillions of non shitty ports, good build from the ground games and emulated games
How are the gazillions of games with shitty frame rate, resolutions, input lag on consoles? All already patched? Gotham Knights and A Plage Tale Requiem already patched to 4k 60fps ultra settings???Deflection is as good as admission in my book.
Wait is this not coming to steam?Yes, Tim Cook. You can have my two grand for a gaming laptop for one of your lovely MacBooks instead.
3600 DDR4 is 57.6 GB/s. 6000 DDR5 would be 96 GB/s. No consumer SSD (even with compression) is close to that.I have a question for PC gamers. How fast is the DDR5 3600 RAM speeds in terms of GBps. The PS5 can pull 5.5 GBps from the SSD into the vram. The XSX is capped at 2.4 GBps, but they used DDR3 ram in conjuction with the ESRAM just last gen. Surely the DDR4 and DDR5 ram can pull data from the Gen 4 7 GBps SSDs and then push to the vram at a much faster rate than the PS5 IO can.
Hogwarts was taking up to 25GBps in my system ram. Another 9GB in my VRAM. thats half the fucking game. Just how slow is this DDR5 that it cant do what the PS5 IO is doing?