Yes, you cant match all settings perfectly on PC with xbox x version, but the most expensive settings are already on low or medium, and therefore you cant expect drastic performance difference from there.
RX 580 on PC runs RDR2 with similar settings in 4K at around 20fps, and you need over 30fps (around 35fps average) in order to play at locked 30fps. It's possible to gain additional 3-5 fps with further tweaks, but then game looks worse than xbox x version.
RDR2 is PC port, but games on PS4P also shows better TFLOPS architecture efficiency on xbox x. With only 40% TFLOPS more, xbox x can render 2x more pixels compared to PS4P GPU. It's a proof MS has customized their GPU with amazing results and at this point I dont even like to compare xbox x GPU to standard RX 580.
I expext MS will also prepare some specific GPU optimizations for xbox SX, so their 12TF console will be even faster than numbers suggest.
If the most expensive settings are already on xbox one X on higher settings and the xbox one X has no problem running it at 30 fps then why even bother putting it on lower then low settings to start with.
Because it can't run it.
Also that 580 is running it at ~28 fps not 20 fps at 4k with higher settings then xbox one x. And its considered a mess of a launch on a PC. So your numbers are wrong and that's also why your conclusion is wrong because of it.
The fact that we know that PC RDR2 has higher settings out of the gate then xbox one X + we can even verify that relative easy ourselves makes that game a utter joke for comparison on what hardware performs like in comparable situations. Because maybe they even nuked more stuff in the game we don't know about that we can't see like lower tessellation in games in buildings that we don't see aka crysis idea. Which we could also nuke down on rdr2 to get a leap of performance gain. We could even make specific area's less complex until the houses are boxes.
Pretty sure PS4 pro uses 470 and xbox one x uses a 580 which is a more optimized architecture over the 400 series, which explains your other point as its a different generation card with improved architectures.
You are making another weak analogy, 3gb were able to do it for a few years because consoles didn´t have good bandwidth to push memory anyway. If you are happy with the 2080ti getting 3fps more than the ps5 at 4k, good for you.
Yea bandwidth is where consoles fell flat on the PS4, not the dog shit CPU, and terrible GPU solution. Oh by the way didn't you hear its now on a single soc which will make it it a super PC architecture and compete with high end hardware on that front with more v-ram then PC's have and 8 cores more then PC's have and that all for only 400 bucks. While a guy with a 3 year old i7 could slam a new GPU in his box for 300 bucks and get a PS4 pro already but with 60 fps years before it even releases? its laughable at best.
Sorry mate, i have been around already for a while and the PR shit those console company's are pushing is beyond laughable at best. Massive respect for phil spencer to not walk into this hype bullshit again ( remember power of the cloud ) and just straight up give us the data we want and say what the performance is like and be realistic.
Hell even red dead redemption 2 people where convinced that the xbox one X was running stuff at ultra settings before the settings came out on PC to start chanting there victory's. Men i remember when xbox one x got announced and everybody and there mom was announcing that box to be as fast as a 1080. a card that's what 20% slower then a 5700xt which phil says is 100% faster then xbox one x. Yea there you go. Stop believing the hype start believing actual numbers because that's what it pushes at the end of the day.
2080ti sits at ~35% above the 5700xt performance wise, 35% = a generation leap. So your idea of 2080ti only pushes 3 fps higher more forwards on 4k, that means the PS5 only runs the game at 9 fps while the PC sits at 13.5 fps aka both are unplayable.
And 35-40% on if you want to hit 60 fps on 4k = the difference between 40 and 60 fps any % more for 4k matters let alone 35-40%. If you don't think this is a big upgrade or the concept of it u probably never played or owned higher end systems or the market for it. People spend bucket loads of money for 30% increase in performance any day.
The only way to make the 2080ti obselete is when they push a next generation amd gpu in that box that performs faster then a 2080ti and AMD has nothing like that at this point and frankly microsoft will just release Xbox series X1 a year later with that gpu in it if they cared for it, so what's the point.
Also PC gamers do not sit at 4k which will give that 2080ti even longer legs then those consoles have unless they start dropping resolutions and they will u can count on that straight out of the gate.
The reason i say death stranding is because its a visually impressive late generation game that runs on a console based engine. It should give a e much better picture then some of the other games that have been compared.
Death stranding is not interesting. They could slam a PC version out with 30x the draw distance at low settings and only renderable on a 16 core ryzen with 2x 2080ti's. That doesn't make the original ps4 more faster then 30 tflops PC with double the cores.
If it needs higher requirements then the reasons are:
1) focused on this day technology and looks at what PC gamers have and build there version around it. ( which mostly happens ) Which ups the base quality of the game.
2) dog shit optimized. ( aka skyrim in the PS3 area ). Which high likely is going to be the case if that engine never pushed PC titles out in there life. And no optimisation isn't a issue on PC as pretty much 90% ( which are a metric ton more pc games that release ) have zero issues with optimisations. It aint the 2000's anymore.
Then also there is this thing where hardware fluxuates on performance, nvidia and amd will never have a equal performance steady output on every game because they are different cards with different drivers that do things differently. one game they can push as far as 50% away from eachother another game its only 15%.. That's why you always look at averages of multiple games and not single any game specifically out because that would be utterly pointless.
This is why death stranding performance means nothing. It only means something towards people that want to push a agenda or hope to reinforce there idea that Console optimisation is still a thing in this day of age. While consoles start to lose more and more performance through locking hardware down and using the same tools as PC platform.
Optimisation means nothing more then lowering settings in this day of ages. PC can optimize there games all day long and far further then consoles ever will be. I can make RDR2 run on 2 core potato CPU with integrated GPU settings at 30 fps looking like this for 48 bucks. That setup is 4x slower then the base PS4.
Or the same setup and control that runs at double the PS4 framerate:
Shitty youtube screenshot so it looks blurry but its actually sharp looking if you play it. Also resolution can be pushed down big time somethign consoles also can't be adjusted besides some very basic settings.
Etc etc.