No excuses this time. If, by 2028, the performance of the 6700XT remains consistent, it would suggest that coding to the metal was a myth. If it's slower, it implies decades of downplaying this benefit. Let's see how it goes

.
I hope you are aware that directx 12, at least from a GPU perspective, is practically coding to the metal as much as possible on PC
it has been 4 years, and only major outlier between my 3070 and PS5 has been last of us part 1. all other games run as fast as one should expect from this GPU. however one important note is that you often have to reduce texture quality below PS5 levels
in certain games if you don't want to either
1) experience bad %1 lows
2) get half the performance of your gpu in vram limitations (spiderman and ratchet)
3) get worse texture quality because streamer cannot fit in high quality textures all at the same time
there's another case where some games will simply automatically reduce texture quality in distance, which you cannot control as a 8 GB user. Then it depends on how much it impacts your experience. In Unreal Engine games, texture streaming is quite refined and can be transparent to the end user if done right. avatar is another example where texture streamer works very efficiently. game practically has no VRAM issues with any modern GPU, and also does not seem to be producing N64-like low quality textures. so it must be doing something right.
which is why 6700xt should be a decent GPU to focus on in the future. rx 6700 however is not because 10 gb gpu means you only get 9 GB usable DXGI budget which is still below 10 GB GPU memory ps5 and series x often gets in games. (and with 8 gb gpus, you often get 7 gb of usable dxgi memory budget)
in this example, 3070 gets its performance reduced by half if you use very high textures (and game still appears to fully utilize the GPU). very high textures are PS5 equivalent textures so if you want to use them, 3070's relative performance to the PS5 will be affected as a result. in other terms, it is not possible to have same quality textures and keep the performance relativity in this specific title. mind you, texture quality shouldn't matter for performance. but in their specific implementation, it does. outside from texture quality, raster load of the game is same as PS5 (i matched settings according to df)
this happens on 8 gb gpus and nixxes ports because of a specific implementation of their own:
Renowned PC-porting studio Nixxes reflects on its first project for Sony, speaks directly to #stutterstruggle.
www.gamedeveloper.com
"Nixxes' solution was to create its own non-unified memory management system, which ranked and tracked a hierarchy of calls' importance in the code, then measured them by "distance to 512 MB" in order to prioritize which large chunks of memory would make the most sense in a shift from VRAM to system RAM. So if you've ever decided to push your Spider-Man gameplay on a GPU that wasn't quite up to the task, you can thank Nixxes for implementing an invisible reduced-stutter option."
Practically 8-10 gb gpus like 3070, 3080 and 6700 will have certain performance problems or they will have to make compromises to texture quality (which potentially leads to worse visuals than PS5, admittedly) However it shouldn't be as severe as 2 GB GPUs have experienced. With 2 GB GPUs, you wouldn't be able to escape performance problems, and you would still get low res textures because the VRAM ratio between the console and 2 GB was insanely big.
PS4 often had 3.5-4 GB GPU memory for games (1-1.5 GB CPU memory). in this case, it is better to think in ratios intead of raw numbers
2 GB GPU memory to 4 GB console memory budget = 2x difference
7 GB GPU memory to 10 GB console memory budget = 1.4x difference
9 GB GPU memory to 10 GB console memory budget = 1.11x difference
technically games should scale better and severe performance problems gtx 770 and alike experience should not happen at the same intensity on 8 GB cards. but none of this matters and no one should recommend anyone a 8 GB GPU. however not every game is guaranteed to use 10 GB GPU memory on console. it depends on how much CPU bound data the game uses. some games may use more, some less.
this is all why i dont take any 8GB GPU to PS5 comparisons serious. both from a PS5 perspective and GPU perspective. From a PS5 perspective, even if the 8 GB GPU outperforms the PS5, there's a possibility 8 GB GPU may have loaded lower quality textures here and there, so that would potentially be unfair to PS5. Since this is not something reviewers or players can control it is best to stick with
- RTX 3060 12 GB
- RX 6700XT
and nothing in between (no 3070, no 4060, no 6600xt).
those weird outliers that happens with 4060 and 3070 and 6600xt? they also happen BETWEEN the same family of GPUs. there are now instances where 3060 outperforms 4060 by a large margin DUE TO VRAM:
this is why one should not jump to conclusions about such outliers that happen 8 GB GPUs and consoles. it is not because of console magic, it is because of VRAM being trashed (sadly)
I mean even 12 GB GPUs are not impervious to these kind of issues if you go overboard wtih ray tracing settings and overload the VRAM. GPU just gets lowered performance
this is why I hate NVIDIA's insistence on 8 GB GPUs. 4060 will be put into similar situations over and over again in future and people will keep saying how it aged horrible compared to PS5, while 12 GB 3060 still gets the same relative performance it has compared to PS5 even in 2028, most likely.