The only thing thats bugging me is the fucking gpu. I dont know if i am convinced that 8tflops of navi will provide longivity in terms of power for the console.
There are two conflicting arguements from two sides that i hear. One says that 8-9tflops is very good because its power effiencent while other side says its not good enough because its not good enough for longivity and wont be enough to power 4k games and such.
I am gonna need some help lmao.
thanks.
There are 2 schools of thought:
1) The PC-centric one focuses on rasterization (drawing/shading 3D triangles efficiently) first and foremost (hence why nVidia cards are so popular with their smart rendering solution, aka Gigapixel's tiled rendering) and uses the CPU for the rest of calculations (AI/NPC simulation).
Compute is rarely used on PCs for many reasons (lack of standardization, baseline is all over the place, PC-focused programmers are not really fond of low-level stuff, PCIe latency).
For example, Witcher 3 uses the CPU for AI pathfinding (which tanks the framerate on Jaguar consoles), while console-exclusive games (like Uncharted 4) use the GPGPU to run that algorithm:
Is it possible to offload Novigrad's AI calculations on the GPGPU? If ND had made Witcher 3, then yeah.
Would Uncharted 4 MP (it has up to 10 NPCs with their own autonomous AI behavior) run at 60 fps without smart coding like that? Hell no!
2) The console-centric one focuses on compute and less beefy CPUs.
Cerny made specific forward-thinking customizations (8 ACEs, volatile L2 bit etc.) on a GCN 1.0 part (Radeon 7970M) that brought it up to GCN 1.1 feature parity and gave longevity to the PS4 platform, despite the Jaguar CPU deficiencies.
They offered these customizations for studios like ND, Santa Monica, Guerrilla Games to able to show that the PS4 can punch above its weight with games like Uncharted 4, God of War, HZD etc.
The problem with the PC-centric school of thought is that they don't make any effort to understand the other side and where they're coming from. That's why there are so many (usually fruitless) debates.
They think 8-9 TF would be enough for a console that is meant to last until 2030. They're judging next-gen specs by taking into account current-gen games. This is a fool's errand and it never works.
They also think that using off-the-shelf PC parts would be fine, but it's clearly not. Cerny doesn't seem to think that way.
TL;DR: 8-9 TF might be enough if you only care about 3D graphics, but if you want to do graphics + compute at the same time, then it definitely doesn't fit the bill for next-gen game design/programming.
Better to be conservative than dream and have them crushed.
I currently have $1000 stashed out for PS5/SNEK and I expect the best from both Sony and MS.
Even if we get 8TF consoles, I'm still gonna buy them, but I'll certainly won't expect a ridiculous $499 MSRP. It's either the one (premium/$499) or the other (weak sauce/$349). At worst, I may just save some money.
