• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
I love to watch how mr "as is ass sources" just trying to educate guess, and everyone else picks up this information as something from second-hand insider. It's funny, because what he does can be done by anyone who at least a little understands the hardware.
It would be better to tighten your level of knowledge, and not listen to what and who will tell you there. Get better.
 

Reindeer

Member
Will RT be a giant waste of time for on these consoles? Wouldn’t players prefer higher res and hdr?
Not necessarily. They need some form of RT because it's the next big evolution. Even if it's applied cheaply on base consoles they could always increase the effect with patches on their midgen Pro consoles. Having no RT now makes no sense when in few years it will be become a norm and won't be so expensive.
 
Last edited:

StreetsofBeige

Gold Member
Hope we get more options next gen. I'll be glad to turn RT off if I can take into consideration the way PC is handle RT. Not a huge fan of this RT hype.
If consoles give the option to turn off RT for resolution and frames boosts, buh bye RT.

If I want to stare at unusually bright and bloomy visuals, I'll find a copy of an old PS1 game where colours were on steroids. Those old PS games often had that fake lens flare effect into your eyeball to boot.
 

Raploz

Member
I'm excited for the jump in CPU power for next-gen. Most games today feel really lifeless with simple physics. I want realistic water simulation, smoke sim, + particles, deformable objects/soft/hard body physics, + destruction. Everything!! Of course it depends more on the devs than only on raw power (BotW has got good physics on a weak Wii U!), but I hope it becomes at least more common next-gen.

(And I know some of that can be calculated by the GPU, but with a powerful CPU the GPU can focus on graphics 😄)
 
Will RT be a giant waste of time for on these consoles?
Ray tracing is needed just more to save time for the implementation of various effects by the developer. Shadows, reflections, occlusion, possibly lighting, will become better, more precisely (but not all at once). The most interesting thing is that they will not destroy the performance of the GPU, as many think, on the contrary, you can get rid of many cascading shadows, especially in the background. I will not go into details about how this is done, but fixed function hardware for RT may eliminate the need to make main calculations on the GPU computing units and get the best solution in rendering in the end.
 

Reindeer

Member
The way RT on PC is a perfomance hungry can't agree more. Unless they push a less demand solution.
I think it was DF that said it won't be utilised like on PC. They were basically saying on consoles you could customise it any way you want to save performance.
 

Fake

Gold Member
I think it was DF that said it won't be utilised like on PC. They were basically saying on consoles you could customise it any way you want to save performance.
They at least need to show how they can put together. Even if not work at the same way as PC, will probably perfomance demand, maybe not brutal as PC, but still.
 

StreetsofBeige

Gold Member
I'm excited for the jump in CPU power for next-gen. Most games today feel really lifeless with simple physics. I want realistic water simulation, smoke sim, + particles, deformable objects/soft/hard body physics, + destruction. Everything!! Of course it depends more on the devs than only on raw power (BotW has got good physics on a weak Wii U!), but I hope it becomes at least more common next-gen.

(And I know some of that can be calculated by the GPU, but with a powerful CPU the GPU can focus on graphics 😄)
Sounds good.

I'd like more resources put into better AI too so enemies and NPCs don't feel like 1998.

But what we're going to get is probably the same stuff, but better hair and RT effects as priorities.
 
Last edited:

TheStruggler

Report me for trolling ND/TLoU2 threads
You just cant flick a switch. However MS made up for the lackluster OG One with a better S, and the awesome X. Both contain 4k bluray players, and they bumped up the frequency of both the GPU and CPU to give it a bit more. They gave excellent BC upgrades with enhanced visuals and performance, for nothing. They have been buying up first party studios and putting together new ones such as the initiative.
That has been a massive investment in first party support. So while this Gen wont benefit, next gen will. So I really look forward to what Sonys first parties put out, and what MSs first parties come out with.
Historically you are correct that MS screwed up their first parties. I mean they let Bungie walk away for nothing, they turned one of Nintendo's best developers Rare into a shell of their former self. Absolutely crazy.
But like I said, if you haven't seen a new change of direction, you haven't been looking.
Now if only Nontnedo decided to go balls out with another new high tech console.
Flip a switch...I mean when they slowly fuckn up since the end of the 360 era with the slow down of critical games and again they had al this gen to get their shit together and they didnt. They had a pretty good start, better than sony and then fell over gasping for air 5 minutes into the race. They had 6-7 years to straighten up and didnt.
What the he'll has releasing a sub par game got to do with the leadership? If Crackdown is the best thing you can hold against them, well they are doing pretty well.
And as for "childish", I hace no idea what you are talking about.
Much like with Sony and Nintendo it seems they have a Golden Seal of approval when executives visit their game studios. Yoshida wasnt happy at all with God of War and Cory said they drastically improved the game, and well Days Gone was night and day when it came out. MS seems to mismanage alot. As for childish jave you seen their recent portfolio, their games have been cartoony and looking like Pixar movies, thats not my bag and while everything doesnt have to be realistic be a little mature here with your content. Xbox used to be known as the "shooter box" with mature content for its gamers, they diverted and drastically changed their portfolio and they expected gamers to accept it with open arms when all it did was piss of their fanbase and drive them into a brand that has content they want like God of War, Last of Us, Ghosts of Tsu, Bloodborne etc.
 

Fake

Gold Member
I'm excited for the jump in CPU power for next-gen. Most games today feel really lifeless with simple physics. I want realistic water simulation, smoke sim, + particles, deformable objects/soft/hard body physics, + destruction. Everything!! Of course it depends more on the devs than only on raw power (BotW has got good physics on a weak Wii U!), but I hope it becomes at least more common next-gen.

(And I know some of that can be calculated by the GPU, but with a powerful CPU the GPU can focus on graphics 😄)
Not only Ryzen is really that good, but also Jaguar is super crap. The gap is huge.
 

joe_zazen

Member
Just a reminder, dual gpu not dead despite r600.

//Gemini Modes
#define PPSMC_GeminiModeNone 0 //Single GPU board
#define PPSMC_GeminiModeMaster 1 //Master GPU on a Gemini board
#define PPSMC_GeminiModeSlave 2 //Slave GPU on a Gemini board

https://cgit.freedesktop.org/~agd5f...u11_driver_if_navi10.h?h=amd-staging-drm-next

from amd linux drivers for navi. No such dual code appeared for vega because they had killed crossfire. So dual gpu is a thing for navi.

plus pastebin from last july

AMD 7nm Navi Next Gen RDNA, 36 Dual-CU (72CU), 64 shaders, @1.55GHz clock speed. 14.2TF
https://pastebin.com/YHRVUqQB

So there you go. A $1000 (as per pachter) ps5. Looking pretty sweet.
 

Dargor

Member
Just a reminder, dual gpu not dead despite r600.



https://cgit.freedesktop.org/~agd5f...u11_driver_if_navi10.h?h=amd-staging-drm-next

from amd linux drivers for navi. No such dual code appeared for vega because they had killed crossfire. So dual gpu is a thing for navi.

plus pastebin from last july


https://pastebin.com/YHRVUqQB

So there you go. A $1000 (as per pachter) ps5. Looking pretty sweet.

So, PS5 confirmed to be just a PS4 with a PS5 sticker slapped on top?
 

FranXico

Member
Just a reminder, dual gpu not dead despite r600.



https://cgit.freedesktop.org/~agd5f...u11_driver_if_navi10.h?h=amd-staging-drm-next

from amd linux drivers for navi. No such dual code appeared for vega because they had killed crossfire. So dual gpu is a thing for navi.

plus pastebin from last july


https://pastebin.com/YHRVUqQB

So there you go. A $1000 (as per pachter) ps5. Looking pretty sweet.
Elaborate, how can we reconcile that with a supposedly easy to use SDK? Because I see that kind of design as a source of headaches for developers.
 
Last edited:

StreetsofBeige

Gold Member
Just a reminder, dual gpu not dead despite r600.



https://cgit.freedesktop.org/~agd5f...u11_driver_if_navi10.h?h=amd-staging-drm-next

from amd linux drivers for navi. No such dual code appeared for vega because they had killed crossfire. So dual gpu is a thing for navi.

plus pastebin from last july


https://pastebin.com/YHRVUqQB

So there you go. A $1000 (as per pachter) ps5. Looking pretty sweet.
This rumour has been floating around for a few days.

PS5 will have dual AMD 5700 GPUs.

Sounds more like funny trolling. No chance IMO.
 
How effective would that virtual ram be against, say, +8GB of VRAM? Or some addition al 4GB of DDR4 for the OS, for example?
It's obvious, isn't it? Think:
You have, say, 12GB of main memory and 5GB/s of read from ssd versus 20GB/s and 100MB/s from hdd.
So, in 1 second you get + 5GB, in 5 seconds you get + 25GB of swap memory, which can be read from the ssd, theoretically. The developer just needs to make a special markup and mapping of reading priorities and data granularity.
With 20GB of RAM, you only get + 500MB of swap memory in 5 seconds of real time. This is without calculating that you will have to constantly wait for loading from the hdd between levels and sectors, when new data will be uploaded to the memory. This is essentially no different from what it is now. Yes, it will be slightly faster, thanks to the new CPU, but not significantly. Do you want to sit and wait again? I personally do not and I have no reason to worry about the amount of RAM. 16GB is more than enough. For the same reason, I do not worry if 9.2 TFlops in PS5 is true against XseX's 12tflops (in which there is no special faith). Why? I'll write more later, when the data is confirmed or not confirmed at all.
 

xPikYx

Member
Just a reminder, dual gpu not dead despite r600.



https://cgit.freedesktop.org/~agd5f...u11_driver_if_navi10.h?h=amd-staging-drm-next

from amd linux drivers for navi. No such dual code appeared for vega because they had killed crossfire. So dual gpu is a thing for navi.

plus pastebin from last july


https://pastebin.com/YHRVUqQB

So there you go. A $1000 (as per pachter) ps5. Looking pretty sweet.
#again #14tf #ps5 #believe

ibqypxxeo2ndlva4bhl.gif
 

Reindeer

Member
The main reason that I have a hard time believing that dual GPU theory is that it sounds like trolling after MisterXMedia.
Also sounds incredibly expensive. Sony would be bleeding money selling that thing.
I think that crazy theory started on Reset after some members hit depression when 9.2 tflops number was leaked.
 
Last edited:
  • LOL
Reactions: Isa

joe_zazen

Member
#again #14tf #ps5 #believe

ibqypxxeo2ndlva4bhl.gif

lol.
#team2019
#teamdualgpu

The main reason that I have a hard time believing that dual GPU theory is that it sounds like trolling after MisterXMedia.
Also sounds incredibly expensive. Sony would be bleeding money selling that thing.

i am stretching, sure. But we only have like one more month of fun speculating.

I think that crazy theory started on Reset after some members hit depression when 9.2 tflops number was leaked.

naw. Pastebin july 2019.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
The main reason that I have a hard time believing that dual GPU theory is that it sounds like trolling after MisterXMedia.
Also sounds incredibly expensive. Sony would be bleeding money selling that thing.
yeah, its basically the same as the hidden gpu in the power brick theory from last gen. now the roles have reversed.

there are simply cheaper ways to get to 14 tflops. 1.4 ghz for 40 cus is 80w. if 36 cus is 75w, you are looking at 150w for dual gpus. you could have one big gpu with 72 cus and it will be smaller and be more power efficient.

i remember this rumor from back then. this was when amd had just announced the rdna gpus and revealed the dual CU makeup of a shader engine. its just some guy from gaf or era making shit up.
 

joe_zazen

Member
Who knows what Sony has, but from an efficiency standpoint, that's not a great way to get to 14 (cost or power).

they apparently they have this weird hardware based BC, that requires a gpu with 36 cus for some reason. so maybe, given that, this is the best way.
 
Hmm, dual GPUs on 36CU of each? Do they share a memory bus and blending units? [Of course not]. And through what will they communicate to work more like as a single chip? Infinity Fabric? Hmm, pretty good. And the delays/latencies? Yes, static framebuffer memory between these two GPUs can help, about 0.5-1GB with 1TB/s must be enough. The price of a such solution? [Something broke through the sky].
 
Don't believe them ..believe in yourself!

Its hard to believe in anything if people keep shitting on my hype train! :messenger_face_steam::messenger_loudly_crying:

Why couldn't they just go with 72cus and deactivate half the gpu for back compat?

For PS5 to be on beast mode, it wouldn't necessarily activate all 72 compute units, some or several would be deactivated for yields, heat etc. Hence it would be 68 fully activated and 4 deactivated. For me it is really hard to believe that PS5 would have the same exact number of compute units as PS4 Pro (despite it being RDNA). Besides the I/O, SSD drive for virtual RAM and no loading screens, what exactly is the genius of Mark Cerny? There is nothing special about the PS4 and PS4Pro. It is just an average console in terms of raw specs. Even back in 2012 with the leaks of durango and orbis, people were calling it average and dead on arrival, but something that was necessary to transition to x86 architecture.

I still dont know what dual compute unit means, but lets just say that it is 36 compute units, what is so special about it? It seems so lackluster in numbers. I just want both consoles on equal playing field for fucks sake, and this is just throwing a wrench and pissing me off.
 
I'm excited for the jump in CPU power for next-gen. Most games today feel really lifeless with simple physics. I want realistic water simulation, smoke sim, + particles, deformable objects/soft/hard body physics, + destruction. Everything!! Of course it depends more on the devs than only on raw power (BotW has got good physics on a weak Wii U!), but I hope it becomes at least more common next-gen.

(And I know some of that can be calculated by the GPU, but with a powerful CPU the GPU can focus on graphics 😄)

I agree. Although RT is important, developers: more specifically, artists and animators need to get over this barrier of motion stiffness. Static, rigid animations are getting old as fuck. Sure the texture work looks great, and in still pictures look real, but they move like dolls, puppets, manikins. If PS5 and Xsex can display realistic cinematic motion indistinguisible from a movie scene, then they will truly bring us to next gen.
 

joe_zazen

Member
Its hard to believe in anything if people keep shitting on my hype train! :messenger_face_steam::messenger_loudly_crying:



For PS5 to be on beast mode, it wouldn't necessarily activate all 72 compute units, some or several would be deactivated for yields, heat etc. Hence it would be 68 fully activated and 4 deactivated. For me it is really hard to believe that PS5 would have the same exact number of compute units as PS4 Pro (despite it being RDNA). Besides the I/O, SSD drive for virtual RAM and no loading screens, what exactly is the genius of Mark Cerny? There is nothing special about the PS4 and PS4Pro. It is just an average console in terms of raw specs. Even back in 2012 with the leaks of durango and orbis, people were calling it average and dead on arrival, but something that was necessary to transition to x86 architecture.

I still dont know what dual compute unit means, but lets just say that it is 36 compute units, what is so special about it? It seems so lackluster in numbers. I just want both consoles on equal playing field for fucks sake, and this is just throwing a wrench and pissing me off.

i wouldnt say it is pissing me off, but i am not buying a hot, energy guzzling 2ghz 9tf ps5 unless it is $199. So i am looking for alternative theories that dont ignore the info we have. two 36 cu chips would give us a 13+ ps5, satisfying all the people who have said ps5>xsx and pachter’s $1000 comment, while not contradicting anything in the github papers.

if it isnt dual gpu, ps5 looks like a serious POS. more energy, more heat, less power...lol, wtf.
 
i wouldnt say it is pissing me off, but i am not buying a hot, energy guzzling 2ghz 9tf ps5 unless it is $199. So i am looking for alternative theories that dont ignore the info we have. two 36 cu chips would give us a 13+ ps5, satisfying all the people who have said ps5>xsx and pachter’s $1000 comment, while not contradicting anything in the github papers.

if it isnt dual gpu, ps5 looks like a serious POS. more energy, more heat, less power...lol, wtf.

Maybe they’re actively trying to hit $399?
 
Status
Not open for further replies.
Top Bottom