Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
So, you are making up the 30% with no proof.
I got a warning for that. This was a calculation based on IF the PS5 wouldn't have VRS, for which I indeed have no proof. It was my assumption, so to be clear: There is no proof of PS5 not having VRS, so the 30% power difference isn't confirmed.
 
Even they knows that, for this reason we saw a test in github with a bigger bandwidth .... I hope they can increase it again but maybe the BOM makes not viable that.

That probably means PS5 is going to be 500$ anyways. Otherwise it will be disappointing to know they could have increase their budget little more and gone with faster RAM
 
I got a warning for that. This was a calculation based on IF the PS5 wouldn't have VRS, for which I indeed have no proof. It was my assumption, so to be clear: There is no proof of PS5 not having VRS, so the 30% power difference isn't confirmed.
Ok. My bad I didn't read between lines. But I can see that happening too. Xbox is a very power centric brand. Sony relies more in other values.
 
Ofcourse not . Sony will unveil ps5 in their own event in the next few weeks according to jason shrier
Inside XBOX is also a separate event and Geoff has confirmed that it will be part of SummerGameFest, I don't see why the same would not happen with a possible PS5 reveal event. Besides, why would they miss the opportunity to use that space?
 
As per cerny:

"There is enough power available for both gpu and cpu to run at their max. The develoeprs do not have to choose one over the other "

So no .
At max frequency? Yeah, I'd expect both to be able to execute nop at peak frequency all the time. But maximum utilization for both? We'll see about that.
 
At max frequency? Yeah, I'd expect both to be able to execute nop at peak frequency all the time. But maximum utilization for both? We'll see about that.
It can't more astroturfing false facts.

"adds Cerny. "It's quite short, if the game is doing power-intensive processing for a few frames, then it gets throttled."

 
I'm curious what the ratio of native 4K vs latest upscale techniques we'll get. I imagine with RT, many games will need to sacrifice native res. Not that it's a bad thing, perfectly acceptable.

It depends on what type of upscale it is, if it's an AI upscale that generates a native 4k image from a lower internal resolution, you might not notice much. Any checkerboard or basic upscale technique that returns fewer than native pixels will be smeared/soft in comparison to a native image. Both systems will try and target native resolutions, neither side wants to be known as the console with that Vaseline smeared look IMO, here's looking at you Xbox One.
 

94NsTz6.jpg
 
Question for the more tech savvy guys. Why hasn't Sony continued to support hardware accelerated checker boarding in the PS5?
Because it was a trick to reach 4k output.
PS5 has enough power for that but I believe they will have something similar to DSLL with deep AI that is an evolution over checkboard.
 
Last edited:

Yeah again proves my earlier point, he just parrots information which is already known. RedGamingTech already said weeks ago that PS5 "may" have features from future architectures like RDNA 3, as for the exact quote he was told from a dev "you should look at what Sony has done in the past in incorporating features from future chips into their APU". not the exact quote but it was along the lines of that. RedGamingTech is also a good source since he doesn't seem to have a bias and if anything he's usually a Xbox apologist Lmao But here we have fake insiders like Tidux relaying the information in cryptic manner giving the impression they have inside information but get defensive when you call them insiders because they never claimed to be as such.
 
It depends on what type of upscale it is, if it's an AI upscale that generates a native 4k image from a lower internal resolution, you might not notice much. Any checkerboard or basic upscale technique that returns fewer than native pixels will be smeared/soft in comparison to a native image. Both systems will try and target native resolutions, neither side wants to be known as the console with that Vaseline smeared look IMO, here's looking at you Xbox One.
Lots of perfectly fine iq checkerboard examples.
 
I mean does any of this come as a surprise to any of you.
Both Microsoft and Sony have done this shit before. Sony themselves put Vega features in the PS4 Pro a full year before Vega made a public launch.
 
I mean RDNA 2 isn't even out yet in any capacity, yet it has RDNA 3 features? let's just stick to RDNA 2 for both consoles for now. It's getting ridiculous.


This isn't a new thing, the PS4 brought forward 8 ACEs from the future GCN pipeline, the 360 was a whole year ahead on unified shaders. Why is it unbelievable that console GPUs based on RDNA 2 would be bringing forward features they wanted from AMDs pipeline?
 
If you don't believe the RDNA 3 rumours it's fine but I think they're true. RedGamingTech is a reliable source compared to fake insiders like Tidux. We have already seen Sony incorporate features from future GPU's into their own APU, a good example is Rapid Packed Math from the AMD Vega GPU's being incorporated into PS4 Pro's Polaris Chip and Vega had not even been released at the time. Considering that it's not out of the box to think PS5 would have features from RDNA 3, I guess it also doesn't fit the power narrative of Series X having a better GPU than PS5.
 
Finally, those teraflops numbers are theoretical maxima, and indeed give an indication of the relative GPU capabilities, but do not in any way mean that they are literally constantly running those many floating point operations per second. My point being, there is no such thing as sustained X or Y TF.
The XSX GPU is not going to run at 12TF at all times. Because it does not need to. It is running with a fixed 1.85GHz clock, but that does not mean the GPU is being fully utilized at all times. That is where notions such as "race to idle" and bottlenecks come into play. Note that both console designs do actively try to prevent bottlenecks as much as possible. Which is also another difference from last gen.

Same applies to PS5, it has 10,3TF on paper but in practice it can be pushing 8-9 most of the time.
 
This isn't a new thing, the PS4 brought forward 8 ACEs from the future GCN pipeline, the 360 was a whole year ahead on unified shaders. Why is it unbelievable that console GPUs based on RDNA 2 would be bringing forward features they wanted from AMDs pipeline?
Good point. Maybe I'm wrong. We will find out soon enough.
 
we could even see a Assassin's Creed Unity situation where MS likely pays off devs not to take advantage of the other console's strengths, or to ensure parity.

See if (pardon...) shit like this is true WHY even have it care for the differences between both consoles!??? Why have this or that SSD & I/O or this or that TF count of at the end of the day two DNA matching, identical boxes should be created, slap "Sony" on one and "Microsoft" on the other and done.

There's no "silly ass" point in having disparity of one pays someone else to totally eliminate that kind of unique advantage...I've read shit like this throughout the years and I've never seen proof, not that I'm equating that to "never happened" but if this true, regardless which corporation, I don't understand the point of both consoles being significantly different upon closer examination when money can simply make them both the cery

Finally, those teraflops numbers are theoretical maxima, and indeed give an indication of the relative GPU capabilities, but do not in any way mean that they are literally constantly running those many floating point operations per second. My point being, there is no such thing as sustained X or Y TF.
The XSX GPU is not going to run at 12TF at all times.

See I remember seeing a nice vid by NXGamer sometime, I believe it's the "a new generation is born" PS5 video and if I'm not mistaken, he mentioned this exact point concerning TF's....and I've also seen that very point defended with good evidence elsewhere. I'm not sure why so many seem to imply strongly that

As per cerny:

"There is enough power available for both gpu and cpu to run at their max. The develoeprs do not have to choose one over the other "

So no .

I'll never understand why so many don't study that Cerny GDC pres. before speaking about the PS5's engineering with certainty. I've been prepared since the day it was aired to see how many do not understand it's significance or neglect/forget to understand the facts and science within.

Are we saying that suddenly this guy is legit? Makes no sense to me.

Not necessarily. "We're" just stating what was stated, no intellectual investment or belief in what was stated necessary.
 
Last edited:
Do you think that Microsoft and Sony have the same relationship with AMD?

I recall an article a long time ago saying sony helped built some off the new features in rdna 2.
Would Microsoft be allowed to use those features or would that be exclusive to just sony.

Fs i don't know why but every time i want to write rdna2 i end up writing R2d2, screw you star wars.
 
Status
Not open for further replies.
Top Bottom