CatLady
Selfishly plays on Xbox Purr-ies X
I don't like commenting on insider stuff but the information from insiders were all wrong.
Oooh absolutely, understatement of the year. It was all pretty much BS except

I don't like commenting on insider stuff but the information from insiders were all wrong.
Yes, he's doing Death Stranding 2 for SeX because it's a complex game.Is that Kojima? haha
Confirmed by who? All these stuff will be revealed by next year GDC when devs start to talk about their next gen games. Game design is a balancing act mate, even more so on consoles have always been about adjusting your scope within power budget. Want higher frame rate? you lower resolution and other stuff.All true.... Except in those cases, the GPU boost and CPU boost work independently. Both clock up and down as needed and do not interfere with each other, unlike the PS5, which has been confirmed to need to balance between the CPU and GPU workload.
Do you think they'll put every second console into a cloud rack just to offer a feature not many people will appreciate? Also remember that a lot of decisions are made in the USA, a country which still is medieval in terms of Internet standards (data caps? I forgot about them 20 years ago).
Question was not about the norm but if both can be at absolute max frequency. Answer seems to be no, otherwise he would not be talking about the worst case and throttling when that happens. And yes, that is not the norm. But that was not the question. The talk about frequency is kind of strange. Why is it so hard to run the CPU at 3 GHZ fixed if the Xbox is doing it way above that?That will never happen. And why is the worst case design target being extrapolated to be the norm?
They expect the norm for the clocks to be at or around those max clocks without the need to drop frequency.
Not sure if someone already told you, but if you go back to 32:30 on the Road to PS5 video, Cerny talks about how increasing the clockspeed on the GPU also boosts rasterisation and the cache's bandwidth. The reason why Cerny went with a lower CU count because it's easier to full load the GPU that way. And with the higher cache bandwidth, the memory can feed more data to the GPU in less time and the cache scrubbers in tandem with the coherency engines allow new data to flow in without the GPU stalling.This is the first time im hearing about the L2/L3 cache bandwidth advantage. where was this mentioned? also, is the faster rasterization going to make up a 17% gap in tflops? ps5 only has 22% higher clockspeeds, and lacks 44% of CUs, i dont see how faster clocks on 16 fewer CUs will make a dent.
Devs CONFIRMED Xbox Series X FASTER than PS5 | Velocity Architecture and Kraken on Next Generation
Yeah he's been running for the title Master deBater for a while now. Arguing just for the sake of arguing...
Those idiots who brought core 2 duos should went with high clocked p4. Those idiots who bought the the low clocked 2800ti should got higher clocked 2060 for more performance. Just because mark netburst cerny has to sell his design to the public does not make it correct. Wanna bet the next high end nvidia cards are not low cores super high clocked? It's all PR plain and simple he way over spent on the SSD and there limited budget left for the apu so clock it just short of exploding. It's fine to admit he is wrong here unless your saying nvidia is also wrong lol.Not sure if someone already told you, but if you go back to 32:30 on the Road to PS5 video, Cerny talks about how increasing the clockspeed on the GPU also boosts rasterisation and the cache's bandwidth. The reason why Cerny went with a lower CU count because it's easier to full load the GPU that way. And with the higher cache bandwidth, the memory can feed more data to the GPU in less time and the cache scrubbers in tandem with the coherency engines allow new data to flow in without the GPU stalling.
By Cerny AKA Sony? Who else?Confirmed by who? All these stuff will be revealed by next year GDC when devs start to talk about their next gen games. Game design is a balancing act mate, even more so on consoles have always been about adjusting your scope within power budget. Want higher frame rate? you lower resolution and other stuff.
I'm at 90%.I hope you get a bit every day.
Stop lying. Present proof with time stamp. You have no idea how smart shift works. You have no idea how PS5 works.By Cerny AKA Sony? Who else?
Rip open a 2060 and tell me if it has any cache scrubbers. Find any I/O from any desktop part that contains a DMAC, Kraken decompression chip, and coherency engines like the ones the PS5's I/O has. I'll wait....Those idiots who brought core 2 duos should went with high clocked p4. Those idiots who bought the the low clocked 2800ti should got higher clocked 2060 for more performance. Just because mark netburst cerny has to sell his design to the public does not make it correct. Wanna bet the next high end nvidia cards are not low cores super high clocked? It's all PR plain and simple he way over spent on the SSD and there limited budget left for the apu so clock it just short of exploding. It's fine to admit he is wrong here unless your saying nvidia is also wrong lol.
Quick question. I hardly ever look at this thread anymore and it has about a gazillion pages, way too many to try to catch up on. SOOO can anyone tell if we ever found out what that game @OsirisBlack was describing in great deal a while ago?
I'm at 90%.
My family not.
Mom, dad, wife, son,... Everyone's sick. My dad very sick.
Yes, nightmare.
Where is this from??
How does that first sentence apply with two consoles that have the same architecture? And why ignore the CPU on XS being higher speed than PS5? This argument doesn't work, it seems almost as if certain people are trying to say the PS5 is barely more powerful than PS4 Pro or something.Those idiots who brought core 2 duos should went with high clocked p4. Those idiots who bought the the low clocked 2800ti should got higher clocked 2060 for more performance. Just because mark netburst cerny has to sell his design to the public does not make it correct. Wanna bet the next high end nvidia cards are not low cores super high clocked? It's all PR plain and simple he way over spent on the SSD and there limited budget left for the apu so clock it just short of exploding. It's fine to admit he is wrong here unless your saying nvidia is also wrong lol.
I don't doubt that - I think questions are mainly how mixing slow/fast access will impact performance.
Yes - while CPU providing fast-path to 'its' memory was always part of PS3 design(so GPU got fast access either way) - RSX wasn't built the other way around.
It's a mix - GC had a really slow-pool in the mix - but Wii upgraded all of its memory to have roughly the same bandwidth, so it was really just latency differences between SRam and DRam pools. PSP was interesting as bandwidth was largely the same, the main use for embedded memory was lower latency/direct access to the 'owning' chip (no bus contention).
Give it a rest mate.I guess that could be seen as a bad thing, given how we are on a forum to debate... oh wait.
Another denying Cerny words lolNo, that is exactly how it works.
By design they can not both be at 100% Frequency and 100% power budget. Otherwise you could make the clocks fixed and it would not make a difference.
And Cerny never said they're both at max speed at the same time. The whole design philosophy only makes sense that way. If the SOC could run the GPU and CPU at 100% frequency under load it would be the same classic console design like the Ps4 or Series X.
Again that is literally impossible under load. And without lod it also does not matter how high your frequency is, because it's not needed anyways.
He did not.Another denying Cerny words lol
Cerny said both can be 100% at same time... in fact he said most of time that is what happens.
Nope.I know that. And that is exactly what I said.
100% GPU and 100% CPU at 100% power target is impossible.
Yes they continually run in boost mode. Guess where the boost start? Not at 100% max frequencies.
It's at 80%, 85% or 90%
Of course he did lolHe did not.
If he did, It will be easy for you to find a video or quote of it.
And if he did, he would've said something wrong.
another YouTube expert?
My dad is 73 years old, diabetic, high blood pressure. I sincerely hope he won't get it - at least not until a solid cure or a vaccine is available.I'm at 90%.
My family not.
Mom, dad, wife, son,... Everyone's sick. My dad very sick.
Yes, nightmare.
I'm not going to go through the entire presentation to provide you with a timestamp. Go do that yourself. You claim I have no idea how SmartShift works? The fact that it is a function for laptops already says quite a lot, but let me quote what it is;Stop lying. Present proof with time stamp. You have no idea how smart shift works. You have no idea how PS5 works.
from that dumb tweet
It's more so if the workload is more GPU intensive, power will be diverted from the CPU to GPU in order for the system to actually execute the workload. Without Smartshift, the GPU will not have enough power to take on the workload while the CPU has more power than it really needs. It also should be noted that laptops usually run at a very low TDP compared to consoles. The Asus Zephyrus G14 has as TDP of 35W iirc. That is what SmartShift is for.In other words, if the GPU is drawing a lot of power, it will hamper the CPU from performing at its max capacity and vice versa. SmartShift is an advantage only when you are comparing it with a system using the exact same power without SmartShift. If you have a system that is allowed to use more power, the one with SmartShift will generally simply lose, because it is by default limited by power constraints.
I did and he never said it.Nope.
100% CPU and 100% GPU at same time.
CPU 3.5Ghz and GPU at 2.23Ghz at same time.
Of course he did lol
Road of PS5... watch it... you were linked the last page but ignored.
from that dumb tweet
For real - devs say this, devs say that, and I'm yet to actually see any dev to actually say anything about nex-gen consoles personally, by himself...
The SSDs are on the wrong sides - 825MB vs 1TB.
That's it boys pack it in, PS5 just lost the war... /s
Fastest AND most powerful... they really are going into this head strong and it makes me wonder if indeed they are actually paying for creation and spread of FUD on forums if they are marketing against PS5 with 'fastest' messaging. Some people claimed GitHub fiasco was bought and paid for, to create FUD and make PS fans root for impossibly high TF figures only to be left feeling bad when official announcement came. The insider debacle could also make sense if MS planned that strategy and now going all in on console war themselves.
Well, PS brand is giant so I don't see them winning this generation either.
This is for the technically illiterate. The GPU of the current gen is too large, but the image describes the situation perfectly.
I already stated that I'm not going to look for it. I don't need nor want your money. The bottom line is that he did say it, but in a sugar-coated kind of way, because, well, he has to sell his inferior console as being equally if not more capable. It's the difference between saying "I ate half your ice cream" and "I left some of your ice cream for you".I did and he never said it.
Just stop spreading misinformation.
Just give me the quote or timestamp.
I'll give you $1000 when you found it.
Brad Sams: "Cerny is lying about 5.5 GBps raw speed. My Microsoft programmed brain is not complex enough to understand how we allowed Sony to get a 2.25x advantage in I/O. Hence, it must be a lie."
I'm not going to go through the entire presentation to provide you with a timestamp. Go do that yourself. You claim I have no idea how SmartShift works? The fact that it is a function for laptops already says quite a lot, but let me quote what it is;
AMD SmartShift is a new technology that is being introduced with the new AMD Ryzen 4000 Mobile processor family.
It can be simply described as a smart power distribution technique, to dynamically improve CPU or GPU performance with a limited power budget.
How Does AMD SmartShift Work?
In a typical laptop, the CPU and GPU each have their own pre-defined power budget.
The CPU and GPU will individually adjust their power consumption according to the load, using less power for a longer battery life, but never exceeding their power budget even when there is a need for more performance.
And here's a video as a bonus
In other words, if the GPU is drawing a lot of power, it will hamper the CPU from performing at its max capacity and vice versa. SmartShift is an advantage only when you are comparing it with a system using the exact same power without SmartShift. If you have a system that is allowed to use more power, the one with SmartShift will generally simply lose, because it is by default limited by power constraints.
And remember that the specs that were given for the PS5 were the MAX performance numbers. So SmartShift is already accounted for. It can even be argued that they are painting it as better than it really is, since it will not be able to run at its max GPU speed and CPU speed at the same time.
Tell me again I don't know how smart shift works or how the PS5 works.
from that dumb tweet
5.5GB/s is sequential read only that is the fastest (that a SSD can do) and more used in games... you basically don't change the game assets (write) and the files are already stored in a sequential way to be read.Hint: random read/write speed.
He did watch it... stop to being lazy.I did and he never said it.
Just stop spreading misinformation.
Just give me the quote or timestamp.
I'll give you $1000 when you found it.
He did not.
If he did, It will be easy for you to find a video or quote of it.
And if he did, he would've said something wrong.
I'm not going to go through the entire presentation to provide you with a timestamp. Go do that yourself. You claim I have no idea how SmartShift works? The fact that it is a function for laptops already says quite a lot, but let me quote what it is;
AMD SmartShift is a new technology that is being introduced with the new AMD Ryzen 4000 Mobile processor family.
It can be simply described as a smart power distribution technique, to dynamically improve CPU or GPU performance with a limited power budget.
How Does AMD SmartShift Work?
In a typical laptop, the CPU and GPU each have their own pre-defined power budget.
The CPU and GPU will individually adjust their power consumption according to the load, using less power for a longer battery life, but never exceeding their power budget even when there is a need for more performance.
And here's a video as a bonus
In other words, if the GPU is drawing a lot of power, it will hamper the CPU from performing at its max capacity and vice versa. SmartShift is an advantage only when you are comparing it with a system using the exact same power without SmartShift. If you have a system that is allowed to use more power, the one with SmartShift will generally simply lose, because it is by default limited by power constraints.
And remember that the specs that were given for the PS5 were the MAX performance numbers. So SmartShift is already accounted for. It can even be argued that they are painting it as better than it really is, since it will not be able to run at its max GPU speed and CPU speed at the same time.
Tell me again I don't know how smart shift works or how the PS5 works.
from that dumb tweet
Brad Sams: "Cerny is lying about 5.5 GBps raw speed. My Microsoft programmed brain is not complex enough to understand how we allowed Sony to get a 2.25x advantage in I/O. Hence, it must be a lie."
Oh. So that's how it works. When it is something that you claim, i.e. that there's more to smart shift than that, then suddenly the burden of proof is not on the one making the claim anymore, but I have to go look for it myself? GTFO.Burden of providing proof lies on you since you are claiming that Cerny said that in that video and Since is did not say that I can't find it in that video.
There is more to smart shift than that, go do your homework better.
That tweet has nothing on the follow-up
I'm sure you are following the PS5 thread on the other place, it is absolutely hilarious, I'm convinced that bloke 'SPDIF' sent out a beacon on that Discord cliq given how all of them assembled in space of few minutes to champion his post and offer support for what Brad Sams wrote.
That tweet has nothing on the follow-up
I'm sure you are following the PS5 thread on the other place, it is absolutely hilarious, I'm convinced that bloke 'SPDIF' sent out a beacon on that Discord cliq given how all of them assembled in space of few minutes to champion his post and offer support for what Brad Sams wrote.
It really isn't as different as they are advertising it to be. In both cases (laptops and PS5), a certain max power consumption target is given (like 45W for example), and the frequency is allowed to go wild up to those power consumption targets. Those limits are determined by real factors like max cooler size and heat output. But minimum voltages also play a role here, and the higher you go in frequency, the more voltage you will need. So you'll need to cap the frequency to not start burning things. That's the way they know exactly how to design the cooler and what power envelop is allowed. The difference is that in laptops temperature is set as the limiting factor, while in the PS5 it's the workload.What this is showing is how SmartShift works, which the point of part of the talk was this is decidedly not.
In a worst case scenario with the highest power operations on both the CPU and GPU, a "couple" of percent drop on the clock speed drops power by 10%. They expect it to run at or near the peak most of the time.
It's flipping the idea on its head somewhat. On a PC laptop with Smartshift, you'd have a TDP that the cooler can remove and shared power between the CPU and GPU, if the GPU is being maxed out and the CPU isn't, it would give more of that TDP to the GPU, the cooling being the hard limit. Smartshift isn't running near its peak clocks most of the time, it's an opportunistic TDP user.
The PS5 flips this around by saying the power output is constant, the cooling isn't the bottleneck, and the CPU and GPU are actually capped from going higher because they built the system around a constant power output vs a constant frequency.