Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Makes sense. Not that aiming for 60FPS is bad tho, with all the CPUs, GPUs, RAM, SSDs, "no bottlenecks" etc. it would be shocking (in a bad way) if games still ran at pathetic 30FPS.

I think that is the first thing you have every posted I agree with lol.

Yes 60 FPS should take precendent, I mainly play COD on console and 30 FPS never feels fluid. I also loved the last God war in performance mode (60 FPS), reminded me of games like gaiden black back in the day.

Bring back the fast gameplay.
 
Wait, if some subscribers have this issue now, doesn't that mean they can leak everything inside the issue like today or something?

LOL. Now you see why I was confused! The issue with the controller on the cover is June and only says it has details about the controller. It is the preview in this issue of the July edition (The Tidux Tweet on the last page) that says the PS5 blowout is happening.

Hopefully that clears it up!
 
Last edited:
J9De.gif




ALL ABOARD THE HYPE TRAIN

Are we going to make a new GaF hype train?

C3heyOJ.jpg
 
I hope that helps set the record straight regarding Github.

If both consoles appear in the same leak it implies that they are developed by the same team in AMD, which in itself is a recipe for a disaster, it's unheard of, and I don't think any customer would like that shit.
I.e. leak is either from MSFT-oriented team or Sony-oriented team. Because it aligns perfectly with XBSX it's probably from the former.
End of story.
 
Maybe it's surprising because they turned it into a Monster Hunter clone. I dunno, sometimes studios fuck up sequels, it's not unheard off. Lost Planet 1 to Lost Planet 2 was a crazy departure for example.
 
Looks like DualSense's unannounced features will also be revealed as well, with the word "everything" underlined along with a "tech in detail" section and a "design secrets" section dedicated just to the controller.

Can't wait to see the games though, fucking excited rn.
Paddles bby
 
I wasn't following leaks during that time frame so im a bit out of the loop
Would you mind posting the sources/links that cover these two steppings you mentioned
Not only that the original was taken down by AMD, Chris 1515 from Era stored it in his google drive and got a takedown notification. So basically you can't really post the leak files online without AMD getting all over your ass.

Why do you think it was a regression test?
It was called a regression test in the files :)

So because of this, the timeframe wouldn't allow for a architecture change rdna1->rdna2 (new code name chip)?
Yes, 2019 was too late to move to RDNA 2. the move from Ariel to Oberon A0 happened in 2018. Some people in Era have time stamps on the files, I don't because I saw an aggregated version of the files. But what we do know, because of Flute, is that Sony still had 532GB/s GDDR6 in July 2019, exactly when DRAM prices stopped dropping and started spiking. That's one of the reasons I think Sony really wanted 500GB/s+ memory bandwidth, but DRAM prices killed that idea.

If both consoles appear in the same leak it implies that they are developed by the same team in AMD, which in itself is a recipe for a disaster, it's unheard of, and I don't think any customer would like that shit.
I.e. leak is either from MSFT-oriented team or Sony-oriented team. Because it aligns perfectly with XBSX it's probably from the former.
End of story.
They weren't developed by the same team. The leak was an intern who's part of his job was to aggregate test data. The intern wasn't careful and stored all of the data in a public Github repo so once it leaked, it contained data for over 10 GPUs. XSX and PS5 were just a small part of that leak, but the most interesting one.
 
The leak was an intern who's part of his job was to aggregate test data.

From all teams in the company?
And only one silicon per each console vendor?
Too many stretches.
It implies that one team can routinely know what other team is doing exactly?
Really...
 
Last edited:
Not only that the original was taken down by AMD, Chris 1515 from Era stored it in his google drive and got a takedown notification. So basically you can't really post the leak files online without AMD getting all over your ass.
not even screencaps :/ ?
Yes, 2019 was too late to move to RDNA 2. the move from Ariel to Oberon A0 happened in 2018. Some people in Era have time stamps on the files, I don't because I saw an aggregated version of the files. But what we do know, because of Flute, is that Sony still had 532GB/s GDDR6 in July 2019, exactly when DRAM prices stopped dropping and started spiking. That's one of the reasons I think Sony really wanted 500GB/s+ memory bandwidth, but DRAM prices killed that idea.
I don't know about the same ram thing maybe they were testing for bottlenecks to decide if the extra bandwidth is worth it and Samsung 16Gb/16Gbps chips are in sample stage still, 18Gbps not even listed
Besides woulnt console manufacturers do long term high volume deals?
 
Maybe it's surprising because they turned it into a Monster Hunter clone. I dunno, sometimes studios fuck up sequels, it's not unheard off. Lost Planet 1 to Lost Planet 2 was a crazy departure for example.

Isn't there a rumor of Coop in the game? Maybe the surprising part is that the game is built around that.
 
Those were two versions of XBSX devkit.

That's a new one, and, uh...."interesting" theory. Doesn't hold up for me, tho.

As someone who knows the Github leak like the back of my hand, I would say it was totally legit. Regarding the timeline, it was probably something like this:
Ariel B0 (iGPU) - RDNA 1 36 CUs @2Ghz with 448GB/s memory bandwidth (256-bit, 14Gbps GDDR6 chips).
Oberon A0 (iGPU) - A move from RDNA 1 to RDNA 2, still 2Ghz, still 36 CUs. Memory bandwidth upgraded to 510GB/s (256-bit, 16Gbps chips).
Oberon B0 (iGPU) - Unknown changes, memory bandwidth upgraded to 532GB/s (256-bit, underclocked 18Gbps chips).
------------------------------------------------------
Flute (the whole APU, separate leak) - unknown change, still 2Ghz, memory bandwidth still 532GB/s, has an actual data - July 2019.

Those are the leaks. Now, for some Q&A regarding the leak:
Q: How do we know it's real?
A: The Github repo was removed for AMD copyrights, it means that AMD had copyrights for that data which means it's 100% a real AMD data. In addition, everything on the hardware level was 100% right, the only thing that changed from the Github leak to the reveal of both consoles was clocks, which again proves it was real (clocks change all the time over development).

Q: How do you know Ariel -> Oberon was a change from RDNA 1 to RDNA 2?
A: Stepping isn't there for architectural changes, it's there for things like improving clock speeds, improving testability of the chip, fixing bugs, improving power usage and so on. When a real architecture change happens, like adding more CUs or actually changing major stuff in the chip, instead of a stepping the code name changes. It means that from Ariel to Oberon something architectural changed. We have references for Ariel being Navi based, which means RDNA1. CU count didn't change, memory controllers remained the same and so on. On top of that, we know that today PS5 is RDNA2. So the logical conclusion is the Ariel to Oberon was the RDNA 1 to RDNA 2 shift.

Q: Wait, wasn't the Github leak missing RT tests? It means Oberon A0 and B0 didn't have RT, right?
A: The Oberon leak was regression testing. In regression testing, all you test is the old chip's abilities VS the new chip's ability in order to see if you f---ed something up. Ariel was RDNA 1, it had no hardware-level RT capabilities, so no regression test comparing to Ariel will test RT. Or in other words, just because RT tests weren't a thing in those tests (because it was regression testing) doesn't mean Oberon A0 and B0 didn't have RT hardware.

Q: It's been so much time since Oberon B0, didn't the PS5 and XSX completely changed since?
A: Actually in early 2019 both consoles were pretty final. At that point, you can't do much other than tweak using steppings. It means that things like clocks could change, but the actual silicon itself remained the same. Remember that though both consoles only launch in late 2020, during 2019 near-final consoles have been already built for testing. For instance, Phil Spencer already got his fulling working (and buggy) XSX last year and the SOC inside that console was probably already fabbed in summer 2019 in a pretty bug-free state.

I hope that helps set the record straight regarding Github.

Fucc it, I was gonna reply to the other replies but you basically said what needed to be said.

I don't see the harm in claiming Arden was the XSX chip and Oberon was the PS5 chip, because aside from a few small details (Arden disabled CUs for retail which follows what they did for Scorpion, higher clock on Oberon which could've been the C0/E0 revision that has a log date of December 2019), they line up more or less perfectly. And the whole thing regarding Navi 10 listing for Oberon has already been explained countless times by multiple folks (also given that RDNA1 never exceeded 40 CUs, and PS5 is in fact 40 CUs, on that metric it technically qualifies as RDNA 1, even if the feature set is RDNA2).

The whole thing also is, it would be incredibly odd if Arden and Oberon were never in reference to XSX and PS5's chips because we have zero other chips, leaks etc. with GPUs matching up with what the final systems have. And we've never ran into that type of scenario leading up to a console generation. So it's really just a simple answer to assume those chips are for those platforms, and were the whole time.

However, it was also always a good thing of people to have reservations on if that info was representative of everything regarding the systems, because in truth they were not. That doesn't reduce their info as being any less valuable when it came to speculation on both systems, however.

What the hell is happening. TLoUPII, CP2077, PS5... Jeez

Something happened with Cyberpunk? Hopefully nothing bad like leaks or massive delay :S
 
Last edited:
From all teams in the company?
And only one silicon per each console vendor?
Too many stretches.
It implies that one team can routinely know what other team is doing exactly?
Really...
Not all teams, some teams. Not just one silicon per console, many, and for other products too. Trying to disproof the Github leak as some kind of fake at this point that we already know AMD has copyrights to that data and the fact that it was 100% accurate regarding both systems silicon isn't really going anywhere. Github was real, 1000%, but it was a peek into development, not final chips.

not even screencaps :/ ?
We do have some screencaps, but you can't capture the whole leak, the amount of data is unbelievable.

I don't know about the same ram thing maybe they were testing for bottlenecks to decide if the extra bandwidth is worth it and Samsung 16Gb/16Gbps chips are in sample stage still, 18Gbps not even listed
Besides woulnt console manufacturers do long term high volume deals?
If they were testing for bottlenecks and 448GB/s was enough, they would have stopped at 510GB/s when they saw that 510GB/s was overkill. But they went even higher and got to 532GB/s with Oberon B0 and it stayed that way in Flute which was months after Oberon B0. So basically they started with 448GB/s, moved to 510GB/s, and then moved to 532GB/s and stayed at 532GB/s for months. We can't tell anything for sure, but it really seems like it implies Sony wanted 500GB/s+ for the PS5. And it's not that surprising. I mean, 5700 has 448GB/s and that's a sub-8TF card and on top of that add a 9GB/s SSD, up to 20GB/s audio chip, a powerful CPU and RT & denoising which hog a lot of memory bandwidth. Even without the Github and Flute leak I probably would have assumed Sony wanted more than 448GB/s and DRAM prices made them go with 14Gbps.

I'm not sure why people happily assume MS gave up on 20GB and went with a weird uneven GDDR6 chip setup because of DRAM prices (which is probably true) but for some reason deny that Sony probably used 14Gbps also because of DRAM prices.
 
Last edited:
Not all teams, some teams. Not just one silicon per console, many, and for other products too. Trying to disproof the Github leak as some kind of fake at this point that we already know AMD has copyrights to that data and the fact that it was 100% accurate regarding both systems silicon isn't really going anywhere. Github was real, 1000%, but it was a peek into development, not final chips.

Again I can believe that one team is careless and it's testing data was available for the intern, but both at the same time?
 
Again I can believe that one team is careless and it's testing data was available for the intern, but both at the same time?

It seems the GitHub leak was staged. Not saying that it was fake but it certainly wasn't some random thing that was posted on the internet. There was planning behind the whole process.
 
We do have some screencaps, but you can't capture the whole leak, the amount of data is unbelievable.
Can you share some links? dms if you must
If they were testing for bottlenecks and 448GB/s was enough, they would have stopped at 510GB/s when they saw that 510GB/s was overkill. But they went even higher and got to 532GB/s with Oberon B0 and it stayed that way in Flute which was months after Oberon B0. So basically they started with 448GB/s, moved to 510GB/s, and then moved to 532GB/s and stayed at 532GB/s for months.
Not saying 448 isn't a bottleneck (more is always better) just that they tested how much of a difference it would make and if that difference was worth the cost. If it really made a substantial difference they would take a short term loss rather than compromise the systems capabilities for the entire gen.
I'm not sure why people happily assume MS gave up on 20GB and went with a weird uneven GDDR6 chip setup because of DRAM prices (which is probably true) but for some reason deny that Sony probably used 14Gbps also because of DRAM prices.
Price definitely plays a role but i don't think it was the only factor. Its too much of a coincidence both companies came to the same bandwidth proportionate to compute power (GB/s per TF) when CPU usage is accounted. They likely consider it the sweet-spot in performance per price and RDNA2 is more bandwidth efficient. RTX 2080 "only" has 448GB/s

Would 16Gb chips at 16-18Gbps even be available for mass production in enough quantities to meet the demands of a console launch come December?
 
GitHub leak was legit but out of date, it's as simple as that. Why is this hard to understand?

Obviously it's quite incredible it happened and I imagine that it leaves a stain on AMD's rep amongst its partners who would rather trust it never happens again.
 
Last edited:
I think at least SOME of the issues with people acknowledging the GitHub data is that some others have tried to tie that into the suggestion that Sony somehow panicked and had to up the clocks in an vain, pathetic attempt to catch up to Microsoft. So they don't want to acknowledge GitHub for that reason.

Bottom line, it's HIGHLY unlikely that there was any panic. The PS5 seems to have been designed around a certain philosophy and lower costs by using smaller die and fewer CU's but clocked high and cooled, is a strategy. Some may not agree with it, but it's far more likely than just some slapdash engineering and planning on the part of a company that has been making consumer electronics for DECADES very successfully. Acknowledging that GitHub was showing earlier tests on the PS5 chips has NOTHING to do with what Sony planned or what their strategy was.

IMHO that's what the denial of GitHub is about. Maybe not. Just seems to me a bit of a defensive reaction to some of the more strident Xbox fans. ;)

This is a good point. My issue with people who reacted that way, though, is...why would they let other people sway their own individual logical conclusions?

I know there were people using Github in a negative way towards PS5 to say BS such as it didn't have hardware RT, but that didn't stop me from brushing off that type of speculation because I knew that wasn't necessarily what the data was saying, just people with their own agenda interpreting data that was malleable enough to be warped into said agenda. Why should I blame the data for people using it irresponsibly?

As for PS5 clock, well,....we don't know when they shifted to 2.23 GHz, but it definitely had to have been after June 2019, and after Oberon B0, because the first two Oberon revisions were 2 GHz. Cerny even said in his presentation they "had trouble" hitting their performance targets with fixed frequency strategy, and they would not have been able to go with variable frequency without getting to an RDNA2 chip (Oberon) to test, using Smartshift as part of that strategy. Something that, of course, they could've been planning for a long time ago knowing the roadmap and getting estimates on PPW gains in RDNA2.

However, just the fact Cerny says they were having issues hitting the targets on fixed frequency strategy, makes me think they might've still been planning for a fixed frequency strategy even after they got Oberon in, and using Smartshift more passively and (possibly) a lighter cooling system to help see if hitting the high GPU clocks would be possible that way. For what they wanted, it wasn't, so they probably decided to shift to variable frequency strategy afterwards.

I probably shouldn't say "decided" in this context; I don't want it to seem like they only figured to do it half a year or so ago. They probably had been considering it as a 'B-plan' for a good while, before even getting the Ariel chips tbh, and considered the possibilities of rolling with it if needed. But it may've been a "B-plan" because it would require more costs to the cooling, and there's already been some reports on cooling being a pretty notable cost component to PS5's BOM (Cerny even told DF that when the teardown comes, people will be impressed with the cooling, but the need to mention that at all suggests the cooling could be a bit extravagant aka costly).

Seeing as how they've already gone with 14 Gbps chips likely because the Japanese side wanted to keep costs down, I'd assume the decision to go variable frequency was out of necessity to hit certain performance targets, and might've been something Cerny and the American side insisted on. But a concession had to be made on the RAM to offset the additional costs needed for cooling to sustain the variable frequency strategy, and the decision to implement it probably came after testing on the Oberon B0 revision was coming along.

All of that is just personal speculation, but I feel fairly confident in it. Of course, even if it were true, it's not like Sony will come out and speak directly on any of the leaks or testing data, same as MS. There's no reason to do so. But it's still fun to speculate :)

It seems the GitHub leak was staged. Not saying that it was fake but it certainly wasn't some random thing that was posted on the internet. There was planning behind the whole process.

Is this a roundabout to the whole "Github is partially owned by MS therefore MS must've conspired to leak out PS5 testing info" thing? Because I guess that would be a step up from the "Github is partially owned by MS therefore the Oberon data must be FUD" stuff that preceded it.

Honestly I don't know if Github was staged or not, no one does. But there really isn't too much about it any different from console spec leaks of the previous gen. The only difference is where that data was gathered from.

Would 16Gb chips at 16-18Gbps even be available for mass production in enough quantities to meet the demands of a console launch come December?

I'm also of similar thought that it wasn't solely down to price. However, in regards to the price factor, it would seem the Japanese side had the influence there.

You probably won't agree with my reasoning behind the variable frequency strategy for PS5 I talked about in reply to Sinthor, but if that was indeed the case there, I would assume the pricing factor for rising DRAM costs weighed a lot more into going with 14 Gbps chips than it would have beforehand.
 
Last edited:
also given that RDNA1 never exceeded 40 CUs, and PS5 is in fact 40 CUs, on that metric it technically qualifies as RDNA 1, even if the feature set is RDNA2
Come on now this is shitty logic and you know it.
Judging a chips microarchitecture by the amount of CUs it has, by your logic an entry level RDNA3 card would be actually RDNA1 because it doesn't go over 40CUs. Hell 5700XT is actually GCN because it doesn't go over 64CUs 🤷‍♂️ 🤡
 
Is this a roundabout to the whole "Github is partially owned by MS therefore MS must've conspired to leak out PS5 testing info" thing? Because I guess that would be a step up from the "Github is partially owned by MS therefore the Oberon data must be FUD" stuff that preceded it.

Nah I haven't seen any evidence that Microsoft had anything to do with it. I was making a reference to that suspicious Discord.

I have no idea why you brought that crazy illogical theory to be honest. It's just weird.
 
Last edited:
Makes sense that a PS5 reveal would be in May, since the rumor that they start to manufacture in June, they need to get a head of any leaks (photo's) from the plant.
 
Ah, just caught an interesting article concerning Bleeding Edge and it's advertising this morning 😅:


MS♥️Discord....again? (All jokes aside it's speculation on my part to believe MS had anything to do with the first "Discord-gate" lol). Interesting only due to context.
Man, that's an all time low
 
...

If they were testing for bottlenecks and 448GB/s was enough, they would have stopped at 510GB/s when they saw that 510GB/s was overkill. But they went even higher and got to 532GB/s with Oberon B0 and it stayed that way in Flute which was months after Oberon B0. So basically they started with 448GB/s, moved to 510GB/s, and then moved to 532GB/s and stayed at 532GB/s for months. …...

I'm not sure why people happily assume MS gave up on 20GB and went with a weird uneven GDDR6 chip setup because of DRAM prices (which is probably true) but for some reason deny that Sony probably used 14Gbps also because of DRAM prices.
They maybe just tested the three memory speeds to extrapolate signal integrity trend against memory speed because by year 3 they might not be able to buy the slower modules, and they all might be rated faster. not knowing how that will impact your product would be failed due diligence.

As for the XsX not doing 20GB, that's odd people would assume it was for cost, because, the quote in the DF puff piece was that they had issues with signal integrity and the non-unified setup was a compromise. I(IRC)

tomshardware have the memory rumoured specs for the Navi 23 aligned with the Ps5 - including the 448GB/s, and the Navi 21 aligning with the XsX setup. Presumably the Navi 23 is the more advanced product if it is realising later, which seems inconsistent with the inferred GitHub context IMHO
 
Last edited:
Come on now this is shitty logic and you know it.
Judging a chips microarchitecture by the amount of CUs it has, by your logic an entry level RDNA3 card would be actually RDNA1 because it doesn't go over 40CUs. Hell 5700XT is actually GCN because it doesn't go over 64CUs 🤷‍♂️ 🤡

I said you could technically go by that metric, not that you necessarily should. We do have on record though that RDNA1 was limited to a 40 CU design and if the reference to Navi 10 for the Oberon tests were in fact reflective of Oberon and not Ariel iGPU profile tests, then the CU count could've been the measure in that listing.

That's the only potential route of explanation for Navi 10 in those Oberon tests if people insist Ariel iGPU profile testing was not a thing. That's the road it would lead them down to, else it takes them off the bridge and into the ocean to drown for a phantom chip that doesn't exist 🤷‍♂️

Nah I haven't seen any evidence that Microsoft had anything to do with it. I was making a reference to that suspicious Discord.

I have no idea why you brought that crazy illogical theory to be honest. It's just weird.

Ah okay, my mistake. That theory, about the MS/Github and FUD stuff, I never personally believed it. Would've been a lawsuit in the making if it were ever even partially true. But there were people legit claiming it as true in the thread when speculation prior to the PS5 presentation was still going on.
 
I said you could technically go by that metric, not that you necessarily should. We do have on record though that RDNA1 was limited to a 40 CU design and if the reference to Navi 10 for the Oberon tests were in fact reflective of Oberon and not Ariel iGPU profile tests, then the CU count could've been the measure in that listing.

That's the only potential route of explanation for Navi 10 in those Oberon tests if people insist Ariel iGPU profile testing was not a thing. That's the road it would lead them down to, else it takes them off the bridge and into the ocean to drown for a phantom chip that doesn't exist 🤷‍♂️



Ah okay, my mistake. That theory, about the MS/Github and FUD stuff, I never personally believed it. Would've been a lawsuit in the making if it were ever even partially true. But there were people legit claiming it as true in the thread when speculation prior to the PS5 presentation was still going on.


Nah read some comments from that Discord that made it seem that they had the leak before anyone else. Then from there they proceeded to spread FUD from it.

It's true that Microsoft does own GitHub. But would they be crazy enough to stage something like this to damage their competitor and main supplier? I bet you that if they did they would be going through a ton of law suits right now and AMD would be one of the plaintiffs. That's why I never believed that theory.
 
No need for conspiracy theories. There's room for many types of stories to be told.

It's not a conspiracy theory, it's out there in the open, just read some so-called professional reviews. That game got review bombed, user reviews says it all, a rare case where user score is much higher than the Metascore:

783732391.jpg
 
Last edited:
Again I can believe that one team is careless and it's testing data was available for the intern, but both at the same time?
Do you want to share AMD's organization tree with us? Considering the data is 100% AMD copyrighted, can we move on already?

Can you share some links? dms if you must
You can see some screenshots here, but that's like 0.1% of the data in that leak.

Not saying 448 isn't a bottleneck (more is always better) just that they tested how much of a difference it would make and if that difference was worth the cost. If it really made a substantial difference they would take a short term loss rather than compromise the systems capabilities for the entire gen.
My guess is, because they stayed at 532GB/s for months, that it was their intended bandwidth. I guess it also fits Cerny's "no bottlenecks" strategy.

Price definitely plays a role but i don't think it was the only factor. Its too much of a coincidence both companies came to the same bandwidth proportionate to compute power (GB/s per TF) when CPU usage is accounted. They likely consider it the sweet-spot in performance per price and RDNA2 is more bandwidth efficient. RTX 2080 "only" has 448GB/s
AMD has been more bandwidth-hungry for generations. RDNA remedied a lot of that, but RDNA 1 is still more hungry than Turing. And still, 2080 SUPER really benefited from having more bandwidth (496GB/s) and it doesn't have a CPU, 9GB/s SSD, and a 20GB/s sound chip sitting on the same bus. Regarding sweet spots and GB per TF, it's not the way I would define memory bandwidth requirements. Bandwidth requirements are defined by the jobs the GPU has at the moment, not by how many CUs or Mhz it has. Obviously a more powerful GPU will probably receive heavier workloads, but if both consoles are aiming at the same thing, the one with the higher GB/s number will have an easier time doing the same thing.

Would 16Gb chips at 16-18Gbps even be available for mass production in enough quantities to meet the demands of a console launch come December?
No idea. There are products that use them, 2080 SUPER for instance, but I don't know if manufacturers can keep up with console volume.

They maybe just tested the three memory speeds to extrapolate signal integrity against memory speed increases, because by year 3 they might not be able to buy the cheaper modules and they all might be rated faster. not knowing how that will impact your product would be failed due diligence.

As for the XsX not doing 20GB, that's odd people would assume it was for cost, because, the quote in the DF puff piece was that they had issues with signal integrity and the non-unified setup was a compromise. I(IRC)

tomshardware have the memory rumoured specs for the Navi 23 aligned with the Ps5 - including the 448GB/s, and the Navi 21 aligning with the XsX setup. Presumably the Navi 23 is the more advanced product if it is realising later, which seems inconsistent with the inferred GitHub context IMHO
We can make a lot of assumptions on why they did what they did. I would find it hard to believe they tested 532GB/s for months "just in case", and GDDR6 at 14Gbps isn't going to run out considering it's all the same modules at different clock speeds. But who knows, anything is possible, I'm just taking the Github and Flute data as some more evidence, not much more than that. When we will see RDNA 2 cards on the market, we will know better. If ~10TF RDNA 2 cards are ~448GB/s or higher, we will know PS5's main bottleneck is the memory bandwidth.

BTW, the signal integrity remake in the DF story was about why they went with 320-bit instead of the X1X's 384-bit memory interface, not regarding the 16GB/s VS 20GB/s. Signal integrity shouldn't be affected by that considering both 16GB and 20GB setups on XSX have the same bus.
 
Last edited:
It's not conspiracy, it's out there in the open, just read some so-called professional reviews. That game got review bombed, user reviews says it all, a rare case where user score is much higher than the Metascore:

783732391.jpg

I know what you are saying, but it already happened and now the media will move on and latch on to the next thing. Days Gone unfortunately didn't live up to its potential - I really liked it -and it came out with some performance issues and bugs that were a shame.

Writing wise, as long as they increase the complexity of the characters and they raise the emotional stakes of the situations outside the main character conflict, they will be fine and convert the nay sayers. I would say half of the cutscenes were really amateurish with really bad editing and pacing. But there's a great game if you allow it to breathe through and I have a feeling Sony will let Bend do their thing while investing more resources in their next game. If it's a sequel I expect great things, if it's not, well maybe we get something even better. Not every game needs a sequel.
 
I think that is the first thing you have every posted I agree with lol.

Yes 60 FPS should take precendent, I mainly play COD on console and 30 FPS never feels fluid. I also loved the last God war in performance mode (60 FPS), reminded me of games like gaiden black back in the day.

Bring back the fast gameplay.

I thought most, if not all, CoD's were 60fps generally?
 
Status
Not open for further replies.
Top Bottom