Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Come on now this is shitty logic and you know it.
Judging a chips microarchitecture by the amount of CUs it has, by your logic an entry level RDNA3 card would be actually RDNA1 because it doesn't go over 40CUs. Hell 5700XT is actually GCN because it doesn't go over 64CUs 🤷‍♂️ 🤡
The ideia he believes there won't be low and mid range RDNA 2 cards is telling lol
 
I know what you are saying, but it already happened and now the media will move on and latch on to the next thing. Days Gone unfortunately didn't live up to its potential - I really liked it -and it came out with some performance issues and bugs that were a shame.

Writing wise, as long as they increase the complexity of the characters and they raise the emotional stakes of the situations outside the main character conflict, they will be fine and convert the nay sayers. I would say half of the cutscenes were really amateurish with really bad editing and pacing. But there's a great game if you allow it to breathe through and I have a feeling Sony will let Bend do their thing while investing more resources in their next game. If it's a sequel I expect great things, if it's not, well maybe we get something even better. Not every game needs a sequel.

Not sure, man. The story-telling was pretty good and was extremely naturally acted, unlike overreacted shit you find on every other game. I see otherwise simply said, that's why I don't want them to regress, I want them to build on it. Other characters are really well built up through the story.

The bugs I'm not sure of, didn't get it in launch because of the shitty reviews it received, got it later on sale and was shocked of how underrated it is. It's not the matter of either one being wrong, but this is my opinion and it's one of the best organic story-telling, protagonist, gameplay this gen to me.
 
Last edited:
I said you could technically go by that metric, not that you necessarily should. We do have on record though that RDNA1 was limited to a 40 CU design and if the reference to Navi 10 for the Oberon tests were in fact reflective of Oberon and not Ariel iGPU profile tests, then the CU count could've been the measure in that listing.
Ah got you, still shitty logic that could be used a ammo for console warfare. Judging a GPU micro-architecture by the amount of streaming processors is just dumb, good GPU architectures scale from lowest to highest amount of shaders

And do we know for sure there's a CU limit for RDNA? Navi 14 for instance has 24CUs per SE so technically a RDNA1 GPU with 2SEs could have up to 48CUs if not more, as it was designed for scalability they likely just didn't bother designing a stop gap big card due to yields and focused on RDNA2
 
Last edited:
The ideia he believes there won't be low and mid range RDNA 2 cards is telling lol

Don't put words in my mouth. You'd have to be incredibly dense to think that were the case or think I would believe that to be the case, when we already know mobile APUs are coming based on RDNA2 with significantly cut down CU numbers.

You're just looking for an angle to take against me but if that's the best you can do then it reeks of desperation.

Ah got you, still shitty logic that could be used a ammo for console warfare. Judging a GPU micro-architecture by the amount of streaming processors is just dumb, good GPU architectures scale from lowest to highest amount of shaders

And do we know for sure there's a CU limit for RDNA? Navi 14 for instance has 24CUs per SE so technically a RDNA1 GPU with 2SEs could have up to 48CUs if not more, as it was designed for scalability they likely just didn't bother designing a stop gap big card due to yields and focused on RDNA2

It's never been 100% confirmed or directly commented on by AMD and doubt it ever will, but I would also assume sufficient clock-to-power limitations (taking RDNA1 sweetspot range into account here) may've swayed against pushing beyond 40 CUs as getting desirable performance relative to Nvidia cards at reasonable sizes beyond 40 CUs on RDNA1 that didn't draw insane amounts of power for minuscule frequency gains, wasn't worth pursuing.

Hence shifting to RDNA2. Just a theory.

BTW, the signal integrity remake in the DF story was about why they went with 320-bit instead of the X1X's 384-bit memory interface, not regarding the 16GB/s VS 20GB/s. Signal integrity shouldn't be affected by that considering both 16GB and 20GB setups on XSX have the same bus.

If you put it that way, I guess that might in some ways redeem what MLID was saying about the "technical difficulties" issue in MS going with 320-bit bus instead of 384-bit. The idea of it being caused by capacity of RAM never made a lot of sense; you can look at PC and see how signal integrity factors potentially crop up as possible issues when the amount of channels increase, not the amount of actual physical RAM.

Something still tells me he wasn't considering the bus size as the potential source for that technical complication, though, which is why I found their analysis in that regard kind questionable. Might need to re-watch that part to make sure.
 
Last edited:
Not sure, man. The story-telling was pretty good and was extremely natural acting, unlike overreacted shit you find it on every other game. I see otherwise simply said, that's why I don't want them to regress, I want them to build on it. Other characters are really well built up through the story.

The bugs I'm not sure of, didn't get it in launch because of the shitty reviews it received, got it later on sale and was shocked of how underrated it is. It's not the matter of either one being wrong, but this is my opinion and it's one of the best organic story-telling, protagonist, gameplay this gen to me.

I have to agree with this strongly. I hadn't gotten Days Gone at first because though it looked really cool to me, what I was reading about it gave me pause. Sounded like it was super buggy and maybe broken. I didn't pay attention to the reviews talking about how male-centric it was or how it was based on a "macho worldview" because frankly, I don't give a shit about that. Me caveman! Me shoot things and go BOOOM! :)

But it looked really cool so I ended up getting it probably 3 weeks to a month after release. Apparently they'd fixed the bugs by then because I had no issues. The game was, IMHO, much better than it had been reviewed. Sure, it's a certain type of story. Show me a game based on a (basically) zombie apocalypse with the modern world gone that's centered on "woke" principles and story line and I'll show you one LOUSY game that shouldn't have been made! lol. So I really liked it.

Look, I think there's room in games to tell all kinds of stories. When a game isn't centered on a specific character and type of character, I want and expect to be able to customize my character, make it male or female or look like whatever I want. If the game IS centered on a specific character and type, like say the Last of Us was....that's fine too. I'm not going to ding TLOU for having a white dude in it, and I wouldn't rip that game or a similar one for having any other race character as the center. That's all good. Same thing with other aspects that people deem "woke." When things are central to the story, all good and as I said, there's a LOT of room for storytelling in games.

What I DO object to, is when such things are shoveled into a game just "because it's time" or "because it's 2020 now." If there's not a reason central to the story, game or character, I don't care how many women the main character bangs. Likewise if the main character was gay or anything else. We're seeing this shoveling in a lot of movies right now. Instead of making a good strong balanced female character and making it about that, the movie creators make an unrealistic caricature of a woman and not only that, they have to make all the male characters useless just to show even more how awesome the female character is. (See: The Last Jedi as one good example of this). Also, isn't it insulting that to make a strong woman character they so often make the woman act MALE? She's twice as ass kicking and beer chugging as her male colleagues, just to show how strong she is. Can't a woman be strong without being "butch?" Same for men...can't men be sensitive, caring or idealistic without acting feminine? Now, of course there's a place for masculine women and feminine men as well, but it just seems that things are out of balance.

The other problem is making things "woke" by breaking the world lore and story. Dr. Who is a great example of this. Throughout the history of the show and books there were male and female TimeLords. It was even joked about that a gender transition in a regeneration would be a disaster...rumored to have happened but doubtless untrue. Then "Because it's time" they broke all that lore and made the new Doctor regenerate as a female. Wouldn't have bothered me if that had been a part of the show and story's lore, etc. but it wasn't. They shoehorned it in and then told everyone if they didn't like it, it was because you were a misogynist.

Anyway, enough of that. Bottom line I agree with games. There's lots of room to tell lots of stories and to keep characters diverse as well. Everyone should get to play at least some games as a character that resembles themselves or whatever image they so choose. I just think it's the wrong thing to do to try and force or shoehorn things into those stories in the name of being "woke." It doesn't feel legitimate and I think it is frankly insulting.
 
…..

When we will see RDNA 2 cards on the market, we will know better. If ~10TF RDNA 2 cards are ~448GB/s or higher, we will know PS5's main bottleneck is the memory bandwidth.

BTW, the signal integrity remake in the DF story was about why they went with 320-bit instead of the X1X's 384-bit memory interface, not regarding the 16GB/s VS 20GB/s. Signal integrity shouldn't be affected by that considering both 16GB and 20GB setups on XSX have the same bus.

Only if the RDNA2 cards are narrow and fast, also, but even that is inconclusive because a discrete GPU in a PC needs more bandwidth than a HSA system for the round trip with data - it just stays in place on a unified HSA system at zero bandwidth cost.

Signal integrity would be affected at 560GB/s for both GPU and CPU because there is more energy, so proportionally more heat. That's why with the CPU constantly feeding the GPU instructions - copying from 3.5GB -> 10GB -we know that the XsX's real working bandwidth will be less than the PS5 - without extensive optimisation and designs that maximize the forward only traffic from CPU to GPU. Any work the XsX needs done by the GPU and returned to the CPU to use will be costly and bring down the average XsX memory bandwidth disproportionately.
 
You probably won't agree with my reasoning behind the variable frequency strategy for PS5 I talked about in reply to Sinthor,
I disagree it was a last minute plan b switch, variable clocks frequency along with the software solutions to give developers control over power budgets and make it work seamlessly and the exotic cooling solution isn't something they come up in the span of 6 months or even 1 year. This was integrated into the SoC design and that's before even getting into their exotic cooling solution patent. Why spend all this money on R&D if they are not going to use it?

Cerny claimed 2GHz was an unreachable target with the old fix frequency strategy, the infamous 9.2tf wouldn't even work with fixed clocks. So what would you consider to be plan A?
 
And still, 2080 SUPER really benefited from having more bandwidth (496GB/s) and it doesn't have a CPU, 9GB/s SSD, and a 20GB/s sound chip sitting on the same bus
20GB/s is a peak for TE, cerny even said devs have to be mindful of how much bandwidth they allocate to it, it won't be typical usage
2080 still has to stream all of its data from the PCIE port so that'll take more than 9GB/s
My guess is, because they stayed at 532GB/s for months,
How do we know this? time between leaks doesn't necessarily translate to time during development
and will 18Gbps chips even be available for mass production this year? I think them testing bottlenecks and deciding it wasn't worth the cost is plausible
Bandwidth requirements are defined by the jobs the GPU has at the moment, not by how many CUs or Mhz it has. Obviously a more powerful GPU will probably receive heavier workloads, but if both consoles are aiming at the same thing, the one with the higher GB/s number will have an easier time doing the same thing.
The more powerful GPU needs more bandwidth to materialize its compute advantage. A 10TF GPU won't reach the same highs as a 12TF GPU, their bandwidth consumption isn't equal
For example XSX GPU can run at 18-21% higher resolution going by compute delta and has in practice 20% extra bandwidth to materialize that resolution advantage.
 
Last edited:
It's not a conspiracy theory, it's out there in the open, just read some so-called professional reviews. That game got review bombed, user reviews says it all, a rare case where user score is much higher than the Metascore:

783732391.jpg
In the personal I believe the user score is usually garbage, if a site like for put a name IGN can have a bias the people who write a
review could be even worse is not weird to see in those review 100 or 0 and those reviews are just fanboy or haters (just look this forum).

The metascore is not perfect but reflect more close an opinion of the general feeling that doesn't mean if a game is 60 is garbage
that only mean is barely above of the medium or is just acceptable even you can say you like more a game of 80 than one that has 90
but one is an opinion and the other is a general consensus.

Usually the people who say they don't believe in the classification or qualification are people who don't like where
were classified or how was it rated.

I usually see a couple of sites where I know share my taste and then I can know if title will be to my liking.

Maybe the only place a user score is useful is for thing like game in steam when you want to know if the new updates
solve some issue but even then exists the review bombing.
 
You now what now I believe PS5 is using jaguar those bastard of Sony lying us in the spec sheet
you now cpu jaguars which can do the same as the ryzen 2 but are jaguars.

And convinced all the devs of say that the SSD was the most important thing and incidentally
not to mention that GPU could no longer do raytrace

 
Last edited:
Yes and my follow up questions were not addressed in the comment i replied to.
  • Github is outdated and likely using an older chip as a placeholder (rdna1). The latest github leak claimed rdna1 without hw rt
  • How can you know the chip test was for BC only? and if it was wouldn't you find it reasonable to asume they were using a rdna1 placeholder for testing bc?
Lastly this bit is aimed at you, going from 1.8 to 2.23ghz its no small change this had to be their target from the beginning not some reactionary change specifically considering their variable clocks and cooling systems. Its not something you come up with overnight, consoles chips design & specs are locked 2 years before launch. Doing unplanned changes can compromise yields and set them back months.
You just contradicted your own argument by saying github is outdated. It is less then a year old. Then you said things have to be locked down 2 years out. How would it be an old chip if everything has to be already locked down?
 
It's not a conspiracy theory, it's out there in the open, just read some so-called professional reviews. That game got review bombed, user reviews says it all, a rare case where user score is much higher than the Metascore:

783732391.jpg


A lot of the reviewers didn't test the game when the day 1 patch that resolved a lot of the issues some reviewers talked about were fixed so when it became released to the public a lot of users didn't have the problems and were calling out these reviewers
 
You just contradicted your own argument by saying github is outdated. It is less then a year old. Then you said things have to be locked down 2 years out. How would it be an old chip if everything has to be already locked down?
I don't know how the leaks timeframe align with actual development (i.e how soon after the chip taped out did it leak?) Also having locked specs doesn't mean final silicon has been taped out.
Anyways if you continue reading you'll see DrKeo DrKeo clarified why oberon likely was rdna2
 
A lot of the reviewers didn't test the game when the day 1 patch that resolved a lot of the issues some reviewers talked about were fixed so when it became released to the public a lot of users didn't have the problems and were calling out these reviewers
In all fairness, a reviewer can only review whats in front of them. Either provide a relatively bug free game for review, or do not send out review copies prior to retail release.

Playing the game on PRO right now, far into it. Flawless performance and bug free today.
 
In all fairness, a reviewer can only review whats in front of them. Either provide a relatively bug free game for review, or do not send out review copies prior to retail release.

Playing the game on PRO right now, far into it. Flawless performance and bug free today.
Oh I understand that they just copped a lot of backlash by fans because they wee apparently told that the patch would fix a lot of issues when they got the review copies sent to them and like I said when retails went out and the patch was installed the game was far better without as many issues.
 
So, no teraflops this time, Phil?
Not to sound mean but if you've been following closely, the general consensus on BOTH next gen consoles is that the CPU and SSD I/O provides the biggest changes to next gen. These components are actually changing the way games are created.
There is a developer on Resetera that actually stated that the next next gen (PS6 and Xbox ?) is the generation when (these new processes should be much refined) that you're going to start seeing some insane things.
 
Not to sound mean but if you've been following closely, the general consensus on BOTH next gen consoles is that the CPU and SSD I/O provides the biggest changes to next gen. These components are actually changing the way games are created.
There is a developer on Resetera that actually stated that the next next gen (PS6 and Xbox ?) is the generation when (these new processes should be much refined) that you're going to start seeing some insane things.


Who said that?

It wouldn't surprise me. Next-next gen is when RT is going to be abundantly used and not just some neat trick used here and there. Think: 40-60 TF.
 
Not all teams, some teams. Not just one silicon per console, many, and for other products too. Trying to disproof the Github leak as some kind of fake at this point that we already know AMD has copyrights to that data and the fact that it was 100% accurate regarding both systems silicon isn't really going anywhere. Github was real, 1000%, but it was a peek into development, not final chips.


We do have some screencaps, but you can't capture the whole leak, the amount of data is unbelievable.


If they were testing for bottlenecks and 448GB/s was enough, they would have stopped at 510GB/s when they saw that 510GB/s was overkill. But they went even higher and got to 532GB/s with Oberon B0 and it stayed that way in Flute which was months after Oberon B0. So basically they started with 448GB/s, moved to 510GB/s, and then moved to 532GB/s and stayed at 532GB/s for months. We can't tell anything for sure, but it really seems like it implies Sony wanted 500GB/s+ for the PS5. And it's not that surprising. I mean, 5700 has 448GB/s and that's a sub-8TF card and on top of that add a 9GB/s SSD, up to 20GB/s audio chip, a powerful CPU and RT & denoising which hog a lot of memory bandwidth. Even without the Github and Flute leak I probably would have assumed Sony wanted more than 448GB/s and DRAM prices made them go with 14Gbps.

I'm not sure why people happily assume MS gave up on 20GB and went with a weird uneven GDDR6 chip setup because of DRAM prices (which is probably true) but for some reason deny that Sony probably used 14Gbps also because of DRAM prices.

Wow, very interesting info if true. It is really shocking such sensitive and detailed info got leaked. Anyways it is shame that they had to settle with 14Gbps. Ram setup is easily the worst part about these console.
 
Last edited:
It's not a conspiracy theory, it's out there in the open, just read some so-called professional reviews. That game got review bombed, user reviews says it all, a rare case where user score is much higher than the Metascore:

783732391.jpg
I'd say that 8/10 is quite generous for Days Gone. I enjoyed the game, got a platinum even but when you compare it to the best of the best games, I think 7/10 is very just. It's a good game and we can expect improvements in the sequel.

So, no teraflops this time, Phil?
Reports started flowing in that multiplatform performance isn't really that ahead of competition or even falls behind. Time to advertise something else. We'll soon see a lot of shifting narratives from the Discord gang too.
 
The bandwidth changes from 500+ to below 500 is allegedly one of the issues Japan disagreed with the US department of Playstation about.
 
I've been gone for a month so I'm asking whoever would be kind to answer, Has there been any leaks or hints from insiders here about any next gen games or event dates? I'll even take unverified rumors lol
 
I have to agree with this strongly. I hadn't gotten Days Gone at first because though it looked really cool to me, what I was reading about it gave me pause. Sounded like it was super buggy and maybe broken. I didn't pay attention to the reviews talking about how male-centric it was or how it was based on a "macho worldview" because frankly, I don't give a shit about that. Me caveman! Me shoot things and go BOOOM! :)

But it looked really cool so I ended up getting it probably 3 weeks to a month after release. Apparently they'd fixed the bugs by then because I had no issues. The game was, IMHO, much better than it had been reviewed. Sure, it's a certain type of story. Show me a game based on a (basically) zombie apocalypse with the modern world gone that's centered on "woke" principles and story line and I'll show you one LOUSY game that shouldn't have been made! lol. So I really liked it.

Look, I think there's room in games to tell all kinds of stories. When a game isn't centered on a specific character and type of character, I want and expect to be able to customize my character, make it male or female or look like whatever I want. If the game IS centered on a specific character and type, like say the Last of Us was....that's fine too. I'm not going to ding TLOU for having a white dude in it, and I wouldn't rip that game or a similar one for having any other race character as the center. That's all good. Same thing with other aspects that people deem "woke." When things are central to the story, all good and as I said, there's a LOT of room for storytelling in games.

What I DO object to, is when such things are shoveled into a game just "because it's time" or "because it's 2020 now." If there's not a reason central to the story, game or character, I don't care how many women the main character bangs. Likewise if the main character was gay or anything else. We're seeing this shoveling in a lot of movies right now. Instead of making a good strong balanced female character and making it about that, the movie creators make an unrealistic caricature of a woman and not only that, they have to make all the male characters useless just to show even more how awesome the female character is. (See: The Last Jedi as one good example of this). Also, isn't it insulting that to make a strong woman character they so often make the woman act MALE? She's twice as ass kicking and beer chugging as her male colleagues, just to show how strong she is. Can't a woman be strong without being "butch?" Same for men...can't men be sensitive, caring or idealistic without acting feminine? Now, of course there's a place for masculine women and feminine men as well, but it just seems that things are out of balance.

The other problem is making things "woke" by breaking the world lore and story. Dr. Who is a great example of this. Throughout the history of the show and books there were male and female TimeLords. It was even joked about that a gender transition in a regeneration would be a disaster...rumored to have happened but doubtless untrue. Then "Because it's time" they broke all that lore and made the new Doctor regenerate as a female. Wouldn't have bothered me if that had been a part of the show and story's lore, etc. but it wasn't. They shoehorned it in and then told everyone if they didn't like it, it was because you were a misogynist.

Anyway, enough of that. Bottom line I agree with games. There's lots of room to tell lots of stories and to keep characters diverse as well. Everyone should get to play at least some games as a character that resembles themselves or whatever image they so choose. I just think it's the wrong thing to do to try and force or shoehorn things into those stories in the name of being "woke." It doesn't feel legitimate and I think it is frankly insulting.

Indeed.

it is super annoying when some "minority" is brute forced into a game/movie just because some people cant handle the reality and/or have mental illness that makes them over reacting to stuff.

Like when making medieval game that happens in europe -> there should be 99-100% white characters in most cases and that should be fine. Or even game until 90's - 00's in many countries. I think I were like 6-10 years old when I saw first black person irl. Only black girl in my school, and not before 00's and 10's it werent common to see other than white people here.

I have talked once or twice to black person in my life, it just doesnt happen.

And yet some people cry that every movie and game must have "variety". Make 100% black game? nothing wrong with that so same should apply to every ethnity. And funny how white characters is ok to replace with black ones, but replace some known black character with white? Like white black panther? it should be equal thing to do with same rules. But it would explode the heads of crazy people

Thing is, people that cry about it are also lazy, talent-less and stupid, because instead of writing and making good deep new characters, they just want to "copy paste" black person into shoes of non-black one. And forgot all other ethnic groups in the process. And on global scale whites are more minority than blacks anyway

Sane person doesnt need character to be his gender/sexuality or ethnity to like the game.

Just look at tomb raider, is anybody complaining that you play as woman? No, imo it is cool to see the story from other angles, as long as it is written so and not by political reasons. well, some women? complained that Lara is too fit.. like a fat person could do what she does.

I think that the problem is these kind of persons that just take things too personally, instead of watching a movie or playing a game, they take it as personal attack or something stupid like that.

Days gone is good game, nice story of biker dudes survival in zombi world. If someone wants to make game about trans-gay-dudes adventures, nobody stops them. But sale figures could

And what does this have to do with next gen?

Well I fear that this retard-snowflake culture will ruin many games in next gen, they kind of ruined the last of us 2 already, or at least I get the feeling that it is made because of political reasons and to show how "woke" they are, instead of just purely from artistic and story telling reasons.
 
Last edited:
Do you want to share AMD's organization tree with us? Considering the data is 100% AMD copyrighted, can we move on already?

We do know that one of the leaked chips was XBSX. It's easy to assume that the other one was also XB-related.
It's a pretty far fetching to assume that tests for Sony leaked too. And we do know that it's not what Sony said their hardware be like.
So, to me it's obvious that all chips were for MSFT unless we have some other info.
Yet everybody somehow automatically assumed PS5. Not to mention that there were previous leaks that XBSX will be 9TF.
 
Well I fear that this retard-snowflake culture will ruin many games in next gen, they kind of ruined the last of us 2 already, or at least I get the feeling that it is made because of political reasons and to show how "woke" they are, instead of just purely from artistic and story telling reasons.
Ruined? Seriously? Who says this cannot happen in an artistic story? It happens frequently in real life so why not in games and movies? Disregarding topics like this at all would be disrespectful and stupid.
 
If you expect one console to be able to run games that look like the Quixel demos while the other can't, you are in for a disappointment. This coming generation graphical difference between the consoles will probably be the lowest ever.

Of course. But one console will show better level of detail if taken advantage of because of its innovative SSD and I/O.

And LOD difference will show in a youtube comparison videos as opposed to screen output resolution.
 
Last edited:
Ruined? Seriously? Who says this cannot happen in an artistic story? It happens frequently in real life so why not in games and movies? Disregarding topics like this at all would be disrespectful and stupid.
Yeah I agree. Things like Battlefield V it's simply brain dead to push identity politics. But a tale that will undoubtedly be deeper that your usual FPS pop off, ideas should be free flowing if done with nuance and self awareness.
 
We do know that one of the leaked chips was XBSX. It's easy to assume that the other one was also XB-related.
It's a pretty far fetching to assume that tests for Sony leaked too. And we do know that it's not what Sony said their hardware be like.
So, to me it's obvious that all chips were for MSFT unless we have some other info.
Yet everybody somehow automatically assumed PS5. Not to mention that there were previous leaks that XBSX will be 9TF.

I think the codenames and their association with Shakespeare match PS5 considering its audio chip is named Tempest. The clocks also hinted at BC. Cerny also mentioned 2ghz specifically in the GDC video, so that's another thing.

It's a head scratcher for sure how the hell it happens, as it hints at borderline corporate espionage.

It sucks they cut the bandwidth, but considering they both landed on same ratios it must have been a price performance balance. At the end of the day are consoles.
 
I have to agree with this strongly. I hadn't gotten Days Gone at first because though it looked really cool to me, what I was reading about it gave me pause. Sounded like it was super buggy and maybe broken. I didn't pay attention to the reviews talking about how male-centric it was or how it was based on a "macho worldview" because frankly, I don't give a shit about that. Me caveman! Me shoot things and go BOOOM! :)

But it looked really cool so I ended up getting it probably 3 weeks to a month after release. Apparently they'd fixed the bugs by then because I had no issues. The game was, IMHO, much better than it had been reviewed. Sure, it's a certain type of story. Show me a game based on a (basically) zombie apocalypse with the modern world gone that's centered on "woke" principles and story line and I'll show you one LOUSY game that shouldn't have been made! lol. So I really liked it.

Look, I think there's room in games to tell all kinds of stories. When a game isn't centered on a specific character and type of character, I want and expect to be able to customize my character, make it male or female or look like whatever I want. If the game IS centered on a specific character and type, like say the Last of Us was....that's fine too. I'm not going to ding TLOU for having a white dude in it, and I wouldn't rip that game or a similar one for having any other race character as the center. That's all good. Same thing with other aspects that people deem "woke." When things are central to the story, all good and as I said, there's a LOT of room for storytelling in games.

What I DO object to, is when such things are shoveled into a game just "because it's time" or "because it's 2020 now." If there's not a reason central to the story, game or character, I don't care how many women the main character bangs. Likewise if the main character was gay or anything else. We're seeing this shoveling in a lot of movies right now. Instead of making a good strong balanced female character and making it about that, the movie creators make an unrealistic caricature of a woman and not only that, they have to make all the male characters useless just to show even more how awesome the female character is. (See: The Last Jedi as one good example of this). Also, isn't it insulting that to make a strong woman character they so often make the woman act MALE? She's twice as ass kicking and beer chugging as her male colleagues, just to show how strong she is. Can't a woman be strong without being "butch?" Same for men...can't men be sensitive, caring or idealistic without acting feminine? Now, of course there's a place for masculine women and feminine men as well, but it just seems that things are out of balance.

The other problem is making things "woke" by breaking the world lore and story. Dr. Who is a great example of this. Throughout the history of the show and books there were male and female TimeLords. It was even joked about that a gender transition in a regeneration would be a disaster...rumored to have happened but doubtless untrue. Then "Because it's time" they broke all that lore and made the new Doctor regenerate as a female. Wouldn't have bothered me if that had been a part of the show and story's lore, etc. but it wasn't. They shoehorned it in and then told everyone if they didn't like it, it was because you were a misogynist.

Anyway, enough of that. Bottom line I agree with games. There's lots of room to tell lots of stories and to keep characters diverse as well. Everyone should get to play at least some games as a character that resembles themselves or whatever image they so choose. I just think it's the wrong thing to do to try and force or shoehorn things into those stories in the name of being "woke." It doesn't feel legitimate and I think it is frankly insulting.

I agree, I don't care whose perspective the game follows (be it a flea, human being, alien, whatever) as long as there is a story to be told. If it's a good story, so much the better.

Usually people screaming "there's a hidden agenda!" are the people who actually have an agenda of their own.
 
Who said that?

It wouldn't surprise me. Next-next gen is when RT is going to be abundantly used and not just some neat trick used here and there. Think: 40-60 TF.
Senior visual artist on Resetera. I cant find the post now but yeah he essentially said this gen will be devs learning how to best take advantage of the new way of developing games without limitations in hard drive and the increased CPU muscle.

I really dont think 40 - 60TF though. At best we may see high 20's. Cant remember who it that you need 40 - 50TF to get to photorealistic games.

EDIT: Ah it was Tim sweeney who said that ^^
 
Last edited:
Can someone explain to me why some people try so hard to make the Github leak false?

The specs are out now in a way. Why keep saying that the Github leak was false?

Is there something in the Github leak that somehow makes the PS5 look bad?
 
Senior visual artist on Resetera. I cant find the post now but yeah he essentially said this gen will be devs learning how to best take advantage of the new way of developing games without limitations in hard drive and the increased CPU muscle.

I really dont think 40 - 60TF though. At best we may see high 20's. Cant remember who it that you need 40 - 50TF to get to photorealistic games.

EDIT: Ah it was Tim sweeney who said that ^^

The 20 tflops in my opinion will be possible with the mid gen "PS5 Pro" and "XSX Next" consoles, if they do as on this generation.
 
Can someone explain to me why some people try so hard to make the Github leak false?

The specs are out now in a way. Why keep saying that the Github leak was false?

Is there something in the Github leak that somehow makes the PS5 look bad?
I don't understand why ppl even give a %uck about the github leaks at this point. Gtf over it already, real, certified console info has already been given and more to come 🤦‍♂️🤦‍♂️
 
Can someone explain to me why some people try so hard to make the Github leak false?

The specs are out now in a way. Why keep saying that the Github leak was false?

Is there something in the Github leak that somehow makes the PS5 look bad?
Nope. It just fuels a theory that Sony upclocked PS5 GPU to narrow the gap between PS5 and XsX.
And some people think so highly of Cerny (he is obviously second comig of God), that it just can't be possible that he would make a late frequency adjustment because he underestimated Microsoft.
 
Nope. It just fuels a theory that Sony upclocked PS5 GPU to narrow the gap between PS5 and XsX.
And some people think so highly of Cerny (he is obviously second comig of God), that it just can't be possible that he would make a late frequency adjustment because he underestimated Microsoft.
There's a huge chunk of Cerny's presentation as to why those frequencies were chosen, and the reasoning behind it.
 
Can someone explain to me why some people try so hard to make the Github leak false?

The specs are out now in a way. Why keep saying that the Github leak was false?

Is there something in the Github leak that somehow makes the PS5 look bad?
Because who gives a fuck? What is the point? How about we just use official specs? Good idea right?
 
Nope. It just fuels a theory that Sony upclocked PS5 GPU to narrow the gap between PS5 and XsX.
And some people think so highly of Cerny (he is obviously second comig of God), that it just can't be possible that he would make a late frequency adjustment because he underestimated Microsoft.

"Underestimated Microsoft"

Maybe you're overestimating how much Xbox matters to the team engineering the Playstation 5. "The losers focus on winners, winners focus on winning." - Anonymous
 
There's a huge chunk of Cerny's presentation as to why those frequencies were chosen, and the reasoning behind it.
Not saying he didn't

But ask yourself a question. If it was true, do you think that Cerny would tell you that in his presentation?

(Answer is NOPE)

But won't bother to talk about it anymore. I just provided N nosseman an explanation. Because if i dared to say it again it will be "he is spreading FUD", "Discord something something", "Xbot fanboys incoming" and other shit from Cerny's believers here
 
Wait, so if people subscribed to the Official PlayStation Magazine in UK get the digital copy a week earlier before it's physical copy sale (Next Tuesday, May 5th), that day is supposed to be today. Based on what's on this magazine's June issue cover, I think we're getting a DualSense info blowout with a Horizon Zero Dawn 2 reveal in a few hours.


QFwhDfI.jpg
 
Last edited:
Wait, so if people subscribed to the Official PlayStation Magazine in UK get the digital copy a week earlier before it's physical copy sale (Next Tuesday, May 5th), that day is supposed to be today. Based on what's on this magazine's June issue cover, I think we're getting a DualSense info blowout with a Horizon Zero Dawn 2 reveal in a few hours.


QFwhDfI.jpg

I can't imagine with the mass starvation (for those waiting for any news from Sony) that anything would be held back lol...there will be leaks of the mag everywhere 🤔....leaks and quarantine....leaks.and.quarantine lol.

What a crazy ass time here as well...folks are getting themselves banned left and right every week 😅...
 
Last edited:
Status
Not open for further replies.
Top Bottom