• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(*) Ali Salehi, a rendering engineer at Crytek contrasts the next Gen consoles in interview (Up: Tweets/Article removed)

GHG

Member
I get what you're saying, but Cell Processor the Series X setup is not. It's not even in the same stratosphere. Also, didn't you work for Lionhead?

The thing is we don't even have a rough percentage figure for what they are typically able extract from the series X.

All he's said is that getting to 100% is difficult (or words to that effect). If they are typically able to reach 90-95% for example that's still better than the PS5 running at 100% all of the time (which it won't). It also means there's more room for improvement across the consoles lifecycle as the developer tools improve (which he also alludes to).

Which is why all of this upset makes no sense. If you take away the headline statements, which are what's being used as ammunition, then there's still a lot of things he said which are favourable to the series X.
 

rnlval

Member
None of this changes the intrinsic capability of the hardware itself. Existing consoles are built under these same processing and rendering paradigms with semi-custom design and yet they perform exactly as expected because the hardware is predictable given it's all sourced from the x86 PC sector.

I appreciate your posts because they actually come with a modicum of seriousness but at the same time I find them just as ridiculous as the hyperbolic drivel shelled out by others here. Both CPU's will perform similar to a Ryzen 1600AF assuming it had 8 cores, one slightly better than the other. One GPU will perform around the level of a 2070 Super, the other around the level of a 2080 Super.

This isn't groundbreaking or controversial, it's common sense, because like the above discrete hardware the application of these components in these consoles is equally predictable.
Ryzen 1600AF is based on Zen 1.x without proper 256bit AVX 2 hardware.
 

Ashoca

Banned
No.

He said a bunch of stuff you don't happen to like. Whether it's "accurate" or not is another discussion but the fact is he's either worked on the consoles (or works closely alongside those who do). You've done neither. For that reason I'll take his word over yours, no matter how much you want to cry about him being "wrong" only to run away from the discussion whenever you find yourself out of your depth or your rebuttals disproven.

So because this whole thing is hurting your fragile ego, which is intrinsically linked to the performance of a particular brand of console, you are now deriving satisfaction from seeing this guy potentially lose his job. That's sad to say the least.

What? It's just a plastic box? Why would anybody be "hurt"? lol

I think the problem is, that there was a lot of inaccurate information in this article and it was very biased towards Sony. Looking through the engineers twitter and instagram feed, you could clearly see that he is a sony fanboy, so no wonder, that he was very biased.

Now, they deleted everything and he even got fired. If everything he said was true and made sense, he wouldn't have been fired and wouldn't have deleted everything. It was just very embarrassing to Crytek as a whole seeing how clueless their engineers are.
You have to understand that Crytek has a very high standards, their games are known for having the best graphics and the most talented engineers... and then you see an employee publicly spouting so much nonsense? Yeah, that's embarrassing.

Anyways, relax. We will find out, as soon as the games will be released and shown.
 

Panajev2001a

GAF's Pleasant Genius
Anybody with their head firmly fastened to their shoulders knows that whole spiel was littered with inaccuracies and nonsense, regardless of the reason for the firing it wasn't credible information, and it was veiled in fanaticism.
Despite your displayed sense of superiority over anyone saying something positive about PS5 (he did not really say anything really negative about XSX, not sure what you are up in arms about), the evidence of the content of what he said is not the reason he could have been fired for and you know this.

HR and Marketing unsanctioned interview potentially causing fractures in the relationship with a key supplier... but no, they would have been ok if he just talked about how amazing 12 TFLOPS is and how that was really the key vs a much less useful extra speed boost for the SSD which would only gets marginal loading times reduction. Oh, they would have kept him if he just said that :rolleyes:.

Also there was no AMD leaker, the whole GitHub situation wasn't intentional. You have a really bad habit of comparing things that are incomparable.

Intentional or not wouldn’t make a lot of difference from an HR point of view, but you are sure able to try test this in your FTSE 250 company, I am not going to in the one I work in. They two are comparable despite this not suiting your argument which is the biggest difference yet again.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
If everything he said was true and made sense, he wouldn't have been fired
Are you having a laugh? Before the release, you potentially break NDA and piss off one of the companies keeping your own company afloat (oh yeah MS would have been ok to have even slightly but potentially damaging PR out that could impact pre-orders had the guy been more factual, maybe producing benchmarks... :rolleyes:) in an interview that HR and Marketing did not write off and nothing happens?!

MS got pissed and Crytek wa able to claim plausible deniability and cut that guy’s head off metaphorically. Case closed and lesson imparted to all other employees.
 
Last edited:

rnlval

Member
What he said may have negatively impacted Crytek's reputation and relationship with Microsoft but it wasn't factually wrong in any way (you just want it to be so because you're a fan or Xbox).
Crytek's reputation in PC gaming sector is already in negative territory and Crytek has a history bashing gaming PCs.

My response to Ali Salehi's dishing DirectX is fukoff Crytek.
 

ZywyPL

Banned
We all know it's going to happen :)
RY07ycf.jpg


SMdbbWf.jpg


Tvcu7m4.jpg
I don't think any Sony first-party game will disappoint in that department.
Sony always delivery in games no matter the hardware.

Don't get me wrong, Sony's 1st part studios will undoubtedly pull out amazingly looking games, they already do, they will be making them for 5-7 years, delaying 3-4 times, ND will crunch their employees to death, but eventually we will get games that look beyond what even a 10k PC has, so what? I'm too old to get excited for graphics, especially with real-time RT finally becoming a thing, for me the question is will their studios finally break the 30FPS barrier? Will they put more focus on actually fun/enjoyable gameplay, rather then slow/sloppy/sluggish controls? Will they have any MP proposition that will allow me to play their games with my friends for countless hours, rather then finishing the game through the weekend and that's it? The graphics will be forgotten really fast, usually it's half a year and a new technically impressive game shows up, no one remembers TO1887, KZ:SF, DC, Ryse, Crysis 2/3 etc. after all that initial hype they got because of the graphics, and there are solid reasons behind it. Even UC4 seems to be forgotten, and it was suppose to be the PS4's messiah, but now all what many people say about it is how bad it is compared to previous installments, its SJW agenda, and what's not, no one cares about its graphics anymore. From all the studios they have only Insomniac has a great track record of making games that are fun to play, like REALLY fun to play.


That position is not correct as there are proper analogs in the PC world about the performance of similar architectures such as AMD's Solid Storage graphics: How the Radeon Pro SSG Works - The Basics | GamersNexus - Gaming PC Builds & Hardware Benchmarks

But that's professional application, where even a mere 10-15FPS (as even shown in the video) is considered as real-time, as oppose to few minutes of rendering, but will it be applicable to 60FPS games? That's the real question.
 

ruvikx

Banned
Sounds like we just had a 20 page discussion on an opinion based article written by an entry-level employee.

I said elsewhere this is the most boring console war ever & this entire debacle is simply more evidence of that. Some random interview given by a random guy who isn't even one of the main guys in his studio shouldn't be used by console warriors to make a point about their box being better. But it was.

If he really was fired, he could also blame the people who translated his words & the others who put them on a pedestal all over social media because they're desperate for validation, i.e. which brought attention to him from his bosses. One man's loss is another man's gain though, i.e. some lucky guy is going to get hired by Crytek. For whoever that person is, Christmas might have come early.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Don't get me wrong, Sony's 1st part studios will undoubtedly pull out amazingly looking games, they already do, they will be making them for 5-7 years, delaying 3-4 times, ND will crunch their employees to death, but eventually we will get games that look beyond what even a 10k PC has, so what? I'm too old to get excited for graphics, especially with real-time RT finally becoming a thing, for me the question is will their studios finally break the 30FPS barrier? Will they put more focus on actually fun/enjoyable gameplay, rather then slow/sloppy/sluggish controls? Will they have any MP proposition that will allow me to play their games with my friends for countless hours, rather then finishing the game through the weekend and that's it? The graphics will be forgotten really fast, usually it's half a year and a new technically impressive game shows up, no one remembers TO1887, KZ:SF, DC, Ryse, Crysis 2/3 etc. after all that initial hype they got because of the graphics, and there are solid reasons behind it. Even UC4 seems to be forgotten, and it was suppose to be the PS4's messiah, but now all what many people say about it is how bad it is compared to previous installments, its SJW agenda, and what's not, no one cares about its graphics anymore. From all the studios they have only Insomniac has a great track record of making games that are fun to play, like REALLY fun to play.




But that's professional application, where even a mere 10-15FPS (as even shown in the video) is considered as real-time, as oppose to few minutes of rendering, but will it be applicable to 60FPS games? That's the real question.
Meh. If you want every game to be 60FPS stick to PC. Many of my favorite games have been 30FPS and they're fun games with great gameplay like God of War and Spider-man.

No one has forgotten Uncharted 4 it sold 17 million and won many awards in 2016, it's one of the greatest games and i still play the campaign every year
 

Panajev2001a

GAF's Pleasant Genius
Don't get me wrong, Sony's 1st part studios will undoubtedly pull out amazingly looking games, they already do, they will be making them for 5-7 years, delaying 3-4 times, ND will crunch their employees to death, but eventually we will get games that look beyond what even a 10k PC has, so what? I'm too old to get excited for graphics, especially with real-time RT finally becoming a thing, for me the question is will their studios finally break the 30FPS barrier? Will they put more focus on actually fun/enjoyable gameplay, rather then slow/sloppy/sluggish controls? Will they have any MP proposition that will allow me to play their games with my friends for countless hours, rather then finishing the game through the weekend and that's it? The graphics will be forgotten really fast, usually it's half a year and a new technically impressive game shows up, no one remembers TO1887, KZ:SF, DC, Ryse, Crysis 2/3 etc. after all that initial hype they got because of the graphics, and there are solid reasons behind it. Even UC4 seems to be forgotten, and it was suppose to be the PS4's messiah, but now all what many people say about it is how bad it is compared to previous installments, its SJW agenda, and what's not, no one cares about its graphics anymore. From all the studios they have only Insomniac has a great track record of making games that are fun to play, like REALLY fun to play.




But that's professional application, where even a mere 10-15FPS (as even shown in the video) is considered as real-time, as oppose to few minutes of rendering, but will it be applicable to 60FPS games? That's the real question.

I think that improving performance while making the console easier to program for and as surprises free as possible (as in the “shit why is performance tanking like crazy here... what is the arcane magic to get the performance stated in the manuals?!?” kind of surprises) is what helps developers focus more on gameplay.

That is why after PS4 and introducing his Time to Triangle metric (and seeing developers convinced by his arguments once they tried to code for PS4) he is now stressing that the Time to Triangle for PS5 is even lower than it was for PS4 despite the performance improvements and higher ceiling.

References:
 
Last edited:
Anybody with their head firmly fastened to their shoulders knows that whole spiel was littered with inaccuracies and nonsense, regardless of the reason for the firing it wasn't credible information, and it was veiled in fanaticism.

"Anybody with their head firmly fastened to their shoulders knows that whole spiel was littered with inaccuracies and nonsense"

:messenger_grinning_smiling: LOL, nice logical fallacy
 

ZywyPL

Banned
That is why after PS4 and introducing his Time to Triangle metric (and seeing developers convinced by his arguments once they tried to code for PS4) he is now stressing that the Time to Triangle for PS5 is even lower than it was for PS4 despite the performance improvements and higher ceiling.

Yeah, I remember that whole TTT thing back from Cerny's PS4 reveal, how he said it's back to what it was in PSX days, but unfortunately as it turned out it has nothing to do with the actual reality, at least for us gamers - the games now take way more time to develop, even half a decade or even more, they get delayed more times than ever before, come out in the worst states as ever before (basically betas, sometimes even alphas), heavily bugged, with full content not being there on time. it was a nice promise back in early 2013, but I'm not falling for it again, if anything, I expect things to get even worse, especially from the artists side who will have to spend even more time to craft even more detailed assets.
 

Panajev2001a

GAF's Pleasant Genius
how he said it's back to what it was in PSX days, but unfortunately as it turned out it has nothing to do with the actual reality, at least for us gamers

You really think that?! I fail to see how games that are pushing the boundaries of what these puny base consoles can do, leave the updated HW refreshes aside, and the wealth and diversity of games Xbox One and PS4 got (and the quality of AA and indie products) would be possible if that did not matter or if it really had nothing to do with the actual reality for us gamers as you said (especially compared to PS3)?

Sorry, but I see that as shortsighted and a bit of involuntary revisionist history here for the efforts both MS and Sony put to improve how games are made. MS and Sony enjoyed the benefits of this too with blossoming digital sales and tons and tons more content coming through their storefronts (less and less PC exclusives, more genres, etc...).

He made a promise and kept it, posted one clear piece of evidence from the developers that ported the KEX engine to PS4 finally for DOOM64. There is more to development than making easy to code for and powerful HW, nothing any console maker can do much more about And nothing they promised could be fixed automagically.
 
Last edited:

Leyasu

Banned
The thing is we don't even have a rough percentage figure for what they are typically able extract from the series X.

All he's said is that getting to 100% is difficult (or words to that effect). If they are typically able to reach 90-95% for example that's still better than the PS5 running at 100% all of the time (which it won't). It also means there's more room for improvement across the consoles lifecycle as the developer tools improve (which he also alludes to).

Which is why all of this upset makes no sense. If you take away the headline statements, which are what's being used as ammunition, then there's still a lot of things he said which are favourable to the series X.
It should also be noted that no console has ever been maxed within the first couple of years.

The software that comes in 20/21 will not look the same as games in 23/24. As the gen progresses, devs will learn to get even more from the hardware.
 

Neofire

Member
That we aren't creating topics on an interview with Lady Bernkestel, since that we would also see the bias. Just like we do now. Developers can also be fanboys, and looking at his twitter this one clearly is.
Never said we were but can you refute anyone he said, on a technical level? How many games has Lady Bernkastel worked on? Or you for that matter. He's worked on both systems, everyone has some level of bias but in this case it doesn't dismiss his technical knowledge of game development.
 

pawel86ck

Banned
Nobody is saying they are burden... what he said it is hard to use all of them for rendering.

Your 2080Ti and 2070 Super is a good example.
4352 SPs @ 1545Mhz vs 2560 SPs @ 1770Mhz
The difference in SPs is 70%.
If you factor clock... 50%.
Do GTX 2080TI shows a performance over 50% GTX 2070 Super?
Nope. Averages at 23% better.

How much that RTX 2070 Super can close the gap with a 100, 200 or more overclock?

The same if you compare the 2080 Super and 2080TI the utilization of SPs decrease and so the performance doesn’t scale like in a ideal scenario because more SPs are not utilized in render time.

Same happens with AMD cards... you double the CUs but the performance can’t scale to double.

AMD did a good job decreasing the wave size in RDNA that give it a better utilization of the CUs in parallel than GCN but it is still an issue they need to work.

You can watch the GPU % use in RivaTunner or other app... the utilization of 2080TI is way lower than 2070 Super in the same game... that is a clear example of what the CryTek guy said.
I dont have 2080ti but I have seen many gamplays and in higher resolutions GPU usage was almost always at 99% (besides CPU limited scenarios), so what you wrote is simply a lie.



99% GPU usage almost entire time!


Also although techpower up suggest 2080ti is 20% faster from both 2070 and 1080ti in reality is very hard to find benchmarks that shows only 20% difference in 4K.

wolfenstein-2-3840-2160.png


Here's 2080ti has 39% better performance compared to 2070 super.

hellblade_3840-2160.png


Here's for example 2080ti is 51% faster from standard 2070 and 1080ti. Bigger and slower chip (2080ti) is still better in the end and it will be exactly the same on consoles, meaning 56CUs 12TF GPU will easily beat 36CUs 10TF.
 
Last edited:

ZywyPL

Banned
You really think that?!

That's not what I think, that's just the reality, games take way longer to develop then ever before despite that TTT being just a month or two just like on PSX. Now it will supposedly be even less than a month, but Cerny even said during his presentation that this doesn't mean the developers won't take as much time as then want/need to materialize their visions for their games.

We saw Death Stranding being pulled out within what, two years? But that's on a well established and polished engine, well known PS4 platform, if the current generation would last another 3-5 years I'm sure we would see such development time decrease across all the gamin studios out there, but it's a new generation, everything resets, engines and tools have to be adjusted and enhanced, devs will have to lear new technologies and techniques, get to know what the consoles are truly capable of, it will take years again until the devs get familiar with new systems, I recon 3-4 years again (2023-24) before we will start seeing the so called "true" next-gen games.
 
I'm glad he was fired for spreading misinformation based on his fanboy bias perspective.

12 is greater than 10
3.8 is greater than 3.5

Basic and logic number prevailed at the end
He wasn't fired .lol

He explained teraflop is theoretical max number and xsx has advatage there. Theoretical.max can be reached and in that case xsx will push more pixels. He even said xsx will have resolution advantage. He didn't say otherwise.

He just thinks ps5 is easier to code for and likes the fast data approach sony has taken which is similar to nvidia .

Thats all.
 

GymWolf

Gold Member
Which is why the 2080 Ti with more SM's (CU's) at greatly lower frequencies absolutely crushes the 2080 Super right?

You guys are flat out talking out of your asses trying to extrapolate on incorrect information in a veiled attempt at making it appear correct. Wrong is wrong.
can i crush a 2080ti with my 2070super?
 

Muddy

Member
Seriously why do Xbox fans get triggered by people praising PS5 hardware? Its not like he said XSX is weak or bad? Is it simply because its the only thing Microsoft has over Sony? A hardware advantage because in terms of software theres only one winner which is Sony? Enjoy it while it lasts because eventually its all about the games. No gamer likes to see their console collecting dust no matter how many teraflops it has. Just like the Xbox one X is currently.
 
Last edited:

replicant-

Member
Only shows he was really under NDA and that he had access to both dev kits and he knew what he was talking about.

Incredibly naïve look.

I suspect he was fired due to breaking a term of his employment - confidentiality. it doesn't matter if the details were correct or not.
 
Is he fired?
Edit - He wasn’t fired lol

Oh I thought he was. haha. Crytek must be standing with him on what he said then.

To be fair, Matt from era (is he a dev?) and a dev from Dice are basically saying the same thing as the Crytek dev. Both are very close to each other despite XSeX having more teraflops. It looks like it has a lot of bottlenecks.
 

rnlval

Member
Nobody is saying they are burden... what he said it is hard to use all of them for rendering.

Your 2080Ti and 2070 Super is a good example.
4352 SPs @ 1545Mhz vs 2560 SPs @ 1770Mhz
The difference in SPs is 70%.
If you factor clock... 50%.
Do GTX 2080TI shows a performance over 50% GTX 2070 Super?
Nope. Averages at 23% better.

How much that RTX 2070 Super can close the gap with a 100, 200 or more overclock?

The same if you compare the 2080 Super and 2080TI the utilization of SPs decrease and so the performance doesn’t scale like in a ideal scenario because more SPs are not utilized in render time.

Same happens with AMD cards... you double the CUs but the performance can’t scale to double.

AMD did a good job decreasing the wave size in RDNA that give it a better utilization of the CUs in parallel than GCN but it is still an issue they need to work.

You can watch the GPU % use in RivaTunner or other app... the utilization of 2080TI is way lower than 2070 Super in the same game... that is a clear example of what the CryTek guy said.
For recent driver updates

relative-performance_3840-2160.png


RTX 2080 Ti vs RTX 2070 Super at 3840 x 2160 resolution

185 / 133 = 39% difference.

--


The difference is smaller with 2560 x 1440 resolution

relative-performance_2560-1440.png


RTX 2080 Ti vs RTX 2070 Super at 2560 x 1440 resolution

174 / 131 = 32‬% difference. <---- bottleneck

The reason for larger GPUs is for robust 4K performance.


Notice the memory bandwidth difference between RTX 2080 Ti's 616 GB/s vs RTX 2070 Super's 464 GB/.s

616/ 464 = 1.327 or 32.7%

At 4K resolution, RTX 2080 Ti's higher TFLOPS over RTX 2070 Super has a higher gain when compared to memory bandwidth related 32% gain.


RTX 2080 Ti continues to scale when somebody replaced GDDR-14000 chips with GDDR6-15500 chips from two RTX 2080 Super cards.

NVIDIA is reserving RTX 2080 Ti Super edition when there's no competition from AMD.

Some RTX 2080 Ti with Samsung GDDR6-14000 chips can memory overclock into GDDR6-15000 range.

The next stop is GDDR6-16000+ rated chips.
 
Last edited:

Renozokii

Member
I told you guys some Xbox fanboys will attack this thread and pretend that they're more educated than a rendering engineer working for cytek 🤭

Why are we acting as if this stuff is up to opinion? If you had the budget to make a PC with the same specs as one of the two consoles, are you making the PS5 or Xbox Series X? Xbox Series X is more powerful, period.

Look at this answer to whether better development tools will matter or not at the end of the gen.


"No, because the PlayStation API generally gives devs more freedom, and usually at the end of each generation, Sony consoles produce more detailed games. For example, in the early seventh generation, even multi-platform games for both consoles performed poorly on the PlayStation 3. But late in the generation Uncharted 3 and The Last of Us came out on the console. I think the next generation will be the same. But generally speaking XSX must have less trouble pushing more pixels. (He emphasizes on “only” pixels)"

This simply makes no sense. The PS3 was MORE powerful than the 360. It just took a long time for that to come to fruition since developing for the Cell sucked dick. Of course the games getting nearly 100% from the console had an edge. I can also add that he took a broad question and honed in on only first party developers. Yea, maybe at the end of the PS5 and Series X gen, Naughty Dog might put out a game that looks as good or better than any Series X game. But that's Naughty Dog. For every first party game there are dozens or hundreds of third party.
 
I told you guys some Xbox fanboys will attack this thread and pretend that they're more educated than a rendering engineer working for cytek 🤭

They kept dismissing the multitude of developers speaking about the many advantages of the PS5 by saying those are first party devs (although many of them are not). Now a third party developer from a well respected development team spoke of the same, some people still has the balls to call him out as though they know better than him.
 

FranXico

Member
Seriously why do Xbox fans get triggered by people praising PS5 hardware? Its not like he said XSX is weak or bad? Is it simply because its the only thing Microsoft has over Sony? A hardware advantage because in terms of software theres only one winner which is Sony? Enjoy it while it lasts because eventually its all about the games. No gamer likes to see their console collecting dust no matter how many teraflops it has. Just like the Xbox one X is currently.
Oh don't worry, Microsoft has also been busy doing what they do best, bribing third-parties left and right to keep games away from PS5. Just wait ;). And don't expect the usual suspects to cry foul about exclusives being "anti-consumer" anymore... ;)
 
Oh don't worry, Microsoft has also been busy doing what they do best, bribing third-parties left and right to keep games away from PS5. Just wait ;). And don't expect the usual suspects to cry foul about exclusives being "anti-consumer" anymore... ;)

The irony...
 

benjohn

Member
Are you having a laugh? Before the release, you potentially break NDA and piss off one of the companies keeping your own company afloat (oh yeah MS would have been ok to have even slightly but potentially damaging PR out that could impact pre-orders had the guy been more factual, maybe producing benchmarks... :rolleyes:) in an interview that HR and Marketing did not write off and nothing happens?!

MS got pissed and Crytek wa able to claim plausible deniability and cut that guy’s head off metaphorically. Case closed and lesson imparted to all other employees.
I really feel bad for the guy. I post on a certain Persian gaming forum and some idiots have found an old tweet of him. In the tweet he's posted a picture of some kind of cheese and calls it "cream" because it reminds him of his childhood. it tastes like those back in the day. They're calling him stupid who can't tell cheese from cream. He's stupid so all he says is stupid. Pathetic. Fanboyism has no limits
 

rnlval

Member
Oh I thought he was. haha. Crytek must be standing with him on what he said then.

To be fair, Matt from era (is he a dev?) and a dev from Dice are basically saying the same thing as the Crytek dev. Both are very close to each other despite XSeX having more teraflops. It looks like it has a lot of bottlenecks.
Define "very close to each other despite" other.
 

John254

Banned
Oh don't worry, Microsoft has also been busy doing what they do best, bribing third-parties left and right to keep games away from PS5. Just wait ;). And don't expect the usual suspects to cry foul about exclusives being "anti-consumer" anymore... ;)
WTF are you talking about?
 

rnlval

Member
Why are we acting as if this stuff is up to opinion? If you had the budget to make a PC with the same specs as one of the two consoles, are you making the PS5 or Xbox Series X? Xbox Series X is more powerful, period.

Look at this answer to whether better development tools will matter or not at the end of the gen.


"No, because the PlayStation API generally gives devs more freedom, and usually at the end of each generation, Sony consoles produce more detailed games. For example, in the early seventh generation, even multi-platform games for both consoles performed poorly on the PlayStation 3. But late in the generation Uncharted 3 and The Last of Us came out on the console. I think the next generation will be the same. But generally speaking XSX must have less trouble pushing more pixels. (He emphasizes on “only” pixels)"

This simply makes no sense. The PS3 was MORE powerful than the 360. It just took a long time for that to come to fruition since developing for the Cell sucked dick. Of course the games getting nearly 100% from the console had an edge. I can also add that he took a broad question and honed in on only first party developers. Yea, maybe at the end of the PS5 and Series X gen, Naughty Dog might put out a game that looks as good or better than any Series X game. But that's Naughty Dog. For every first party game there are dozens or hundreds of third party.
X360 has extra fix function hardware when compared to NVIDIA RSX e.g. X360 GPU's tessellation hardware which needs to be moved to CELL's SPUs.

Xbox 360 GPU has ROPS similar functionality as DirectX12_1's ROV as revealed by PC's Xbox 360 emulator. Such functionality needs to be moved to SPUs.

From https://forum.beyond3d.com/posts/1460125/

------------------------

"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"

1) Two ppu/vmx units
There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.
2) Vertex culling
You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.
3) Vertex texture sampling
You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.
4) Shader patching
Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.
5) Branching
You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.
6) Shader inputs
You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.
7) MSAA alternatives
Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.
Post processing
360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.
9) Load balancing
360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.
10) Half floats
You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.
11) Shader array indexing
You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.
Etc, etc, etc...
 

Bojanglez

The Amiga Brotherhood
Oh don't worry, Microsoft has also been busy doing what they do best, bribing third-parties left and right to keep games away from PS5. Just wait ;). And don't expect the usual suspects to cry foul about exclusives being "anti-consumer" anymore... ;)
Yeah, I expect they'll go all out with this. People were crying foul because modern warfare 2 remaster has come out on PS4 a month earlier, but I expect they'll forget that when it suits.

For MS it just makes business sense to try and fill the void until their internal studios get up to pace, we'll see how Phil can spin this, as he's been pretty cool since he started in his role, promoting things like cross platform play, but being perceived as taking games away from other platforms may alter the perception of him.

I think both platforms will do it to an extent, I'll buy both consoles at launch but will probably not buy any 3rd party exclusives at launch in order to discourage it.

... but never say never 😆
 

FranXico

Member
Yeah, I expect they'll go all out with this. People were crying foul because modern warfare 2 remaster has come out on PS4 a month earlier, but I expect they'll forget that when it suits.

For MS it just makes business sense to try and fill the void until their internal studios get up to pace, we'll see how Phil can spin this, as he's been pretty cool since he started in his role, promoting things like cross platform play, but being perceived as taking games away from other platforms may alter the perception of him.

I think both platforms will do it to an extent, I'll buy both consoles at launch but will probably not buy any 3rd party exclusives at launch in order to discourage it.

... but never say never 😆
They both always try to do it. But let's not pretend we don't know who would always win a bidding war by default.
 
X360 has extra fix function hardware when compared to NVIDIA RSX e.g. X360 GPU's tessellation hardware which needs to be moved to CELL's SPUs.

Xbox 360 GPU has ROPS similar functionality as DirectX12_1's ROV as revealed by PC's Xbox 360 emulator. Such functionality needs to be moved to SPUs.

From https://forum.beyond3d.com/posts/1460125/

------------------------

"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"
Ron, it's so easy to make out your posting style lol.
 
It's called exclusivity deal. It happens to be legal, but it basically amounts to paying a third party to delay a release on a competing platform. See: Modern Warfare 2 Campaign exclusivity to PS4.

When was the last time ms moneyhatted a game, rise of the tomb raider? I'm sorry but it's Sony that have been the "bribing" devs, and i don't expect them to stop next gen.
 
Top Bottom