• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD CEO Lisa Su reminisces about designing the PS3's infamous Cell processor during her time at IBM

Thick Thighs Save Lives

NeoGAF's Physical Games Advocate Extraordinaire
qEqwoQaWYeBYHsbKhA7DFD-650-80.jpg.webp


Just after Computex 2024, AMD CEO Lisa Su sat down with Stratechery to conduct an extended interview about solving hard problems throughout her career— including her time at IBM and contributing to the legacy of PlayStation from both there and AMD afterward. As she notes, "I've been working on PlayStation for a long time, if you think about it. PlayStation 3, 4, 5...[like the common thread] across multiple companies, yes."

Now, even if you're familiar with the PlayStation 3 and its nature as being difficult to program thanks to its IBM PowerPC-based Cell processor, you most likely didn't know Lisa Su had any involvement with it before this week. Details are few and far between, but we did manage to find the earliest statement where pre-AMD CEO Lisa Su commented on the matter.

According to Lisa Su, then-Director of Emerging Products at IBM in 2001, "We [IBM] started with a clean sheet of paper and sat down and tried to imagine what sort of processor we'd need five years from now." The decision that IBM, Sony, and Toshiba made was to create a CPU with an extreme focus on parallelization.

Today, that approach is fairly common through multicore CPUs, Simultaneous Multi-Threading (SMT, or Hyper-threading under Intel marketing), and even dedicated Efficiency cores, but SMT wouldn't emerge until 2002, and the first consumer multicore CPUs from AMD and Intel wouldn't be seen until 2005. And, of course, the first-ever multicore was released by IBM for workstation and server use in 2001— the same year they were planning the PS3's Cell processor.

The interviewer points out that Sony's PlayStation 3 is viewed as one of its least successful consoles, which is true. The PlayStation 3 pretty much lost the generation handily to Nintendo's cheap, casual-friendly Wii and Microsoft's less powerful but easier Xbox 360. The complexity of the architecture meant that cross-platform games didn't always perform as well as they should on PS3, though as developers (particularly first-party devs) mastered the hardware, it did result in the most visually stunning console games of the latter half of the generation being Sony exclusives, like Uncharted 3 and its ilk.

Lisa said, "The Cell processor was extremely ambitious at that time, thinking about the type of parallelism it was trying to get out there. Again, I would say, from a business standpoint, it was certainly successful. As you rank things, I think history will tell you that there may be different rankings."

"My perspective is, the console era has gone through phases [...] but once you went to HD, you had tremendous increase in cost of asset creation, you had developers heavily motivated to support multiple processors, you had game engines coming along. Suddenly, no one wanted to go to the burden of differentiating on the Cell; they just wanted to run on the Cell," Su explained.

Interestingly, the seventh generation of home consoles (PlayStation 3, Xbox 360, Nintendo Wii/Wii U) also marks a shift in AMD's allegiances to console manufacturers. AMD produced graphics chips for both Nintendo and Xbox, and all three console manufacturers used the PowerPC CPU architecture. But come to the eighth-gen (PS4, XB1, Switch), both Xbox and PlayStation had switched fully to AMD-powered x86 CPU and GPU architecture. The Switch also saw Nintendo pivot to an Nvidia-powered SoC design (with Arm CPU cores) for their new hybrid console focus.

With the added context of this interview, one can't help but wonder if Lisa Su's unifying thread throughout the last three generations of PlayStation hardware isn't a coincidence. It could just be corporate happenstance, but going from a humble Product Director and Engineer working on the PS3 at IBM to Senior vice president (in 2012, CEO in 2014) at AMD, setting the future course of both PlayStation and Xbox hardware, is truly impressive.

It's also a great win for AMD, in general, to provide the hardware behind the two biggest consoles on the market for two consecutive (and a third upcoming) console generations. No matter who wins between Sony and Microsoft and their console war, AMD wins, and that's the kind of thinking that earns you a CEO spot.

 

LordOfChaos

Member
The Cell was interesting enough that we're still talking about it 17 years later, there's something to be said about that and no console hardware since has been like that.

Was it worth it, sort of mixed on that, but I do wish we could have seen the alternate world where the late stage of Cell when developers really had it down pat wasn't just making up for how lacklustre the last minute RSX was in comparison to Xenos. And launching it with undercooked dev tools for it did hurt it, iirc developers just got together and made something akin to PIX for it eventually.

Ultimately it added a lot of SIMD to the CPU when things would soon move a lot of that to the GPU. It had interesting early ideas. It took years for us to get that much SIMD on desktop CPUs, but GPUs quickly dwarfed it.
 
Last edited:

HL3.exe

Member
The cell is actually still a powerhouse all these years later. Their PPU's -if honed well- can perform faster on single thread speeds alone than even modern high-end x86 CPUs.

The main problem was just that; terribly complex to program for and not at all efficient to a programmers time and investment.
 

DeepEnigma

Gold Member
The cell is actually still a powerhouse all these years later. Their PPU's -if honed well- can perform faster on single thread speeds alone than even modern high-end x86 CPUs.

The main problem was just that; terribly complex to program for and not at all efficient to a programmers time and investment.
It was a physics beast at the time too compared to x86.

A real world example of this is where physics had a downgrade from GTA4 vs GTA5.
 

HL3.exe

Member
It was a physics beast at the time too compared to x86.

A real world example of this is where physics had a downgrade from GTA4 vs GTA5.
Neh, that's just the result of generalized CPU leap at the time. GTA 4 ran great (even better) on X360's PowerPC CPU. Going from a few 100mhz CPU's in the PS2 for San Andreas to -at the time- huge leap of tri-core 3,2ghz console CPU, resulting in a actually next gen gameplay/simulation leap. Don't think we'll ever see that kind of leap in consoles again.

But the main problem of both consoles (and having a regressive effect on GTA 5 compared to 4 as an example) where the limited RAM pools, and the PS3 struggled with it's weak GPU on top of being a bitch to program for. The Cell CPU could have been great if the instruction-set was more common and easier to work on, and wasn't hamstrung by 2004 memory/GPU components.
 
Last edited:

DeepEnigma

Gold Member
Neh, that's just the result of generalized CPU leap at the time. GTA 4 ran great (even better) on X360's PowerPC CPU. Going from a few 100mhz CPU's in the PS2 for San Andreas to -at the time- huge leap of tri-core 3,2ghz console CPU, resulting in a actually next gen gameplay/simulation leap. Don't think we'll ever see that kind of leap in consoles again.

But the main problem of both consoles (and having a regressive effect on GTA 5 compared to 4 as an example) where the limited RAM pools, and the PS3 struggled with it's weak GPU on top of being a bitch to program for. The Cell CPU could have been great if the instruction-set was more common and easier to work on, and wasn't hamstrung by 2004 memory/GPU components.
Physics were not as good in the PS4/XB1 compared to PS360 until some devs tapped into GPGPU to make up for the shortcomings. There were benchmarks ran and showed there could be more far physics actions on screen compared to what we got in the PS4/XB1.

As for the Cell/Xenon

Cell: 1 PPU, the most traditional core, and seven enabled SPUs (one reserved for OS, so 6 for games), which are cores with a lot of traditional stuff pulled out (no prefetching, very little cache, a local memory to manually manage instead), but a lot of SIMD power for the time (Single instruction, multiple data, aka large number crunching)

Xenon: Basically 3 PPUs, but with more registers than the PPU in Cell had. Much more traditional. Half the SIMD power on paper, but easier to tap into its power as the traditional cores take care of a lot for the programmer.



The Cell was essentially early to be decent at what GPUs are very good at today, lots of number crunching but in a "straight line" so to speak, things that were branchy and unpredictable are where both suffer. And GPUs are several magnitudes faster than the Cell today, so we don't really need or want a CPU like that again. They bet on CPU when more things would eventually shift to GPU.


To get more complicated than that it's a heck of a lot of reading.


http://www.redgamingtech.com/sony-playstation-3-post-mortem-part-1-the-cell-processor/
 
Last edited:

SHA

Member
She did something worthwhile but her lagacy in the Tech industry don't matter, cause someone will make something completely different and take her place.
 

Puscifer

Member
The Cell was interesting enough that we're still talking about it 17 years later, there's something to be said about that and no console hardware since has been like that.

Was it worth it, sort of mixed on that, but I do wish we could have seen the alternate world where the late stage of Cell when developers really had it down pat wasn't just making up for how lacklustre the last minute RSX was in comparison to Xenos. And launching it with undercooked dev tools for it did hurt it, iirc developers just got together and made something akin to PIX for it eventually.

Ultimately it added a lot of SIMD to the CPU when things would soon move a lot of that to the GPU. It had interesting early ideas. It took years for us to get that much SIMD on desktop CPUs, but GPUs quickly dwarfed it.
And it's not like some of the different techniques weren't carried forward either. There's a slide out there about how they used the CPU for Infamous Second Son and how they intentionally would save percentages of CPU cores and GPU for certain task.

 

LordOfChaos

Member
The cell is actually still a powerhouse all these years later. Their PPU's -if honed well- can perform faster on single thread speeds alone than even modern high-end x86 CPUs.

The main problem was just that; terribly complex to program for and not at all efficient to a programmers time and investment.

Source? The Cell was basically way ahead of where CPUs were on SIMD for years, but they've overcome it and offer far more Gflops even there on SIMD than Cell now. Especially parts with AVX-512. What task is it faster at than a modern high end 2024 CPU, what's the proof?

It's why we can now emulate the PS3, with all the overhead that comes with emulating such an alien architecture, it was novel and ahead in some areas for years but we're comfortably past it now. A 234 million transistor CPU is not likely to be better than modern multi billion transistor ones, it just had put all its skill points in one area which made it novel for a while.
 
Last edited:

DeepEnigma

Gold Member
Damn I did not expect to be cited from a 7 year old comment lol
Adding to my earlier comment.

UbiSoft's motion cloth testing.


Slide.jpg


In summary:

Xbox 360 CPU: 34 dancers
PS3s Cell: 105 dancers
PS4 CPU: 98 dancers
Xbox One CPU: 113 dancers
Xbox One GPGPU: 830 dancers
PS4 GPGPU: 1600 dancers
 
Last edited:

Drew1440

Member
Don't forget the Cell was supposed to function part of the PS3's GPU as part of the original design, with the other part being designed by Toshiba. It's a shame the Cell and Intel Larrabee design's didn't take off, or that PowerPC couldn't compete with modern X86 or ARM processors.
 

Romulus

Member
The Cell was interesting enough that we're still talking about it 17 years later, there's something to be said about that and no console hardware since has been like that.

Was it worth it, sort of mixed on that, but I do wish we could have seen the alternate world where the late stage of Cell when developers really had it down pat wasn't just making up for how lacklustre the last minute RSX was in comparison to Xenos. And launching it with undercooked dev tools for it did hurt it, iirc developers just got together and made something akin to PIX for it eventually.

Ultimately it added a lot of SIMD to the CPU when things would soon move a lot of that to the GPU. It had interesting early ideas. It took years for us to get that much SIMD on desktop CPUs, but GPUs quickly dwarfed it.


Because it was terrible. All other Sony consoles were at least somewhat developer friendly.
 

Clear

CliffyB's Cock Holster
The bottom line was simple: To get the best of the Cell you had to specialize for it. Without that, it wasn't nearly so good.

This manifested itself as things like UE3 requiring massive modification to run at parity.

The lesson learned was that the amount of effort required to achieve a result, is generally more impactful than anything else.
 

Sophist

Member
The book The Race For A New Game Machine: Creating the Chips Inside the Xbox 360 & the Playstation 3 explains how IBM fooled the Sony and Toshiba engineers who were living on-site to design and test the cpu for the Xbox 360. It was like a spy movie.


An excerpt:
Soon there was a dramatic increase in the number of IBM-only meetings, both one-on-ones and group gatherings, and the tension in the Design Center was thick enough to cut. Behind closed doors, my team argued that working for Microsoft was a conflict of interest for them and for their Sony and Toshiba friends, who were unknowingly designing and verifying logic that would be going into their competitor’s product.

“Do you think Intel worries about delivering the same chip to Dell and to Gateway?” I said. “It’s up to those companies to provide product differentiation. We just have to change the way we’ve been thinking about our core and, to be frank, we’ve been living a fairy tale. It was nice while it lasted, but now it’s time to get real. IBM is entirely within their rights to market the PowerPC core to multiple customers. You’ve been at all-hands meetings where Akrout and Kahle showed a number of products, besides the PlayStation 3, that will carry this chip in the future. Well, the future is here, and its name is Microsoft. So get with the program.”

It was a hard line to pull, but I knew that I had to get the team past the emotional upheaval. I was already tired of coddling them. We were wasting time with all this yakking about something we couldn’t change. It was time to move on, to get back to work.

One young, idealistic engineer came to me with tears brimming in his eyes, completely stressed out by the Microsoft deal, an unwilling participant in what he considered unethical shenanigans. He was one of our very brightest engineers, and the last thing I wanted was for him to become so disillusioned that he left the company. I moved him to an area where he would only be dealing with STI work. That seemed to work for him, but he continued to be a vocal (almost belligerent) opponent of our work on the Xbox chip. He reveled in bringing up touchy topics in mixed company—in other words, in front of our Sony and Toshiba partners—just to watch us squirm as we tried to keep the lid on Pandora’s box. Mickie and I frequently had to rein him in before he pulled others down that rabbit hole yet again. Jim Kahle could have taken lessons from this outspoken young radical!

Things really got exciting when Microsoft launched a travelling roadshow to kick off the Xbox 360 project at the various design sites. I dreaded the day they came to Austin, because we couldn’t exactly invite the whole core design team to the meeting. It would really liven things up to invite the Sony and Toshiba partners to the Xbox kickoff meeting, wouldn’t it? These were strange times at IBM. Akrout agreed to allow Steve Curtis’s team to go to the meeting along with a handful of my core IBM designers. We hoped it would be enough critical mass to convince Microsoft that we really had a dedicated design team. They, of course, still had no idea that the PlayStation 3 team was developing their core. It still seemed deceptive, not being able to invite the entire core team to the meeting. Here I was again, caught in a strange, international spy drama.

The roadshow invitation arrived via e-mail, vaguely indicating 10:00 A.M., 1st floor, Building 908, which was a neutral building in the same complex as the STI Design Center. Realizing I still needed to find the specific meeting room, I took the short hike over from my office and arrived a little early. I wandered around the first floor and poked my head into some conference rooms, but I couldn’t find anybody I recognized. I soon wound my way into the main cafeteria. There was a large, open space in the back of the cafeteria that often doubled as a conference room for large audiences. The room was already two-thirds full, and people were still gathering. A group near the front was setting up the video presentation for the Xbox 360 meeting. I almost choked. This was an open meeting room. Many of the Sony and Toshiba engineers from the STI Design Center ate lunch in this cafeteria every day and would most certainly stumble onto this meeting!

I grabbed the first manager I could find, someone from Steve Curtis’s organization whose name escaped me.

I gripped his arm and stepped in close, bringing our faces nearly nose to nose. “Are you crazy?” I whispered. I know he sensed the urgency in my voice, because he turned a shade paler. “You can’t have this meeting here. You need to find a conference room where you can close the door for privacy. The Sony and Toshiba engineers cannot be exposed to this!”

The manager stammered, “Don’t you think you’re overreacting?”

That pushed me over the top and I marched to the front of the conference room area. With a little yank, I unplugged the overhead projector. I pointed to the gape-mouthed manager and said, “You. Grab that projector.” I faced the crowd and waved my arms until I got everyone’s attention. “Folks, we’re moving this show to another conference room. Follow me.”

I led the way to the second floor and (luckily) found a large, empty conference room. The bewildered group of engineers tagging along behind me grumbled and whispered, but I ignored it. I assigned people to collect more chairs, arrange the room, and set up the projector, all before our Microsoft guests entered. The meeting was a success, but I made a few enemies among the IBM folks. There were a lot of “who do you think you are” stares, but I couldn’t let it bother me. I had to protect IBM’s interests as well as those of my Japanese friends.
 

HL3.exe

Member
Source? The Cell was basically way ahead of where CPUs were on SIMD for years, but they've overcome it and offer far more Gflops even there on SIMD than Cell now. Especially parts with AVX-512. What task is it faster at than a modern high end 2024 CPU, what's the proof?

It's why we can now emulate the PS3, with all the overhead that comes with emulating such an alien architecture, it was novel and ahead in some areas for years but we're comfortably past it now. A 234 million transistor CPU is not likely to be better than modern multi billion transistor ones, it just had put all its skill points in one area which made it novel for a while.
Source from MVG and tech forums. MVG spoke to a Insomniac coder, mentioning the single threaded performance possibilities. But extracting that performance with efficient code is the problem, in a world were the Cell CPU knowledge base isn't the standard (far from it). It's also why emulation of games that tapped deep into PPU utilization is still a problem on modern hardware. (Combined with the common overhead emulation brings).

I'm not saying it's some magic chip (it definitely isn't), but there is a reason why CPU manufacturers are dabling with the idea of bringing back integrated PPU clusters into traditional CPU designs.

 
Last edited:

Duchess

Member
The PlayStation 3 pretty much lost the generation handily to Nintendo's cheap, casual-friendly Wii and Microsoft's less powerful but easier Xbox 360.
I'd disagree here. All things considered (hardware sales, software sales, quality of games), I think the PS3 scraped a win, but it was the kick up the arse that Sony needed, coming from the highs of the success of the PS1 and PS2.
 

HL3.exe

Member
Physics were not as good in the PS4/XB1 compared to PS360 until some devs tapped into GPGPU to make up for the shortcomings. There were benchmarks ran and showed there could be more far physics actions on screen compared to what we got in the PS4/XB1.

As for the Cell/Xenon
Yep, I too lament the fact that console's focus less on CPU leaps. There was this comment by Cerny of developers wanting a proper IPC/core speed leap. But Cerny insisted that it would effect the size of the GPU and amount of memory, and devs immediately backed down from the idea, as visuals and scope is more easily shown/sellable then complex/nuanced simulation-logic.

Multi-threaded development has always been an issues when working in determistic game-logic. It's one of the reasons why game-logic complexity (collision fidelity, ai, simulation, etc) has severely stagnated since the last 15 years (combined with the high risk in development costs)
 
Last edited:

LordOfChaos

Member
Source from MVG and tech forums. MVG spoke to a Insomniac coder, mentioning the single threaded performance possibilities. But extracting that performance with efficient code is the problem, in a world were the Cell CPU knowledge base isn't the standard (far from it). It's also why emulation of games that tapped deep into PPU utilization is still a problem on modern hardware. (Combined with the common overhead emulation brings).

I'm not saying it's some magic chip (it definitely isn't), but there is a reason why CPU manufacturers are dabling with the idea of bringing back integrated PPU clusters into traditional CPU designs.


So, source? Link? This seems like something that was read in 2015 and it was impressive that the Cell could still hold up in specific ways 10 years later, but it's hard to realize it's almost 10 years after that lol. The Cell as a processor put most of its skill points into SIMD, forgoing things like prefetching, forgoing large caches instead of small manually managed local memories, in favour of the SPEs with wide for then SIMD units. But eventually more regular CPUs caught up even to that min/maxing.

I've seen a few dozen of these startups claiming big speedups in my time, the problem is some tasks are inherently serial and vast armies of computer scientists haven't overcome this, only mitigated it with power hungry features like speculative execution. It might be possible to speed up one edge case that much, you can get an FPGA or ASIC that does one specific task thousands of times faster than a CPU, but making it universal I bet will prove impossible. And doesn't prove anything about the Cell.
 
Last edited:
Physics were not as good in the PS4/XB1 compared to PS360 until some devs tapped into GPGPU to make up for the shortcomings. There were benchmarks ran and showed there could be more far physics actions on screen compared to what we got in the PS4/XB1.

As for the Cell/Xenon
GTA5 was developed for PS360 first and foremost, the PS4/XB1 remaster came later on.

Jaguar has 102.4 Gflops with AVX, actually it's a bit faster than XBOX 360's Xenon CPU.

GTA5 was never a PS3 exclusive to take full advantage of Cell.
 
Source from MVG and tech forums. MVG spoke to a Insomniac coder, mentioning the single threaded performance possibilities. But extracting that performance with efficient code is the problem, in a world were the Cell CPU knowledge base isn't the standard (far from it). It's also why emulation of games that tapped deep into PPU utilization is still a problem on modern hardware. (Combined with the common overhead emulation brings).

I'm not saying it's some magic chip (it definitely isn't), but there is a reason why CPU manufacturers are dabling with the idea of bringing back integrated PPU clusters into traditional CPU designs.

The "PPU" mentioned in the article has absolutely no correlation to Cell's PPU.

Cell was revolutionary back in 2005 (just like Amiga was revolutionary back in 1985), but eventually PCs caught on.

Adding to my earlier comment.

UbiSoft's motion cloth testing.


Slide.jpg
How many people know that the "puny" Jaguar has 5 times higher IPC than Cell's PPU? (OoO vs in-order makes a hell of a difference)

Half the frequency (1.6 vs 3.2 GHz), which means it's actually 2.5 times faster in general purpose/scalar instructions and it has 8 general purpose cores (Cell has 1 general purpose core + 8 SIMD cores).

There's a reason many AAA games became open-world RPG collectathons during the 2nd half of the PS4 generation (as soon as devs mastered multi-threading in their game engines)...

~

Btw, it looks like we might need an AVX-512 CPU (maybe Zen 4/5 on PS6) for proper PS3 emulation:


Each SPU has a huge register file (128 registers x 128-bit) and it seems AVX-512 can match it (32 registers x 512-bit).
 

SonGoku

Member
the 2nd half of the PS4 generation (as soon as devs mastered multi-threading in their game engines)...
I wonder what happened with this. I was excited for this gen that we would have a massive CPU related jump going from devs being forced to master multi threading to get the most out of Jaguar cores to implementing these skills on Zen 2 which not only has >2x the clockspeed but also much higher IPC as well.

Yet here we are with poorly optimized games already struggling on these new CPUs and nothing to show for it
Btw, it looks like we might need an AVX-512 CPU (maybe Zen 4/5 on PS6) f
Following Cernys/Sonys design philosophy do you think this will be the case or that they will gimp/compromise AVX instructions to push the envelope elsewhere (GPU power budget, CPU frequency, save die space etc)
 
1) I wonder what happened with this. I was excited for this gen that we would have a massive CPU related jump going from devs being forced to master multi threading to get the most out of Jaguar cores to implementing these skills on Zen 2 which not only has >2x the clockspeed but also much higher IPC as well.

2) Yet here we are with poorly optimized games already struggling on these new CPUs and nothing to show for it

3) Following Cernys/Sonys design philosophy do you think this will be the case or that they will gimp/compromise AVX instructions to push the envelope elsewhere (GPU power budget, CPU frequency, save die space etc)
1) They keep churning out massive open-world RPG collectathons, sprinkled with an extra dose of wokeness. As I said, this trend started during the 2nd half of the PS4 generation (the 1st half had games like The Order 1886, Uncharted 4 etc.)

Cell (just like Emotion Engine) might be a multi-core processor, but it has a very different philosophy compared to Jaguar/Zen or even Xenon.

Cell was specifically made for arcade/linear games (think of Uncharted 2, Super Stardust HD), while x86 CPUs tend to excel in open-world games (think of Skyrim).

Do you honestly think Cell was the ideal processor for a PC-esque game like Skyrim?

Games like Skyrim have tons of activities (NPCs here and there doing stuff) that can be parallelized, as long as you have lots of general purpose (not SIMD) cores!

That's why Skyrim ran like crap on PS3 and most games during the PS3 era were linear.

2) I don't think it's a matter of poor optimization when they push the GPU too much and there's no DLSS equivalent to lessen the workload.

3) https://www.neogaf.com/threads/ken-...-true-visionary.1670756/page-3#post-269324667

I agree AVX-512 is too power hungry, especially in native 512-bit mode (Zen 4 still has 256-bit vector units, so it has to split 512-bit instructions), but the biggest change that AVX-512 brings in x86 CPUs is that it closely matches SPU instructions:


The newest PS5 Slim revision already uses 6nm, we expect PS5 Pro to use 5nm, so PS6 is going to use 3nm.

I personally think it's doable, but I wouldn't mind a native Cell comeback either. :)

Would be nuts to have it as a co-processor at 3nm!
 

LordOfChaos

Member
The newest PS5 Slim revision already uses 6nm, we expect PS5 Pro to use 5nm, so PS6 is going to use 3nm.

I personally think it's doable, but I wouldn't mind a native Cell comeback either. :)

Would be nuts to have it as a co-processor at 3nm!
3) https://www.neogaf.com/threads/ken-...-true-visionary.1670756/page-3#post-269324667


The Cell as a co-processor idea would be cool but if it was going to be done should have been done at the outset of the PS5 or really the PS4 as some secret sauce, the skills to really use it have been lost and almost no one would bother apart from being used for native PS3 games. Modern toolchains don't support it either and would probably be lost on it, even if some of the ideas from it evolved and influenced the industry after.
 

SonGoku

Member
1) They keep churning out massive open-world RPG collectathons, sprinkled with an extra dose of wokeness. As I said, this trend started during the 2nd half of the PS4 generation (the 1st half had games like The Order 1886, Uncharted 4 etc.)

Cell (just like Emotion Engine) might be a multi-core processor, but it has a very different philosophy compared to Jaguar/Zen or even Xenon.

Cell was specifically made for arcade/linear games (think of Uncharted 2, Super Stardust HD), while x86 CPUs tend to excel in open-world games (think of Skyrim).

Do you honestly think Cell was the ideal processor for a PC-esque game like Skyrim?

Games like Skyrim have tons of activities (NPCs here and there doing stuff) that can be parallelized, as long as you have lots of general purpose (not SIMD) cores!

That's why Skyrim ran like crap on PS3 and most games during the PS3 era were linear.

2) I don't think it's a matter of poor optimization when they push the GPU too much and there's no DLSS equivalent to lessen the workload.

3) https://www.neogaf.com/threads/ken-...-true-visionary.1670756/page-3#post-269324667
I wasnt making an argument for or against CELL
My main focus was on multi threading skills from PS4 transitioning to PS5 not showing any leaps or improvements. Considering how much they were able to get out of puny Jaguar cores they havent done nothing with Zen 2 that they could not have done on Jaguar besides high fps modes.

My follow up question would be are PS5 games as multi threaded optimized/focused as PS4 games?
 
Last edited:
I wasnt making an argument for or against CELL
My main focus was on multi threading skills from PS4 transitioning to PS5 not showing any leaps or improvements. Considering how much they were able to get out of puny Jaguar cores they havent done nothing with Zen 2 that they could not have done on Jaguar besides high fps modes.

My follow up question would be are PS5 games as multi threaded optimized/focused as PS4 games?
A Plague Tale: Requiem seems to take advantage of Zen 2 to offer a next-gen leap (improved rat physics).

I don't think they abandoned multi-threading this gen, quite the contrary.
 

Shut0wen

Banned
The Cell was interesting enough that we're still talking about it 17 years later, there's something to be said about that and no console hardware since has been like that.

Was it worth it, sort of mixed on that, but I do wish we could have seen the alternate world where the late stage of Cell when developers really had it down pat wasn't just making up for how lacklustre the last minute RSX was in comparison to Xenos. And launching it with undercooked dev tools for it did hurt it, iirc developers just got together and made something akin to PIX for it eventually.

Ultimately it added a lot of SIMD to the CPU when things would soon move a lot of that to the GPU. It had interesting early ideas. It took years for us to get that much SIMD on desktop CPUs, but GPUs quickly dwarfed it.
True it was a shame that took almost 8 years to finally make a a free roam game that managed to run well which was gta5, every other free roam game eventually crashed, especially red dead and bethesda games
 

LordOfChaos

Member
I wasnt making an argument for or against CELL
My main focus was on multi threading skills from PS4 transitioning to PS5 not showing any leaps or improvements. Considering how much they were able to get out of puny Jaguar cores they havent done nothing with Zen 2 that they could not have done on Jaguar besides high fps modes.

My follow up question would be are PS5 games as multi threaded optimized/focused as PS4 games?

Clearly, console programming has gotten too easy, too automatically fast, people don't have to invent new branches of data science to take advantage anymore


We need...Cell Broadband Engine™ 3 in PS6!
 

SonGoku

Member
Clearly, console programming has gotten too easy, too automatically fast, people don't have to invent new branches of data science to take advantage anymore


We need...Cell Broadband Engine™ 3 in PS6!
Cant tell if this is sarcasm but thats the feeling i get too, that PS5 is very friendly to develop and the CPU is "fast enough" that most devs just do bare minimum on CPU without bothering to get the most out of it via smart and efficient multi threaded code

I dont know about Cell but some sort of bespoke accelerator would be nice maybe something to aid with raytracing and AI image reconstruction. A NPU of sorts
 
Last edited:
Cant tell if this is sarcasm but thats the feeling i get too, that PS5 is very friendly to develop and the CPU is "fast enough" that most devs just do bare minimum on CPU without bothering to get the most out of it via smart and efficient multi threaded code

I dont know about Cell but some sort of bespoke accelerator would be nice maybe something to aid with raytracing and AI image reconstruction. A NPU of sorts
Just like PS4 Pro had experimental RPM/2xFP16 support (PS4 Pro was a testbed for next-gen features, a proto-PS5 if you will), PS5 Pro will have experimental DLSS-esque (PSSR) support that will become mainstream on PS6.

But these features tend to help GPU-bound games, not CPU-bound ones.

Even though we don't have Task Manager in consoles to see CPU usage, I'm pretty sure they max out all 7 available cores.

Which graphics/game engine isn't multi-threaded these days?
 
Last edited:

SonGoku

Member
Even though we don't have Task Manager in consoles to see CPU usage, I'm pretty sure they max out all 7 available cores.

Which graphics/game engine isn't multi-threaded these days?
Speaking merely from PC experience, most games even those that are heavily multi threaded dont get anywhere close to maxing out all cores. They max 1 or 2 and the other sit at 50% or lower.
The explanation i see often is multi threading is hard on games because of latency and not trivial to fragment a main thread across multiple cores
But these features tend to help GPU-bound games, not CPU-bound ones.
Ya i reckon that, was just talking in the context of bringing back some bespoke accelerator that makes the console punch way above its weight for example a hypothetical custom RT accelerator that if enough effort is put to optimize code around it, it can provide 2 to 4x the RT performance of a similarly specced off the shelf card
 
1) Speaking merely from PC experience, most games even those that are heavily multi threaded dont get anywhere close to maxing out all cores. They max 1 or 2 and the other sit at 50% or lower.
The explanation i see often is multi threading is hard on games because of latency and not trivial to fragment a main thread across multiple cores

2) Ya i reckon that, was just talking in the context of bringing back some bespoke accelerator that makes the console punch way above its weight for example a hypothetical custom RT accelerator that if enough effort is put to optimize code around it, it can provide 2 to 4x the RT performance of a similarly specced off the shelf card
1) Latency has nothing to do with it.

Multi-threading is not easy, because certain types of code tend to have dependencies that cannot be parallelized.

I'll give you 2 real-world/layman examples to understand what I'm talking about, without any technicalities:

a) 9 women cannot speed up the pregnancy process down to 1 month. This is called a single-threaded task.

No matter how many "women" (cores) you throw at it, pregnancy will always take 9 months.

b) You have a relatively successful supermarket business, but there's only one cashier (pipeline). Bottlenecks are inevitable, especially during rush hours.

What do you do as a business to increase your efficiency and reduce complaints? You hire more cashiers! Yes, it costs money, but it's definitely worth it, unless you don't like having happy customers. ;)

This is an inherently parallelizable/multi-threaded task.

I'd argue that most modern AAA games (especially open-world/RPG ones) have tons of parallelizable activities, because if you think about it, it's a huge map with tons of activities, NPCs etc. It's mostly branchy if-then-else code that needs lots of general purpose, out-of-order CPU cores.

Linear games on the other hand tend to benefit more from SIMD cores (games like Uncharted, A Plague Tale etc.)

That's why I made a clear distinction between Cell and Xenon/Jaguar/Zen.

2) I don't expect anything like that, most innovations in the IT space have already been discovered.

I'd love to be proven wrong, but I won't hold my breath that we're gonna have yet another Amiga/Cell moment any time soon.
 
Last edited:

LordOfChaos

Member
Cant tell if this is sarcasm but thats the feeling i get too, that PS5 is very friendly to develop and the CPU is "fast enough" that most devs just do bare minimum on CPU without bothering to get the most out of it via smart and efficient multi threaded code

I dont know about Cell but some sort of bespoke accelerator would be nice maybe something to aid with raytracing and AI image reconstruction. A NPU of sorts

It was sarcasm hah. But there is probably something to be said for hardware and software toolchains just making everything automatically kind of fast enough and not leaving much incentive for really down in the weeds optimization like creating entire jobs and auto multitasking systems for the SPEs etc.

I won't say it's worse this way, the way development cost and timelines have gone already systems that are easier to use are an overall boon, but it also flattened out outliers
 
Top Bottom