• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Cell Processor to Run at 4.6 Ghz

golem said:
whoa.. 85'C with heatsink? impressively scorching...

The temp was already discussed in the months old thread :P But in summary, most seemed to agree that with a refined manufacturing process, and a fan stuck in to it, this thing would be more "normal" in terms of its operating temp.

Still, if it's operating in a stable fashion at 85'C..
 
What does a console even need that much mhz for? They must be planning some hella complicated physics and real time systems to justify that. Man, no wonder developers are practically planning when to shut up shop, this industry should have never gotten this big with so many idiots (suits who don't play games) in charge.
 
Koshiro said:
What does a console even need that much mhz for? They must be planning some hella complicated physics and real time systems to justify that. Man, no wonder developers are practically planning when to shut up shop, this industry should have never gotten this big with so many idiots (suits who don't play games) in charge.

The "Cell" is designed with networked, node based processing in mind. Sony is going to use the excess processing power of tens of millions of connected PS3 Cells to produce the massive computational power necessary to build monster, ultra powerful robots.

These 10 story tall, unstoppable Aibo-looking robot dogs are then going to march on and destroy Redmond, Washington.
 
Cell's performance is dependant upon the numbers of Streaming Processors per Core though. So comparing just the gigahertz ratings of cell's 4.6GHz SP units and Xenon's 3.5 GHZ PPC core does not automatically mean anything.
 
nitewulf said:
Cell's performance is dependant upon the numbers of Streaming Processors per Core though. So comparing just the gigahertz ratings of cell's 4.6GHz SP units and Xenon's 3.5 GHZ PPC core does not automatically mean anything.


True, fingers crossed for 8. Most people seem to be assuming that, but it may not work out like that. That's one missing piece of the puzzle that should be revealed in Feb.
 
krypt0nian said:
If true, Xenon looks to be pre-owned at this point.

Why? At this point, we really don't know anything concrete about any of these systems, and as a number of posters in this thread have already pointed out, clock speed means almost nothing these days. Graphics chips, memory, system bandwidth are all at least as important.

For an example of this, take a look at the showdowns between Mac G5's vs. Intel P4's. The P4 has a much higher clockspeed then the G5, but is regularly owned on certain applications. For instance, the P4 usually out performs the G5 in Office applications, but in something like Photoshop, the P4 isn't even close. Current G5's are running at 2.5 Ghz while P4's are in the 3.8 range.
 
Kung Fu Jedi said:
Why? At this point, we really don't know anything concrete about any of these systems, and as a number of posters in this thread have already pointed out, clock speed means almost nothing these days. Graphics chips, memory, system bandwidth are all at least as important.

For an example of this, take a look at the showdowns between Mac G5's vs. Intel P4's. The P4 has a much higher clockspeed then the G5, but is regularly owned on certain applications. For instance, the P4 usually out performs the G5 in Office applications, but in something like Photoshop, the P4 isn't even close. Current G5's are running at 2.5 Ghz while P4's are in the 3.8 range.

Well clockspeed doesn't count for nothing..it's one factor.

Also, the comparisons you draw between the G5 and P4 possibly/probably won't also hold true for a faster Cell chip versus Xbox2's chip, if the other parts of the equation hold up (like the number of APUs). Like I said earlier, if there are 8 APUs in there, that Cell chip will be doing a LOT per cycle.

That's just based on speculation though..we still don't know for sure what MS are doing with their CPU.
 
gofreak said:
Well clockspeed doesn't count for nothing..it's one factor.

Also, the comparisons you draw between the G5 and P4 possibly/probably won't also hold true for a faster Cell chip versus Xbox2's chip, if the other parts of the equation hold up (like the number of APUs). Like I said earlier, if there are 8 APUs in there, that Cell chip will be doing a LOT per cycle.

That's just based on speculation though..we still don't know for sure what MS are doing with their CPU.

I don't disagree with you at all, and have pretty much said the same things in my previous posts in this thread. I just thought it was silly to say that Xbox Next was "pre-owned" because of the clockspeed alone.
 
Lets say I buy a PS3 with cell and a multimedia hub with cell in it. Could both cells work together for increased performance?

That's the impression I'm getting.
 
seismologist said:
Lets say I buy a PS3 with cell and a multimedia hub with cell in it. Could both cells work together for increased performance?

That's the impression I'm getting.

That's one of the fundamental ideas of the architecture, yes. But how that'll manifest itself in real life..that's anyone's guess at this point. I'm not sure how hard they'll push that idea with the first products or PS3, but it'd certainly be interesting if they did. There is a Sony patent out there that discusses how rendering could be distributed over cell processors connected by a less than ideal network (i.e. the internet), with a game used to illustrate, for example, so perhaps they are planning to use stuff like that from the get-go. But who knows.
 
There's an assumption here that the4.6 ghz version of the CELL will be the one used in the PS3. I would imagine the high end CELL would be used for the powerful home media center server while the PS3 would get something a little less bleeding edge.
 
Oh clockspeed counts alright. Depends on what the cell can do per cycle though. Clockspeed doesnt matter in an INTEL cpu anymore cause their pipelines are so friggin huge and they dont do very much per cycle. Clockspeed DOES matter on an AMD IPC is higher, and once they get clocked high its over hence Amd's return to performance crown. Could be a similar situation here. I know the IBM chip in the Xenon is on point (i still dont believe the clock speed though). If the chip being used for the Xenon is an any way related to the chip that IBM fabs for apple workstations , they can barely get those things clocked over 2.5ghz...and thats with water cooling. Now the friggin successor is going to be clocked and fabbed in the millions at 3.5ghz in 2005!??...ehhh i dunno bout that.
 
DonasaurusRex said:
If the chip being used for the Xenon is an any way related to the chip that IBM fabs for apple workstations , they can barely get those things clocked over 2.5ghz...and thats with water cooling. Now the friggin successor is going to be clocked and fabbed in the millions at 3.5ghz in 2005!??...ehhh i dunno bout that.

Yeah, I agree with you here! Steve Jobs promised a 3 Ghz G5 would be out in a year, and they didn't make it and still have a ways to go. 2.5 Ghz G5's are nice, especially when they come in a duel processor system, but I have no idea how IBM plans to get mass quanaties of 3.5 Ghz chips done in time for a Xenon launch. Unless they're not revealing something yet. Like I said earlier, we really don't have specs yet on any of the next gen systems, just lots of speculation at this point.
 
gofreak said:
On a side note, I find it kind of funny that a lot of people seem to be perenially surprised at the possibility of PS3 being more powerful than Xbox2
Hm, I thought everybody knows it already!?
Power: PS3>>>>Xenon>>Revolution
Well, let us wait and see
 
gofreak said:
There is a Sony patent out there that discusses how rendering could be distributed over cell processors connected by a less than ideal network (i.e. the internet), with a game used to illustrate, for example, so perhaps they are planning to use stuff like that from the get-go.
i dont know about that, that'd mean people would get different performance in games depending on how many cell devices they own/have access to. it doesnt seem a like a suitable design to be implemented in general games. perhaps a few niche titles may take adavntage of such technology.
 
Ms's got nothing to worry about.


laws2kuk1
The fact that the PS3 will have a cell processor at the speed of 4.6Ghz with multithread technology still doesnt exactly beat the Xbox 2 from what ive heard. The Xbox 2 will have 3, 3Ghz multithread processors which totals at 9Ghz. Also with there being 3 processors, and each one has multithread technology it can work on triple the data at one time as Sony's cell processor.

Well, that's just my 2 cents on the facts.


MetroFighter
4.6Ghz just sounds weak... that would be high tech now, but one year from now that will be average, and a year after that, it'll be nothing. And i thought the xbox 2 would be 3 3.5 Ghz processors, but nothings been officially released yet... even 3 2.0 Ghz processors would beat the 1 4.6Ghz processor...


Otogi
Just because the cell chip is 4.6 doens't mean it will be 4.6 in the playstation. I'm pretty sure ibm and sony are showing off what the max power the chip is capable of.

But it really doesn't matter since the cell chip will be the main processor not the graphics processor they went to crapvidia for that.


NeoNemesis
AzWanksta, the reason this article is on TXB is because Sony is competition, and we need to know what the competition has in store. Also, 4.6 GHZ is powerful, no doubt about that, but if the rumours are true, then the Xbox 2 will have PS3 beat, with 3 cores running at 3.5+GHZ each.

pr0phets0f0old
Processors are cool. They would mean no lagging, correct? What about the graphics? Will it be a 128 or 256?
 
nitewulf said:
i dont know about that, that'd mean people would get different performance in games depending on how many cell devices they own/have access to. it doesnt seem a like a suitable design to be implemented in general games. perhaps a few niche titles may take adavntage of such technology.

The presumption is it would be only in online games, only leveraging other PS3's also playing that game, and everyone's machine could use others, thus making it "fair".
 
Azih said:
There's an assumption here that the4.6 ghz version of the CELL will be the one used in the PS3. I would imagine the high end CELL would be used for the powerful home media center server while the PS3 would get something a little less bleeding edge.

what powerful home media center server? A little box with a big hard drive, whacking some bytes down a cable now and then? Doing a spot of MPEG4 decoding? Hardly power-packed. I'd fully expect the PS3 to have the most powerful version of CELL available (in quantity) at time of launch. Later ones may be more powerful, but PS3 is the raison d'etre of CELL.
 
bigNman said:
no way the original xbox is more powerful than revolution.

EDIT:
actually i misread that ^ heh (thought he said xbox 2). i agree with it now tho. apologies for misreading.
 
XBox2's triple core CPU is said to push ~84 Gflops. I cant remember if that was the exact amount but it was 80~90 Gflops. Xbox2 is likely to have one of these sub-100 Gflop triple core CPUs but the newest revealed patents indicate that there could be more CPUs. it is flexable. it's not out of the realm of possibility that MS could have 2 triple core CPUs but thats still under 200 Gflops (80~90 Gflops * 2)

a single Cell Processor Element (1 CPU core plus 8 APUs) is said to push 256 Gflops

the PS3 CPU could be made from 1 to 8 of these 256 Gflop Cell Processor Elements. lets say Sony goes for a 2-PE CPU. thats 512 Gflops. that would outperform even 2 triple core CPUs in Xbox2 (160~180 Gflops) and totally blow away a single triple core Xbox2 CPU (80-90 Gflops).


4.6~4.8 GHz vs 3.5 GHz does in no way indicate what PS3 will be like vs Xbox2, going by clockspeed numbers.

clockspeed matters, yes, but also it's about how much performance you have per clock.
 
gofreak said:
The presumption is it would be only in online games, only leveraging other PS3's also playing that game, and everyone's machine could use others, thus making it "fair".

Ok I'm going to get a few things off my chest. Somethings I'm just not seeing with Cell concerning graphics.

I'm reading things here and there about cell but this is one thing I can look to and roll my eyes a bit. Maybe it is the over confidence forum members have in the tech. Lets say that nice LCD CELL equipped flat panel hi def display has 2 cell processors in it. The set also includes some ultra high bandwidth connection that connects to the PS3 to aid in what ever. That is easier to take, hell at my college they built a node based "super computer" using a bunch of alienware pc's connected by high speed interconnects.


However, I certainly have my doubts about something like this being done over the net. I doubt your cell equipped set combined with your PS3 will help someone's lonely PS3 render more things on screen fast enough to provide two identical experiences. It is obvious certain aspects of the net should already bring doubts to mind but also how do we take the NVIDIA hardware into account? Let me be bold for a moment and say that at the rate that piece of tech (NVIDIA’s tech) will churn through pixels will be leaps and bounds beyond over the rate of computed data being brought over the net powered by any number of devices over the internet concerning real time graphics.

Sure enough there are many things that benefit from that type computing. But for the vast amounts of visuals on the screen I don't see it. Perhaps someone can show me the light.

I can picture however some type of MMO network powered by a cell cluster and being aided by every PS3 connected to it in some way. Although rather limited, it would still benefit since the cluster can bank on the spec of every PS3. It knows what it can do with every PS3 at any given time and can work accordingly. (All PS3’s will be similarly equipped.) Most of the things concerning the actual player could be computed on the PS3 and sent back to the cluster since the PS3 would technically be an extension of the cell cluster.

Let’s say you are seeing some type of cut scene on your PS3 MMO game. The cluster knows what’s being used and what’s not. It already knows the hardware, software platform, the way the scheduler synchronizes/communicates between processes/threads, and thus the cell cluster can take some processes out of its own queue and queue them right up on your PS3. The process is computed and the results are sent right back to the cluster. Thus one less process for the cluster to keep in queue and the scheduler can give another process its chance in the queue while the other is being computed somewhere else. In the world of multi-processing processes have priorities, thus the high priority processes such as real time computations have the majority of Cells time. When exactly do the lower priority processes get their Cell time? Thus let someone else's PS3 handle it while the Cell cluster churns away at the high priority processes. Making the system more efficient. Of course we are not talking about physics or graphics computations, rather aiding in the scope of running a MMO game, real time tracking, managing weather in the MMO world etc...

Of course this is a ruff idea, but then again I don't expect everyone to be on the same page if I busted out with the tech talk. But I think this is enough for everyone to understand (hopefully) although vague if you’re in the industry. (For example. how feedback would be handled between the PS3 and the Cell cluster. What if another high priority process begins computing but requires results of another process sent out to someone's PS3? How long do we want to wait until a process sent out to someone's PS3 is considered lost? How would we estimate time that will give a good balance? I mean, what if we don't wait long enough and cut off too many processes that were on their way back? What if we are waiting too long? The more processes you lose the more work you have to do to get that process computed. When should we decide to just have the process computed locally? The more "misses" the more redundant work that must be done to compute the process. What if this (the process that require the computed data from the process sent out to someone's PS3) is a critical process that must lock a Cell until is done computing which ultimately depends on that process sent out? Sure we can just reload the process and compute it locally but for how many processes do you want to do this? Many of these issues will bring any computing cluster to its knees. Even local thread and process synchronization is one hell of mess with many chances for the system to crash down, I’m getting chills thinking of something based across the net.)

Anyways, that is what I think when I hear about distributed computing concerning cell over networks. Maybe someone can explain to me how the graphical side things would be handled. Graphics processing, as much as I love graphics I don’t know enough to convince myself someone’s PS3 with their Cell equipped set will also make someone else’s visuals on their PS3 over the net improve.


doncale said:
XBox2's triple core CPU is said to push ~84 Gflops. I cant remember if that was the exact amount but it was 80~90 Gflops. Xbox2 is likely to have one of these sub-100 Gflop triple core CPUs but the newest revealed patents indicate that there could be more CPUs. it is flexable. it's not out of the realm of possibility that MS could have 2 triple core CPUs but thats still under 200 Gflops (80~90 Gflops * 2)

a single Cell Processor Element (1 CPU core plus 8 APUs) is said to push 256 Gflops

the PS3 CPU could be made from 1 to 8 of these 256 Gflop Cell Processor Elements. lets say Sony goes for a 2-PE CPU. thats 512 Gflops. that would outperform even 2 triple core CPUs in Xbox2 (160~180 Gflops) and totally blow away a single triple core Xbox2 CPU (80-90 Gflops).


4.6~4.8 GHz vs 3.5 GHz does in no way indicate what PS3 will be like vs Xbox2, going by clockspeed numbers.

clockspeed matters, yes, but also it's about how much performance you have per clock.

I still not sure what role exactly is NVIDIA's tech playing in the PS3. Do you know? You are directly compairing Xenon's CPU's as they would be doing the exact same thing as to what PS3's Cell chips would be doing. We don't know that, for what Xenon's CPU's may lack, the ATI tech may make up. Of course, when we talk about power we mostly imply graphics. Sure there is AI and physics but we can fake those to different degrees and in some cases don't even have to come close to the real thing as we only need certain aspects of them. I have questions all over the place concerning both these systems. I don't see how many people are coming up with specifics. But we will know soon enough.
 
4.6Ghz just sounds weak... that would be high tech now, but one year from now that will be average
We're not talking about PCs here!
The PS2 got 300 Mhz and beats the shit out of a lot of PC games (MGS3, Silent Hill 3).
Well, let us wait and see
 
all I know is that all the Mhz and Flops will be irrelavant once we see what these things can do.

look at what Capcom did with the little Cube that good, in RE4. while RE4 might not be THE most technically advanced games ever, it is certainly one of the most impressive games overall, ever. imagine Xenon and PS3 pushed as hard as Cube in RE4. it's gonna be amazing.
 
mrklaw said:
what powerful home media center server? A little box with a big hard drive, whacking some bytes down a cable now and then? Doing a spot of MPEG4 decoding? Hardly power-packed. I'd fully expect the PS3 to have the most powerful version of CELL available (in quantity) at time of launch. Later ones may be more powerful, but PS3 is the raison d'etre of CELL.

I thought that one of the reasons that IBM entered the CELL arangement was to build CELL based servers and share the development cost of the technology with other parties. Sounded more like everyone had their own reasons to participate in CELL development and no one party was the "reason" for CELL.
 
"4.6Ghz just sounds weak... that would be high tech now, but one year from now that will be average"

If you could somehow equate it to Intel's processors, the hell it's average. Intel has 32-bit, 3.8 GHz processors now, and plans to have dualcore 32-bit 3.0 GHz processors by the end of this year.

Cell is 64-bit 4.6 GHz multicore. Owned on all fronts...if they were comparable.
 
intel has 64bit processors what you talkin bout. All dual core processors will have 64bit functionality as well, i dont see why they wouldnt when the single core ones support it.
 
I'm not aware of any Intel 64-bit processors on the market.
 
COCKLES said:
The days of MHZ/GHZ defining a processor speed are long over. It's all about the efficency of the chip. Even Intel have said they are giving up the race for endless GHZ improvements.
That's because the bad design of their recent chips prevents them from upping the speed without heat problems. Not to mention the inefficent design of most of their chips in general, regardless of the clock speed. So naturally they're going to discount processor speed, when for the next year or two, they can't compete and will be making little headway until they get their new chip designs in. P4 and the Prescott really hurt Intel as far as efficency and design go.
 
Real-time resource management system for real-time applications...



HAHAHAHAHAHAH Best marketing bs quote I have ever heard.

So resource management is usually pre-rendered? :) Freaking multi-thread balancing is a big duh next gen, I love when I read shit like this. It will be RTRM for RTA in the next PS3 release.

and 90nm. What happened to 65 Sony??? :lol :lol :lol :lol
 
Oh, that's why. I don't really keep up with stuff outside the Pentium line.

Still, the Xeon isn't dual core and is 1.6 GHz behind, and the itanium (i've never heard of those) isn't even comparable because it costs over $1000. That would buy you three PS3s.
 
teh_pwn said:
Oh, that's why. I don't really keep up with stuff outside the Pentium line.

Still, the Xeon isn't dual core and is 1.6 GHz behind, and the itanium (i've never heard of those) isn't even comparable because it costs over $1000. That would buy you three PS3s.

First these are retail costs, (hell I posted a newegg link). The processor won't cost that much for Intel to make.

Second, I just posted a 3.0 ghz Xeon EM64T cause it's reasonably cheap, common, and reasonably fast. Intel has much faster ones (they also have obscene profit margins on their high end chips).

Yes, no dual core yet of course.
 
Izzy said:
Ms's got nothing to worry about.
Considering the 4.6Ghz refers to a 9 core CPU, using the teamxbox logic that equals ~42Ghz :p

doncale said:
a single Cell Processor Element (1 CPU core plus 8 APUs) is said to push 256 Gflops
That's for the 4ghz part though. @hypothetical 4.6 the number would be accordingly higher.
 
teh_pwn said:
Oh, that's why. I don't really keep up with stuff outside the Pentium line.

Still, the Xeon isn't dual core and is 1.6 GHz behind, and the itanium (i've never heard of those) isn't even comparable because it costs over $1000. That would buy you three PS3s.

Pentiums are the desktop version of Xeons...which will be dual core in a matter of months here.

Itanium , or Itanic as some call it was supposed to be Intels/HP next gen processor that fully left x86 behind. Using the EPIC architecture, it was supposed to be the future. Unfortunately that was in like 97, and since the Itanium isnt good at running x86 software you can see the inherent problem when 90% of all software is written and compiled for that ISA. Itanic never took off, prices never came down, and its pretty much now a high end part for people building supercomputers if they actually have software written for the chip. Ultimately the argument that the EPIC architecture was the way to go became moot, because now in 2005 it is also dated, and doesnt even have the software support to fall back on, even microsoft dumped them recently. Thats huge cause intel and MS are bedfellows. In the end amd was right, why force a platform change when the ISA can be worked around, hence the reason why there is even a link to Intel "EMT" processors.
 
Phoenix said:
I thought that one of the reasons that IBM entered the CELL arangement was to build CELL based servers and share the development cost of the technology with other parties. Sounded more like everyone had their own reasons to participate in CELL development and no one party was the "reason" for CELL.

But thats SERVER servers. Not home media ones. Home media servers would be a big hard drive with network connectivity, sitting under your TV for recording and playback of TV and streaming from your PC. Doesn't need a big-ass chip

Server servers, like IBM are planning, are the big things in companies handling thousands of data streams concurrently, sending data back and forth, keeping company IT systems moving. They need a lot of computing power in total, but usually come in sections, so I would only expect each section to need a relatively low powered CELL.

In fact, I don't know what applications, other than PS3, need such a powerful cutting edge design. Maybe they don't, it was just a sweetner used from Sony to help keep their costs down?
 
The actual Cell setup plan on PS3 is in the hands of developers now. Its obviously not genneral information but hardly a trade secret. Dean C, over at Beyond 3D along with other posters has said that the "Cell = the entire CPU in PS3 case likely to be 1 PU (Processor Unit) with 8 SPU (Synegestic (stupid name) procesing unit)" I think MrSingh
who post here also thinks it will be 1-8.
If this is the case and the fact that both XB2 and PS3 are using next gen GPU's which will be no matter which way you look at be very simialr in terms of spec give two machines of very similar performance and certaintly hardly any difference in what you see on screen.
 
Pug said:
The actual Cell setup plan on PS3 is in the hands of developers now. Its obviously not genneral information but hardly a trade secret. Dean C, over at Beyond 3D along with other posters has said that the "Cell = the entire CPU in PS3 case likely to be 1 PU (Processor Unit) with 8 SPU (Synegestic (stupid name) procesing unit)" I think MrSingh
who post here also thinks it will be 1-8.
If this is the case and the fact that both XB2 and PS3 are using next gen GPU's which will be no matter which way you look at be very simialr in terms of spec give two machines of very similar performance and certaintly hardly any difference in what you see on screen.

A 8-APU, 1 PE cell CPU will likely be quite a bit more powerful than Xenon's CPU setup, from what's rumoured.

Functionally I expect the GPUs to be similar, but in terms of performance there may be differences..

Dean C and others, AFAIK, are just speculating, also. Solid PS3 specs aren't being talked about yet, I don't think. Some developers have been complaining about the lack of PS3 info..
 
Gofreak, The expectent CPU setup set is in the hands of developers, for sure! Dean C is under NDA he's giving as much information as he can without braking this. The information is there, but just like XB2 it is not set in stone yet. Both MS and Sony will wait till the last minute to finalise details. I know that a south coast developer in the UK has been told to expect things to change on XB2. By the way although a single cell will be more powerful than a 3 core Xbox2 setup there are things to take into account.

Ultimately MS is trying to launch a machine before the PS3 which will "on screen" look very similar to the what the PS3 can do. From all the information we have and even if say the PS3 goes 2-16 the fact that Sony has gone with Nvidia makes me think they will be extremly simialr in the real world.
 
marsomega said:
Ok I'm going to get a few things off my chest. Somethings I'm just not seeing with Cell concerning graphics.

I'm reading things here and there about cell but this is one thing I can look to and roll my eyes a bit. Maybe it is the over confidence forum members have in the tech. Lets say that nice LCD CELL equipped flat panel hi def display has 2 cell processors in it. The set also includes some ultra high bandwidth connection that connects to the PS3 to aid in what ever. That is easier to take, hell at my college they built a node based "super computer" using a bunch of alienware pc's connected by high speed interconnects.


However, I certainly have my doubts about something like this being done over the net. I doubt your cell equipped set combined with your PS3 will help someone's lonely PS3 render more things on screen fast enough to provide two identical experiences. It is obvious certain aspects of the net should already bring doubts to mind but also how do we take the NVIDIA hardware into account? Let me be bold for a moment and say that at the rate that piece of tech (NVIDIA’s tech) will churn through pixels will be leaps and bounds beyond over the rate of computed data being brought over the net powered by any number of devices over the internet concerning real time graphics.

Sure enough there are many things that benefit from that type computing. But for the vast amounts of visuals on the screen I don't see it. Perhaps someone can show me the light.

I can picture however some type of MMO network powered by a cell cluster and being aided by every PS3 connected to it in some way. Although rather limited, it would still benefit since the cluster can bank on the spec of every PS3. It knows what it can do with every PS3 at any given time and can work accordingly. (All PS3’s will be similarly equipped.) Most of the things concerning the actual player could be computed on the PS3 and sent back to the cluster since the PS3 would technically be an extension of the cell cluster.

Let’s say you are seeing some type of cut scene on your PS3 MMO game. The cluster knows what’s being used and what’s not. It already knows the hardware, software platform, the way the scheduler synchronizes/communicates between processes/threads, and thus the cell cluster can take some processes out of its own queue and queue them right up on your PS3. The process is computed and the results are sent right back to the cluster. Thus one less process for the cluster to keep in queue and the scheduler can give another process its chance in the queue while the other is being computed somewhere else. In the world of multi-processing processes have priorities, thus the high priority processes such as real time computations have the majority of Cells time. When exactly do the lower priority processes get their Cell time? Thus let someone else's PS3 handle it while the Cell cluster churns away at the high priority processes. Making the system more efficient. Of course we are not talking about physics or graphics computations, rather aiding in the scope of running a MMO game, real time tracking, managing weather in the MMO world etc...

Of course this is a ruff idea, but then again I don't expect everyone to be on the same page if I busted out with the tech talk. But I think this is enough for everyone to understand (hopefully) although vague if you’re in the industry. (For example. how feedback would be handled between the PS3 and the Cell cluster. What if another high priority process begins computing but requires results of another process sent out to someone's PS3? How long do we want to wait until a process sent out to someone's PS3 is considered lost? How would we estimate time that will give a good balance? I mean, what if we don't wait long enough and cut off too many processes that were on their way back? What if we are waiting too long? The more processes you lose the more work you have to do to get that process computed. When should we decide to just have the process computed locally? The more "misses" the more redundant work that must be done to compute the process. What if this (the process that require the computed data from the process sent out to someone's PS3) is a critical process that must lock a Cell until is done computing which ultimately depends on that process sent out? Sure we can just reload the process and compute it locally but for how many processes do you want to do this? Many of these issues will bring any computing cluster to its knees. Even local thread and process synchronization is one hell of mess with many chances for the system to crash down, I’m getting chills thinking of something based across the net.)

Anyways, that is what I think when I hear about distributed computing concerning cell over networks. Maybe someone can explain to me how the graphical side things would be handled. Graphics processing, as much as I love graphics I don’t know enough to convince myself someone’s PS3 with their Cell equipped set will also make someone else’s visuals on their PS3 over the net improve.

I wasn't talking about someone with multiple cell-devices helping others with none (although that is possible, I guess). I was just talking about a network of PS3s helping each other in online games, where for example, one person's view may require more processing than anothers (one example talked about was a game whereby you could adopt a birds-eye "commander" view or a first person "solider" view in a MMO battle game - different PS3s will have different processing requirements depending on your view, and those that are less stressed may be in a position to help the others with more taxing rendering demands). I'm not necessarily saying this is feasible, now, but that it is something Sony are looking into, so such applications are being considered for some point in the future. I'll try and dig up a link to the patent on this if you want more detail, but after reading it, it was the first time I thought "yes, this could possibly work" with regards to Cell's distributed processing capability.
 
Pug said:
Gofreak, The expectent CPU set is in the hands of developers, for sure! Dean C is under NDA he's giving as much information as he can without braking this. Thie information is there, but just like XB2 it is not set in stone yet. Both MS and Sony will wait till the last minute to finalise details. I know that a south coast developer in the UK has been told to expect things to change on XB2.

Thanks, it's news to me, I'll have to take a closer look at Dean C's posts then..

To clarify, I don't expect anything more than 1 or 2 PEs myself. I still think 1 PE with 8 APUs at ~ 4Ghz will take the performance crown in terms of CPUs next gen, but I guess we'll wait and see.

edit - I've been looking through a few of DeanoC's posts on B3D, and his most recent ones don't suggest he has any solid inside info or firm info from Sony. Just last week he was saying "I personally think 1 or 2 PU(s) [PEs] is more likely. " It doesn't sound like he's been given solid info here. Can you point me toward where he talks about that, or talks about not being able to say much? Thanks..
 
Pug said:
If this is the case and the fact that both XB2 and PS3 are using next gen GPU's which will be no matter which way you look at be very simialr in terms of spec give two machines of very similar performance and certaintly hardly any difference in what you see on screen.
It depends how you define "very similar" performance.
At a glance, 2-3x higher performance sounds like "a lot" - but reality is that we have machines that are at least that far apart on the market today, and the way the games perform visually is indeed, quite similar.

But nonetheless, I expect the differences will still be quite enough for fanboys to show into eachothers face every chance they get. ;)
 
"To clarify, I don't expect anything more than 1 or 2 PEs myself. I still think 1 PE with 8 APUs at ~ 4Ghz will take the performance crown in terms of CPUs next gen, but I guess we'll wait and see."


There is little doubt that in GFLOP the cell will have a performance advantage. I don't htink that is doubt. But say MS delayed launch till say the same time as PS3, I don't think the clock speed in the 3 core unit would be upto 4.6Ghz even at that time so delaying the XB2 on CPU performance is a no brainer of MS. They are not going to able to get hold of CPU to run at that Gflop rating. I think a lot is resting on the ATI GPU which most people expect to the best GPU out there. I'm sure the Nvidia chip will be just as powerful but I'd be suprised if Nividia pulled a length on ATI seeing as they've been playing catch up for a fair while now.
 
Pug said:
There is little doubt that in GFLOP the cell will have a performance advantage. I don't htink that is doubt. But say MS delayed launch till say the same time as PS3, I don't think the clock speed in the 3 core unit would be upto 4.6Ghz even at that time so delaying the XB2 on CPU performance is a no brainer of MS. They are not going to able to get hold of CPU to run at that Gflop rating. I think a lot is resting on the ATI GPU which most people expect to the best GPU out there. I'm sure the Nvidia chip will be just as powerful but I'd be suprised if Nividia pulled a length on ATI seeing as they've been playing catch up for a fair while now.

NVidia haven't been playing catchup with ATi recently - in fact it was ATi that was playing catchup with its latest card, which finally met the performance of a NVidia card released some months before it. Feature-wise, NVidia are also more fully compliant with newer specs (SM3.0, floating point precision etc.). NVidia and ATi are quite evenly matched in terms of ability, GF5xxx style fiascos aside. I would expect whatever GPU that goes into PS3 to be at least as powerful as the GPU in Xenon, although it may be quite a bit different, so as to render direct comparsions difficult.
 
Fafalada, Oh god there are going to be differences, such as this spark is better than that spark etc. But in terms of geometry, nah hardly any difference. I don't expect one machine to do say something as fundamental as say bump mapping whilst the other can't. We should get more "ART" arguments mind which will probably be worse than this game has more polys than this etc.
Anyway you’re the developer, do you see any significant onscreen advantages from a 3 core XB2 with a next gen ATI GPU and a single or dual cell PS3 with a Nividia next gen GPU?
 
Gofreak yep the latest 6800 is a great card but its the first in a while. As I said I think MS is pinning it hopes on the fact that although the ATI GPU will be off the line 6 months earlier than Nvidia and that the spec differences will be marginal. And I tend to think they will be. There's no doubt that PS3 will have all the performances advantages, I just can't see any of these advantages translating into anything majorly tangable on screen. Thats why I think MS jump to the next gen market for late this year is very well timed indeed.
 
Pug said:
Gofreak yep the latest 6800 is a great card but its the first in a while. As I said I think MS is pinning it hopes on the fact that although the ATI GPU will be off the line 6 months earlier than Nvidia and that the spec differences will be marginal. And I tend to think they will be. There's no doubt that PS3 will have all the performances advantages, I just can't see any of these advantages translating into anything majorly tangable on screen. Thats why I think MS jump to the next gen market for late this year is very well timed indeed.

It's possible - that we're hitting a ceiling in terms of how we much power we can actually translate into measurable or visual difference in our games, but I'm kinda willing to be that we haven't got there yet. The same talk about hitting celings happens every generation, and although it is true that to the casual eye there may be little difference between games on different platforms this generation, I think "gamers" will pick up on even the most minor of differences and turn them into the "BIGGEST FUCKING DEAL EVAR". It happened with jaggies last/this generation - who would have thought before that we'd be arguing over how smooth our lines are? Or how low res our shadow maps were? Or how unrealisitc our physics were? For sure, all the next gen systems, right now, just look like massively powerful beasts - how on earth could we become unhappy with any of them? The problem (?) is, gamers tend to assimilate new technology very quickly, and they become more technically sophisticated and their eye becomes quite sharp. All it takes for one little niggly difference to have a buzzword attached to it, and suddenly it's a big deal.

Visual/quantifiable differences asides, there is a psychological advantage to having the most powerful hardware on paper. Xbox certainly attracted a lot of people based on the notion that it was the most powerful system available.
 
Top Bottom