• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

PlayStation 3 Cell chip aims high

SiegfriedFM said:
Rendering flat copies of your character and sending it out to everyone within reasonable range, 30-60 times per second, is stupid.

Uh, maybe that's because it's not what they were discussing; also your idea is not better in any shape or form, with all due respect. This is the most basic site that has some cool Project pages that are pretty accessable (like the image warping one), but if you search around for image-based rendering techniques you'll find a plathora of papers.

UNC-Chapel Hill Link
 
Another interesting article here, with some info apparently leaked from the ISSCC papers, but some of it seems old (like the Zimmons "25GFlop" comment):

http://www.eet.com/conf/isscc/showA...SNDBGCKH0CJUMEKJVN?articleId=54200580&kc=3681

This bit, if accurate, is interesting:

According to one of the ISSCC papers on the Cell design, a single chip implements a single processing element. The initial chips are being built in 90-nm SOI technology, with 65-nm devices reportedly sampling.
 
marsomega said:
The bandwidth inside current video cards on the market is several times higher over the AGP bandwidth as well as the PCI express bandwidth. If texturing over the AGP/PCI bus is still not even close to the speed at which the card handles this itself to be practical, how the hell do you assume a broadband connection will be sufficient for a PS3 to share texture data/buffer data etc. through a broadband connection to another the PS3?
bingo
 
Lol, actually that pic from the NY times DOES look like a chip, now that I've had a second look. It just looked like a chest of drawers on first glance :lol It looks like a 8APU PE? It seems quite small..
 
if that's a 1pe 8 apu cell, and its capable of the numbers i've seen discussed.

HOLY SHIT.


But we'll see soon enough i suppose
 
seanoff said:
if that's a 1pe 8 apu cell, and its capable of the numbers i've seen discussed.

HOLY SHIT.


But we'll see soon enough i suppose

Note that the pic shows the core(s), not the complete CELL chip, once packaging, pin-outs etc. are added the size is likely to be much larger.
 
seanoff said:
if that's a 1pe 8 apu cell, and its capable of the numbers i've seen discussed.

HOLY SHIT.


But we'll see soon enough i suppose

That'd be at 90nm too. At 65nm the possibilities would be greater, but who knows.

Though I still don't think 4 PEs will happen in PS3. Maybe 2 if they get to 65nm (which is still a big if imo). There's probably more to the chip as a whole than is shown in that pic, which will add to size.

I'll stick to a 1PE PS3 CPU for now ;)
 
Elios83 said:
If they're already sampling 65nm chips than a 2 Cell 65nm PS3 processor in early 2006 could be a given.

Hmm..I'd be cautious about that for now. Sampling at 65nm doesn't mean things are working out well at 65nm, necessarily. The samples coming back may have high defect rates etc. Not to mention cost. Maybe they'll shed some light on the potential at 65nm over the next couple of days (afterall, at the moment we're just going on what one article has said about that - I'd prefer to hear something directly from Sony or IBM on that).
 
90nm is already pretty advanced fabbing compared to many other chip manufacturers. I'd expect Sony to have planned PS3 around 90.

65 would just be used to eventually bring some cost/size/heat savings, allowing for loss reduction for Sony in the longer term.

I don't expect anything other than a 1PE Cell PS3 - I just want it, or one of the other consoles to amaze me.
 
gofreak said:
Hmm..I'd be cautious about that for now. Sampling at 65nm doesn't mean things are working out well at 65nm, necessarily. The samples coming back may have high defect rates etc. Not to mention cost. Maybe they'll shed some light on the potential at 65nm over the next couple of days (afterall, at the moment we're just going on what one article has said about that - I'd prefer to hear something directly from Sony or IBM on that).


That's true but they have a full year until the supposed PS3 release in March 2006 to optimize production.
That's a lot of time,much more than what GPU makers have for example.
 
Pug said:
It is more than likely that the PS3 will be a 1PE APU unit.

That would be still very powerful but disappointing.
It would mean the gap between PSX and PS2 CPU is bigger than PS2 to PS3 CPU.
 
Elios83 said:
That would be still very powerful but disappointing.
It would mean the gap between PSX and PS2 CPU is bigger than PS2 to PS3 CPU.


yeah I think it probaby mean that. although PS1 CPU (MIPS R3000a + GTE) was mainly a non-floating point processor, did all or most of its work in integer or fixed point. so the whole PS2 CPU was a huge huge leap from PS1 CPU. now if we go from a floating point based CPU (6.2 Gflops) to a CPU with balanced floating point and integr performance (256 Gflops / 256 Gops) it seems that it might not be as much of a leap. yet, that could also be totally untrue and we cannot jump to such conclusions til more is known about the PS3 CPU.
 
anyway, here's a few more articles, i think no older than Feb 6-7

http://news.ft.com/cms/s/6b31ebfe-786b-11d9-9961-00000e2511c8.html

IBM, Sony, Toshiba to reveal ‘superbrain chip’
By Chris Nuttall in San Francisco
Published: February 6 2005 18:29 | Last updated: February 6 2005 18:29

IBMSemiconductor designers from International Business Machines, Sony and Toshiba will reveal on Monday the inner workings of a “supercomputer on a chip” they claim could revolutionise communications, multimedia and consumer electronics.

The Cell microprocessor has been under development by the three companies since 2001 in a laboratory in Austin, Texas.

Its unveiling at the International Solid State Circuits Conference in San Francisco has been eagerly awaited and products containing Cell including Sony's PlayStation 3 games console are expected as early as next year.

Advance reports suggest the chip is significantly more powerful and versatile than the next generation of micro-processors announced by the consortium's competitors, Intel and AMD.

The two leading chipmakers are just moving from 32-bit to 64-bit computing and to dual-core processors essentially two “brains” on a single chip.Cell is understood to have at least four cores and be significantly faster than Intel and AMD chips.

“This is probably going to be one of the biggest industry announcements in many years,” said Richard Doherty, president of the Envisioneering research firm. “It's going to breathe new life into the industry and trigger fresh competition.”

Cell is being presented as an architecture capable of wide-ranging functions and powerful parallel processing that will allow it to distribute its work among the different cores in order to perform many tasks at once.

The consortium says this will improve the quality of video delivered over the broadband internet and increase the fidelity of computer games. The Cell developers have already produced a prototype of a computer workstation with supercomputer capabilities.

High-definition TVs from Sony and Toshiba, a Sony home server for broadband content and the PlayStation 3 all featuring Cell are due to appear in 2006.

Cell's architecture is described as scalable from “small consumer devices to massive supercomputers”.

The consortium's rivals have questioned whether Cell's potential can be realised and are working on alternative multi-tasking methods. Intel has just brought forward to this year the release on desktop PCs of virtualisation technology known as Vanderpool. This can split a microprocessor into any number of virtual processors to perform different tasks across a network from a central location.

IBM is expected to begin pilot production of the Cell chip at its 300mm wafer plant in New York state in the first half of this year.
--

http://news.ft.com/cms/s/6b31ebfe-786b-11d9-9961-00000e2511c8.html

Cell,' Developed in Austin, Texas, Could Revolutionize Computing


(financialwire.net via COMTEX) -- February 7, 2005 (FinancialWire) Austin, Texas, can not be satisfied with being the home of the world's best cyclist, one of the world's top tennis players, the best barbecue (Stubbs) and some of the top creative talent in motion pictures and music.

Now its IBM (IBM), Sony (SNE) and Toshiba (TOSBF) Design Center for Dell Technology partnership has had to create what may be the world's smallest and fastest microprocessor, the "Cell."

The Cell is expected to be introduced today in San Francisco.

It is expected to be especially useful in the new digital living room, as well as high-performance engineering and scientific markets.

The chip is said to have a potential performance speed of 256 billion mathematical computations per second, which would give the chip a place among the top 500 supercomputers.

It controls an array of eight additional processors that the design team refers to as synergistic processing elements, or S.P.E.'s. Each of the S.P.E.'s is a 128-bit processor in its own right, according to published reports.

The Cell has components that have switched at 5.6 GHz, making it proficient for high-bandwidth usages. Its relative low cost could bring home digital entertainment components in at significantly lower prices.

5.6 GHz? maybe that's just a typo. Cell was meant to have parts that are 4.6 GHz.

--

http://www.pcpro.co.uk/news/69058/ibm-sony-to-lift-the-lid-on-the-cell-processor.html

Monday 7th February 2005
IBM, Sony to lift the lid on the 'Cell' processor 11:26AM
Details of the Playstation 3 processor are expected to be revealed by Sony, IBM and Toshiba later today

The Internet rumour mill has long churned through possible details of the next-gen processor. But the technical specs are finally being revealed at a technology conference, according to Reuters.

The processor - dubbed the `Cell` - is reportedly based on the core of IBM's existing PowerPC processor line, such as those appearing in Apple desktops and servers. A multi-threading, multi-core 64-bit chip, it is designed to be capable of massive floating point processing and to be optimised for 'broadband rich media applications', such as delivering movies.

IBM and Sony claim that a one rack Cell processor-based workstation will reach 16 teraflops per second. Furthermore, the Cell processors will support clustering to act as one huge parallel processing unit. It was back in November 2004, that they showcased the next-gen Cell processor for the first time.

'Cell processor-based workstations will totally change the digital content creation environment,' boasted Masayuki Chatani, corporate CTO of Sony Computer Entertainment. 'Its overwhelming power will be demonstrated at every aspect in the development of all kinds of digital entertainment content, from movies, broadcast programs to next generation PlayStation games.'
-


http://www.computerandvideogames.com/r/?page=http://www.computerandvideogames.com/news/news_story.php(que)id=114631

SONY READY TO UNVEIL PS3 CELL CHIP

New Cell processor may be very fast indeed...

10:39 The countdown to the next-generation begins in earnest as Sony fire the first tangible shot with the unveiling of the Cell processor - the chip which will power the PS3 - in San Francisco today.

Breath will be bated at the snappily named International Solid-States Circuit Conference in San Francisco as Sony, IBM and Toshiba draw back the curtain on a technology which is reputedly ten times faster than the current generation and has been described as a "supercomputer on a chip".

Designed for high-end graphical workstations, the Cell chip is alleged to feature four power cores, is supposed to run at 4.6GHz and will apparently be capable of 16 trillion (that's a one with a lot of noughts behind it) calculations per second. It's being lined up to sock it to Microsoft and Nintendo in terms of sheer processing grunt per machine or PGPM(the new trademarked C&VG measurement of such things).


Of course all this is all just rumour and speculation until we see the full technical specs and then have them explained to us V-E-R-Y S-L-O-W-L-Y by the legion of boffins we keep on hand for such occasions, but as soon as we know more on the PS3's tech specs, you will.

--
http://news.bbc.co.uk/1/hi/technology/4242447.stm
PlayStation 3 chip to be unveiled
Close-up of PlayStation 2 console, Sony
The successor to the PlayStation 2 will be launched in 2006
Details of the chip designed to power Sony's PlayStation 3 console will be released in San Francisco on Monday.

Sony, IBM and Toshiba, who have been working on the Cell processor for three years, will unveil the chip at a technology conference.

The chip is reported to be up to 10 times faster than current processors.

It is being designed for use in graphics workstations, the new PlayStation console, and has been described as a supercomputer on a chip.

Sony has said the Cell processor could be used to bridge the gap between movies and video games.

Special effects and graphics designed for films could be ported for use directly in a video game, Sony told an audience at the E3 exhibition in Los Angeles last year.

'Ideal technology'

Cell could also be marketed as an ideal technology for televisions and supercomputers, and everything in between, said Kevin Krewell, the editor in chief of Microprocessor Report.

The chip will be made of several different processing cores that work on tasks together.

The PlayStation 3 is expected in 2006 but developers are expecting to get prototypes early next year to tune games that will appear on it at launch.

Details of the chip will be released at the International Solid State Circuits Conference in San Francisco.

Some details have already emerged, however.

When put inside powerful computer servers, the Cell consortium expects it to be capable of handling 16 trillion floating point operations, or calculations, every second.

Detailed graphics

The chip has also been refined to be able to handle the detailed graphics common in games and the data demands of films and broadband media.

IBM said it would start producing the chip in early 2005 at manufacturing plants in the US. The first machines off the line using the Cell processor will be computer workstations and servers.

A working version of the PS3 is due to be shown off in May 2005 but a full launch of the next generation console is not expected to start until 2006.

"In the future, all forms of digital content will be converged and fused onto the broadband network," said Ken Kutaragi, chief operating officer of Sony, said last year.

"Current PC architecture is nearing its limits," he added.

--

http://www.nytimes.com/2005/02/07/t...78e28e3f45125c3d&ei=5040&partner=MOREOVERNEWS
Smaller Than a Pushpin, More Powerful Than a PC
By JOHN MARKOFF

Published: February 7, 2005

The Cell microprocessor could enhance home entertainment.

SAN FRANCISCO, Feb. 6 - In a new volley in the battle for digital home entertainment, I.B.M., Sony and Toshiba will announce details Monday of their newest microprocessor design, known as Cell, which is expected to offer faster computing performance than microprocessors from Intel and Advanced Micro Devices.

Anticipation of the announcement, to be made at an industry conference here, has touched off widespread industry speculation over the impact of the new chip technology, which promises to enhance video gaming and digital home entertainment.

Sony plans to use the new Cell in its PlayStation 3, likely to be introduced in 2006, and Toshiba plans to use the chip in advanced high-definition televisions, also to be introduced next year.

However, many industry executives and analysts say that Cell's impact may ultimately be much broader, staving off the PC industry's efforts to dominate the digital living room and at the same time creating a new digital computing ecosystem that includes Hollywood, the living room and high-performance scientific and engineering markets.

"There is a new game in town, and it will revive an industry that has been kind of sleepy for the last few years," said Richard Doherty, a computer industry analyst and president of Envisioneering, a market research company in Seaford, N.Y.

The Cell's introduction also comes at a time when the computer industry has largely given up investing in fundamentally new processor designs and has instead chosen to use the additional space available on the newest generation of chips to place multiple processors and thus add performance.

The Cell chip, computer experts said, could have a theoretical peak performance of 256 billion mathematical operations per second. With that much processing power, the chip would have placed among the top 500 supercomputers on a list maintained by scientists at the University of Mannheim and the University of Tennessee as recently as June 2002.

"This is extremely impressive," said Kevin Krewell, editor in chief of Microprocessor Report, an industry technical publication, "and it proves that architectural innovation isn't dead."

Several computer industry executives warned, however, that despite the Cell's impressive specifications, success is not guaranteed for any new design in the computer industry. For example, Intel and Hewlett-Packard have spent more than a decade and hundreds of millions of dollars on the Itanium and the chip has yet to find a receptive market.

The Cell has a modular design based on a slightly less powerful I.B.M. processor that is currently in G5 64-bit desktop computers from Apple Computer. Additionally, the Cell architecture is distinguished by the fact that it controls an array of eight additional processors that the design team refers to as synergistic processing elements, or S.P.E.'s. Each of the S.P.E.'s is a 128-bit processor in its own right.

The Cell has some components that in the lab switch at 5.6 GHz, and several people familiar with the design said that it was both more flexible than is generally understood and that it has been designed with high bandwidth communications, such as high-speed data links to homes, in mind.

"Cell has been optimized for broadband-rich applications," said Jim Kahle, I.B.M.'s director of technology at the Design Center for Cell Technology, the headquarters in Austin, Tex., for the I.B.M., Sony and Toshiba partnership.

He said that I.B.M. had refined a technology also being developed by Intel called "virtualization," which is designed to isolate applications from one another. Originally used in mainframe computing applications, the technology is now being exploited by consumer electronics designers to run demanding applications like video decompression and decryption simultaneously.

One significant risk for Sony and I.B.M. is that the Sony PlayStation 3 game machine is likely to be introduced later than the next generation of Xbox from Microsoft. The PlayStation 2 beat the Xbox to market and Microsoft was never able to catch up, meaning that it lost hundreds of millions of dollars on its bet on the video game market.

In its next version of the Xbox, Microsoft plans to shift from using Pentium chips from Intel to a PowerPC microprocessor from I.B.M. The chip will have two PowerPC processor cores, but it will not be as radically new as the I.B.M. Cell design that Sony plans to use, said one executive who is familiar with the Microsoft project.

That will make for a fascinating rivalry: Sony is betting that its computer horsepower advantage will be large enough to give it a quality advance over Microsoft, even if it arrives late.

"Our goal with the Cell is to be an order of magnitude faster," said Lisa Su, an I.B.M. executive in charge of technology development and licenses.

Many industry executives believe that because of its low cost, the Cell is a harbinger of a fundamentally new computing era that will push increasingly into consumer applications.

"I think it will aid in some of the convergence between consumer and corporate I.T. and this will accelerate amazingly from the consumer side," said Andrew Heller, a former I.B.M. processor designer who is now chairman of Heller & Associates, a consulting firm in Austin, Tex.

One area of wide speculation is whether Apple might become a partner in the Cell alliance in the future. Apple is already the largest customer for the PowerPC chip, and it would be simple for the company to take advantage of the Cell design. Several people familiar with Apple's strategy, however, said that the computer maker had yet to be convinced that the Cell technology could provide a significant performance advantage.


http://www.totalvideogames.com/pages/articles/index.php?article_id=7157
CELL Processor Revealed Today

07/02/2005
By: Chris Leyton


Sony, Toshiba and IBM set to lift the lid on the CELL processor at the International Solid State Circuit Conference...

The STI Group (Sony, Toshiba, IBM) confirmed plans to finally unveil the CELL processor and describe it in detail for the first time later today at the International Solid State Circuit Conference in San Francisco. Despite information dripping slowly through over the last few years, specific information on the “supercomputer-on-a-chip” has been sparse to say the least as the CELL processor builds up an almost mythical like presence.

Many consider the CELL processor to become the biggest threat to the Intel/AMD stranglehold of the computer market; however its use goes far further then that, designed to run portable electronics, home entertainment devices and powerful computers.

Set to make its debut in computer workstations and servers before making its grand debut in the Playstation3 next year, the CELL processor will also feature in a wide range of appliances, from televisions to supercomputers and everything else in between.

Based on the core of IBMÂ’s Power processor line, the CELL processor contains multiple cores which effectively allow it to run as numerous chips in one and capable of massive floating-point operations (flops), optimized for supercomputer workloads and broadband entertainment, such as movies and other forms of digital media on demand. When utilised in a server set-up, the CELL processor is believed to be capable of handling up to 16 trillion flops every second!

Fundamentally built around distributed computing, itÂ’s possible to stack CELL processors for increased performance, although whether this aspect will feature in the Playstation3 looks unlikely.

Naturally trying to usurp the x86 technology which has stood the test of time is a difficult task that Sony, Toshiba and IBM will have to face, however at this stage before the CELL has even been demonstrated in use, itÂ’s fair to say that the next few years will be an interesting time for technophiles.
 
mrklaw said:
90nm is already pretty advanced fabbing compared to many other chip manufacturers. I'd expect Sony to have planned PS3 around 90.

It's very unlikely. See the shortages of the 90nm parts by the PS2 and the PSP. They have no more places to manufacture 90nm parts.
 
ThirdEye said:
It's very unlikely. See the shortages of the 90nm parts by the PS2 and the PSP. They have no more places to manufacture 90nm parts.

While they have heavily invested with IBM in 65nm manufacturing plants.
 
Elios83 said:
While they have heavily invested with IBM in 65nm manufacturing plants.

I'm pretty sure the PS3 Cell CPU will be at 65nm and the GPU at 90nm. Knowing Sony, they will put as many PE's in there as possible and look for cost reduction later. That sayd, I think a 4PE aka 1TFlop Cell CPU is not impossible.

Fredi
 
Elios83 said:
That would be still very powerful but disappointing.
It would mean the gap between PSX and PS2 CPU is bigger than PS2 to PS3 CPU.
Are you sure? :)

PS1->PS2
Core:
R3000@33mhz -> R5900@300Mhz ~ 9x jump + Architectural improvements of R59k (dual issue, slightly larger cache, extended instruction set) which aren't easily quantifiable.

Math:
PS1 GTE(33Mhz) - 2 cycles for dot product
PS2 VU(300mzh*2) - 1 cycle for dot product
Roughly 36x jump + again architectural improvements, VUs have standalone operation, actual floating point etc.

PS2->PS3
Core:
R5900@300mhz -> Unnamed PPC core @ 4Ghz (perhaps higher, but let's go with this for now) ~ 13x jump + architectural improvements (likely much better caching scheme, some form of OOE etc.).

Math:
(single cycle dot product on both)
2*VU@300mhz -> 8*SPU@4Ghz ~ 52x jump + architectural improvements (standalone DMA operation on SPUs, more powerfull instruction set, larger local memories etc.).

I admit, I had to make some assumptions on PS3 side, but I didn't assume anything drastic like claim the PPC core is a G5 (which would make the difference to R59k much bigger).
Anyway, in terms of raw performance the jump is a good 50% bigger this time (possibly more, if the thing will really be more then 4ghz).
 
Fafalada said:
Are you sure? :)

PS1->PS2
Core:
R3000@33mhz -> R5900@300Mhz ~ 9x jump + Architectural improvements of R59k (dual issue, slightly larger cache, extended instruction set) which aren't easily quantifiable.

Math:
PS1 GTE(33Mhz) - 2 cycles for dot product
PS2 VU(300mzh*2) - 1 cycle for dot product
Roughly 36x jump + again architectural improvements, VUs have standalone operation, actual floating point etc.

PS2->PS3
Core:
R5900@300mhz -> Unnamed PPC core @ 4Ghz (perhaps higher, but let's go with this for now) ~ 13x jump + architectural improvements (likely much better caching scheme, some form of OOE etc.).

Math:
(single cycle dot product on both)
2*VU@300mhz -> 8*SPU@4Ghz ~ 52x jump + architectural improvements (standalone DMA operation on SPUs, more powerfull instruction set, larger local memories etc.).

I admit, I had to make some assumptions on PS3 side, but I didn't assume anything drastic like claim the PPC core is a G5 (which would make the difference to R59k much bigger).
Anyway, in terms of raw performance the jump is a good 50% bigger this time (possibly more, if the thing will really be more then 4ghz).


Thanks for the details :)
But I still think that considering the Cell technology potential and its scalability a 90nm 256Gigaflops CPU in 2006 would be disappointing.
I'm pretty confident the PS3 CPU will be a 65nm part with a 2 Cell core.
 
If Fafalada guessed right you have a bigger jump from PlayStation 2's CPU to PlayStation 3's CPU than from PSOne's CPU to PlayStation 2's CPU... we have not seen what is the I/O link connecting all devices together plus we have yet to see how much RAM Playstation 3 will have and how fast it will be. Do you know about the GPU ? Not yet ;).

I think IBM, SCE/Sony, Toshiba, nVIDIA and RAMBUS know what they are doing.
 
How long until we have solid word on what is really going on?


Cheers,
bbyybb.
 
doncale said:
probably hours to get word, and days to digest :)

The info will probably be dispersed over the next few days - there's only one of the four or five presentations today, and it may not even be the most useful from an information perspective.

Also, I missed this little nugget on my first read through the NYTimes article, but it seems significant given the speculation over what the PU will be:

The Cell has a modular design based on a slightly less powerful I.B.M. processor that is currently in G5 64-bit desktop computers from Apple Computer

Is that our PU? A slightly cut-down version of the G5? Or a slightly underclocked version of the G5, with the APUs etc. at the higher clockspeeds (if that'd even make sense)?
 
More Photos! :lol I want a blue one in my PS3..

cell21iw.jpg


cell9vw.jpg


Methinks STI are taking advantage of the ISSCC as more than "just" a techie thing, and using it almost more as a PR event.
 
Damn is it horribly wrong if looking at the picture of a chip makes me hungry?

That G5 is just typical speculation that everyone has been making for Xenon CPU too, just because it's known it's a PPC ISA, hyping is easiest by tying it to some existing product.
And of course what better then IBMs highest end desktop CPU - makes it sound so much more impressive then guessing if it was based from some other core, or god forbid, even a new design.
 
One feasable, and (at least it seems to me) easy to implement feature, with no performance hit, no lag, of the network distributed, game related computing, would be the world evolving calculation. Remember how Blue Box guys complained about Fable that they could not do the world evolving as it would require too much CPU power?

Well now think that your game world is split between several areas, and your local console calculates changes only to the area you are presently exploring, so that you see the changes/updates instantly. All other areas evolve (trees grow, characters die of old age, etc.) at the same pace, but the calculation of their changes is passed onto other consoles on the network that are not doing much at the moment. As those changes do not have to update in realtime (you wouldn't see them until you get to that area anyways), the calculation/update lag would not be important. That way you free up your local CPU of a massive calculation burden, and your game suffers no loss because of that.

This is something like what MMORPG games are doing, probably more complax, I think, and with no dedicated central servers needed.
 
gofreak said:
Lol, actually that pic from the NY times DOES look like a chip, now that I've had a second look. It just looked like a chest of drawers on first glance :lol It looks like a 8APU PE? It seems quite small..

lol, okay I guess I'm not the only one that thought it was a chest of drawers :)
 
Marconelly,
there's no reason why that's not possible for any of the other consoles, let alone current gen consoles. I don't really see how Cell's architecture would really be a benefit in this type of situation since it won't be able to handle the synchronization issues internally. I mean, what would happen when you start getting to the edges of zones and results haven't been returned yet? Once the results come back do all the trees in the area all of a sudden grow or something? It's a non-trivial software problem that I believe would require far more intelligence built-in than Cell has.
 
Marconelly said:
One feasable, and (at least it seems to me) easy to implement feature, with no performance hit, no lag, of the network distributed, game related computing, would be the world evolving calculation. Remember how Blue Box guys complained about Fable that they could not do the world evolving as it would require too much CPU power?

To simulate an entire land evolving in real time WHILE you're playing a game is out of the reach of the XBox and of the PS3. Now sure you can distribute that across a large number of consoles, but you just made life suck much much more for yourself as a developer. Now you have to synchronized the various networked nodes (PS3s) in your simulation and build in a level of fault tolerance that is beyond what you have in your current server farms. Anyone can power off their PS3 at any time, their machine isn't dedicated to your game, and unless you're dumbing down the graphics the remote PS3s should be spending most of their time rendering content - not doing simulation. Then there is the sweet sweet lag and the more detrimental open door hacks that you open your world up to now that each of the remote nodes (which you no longer can secure because you don't have physical access to) can be exploited seperately.

No.

This is something like what MMORPG games are doing, probably more complax, I think, and with no dedicated central servers needed.

The average MMORPG consists of huge arrays of servers and databases to maintain the world. Those servers are responsible for keep track of player state, combat, etc. All of this has to be orchestrated in a controlled environment to prevent (easy) exploits and reduce the amount of traffic between players. If you're the only one in the world or one are of the world, that's fine. But if there are 100 people in that area of the world - you now leave the responsibility of synchronizing that part of the world with 100 people and you have ot make sure that each of those 100 remote nodes are changing/evolving the world at the same rate.

Peer to Peer massively multiplayer games will still be well beyond the reach of the PS3 both from an architectural perspective and CPU horsepower perspective. Distributing games across remote consoles is impractical for a large number of reasons.
 
there's no reason why that's not possible for any of the other consoles, let alone current gen consoles. I don't really see how Cell's architecture would really be a benefit in this type of situation since it won't be able to handle the synchronization issues internally. I mean, what would happen when you start getting to the edges of zones and results haven't been returned yet? Once the results come back do all the trees in the area all of a sudden grow or something? It's a non-trivial software problem that I believe would require far more intelligence built-in than Cell has.
Yeah, I was talking more in general than Cell speciffic. I'm aware this would probably be possible even today, but I just wanted to give an example of network distributed game computing.

At the edges of zones in my simple example, the game would stop to load, and possibly calculate any remaning changes that have not yet been returned from outside machines. Then, when it loads, you would be presented with an evloved scene that has been calculated all the while.

I think the benefit of the cell could be that it would be easy to find a "spare" computing unit on the network, as some of the cores or some of the APUs on many machines would not be allocated to execute anything in some of the games, and those could be easily used to perform their outsourced function with absolutely no hit to the game that is being played locally.
 
Don't you guys know we can NEVER let these supercomputers be omnipresent and talk to each other? Have you learned nothing from the Terminator movies specifically and Sci-Fi in general?

The computers will be smart enough to realize that they can take over the world and kill us all! PS3 = Skynet!
 
and unless you're dumbing down the graphics the remote PS3s should be spending most of their time rendering content - not doing simulation. Then there is the sweet sweet lag and the more detrimental open door hacks that you open your world up to now that each of the remote nodes (which you no longer can secure because you don't have physical access to) can be exploited seperately.
Why should those external machines be doing rendering when they are calculating something you don't even see? They should just be doing calculation of what your machine will render once you get to that, now distant area. For that same reason, the lag wouldn't matter, and the error management would be easy - remember, these machines are calculating something you are not going to see anytime soon. Sure, the outside nodes can be exploited, but the central game you're playing could easily perform some checks to see if the returned data is valid (someting like SETI@Home, etc. are doing) and perhaps even use more external machines for the same task, then compare results (again, what SETI@home is doing)

Peer to Peer massively multiplayer games will still be well beyond the reach of the PS3 both from an architectural perspective and CPU horsepower perspective. Distributing games across remote consoles is impractical for a large number of reasons.
Yet, I have the feeling this is exactly what they are shooting for with Cell. I think it's too soon to proclaim something like this to be beyond the reach of something we know so little about. Or it may be impossible on some large level, but possible on smaller, limited situation level. Say, in my example, the game would stop to load and finish any unfinished calculations before entering the zone that was manipulated by an outside machine.

*Edit*, I missed your MMORPG comment. My example was a single player game, btw.
 
The computers will be smart enough to realize that they can take over the world and kill us all!
You are forgetting it's IBM creating these chips. Once the GRID becomes self-aware (I think I predicted 2012 for the Cell userbase to be high enough) it will engage in an infinite chess game with itself, leaving the world in stone age.
 
Honestly I think CELL will remotely do my laundry, wipe my ass and insure I always have a full class of my favorite beverage in hand. :lol
 
Marconelly said:
Why should those external machines be doing rendering when they are calculating something you don't even see?

So these other players are just donating their PS3s for your enjoyment? Sure if you've got a lot of unused machines sitting around cool. But I doubt seriously if I saw my machine saying "computing world for Marconelly, please wait until he's finished playing so I can render stuff for you", it would stay plugged into the wall. Cell != free CPU and bandwidth for everyone else because it assumes that those machines are doing nothing for the other people that own them... and that those other people don't mind donating their machines for someone elses enjoyment.

Sure, the outside nodes can be exploited, but the central game you're playing could easily perform some checks to see if the returned data is valid (someting like SETI@Home, etc. are doing) and perhaps even use more external machines for the same task, then compare results (again, what SETI@home is doing)

First problem SETI at home isn't a game. No other computer in the world uses the results of any other node for anything.

Second problem SETI uses a central server still. As a work segment is computed it is sent back to a central server for validation. The mechanism that SETI uses for this is for slow long running tasks that don't need to be distributed. That isn't the same data model nor technology model for a MMORPG.


Yet, I have the feeling this is exactly what they are shooting for with Cell. I think it's too soon to proclaim something like this to be beyond the reach of something we know so little about.

This is a fundamental design problem, not a technology problem.

Or it may be impossible on some large level, but possible on smaller, limited situation level. Say, in my example, the game would stop to load and finish any unfinished calculations before entering the zone that was manipulated by an outside machine.

What about all the other people in the zone? This is a massively multiplayer game after all. Your machine can't just stop doing work on that zone now that you've left it - what about the other 100-1000 people in the zone that you just left? How do you deal with the 'starvation' problem where areas that players don't visit often don't get updated? How do you deal with the world synchronization problem with the 10 thousand other people in the game with you (best case). Since there is no central server, how are people going to join the world - if there is just one world you're talking about a distribution problems across potentially 10s of millions of PS3s!
 
So these other players are just donating their PS3s for your enjoyment? Sure if you've got a lot of unused machines sitting around cool. But I doubt seriously if I saw my machine saying "computing world for Marconelly, please wait until he's finished playing so I can render stuff for you", it would stay plugged into the wall. Cell != free CPU and bandwidth for everyone else because it assumes that those machines are doing nothing for the other people that own them... and that those other people don't mind donating their machines for someone elses enjoyment.
Again, as I said above, your game would be completely unaffected. Only the spare PU/APU units will be used for outsourced calculation. This is assuming that some (many?) games simply will not be programmed with such parallelism in mind as to use every APU in your console.

Second problem SETI uses a central server still.
Well, in my example, your local machine is the central server. It's performing the checks, and it calculates any missing data before you enter the next zone.

What about all the other people in the zone? This is a massively multiplayer game after all.
Actually, I was talking about a Fable-like, single player game all the while. I don't know if I made that clear enough.
 
check it..

http://www.realworldtech.com/forums...PostNum=3098&Thread=1&entryID=45958&roomID=13

Name: David Wang (dwang@realworldtech.com) 2/7/05

Today is the day that the CELL processor family gets announced.
I'll be writing a few things about the various papers at ISSCC on CELL, but before the news conference starts and the papers gets official unveiling, some interesting data have already been presented in the technical digests.

The CELL processor presented has 1 64b PPC core acting as the traditional scalar processor, complete with its own L2. The PPE (PowerPC processing Element) is connected to 8 other SPE's (Synergistic Processing Elements) The SPE's are the magic glue that is suppose to contain enormous amount of compute power and a bunch of them gets you the enormously large flop rating that we've all head much about.

Some stats.

1. 90nm SOI process.

2. Logic depth is functionally equivalent to about 20 FO4 (est), but circuit speed equivalence is 11 FO4 per stage. The short pipestage circuit depth is reached with "circuit efficiencies" and Dynamic logic !?!

3. With per stage delay of 11 FO4, the schmoo plots show that the SPE's can crank from 3.2 GHz @ 0.9V Vdd to 5.2 GHz @ 1.3 V Vdd. The entire chip has similar frequency/voltage range, but to get to 5.2 Ghz @ 1.3V, each SPE will eat 11~12W. Add in the rest and the chip will get really hot. 4 GHz @ 1.1V = 4W per SPE seems to be the nominal range.

4. Die size per SPE is 2.5 x 5.81 mm^2. The entire chip with 8 SPE's seems to be about 17.2 x 12 mm^2. That seems to be an awfully large chip for IBM. The CPU to be used in PS3/Xbox2 will probably be the 65nm version or it'll have to have fewer SPE's.

6. As previously announced, the off chip I/O interface is Rambus Redwood and the memory interface is XDR. Similar clocking/deskewing schemes. Looks to be about ~50 GB/s BW to memory, and 50~100 GB/s to I/O.

I'll write up articles as the papers are presented.
 
The CPU to be used in PS3/Xbox2 will probably be the 65nm version or it'll have to have fewer SPE's.
As far as I remember, Xbox 2 CPU won't even be the same thing (no SPEs, two Power cores...) Or was that leaked info all wrong and PS3 and Xbox2 will use the same Cell based CPU?

Also, has there ever been any talk about a Cell with less than 8 SPEs?

What happened to the speculation that physical Cell chip is supposed to have four of these core + 8 SPU units on it?
 
Phoenix said:
To simulate an entire land evolving in real time WHILE you're playing a game is out of the reach of the XBox and of the PS3. Now sure you can distribute that across a large number of consoles, but you just made life suck much much more for yourself as a developer. Now you have to synchronized the various networked nodes (PS3s) in your simulation and build in a level of fault tolerance that is beyond what you have in your current server farms. Anyone can power off their PS3 at any time, their machine isn't dedicated to your game, and unless you're dumbing down the graphics the remote PS3s should be spending most of their time rendering content - not doing simulation. Then there is the sweet sweet lag and the more detrimental open door hacks that you open your world up to now that each of the remote nodes (which you no longer can secure because you don't have physical access to) can be exploited seperately.

No.



The average MMORPG consists of huge arrays of servers and databases to maintain the world. Those servers are responsible for keep track of player state, combat, etc. All of this has to be orchestrated in a controlled environment to prevent (easy) exploits and reduce the amount of traffic between players. If you're the only one in the world or one are of the world, that's fine. But if there are 100 people in that area of the world - you now leave the responsibility of synchronizing that part of the world with 100 people and you have ot make sure that each of those 100 remote nodes are changing/evolving the world at the same rate.

Peer to Peer massively multiplayer games will still be well beyond the reach of the PS3 both from an architectural perspective and CPU horsepower perspective. Distributing games across remote consoles is impractical for a large number of reasons.


Oddly enough no matter how you say some it people won't take it in. Hell, didn't I even go into lecturing about process/threads in another post?

But to add to what you said. Some people don't know that process/thread handling is not perfect. We aim for the middle area between all outcomes, worst and best. When you take that to include the internet as part of the computing structure, you have nothing but hell that further escolates the programming hell not only from the high level but the low level as well. Then again you have to study or research this from the ground up to realize the many pitfals that you would have to address. Many to which we dont' have solutions just probabilities of failure.

Just because the effort required is shot to hell where you have 5 yearold boys/girls in sweat shops programming day and night (or EA), you still haven't outweighed the cons to going with this.

For example.

Lets say you need a frame of memory. You have two reads per memory operation. One reads the page table information, and the other access the page. Registers on the CPU are expensive, but let say you fork over a little cash to have some extra registers to have some sort of TLB look up table. Of course these registers are less since they are considerably more expensive so lets say they only hold 100 references versus lets say, 32k store in the page table. When a memory references is in your TLB table, you'll have one memory access + the time to get the info from the registers (registers on the CPU will be much faster) versus two memory access. Thats great but what if the TLB misses? In other words it doesn't have a reference in memory you are looking for? Then you have to time to access the TLB + the time to access the memory twice thus it ends up being worse. So you have to design an algorithm that best suits the computing characteristics. As always things aren't so perfect and its almost never the most obvious solution. This is just a small example. (32k only are you kidding me?) Things get way up there not to mention this is just fast memory and even faster registers that don't even take into account I/O.

But the point of that example is that in the process of adding something to make it considerably faster, you just as easily make it considerably worse. Now imagine taking the nature of the internet into account concerning high performance computing.

EDIT:

Oh. My. God. I am not getting anything done today. Deposits to the bank, orders I have to make, credit cards to pay, and appointments. Damn you all. :-(
 
Top Bottom