PSM: PS4 specs more powerful than Xbox 720

Status
Not open for further replies.
It always makes more sense to do it on the GPU, unless you've sunk otherwise wasted money into a failed CPU design.

depends what you're doing.

You can't do *everything* on the GPU. Well, maybe you could, but then you're taking time away from it doing other things. You could almost not bother with a decent CPU, just something good enough to handle basic housekeeping, and let a GPGPU do everything else. But I can't help thinking that wouldn't give the best results.

modern multcore, multithreaded CPUs have plenty of grunt for tasks that free up the GPU to do other, more graphically focused things.
 
If they can integrate the necessary SPU sauce for BC into the 'main' CPU rather than on a separate chip, it would be better, more usable I think.

Power7+ or Power8 sounds like it could be an elegant solution. IBM has said that Cell development would be rolled into their next lines. Both of these are slated to incorporate 'accelerators' - remind you of anything? :)

There's been a lot of talk about Cell being a dead end, about Sony being foolish to use it again, but I think it depends what you mean by 'Cell'. Power7+/Power8 may well be wrapping in some of the key bits of Cell - or in a custom variant, be able to - which if they did would yield something close to a next-gen cell/power hybrid.

Yes I think for the next lap, it'd be be more fruitful to improve Cell in other ways rather than scaling up the number of cores. Tighter integration of the SPUs with the CPU *and* GPU would enable more flexible algorithms on both sides. Currently, the developers have to manage the concurrency and synchronization manually. Memory sharing performance and versatility should be improved. More flexible LocalStore usage should be great. Clocking it up to keep up with the GPU may be good. The SPUs won't be able to compete with the GPU threads for brute force graphics work, but its flexible memory controller and cores should allow the GPU to run a different class of math and algorithms faster. It should be more nimble than the GPU's long pipeline but more efficient than traditional CPU's adhoc nature.

The isolated memory and autonomous nature of SPU is also useful for OS secuity. I suspect Sony may yet rely on an improved version. The data locality mantra is always useful in any architecture, a trained SPU programmer should be able to fly here too.
 
To the anti 32 spu naysayers.

Fluid Processing. Liquids and smoke. Much more particles and debris for explosions. Better environmental destructibility. Soft bodies. Cloth. Generally less static worlds.

Yes the GPU renders the world but the CPU moves it and gives it life. Dynamic environments are FUN by their very nature. Most humans have the desire to destroy things but they know they cant do it in real life. Destruction simulation subconsciously attracts us and satisfies us.
 
To the anti 32 spu naysayers.

Fluid Processing. Liquids and smoke. Much more particles and debris for explosions. Better environmental destructibility. Soft bodies. Cloth. Generally less static worlds.

Yes the GPU renders the world but the CPU moves it and gives it life. Dynamic environments are FUN by their very nature. Most humans have the desire to destroy things but they know they cant do it in real life. Destruction simulation subconsciously attracts us and satisfies us.

Most devs aren't even taking advantage of quad core CPUs, let alone 32 threads.
It's far more advantageous for everyone to have a more standard CPU design. Like with Vita.
 
Annnnnnd the next-gen pissing contest has begun.

Hardware has become so advanced since the launch of 360/PS3 that whoever has the technical edge probably won't matter
unless MS trolls themselves by giving their next console only 2GB of memory as previously rumored, lulz
 
Annnnnnd the next-gen pissing contest has begun.

Hardware has become so advanced since the launch of 360/PS3 that whoever has the technical edge probably won't matter
unless MS trolls themselves by giving their next console only 2GB of memory as previously rumored, lulz

Isn't 2GB still pretty good for a dedicated console? 360 only had 512MB and some of the games on it continue to blow me away visually...

Seeing as the 720 will most likely at least 4 times that much, and much faster too, I think that'll still be adequate, no?
 
Isn't 2GB still pretty good for a dedicated console? 360 only had 512MB and some of the games on it continue to blow me away visually...

Seeing as the 720 will most likely at least 4 times that much, and much faster too, I think that'll still be adequate, no?

hmm the thing is it's not just a "game console" anymore it's a multi-media entertainment hub thats always connected

I'd say 4GB of RAM for PS4/720

since back when 360 came out 1GB desktop PCs was the norm

now that 8GB Desktops are the norm well half-that
 
Yes the GPU renders the world but the CPU moves it and gives it life. Dynamic environments are FUN by their very nature. Most humans have the desire to destroy things but they know they cant do it in real life. Destruction simulation subconsciously attracts us and satisfies us.
That's a very weird analysis of the human psyche, not a reason why SPEs should be used. Most of the things you mention just need a powerful GPU and a reasonable amount of memory. The computationally intensive stuff you mention require a reasonable amount of floating point performance, definitely not 32 SPEs.

StevieP is right.
 
Most devs aren't even taking advantage of quad core CPUs, let alone 32 threads.
It's far more advantageous for everyone to have a more standard CPU design. Like with Vita.

most of the things i mentioned scale very well on lots of cores. you guys are acting like no one has ever coded for more than 4 cres before. what is CUDA?
 
hmm the thing is it's not just a "game console" anymore it's a multi-media entertainment hub thats always connected

I'd say 4GB of RAM for PS4/720

since back when 360 came out 1GB desktop PCs was the norm

now that 8GB Desktops are the norm well half-that

I'd say 3GB with a higher bandwidth.
 
Most devs aren't even taking advantage of quad core CPUs, let alone 32 threads.
It's far more advantageous for everyone to have a more standard CPU design. Like with Vita.

Yes and no. You'd want commonality, but you also want unique features to stand out.

Part of Vita's security is enforced via the proprietary memory card and locked up I/O (No standard USB file system access). The Cell security kernel survived multiple assaults with open USB and HDD access.
 
most of the things i mentioned scale very well on lots of cores. you guys are acting like no one has ever coded for more than 4 cres before. what is CUDA?

Guess where CUDA exists? On the GPU.

hmm the thing is it's not just a "game console" anymore it's a multi-media entertainment hub thats always connected

I'd say 4GB of RAM for PS4/720

since back when 360 came out 1GB desktop PCs was the norm

now that 8GB Desktops are the norm well half-that

Don't compare desktop PCs.
You can't get the same chip densities out of the type of memory game consoles use as the types PC use. If you're going with a unified pool of GDDR5, for example, the max you're going to get on a console this/next year is 2GB.

Yes and no. You'd want commonality, but you also want unique features to stand out.

Unique features that will define next gen are interface and services provided.
 
Guess where CUDA exists? On the GPU.

OpenCL runs on CPU, SPU and GPU. You can tap on all of them at the same time, especially if they are good at sharing data.

Unique features that will define next gen are interface and services provided.

Sure, but I'm talking about hardware features for enabling developers to do more.
 
Sure, but I'm talking about hardware features for enabling developers to do more.
SPEs enabled developers to do much more on the PS3, and they haven't delivered.

Hardware uniqueness is almost always problematic if it goes against what everybody else is doing. The PS3 is analogous somewhat to the Sega Saturn in that respect which was very powerful but very complex and used different polygons. That power is kind of nice if it is used well, but otherwise problematic if you want software ported to your systems.

Besides, it's very questionable whether SPEs enable developers to do anything more than on more conventional hardware, especially if the silicon spent on SPEs is spent on other components.
 
That's a very weird analysis of the human psyche, not a reason why SPEs should be used. Most of the things you mention just need a powerful GPU and a reasonable amount of memory. The computationally intensive stuff you mention reasonable amount of floating point performance, definitely not 32 SPEs.

StevieP is right.

you realize the gpu has to render too right? and even the latest nvidia gpus cant do both at the same time. they have to switch back and fourth although switching time improved drastically from the gtx 200 series to fermi. but still id rather have the best possible graphics AND physics. not xor. i guess i just wish gpu companies werent trying to make cpus and cpu companies werent trying to make gpus. the overlap is just a waste of transistors. have you seen an intel die shot lately? the graphics portion is like a third of the die area. even on the high end chips where people are most likely going to be getting a graphics card anyway. id rather have 2 more cores. less useless than an idle hunk of silicon. this is also why i hope APUs with HIGH END graphics take off more.
 
That's a very weird analysis of the human psyche, not a reason why SPEs should be used. Most of the things you mention just need a powerful GPU and a reasonable amount of memory. The computationally intensive stuff you mention reasonable amount of floating point performance, definitely not 32 SPEs.

StevieP is right.

Sony either continues with a Cell-compatible CPU or they give up all software (operating system level and game content) from this generation going forward.

Given that Cell was designed from the beginning as a scalable architecture, I do expect to see the SPEs make a return, even if they don't go beyond 8 (6?) of them for BC.

SPUs do have some advantages over GPGPU, though, including ease and flexibility of programming, oddly enough, so a pool of SPUs present for BC could still earn their keep in PS4, potentially.

It'll be very interesting to see what Sony announces.
 
SPEs enabled developers to do much more on the PS3, and they haven't delivered.

Some did. You just didn't acknowledge them. ^_^

Also, it doesn't mean first attempt is perfect. They can always improve on what they did with SPU further. Doesn't mean the next time is going to be the same thing you know.

Even on Vita, the CPU MediaEngine can share data with the GPU like PS3. I wish they publish more info about their proprietary extension to the GPU though.
 
you realize the gpu has to render too right?
I'm not suggesting that they should use the GPGPU functionality at all. It's just that some of the stuff you mentioned were plain rendering tasks or close to it.

@jonabbey: I agree that the Cell or parts of it will make a return for backwards compatibility.

Some did. You just didn't acknowledge them. ^_^

Also, it doesn't mean first attempt is perfect. They can always improve on what they did with SPU further. Doesn't mean the next time is going to be the same thing you know.
I know, but you'll agree that underused potential is not a very good reason to choose for it next time, right? Of course some exclusives it used it very well, but still used most of the SPEs to compensate for a weak GPU...
 
so guys. what if the ps4 had a gig of xdr2 main and a gig of gddr5 video, and simply threw in 4-8 gigs of standard dual channel pc ddr3. i feel like that would help a lot with having high res textures. (im talking like Skyrim HD mod level of textures.) It should be fast enough right?
 
I'm not suggesting that they should use the GPGPU functionality at all. It's just that some of the stuff you mentioned were plain rendering tasks or close to it.

@jonabbey: I agree that the Cell or parts of it will make a return for backwards compatibility.

I know, but you'll agree that underused potential is not a very good reason to choose for it next time, right? Of course some exclusives it used it very well, but still used most of the SPEs to compensate for a weak GPU...

They can make it more usable for developers perhaps ?

Most of the PS3 problems developers b*tched about were the limited memory, not the SPUs. Even the GPU vertex limit was overcome by the SPUs, and they further improved that part to augment animation and rendering work.

It doesn't have to be SPU "as is". IBM and Sony and Toshiba can improve on that setup with 5-6 years worth of technological advancement. The GPUs have improved significantly too, so they will need to find a balance somewhere.
 
It's been a few years since I've stopped caring about the battle between Playstation and XBOX. I have a gaming PC so the outcome has never been terribly important to me, but I look forward to what the next couple of years brings.
 
They can make it more usable for developers perhaps ?

Most of the PS3 problems developers b*tched about were the limited memory, not the SPUs. Even the GPU vertex limit was overcome by the SPUs, and they further improved that part to augment animation and rendering work.

It doesn't have to be SPU "as is". IBM and Sony and Toshiba can improve on that setup with 5-6 years worth of technological advancement.

yeah like more local store for the spus.
 
so guys. what if the ps4 had a gig of xdr2 main and a gig of gddr5 video, and simply threw in 4-8 gigs of standard dual channel pc ddr3. i feel like that would help a lot with having high res textures. (im talking like Skyrim HD mod level of textures.) It should be fast enough right?

Introducing bottlenecks is never good. A more balanced approach means less memory but faster throughput of said memory.

Also, count the memory chips on the... 360 motherboard, for instance:
motherboard.jpg


It doesn't have to be SPU "as is". IBM and Sony and Toshiba can improve on that setup with 5-6 years worth of technological advancement. The GPUs have improved significantly too, so they will need to find a balance somewhere.

IBM / Toshiba are pretty much done with the Cell.

Throwing all that stuff so you can have your 'conventional' system sounds so lamish its actually funny. There is nothing sony gains, absolutely nothing it gains by going the route of other has beens. Infact they would have to pay royalties for a conventional cpu whereas they have own the cell broadband engine with IBM and toshiba.

"Has beens"?
Look, to put it more simply - you have a certain amount of die space. Either you spend it to continue the Cell, or you give third party developers what they prefer to work with. The 3rd party exclusive is pretty much dead, so multiplatform is a huge deal. Also, any design from a more conventional CPU would be owned and customized to a degree. Both MS and Sony have learned this lesson. As an example, the main CPU in the Cell (PPE) was actually sold to Microsoft - who has 3 of pretty much the exact same CPUs inside their console.
 
Most devs aren't even taking advantage of quad core CPUs, let alone 32 threads.
It's far more advantageous for everyone to have a more standard CPU design. Like with Vita.

What devs are you talking about?

Just what do you think the dynamic cycle of Gt5 runs on? Or it's incredible physics system? That god of war 3 MLAA. Yeah. All the physics, particles, effects etc in killzone are all runnng on SPUs. More SPUs means more of these. It's simple logic.

The playstation 3 entire security system is dependent on the cell. That isolated SPU. The entire backlog depends on the SPUs. Vidzone, music unlimited and all other hosts of services are dependent on the cell architecture.

Throwing all that stuff so you can have your 'conventional' system sounds so lamish its actually funny. There is nothing sony gains, absolutely nothing it gains by going the route of other has beens. Infact they would have to pay royalties for a conventional cpu whereas they own the cell broadband engine with IBM and toshiba.


From a financial and a gaming point of view it makes complete sense for sony to stick with it's Cell. And this will also mean that polyphony digital won't take billion years for GT6.
 
They can make it more usable for developers perhaps ?
They could. If I were Sony however, I'd keep that R&D money in my pocket and simply reuse the new Wii U architecture IBM has laying around.
Just what do you think the dynamic cycle of Gt5 runs on? Or it's incredible physics system? That god of war 3 MLAA. Yeah. All the physics, particles, effects etc in killzone are all runnng on SPUs. More SPUs means more of these. It's simple logic.
All of these can be done on a normal architecture as well and don't warrant its reuse. Exclusive games used the SPEs well, but Cell is dead, and these things are a given on next-gen hardware (even if it's a HD6670).
The playstation 3 entire security system is dependent on the cell. That isolated SPU. The entire backlog depends on the SPUs. Vidzone, music unlimited and all other hosts of services are dependent on the cell artitecture.
Whaha, if Sony's engineers wrote that stuff to run on Cell only, they deserve to be fired.
Throwing all that stuff so you can have your 'conventional' system sounds so lamish its actually funny. There is nothing sony gains, absolutely nothing it gains by going the route of other has beens. Infact they would have to pay royalties for a conventional cpu whereas they have own the cell broadband engine with IBM and toshiba.
What has Sony gained from using the Cell except for some nice graphics (nice compared to a system released a year earlier) in God of War and Uncharted?
 
It's been a few years since I've stopped caring about the battle between Playstation and XBOX. I have a gaming PC so the outcome has never been terribly important to me, but I look forward to what the next couple of years brings.

you should care. the power of these consoles will determine how pc games look for the next 6 years. remember the age of advanced pc only games is over. theyre all going to be console ports with 32x antialiasing.
 
SPEs enabled developers to do much more on the PS3, and they haven't delivered.

Hardware uniqueness is almost always problematic if it goes against what everybody else is doing. The PS3 is analogous somewhat to the Sega Saturn in that respect which was very powerful but very complex and used different polygons. That power is kind of nice if it is used well, but otherwise problematic if you want software ported to your systems.

Besides, it's very questionable whether SPEs enable developers to do anything more than on more conventional hardware, especially if the silicon spent on SPEs is spent on other components.

Most of the problems for ports come from the split memory pool more than Cell to tell the truth .
I always wonder how PS3 would have been if Cell was pair with a better Gpu (PS3 Gpu sucks) but blu ray price mess that up.

Also most of time when some of us say Sony should use Cell we don't mean the same thing in PS3 but certain parts of it .
 
IBM / Toshiba are pretty much done with the Cell.

Nope. IBM said they may add Cell-like features to their future processor. Toshiba is involved in fabing the next CPU for Sony right ?


They could. If I were Sony however, I'd keep that R&D money in my pocket and simply reuse the new Wii U architecture IBM has laying around.

For all we know, it may be the same base architecture, but Sony want to add SPU-like features to it. Those R&D money spent can still continue to serve Sony, while they tap on their Vita and general purpose software. It's actually more interesting to see how the SPU complements the modern GPU. We already know how the SPU can work with the CPU.
 
Introducing bottlenecks is never good. A more balanced approach means less memory but faster throughput of said memory.

Also, count the memory chips on the... 360 motherboard, for instance:
motherboard.jpg




IBM / Toshiba are pretty much done with the Cell.

but the textures that arent going to be on screen need to be stored somewhere thats faster than a hdd or worse the blu ray. my pc doesnt seem to have any trouble and all it has is ddr3 to store the non on screen textures in.

and also the os footprint can be in the ddr3 pool as not to interfere as much.
 
so guys. what if the ps4 had a gig of xdr2 main and a gig of gddr5 video, and simply threw in 4-8 gigs of standard dual channel pc ddr3. i feel like that would help a lot with having high res textures. (im talking like Skyrim HD mod level of textures.) It should be fast enough right?

sounds like 599$ USD
 
Nope. IBM said they may add Cell-like features to their future processor. Toshiba is involved in fabing the next CPU for Sony right ?
Sony bought back Toshiba's share in a Cell plant in 2010. In other words: Toshiba has bailed out. IBM has only stated a promise upon which they have yet to deliver, and it didn't even involve the actual Cell-architecture but just "Cell-like features". That could refer to the PS4 CPU though.
not even. 4 gig kit of ddr3 1600 (800 mhz) = 30 bucks RETAIL which means like 50 cents for sony to include it (exaggerating obviously but its dirt cheap none the less)
Except that it would also increase motherboard complexity, size and failure rate and the associated costs very significantly. You can't just plug this stuff in.

Rule 1 of hardware speculation: Do not use Newegg as your source.
 
Most of the problems for ports come from the split memory pool more than Cell to tell the truth .

And the PS3's OS reserving more memory in each pool than the 360's OS does in its unified pool.

If the PS3 had more memory the split pool wouldn't have mattered as much, but coming out a year after 360 with less available RAM has made things difficult for all multiplat developers.

Not much Sony could do about that with the expenses they had at launch for the BRD. Those 405nm violet laser diodes were *not* cheap back then.
 
Sony bought back Toshiba's share in a Cell plant in 2010. In other words: Toshiba has bailed out. IBM has only stated a promise upon which they have yet to deliver, and it didn't even involve the actual Cell-architecture but just "Cell-like features". That could refer to the PS4 CPU though.

Toshiba bailed out but may be hired to help Sony fab the next CPU according to the grapevine.

IBM acknowledged that heterogeneous architecture may still be in their future CPU (as opposed to standard homogenous ones). Cell-like features are based on fundamental principles of computing (Overcome memory wall, reward data locality, explicit cache control, large number of cores, fast internal bus, isolated memory, ...). As long as these traits are kept where appropriate, I don't see why it's a problem.

Technologies always evolve. We drop some parts, we gain new ones.

I think more rewarding is for them to figure out if modern GPUs can be enhanced by Cell-like features. Gamers would understand and appreciate that marketing message better than enhancing CPU.
 
Sony bought back Toshiba's share in a Cell plant in 2010. In other words: Toshiba has bailed out. IBM has only stated a promise upon which they have yet to deliver, and it didn't even involve the actual Cell-architecture but just "Cell-like features". That could refer to the PS4 CPU though.
Except that it would also increase motherboard complexity, size and failure rate and the associated costs very significantly. You can't just plug this stuff in.

Rule 1 of hardware speculation: Do not use Newegg as your source.

the pcb would be different but cost virtually the same? couple extra pieces of plastic? guess they would need to incorporate a ddr3 controller in the north bridge... and how is it any more failure prone than anything else on the board? i dont hear about pc memory modules frying left and right...

And this is how we know to ignore this.

u_u
 
They could. If I were Sony however, I'd keep that R&D money in my pocket and simply reuse the new Wii U architecture IBM has laying around.
All of these can be done on a normal architecture as well and don't warrant its reuse. Exclusive games used the SPEs well, but Cell is dead, and these things are a given on next-gen hardware (even if it's a HD6670).
Whaha, if Sony's engineers wrote that stuff to run on Cell only, they deserve to be fired.
What has Sony gained from using the Cell except for some nice graphics (nice compared to a system released a year earlier) in God of War and Uncharted?

cell is only dead when sony says it's dead. It helps if you know that sony bought back its' fab cell production back from toshiba. There's nothing stopping sony from designing cell 2.0. IBM will love nothing more than sony spending a few dollars on them to develop it

As for the part you said 'it dont warrant its reuse' well i can only smile in amusement. I am giving you reasons why sony should go keep it's cell and you're telling me all that could be done on a normal architecture? Well news flash. Yes it could be done. But it will be a complete waste of time and money.

You want sony developers to start Coding new tools and libraries, design new operating system from scratch, design new security from scratch, throw out backwards compatibility, pay royalties on top of R&D and spend enormous time and effort doing so.

When suprise suprise sony has all that covered right now with the ps3. Are you having a laugh mate? You might as well tell sony to go back to DVDs again
 
A system PSM knows nothing about, will be more powerfull than another system PSM knows nothing about

Seems like a very valid article from PSM, i suppose they have some hard facts and proof there, from their "insider" sources
 
Why would they use 460, when the Kepler model 28nm would be revised as the 660. I honestly think they can do all that with a 670/770 GPU and keep it at $399 with no loss or break even.

I doubt they can thats why I suggested a 460. A 460 in a closed system environment would be a beast of a GPU and much better than what MS is doing with that shitty ass 6670.
 
Status
Not open for further replies.
Top Bottom