PSM: PS4 specs more powerful than Xbox 720

Status
Not open for further replies.
UMA with 2GB GDDR5 on a 256 bit bus (4GB if 4Gb chips are available in time).
.

if you think 4Gb chips are going to be available soon enough, could you take the hit on initial board complexity and cost, factoring in moving to 4Gb chips ASAP with a board revision?
 
So um, haven't posted in this thread yet, but yeah I wonder why PSM would report something like that. Also, wouldn't that mean their sources would know how much power is behind both consoles? Hogwash imo.
 
if you think 4Gb chips are going to be available soon enough, could you take the hit on initial board complexity and cost, factoring in moving to 4Gb chips ASAP with a board revision?

I'd wager not. That would mean 16 chips, not really feasible when your competitors already have a major cost advantage. Unless you're happy to release a huge console at a significant loss?
 
I'd wager not. That would mean 16 chips, not really feasible when your competitors already have a major cost advantage. Unless you're happy to release a huge console at a significant loss?

you seem to really know your stuff. I don't know much about this tech stuff, but would you wager that nvidia maxwell will be anywhere near close to getting us avatar style graphics?
 
I think Sony will just use cell as main CPU and beef up ram and GPU. (shrug)

This would be my guess.

I'm willing to bet that the PS4 will end up being the most powerful console, but it's a moot achievement considering third parties won't take the effort to get most of it. Unless they have some major third party exclusive lined up.
 
This would be my guess.

I'm willing to bet that the PS4 will end up being the most powerful console, but it's a moot achievement considering third parties won't take the effort to get most of it. Unless they have some major third party exclusive lined up.

How will it be moot if games that are running 720p on Wii U and the Next Box possibly run at 60 fps or 1080p on the PS4?
 
This would be my guess.

I'm willing to bet that the PS4 will end up being the most powerful console, but it's a moot achievement considering third parties won't take the effort to get most of it. Unless they have some major third party exclusive lined up.

A couple of graphics programmers spending a few days tweaking some code is enough to get a significant improvement out of a higher end GPU. There's so much from shadow filtering/resolution, AA, framerate, LOD, texture filtering/resolution, to the precision of the shaders used that can provide a significant boost to visuals from a few lines of code. A faster GPU will always be taken advantage of.
 
-4 CELL processors daisy-chained together
-Dual Radeon HD 7670 SLI
-16 GB XDR2

I have inside sources.

There's no reason to SLI 2 low range GPUs when it would be cheaper and simpler to use a single 7870/7850. They are rumored to give 6970/6950 performance at like 90 watts. SLI chips are more expensive, add unnecessary board complexity and are more of a pain to get performance from.
 
Sony is done with all cell investments. They have no reason to include it in the next gen and they want to help devs by including a more user friendly cpu than cell. IBM PP7 is perfect for them. Just throw in a modern gpu architecture. Doesn't need to be the highest spec, but something around a HD 7870 or so would be perfect.
 
So is there any way the ps4 will have decent backwards comparability with the ps3?


People have been saying for years that they are ditching Nvidia for ATI which will screw things up.

Are modern IBM CPU be anything even close to the cell?



I guess they could do the early ps3 thing of sticking all the old chips in console to be able to run old games but that will probably cost a bunch and raise the price.

Plus if they remove it later it would screw up any plans to keep selling old ps3 games over PSN since only some owners would be able to buy them.
 
What config of cell?

One good thing Sony has going for it going into the next gen is the scalability of the Cell architecture. It'll be easier for them to attain that 8x-12x PS3 computational power than it would for MS.
Sony aren't going to sony spend significant R & D designing a whole new chip from the ground up. That ship has sailed.

People really need to pay closer attention to the Vita.
 
So is there any way the ps4 will have decent backwards comparability with the ps3?


People have been saying for years that they are ditching Nvidia for ATI which will screw things up.

Are modern IBM CPU be anything even close to the cell?



I guess they could do the early ps3 thing of sticking all the old chips in console to be able to run old games but that will probably cost a bunch and raise the price.

Plus if they remove it later it would screw up any plans to keep selling old ps3 games over PSN since only some owners would be able to buy them.
No modern CPU its BC with Cell or capable if easily emulating it. Don't expect Sony to have BC, even if the hardware makes it technically possible without major investment, their HD rereleases have proven the business case against BC.
 
No modern CPU its BC with Cell or capable if easily emulating it. Don't expect Sony to have BC, even if the hardware makes it technically possible without major investment, their HD rereleases have proven the business case against BC.

That´s basically free money for publishers. I expect a lot of this generation´s games to be reworked for next gen.
 
But they can't HD remake games that are already HD. But "PS3 Originals" could be lucrative, as long as PS4 can run them.

'Full HD classics'

We do want that 1080p native, AA (+ 60fps?) Uncharted 2, I know I do.

There was this UC3 60fps video on EG, it looked brilliant.
 
Very significant, actually. It won't be a patch on a contemporary high end PC rig (or even a high end PC rig of today) but it can be a true generational leap.

Something like:

4 wide CPU cores with OoOE and decent SIMD capabilities (either AMD x86 or Power7 based).
UMA with 2GB GDDR5 on a 256 bit bus (4GB if 4Gb chips are available in time).
~2 Teraflop AMD GCN based GPU (I guess a Keplar based alternative would do but I'm not seeing it).

Released in Q4 2013 for $400.

I was thinking the Xbox3 would be practically identical until all the recent SOC and 6670 rumours starting sprouting up but I'm not so sure anymore. Its safe to say this would blow the current rumoured 360 (and Wii U for that matter) out of the water.
Although I agree that specs like this should be possible in principle (although I'm quite sure a 256-bit bus is an unnecessary expense if the GPU has EDRAM), this is not the way to think of it.

Thinking of what is on the edge of practical possibility is something that won't give very realisitic results anyway. Especially f you consider the current market conditions, a market that has not appreciated high end graphics but rather novel input ideas, and add to that that Sony will not feel obligated to create $400 console they could potentially lose money on, considering a similar strategy has not gained them anything the last years and their opponents aren't doing it either, I wouldn't be sure that this is what they're aiming for. It's technically possible, sure, but is it a good business decision? Seems unlikely.

A 2 TFLOPS GPU is nice and all, but if you consider that an AMD engineer has been quoted as saying the Wii U GPU does 1 TFLOPS (although realistically it'll probably do 0.7-0.8), and the next Xbox GPU will probably do better, then 2 TFLOPS isn't going to constitute a generational leap over the others either. In current market conditions I doubt Sony will invest in a GPU on that if they could use that money to build a smaller console with their own input method.

What I'm trying to say is that this is still wishful thinking. Not in a sense that this is technologically infeasible like many other suggestions here, but in the sense that it disregards what kind of market we're in. Although if any of the three is going to be the most powerful it's going to be Sony's, you can't expect a $399 base console with roughly the same power requirements and build cost (sans extraordinary bluray and cell expenses) as the PS3. I wouldn't be surprised if Sony also went with a SoC as well - if they went with an AMD CPU that would even almost confirm it. That said, if they were to match a GPU matching with what you said, I'd be very pleasantly surprised.
 
No modern CPU its BC with Cell or capable if easily emulating it. Don't expect Sony to have BC, even if the hardware makes it technically possible without major investment, their HD rereleases have proven the business case against BC.

Your position is that Sony is not going to use a Cell read (IBM CPU)PPU (Sony)SPU hybrid even an updated more efficient version. I think you are wrong.

All your arguments are using old news that no longer applies. You also do not take into account that Sony is a CE Media company first and a Game console manufacturer LAST.

The money invested in the Cell by Toshiba, Sony and IBM and also the blu-ray drive had as it's LONG RANGE goal supporting High Definition and Ultra High Definition video. The Cell is superb at video and Vector Math processing as in Video Codecs and as a programable processor that is fast enough to support Games.

The Cell can support 4K (HEVC h.265) with Software routines but Intel and AMD CPUs need dedicated hardware. Sony is using PS3 code to upscale 1080P video to 4K in the soon to be released top end Sony blu-ray players, in the Sony 4K video projector and Toshiba is using a cell processsor to do the same in their 4K TV. UHD is coming during the life of the PS4 and that more than doubles the processing power needed.

For the above reason alone, SPUs will be in the PS4 and can be used to support BC. There is NO current CPU powerful enough to emulate SPUs..they have to be included to support BC.

Power/Heat/efficiency is going to be a BIG issue for all CE electronics. Part of the PS3/PS4/ Xbox use this year and in the future is to support Home Theater or RVU viewing on the Home TV or serving media and data to handhelds and tablets. When doing so most of the GPU and CPU needs to be turned off to conserve power.

If I were Microsoft I'd contract from Sony/Toshiba/IBM to get a few SPUs to use for Codecs. Far from the Cell SPUs not being used by Sony....I expect Microsoft to be using them in their next generation XBox as well as a blu-ray drive as it appears that a modern blu-ray drive can read 4 layer blu-ray disks fast enough that with h.265 it can support 4K media.

The Cell SPU with enough cache can be thought of as the smallest possible general purpose CPU element (building block) for everything video related as in Cameras, Blu-ray players, Game Consoles, TVs etc. I expect when Sony jumps into a smaller die size for their Nagasaki plant at a die size under 28nm (I think first year or so PS4 processor will be manufactured by IBM and later at a smaller die size by Sony in the Nagasaki plant like they did with the PS3 slim) they will be using SPUs for everything.
 
@Jeff_rigby: If 4K is going to be a major thing next gen (it won't be) AMD's UVD chip can do what Cell can at 10% of its power usage and at virtually no added cost. SPEs ( they're called SPEs, not SPUs) are really nothing special.
 
@Jeff_rigby: If 4K is going to be a major thing next gen (it won't be) AMD's UVD chip can do what Cell can at 10% of its power usage and at virtually no added cost. SPEs ( they're called SPEs, not SPUs) are really nothing special.
Cite please. Dedicated hardware decoding is usually more efficient especially when you are using a general purpose CPU like AMDs and Intels. Arm processors include a NEON co-processor which is similar in idea to a SPE or SPU for use with Codecs.

Cell does have a couple of issues that need to be corrected and with sufficient cache should be as efficient as the AMD dedicated hardware. That's the point I am making. SPE equipped does not need dedicated hardware, AMD and Intel do need dedicated hardware.

Also, why do both Intel and AMD have dedicated hardware to support 4K if it's not going to be a major thing next generation? Are you assuming that current trends with the Xbox and PS3 supporting Home Theater is not going to be even more true for the next generation?

https://www-01.ibm.com/chips/techlib/techlib.nsf/techdocs/76CA6C7304210F3987257060006F2C44 said:
The Cell Broadband Engine architecture defines a single-chip multiprocessor consisting of one or more Power Processor Elements (PPEs) and multiple high-performance Synergistic Processor Elements (SPEs). The Synergistic Processor Unit (SPU) is part of the SPE in the Cell Broadband Engine Processor. The SPU instruction set architecture (ISA) provides 7-bit register operand specifiers to directly address 128 registers using a pervasive single instruction, multiple data (SIMD) computation approach for both scalar and vector data. This specification describes the SPU Instruction Set Architecture.
I was using the SPU term correctly as the smallest element in the Cell SPE.
 
Sony aren't going to sony spend significant R & D designing a whole new chip from the ground up. That ship has sailed.

People really need to pay closer attention to the Vita.

Why would it cost significant R&D to do that? I mean, a scaling of 'vanilla' Cell?

I'm not advocating the approach necessarily, just to be clear, but don't know why we can dismiss it on a cost basis?

Wouldn't it be just about the same, investment-wise, as any other customisation of existing tech they might go with?

If they did want to go that route, it's likely it wouldn't be quite a clean/simple scaling of what's already there - they'd probably at least want to replace the PPE with another Power core, for example. Different external I/O interfaces. That would require Sony-specific work, to customise another (off-the-shelf, no doubt) core to talk to the EIB etc. But in that case we're still way way off the kind of from-scratch r&d that went into the first PS3 cell.
 
Why would it cost significant R&D to do that? I mean, a scaling of 'vanilla' Cell?

I'm not advocating the approach necessarily, just to be clear, but don't know why we can dismiss it on a cost basis?

Wouldn't it be just about the same, investment-wise, as any other customisation of existing tech they might go with?

If they did want to go that route, it's likely it wouldn't be quite a clean/simple scaling of what's already there - they'd probably at least want to replace the PPE with another Power core, for example. Different external I/O interfaces. That would require Sony-specific work, to customise another (off-the-shelf, no doubt) core to talk to the EIB etc. But in that case we're still way way off the kind of from-scratch r&d that went into the first PS3 cell.
Yup and Sony owns the rights to SPE or SPU IP so it costs them only for the Silicon while using a IBM Power PPE costs more. Still there is a need for more easier to use general purpose IBM PPE cores so at least 2 possibly 4 but not 6 as in the Xbox or WiiU. With the Silicon real estate freed by not using 2 PPE cores Sony could include 8 SPEs (I think). A reasonable guess would have 4 PPE and 8 SPEs with more cache and power management to turn off cores not in use.
 
The fact that it's not major doesn't mean non-existent. UVD chips are deployed in things other than consoles too. Of course 10% is not the literal number, but you're comparing an almost full processor from a dead architecture to a small dedicated chip that is included on the tiniest and cheapest GPUs.

The suggestion that Sony and even Microsoft should include an entire coprocessor in their consoles to do software decoding when a small chip already designed for the GPUs could do the same is ridiculous.

Cell will find its use in the PS4 but it is by no means necessary for Sony to include it.
 
@Jeff_rigby: If 4K is going to be a major thing next gen (it won't be) AMD's UVD chip can do what Cell can at 10% of its power usage and at virtually no added cost. SPEs ( they're called SPEs, not SPUs) are really nothing special.


chris-rock2.jpg



Cell will find its use in the PS4 but it is by no means necessary for Sony to include it.

It will be necessary if they want to keep

1) Backwards compatibility. No CPU out there can emulate the cell
2) Keep the Ps3 operating system.
3) Keep the same security system
4) Keep the 7 years of programming tools and libraries which both first party and third party devs compiled which will allow them easy transition to next gen. No more naughty dog crying about making the new uncharted on the ps4
5) don't pay any royalities to use someone else's cpu.


-4 CELL processors daisy-chained together
-Dual Radeon HD 7670 SLI
-16 GB XDR2

I have inside sources.


Gran Turismo running on 4 ps3s

http://www.sonyinsider.com/2008/11/19/gran-turismo-in-4-times-the-resolution-of-full-hd/
 
The fact that it's not major doesn't mean non-existent. UVD chips are deployed in things other than consoles too. Of course 10% is not the literal number, but you're comparing an almost full processor from a dead architecture to a small dedicated chip that is included on the tiniest and cheapest GPUs.

The suggestion that Sony and even Microsoft should include an entire coprocessor in their consoles to do software decoding when a small chip already designed for the GPUs could do the same is ridiculous.

Cell will find its use in the PS4 but it is by no means necessary for Sony to include it.

We are talking 50 million PS4s or next generation Xbox, the real estate occupied by a dedicated hardware codec decreases the number of chips that can be produced on each $20,000 wafer. Several SPUs can do double duty in a PS4 or Xbox while a UVD can only process video. Dedicated UVD could cost $1 or more per chip plus IP costs.
 
We are talking 50 million PS4s or next generation Xbox, the real estate occupied by a dedicated hardware codec decreases the number of chips that can be produced on each $20,000 wafer. Several SPUs can do double duty in a PS4 or Xbox while a UVD can only process video. Dedicated UVD could cost $1 or more per chip plus IP costs.
What are you talking about? How does including the UVD even begin to compare with the costs and complexity associated with the SPEs?

@vandaliser: you should know that the PS3's security system contains a critical design flaw and needs to be rewritten anyway. I do think it's likely the PS3 chip will be included for BC. All your other arguments don't justify continued use of an old weird architecture that gets underused in games either.
ugh, Sony's marketing really worked well...
 
It's not a design flaw. That's sony japan engineers fucking up their cryptology. If they did their job right the ps3 would have remained unhacked forever.

As for the underused games bit, i can only surmise you actually havent played any ps3 games.
 
If that's not a design flaw, what is? The point is: they need a new system.

We have been over this stuff many times already. I suggest you read up on this thread. I won't argue with your Sony tinted glasses.
 
It will be necessary if they want to keep

1) Backwards compatibility. No CPU out there can emulate the cell
2) Keep the Ps3 operating system.
3) Keep the same security system
4) Keep the 7 years of programming tools and libraries which both first party and third party devs compiled which will allow them easy transition to next gen. No more naughty dog crying about making the new uncharted on the ps4
5) don't pay any royalities to use someone else's cpu.

Gran Turismo running on 4 ps3s

http://www.sonyinsider.com/2008/11/19/gran-turismo-in-4-times-the-resolution-of-full-hd/
Wow, hadn't come across GT outputting 4K since they did it a few years ago and displayed on 4 1080P panels. Shows how serious GT, Sony and Japan is about 4K.

DCKing said:
If that's not a design flaw, what is? The point is: they need a new system.
It's just code that was burned into one of the SPUs.

There is a reason all Game consoles this generation are using PowerPC cores. That reason has not changed despite the advertising by Intel and AMD. AMD 6 core CPUs can not run 100% duty cycle without overheating. Intel might be better with 22nm. Games are generally repetitive tasks that a RISC chip can do faster with less power. The SPU goes RISC one better as a SUPER RISC with an even smaller instruction set. The only thing with a smaller instruction set would be a dedicated hardware codec.

But for general purpose OS functions like a Web Browser an AMD CPU is probably a better choice. Are we talking game Consoles or PCs here?
 
What are you talking about? How does including the UVD even begin to compare with the costs and complexity associated with the SPEs?

@vandaliser: you should know that the PS3's security system contains a critical design flaw and needs to be rewritten anyway. I do think it's likely the PS3 chip will be included for BC. All your other arguments don't justify continued use of an old weird architecture that gets underused in games either.
ugh, Sony's marketing really worked well...

His point is SPUs can be used for many things. A dedicated video decode block can't help you in your physics simulation or render 3d sound or blend animations, etc, etc. If your problem is "how should we apportion our transistor budget in a game console to allow for 4K video playback?", choosing an architecture that can pull double duty makes a lot of sense.

Also, did you get your talking points from 2007? What makes you think SPUs are underused? All the multiplatform titles out there are leaning pretty heavily on them. They are a solved problem, and that solution is, coincidentally, the same one everyone is going to be using next generation to keep their "traditional" 12 or 16 thread CPUs busy. Cell isn't old and weird; it's still ahead of its time. It took Intel six years to catch up to Cell's vector capabilities. And a fairly inexpensive upgrade will leave Intel in the dust again. When you add in the benefits of backwards compatibility and security to proven excellence for games and video, it's hard to imagine a better choice out there.
 
It's just code that was burned into one of the SPUs.
Changing it is going to be a pain if they want to stay compatible. As a computer security master I was amazed that they got it this wrong :-\
There is a reason all Game consoles this generation are using PowerPC cores. That reason has not changed despite the advertising by Intel and AMD. AMD 6 core CPUs can not run 100% duty cycle without overheating. Intel might be better with 22nm. Games are generally repetitive tasks that a RISC chip can do faster with less power. The SPU goes RISC one better as a SUPER RISC with an even smaller instruction set. The only thing with a smaller instruction set would be a dedicated hardware codec.

But for general purpose OS functions like a Web Browser an AMD CPU is probably a better choice. Are we talking game Consoles or PCs here?
The conjecture is either a Wii U-like architecture or the next iteration of Bulldozer. It's not going to be a PC part so I doubt AMD won't fix the overheating, but in general RISC has more advantages indeed. AMD and Intel's internal workings have shifted towards RISC much more however, so it's probably too close to call at this point.

@Brad Grenz: I call them ineffective because the Xbox 360 can do almost all of the same things as the PS3 can on a relatively normal architecture, released a year earlier using less silicon, without a Cell-like architecture. The Cell compensated for the RSX well, but for its complexity it has not delivered. The Xbox 360 was simply much more effective.
 
Meh Sony should use Cell instead of going backwards for the sake of making development easier for third party devs. What Sony has done on consoles this gen with Cell has been amazing.
 
Why would it cost significant R&D to do that? I mean, a scaling of 'vanilla' Cell?

I'm not advocating the approach necessarily, just to be clear, but don't know why we can dismiss it on a cost basis?

Wouldn't it be just about the same, investment-wise, as any other customisation of existing tech they might go with?

If they did want to go that route, it's likely it wouldn't be quite a clean/simple scaling of what's already there - they'd probably at least want to replace the PPE with another Power core, for example. Different external I/O interfaces. That would require Sony-specific work, to customise another (off-the-shelf, no doubt) core to talk to the EIB etc. But in that case we're still way way off the kind of from-scratch r&d that went into the first PS3 cell.

I don't think Sony is going to use Cell again but if they do i agree it can't be that much in R&D.
Scaling Cell should not cost Sony that much to tell the truth and they already know the problems with it from PS3.
Plus the days of Cell being a pain in ass to work with are over , more devs have problems with PS3 split memory .
 
Meh Sony should use Cell instead of going backwards for the sake of making development easier for third party devs. What Sony has done on consoles this gen with Cell has been amazing.

They need to make things easier on human beings if they want to continue on this field.
 
They need to make things easier on human beings if they want to continue on this field.

We are not in 2007 anymore. People have moved on. When third party devs such as EA are using the spus to great effects on battlefield then you can safely say that transitioning hell days are over.

The bottleneck right now for the ps3 is the ram.
 
I think Sony will just use cell as main CPU and beef up ram and GPU. (shrug)

I think so too. They obviously need more ram and faster read speeds but they should keep bluray. This time PSN will be integrated from the start too. Get rid of the bottlenecks and make the system more developer friendly all for under $500. Update the Dualshock as well and maybe come up with some sort of hybrid of the Move/Dualshock all in one controller and a more advanced Eye camera.
 
Status
Not open for further replies.
Top Bottom