• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN rumour: PS4 to have '2 GPUs' - one APU based + one discrete

HyperionX

Member
Probably.
It can also be $200 / $400 if they didn't go batshit insane on specs.

As I just said, $400 unsubsidized would be a stretch if they want 256-bit bus and have something comparable to say, an AMD 7850.

I can't see anything less than $500 unsubsidized this round. Not unless they are pulling a WiiU style console.
 

Ashes

Banned
Price war with MS? GL. :p

Not to mention MS will cheat by offering subsidized plans.

Sony should focus on creating an unique gaming experience, and offer FREE PSN that rivals / overtakes the paid XBL.

Consoles don't have price wars, they have value adjustments. :)

I think the subsidised console route is a cool idea, but you can only do it with the first console. I don't mind a 2 year contract, £20. But the buck stops at one. Which is what MS want basically. : lock out the competition. But I think you've raised a very important point.

Sony's has unique stuff, - hit or miss affair though. :/
 
YW.

For me I assumed you meant overall performance that's why I said I couldn't answer that. And the assumed amount is 2GB of GDDR5.
That's potentially misleading. If we are talking memory "attached to logic" in the SOC then it won't be GDDR5, if we are talking 3D stacked memory as system memory then it won't be GDDR5. 2 Gigs of system memory with 100 or less megs in the SOC is a possible.

3D stacked memory is faster than GDDR5 and will eventually be cheaper. Eventually is key here. The design of the PS4 has aspects that can not change so the first one made must use what eventually is cheaper even though it costs more now.

This is all speculation at this point so we can't say that GDDR5 won't be used, we also should not insist that GDDR5 will be used. In the wide memory buss discussion above, 256 and 384 bit wide buss was to speed up memory transfer speeds. 3D stacked memory is as fast as XDR2 and would not need as wide a buss as GDDR3 or GDDR5. To this point 3D stacked memory either was not available or too expensive, that's changing and impacts future hardware designs.

A wider buss is more energy efficient so SOC handheld designs will have a wider buss and use 3D stacked ultra wide I/O memory at a slower clock speed because it's energy efficient. 3D stacked memory is faster, eventually cheaper and more energy efficient than GDDR5.
 

Proelite

Member
Obviously not in the console space. Something on a Pitcairn le (under clocked at that) would be a pretty good ceiling.

That's why 7950 is batshit insane. :p

Although my version of batshit insane is probably different from ShockingAlberto's, because in the context of his posts a traditional generation leap is batshit insane.

7850 with 2GB of GDDR5 would absolutely be a traditional generation leap.

For me batshit insane is packing as much as possible in a 250 Watt TDP envelope, and you definitely can do more than a 7850 in a 250 TDP envelope, as Chittagon's build had a 7850 and was 180w.
 

i-Lo

Member
Wii U will not be on par with PS4 or Xbox 3, but it should be able to get ports from those consoles.

The way I see it so far it's shaping up like last gen where power-wise:

Wii U = PS2

PS4 = GC

Xbox 3 = Xbox

I really hope against this. The next generation like this one, will be thoroughly influenced by third party software devs and as such, I can't imagine the conditions will favour PS4 if it's not at par with XB3. Heck, it's bad enough that this gen, we see a ton of 360 vs PS3 game comparison vids on which many make their decisions and it'll only get worse next gen.

If it turns out like GC at the end of the day, I just hope Sony's first parties keep them afloat.
 

Proelite

Member
That's potentially misleading. If we are talking memory "attached to logic" in the SOC then it won't be GDDR5, if we are talking 3D stacked memory as system memory then it won't be GDDR5. 2 Gigs of system memory with 100 or less megs in the SOC is a possible.

3D stacked memory is faster than GDDR5 and will eventually be cheaper. Eventually is key here. The design of the PS4 has aspects that can not change so the first one made must use what eventually is cheaper even though it costs more now.

This is all speculation at this point so we can't say that GDDR5 won't be used, we also should not insist that GDDR5 will be used. In the wide memory buss discussion above, 256 and 384 bit wide buss was to speed up memory transfer speeds. 3D stacked memory is as fast as XDR2 and would not need as wide a buss as GDDR3 or GDDR5. To this point 3D stacked memory either was not available or too expensive, that's changing and impacts future hardware designs.

A wider buss is more energy efficient so SOC handheld designs will have a wider buss and use 3D stacked ultra wide I/O memory at a slower clock speed because it's energy efficient. 3D stacked memory is faster, eventually cheaper and more energy efficient than GDDR5.

I think it'll be easier for everyone if you just listed the specs of what you would reasonably expect from PS4 and x720 in 2013.
 

Ashes

Banned
If it launches with value - jaguar based apu, plus moderate southern islands discrete, plus 2gb: $299

Ps2 launch price.

It won't be though.
 

onQ123

Member
Wii U will not be on par with PS4 or Xbox 3, but it should be able to get ports from those consoles.

The way I see it so far it's shaping up like last gen where power-wise:

Wii U = PS2

PS4 = GC


Xbox 3 = Xbox

does not compute!



if the Wii U is the PS2 & the PS4 is GC that would mean that Wii U will outperform the PS4 in some areas
 
I think it'll be easier for everyone if you just listed the specs of what you would reasonably expect from PS4 and x720 in 2013.
What can be reasonably stated is the technology that MUST be used to build a next generation game console; Fusion HSA - SOC - 3D stacked Memory - Process efficiencies like using a CPU to prefetch data for a GPU or FPGA - and if a FPGA is going to be used for instance.

Guessing the GPU or CPU going into a next generation console can only be determined by the power envelope and a definition of what Next Generation performance should be but that is impacted by the above. Too many variables so we fall back to the performance of developer platforms which should be a MINIMUM spec.

What I keep seeing is an attempt to use old technology to predict next generation and statements that it's going to be weak to meet power and cost goals.

The new technologies are going to be available 2013 but many are not expecting it will be used. I've provided cites to prove my view.
 

Proelite

Member
What can be reasonably stated is the technology that MUST be used to build a next generation game console; Fusion HSA - SOC - 3D stacked Memory - Process efficiencies like using a CPU to prefetch data for a GPU or FPGA - and if a FPGA is going to be used for instance.

Guessing the GPU or CPU going into a next generation console can only be determined by the power envelope and a definition of what Next Generation performance should be but that is impacted by the above. Too many variables so we fall back to the performance of developer platforms which should be a MINIMUM spec.

What I keep seeing is an attempt to use old technology to predict next generation and statements that it's going to be weak to meet power and cost goals.

The new technologies are going to be available 2013 but many are not expecting it will be used. I've provided cites to prove my view.

200-250W TDP.

Can you work with that? Give me the most powerful console for 200-250 TDP, prioritizing ram, gpu, cpu and then everything else.

Quantizes CPU in terms of threads, clock speed, and gflops.
GPU in terms of core count, clock speed, and teraflops.
Ram in terms of speed and amount.
 

Donnie

Member
does not compute!



if the Wii U is the PS2 & the PS4 is GC that would mean that Wii U will outperform the PS4 in some areas

Why shouldn't it? Newer hardware isn't always better in every way, in fact it very rarely is. PS2 and GC could outperform Xbox in some ways. The point is PS4 will be the more powerful system overall, like GC was more powerful overall than PS2.
 
That's potentially misleading. If we are talking memory "attached to logic" in the SOC then it won't be GDDR5, if we are talking 3D stacked memory as system memory then it won't be GDDR5. 2 Gigs of system memory with 100 or less megs in the SOC is a possible.

3D stacked memory is faster than GDDR5 and will eventually be cheaper. Eventually is key here. The design of the PS4 has aspects that can not change so the first one made must use what eventually is cheaper even though it costs more now.

This is all speculation at this point so we can't say that GDDR5 won't be used, we also should not insist that GDDR5 will be used. In the wide memory buss discussion above, 256 and 384 bit wide buss was to speed up memory transfer speeds. 3D stacked memory is as fast as XDR2 and would not need as wide a buss as GDDR3 or GDDR5. To this point 3D stacked memory either was not available or too expensive, that's changing and impacts future hardware designs.

A wider buss is more energy efficient so SOC handheld designs will have a wider buss and use 3D stacked ultra wide I/O memory at a slower clock speed because it's energy efficient. 3D stacked memory is faster, eventually cheaper and more energy efficient than GDDR5.

It's not misleading as it ignores assumptions. If there was fact being ignored you would be correct.

does not compute!



if the Wii U is the PS2 & the PS4 is GC that would mean that Wii U will outperform the PS4 in some areas

I'm using the K.I.S.S. method. ;)
 
200-250W TDP.

Can you work with that? Give me the most powerful console for 200-250 TDP, prioritizing ram, gpu, cpu and then everything else.

Quantizes CPU in terms of threads, clock speed, and gflops.
GPU in terms of core count, clock speed, and teraflops.
Ram in terms of speed and amount.
Too many variables as I said. Throw away the power for now and look to what we have all agreed is next generation and rumored Developer platforms support = 2 TFLOPS. Minimum is Developer Platform and max is limited by Power envelope (not counting cost). Assuming the Developer platform is using hardware at the power envelope set for a game console we can do the following:

Now look at the efficiencies created by the new technologies, total them up and for example if they give us a 100% performance increase we can half the developer GPU specs to get the same performance. This results in a cheaper console that would easily fit in the target power envelope. Now does Sony want a killer Console and is willing to subsidies it ...the limit would then be the power envelope which looks close to the Developer platform so we could then have a near 4 TFLOPS game console. Which way will Sony go?

Too many variables!
 

Ashes

Banned
Sorry, I'm completely out of the loop about this. IGN rumoured low-end GPUs for what?

1. See op for ign apu rumour. Note first gen apu
2, trinity second gen apu coming out now, so ps4 spec will at least be that.
3, next year sees third gen apu. One presumes that ps4 will have that apu.

3rd gen apu comes with steamroller cores - the high performance part. Thus we expected ps4 to use an apu based on this chip.
But now somebody on Gaf posted information gleamed from a friend who works at AMD, claiming, that the plan has changed to use the lower powered 3rd gen apu with jaguar cores.

Nobody knows. Nothing is confirmed. And on top of that everything is changing in the manufactory as well. A fairly big shift.
 

McHuj

Member
I'd say TGS has a good shot. Last big show of the year.

Would not surprise me. While MS may not view Nintendo as competition, I'm sure Sony does, especially on their home turf in Japan. I won't surprised in the least if Sony announces a PS4 at TGS with a Q2 2013 launch in Japan (US/EU to follow for the holidays). I'm not sure they can afford to give Nintendo a year head start in Japan.
 

teiresias

Member
FPGA was mentioned by the Sony CTO, there are many use cases that would apply to a Game console (Speech and video recognition) and it's more efficient both in power and performance than CPU or GPU. Cost has come down and we may be talking a lower cost smaller array. If it gives an advantage and reduces heat and is economical then why not?

You do understand that AMD and IBM are designing process optimized building blocks with standards that allow a SOC to be built using "Building blocks" 2.5D attached to a SOC substrate. My point was that a line of FPGAs is also part of the building blocks that Sony and Microsoft can use to build the Next generation game console SOC.

Video\sound Processing for motion tracking & voice controls that want take away from the GPU & CPU & video streaming for remote play all standard without the GPU\CPU taking a hit.

I'm aware of all of this, but the main draw of FPGAs is in their name "Field Programmable Gate Array", meaning they can instantiate synthesized hardware components and some can do it dynamically now with the right architecture, allowing for rapid prototyping and design changes without rolling new silicon to do it.

Still, unless you're talking about a game telling the console that it needs it to synthesize specific blocks for specific processing functions and hence the FPGA wouldn't be doing necessarily the same thing for every game, then it still seems as if it would be more economical and more cost effective in the long run to identify these components and build up ASIC versions of these processors, because given a company that can afford to send ASIC designs out to a foundry (and would be needing parts in the quantity required for supplying consoles to market), having an optimized ASIC design is going to be more power and heat efficient than the same design synthesized and instantiated in an FPGA.

The thought that a game designer could tell the FPGA to be something different depending on the game is interesting, but I question whether it has enough application to make it worth the trouble. Going by developer sentiment about how "difficult" a console can be to develop for, I think throwing a library of FPGA-instantiable SOC components at them will be met with some . . . resistance.
 
Would not surprise me. While MS may not view Nintendo as competition, I'm sure Sony does, especially on their home turf in Japan. I won't surprised in the least if Sony announces a PS4 at TGS with a Q2 2013 launch in Japan (US/EU to follow for the holidays). I'm not sure they can afford to give Nintendo a year head start in Japan.

That would be just before Wii U launch, possibly. A bit like their NGP reveal, but forget about that.

Back when the Wii U was new news, there seemed to be a popular argument that whoever releases a rival machine earlier and closer to Wii U would do a lot better than a company waiting a few years and releasing a more advanced, expensive design.
Is that still the consensus, or is Wii U seen as less of a potential multi-platform target now?
 

i-Lo

Member
I'd say TGS has a good shot. Last big show of the year.

Not going to happen. It'll not get the same exposure as it would during E3. Add to that the fact that it'll take away attention from PS3 especially if it receives another small price drop during holiday 2012. Lastly, various first parties would have to divert resources to make showcases that are sort of pointless if the console is meant to be out sometime around holiday 2013.
 

onQ123

Member
I'm aware of all of this, but the main draw of FPGAs is in their name "Field Programmable Gate Array", meaning they can instantiate synthesized hardware components and some can do it dynamically now with the right architecture, allowing for rapid prototyping and design changes without rolling new silicon to do it.

Still, unless you're talking about a game telling the console that it needs it to synthesize specific blocks for specific processing functions and hence the FPGA wouldn't be doing necessarily the same thing for every game, then it still seems as if it would be more economical and more cost effective in the long run to identify these components and build up ASIC versions of these processors, because given a company that can afford to send ASIC designs out to a foundry (and would be needing parts in the quantity required for supplying consoles to market), having an optimized ASIC design is going to be more power and heat efficient than the same design synthesized and instantiated in an FPGA.

The thought that a game designer could tell the FPGA to be something different depending on the game is interesting, but I question whether it has enough application to make it worth the trouble. Going by developer sentiment about how "difficult" a console can be to develop for, I think throwing a library of FPGA-instantiable SOC components at them will be met with some . . . resistance.

well seeing as it's a console & they don't know everything that they will be using with it in the future having a FPGA will be better than designing a ASIC that can't adapt
 

Ashes

Banned
That looks good and it makes me wonder what the performance number would be with Radeon 7850. Of course, the system RAM configuration would be quite different.

And also note that apart from one vague inconsistent post on neogaf, we really don't have any reason to believe it won't be Kaveri with Steamroller cores.

Edit: and in any case, jaguar cores too are sea islands based.

Source:

http://www.anandtech.com/show/5491/amds-2012-2013-client-cpugpuapu-roadmap-revealed
 
I'm aware of all of this, but the main draw of FPGAs is in their name "Field Programmable Gate Array", meaning they can instantiate synthesized hardware components and some can do it dynamically now with the right architecture, allowing for rapid prototyping and design changes without rolling new silicon to do it.

Still, unless you're talking about a game telling the console that it needs it to synthesize specific blocks for specific processing functions and hence the FPGA wouldn't be doing necessarily the same thing for every game, then it still seems as if it would be more economical and more cost effective in the long run to identify these components and build up ASIC versions of these processors, because given a company that can afford to send ASIC designs out to a foundry (and would be needing parts in the quantity required for supplying consoles to market), having an optimized ASIC design is going to be more power and heat efficient than the same design synthesized and instantiated in an FPGA.

The thought that a game designer could tell the FPGA to be something different depending on the game is interesting, but I question whether it has enough application to make it worth the trouble. Going by developer sentiment about how "difficult" a console can be to develop for, I think throwing a library of FPGA-instantiable SOC components at them will be met with some . . . resistance.
Very valid arguments, very good description of FPGA and how it's programmed. onQ123 is probably on target with "they don't know everything that they will be using with it in the future having a FPGA will be better than designing a ASIC that can't adapt". For us on the outside we can only go by the Sony CTO statement and others who follow the industry closely like Charlie at SimiAccurate:

http://semiaccurate.com/2012/03/02/sony-playstation-4-will-be-an-x86-cpu-with-an-amd-gpu/ said:
"There is alot of weird talk coming out of Sony engineers and programmable logic, AKA an FPGA, is just one of the things".

"Expect stacked memory and lots of it, we have been hearing about this for a year plus now".
We also have Hirari's statement that the PS4 SOC would be used in Medical imaging. Economy of scale building for game console volumes gives Sony an advantage but only if it's not a general purpose SOC competing with AMD fusion APUs in PCs. It must have some hardware difference that makes it a unique and BEST USE case for Medical imaging. FPGA in the SOC would make it so.

http://www.altera.com/literature/wp/wp-01173-opencl.pdf?f=hp&k=gh

FPGA has may uses and can be very simple circuits used as glue logic, to turn off damaged or unused logic or to compensate for damaged elements in an image sensor. It's not confirmed how FPGA is to be used and I may be over stating it's use in giving examples of the more glamorous uses.

http://www.design-reuse.com/articles/6733/fpga-coprocessors-hardware-ip-for-software-engineers.html said:
It is widely recognized that FPGAs are very efficient for the implementation of many computationally complex digital signal processing algorithms. In comparison with programmable DSP processors, they can deliver a lower-cost and lower-power solution for a variety of algorithms. FPGAs, however, do not offer the same flexibility and ease of design as DSP processors. FPGA coprocessors are blocks of hardware IP that can easily be integrated into a processor-based system in order to offload some of the most computationally intensive tasks.

A combination of standardized hardware interfaces, design automation tools to assemble a system, and a standardized software API forms the concept of FPGA coprocessors. The design automation tools and software API make it possible for system and software engineers to make use of hardware IP with a minimum of actual FPGA design. The standardized interfaces provide orthogonality. If an IP designer conforms to the standards, an IP block can be used as a coprocessor with any of the supported processors. In a similar way, once the necessary interface hardware and software drivers have been created, all FPGA coprocessor IP can be used with that processor.
 

i-Lo

Member
IGN said that XB3 has a 6670 and PS4 has a 7670 (which is the same chip as the 6670).

Ironically, they claimed that Wii U has a 4850 last year.

The last rumour from BrainStew was that the gfx chip in PS4 was going to have 18 Compute Units, equivalent to the unreleased Radeon HD 7790.

EDIT: Also, I have asked this once before- If the chip is capable of downsampling in real time, then will adding more AA @1280x720 be less efficient than say downsampling 1680x 945?
 
The last rumour from BrainStew was that the gfx chip in PS4 was going to have 18 Compute Units, equivalent to the unreleased Radeon HD 7790.

EDIT: Also, I have asked this once before- If the chip is capable of downsampling in real time, then will adding more AA @1280x720 be less efficient than say downsampling 1680x 945?

How does the 7790 compare to the Wii U GPU and the rumored Xbox GPU?

Is the 7790 a lot better than the 7670?
 
Not going to happen. It'll not get the same exposure as it would during E3. Add to that the fact that it'll take away attention from PS3 especially if it receives another small price drop during holiday 2012. Lastly, various first parties would have to divert resources to make showcases that are sort of pointless if the console is meant to be out sometime around holiday 2013.

How would it take attention away from the PS3? As you said, TGS exposure would be mostly to the Asian market and enthusiasts, while if the ps3 was to get a price drop that would be all over tvs, the internet, and print ads. Holiday 2013 is a long ways away.

As for your point about the showcases, thats standard fare. Almost every game that has a demo at E3, has considerable time and resources taken away from the actual game to be made. Game development is usually halted for an best attempt at making a complete bugfree vertical slice. You may think its pointless but it has always been around this time frame, the 360 being the only recent anomaly.

The more I think about it, the more I believe that it could be around TGS, if E3 is skipped. There is always about a year between the reveal and launch and if E3 is skipped the only venue I could think of is TGS or CES. Unless of course Sony/MS do their own shows. I would wager we know about at least the PS4 before next E3.
 

Mario007

Member
How would it take attention away from the PS3? As you said, TGS exposure would be mostly to the Asian market and enthusiasts, while if the ps3 was to get a price drop that would be all over tvs, the internet, and print ads. Holiday 2013 is a long ways away.

As for your point about the showcases, thats standard fare. Almost every game that has a demo at E3, has considerable time and resources taken away from the actual game to be made. Game development is usually halted for an best attempt at making a complete bugfree vertical slice. You may think its pointless but it has always been around this time frame, the 360 being the only recent anomaly.

The more I think about it, the more I believe that it could be around TGS, if E3 is skipped. There is always about a year between the reveal and launch and if E3 is skipped the only venue I could think of is TGS or CES. Unless of course Sony/MS do their own shows. I would wager we know about at least the PS4 before next E3.

Sony has done a 'Playstation meeting' for both PS3 and Vita so I'd say they'll have one for ps4 too. Also right now Sony is concentrating on Vita, they said it themselves; one of the reasons why psp dropped off so quickly was because they moved resources over to ps3 very soon after psp's launch.
 

BurntPork

Banned
The last rumour from BrainStew was that the gfx chip in PS4 was going to have 18 Compute Units, equivalent to the unreleased Radeon HD 7790.

EDIT: Also, I have asked this once before- If the chip is capable of downsampling in real time, then will adding more AA @1280x720 be less efficient than say downsampling 1680x 945?

Yeah, I know.

Wait, 7790? But the 7850 only has 16 active CUs...
 
Sony has done a 'Playstation meeting' for both PS3 and Vita so I'd say they'll have one for ps4 too. Also right now Sony is concentrating on Vita, they said it themselves; one of the reasons why psp dropped off so quickly was because they moved resources over to ps3 very soon after psp's launch.

Well the PS3 meeting was after the E3 reveal but I think the actually revealed the VITA and last years one. Hmm.

Sony is definitely concentrating on Vita right now, but they are entering a risky bubble here. If they are as hell bent on catching MS in 2013 as they say they are gonna have to show something at some point. Within about a years time from now. As I said it would be before the next E3. The Vita reveal meeting was in the end of Jan so maybe around that time.
 
Top Bottom