Proelite
Member
So it would be something like this:
Subsidized version: $300.
Unsubsidized version: $600?
Probably.
It can also be $200 / $400 if they didn't go batshit insane on specs.
So it would be something like this:
Subsidized version: $300.
Unsubsidized version: $600?
Probably.
It can also be $200 / $400 if they didn't go batshit insane on specs.
Price war with MS? GL.
Not to mention MS will cheat by offering subsidized plans.
Sony should focus on creating an unique gaming experience, and offer FREE PSN that rivals / overtakes the paid XBL.
As I just said, $400 unsubsidized would be a stretch if they want 256-bit bus and have something comparable to say, an AMD 7850.
That's potentially misleading. If we are talking memory "attached to logic" in the SOC then it won't be GDDR5, if we are talking 3D stacked memory as system memory then it won't be GDDR5. 2 Gigs of system memory with 100 or less megs in the SOC is a possible.YW.
For me I assumed you meant overall performance that's why I said I couldn't answer that. And the assumed amount is 2GB of GDDR5.
Obviously not in the console space. Something on a Pitcairn le (under clocked at that) would be a pretty good ceiling.Batshit insane is something like 7950/7970.
Obviously not in the console space. Something on a Pitcairn le (under clocked at that) would be a pretty good ceiling.
Wii U will not be on par with PS4 or Xbox 3, but it should be able to get ports from those consoles.
The way I see it so far it's shaping up like last gen where power-wise:
Wii U = PS2
PS4 = GC
Xbox 3 = Xbox
That's potentially misleading. If we are talking memory "attached to logic" in the SOC then it won't be GDDR5, if we are talking 3D stacked memory as system memory then it won't be GDDR5. 2 Gigs of system memory with 100 or less megs in the SOC is a possible.
3D stacked memory is faster than GDDR5 and will eventually be cheaper. Eventually is key here. The design of the PS4 has aspects that can not change so the first one made must use what eventually is cheaper even though it costs more now.
This is all speculation at this point so we can't say that GDDR5 won't be used, we also should not insist that GDDR5 will be used. In the wide memory buss discussion above, 256 and 384 bit wide buss was to speed up memory transfer speeds. 3D stacked memory is as fast as XDR2 and would not need as wide a buss as GDDR3 or GDDR5. To this point 3D stacked memory either was not available or too expensive, that's changing and impacts future hardware designs.
A wider buss is more energy efficient so SOC handheld designs will have a wider buss and use 3D stacked ultra wide I/O memory at a slower clock speed because it's energy efficient. 3D stacked memory is faster, eventually cheaper and more energy efficient than GDDR5.
What are the latest WiiU rumours saying?Wii-U is probably going to end being the most powerful of the consoles based on the latest rumors.
Wii U will not be on par with PS4 or Xbox 3, but it should be able to get ports from those consoles.
The way I see it so far it's shaping up like last gen where power-wise:
Wii U = PS2
PS4 = GC
Xbox 3 = Xbox
does not compute!
if the Wii U is the PS2 & the PS4 is GC that would mean that Wii U will outperform the PS4 in some areas
Wii-U is probably going to end being the most powerful of the consoles based on the latest rumors.
What are the latest WiiU rumours saying?
That's logically not possible. Ever.
That's logically not possible. Ever.
PowerPC > Jaguar Cores.
What can be reasonably stated is the technology that MUST be used to build a next generation game console; Fusion HSA - SOC - 3D stacked Memory - Process efficiencies like using a CPU to prefetch data for a GPU or FPGA - and if a FPGA is going to be used for instance.I think it'll be easier for everyone if you just listed the specs of what you would reasonably expect from PS4 and x720 in 2013.
What can be reasonably stated is the technology that MUST be used to build a next generation game console; Fusion HSA - SOC - 3D stacked Memory - Process efficiencies like using a CPU to prefetch data for a GPU or FPGA - and if a FPGA is going to be used for instance.
Guessing the GPU or CPU going into a next generation console can only be determined by the power envelope and a definition of what Next Generation performance should be but that is impacted by the above. Too many variables so we fall back to the performance of developer platforms which should be a MINIMUM spec.
What I keep seeing is an attempt to use old technology to predict next generation and statements that it's going to be weak to meet power and cost goals.
The new technologies are going to be available 2013 but many are not expecting it will be used. I've provided cites to prove my view.
Jeff-Rigby is probably typing up a huge post right now.
PowerPC > Jaguar Cores.
does not compute!
if the Wii U is the PS2 & the PS4 is GC that would mean that Wii U will outperform the PS4 in some areas
That's potentially misleading. If we are talking memory "attached to logic" in the SOC then it won't be GDDR5, if we are talking 3D stacked memory as system memory then it won't be GDDR5. 2 Gigs of system memory with 100 or less megs in the SOC is a possible.
3D stacked memory is faster than GDDR5 and will eventually be cheaper. Eventually is key here. The design of the PS4 has aspects that can not change so the first one made must use what eventually is cheaper even though it costs more now.
This is all speculation at this point so we can't say that GDDR5 won't be used, we also should not insist that GDDR5 will be used. In the wide memory buss discussion above, 256 and 384 bit wide buss was to speed up memory transfer speeds. 3D stacked memory is as fast as XDR2 and would not need as wide a buss as GDDR3 or GDDR5. To this point 3D stacked memory either was not available or too expensive, that's changing and impacts future hardware designs.
A wider buss is more energy efficient so SOC handheld designs will have a wider buss and use 3D stacked ultra wide I/O memory at a slower clock speed because it's energy efficient. 3D stacked memory is faster, eventually cheaper and more energy efficient than GDDR5.
does not compute!
if the Wii U is the PS2 & the PS4 is GC that would mean that Wii U will outperform the PS4 in some areas
What are the chances of a PS4 reveal at E3, Gamecon or the Tokyo Game Show?
What are the chances of a PS4 reveal at E3, Gamecon or the Tokyo Game Show?
Too many variables as I said. Throw away the power for now and look to what we have all agreed is next generation and rumored Developer platforms support = 2 TFLOPS. Minimum is Developer Platform and max is limited by Power envelope (not counting cost). Assuming the Developer platform is using hardware at the power envelope set for a game console we can do the following:200-250W TDP.
Can you work with that? Give me the most powerful console for 200-250 TDP, prioritizing ram, gpu, cpu and then everything else.
Quantizes CPU in terms of threads, clock speed, and gflops.
GPU in terms of core count, clock speed, and teraflops.
Ram in terms of speed and amount.
Sorry, I'm completely out of the loop about this. IGN rumoured low-end GPUs for what?Pretty sure it's a joke about the low-end GPUs IGN mentioned for them.
Sorry, I'm completely out of the loop about this. IGN rumoured low-end GPUs for what?
I'd say TGS has a good shot. Last big show of the year.
FPGA was mentioned by the Sony CTO, there are many use cases that would apply to a Game console (Speech and video recognition) and it's more efficient both in power and performance than CPU or GPU. Cost has come down and we may be talking a lower cost smaller array. If it gives an advantage and reduces heat and is economical then why not?
You do understand that AMD and IBM are designing process optimized building blocks with standards that allow a SOC to be built using "Building blocks" 2.5D attached to a SOC substrate. My point was that a line of FPGAs is also part of the building blocks that Sony and Microsoft can use to build the Next generation game console SOC.
Video\sound Processing for motion tracking & voice controls that want take away from the GPU & CPU & video streaming for remote play all standard without the GPU\CPU taking a hit.
Would not surprise me. While MS may not view Nintendo as competition, I'm sure Sony does, especially on their home turf in Japan. I won't surprised in the least if Sony announces a PS4 at TGS with a Q2 2013 launch in Japan (US/EU to follow for the holidays). I'm not sure they can afford to give Nintendo a year head start in Japan.
I'd say TGS has a good shot. Last big show of the year.
I'm aware of all of this, but the main draw of FPGAs is in their name "Field Programmable Gate Array", meaning they can instantiate synthesized hardware components and some can do it dynamically now with the right architecture, allowing for rapid prototyping and design changes without rolling new silicon to do it.
Still, unless you're talking about a game telling the console that it needs it to synthesize specific blocks for specific processing functions and hence the FPGA wouldn't be doing necessarily the same thing for every game, then it still seems as if it would be more economical and more cost effective in the long run to identify these components and build up ASIC versions of these processors, because given a company that can afford to send ASIC designs out to a foundry (and would be needing parts in the quantity required for supplying consoles to market), having an optimized ASIC design is going to be more power and heat efficient than the same design synthesized and instantiated in an FPGA.
The thought that a game designer could tell the FPGA to be something different depending on the game is interesting, but I question whether it has enough application to make it worth the trouble. Going by developer sentiment about how "difficult" a console can be to develop for, I think throwing a library of FPGA-instantiable SOC components at them will be met with some . . . resistance.
Found source to show that these days CPU/apu doesn't really lose out to the big dogs:
http://www.winmatrix.com/forums/ind...-3850-llano-apu-benchmark-gaming-performance/
That looks good and it makes me wonder what the performance number would be with Radeon 7850. Of course, the system RAM configuration would be quite different.
Very valid arguments, very good description of FPGA and how it's programmed. onQ123 is probably on target with "they don't know everything that they will be using with it in the future having a FPGA will be better than designing a ASIC that can't adapt". For us on the outside we can only go by the Sony CTO statement and others who follow the industry closely like Charlie at SimiAccurate:I'm aware of all of this, but the main draw of FPGAs is in their name "Field Programmable Gate Array", meaning they can instantiate synthesized hardware components and some can do it dynamically now with the right architecture, allowing for rapid prototyping and design changes without rolling new silicon to do it.
Still, unless you're talking about a game telling the console that it needs it to synthesize specific blocks for specific processing functions and hence the FPGA wouldn't be doing necessarily the same thing for every game, then it still seems as if it would be more economical and more cost effective in the long run to identify these components and build up ASIC versions of these processors, because given a company that can afford to send ASIC designs out to a foundry (and would be needing parts in the quantity required for supplying consoles to market), having an optimized ASIC design is going to be more power and heat efficient than the same design synthesized and instantiated in an FPGA.
The thought that a game designer could tell the FPGA to be something different depending on the game is interesting, but I question whether it has enough application to make it worth the trouble. Going by developer sentiment about how "difficult" a console can be to develop for, I think throwing a library of FPGA-instantiable SOC components at them will be met with some . . . resistance.
We also have Hirari's statement that the PS4 SOC would be used in Medical imaging. Economy of scale building for game console volumes gives Sony an advantage but only if it's not a general purpose SOC competing with AMD fusion APUs in PCs. It must have some hardware difference that makes it a unique and BEST USE case for Medical imaging. FPGA in the SOC would make it so.http://semiaccurate.com/2012/03/02/sony-playstation-4-will-be-an-x86-cpu-with-an-amd-gpu/ said:"There is alot of weird talk coming out of Sony engineers and programmable logic, AKA an FPGA, is just one of the things".
"Expect stacked memory and lots of it, we have been hearing about this for a year plus now".
http://www.design-reuse.com/articles/6733/fpga-coprocessors-hardware-ip-for-software-engineers.html said:It is widely recognized that FPGAs are very efficient for the implementation of many computationally complex digital signal processing algorithms. In comparison with programmable DSP processors, they can deliver a lower-cost and lower-power solution for a variety of algorithms. FPGAs, however, do not offer the same flexibility and ease of design as DSP processors. FPGA coprocessors are blocks of hardware IP that can easily be integrated into a processor-based system in order to offload some of the most computationally intensive tasks.
A combination of standardized hardware interfaces, design automation tools to assemble a system, and a standardized software API forms the concept of FPGA coprocessors. The design automation tools and software API make it possible for system and software engineers to make use of hardware IP with a minimum of actual FPGA design. The standardized interfaces provide orthogonality. If an IP designer conforms to the standards, an IP block can be used as a coprocessor with any of the supported processors. In a similar way, once the necessary interface hardware and software drivers have been created, all FPGA coprocessor IP can be used with that processor.
Sorry, I'm completely out of the loop about this. IGN rumoured low-end GPUs for what?
IGN said that XB3 has a 6670 and PS4 has a 7670 (which is the same chip as the 6670).
Ironically, they claimed that Wii U has a 4850 last year.
The last rumour from BrainStew was that the gfx chip in PS4 was going to have 18 Compute Units, equivalent to the unreleased Radeon HD 7790.
EDIT: Also, I have asked this once before- If the chip is capable of downsampling in real time, then will adding more AA @1280x720 be less efficient than say downsampling 1680x 945?
How does the 7790 compare to the Wii U GPU and the rumored Xbox GPU?
Is the 7790 a lot better than the 7670?
Not going to happen. It'll not get the same exposure as it would during E3. Add to that the fact that it'll take away attention from PS3 especially if it receives another small price drop during holiday 2012. Lastly, various first parties would have to divert resources to make showcases that are sort of pointless if the console is meant to be out sometime around holiday 2013.
How would it take attention away from the PS3? As you said, TGS exposure would be mostly to the Asian market and enthusiasts, while if the ps3 was to get a price drop that would be all over tvs, the internet, and print ads. Holiday 2013 is a long ways away.
As for your point about the showcases, thats standard fare. Almost every game that has a demo at E3, has considerable time and resources taken away from the actual game to be made. Game development is usually halted for an best attempt at making a complete bugfree vertical slice. You may think its pointless but it has always been around this time frame, the 360 being the only recent anomaly.
The more I think about it, the more I believe that it could be around TGS, if E3 is skipped. There is always about a year between the reveal and launch and if E3 is skipped the only venue I could think of is TGS or CES. Unless of course Sony/MS do their own shows. I would wager we know about at least the PS4 before next E3.
The last rumour from BrainStew was that the gfx chip in PS4 was going to have 18 Compute Units, equivalent to the unreleased Radeon HD 7790.
EDIT: Also, I have asked this once before- If the chip is capable of downsampling in real time, then will adding more AA @1280x720 be less efficient than say downsampling 1680x 945?
Sony has done a 'Playstation meeting' for both PS3 and Vita so I'd say they'll have one for ps4 too. Also right now Sony is concentrating on Vita, they said it themselves; one of the reasons why psp dropped off so quickly was because they moved resources over to ps3 very soon after psp's launch.
Yeah, I know.
Wait, 7790? But the 7850 only has 16 active CUs...