PSM: PS4 specs more powerful than Xbox 720

Status
Not open for further replies.
the pcb would be different but cost virtually the same? couple extra pieces of plastic? guess they would need to incorporate a ddr3 controller in the north bridge... and how is it any more failure prone than anything else on the board? i dont hear about pc memory modules frying left and right...



u_u


When you classify adding a whole new memory controller, memory bus and 4+ memory chips to a console PCB as a "couple extra pieces of plastic" its time to stop proposing silly ideas.
 
never said that. either way its a cheaper idea than 8 gigs of xdr other people want.

Who the hell suggested 8GB of XDR?

Board complexity and size constraints don't just vanish over time.

OK, just for curiosity's sake, if nothing else. Please tell me the precise memory setup you propose, including bus widths, proposed interconnects and the precise number and density of the RAM chips you want to use. Then tell me how that setup is going to one day produce a $100-$150 console after die shrinks, and the RAM chips and densities you'd be using at that point. So many forget that the cost in the here and now is only a minor part of the equation. You need to propose a solution that will reduce in cost quickly as well and explain how adding another memory bus is going to help towards that goal.

Those restraints mean that at an absolute maximum you've got a 256bit bus to "spread out" across these three memory pools. You're never going to be able to get a console down to the ~$100 range if you need to produce a board with more than 8 RAM chips and a wider memory bus and all the complexity that goes with that.
 
Who the hell suggested 8GB of XDR?

Board complexity and size constraints don't just vanish over time.

OK, just for curiosity's sake, if nothing else. Please tell me the precise memory setup you propose, including bus widths, proposed interconnects and the precise number and density of the RAM chips you want to use. Then tell me how that setup is going to one day produce a $100-$150 console after die shrinks, and the RAM chips and densities you'd be using at that point. So many forget that the cost in the here and now is only a minor part of the equation. You need to propose a solution that will reduce in cost quickly as well and explain how adding another memory bus is going to help towards that goal.

But newegg can sell me 16GB for $15.
 
does anyone honestly think at this point that the new consoles will have less than 4 gig? can't you get 4 gig for like 20 dollars now? 2 years from now it's going to be even cheaper.
 
does anyone honestly think at this point that the new consoles will have less than 4 gig? can't you get 4 gig for like 20 dollars now? 2 years from now it's going to be even cheaper.

You know what would be awesome? If the PS4 was actually my gaming PC and I could open it up and slot more memory in there! Yeah!
Running Windows 8, of course *rips hair out like brain_stew*

I'm expecting a midrange Kepler GPU with 2 GB XDR2/3 for the CPU plus 2 GC GDDR 5 for the GPU.

It is more likely that the PS4 will have AMD chips in it.
 
Who the hell suggested 8GB of XDR?

Board complexity and size constraints don't just vanish over time.

OK, just for curiosity's sake, if nothing else. Please tell me the precise memory setup you propose, including bus widths, proposed interconnects and the precise number and density of the RAM chips you want to use. Then tell me how that setup is going to one day produce a $100-$150 console after die shrinks, and the RAM chips and densities you'd be using at that point. So many forget that the cost in the here and now is only a minor part of the equation. You need to propose a solution that will reduce in cost quickly as well and explain how adding another memory bus is going to help towards that goal.

read the thread lol one guy said 16 gigs xdr2.

why cant we have 4 4 gigabit xdr2 chips and 4 2 gigabit gddr5 chips? 256 bit. is 3 gigs of ram seriously too much? the ram jump from console to console is usually 8-16x. is moores law slowing down or something?
 
read the thread lol one guy said 16 gigs xdr2.

why cant we have 4 4 gigabit xdr2 chips and 4 2 gigabit gddr5 chips? 256 bit. is 3 gigs of ram seriously too much? the ram jump from console to console is usually 8-16x. is moores law slowing down or something?

4Gb XDR2 chips aren't produced yet. How can you guarantee sufficient supply for launch? How can you get costs to the level you want with only one potential memory supplier?


When the GPU is the one major component where performance is most dependant on bandwidth, is a 128 bit bus really sufficient? Is ~3x the bandwidth of RSX really sufficient when you're targetting 1080p and RSX was already severely bandwidth starved?

Where does the DDR3 pool fit in?

How do you propose developers deal with this complexity?

Is it cost efficient to create high bandwidth interconnects between components and memory pools?
 
4Gb XDR2 chips aren't produced yet. How can you guarantee sufficient supply for launch? How can you get costs to the level you want with only one potential memory supplier?


When the GPU is the one major component where performance is most dependant on bandwidth, is a 128 bit bus really sufficient? Is ~3x the bandwidth of RSX really sufficient when you're targetting 1080p and RSX was already severely bandwidth starved?

Where does the DDR3 pool fit in?

How do you propose developers deal with this complexity?

Is it cost efficient to create high bandwidth interconnects between components and memory pools?

i scrapped the ddr3 bc youre mean. and i never said 128 bit. i said 256 bit data width. and sony is releasing later than everyone else. on the right side of the density jump. they could use 4 gigabit chips.
 
4Gb XDR2 chips aren't produced yet. How can you guarantee sufficient supply for launch? How can you get costs to the level you want with only one potential memory supplier?


When the GPU is the one major component where performance is most dependant on bandwidth, is a 128 bit bus really sufficient? Is ~3x the bandwidth of RSX really sufficient when you're targetting 1080p and RSX was already severely bandwidth starved?

Where does the DDR3 pool fit in?

How do you propose developers deal with this complexity?

Is it cost efficient to create high bandwidth interconnects between components and memory pools?


If it was up to me i wouldn't settle for a console less than avatar graphics!
 
If it was up to me i wouldn't settle for a console less than avatar graphics!

Then this, sir, is your console of choice:
85940485_a63a57a00d.jpg


And its cooling system:
weta-digital-water-cooling-servers.gif
 
Then this, sir, is your console of choice:
85940485_a63a57a00d.jpg


And its cooling system:
weta-digital-water-cooling-servers.gif

No need for all of that we can just fake it .
If devs 95% close most people won't be able to tell either way .
Please note not say that saying next gen we going to get avatar like graphics we long way from that .

I would be happy if could get better hair, clothes ,shadows just to name few things for next gen .
 
read the thread lol one guy said 16 gigs xdr2.

why cant we have 4 4 gigabit xdr2 chips and 4 2 gigabit gddr5 chips? 256 bit. is 3 gigs of ram seriously too much? the ram jump from console to console is usually 8-16x. is moores law slowing down or something?

by 16 i mean 16 split down the middle. One for main memory and the other for graphics part.

If there is one hardware company which can fit things into a motherboard it's sony. c'mon sony get rambus and IBM together for a little talk
 
how bad is that to have 12 chips? how much does each cost etc. how manu chips did the ps3 have

8 and they still haven't managed to get it sub $200. The 360 started with 8 and is down to 4 now. The cost of the chips is only a small part of it, its the board complexity and the minimum size of the CPU/GPU after die shrinks that is the big issue.

Do you think Sony are going to release a console with higher costs than the PS4? Do you think they're going to release a console that can't be aggressively cost reduced?
 
by 16 i mean 16 split down the middle. One for main memory and the other for graphics part.

If there is one hardware company which can fit things into a motherboard it's sony. c'mon sony get rambus and IBM together for a little talk

So that's 32 2Gb GDDR5 chips and 16 4Gb XDR2 chips.

Piece of cake.
 
8 and they still haven't managed to get it sub $200. The 360 started with 8 and is down to 4 now. The cost of the chips is only a small part of it, its the board complexity and the minimum size of the CPU/GPU after die shrinks that is the big issue.

Do you think Sony are going to release a console with higher costs than the PS4? Do you think they're going to release a console that can't be aggressively cost reduced?

well idk about in todays economy but they typically sell consoles at a loss at first.. remember the ps3 costed like 800 bucks to make or something at launch. and this time around sony doesnt have to worry aboutexpensive blu ray drive
 
Still i question how much BC really means to people as i am a person that don't really care about it after the first year.

It depends... if we don't get i.e. another WipEout in quite a while, I'd like to be able to play the old one on the new system. Like I played the old Silent Hills on PS3, when there was no real horror games available. And I'd like all the PSN content I've bought to work on the new system as well.


EDIT:
When the GPU is the one major component where performance is most dependant on bandwidth, is a 128 bit bus really sufficient? Is ~3x the bandwidth of RSX really sufficient when you're targetting 1080p and RSX was already severely bandwidth starved?

I just remembered, that PS3 and 360 have a 128-bit bus, and I wondered how much benefits 256-bit bus would offer? I guess it's unlikely the new consoles would use a 512-bit bus?
 
How far away are we from Holographic Versatile Disc becoming mainstream?

Of course they are still prototypes so they will be expensive, but the drives expected initial costs are around 15,000 dollars with each disc costing over a 120. Of course more get produced, prices drop, etc... but still not anytime soon.
 
depends what you're doing.

You can't do *everything* on the GPU. Well, maybe you could, but then you're taking time away from it doing other things. You could almost not bother with a decent CPU, just something good enough to handle basic housekeeping, and let a GPGPU do everything else. But I can't help thinking that wouldn't give the best results.

modern multcore, multithreaded CPUs have plenty of grunt for tasks that free up the GPU to do other, more graphically focused things.

The problem with a console is the more silicon you spend on a CPU, the less you can spend on the GPU. Highly parallel tasks are going to work better on a GPU, so why not leave the CPU to focus on tasks that aren't easily threaded as well as feeding the GPU?

Its the direction I believe Sony will take.
 
The problem with a console is the more silicon you spend on a CPU, the less you can spend on the GPU. Highly parallel tasks are going to work better on a GPU, so why not leave the CPU to focus on tasks that aren't easily threaded as well as feeding the GPU?

Its the direction I believe Sony will take.

Isn´t the VITA an indication on how Sony will make the PS4 next gen?
 
It's been said quite a few times before that devs are creating assets and having to scale them down to this consoles so their won't be this big increase in dev time/cost compared to this generation, I've never quite understood this, what exactly I people talking about? anyone?
 
How far away are we from Holographic Versatile Disc becoming mainstream?
Recent articles are pointing to 4K blu-ray being released in 2013 after HEVC (h.265) is Published. It's speculated that a quad layer disk and h.265 would work with a current blu-ray drive but a player would need a more powerful CPU and more video buffer memory to support 4K Blu-ray. The PS3 slim might be able to support 4k blu-ray.

But what about UHD resolutions above 4K like 8K that are coming in 10 years. Holographic drives are here now using a new GE plastic and it can use a slightly modified blu-ray drive. Two years ago it was speculated that a PS4 would have a Blu-ray drive able to use the GE plastic disks. Problem is the equipment Sony has to manufacture Blu-ray disks would have to be replaced if supporting a format using the GE plastic. The PS4 drive may (sleeper) support the GE plastic for holigraphic storage to support 8K but for the next 10 years or so 4K blu-ray will most likely be quad layer blu-ray using a h.265 codec.

phosphor112 is correct in that without the GE plastic that has a higher reflective index, much more powerful lasers are needed and to record two lasers are used. This and no economy of scale has costs that are very high.
 
Isn´t the VITA an indication on how Sony will make the PS4 next gen?

I'd say it is, yeah. A developer focused machine with mostly off the shelf components that delivers fantastic results from day one while allowing Sony to break even within the first year of launch while selling at a mass market friendly price. What more would you want from a console?
 
I'd say it is, yeah. A developer focused machine with mostly off the shelf components that delivers fantastic results from day one while allowing Sony to break even within the first year of launch while selling at a mass market friendly price. What more would you want from a console?

brain_stew, I realise Cell isn't "off the shelf" but could it fit into the equation? I mean of course it could work but would mature dev tools and familiarity be enough to promote it above, say, another PowerPC CPU?

Or will Cell, if it does indeed evolve, be very different from the iteration in the PS3? I'm thinking substantive changes to the SPUs etc.

Difficult questions to answer I guess; seeing as for a layman like me they're difficult to ask!
 
brain_stew, I realise Cell isn't "off the shelf" but could it fit into the equation? I mean of course it could work but would mature dev tools and familiarity be enough to promote it above, say, another PowerPC CPU?

The only "off the shelf" Cell is the one in the PS3. Any CPU that would be useful for a next generation console would require a complete redesign and all the significant investment that goes along with that. The quicker Sony write off Cell as a failed experiment, the better. Spend those transistors on the GPU so that all developers can get fantastic results without a significant time investment and more development time can be spent on things that matter.
 
I'd say it is, yeah. A developer focused machine with mostly off the shelf components that delivers fantastic results from day one while allowing Sony to break even within the first year of launch while selling at a mass market friendly price. What more would you want from a console?

I think it would great, especially the developer friendly, and flexibility in price. But how big of a leap can the PS4 have over this gen with these parts?
 
I think it would great, especially the developer friendly, and flexibility in price. But how big of a leap can the PS4 have over this gen with these parts?

Very significant, actually. It won't be a patch on a contemporary high end PC rig (or even a high end PC rig of today) but it can be a true generational leap.

Something like:

4 wide CPU cores with OoOE and decent SIMD capabilities (either AMD x86 or Power7 based).
UMA with 2GB GDDR5 on a 256 bit bus (4GB if 4Gb chips are available in time).
~2 Teraflop AMD GCN based GPU (I guess a Keplar based alternative would do but I'm not seeing it).

Released in Q4 2013 for $400.

I was thinking the Xbox3 would be practically identical until all the recent SOC and 6670 rumours starting sprouting up but I'm not so sure anymore. Its safe to say this would blow the current rumoured 360 (and Wii U for that matter) out of the water.
 
No need for all of that we can just fake it .
They already did fake a lot when they did avatar, they also computed a lot of the graphics with GPUs to make the process faster. (Pantaray used GPUs for calculating spherical harmonics/GI.)

For next generation I wouldn't mind a 2 or 3 way asymmetric processing.. (Fast CPU OoO core/s for single thread performance, Slower CPU within GPU for housekeeping and butload of APUs for raw power.)
This with buttload of memory for an ability to do more interesting games..

It's been said quite a few times before that devs are creating assets and having to scale them down to this consoles so their won't be this big increase in dev time/cost compared to this generation, I've never quite understood this, what exactly I people talking about? anyone?
In current generation we pretty much already do multimillion polygon models with Zbrush/mudbox and a 'low polygon' model to transfer the properties into. (normalmap, color, specular etc.)
In next generation the process will change a little, but more on in terms of what properties will be transferred.. (add either z or vector displacement more information about surface/subsurface and so on.)

There will be need for more refined work, but a lot of the work is already done for current generation of assets even though information is lost in the process.
 
off the shelf parts? I hope you guys never ever get the chance to design the ps4. Sony has always been terrible at software. they always distinguished themselves by their hardware philosophies and yet you guys want them to design another run of the mill hardware specs like the 720 and the wii u.
 
So that's 32 2Gb GDDR5 chips and 16 4Gb XDR2 chips.

Piece of cake.

who said anything about GDDR5? Get that last gen retarded ram away from the ps4. XDR it is for both main memory and graphics.

High speed bandwidth is the way to go. And if there is one company than can design motherboards it's sony.
 
Seems likely if the PS4 is set to release later than the 720. It wouldn't make sense for them to release second if the system wasn't more powerful.
 
Very significant, actually. It won't be a patch on a contemporary high end PC rig (or even a high end PC rig of today) but it can be a true generational leap.

Something like:

4 wide CPU cores with OoOE and decent SIMD capabilities (either AMD x86 or Power7 based).
UMA with 2GB GDDR5 on a 256 bit bus (4GB if 4Gb chips are available in time).
~2 Teraflop AMD GCN based GPU (I guess a Keplar based alternative would do but I'm not seeing it).

Released in Q4 2013 for $400.

I was thinking the Xbox3 would be practically identical until all the recent SOC and 6670 rumours starting sprouting up but I'm not so sure anymore. Its safe to say this would blow the current rumoured 360 (and Wii U for that matter) out of the water.

What could Sony do if they took a small loss in terms of GPU (Kepler or AMD)?
 
Using a field programmable gate array for this:

http://www.cs.cornell.edu/~wenzel/bidir.pdf

Yields promising results.

A highly space-efficient processor architecture using floating point arithmetic was presented.
The recursive kd-tree intersection code was mapped into an efficient, pipelined
hardware design. A multi-core version of this architecture was implemented on an FPGA and
both verified using test-benches and a specially designed global illumination renderer.
While the performance of the prototype did not meet the initial expectations, much improvement
will be possible using more adequate evaluation hardware. A custom floating
point core design made it possible to fit the exceptional number of 126 floating point cores
onto a single FPGA. The designed architecture proved to be highly efficient in the number
of required clock cycles per intersection operation.
The single most important future work will be to acquire more adequate evaluation
hardware, and to adapt the project to it. With some effort, it might be possible to design
a hardware implementation of the ray generator, which runs directly on the FPGA – thus
significantly lowering the communication bandwidth requirements. Additional areas of
future work include such optimizations as mailboxing2 or studies on how variations in
pipelining and cache design affect the performance of the intersection core.

What if billions of transistors were put into a hardware bidirectional path tracer? We could have avatar graphics like right now with current technology if someone would invest some money into researching this. Whose with me?!
 
What could Sony do if they took a small loss in terms of GPU (Kepler or AMD)?

Its not about taking a loss, really. Its about designing a machine to strict size and power restraints. If you want to fit a better GPU in there then you're going to have to either dramatically increase the size of the system or use super expensive cooling, neither of which is desirable.

Then you enter more problems like, if you've got a faster GPU then you need to increase bandwidth alongside that but using anything more than a 256 bit bus is going to mess with cost reductions later on. So you look at the alternative like eDRAM, but with MRTs prevalent and a 1080p target resolution, 64MB is really the absolute minimum you can get away with and that becomes cost prohibitive.

Honestly, given a 2013 launch and a 28nm process, a 2 teraflop/2 billion transistor GPU is the peak of what is reasonably possible imo. Step past that point and the costs/heat/power/size of the system can start to spiral out of control.
 
Status
Not open for further replies.
Top Bottom