adding 4gb of gddr5 is not free and the motherboard, powersupply,cooling ads something does it not?
Well put -- power supplies, being tangible goods, are not free. They cost something more than $0, as they have since they were first created.
adding 4gb of gddr5 is not free and the motherboard, powersupply,cooling ads something does it not?
That was mostly based on speculation and the idea of logical progression. Sony could have been holding out for it but another solution was made available. 16 modules is going to take up a ton of space on the board and, as others have stated, will have heat and power repercussions.
I think the only thing that can be certain is that Sony has been thinking about this for a good deal of time and is just now letting everyone know of the change (devs included).
If Sony is targeting hitch-free 15-20mbps (roughly half of Blu-ray's maximum bitrate) then a 16GB flash drive would definitely help.
But I think the flash chip is likely used in conjunction with the ARM chip so that the HDD does not need to be powered on in low-power mode when downloading updates and the like.
who buy that much at a time?, you buy a few million, wait for price to drop, buy a few million more. you think sony or MS is buying 70 million graphics cards at one time?Going by video cards it's 150 bucks but when you buy 20 million gig...what do you think it cost?
In the past the biggest obstacle in late-generation development has been RAM, Sony has essentially gone for future-proofing, RAM-wise. I also feel that they hope they can squeeze out a decent amount of innovation from the RAM they are offering developers. Larger open worlds, complex AI, etc.
i-Lo said:Pertaining to CU, can anyone outline the benefits of having 18 unified CUs as opposed to 14+4 configuration where those general purpose ones could have been individually used for GPU related tasks (effectively making it 18 CUs)? Also, is this a product of customization given I do not remember a normal pitcairn's CU being programmable for such diverse purposes?
who buy that much at a time?, you buy a few million, wait for price to drop, buy a few million more. you think sony or MS is buying 70 million graphics cards at one time?
I assume the benefits of 18 unified CUs is that you could technically use a 10+8 configuration for example if your game required it. Essentially, the benefit is flexibility, similar to that of a unified ram pool.
That makes a lot of sense, however, that's why I said "part" of it (that can be reserved) and recording media would be done while actively playing the game in normal power mode.
Although I'm not complaining about it, I find the choice to go with 8GB of GDDR5 to be quite perplexing. The PS3 combines what essentially is a mid-range GPU (Radeon 7850) and a slightly less than mid-range CPU with more high-speed RAM than the Titan.
There's a reason why high-end PC graphics cards aren't paired with so much video RAM: modern games, even the most graphically-intensive ones, don't use anywhere near 8GB -- in most cases not even 2GB.
So either the OS and streaming/recording functions are going to require massive RAM (like 2-4GB worth), and/or developers will start using the RAM to constantly load data from the blu-ray/storage device into RAM long before it needs to be processed by the console so as to eliminate loading.
It's just a strange choice to pair mid-spec compute parts with absurd amounts of RAM -- like a puppy with enormous paws: the paws make sense when he grows up into a full-size dog, but in this case the puppy will never grow up because they're not going to upgrade the graphics card down the road.
Maybe they're thinking ahead to 2015 and beyond when PC graphics cards will all be carrying 8GB GDDR5 or faster and want to ensure that RAM levels are still comparable, even though the compute power of the future PC cards will be many times greater? Or maybe just looking at the past and seeing that RAM prices nosedive over time and figuring that the investment in more GDDR5 now will pay off in time.
In any case, if what Digital Foundry has reported about Durango is true, the performance gap between the PS4 and Durango is fairly massive. Would love to have been a fly on the wall at the meetings over at Microsoft this morning.
Ah, thanks. Hypothetically, can you have variable usage in the number of CUs for graphics and general purpose tasks depending upon priority of tasks in real time game environment instead of drawing a line in the sand from the beginning?
Certainly multitasking is part of this. Last gen, we finally saw Sony and MS push forward with their plans of a console being a media hub. This gen we're going to see a paradigm shift that places the consoles were Sony and MS always wanted them to be. A true 'center of living room' (if not house) hub. Not just for media, but for stuff like social, potentially DVRing, home automation, etc. They want fast switching between all of those tasks.Although I'm not complaining about it, I find the choice to go with 8GB of GDDR5 to be quite perplexing. The PS3 combines what essentially is a mid-range GPU (Radeon 7850) and a slightly less than mid-range CPU with more high-speed RAM than the Titan.
There's a reason why high-end PC graphics cards aren't paired with so much video RAM: modern games, even the most graphically-intensive ones, don't use anywhere near 8GB -- in most cases not even 2GB.
So either the OS and streaming/recording functions are going to require massive RAM (like 2-4GB worth), and/or developers will start using the RAM to constantly load data from the blu-ray/storage device into RAM long before it needs to be processed by the console so as to eliminate loading.
It's just a strange choice to pair mid-spec compute parts with absurd amounts of RAM -- like a puppy with enormous paws: the paws make sense when he grows up into a full-size dog, but in this case the puppy will never grow up because they're not going to upgrade the graphics card down the road.
Maybe they're thinking ahead to 2015 and beyond when PC graphics cards will all be carrying 8GB GDDR5 or faster and want to ensure that RAM levels are still comparable, even though the compute power of the future PC cards will be many times greater? Or maybe just looking at the past and seeing that RAM prices nosedive over time and figuring that the investment in more GDDR5 now will pay off in time.
In any case, if what Digital Foundry has reported about Durango is true, the performance gap between the PS4 and Durango is fairly massive. Would love to have been a fly on the wall at the meetings over at Microsoft this morning.
we will see, I'm predicting atleast $100 price difference or six month later launch.It's 8 gig per console. It's 8 million just for 1 million consoles...so yeah you will be buying millions of gigs of ram.
Really you are gonna use that excuse?! Maybe for current gen but developers know how 8GB to work with. They will use more then half of that. Games (PC for that matter) were only limited to that size because of the PS360. Not the case anymore sonny.
Clamshell design.
That makes a lot of sense, however, that's why I said "part" of it (that can be reserved) and recording media would be done while actively playing the game in normal power mode.
Every generation has dealt with this. So if you want to have your hardware remain viable for ports in the long-run, you need a lot of memory. And I can guarantee you ... both Sony and MS want this gen to last at least as long as the last gen. This is the only economically feasible way to do it. Actually I'd argue it's the only way to realistically do it regardless of costs.
Top end i5 has a far greater power and thermal footprint, not exactly an ideal candidate for such a, presumably, compact box. Plus, it'd raise the BOM another $50+.
8-core Jaguar ain't too shabby, not sure what other solutions Sony could've gone with.
PC has a bigger selection of games than the twins combined, across more genre's even. Accessibility? You can't be serious.
I don't want to argue on diversity of games. Ultimately I just think PC gaming has a different diversity.
On accessibility, of course I'm serious. PC gaming is clearly less accessible. For one thing, it requires you to keep track of the specs of your computer, the type of computer you have, etc. For another, it simply does require you to tweak your settings to get it right. If I have a Playstation, or Xbox or Nintendo, I need to know nothing about the game other than what system it was made for. I pop it in an play it. There's nothing more to it. This isn't true about PC games and I'm not even talking about constant patching and installing multiple of discs as was common in years past. I'm just talking about having to connect to Blizzard's server to play Starcraft and having to tweak the specs of the game to get it to even run on my laptop, or finding out that there's no cross play for Civ 5 between PC and Mac after I buy the game on Steam (who also wouldn't refund me). With Dark Souls on PS3 I just put the game in the machine, created a character and played the game.
Look, I'm not saying PC gaming is bad. Not at all. It's cool people who are into it. But PC gaming just isn't as accessible as console gaming, and PC gamers are as delusional as gun owners who claim guns aren't the problem if they try to argue otherwise. I have a computer that cost $2,400 and I can't play Civ 5 with my friends on the other side of the country. That's an accessibility problem.
who buy that much at a time?, you buy a few million, wait for price to drop, buy a few million more. you think sony or MS is buying 70 million graphics cards at one time?
You're forgetting about flash games and Facebook games! The most accessible games available.
Yep.I think you've nailed it right there. The extra cost now is minuscule compared to the benefit in 2018 and beyond.
This is also why I don't think Sony will charge as much as some people think up front for the PS4 -- in fact I'd bet the rumored $429/529 is probably what they're leaning towards.
Oh c'mon that's a bit broad within the definition of "PC games". But I conceded the point. Also, excuse the rantyness - I'm a little (but not too) drunk.
Having the arm chip for stand-by mode doesn't get enough praise. I love that move. An ARM chip will allow that to be functional, but have super low power consumption. The entire design of everything is just so elegant.
Good point.
Launching close to the next Xbox is a huge improvement from last generation, but if the PS4 is $100+ more...big problem.
I assume the benefits of 18 unified CUs is that you could technically use a 10+8 (or any other combination really) configuration for example if your game required it. Essentially, the benefit is flexibility, similar to that of a unified ram pool.
Yeah, there's no way this comes out and the cheapest model is more than $450. Max $500-550 for the high end model if they go multi-SKU (which knowing Sony they will).Tretton hinted that it'll be cheaper than PS3's launch.
Right, but in the previous generation the PS3 had under 500MB of usable RAM, and it was split between the CPU & GPU. And high-end graphics cards of that time were shipping with a minimum of 512MB of VRAM (768MB in the case of the 8800GTX which released just before the PS3), vs. the 256GB dedicated to RSX.
So in that case the PS3 had half of the video RAM of its PC contemporaries. In the case of PS4 we have 8GB of GDDR5 while high-end cards at the end of this year will probably all be in the 2-3GB range.
It's like Sony realized they had a headache, and decided to fix it by hooking up a morphine drip.
To be fair I'm pretty sure Durango was rumored to be doing something similar before (And I sort of expected it given the emphasis on non-gaming functionality in rumors).Panajev2001a said:Cerny method baby .
It sounds great but I worried that early hardware is going to be plagued with problems like early 360 and ps3's.
Launching 1 year later, with third developers having issues with HW, tools, and maintaining feature/IQ parity in multi platform titles for some time == Launching side to side, with developers happy with tools, SDK, and the HW setup? Really?
Having the arm chip for stand-by mode doesn't get enough praise. I love that move. An ARM chip will allow that to be functional, but have super low power consumption. The entire design of everything is just so elegant.
So what are the chances of MS going for 12gb of ddr3 or more? That would still work out cheaper than jumping to ddr5 and would give developers even more to play with.
Except that on PCs you aren't recording video using only 1GB of RAM. When I record video, albeit lossless 1080p, it takes up a decent chunk of RAM, and unless the PS3 is going to use a comparatively slow HDD they'll end up using RAM for the video recording feature (which I love). I'd say 2GB is a decent estimate, especially given their ambitions for multitasking.
Well, I have an external recorder that can record lossless 1080p and it does have 128MB of RAM inside. Why would the PS4 need to use 2GB of ram for doing the same thing?
Does anyone here have an idea of how this 8-core Jaguar CPU compares with a current mid-range 4-core Intel i5 in total processing power?
Its at least 50% slower than i5 2500k.
Maybe PS4 is the reason Nvidia and AMD aren't launching their nextgen series until winter. Sony needs to secure all the GDDR5 chips they can.
How does it compare in terms of FLOPS?