• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: PS4 Spec Analysis

onQ123

Member
I hadn't even thought of that, that's pretty insane. I don't remember that ever happening before, the xbox 360 couldn't fit a whole xbox games worth of content in its memory, the PS3 couldn't for the PS2, etc.

And along with the memory being larger, they can make more effective use of streaming since more assets will be able to be vacated from the memory as the scene changes. Hopefully a lot of devs do it right and bring us games with few load screens.

GameCube had 43MB of ram & N64 cartridges was only 32MB I think.
 

KingJ2002

Member
well... it was smart for sony to invest now... as we get closer to the area of diminishing returns and the issue of "uncanny valley" will start up... the system will be prepared to handle "CG Visuals" for the long term... especially with their cloud service being expandable it it's feature set.

i'm sure PS5 will be a completely different beast altogether.
 

KKRT00

Member
For me that Killzone 4 build must be in alpha stage, because looks to me like next-gen COD, not something GG would produce. LOD issues are hideous and considering geometry its really strange, also they havent really improved particles much from KZ 3, except for resolution and art. Shadows are past gen too, with many similarities too past KZ games.
HDR and reflections are mostly new here.
 
Yea, that's why pretty much all launch consoles suffered the YLOD defect.. (including my european 60GB launch console)



Absolute nonsense. The Ps3 launched in November 2006 in NA and in March 2007 in Europe. Crysis launched in March 2007. I bought my Gaming PC in July 2008 (which at the time was somewhat outdated and rather middle class in terms of performance) and I only upgraded the OS/external peripherals.

(CPU: QuadCore Intel Core 2 Quad Q9450, 2666 MHz (4x 2,6GHz), MB: Asus P5K, RAM: 4GB DDR RAM (too lazy to look up which one), GFX: nVIDIA GeForce GTX 260 896 MB (Zotac)
And I can still run every modern game at 1080p maxed out (most of the time even with 16xAA/AF!) with at least 30f-60fps if not more.

Also, you guys should be aware that at least ~2GB will be reserved for the OS. Sure it has a little co-processor thing for the recording/streaming but it still needs lots of RAM. (especially if you want good quality videos, i.e. 720p streaming/recodring)

Considering you can already do that on a PC with 1GB of ram and a bloated Windows OS, I think the rumored 512MB for the OS is more than enough for all these functionalities.
 
You know, thinking about it, I think it's just that the diversity on consoles and PCs is different. I stand by the rest though. PC gaming is a huge pain in the ass.

I agree that for some less technologically inclined that the PC can be a slight annoyance, but the view that you have to constantly search for drivers and lookup errors is 5-8 years outdated. It's still not as simple as on consoles, but the way console games have headed this generation the differences in simplicity are narrowing rapidly.

Considering you can already do that on a PC with 1GB of ram and a bloated Windows OS, I think the rumored 512MB for the OS is more than enough for all these functionalities.

Except that on PCs you aren't recording video using only 1GB of RAM. When I record video, albeit lossless 1080p, it takes up a decent chunk of RAM, and unless the PS3 is going to use a comparatively slow HDD they'll end up using RAM for the video recording feature (which I love). I'd say 2GB is a decent estimate, especially given their ambitions for multitasking.
 

genjiZERO

Member
I agree that for some less technologically inclined that the PC can be a slight annoyance, but the view that you have to constantly search for drivers and lookup errors is 5-8 years outdated. It's still as simple as on consoles, but the way console games have headed this generation the differences in simplicity are narrowing rapidly

I don't really mean drivers, but I mean things like having to sign on to whatever Blizzards or whoevers server wants me to do, finding out that Civilisation 5 doesn't have cross play between Mac and PC, and having to tweak the game setting to not get it to slow down constantly. With a console I just put the disc in the machine and press play. I can't say that I've had a single PC gaming experience that didn't require to do something besides just playing the damn thing.
 
I don't really mean drivers, but I mean things like having to sign on to whatever Blizzards or whoevers server wants me to do, finding out that Civilisation 5 doesn't have cross play between Mac and PC, and having to tweak the game setting to not get it to slow down constantly. With a console I just put the disc in the machine and press play. I can't say that I've had a single PC gaming experience that didn't require to do something besides just playing the damn thing.

Yeah, PC gaming does take a couple more steps. What I'm saying is, with mandatory installs, constant updates when booting, games freezing, having to clear the cache (I had to do this recently on the 360 for FIFA 13) and more, console gaming is moving towards PC levels of complexity. I hope Sony and MS have stricter QA requirements going forward.
 

genjiZERO

Member
Yeah, PC gaming does take a couple more steps. What I'm saying is, with mandatory installs, constant updates when booting, games freezing, having to clear the cache (I had to do this recently on the 360 for FIFA 13) and more, console gaming is moving towards PC levels of complexity. I hope Sony and MS have stricter QA requirements going forward.

Amen to that. I do find the constant patching etc. to be super annoying.
 

bill0527

Member
Yeah, PC gaming does take a couple more steps. What I'm saying is, with mandatory installs, constant updates when booting, games freezing, having to clear the cache (I had to do this recently on the 360 for FIFA 13) and more, console gaming is moving towards PC levels of complexity. I hope Sony and MS have stricter QA requirements going forward.

I don't expect thats going to happen.

PS+ already has some of this covered, the patching part at least by downloading and installing patches. I've got mine set to do it at 2am when I know I'llbe in bed, so I wake up the next day, turn on the system, and get an overnight patch report showing whats been done.
 

Skyzard

Banned
Pretty disappointed in the CPU and GPU but I guess you can't have everything. I assume affordable PCs will get significantly further ahead in about 4 or 5 years as well so basically a repeat of previous gen. Really wish they went for something like a top end i5 at least. This will do..let me watch that 1080p video of killzone.
 

spisho

Neo Member
I am comparing hardware. Crysis runs on High end graphics cards from the end of 2006/ early 2007 (think 8800 GTX), and the PS3 never really ran Crysis all that well throughout its lifetime, even after they retooled the game for consoles.

Crysis on PS3 leaves a lot of performance on the table by doing all of its rendering on RSX. The problem with current gen consoles is the limited memory, which is especially critical on the PS3 in order to utilize all of the processing power it has. If the PS3 had twice the memory it wouldn't have to pare back geometry and shaders would run a great deal better utilizing the spus.
 
Pretty disappointed in the CPU and GPU but I guess you can't have everything. I assume affordable PCs will get significantly further ahead in about 4 or 5 years as well so basically a repeat of previous gen. Really wish they went for something like a top end i5 at least. This will do..let me watch that 1080p video of killzone.

Top end i5 has a far greater power and thermal footprint, not exactly an ideal candidate for such a, presumably, compact box. Plus, it'd raise the BOM another $50+.

8-core Jaguar ain't too shabby, not sure what other solutions Sony could've gone with.
 

VariantX

Member
Uh, incorrect. The 7850 BLOWS away the 6870.

Consider that the 7850 is able to run Crysis 2 at high settings @ 1080p, 60fps.

As far as graphics cards go, the 7850 is pretty damn powerful. It will run virtually every game on the market at 1080p 60fps, including Crysis 3.

Its like 20% better than a 6870, I wouldn't say that blows it away or anything.
 

statham

Member
anyone think Sony is pulling a PS3? announce how much powerful your system is and it'll launch with the next Xbox only to be delayed a year? I don't but they have done this.
 
anyone think Sony is pulling a PS3? announce how much powerful your system is and it'll launch with the next Xbox only to be delayed a year? I don't but they have done this.

not the same situation i think. I remember reading that they had to change their specs quite late so that probably was a factor. I'm not sure it'll be the same this time.
 

statham

Member
not the same situation i think. I remember reading that they had to change their specs quite late so that probably was a factor. I'm not sure it'll be the same this time.
4gb-8gb is a pretty huge deal, thats not free watts or a simple motherboard change.
 

Strike

Member
not the same situation i think. I remember reading that they had to change their specs quite late so that probably was a factor. I'm not sure it'll be the same this time.
Blu-ray and Cell were factors into the PS3's delay since they were new technology and yields were low. Since the PS4 is mostly using off the shelf parts, they shouldn't run into this problem. 8GB GDDR5 might be, but I hope for their sake that they actually manage to pull it off.
 

onQ123

Member
I stopped reading when he mentioned the 1000 dollar titan, what sensationalist tripe.

"To illustrate the extent of the achievement, Nvidia's $1000 graphics card - the GeForce Titan - offers "just" 6GB of onboard GDDR5."

what was wrong with that? it was used to get the point across that 8GB of GDDR5 is a big achievement & that even the new high end GPU's are just getting 6GB of GDDR5.
 

Raistlin

Post Count: 9999
Except that on PCs you aren't recording video using only 1GB of RAM. When I record video, albeit lossless 1080p, it takes up a decent chunk of RAM, and unless the PS3 is going to use a comparatively slow HDD they'll end up using RAM for the video recording feature (which I love). I'd say 2GB is a decent estimate, especially given their ambitions for multitasking.
Even a slow HDD has no problems storing an HD stream while also reading data concurrently. That's how DVR's work.

PS4 has dedicated IC's for decoding/encoding video. It wouldn't be a jump to assume they're essentially going straight from the framebuffer to HDD.
 

USC-fan

Banned
Yeah hdd have no trouble recording hdtv streams. My mcpc can record 6 hdtv shows and stream 3 hd show to extenders. No problem at all. Sure it can handle more than that, no slow downs at all.

This is a green 5400 1 tb hdd. Not high performance at all.
 

daveo42

Banned
4gb-8gb is a pretty huge deal, thats not free watts or a simple motherboard change.

They've either been debating this for quite some time (well before the latest dev kits went out) or Sony was waiting on confirmation of bigger modules (1GB).

Safe to assume it'll be 2.5D?
 

statham

Member
4gb is at least $50 of ddr5(7850 2gb and 1gb is a $35 difference) and I would imagine 4GB they already have is another $10-15 bucks more. lets just say the touch controller is $5 more and the graphics card is $15 more. we are looking at a console that costs atleast $100 more with a company that can't afford it.
they both will have camera so thats a wash, I really bet we see a $299 xbox and a $499 PS4
or PS4 isnot launching this year. or highly limited.
 

i-Lo

Member
They've either been debating this for quite some time (well before the latest dev kits went out) or Sony was waiting on confirmation of bigger modules (1GB).

Safe to assume it'll be 2.5D?

I recently heard that 512MB modules came to exist this Jan. Are they really expecting double the density already?
 
Even a slow HDD has no problems storing an HD stream while also reading data concurrently. That's how DVR's work.

PS4 has dedicated IC's for decoding/encoding video. It wouldn't be a jump to assume they're essentially going straight from the framebuffer to HDD.

Good point, forgot about the dedicated decoder. I was working with lossless 1080p60 numbers, which can be up to 3.5GB per minute, something 7200RPM drives still struggle with for long periods. The PS3 will be storing the compressed output from the dedicated decoder, I imagine at a decent compression level. Probably 200MB per minute, max.
 
4gb is at least $50 of ddr5(7850 2gb and 1gb is a $35 difference) and I would imagine 4GB they already have is another $10-15 bucks more. lets just say the touch controller is $5 more and the graphics card is $15 more. we are looking at a console that costs atleast $100 more with a company that can't afford it.
they both will have camera so thats a wash, I really bet we see a $299 xbox and a $499 PS4
or PS4 isnot launching this year. or highly limited.

You would think after being so wrong in the past that some people would finally stop the armchair analyst and concede the fact that

a)no one has any idea about the return of investment or research and development that goes into any of this.

b) and pretending that you do is pointless.

but here we are.
 

KidBeta

Junior Member
Good point, forgot about the dedicated decoder. I was working with lossless 1080p60 numbers, which can be up to 3.5GB per minute, something 7200RPM drives still struggle with for long periods. The PS3 will be storing the compressed output from the dedicated decoder, I imagine at a decent compression level. Probably 200MB per minute, max.

you can easily get away with 10mbit/s for 1080P.

The entire 15 minute stream ends up being 1125MB or something along those lines.

EDIT:.

Which is 75MB / minute
 

Zinthar

Member
Although I'm not complaining about it, I find the choice to go with 8GB of GDDR5 to be quite perplexing. The PS3 combines what essentially is a mid-range GPU (Radeon 7850) and a slightly less than mid-range CPU with more high-speed RAM than the Titan.

There's a reason why high-end PC graphics cards aren't paired with so much video RAM: modern games, even the most graphically-intensive ones, don't use anywhere near 8GB -- in most cases not even 2GB.

So either the OS and streaming/recording functions are going to require massive RAM (like 2-4GB worth), and/or developers will start using the RAM to constantly load data from the blu-ray/storage device into RAM long before it needs to be processed by the console so as to eliminate loading.

It's just a strange choice to pair mid-spec compute parts with absurd amounts of RAM -- like a puppy with enormous paws: the paws make sense when he grows up into a full-size dog, but in this case the puppy will never grow up because they're not going to upgrade the graphics card down the road.

Maybe they're thinking ahead to 2015 and beyond when PC graphics cards will all be carrying 8GB GDDR5 or faster and want to ensure that RAM levels are still comparable, even though the compute power of the future PC cards will be many times greater? Or maybe just looking at the past and seeing that RAM prices nosedive over time and figuring that the investment in more GDDR5 now will pay off in time.

In any case, if what Digital Foundry has reported about Durango is true, the performance gap between the PS4 and Durango is fairly massive. Would love to have been a fly on the wall at the meetings over at Microsoft this morning.
 

statham

Member
You would think after being so wrong in the past that some people would finally stop the armchair analyst and concede the fact that

a)no one has any idea about the return of investment or research and development that goes into any of this.

b) and pretending that you do is pointless.

but here we are.
ok. what are your costs.
 

i-Lo

Member
Good point, forgot about the dedicated decoder. I was working with lossless 1080p60 numbers, which can be up to 3.5GB per minute, something 7200RPM drives still struggle with for long periods. The PS3 will be storing the compressed output from the dedicated decoder, I imagine at a decent compression level. Probably 200MB per minute, max.

Let me ask, can a part of the rumoured 16GB flash chip play a part here?
 

Fafalada

Fafracer forever
RibbedHero said:
but the view that you have to constantly search for drivers and lookup errors is 5-8 years outdated.
It's gotten better - primarily thanks to market being console driven and PC getting ports that had the benefit of stricter QA.
But it's not an issue that has disappeared - if you only care about games being "playable" (regardless of performance/looks), you'll run into it much less, but it does still happen.
If you're an enthusiast(eg. tweak settings to play games at specific framerate, and/or quality) - the needs for constant tweaking increase exponentially.
 

daveo42

Banned
I recently heard that 512MB modules came to exist this Jan. Are they really expecting double the density already?

That was mostly based on speculation and the idea of logical progression. Sony could have been holding out for it but another solution was made available. 16 modules is going to take up a ton of space on the board and, as others have stated, will have heat and power repercussions.

I think the only thing that can be certain is that Sony has been thinking about this for a good deal of time and is just now letting everyone know of the change (devs included).
 

Raistlin

Post Count: 9999
you can easily get away with 10mbit/s for 1080P.

The entire 15 minute stream ends up being 1125MB or something along those lines.

EDIT:.

Which is 75MB / minute
Vudu HDX's nominal bitrate isn't even 10mbit ... it's something like 8mbit. Granted that's at 24p, so 30p would probably bump it up to around 10.

Again though, that's Vudu HDX. I'd be shocked if Sony is bothering with that sort of quality. This is meant for sharing on social sites. So if anything, it would be closer to Netflix bitrates. Actually now that I think about it, do we even no if it's 1080p? 720p is plenty for the use-case.


What would be cool is if Sony allows you to configure it. However if it is going straight to HDD, they'd probably want to keep things more static in order for devs to have a better feel for available bandwidth for caching/loading while it's going on.
 

Zinthar

Member
4gb is at least $50 of ddr5(7850 2gb and 1gb is a $35 difference) and I would imagine 4GB they already have is another $10-15 bucks more. lets just say the touch controller is $5 more and the graphics card is $15 more. we are looking at a console that costs atleast $100 more with a company that can't afford it.
they both will have camera so thats a wash, I really bet we see a $299 xbox and a $499 PS4
or PS4 isnot launching this year. or highly limited.

They can easily afford to take a small to moderate loss on the hardware initially, even assuming your random guessing for the hardware costs is correct (hint: you're incredibly wrong; the APU they're using is not all that pricey).

Early adopters buy a large number of games and accessories, and that's factored into the initial price, as is the negative PR associated with launching a console at $599 U.S. dollars.

Additionally, the PS3 was so expensive because Cell had poor yields and was expensive to produce, and it packed a first-gen blu-ray drive. The APU of the PS4 will cost nowhere near that of Cell + RSX, nor will the blu-ray drive or hard drive.
 
Although I'm not complaining about it, I find the choice to go with 8GB of GDDR5 to be quite perplexing. The PS3 combines what essentially is a mid-range GPU (Radeon 7850) and a slightly less than mid-range CPU with more high-speed RAM than the Titan.

There's a reason why high-end PC graphics cards aren't paired with so much video RAM: modern games, even the most graphically-intensive ones, don't use anywhere near 8GB -- in most cases not even 2GB.

So either the OS and streaming/recording functions are going to require massive RAM (like 2-4GB worth), and/or developers will start using the RAM to constantly load data from the blu-ray/storage device into RAM long before it needs to be processed by the console so as to eliminate loading.

It's just a strange choice to pair mid-spec compute parts with absurd amounts of RAM -- like a puppy with enormous paws: the paws make sense when he grows up into a full-size dog, but in this case the puppy will never grow up because they're not going to upgrade the graphics card down the road.

Maybe they're thinking ahead to 2015 and beyond when PC graphics cards will all be carrying 8GB GDDR5 or faster and want to ensure that RAM levels are still comparable, even though the compute power of the future PC cards will be many times greater? Or maybe just looking at the past and seeing that RAM prices nosedive over time and figuring that the investment in more GDDR5 now will pay off in time.

In any case, if what Digital Foundry has reported about Durango is true, the performance gap between the PS4 and Durango is fairly massive. Would love to have been a fly on the wall at the meetings over at Microsoft this morning.

In the past the biggest obstacle in late-generation development has been RAM, Sony has essentially gone for future-proofing, RAM-wise. I also feel that they hope they can squeeze out a decent amount of innovation from the RAM they are offering developers. Larger open worlds, complex AI, etc.

Let me ask, can a part of the rumoured 16GB flash chip play a part here?

If Sony is targeting hitch-free 15-20mbps (roughly half of Blu-ray's maximum bitrate) then a 16GB flash drive would definitely help.

But I think the flash chip is likely used in conjunction with the ARM chip so that the HDD does not need to be powered on in low-power mode when downloading updates and the like.
 

DasDamen

Member
GameCube had 43MB of ram & N64 cartridges was only 32MB I think.

The Sega Saturn had 5MB of RAM (expandable to 9MB) and the largest Genesis game was 5MB (let's forgot about Sega CD games for the moment).

Also, the N64 had 4MB of RAM (expandable to 8MB) and the largest SNES games were 6MB.

BTW, Resident Evil 2 for N64 was 64MB.
 
Top Bottom