PS4 has 8 GB OF GDDR5 RAM

It won't matter if microsoft decides to go with less, then all developers will go with less. that would suck for SONY...again

At PS4's meeting, they specifically mentioned ease of porting between PC and PS4. There is a graphical difference between the PC versions of, say, BF3, The Witcher, Oblivion and their console counterparts. So why would PC devs take detail away from their PS4 ports?
Perhaps you are talking about console multiplatforms that are mainly built on Xbox 3 then ported. But even that is not indicative of how multiplats would run on PS4/Xbox 3.

PS3 and 360 had the same amount of memory, yet 360 has/had better ports due to its memory being unified. Now, PS4 has more memory than Xbox 3 AND is much faster. If memory design was a part of the reason why 360 has better looking ports, why can't the same work for PS4 as well?
 
It won't matter if microsoft decides to go with less, then all developers will go with less. that would suck for SONY...again

So you are telling me that a game on GDDR5 would suck because a developer "decides" to go with GDDR3? How does that make sense? There will always be an advantage even if it is simply faster loading times for PS4.

A nerd correct me if I'm wrong.
 
So you are telling me that a game on GDDR5 would suck because a developer "decides" to go with GDDR3? How does that make sense. There will always be an advantage even if it is simply faster loading times for PS4.

A nerd correct me if I'm wrong.
It's called designing the game for the lowest common denominator.
 
Won't 720 have more powerful specs than Wii U?
Sorry to break this to you, but Wii U will only be included in multiplatform games that are cross-gen and will be grouped in with the PS3 and 360 in that it'll have the gimped version.

With the more ambitious games in the next gen, if there's a Wii U game coming out of an IP at the same time as PS4/Durango it'll most likely be a separate spin-off game like how Call of Duty was handled this gen. Either that or it'll be seriously gimped like GRAW was for PS2 or Quantum of Solace for Wii.

Wii was the lowest common denominator this gen too, but games were made for PS3/360 and the system specs that developers coded for were a lowest common denominator, taking into account shortcomings in both the 360 and the PS3.
 
At PS4's meeting, they specifically mentioned ease of porting between PC and PS4. There is a graphical difference between the PC versions of, say, BF3, The Witcher, Oblivion and their console counterparts. So why would PC devs take detail away from their PS4 ports?
Perhaps you are talking about console multiplatforms that are mainly built on Xbox 3 then ported. But even that is not indicative of how multiplats would run on PS4/Xbox 3.

PS3 and 360 had the same amount of memory, yet 360 has/had better ports due to its memory being unified. Now, PS4 has more memory than Xbox 3 AND is much faster. If memory design was a part of the reason why 360 has better looking ports, why can't the same work for PS4 as well?

It's amazing how Durango since first specs leaked as always been rumored to have 8GB of DDR3. Sony announces they'll have 8GB of GDDR5 and suddenly people forget that Durango was always rumored to have 8GB,
 
It's amazing how Durango since first specs leaked as always been rumored to have 8GB of DDR3. Sony announces they'll have 8GB of GDDR5 and suddenly people forget that Durango was always rumored to have 8GB,

I think it's the fact that it's so much faster, or something.

What I'm wondering is if everyone forgot that these systems will have more complex OS's. The Share button on the PS4 for example takes the last 15 seconds of gameplay and throws it on the internet. So uh, where do you suppose that 15 seconds comes from? Thin air? Or as a big chunk out of the RAM? Unless we've got cores specifically for doing that, and well designed to do so, the PS4 likely won't be able to hyper-compress it to end user compression levels before sticking it all in RAM, so that's likely a sizeable chunk of the RAM right there, not to mention doubling as a Gaikai server, and all these other things.
Yet we're gonna have "8GB of textures in dat Killzone". Come on guys, yall know better than this. This is FAR less than 8GB to a game. I'd say 4 is a good number to expect for each game to have, just for itself. Will it be higher? Yeah probably but if we look at 4 then we won't be disappointed.
 
I think it's the fact that it's so much faster, or something.

What I'm wondering is if everyone forgot that these systems will have more complex OS's. The Share button on the PS4 for example takes the last 15 seconds of gameplay and throws it on the internet. So uh, where do you suppose that 15 seconds comes from? Thin air? Or as a big chunk out of the RAM? Unless we've got cores specifically for doing that, and well designed to do so, the PS4 likely won't be able to hyper-compress it to end user compression levels before sticking it all in RAM, so that's likely a sizeable chunk of the RAM right there, not to mention doubling as a Gaikai server, and all these other things.
Yet we're gonna have "8GB of textures in dat Killzone". Come on guys, yall know better than this. This is FAR less than 8GB to a game. I'd say 4 is a good number to expect for each game to have, just for itself. Will it be higher? Yeah probably but if we look at 4 then we won't be disappointed.

The PS4 will be using the dedicated encoding chip to encode the vide real time, it will probably end up being 300-400MB if they are aiming for youtube quality or 600-800MB if they are aiming for higher. So I reckon 1.5GB OS.
 
The PS4 will be using the dedicated encoding chip to encode the vide real time, it will probably end up being 300-400MB if they are aiming for youtube quality or 600-800MB if they are aiming for higher. So I reckon 1.5GB OS.

I wonder if you'll also be able to just save it to the HDD.
 
I think it's the fact that it's so much faster, or something.

What I'm wondering is if everyone forgot that these systems will have more complex OS's. The Share button on the PS4 for example takes the last 15 seconds of gameplay and throws it on the internet. So uh, where do you suppose that 15 seconds comes from? Thin air? Or as a big chunk out of the RAM? Unless we've got cores specifically for doing that, and well designed to do so, the PS4 likely won't be able to hyper-compress it to end user compression levels before sticking it all in RAM, so that's likely a sizeable chunk of the RAM right there, not to mention doubling as a Gaikai server, and all these other things.
Yet we're gonna have "8GB of textures in dat Killzone". Come on guys, yall know better than this. This is FAR less than 8GB to a game. I'd say 4 is a good number to expect for each game to have, just for itself. Will it be higher? Yeah probably but if we look at 4 then we won't be disappointed.

Didn't watch the PS4 conference I take it, or read any analysis afterwards? They specifically cited a companion chip specifically in the system to do background processes, including managing downloads and compressing video for the share function. I'd imagine they could stream that video to the harddrive rather than have it take up room in RAM, after all that's what happens when most people compress video on their PCs.
 
The PS4 will be using the dedicated encoding chip to encode the vide real time, it will probably end up being 300-400MB if they are aiming for youtube quality or 600-800MB if they are aiming for higher. So I reckon 1.5GB OS.

I don't know a HUGE amount about video encoding, but doesn't it have to take the raw, 1920x1080 image, in basically PNG quality, and then downsample/encode it to be sent to Youtube? Sending the video might only take 600MB, but I'd imagine from start to end, getting to that point would be a bit larger.
 
I don't know a HUGE amount about video encoding, but doesn't it have to take the raw, 1920x1080 image, in basically PNG quality, and then downsample/encode it to be sent to Youtube? Sending the video might only take 600MB, but I'd imagine from start to end, getting to that point would be a bit larger.

They would be encoding it into h264 into the ram, they wouldn't be storing the raw frames, to big.
 
So you're telling me that it can take Frame A, B, and C, figure out between them where the image similarities are to compress them in terms of what-to-change data (don't know the technical term), and then compress between the pixels, all without those frames ever having to go to RAM? You can't operate on an HDD like RAM, at least not at the speed that Gaikai needs to operate on if someone is going to be playing your game. I know it has a companion chip -- but I suspect that said chip will be operating under the same RAM that the rest of the system will be. It's not a companion system-on-a-chip. This would be like an integrated video card using up some of your system RAM.

You wouldn't use RAM because it's way too god damn slow for this sort of thing. Gaikai's tech is specifically designed to be lightning fast, and for that to happen it can't be massively bottlenecked by the HDD being used as RAM. Honestly? You ask me? This is why they went for GDDR5 RAM. Not for games (though that benefits), but for Gaikai.
 
So you're telling me that it can take Frame A, B, and C, figure out between them where the image similarities are to compress them in terms of what-to-change data (don't know the technical term), and then compress between the pixels, all without those frames ever having to go to RAM? You can't operate on an HDD like RAM, at least not at the speed that Gaikai needs to operate on if someone is going to be playing your game. I know it has a companion chip -- but I suspect that said chip will be operating under the same RAM that the rest of the system will be. It's not a companion system-on-a-chip. This would be like an integrated video card using up some of your system RAM.

It'll be compressing it on the fly, into the ram yeah. It won't be working on the HDD (i thought of that too) because it will cause trashing with the constant data flow.
 
It'll be compressing it on the fly, into the ram yeah. It won't be working on the HDD (i thought of that too) because it will cause trashing with the constant data flow.

And the compression uses 0 RAM, is what you are telling me. Am I missing some huge computer science revolution where a chip is able to take input and output without having anything to give to it? Framebuffer grabs the frame, sends it to GaiChip, but unless the GaiChip has some massive 4MB bus to it, wouldn't it need the RAM to hold it in between cycles of the GaiChip processing the frames?

I'm not saying I know exactly what I'm talking about here, but I'm pretty sure that the GaiChip is going to have to pull from something, and that said something is probably RAM. The frames can't be grabbed pre-compressed, so the frames will take up space, not to mention audio and whatever the hell else I'm forgetting.
 
I don't want to open another ram thread for this question.
Kotaku was the first to rumor 8GB of GDDr5 ram, but they also mentioned something like 2.2GBof VRAM. What is it? Is it extra or takes from those 8GB. What's it's role ? (if true)
 
And the compression uses 0 RAM, is what you are telling me. Am I missing some huge computer science revolution where a chip is able to take input and output without having anything to give to it? Framebuffer grabs the frame, sends it to GaiChip, but unless the GaiChip has some massive 4MB bus to it, wouldn't it need the RAM to hold it in between cycles of the GaiChip processing the frames?

I'm not saying I know exactly what I'm talking about here, but I'm pretty sure that the GaiChip is going to have to pull from something, and that said something is probably RAM. The frames can't be grabbed pre-compressed, so the frames will take up space, not to mention audio and whatever the hell else I'm forgetting.

The framebuffer is already in RAM, why would you need to duplicate the same frame elsewhere just for the compression algorithm?
 
And the compression uses 0 RAM, is what you are telling me. Am I missing some huge computer science revolution where a chip is able to take input and output without having anything to give to it? Framebuffer grabs the frame, sends it to GaiChip, but unless the GaiChip has some massive 4MB bus to it, wouldn't it need the RAM to hold it in between cycles of the GaiChip processing the frames?

I'm not saying I know exactly what I'm talking about here, but I'm pretty sure that the GaiChip is going to have to pull from something, and that said something is probably RAM. The frames can't be grabbed pre-compressed, so the frames will take up space, not to mention audio and whatever the hell else I'm forgetting.

I would suspect the compressor would be pulling data from whatever part of the system pushes the final frame to the TV. Probably somewhere near this

http://www.vgleaks.com/orbis-displayscanout-engine-dce/

But i could be wrong, its a shot in the dark. Either way a single 1080P frame isn't too big (its about 8MB). But thats per frame, at 30FPS, for 15 minutes it gets pretty big.
 
And the compression uses 0 RAM, is what you are telling me. Am I missing some huge computer science revolution where a chip is able to take input and output without having anything to give to it? Framebuffer grabs the frame, sends it to GaiChip, but unless the GaiChip has some massive 4MB bus to it, wouldn't it need the RAM to hold it in between cycles of the GaiChip processing the frames?

I'm not saying I know exactly what I'm talking about here, but I'm pretty sure that the GaiChip is going to have to pull from something, and that said something is probably RAM. The frames can't be grabbed pre-compressed, so the frames will take up space, not to mention audio and whatever the hell else I'm forgetting.

The video compressor chips AMD has in their latest GPUs just read the framebuffer directly. Since you need a framebuffer it doesn't have to duplicate anything. Any working memory it needs is built into the chip. So it inputs the raw framebuffer from memory, and outputs a compressed h.264 stream to a small buffer in main memory that is periodically written out to the hard drive. This buffer wouldn't be large and would be accounted for in the 512MB OS reservation.

I don't want to open another ram thread for this question.
Kotaku was the first to rumor 8GB of GDDr5 ram, but they also mentioned something like 2.2GBof VRAM. What is it? Is it extra or takes from those 8GB. What's it's role ? (if true)

They were describing a development kit that was basically a PC at the time with 8GB of DDR3 main memory and a videocard with 2.2GB of memory. This was from before final silicon of the Orbis design with it's unified 8GB of GDDR5 was available.
 
That's what made sense to me... Why use RAM when you have more HDD space for a non critical task?

Oh I meant instead of uploading to ustream or facebook, just save the file direct to the HDD or a USB stick or something. People could then take those files and put some proper edits together.

But yeah, I do wonder where it's going to store the video while it's recording it. I think it was rumoured to record the last 15 minutes, constantly dumping out the time before that to record new data.
 
15 minutes? Man I heard that wrong, I thought it was 15 seconds. O_x
Eh, like I said I don't know -everything-, it just makes a lot of sense to me that the jump to 8GB GDDR5 would be largely influenced by the Gaikai based features. I'm expecting anywhere from 4-6GB for the actual games, depending on whatever. None of this 8 or 7 just for the game stuff.
 
15 minutes? Man I heard that wrong, I thought it was 15 seconds. O_x
Eh, like I said I don't know -everything-, it just makes a lot of sense to me that the jump to 8GB GDDR5 would be largely influenced by the Gaikai based features. I'm expecting anywhere from 4-6GB for the actual games, depending on whatever. None of this 8 or 7 just for the game stuff.

The 15 minute rumour was from Edge who have been very accurate all around.

As for the reason for the RAM, I don't know that the kind of things Gaikai is supposed to be bringing to the table are going to be that ram intensive.
 
15 minutes? Man I heard that wrong, I thought it was 15 seconds. O_x
Eh, like I said I don't know -everything-, it just makes a lot of sense to me that the jump to 8GB GDDR5 would be largely influenced by the Gaikai based features. I'm expecting anywhere from 4-6GB for the actual games, depending on whatever. None of this 8 or 7 just for the game stuff.

The whole point of Gaikai is it can run on anything that can stream h.264 video. A phone, tablet, Roku, laptop, PC, console, whatever. It needs almost no RAM. Even the video recording needs almost no RAM because the videos will be saved to the built in hard drive as they record.

The jump to 8GB change is completely about making bigger, better games.
 

wink-point-kaz.gif


Haha, I just realized everyone is a Blu-Ray head, and the Upper class PC fellow is observing all this.
 
What I'm wondering is if everyone forgot that these systems will have more complex OS's. The Share button on the PS4 for example takes the last 15 seconds of gameplay and throws it on the internet. So uh, where do you suppose that 15 seconds comes from? Thin air? Or as a big chunk out of the RAM? Unless we've got cores specifically for doing that, and well designed to do so, the PS4 likely won't be able to hyper-compress it to end user compression levels before sticking it all in RAM, so that's likely a sizeable chunk of the RAM right there, not to mention doubling as a Gaikai server, and all these other things.

Yet we're gonna have "8GB of textures in dat Killzone". Come on guys, yall know better than this. This is FAR less than 8GB to a game. I'd say 4 is a good number to expect for each game to have, just for itself. Will it be higher? Yeah probably but if we look at 4 then we won't be disappointed.

No way encoding takes that much memory. Sony's NEX-5 camera encodes AVCHD 1920x1080 at 17Mpbs. That thing's running off a small 1080mAH battery, and IIRC a 32MB Mobile DDR SDRAM chip.

4GB is not a good number to expect at all. So before the upgrade, developers could count on having 0GB-2GB for games?
 
The 15 minute rumour was from Edge who have been very accurate all around.

As for the reason for the RAM, I don't know that the kind of things Gaikai is supposed to be bringing to the table are going to be that ram intensive.
Ustream's CEO confirmed the 15 minute recording buffer.

Co-founder and CEO Brad Hunstable says that your gameplay footage—up to fifteen minute’s worth according to Sony—can be captured and shared on your social networks the easy way: Instead of recording your gameplay, uploading it to Ustream, then sharing it from Ustream to Facebook or Twitter, “you can do all that directly within the console. The goal of this is to make it simple and easy for people to record and live stream to their friends, wherever they may be watching.”

But is it as simple as pressing a button? Specifically, the new Share button found on upcoming PS4 DualShock controllers? Hunstable neatly sidesteps the question. “When it comes to questions about anatomy, I’ll defer to Sony to communicate those.”

So then is Durango all but confirmed to have 8GB too now? Looking back at this old rumor thread I see that it was already pegged at 8 gigs, so I'm guessing it was known around the industry that both platforms would have 8GB RAM.
Pretty much confirmed considering the rumor came from leaked documents/devkits SuperDaE got his hands on. With him getting PS4 stuff right that no one had posted before (share button) and the recent raid on his house initiated by MS there seems to be some truth behind his claims.

As for the RAM bump for the PS4 being known in the industry, I think it was a very recent move. Up until the start of the month everyone was saying 4GB's except for EDGE.
 
So you've accounted for pretty much all launch consoles somehow? Yea ok.

My launch 60gb is running just fine still.

Same here. Never had any issues with my launch PS3 (my friends PS3 never YLODed either, for what it's worth...) although to be fair, I'm not a very heavy user.
 
It'll be compressing it on the fly, into the ram yeah. It won't be working on the HDD (i thought of that too) because it will cause trashing with the constant data flow.

I highly doubt it. I mean, DVR have been recording straight to HDD for years without any problem.
 
It's amazing how Durango since first specs leaked as always been rumored to have 8GB of DDR3. Sony announces they'll have 8GB of GDDR5 and suddenly people forget that Durango was always rumored to have 8GB,

I don't think anyone ever forgot. Initially there was concern that Sony would be blown away if they only had 2GB RAM (very early rumours). Then the bump to 4GB put us in a 'no mans land' of trying to dissect 4GB fast Vs 8GB fast, plus whether less ram hindered the ability of PS4 to actually make use of its more powerful (on paper) GPU.


TBH, now they both seem fairly level in terms of ram. If Xbox is really 102GB/s then that is very fast for DDR and adding in the ESRAM should give similar levels of performance. That is reassuring more than 'omg sony rox'
 
I don't think anyone ever forgot. Initially there was concern that Sony would be blown away if they only had 2GB RAM (very early rumours). Then the bump to 4GB put us in a 'no mans land' of trying to dissect 4GB fast Vs 8GB fast, plus whether less ram hindered the ability of PS4 to actually make use of its more powerful (on paper) GPU.


TBH, now they both seem fairly level in terms of ram. If Xbox is really 102GB/s then that is very fast for DDR and adding in the ESRAM should give similar levels of performance. That is reassuring more than 'omg sony rox'

Yeah, this is the first time consoles will not be limited by RAM. It's gonna be really interesting to see what very talented developers like Naughty Dogs and Polyphony Digital will be able to do with all this RAM.
 
This might a stupid question but it's been bugging me. It's about live streaming on the PS4 via Ustream. You don't need a PS4 to be able to watch a person playing and streaming from their PS4, is that correct? So you can watch on a PC, tablet, or anything with the internet as long as you can get to Ustream's website, is that correct. Just making sure because I'm actually pretty excited about this feature(I watch people playing videos games a bit too much as you can guess).
 
Top Bottom