PS4 Ram vs. Durango Ram: How big of a difference does it make?

I see that you jump to conclusions easily. bgassassin is the only person saying that the jaguar cores were beefier in Durango and he's also saying a bunch of other negative things. You cant just pick out the information to push your agenda, take it all or leave it.

As it stands right now, Edge has the only article that got it right 100%. And they're saying the same negative things (minus the CPU upgrade) for the Xbox that bgassassin is saying. At this point in time if MS announces second hand games playable and no always online madate then those are megatons IMO.

regardless of what they say ...TODAY... you can not rule out sony going there eventually too if the market dictates it. and by market I mean consumers, publishers, developers, partners.
 
I'm estimateing a core version in there... I'm saying there will be two

At most the PS4 will cost probably 80/90 more to make if they get a bad deal with the ram i suspect.

What are you taking out of the premium to make up for this $220?.
 
Funny I remember people went on and on singing virtues of the EDRAM in 360. I guess RAM speed doesn't matter now because the advantage is not flowing in certain direction right?
EDRAM != ESRAM

EDRAM was a big upgrade to (128-bit) GDDR3 solution which was a partly 22.6GB/s. With GDDR5, that's essentially a non-issue now.
 
And also you can increase RAM not change RAM types. No going from DDR3 to DDR4.

Well sure. You can also only increase ram if the larger chips are available too, which is why MS is likely stuck at 8GB of DDR3 unless they want to A. pay for a full mobo rebuild that can accommodate more chips (not happening) or B. want to pay out the ass for cutting edge layered DDR3 chips.

The only reason Sony was even able to do this jump from 4GB to 8GB is thanks to advancements in the production of GDDR5 chips. They either designed for a set number of chips and as GDDR5 has stepped up from 1Gbit chips to 2Gbit chips to 4Gbit chips they've made iterative upgrades, or they budgeted for 4GB of RAM using 1Gbit chips (32 chips on a console mobo sounds crazy) and are now getting 2Gbit chips for a much better price due to lagging PC hardware sales hurting GDDR5 demand.

Either way, it looks like Sony is about to become the dominant consumer in the GDDR5 market.
 
umm let's see:

cost I'll wager PS4 $579 (see they didn't go $599US) 720 $299- $399

Friends who want to stay on Live

Xbox exclusives

You are insane if you think PS4 will be $579. They will both release at $399 or less for their lower model. They all will have to buy new consoles and what makes you think they would stay on live versus going over to the PS4 with free online play? If MS comes out with some nice exclusive games that could be the only worthwhile reason to have one, but seeing their output over the last few years does not instill faith.
 
I'm estimateing a core version in there... I'm saying there will be two (or subsidized with Live to make it seem a lot cheaper)

What would a $300 core pack consist of? If every 720 doesn't come with a Kinect and a hard drive then that pretty much ruins the whole point of their new system.
 
At most the PS4 will cost probably 80/90 more to make if they get a bad deal with the ram i suspect.

What are you taking out of the premium to make up for this $220?.

I'm suggesting that due to a lower cost engineered machine, MS can afford to subsidize and will to undercut Sony and it's bold move of spending a billion adding to the RAM. I'm suggesting the subsidized versions that will be cheap payments, I'm picturing smaller storage units again possibly with fewer features for the low end casual market.


only problem is they need to keep one price closer to PS4 at first to not allow the perception that it is far less of a machine. so the numbers might look more like $349-$479

But Ms will use every advantage this gen IMO to grab and keep grabbing market share on cost and cool features and worry not about numbers on a spec sheet and let the games appear, to most people, to look the same.
 
Dave Perry from Gaikai was thanking Sony at the press conference for finally greenlighting the program. I'm willing to bet the gaikai stuff wasn't in the developer builds yet and the video recording and streaming might take up a lot of ram. I don't know how much ram the games themselves will actually get. 6GB maybe?

Multiple rumors had already referenced the streaming features back when dev kit details were leaking, so developers knew about it and it was part of the OS footprint they were quoting. Even then sources were saying that the Share features had almost zero system footprint.

Also, why would it take up a lot of ram? Its the equivalent of throwing a cheap video capture/compression card on the board and duplicating the video out, saved to a flash drive (if the 16GB internal flash rumor is true) or the local HDD. It's an incredibly simple, low cost solution. It will be entirely reliant on an elegant software implementation, not eating up system hardware.
 
You are insane if you think PS4 will be $579. They will both release at $399 or less for their lower model. They all will have to buy new consoles and what makes you think they would stay on live versus going over to the PS4 with free online play? If MS comes out with some nice exclusive games that could be the only worthwhile reason to have one, but seeing their output over the last few years does not instill faith.



good luck with that
 
I'm suggesting that due to a lower cost engineered machine, MS can afford to subsidize and will to undercut Sony and it's bold move of spending a billion adding to the RAM. I'm suggesting the subsidized versions that will be cheap payments, I'm picturing smaller storage units again possibly with fewer features for the low end casual market.

And you'd be wrong.

MS is, by all accounts, using at least as heavily engineered parts. Supposedly their silicon deviates more from ATi's original fabrications than Sony's do, and as a result would have had more R&D costs and will show lower early yields in production as a result.

Also, the 32MB ESRAM + 8 GB DDR3 is not a major cost savings for MS if Sony is getting their GDDR5 at the wholesale prices the biggest shark in the tank can demand (which they've now instantly become in the GDDR5 market). Not only is this forcing MS to include more customized silicon (like the beefy data manager leaked a few weeks back), but 32MB of ESRAM isn't cheap and having coders spend time tweaking libraries to make all of this something other than a nightmare for developers isn't free either.

MS isn't going cheap on their BoM relative to Sony, they're going with a different focus. Why would MS want such low latency system memory? Because it's better for non-gaming operations. Why would MS skimp on in the box graphics silicon? Because Kinect 2.0 needs it's own silicon to reduce lag and strengthen it's feature set, and the sound chip needs to apparently be a freak as well. Why would their memory and CPU footprint be so much larger than Sony's? Because they want their OS to run very differently (hence display planes).

MS isn't going cheap, they're going multi-purpose. Sony is building a cutting edge GPU-in-a-box. MS is building the Windows 8 TV Box.

At this point Sony is probably at best second, if not third, on the list of companies MS is targeting their next system to compete against. Apple being first with Google as a legitimate contender for the #2 seat. If you don't think the in-home applications of iOS and Android have MS worried about it's core market (Windows) you aren't paying attention.
 
And you'd be wrong.

MS is, by all accounts, using at least as heavily engineered parts. Supposedly their silicon deviates more from ATi's original fabrications than Sony's do, and as a result would have had more R&D costs and will show lower early yields in production as a result.

Also, the 32MB ESRAM + 8 GB DDR3 is not a major cost savings for MS if Sony is getting their GDDR5 at the wholesale prices the biggest shark in the tank can demand (which they've now instantly become in the GDDR5 market). Not only is this forcing MS to include more customized silicon (like the beefy data manager leaked a few weeks back), but 32MB of ESRAM isn't cheap and having coders spend time tweaking libraries to make all of this something other than a nightmare for developers isn't free either.

MS isn't going cheap on their BoM relative to Sony, they're going with a different focus. Why would MS want such low latency system memory? Because it's better for non-gaming operations. Why would MS skimp on in the box graphics silicon? Because Kinect 2.0 needs it's own silicon to reduce lag and strengthen it's feature set, and the sound chip needs to apparently be a freak as well. Why would their memory and CPU footprint be so much larger than Sony's? Because they want their OS to run very differently (hence display planes).

MS isn't going cheap, they're going multi-purpose. Sony is building a cutting edge GPU-in-a-box. MS is building the Windows 8 TV Box.

At this point Sony is probably at best second, if not third, on the list of companies MS is targeting their next system to compete against. Apple being first with Google as a legitimate contender for the #2 seat. If you don't think the in-home applications of iOS and Android have MS worried about it's core market (Windows) you aren't paying attention.

I agree with a lot of what you said except for the part that I think they are going to have BOM of around $80-$90 cheaper and even if a little less or not... they will exercise their luxury to place units in homes with some smaller entry level units or the LIVE subsidized payment plan which has been rumored (or is that confirmed?)


He's right you know. There is no way the PS4 would be sold above 399. Not even with the 8GB GDDR5 RAM upgrade.



hey any of us could be right we're just (educated) guessing :lol but

Sony released specs for its new console last night, revealing a custom AMD processor, dual-camera PlayStation Eye to detect motion, and 8GB of memory among other things. We're not sure whether anything significant will change for the "final specs," nor how much the final product will cost, though Tretton says he hopes the price point will be lower than the PlayStation 3's original $599.

that doesn't instill hope if I'm counting on $399
 
Didn't some guy in another thread said investors were disappointed at Durango's specs? If that is true that might have led to something being changed.

That was probably Shayan. A new member who said that Orbis would have 8GB of ram the day before the conference, but at no point mentioned it anywhere on GAF before Edge touted it at the start of Feb.

Now. He could well entirely be accurate and be telling the truth in what he says about where he gets his info and that it is all accurate. I do not know. I'm certainly not going to just trust someone blindly on that one correct statement that had been reported by a major gaming publication first.

Since getting that right he has been saying a lot more things as facts than he was before, even going so far as to tell us where he gets his info, telling us that Bungie were supposed to Live Demo Destiny but didn't because of Killzone, that Guerilla convinced Sony to change the RAM 8 weeks ago to deal a blow to Microsoft, also telling us that they upped it to 8GB because investors (the people he says give him his info) were unimpressed.

I mean none of this in disrespect to the guy, but it pays to be sceptical until people have proved time and again that they are accurate. That is why people like GopherD are always believed, they also say very little.
 
That was probably Shayan. A new member who said that Orbis would have 8GB of ram the day before the conference, but at no point mentioned it anywhere on GAF before Edge touted it at the start of Feb.

Now. He could well entirely be accurate and be telling the truth in what he says about where he gets his info and that it is all accurate. I do not know. I'm certainly not going to just trust someone blindly on that one correct statement that had been reported by a major gaming publication first.

Since getting that right he has been saying a lot more things as facts than he was before, even going so far as to tell us where he gets his info, telling us that Bungie were supposed to Live Demo Destiny but didn't because of Killzone, that Guerilla convinced Sony to change the RAM 8 weeks ago to deal a blow to Microsoft, also telling us that they upped it to 8GB because investors (the people he says give him his info) were unimpressed.

I mean none of this in disrespect to the guy, but it pays to be sceptical until people have proved time and again that they are accurate. That is why people like GopherD are always believed, they also say very little.

Outside of what he said, the footage that they did show of Destiny DID look unimpressive compared to Killzone 4. Bungie being Bungie:(
 
I agree with a lot of what you said except for the part that I think they are going to have BOM of around $80-$90 cheaper and even if a little less or not... they will exercise their luxury to place units in homes with some smaller entry level units or the LIVE subsidized payment plan which has been rumored (or is that confirmed?)

How?

$80-$90 cheaper is significantly more than the cost Sony added by going from 4GB to 8GB of GDDR5. So you think the Xbox was already $50 cheaper BoM yet somehow even in the same technological ballpark?

Also, subsidizing via payments isn't a lower MSRP and 99% of consumers understand this. That only works if your competitor isn't kicking your ass with sans-subscription hardware.

Even if that was a working strategy, why do you think Sony wouldn't adopt the same thing? This is the kind of marketing strategy the entire industry would know about months in advance and Sony would literally have retailers begging them to do the same thing (allowing them to move more software at zero additional cost, and clear out more hardware inventory faster).

This isn't some grand master plan MS can spring at the last minute and there won't be a magic BoM bullet waiting to be fired. Sony is building a gaming console, MS is building a home media convergence center. They differ for that reason and that reason alone.
 
According to JohnnySasaki86, he believes the CPU info was already there. I really don't know who to believe anymore.



I'll eat crow if you're right. Crytek made me eat crow already.


We'll see when VGleaks posts whatever their gonna post about the CPU. I just think it was pretty vague and therefor it's very possible it could of been that way all along. They didn't even call them jaguar CPU's. Just 8 cores clocked at 1.6ghz. I guess this could of been a change that happened after the Feb 2012 summit.
 
How?

$80-$90 cheaper is significantly more than the cost Sony added by going from 4GB to 8GB of GDDR5. .

how do you figure? I've read that is exactly the difference if not more. the first 4GB was probably a wash compared to Ms on the BOM


also I do not know the ratio or desire for PS + memberships but if you are giving away for free the online experience with the machine how can you sell it as a payment plan that appears to be part of your online experience?

Also again I think MS has more cash on hand and is more willing to carry these kinds of deals. But we are getitng away from the strategy of MS having a lower priced Sku again ths gen which worked so well this past gen. so yea, cost will go far.
 
It's tough to tell from this angle but I think Sony has the high ground:

7446577620_f2b9075741.jpg


Yea but in that picture the one on the high ground is at a disadvantage. Those horns are worthless up there. The one at the bottom can skewer with more ease. This picture might actually be symbolic of what's about to happen.
 
how do you figure? I've read that is exactly the difference if not more. the first 4GB was probably a wash compared to Ms on the BOM

So you're saying that the system at 4GB of memory was a wash with MS on BoM but it has now jumped $90 due to an additional 4GB of memory? GDDR5 is expensive for memory, it sure as hell isn't that expensive. Wholesale sans-bulk discounts put the entire 8 GB of GDDR5 at ~$100 give or take, and you can bet that Sony is getting one hell of a bulk discount.

You are grossly overestimating the cost of the additional memory.
 
IIRC Microsoft doubled the 360's ram losing billions because Epic showed them how crappy GeoW would look like. Of course it will make a difference.

However I don't think MS will react. They go for a different model this time. And with their behaviour towards 1st party games I feel they don't really care about gaming. They just tolerated gaming for other goals.
 
Yea but in that picture the one on the high ground is at a disadvantage. Those horns are worthless up there. The one at the bottom can skewer with more ease. This picture might actually be symbolic of what's about to happen.

You don't get how rams fight at all, do you?

Rams never skewer, it's head to head collisions consisting of bludgeoning blows until one backs down. The lower ram is backing up already because at that angle the top ram would destroy him with all the momentum and superior angle that his downhill approach would afford him.

This is how the vast majority of horned mammals fight. Same with deer, moose, goats, bulls, etc..
 
So you're saying that the system at 4GB of memory was a wash with MS on BoM but it has now jumped $90 due to an additional 4GB of memory? GDDR5 is expensive for memory, it sure as hell isn't that expensive. Wholesale sans-bulk discounts put the entire 8 GB of GDDR5 at ~$100 give or take, and you can bet that Sony is getting one hell of a bulk discount.

You are grossly overestimating the cost of the additional memory.

perhaps but we do not know and more than a few people besides myself are speculating that the BOM of MS is that much less. You and I do not even know what the Durango has in it yet :lol

And we don't even know yet what will be included in a PS4 Sku actually if you think about it
neither does sony if you read Tretton's quote above
 
IIRC Microsoft doubled the 360's ram losing billions because Epic showed them how crappy GeoW would look like. Of course it will make a difference.

However I don't think MS will react. They go for a different model this time. And with their behaviour towards 1st party games I feel they don't really care about gaming. They just tolerated gaming for other goals.


Why would they create 4 new IPs if they didn't care about gaming? Or did I miss something?
 
No. Only a speed benefit to the extent of 32mb. The 8GB DDR3 would however still remain the same lesser bandwidth.



Wait, I thought Durango's 'data move engines' along with 'ESRAM' was suppose to rise the bandwidth to around 170 GB/s, close to that of GDDR5. Combine that with the lower latency of DDR3, and Durango comes out ahead.
 
Wait, I thought Durango's 'data move engines' along with 'ESRAM' was suppose to rise the bandwidth to around 170 GB/s, close to that of GDDR5. Combine that with the lower latency of DDR3, and Durango comes out ahead.
It's still lower than 360's bandwidth

xbox-ps3_memqxplg.gif


:D
 
Wait, I thought Durango's 'data move engines' along with 'ESRAM' was suppose to rise the bandwidth to around 170 GB/s, close to that of GDDR5. Combine that with the lower latency of DDR3, and Durango comes out ahead.

Lower latency does not effect graphics much, most of the GPU's are designed around hiding latency

Also you can't add bandwidth together like that

You have a tiny amount of RAM that is 102GB/s, and if anything needs to be moved there it has to move there from the 68GB/s pool. Therefore if something is bigger then 32MB/s your not going to get the entire 170GB/s from it.
 
Lower latency does not effect graphics much, most of the GPU's are designed around hiding latency

Also you can't add bandwidth together like that

You have a tiny amount of RAM that is 102GB/s, and if anything needs to be moved there it has to move there from the 68GB/s pool. Therefore if something is bigger then 32MB/s your not going to get the entire 170GB/s from it.

Unlike the 360 you won't have to fit everything in the embedded ram this time around.
 
Unlike the 360 you won't have to fit everything in the embedded ram this time around.

Thats not the point, the point is that you will probably never get anywhere near 170GB/s from the system because of the constant moving between the DDR3 ESRAM you will have to do jus to use the 102GB/s bandwidth of the ESRAM.
 
You don't get how rams fight at all, do you?

Rams never skewer, it's head to head collisions consisting of bludgeoning blows until one backs down. The lower ram is backing up already because at that angle the top ram would destroy him with all the momentum and superior angle that his downhill approach would afford him.

This is how the vast majority of horned mammals fight. Same with deer, moose, goats, bulls, etc..


lol, Not really. But if you replaced the top one with Kaz and the bottom one with Ballmer, who would win? I'd guess Kaz would go for an elbow drop only to wind up bouncing from Ballmer's stomach and falling over the cliff.
 
Lower latency does not effect graphics much, most of the GPU's are designed around hiding latency

Also you can't add bandwidth together like that

You have a tiny amount of RAM that is 102GB/s, and if anything needs to be moved there it has to move there from the 68GB/s pool. Therefore if something is bigger then 32MB/s your not going to get the entire 170GB/s from it.


Oh, okay. I was just reading this article that said it can run in 'parallel' with the DDR3 for a combined bandwidth of 170GB/s. Do you know if that is true? link - http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis
 
Naughty Dog and co. had to actually dig in and write some parts of their games in assembly to even unlock that though. That's some obtuse ass design right there.

Make no mistake, they will do that for PS4, but they won't have to do it right out of the gate, obviously, no one will. This is what Carmack alludes to with regards to the untapped potential of the PC. If you could magically code in assembly an engine that would work across many PC configurations and drivers configs then PCs would be brilliant. But there are so many dang timing issues and CPU frequencies and such that it's a PITA to do that kind of optimizing.
 
Make no mistake, they will do that for PS4, but they won't have to do it right out of the gate, obviously, no one will. This is what Carmack alludes to with regards to the untapped potential of the PC. If you could magically code in assembly an engine that would work across many PC configurations and drivers configs then PCs would be brilliant. But there are so many dang timing issues and CPU frequencies and such that it's a PITA to do that kind of optimizing.

As long as your targeting just windows machines you'll have uniform access to 99% of assembly instructions for x86.

Then again its not really worth it for most things.
 
Thats not the point, the point is that you will probably never get anywhere near 170GB/s from the system because of the constant moving between the DDR3 ESRAM you will have to do jus to use the 102GB/s bandwidth of the ESRAM.

You don't need to move anything. If you want to target both pools for render target, you can do so. Unlike the 360, you can render to the esram AND the main memory. The esram is there for bandwidth intensive ops (framebuffer ops; render target, screen space ops, post processing). Those are the rendering elements that eat bandwidth, texture and geometry uses just a fraction of your available bandwidth. These framebuffer ops need just a little amount of memory but a lot of bandwidth, the reverse is the case for texture and geometry. The durango would have run into trouble if its eSRAM had the restriction that was in the 360 implementation, but it doesn't. The eSRAM is read/write/modify. So no matter how large your gbuffer (deferred rendering) is, no matter how many MRT (multiple render target) you need to render, you can split them into the different memory pool, which is then combined to the display unit. This is why the durango memory setup is good enough to do what it needs to do.

You are never going to need 170/176 gb/s of bandwidth for texture and geometry, framebuffer ops will eat most of that.
 
Problem was 360 couldn't actually access all that bandwidth like what MS is rumored to do with Durango.
Can whatever PS4 is doing access all the bandwidth that comes with using GDDR5? If it was a problem with current gen, I guess it's feasible it could also be a problem for next. Be a hell of a thing if people talk about faster RAM if the PS4 can't use it.

But I literally know nothing about how the architecture works in anything, so there's a fair chance what I'm saying could be completely improbable.
 
You don't need to move anything. If you want to target both pools for render target, you can do so. Unlike the 360, you can render to the esram AND the main memory. The esram is there for bandwidth intensive ops (framebuffer ops; render target, screen space ops, post processing). Those are the rendering elements that eat bandwidth, texture and geometry uses just a fraction of your available bandwidth. These framebuffer ops need just a little amount of memory but a lot of bandwidth, the reverse is the case for texture and geometry. The durango would have run into trouble if its eSRAM had the restriction that was in the 360 implementation, but it doesn't. The eSRAM is read/write/modify. So no matter how large your gbuffer (deferred rendering) is, no matter how many MRT (multiple render target) you need to render, you can split them into the different memory pool, which is then combined to the display unit. This is why the durango memory setup is good enough to do what it needs to do.

You are never going to need 170/176 gb/s of bandwidth for texture and geometry, framebuffer ops will eat most of that.

But if you want to move anything from DDR3 to eSRAM to read it at faster then 68GB/s then that will also eat up bandwidth, and also moving back if you fill up all the eSRAM but need more of it. the eSRAM is also not unlimited in space, once you fill it all up you either need to content with just 68GB/s for the rest of your data, unless you want to move it to the DDR3 and then back to the eSRAM if you need it again.

The eSRAM is barely big enough for 2x 1080P buffers with 4 bytes per pixel data and 2x MSAA.
 
As long as your targeting just windows machines you'll have uniform access to 99% of assembly instructions for x86.

Then again its not really worth it for most things.

It's not worth it because you can write a routine 50% faster in assembly on your spec machine but another machine could have issues and only be 20% faster. There are entire data specs related to the different architectures you have to deal with just in one PC generation.

Granted, I should admit that compilers for x86 have become amazingly good, GCC in particular is a beast on x86-64. I still think they'll do some hardcore assembly on PS4 later in its life.
 
Hence why the rumor is that it is a 256bit AVX instead of the 128bit part in a vanilla jaguar core.
Probably something else as it's standard feature for Jaguar and it does not seem to double overall performance:
xbitlab said:
In a bid to boost IPC by 15%, Jaguar introduces 128-bit floating point unit (FPU) with enhancements and double-pumping to support 256-bit AVX instructions as well as an innovative integer unit with new hardware divider, larger schedulers and more out-of-order resources.
source: http://www.xbitlabs.com/news/cpu/di...ext_Generation_Jaguar_Micro_Architecture.html
 
It's not worth it because you can write a routine 50% faster in assembly on your spec machine but another machine could have issues and only be 20% faster. There are entire data specs related to the different architectures you have to deal with just in one PC generation.

Granted, I should admit that compilers for x86 have become amazingly good, GCC in particular is a beast on x86-64. I still think they'll do some hardcore assembly on PS4 later in its life.

Thats what CPUID is for :D
 
But if you want to move anything from DDR3 to eSRAM to read it at faster then 68GB/s then that will also eat up bandwidth, and also moving back if you fill up all the eSRAM but need more of it. the eSRAM is also not unlimited in space, once you fill it all up you either need to content with just 68GB/s for the rest of your data, unless you want to move it to the DDR3 and then back to the eSRAM if you need it again.

The eSRAM is barely big enough for 2x 1080P buffers with 4 bytes per pixel data and 2x MSAA.

Then you avoid moving data, hence the mrt. And just so you know, these bandwidth are not constantly in use, a situation can arise whereby the alu's are fully occupied, thus you are computationally bound. At that point your bandwidth is not in use.

The eSRAM is big enough for 2x1080p buffers, if you need more, you target the main memory. Btw all framebuffer ops are not the same in size, don't forget that.
 
Then you avoid moving data, hence the mrt. And just so you know, these bandwidth are not constantly in use, a situation can arise whereby the alu's are fully occupied, thus you are computationally bound. At that point your bandwidth is not in use.

The eSRAM is big enough for 2x1080p buffers, if you need more, you target the main memory. Btw all framebuffer ops are not the same in size, don't forget that.

True, but I've heard that some things official may be saying.

68 GB/s or 102GB/s for render bandwidth, implying that both cannot be used at the same time.
 
True, but I've heard that some things official may be saying.

68 GB/s or 102GB/s for render bandwidth, implying that both cannot be used at the same time.

Of course you can. They are two separate memory pools. That you can render to both is precisely one of the reason the setup is better than the 360 setup. READ THE VGLEAK DOC. Seriously. This is really not an argument.
 
This thread is kind of weird. The rumours said that the PS4 would have 4GB of ram, when it really had 8GB. I'm betting that the XBox 720 rumour is also wrong, and it will also have 8GB of GDDR5.
 
Of course you can. They are two separate memory pools. That you can render to both is precisely one of the reason the setup is better than the 360 setup. READ THE VGLEAK DOC. Seriously. This is really not an argument.

My doco says different.
 
Top Bottom