VGLeaks Rumor: Durango Memory System Overview & Example

Since when do GPUs require low-latency? They have been using GDDR for years. The argument was the low latency helped CPUs far more than GPUs. Small data sets, AI, physics, etc. Time for new theories I guess.

It can help. Especially compute shaders apparently. Cache misses and memory stalls happen on GPU's too.

Supposedly one reason Nvidia GPU's are better per flop than AMd's is better/lower latency caches.
 
I think the specialized audio unit hype died down when the PS4 was revealed to have something similar, if I'm not mistaken.

The way each console handles audio could make worlds of difference, but people only seem to care how long their GDDR is.

Sony has always had extremely capable audio processors, and I'd imagine MS is doing something competent as well. It really won't make a difference in the end.
 
The info you are referring to is from this year.

But the information was taken from a document from last year.

Oldest info is from Feb last year. Thats hardly 2 years.

I said over a year old, not 2.


Why are people hanging on to this like it's gospel? It may be true but, Sony has proven that things change. Everyone seems to accept that the Orbis has an overclocked CPU. But then acts like MS is forced to keep theirs at the stock speed.
 
But the information was taken from a document from last year.



I said over a year old, not 2.

The document alluded to last year (the Durango conference stuff) contained none of the information presented here.

So it cannot have been from there.
 
Seems like an overly complicated design.


Strange how it seems like MS have done a Sony and went very complicated and Sony went in the opposite direction.


Weird.
 
I think you need to actually read the article and the second page example, it says more about the system than anything yet released. If correct, this is a gold mine. Of course you could glance at the graph, call it old, and ask for something else you will not read or comprehend.


This isn't new info, it's just a usage example made up by vgleaks. You can apply the same logic they used to any system (given the level of detail that we already had)
 
Seems like an overly complicated design.


Strange how it seems like MS have done a Sony and went very complicated and Sony went in the opposite direction.


Weird.

it's more fun to discuss imo though.

it seems like durango speculation threads are bigger than ps4 ones. cause on ps4 there's really nothing to discuss.

i wouldn't presume the design is overly complicated, it just seems to be designed with 8gb ram in mind. at that time ms thought it either impossible or overly expensive to do 8gb ddr5, and so that necessitated the (common in consoles) embedded ram pool. ps2 had it, gamecube (and by extension wii) i believe, 360, wii u, etc.

sony showed 8gb ddr5 isn't impossible, but i'd bet the costs will be quite stiff, so that advantage likely remains for ms.
 
Sony has always had extremely capable audio processors, and I'd imagine MS is doing something competent as well. It really won't make a difference in the end.
I always thought it'd make a difference considering numbers thrown around where devs ended up using up to half of the processing power of current gen consoles (or was that 360 specifically?) just to process audio. I think the difference in efficiency could make a difference, even if consoles now are much more powerful than they were in 2005. Maybe 'worlds' of difference is exaggerating, but still.
 
Wait what?

RAM quantity?

Last time I checked they both got 8GB
Latency, I was told is not a problem due to cache memory and high bandwidth
The CPUs are basically the same
The PS4 GPU is 50% more powerful than the 720s

I don't see how you come to this conclusion.

Last time i checked GPU since directx 10 can create and destroy vertices ( geometry shaders ).
 
The document alluded to last year (the Durango conference stuff) contained none of the information presented here.

So it cannot have been from there.

Yes, but this is not an official document. This is more of a breakdown and summary of the original document about how the system will work. There is no new information leaked from MS here.
 
I always thought it'd make a difference considering numbers thrown around where devs ended up using up to half of the processing power of current gen consoles (or was that 360 specifically?) just to process audio. I think the difference in efficiency could make a difference, even if consoles now are much more powerful than they were in 2005. Maybe 'worlds' of difference is exaggerating, but still.

yeah, supposedly ms saw devs were really eating up a lot of the 360 cpu on audio, or something.

i'm also sure kinect voice recognition played a big part in needing a beefy dedicated audio chip.
 
I think the specialized audio unit hype died down when the PS4 was revealed to have something similar, if I'm not mistaken.

The way each console handles audio could make worlds of difference, but people only seem to care how long their GDDR is.

So what's special about the audio?
 
This isn't new info, it's just a usage example made up by vgleaks. You can apply the same logic they used to any system (given the level of detail that we already had)

I understand the example is made up, but they used the details which are new to us (beyond the flow chart). What is with the concerted effort to discredit this info? Isn't this what everyone wants, more technical details on the systems?
 
Question so the GPU can read both at 170GBs but wouldn't the esram have to be transferred the data from the dram first at 68GBs in 32MB chunks? So even if the you can read both at 170GBs it takes a lot of lower bandwidth steps before it even gets to that point correct?

Yes, the goal there seems to be the DMEs moving data from ddr3 to esram during the cycles the gpu is doing some other work so when it needs that data it's there for it to use...

Apparently another possible scenario would be stream data from the cpu cache into the esram, that also could lead to some interesting uses.
 
Wait what?

RAM quantity?

Last time I checked they both got 8GB
Latency, I was told is not a problem due to cache memory and high bandwidth
The CPUs are basically the same
The PS4 GPU is 50% more powerful than the 720s

I don't see how you come to this conclusion.
But Durango has more RAM.

Durango: 8192 MB DDR3RAM + 32 MB ESRAM = 8224
PS4: 8192 MB GDDR5 = 8192

8224 > 8192

Similarly, Durango has same amount of bandwidth as PS4; 68 + 102 = 170 GB/s.

Oh and ignore the PS4 CPU clock bump rumor to 2GHz. Also ignore the latency cycles.

The PS4 GPU is not 50% more powerful but the Durango is 33% less.

I think I covered everything.
 
I understand the example is made up, but they used the details which are new to us (beyond the flow chart). What is with the concerted effort to discredit this info? Isn't this what everyone wants, more technical details on the systems?

Clearly no, not everyone cares about vgleak's analysis of information we already know (and I don't see how it is based on more than the flowchart when it is literally following the flowchart and deducting some amount for 'real world' estimates). What we want is new data (as in information on the areas we don't know about yet, not some fantasy of new specs). While vgleaks is under no obligation to provide that, it isn't discrediting the article to say that it is uninteresting.
 
Not for some. People feel the need to validate their purchases. Both systems will be more than capable of providing gamers with great experiences. The power battle just provides people an opportunity to bicker. People love to bicker.

Justifying a purchase of a console, to anyone, especially strangers on the Internet is bizarre to me. I couldn't give a shit as long as I enjoy it but that's me, maybe a little boring however I used to enjoy the pms from the N4G kids.
 
Why are people hanging on to this like it's gospel? It may be true but, Sony has proven that things change. Everyone seems to accept that they have overclocked their CPU. But then acts like MS is forced to keep theirs at the stock speed.

Courtesy of me:
FCQwrbx.png

They can increase their CPU clock speed, but some people suggesting a GPU clock increase would produce much more heat.

I always thought it'd make a difference considering numbers thrown around where devs ended up using up to half of the processing power of current gen consoles (or was that 360 specifically?) just to process audio. I think the difference in efficiency could make a difference, even if consoles now are much more powerful than they were in 2005. Maybe 'worlds' of difference is exaggerating, but still.

The DSPs are there in both consoles so they WON'T take a hit on the CPU performance. PS3's audio processing didn't take that much of the CPU because the Cell is very capable in that regard.
 
Clearly no, not everyone cares about vgleak's analysis of information we already know (and I don't see how it is based on more than the flowchart when it is literally following the flowchart and deducting some amount for 'real world' estimates). What we want is new data (as in information on the areas we don't know about yet, not some fantasy of new specs). While vgleaks is under no obligation to provide that, it isn't discrediting the article to say that it is uninteresting.

If the details of the memory subsystem is uninteresting I wonder what is interesting. Understanding how the data flow between IO, CPU and GPU is vital for understanding the system itself. Glancing at the flow chart and denouncing it "old" is ignoring the content of the text. They leaked the CPU details, the GPU details the Move engines, Kinect details and the display planes. What exactly are you guy expecting? You don't like the hardware details and are hoping for something more exciting?

I'd love to see the MB and the APU itself, but that doesn't get to the heart of the system, just how they implemented it.
 
Coming out my sarcastic mode .. the GPU logic on the die would take up (a lot) more space the the Jaguar cores (which are tiny), so it's reasonable to expect increasing CPU core would come at a lower cost than the GPU.

Unless they changed the cooling system there would be no cost increases other than testing.

It's just clocking 12 (much larger) CU's vs 8 small CPU cores would at a higher rate would increase the heat by a bit.

(it was a compliment) :)

I know ;]

Some people just don't like to think about these small details.
 
Unless they changed the cooling system there would be no cost increases other than testing.

It's just clocking 12 (much larger) CU's vs 8 small CPU cores would at a higher rate would increase the heat by a bit.
I didnt mean cost in the literal sense but rather from a thermal budget pov.
 
I'm not knowledgeable enough to discuss this, but I would be shocked if 3rd parties refused to tap into the PS4's extra power for the sake of visual parity.
That seems so dumb to me, I'm for each machine to be made the most of.

Just like the PS3 right?
 
Seems like an overly complicated design.


Strange how it seems like MS have done a Sony and went very complicated and Sony went in the opposite direction.


Weird.

The Giant Bomb crew spoke about this a Bombcast or two ago.

Jeff said everything he's is under the impression Sony and Microsoft switched roles; the PS4 is easy to work with and Sony has been gracious with developers. The implication was Microsoft has been arrogant and the new XBOX is not as simple to develop for.

We'll see. At this point, I just want the damn thing announced.
 
Just like the PS3 right?

Not an accurate comparison. Devs found the PS3 difficult to develop for from the word go, so any additional optimization was likely out of the question unless it was the one console you were developing for (As in, Sony owned studios who did in fact show off some of that extra power).

However, whether or not devs take advantage of any additional PS4 power, or whatever ends up being more powerful for that matter, is still an open question none of us can answer. I certainly wouldn't take it as a given that they would.
 
So with this audio chip can we expect the New Xbox games to feature DTS and/or LPCM 5.1/7.1 audio?

Also, is there anything in here to suggest that Microsoft has an answer to Sony's Share Button?

Not an accurate comparison. Devs found the PS3 difficult to develop for from the word go, so any additional optimization was likely out of the question unless it was the one console you were developing for (As in, Sony owned studios who did in fact show off some of that extra power).

However, whether or not devs take advantage of any additional PS4 power, or whatever ends up being more powerful for that matter, is still an open question none of us can answer. I certainly wouldn't take it as a given that they would.

There were a number of studios that made good use of the PS3, beyond first party like Valve, Criterion and Visceral...
 
So with this audio chip can we expect the New Xbox games to feature DTS and/or LPCM 5.1/7.1 audio?

Yes. They also have the storage capabilities (50gb disc).

Also, is there anything in here to suggest that Microsoft has an answer to Sony's Share Button?
I'm not too sure I think there was a rumor somewhere saying there was, but I have no idea of the source.

There were a number of studios that made good use of the PS3, beyond first party like Valve, Criterion and Visceral...

Indeed. Those examples were few and far between though.
 
But Durango has more RAM.

Durango: 8192 MB DDR3RAM + 32 MB ESRAM = 8224
PS4: 8192 MB GDDR5 = 8192

8224 > 8192

Similarly, Durango has same amount of bandwidth as PS4; 68 + 102 = 170 GB/s.

Oh and ignore the PS4 CPU clock bump rumor to 2GHz. Also ignore the latency cycles.

The PS4 GPU is not 50% more powerful but the Durango is 33% less.

I think I covered everything.

Are you trolling here? Because you cannot simply add numbers together.
That's like saying 2 bathrooms in your house let you poop twice as fast.
 
Yes. They also have the storage capabilities (50gb disc).

I'm not too sure I think there was a rumor somewhere saying there was, but I have no idea of the source.



Indeed. Those examples were few and far between though.
The notion behind the share button (and some of the other features that stem from it) seems incredibly important to me as a community building tool, a development from cross game/party chat. So much so I figured in the build up to these reveals that Microsoft would the one to do it rather than Sony, as a natural extension of XBL.

I'd imagine that they would have their own version of it though.
 
But Durango has more RAM.

Durango: 8192 MB DDR3RAM + 32 MB ESRAM = 8224
PS4: 8192 MB GDDR5 = 8192

8224 > 8192

Similarly, Durango has same amount of bandwidth as PS4; 68 + 102 = 170 GB/s.

Oh and ignore the PS4 CPU clock bump rumor to 2GHz. Also ignore the latency cycles.

The PS4 GPU is not 50% more powerful but the Durango is 33% less.

I think I covered everything.

Did you hear the saturn was also 256bit because it had so many 32bit processors?
 
I hate to do the old B3D text dump, but ERP is one of the few there that has any real technical insight that is trustworthy.

Don't get me wrong the primary purpose of the fast memory pool is to increase the overall bandwidth they had to add a fast pool the moment they decided on DDR3. Having said that they selected a low latency solution because they saw value in it.

Yes GPU's have caches, they question becomes how effective they are. It's hard to quantify without running a lot of tests on a lot of existing titles.

The system supports rendering to either memory pool, I suspect any real renderer would render to both inside a frame, there is the issue of how much data copying you end up doing, the DME's are there for a reason, but it all eats bandwidth.

PRT's make it feasible to know pretty much exactly what parts of what textures were actually used in the last frame, but you still have to use that knowledge effectively.

As I've said before the split memory/Bandwidth/ROP count would still be the things that concern me most in the design, but I wouldn't judge anything without actually using it.

My guess is the quantity of memory was important early in the design which dictated DDR3, which dictated the fast memory pool, statistical data and manufacturing complexity probably indicated using SRAM instead of eDRAM.

Sony matching the 8GB is a big deal IMO.

http://forum.beyond3d.com/showpost.php?p=1718716&postcount=2025


The last bolded part is what I was guessing and have posted here. They wanted 8GB total and that made the choice DDR (GDDR5 for 8GB was out of the question when they made the choice). The low bandwidth dictated the use of a faster cache and the choice of eSRAM over eDRAM was the ability to fab it on the APU itself. They did not start with the decision to use eSRAM for latency, it was just a side benefit after all the choices were made.
 
yeah, i'm curious to see launch prices. Sony has a bigger opportunity for post release price reduction, but they are going to be starting from a higher base

I think it is the natural outcome of wanting two things

1. Unified memory
2. 8GB


At the time of the decision that meant DDR... and the rest of the design is to mitigate bandwidth issues. See my post above.

If they gave up #1 or #2, thy could have done 4GB GDDR unified or 4GB DDR and 4GB GDDR split, neither one they wanted to sacrifice for.
 
Top Bottom