VGLeaks rumor: Durango CPU Overview

How long do you think it takes to produce a console, I'm curious.

Because right now you are saying it takes less then 7 months to completely redesign the GPU and fix all the bugs and get it rolling :/.

there is also this really odd thing where some posters seem to think that Sony and MS both design their consoles based on what the other guy is doing, and knee jerk reactions to rumours.

Both have a silicon and dollar budget in mind, with a certain amount of flex. Both will know the general capability of different GPU/CPU options based on their expected launch times. Its entirely unsurprising that both ended up with similar technology because they're launchingn around the same time using the same vendor.

The only main difference is that MS look like they wanted to lock down lots of ram early on, which meant reducing risk and going with DDR3 (either for cost reasons or lack of faith that GDDR5 would increase in density in time), and Sony focused on putting more silicon aside to the GPU and putting the bandwidth outside the chip and using GDDR5 - presumably they had good confidence that densities would allow them 4GB, with a stretch target of 8GB if possible.


Speculation - if 4Gb chips weren't available until 2nd half, would sony have gambled on a short delay to get the ram, or shipped with 4GB?
 
I guess people should calm down about the RAM upgrade for launch games then? It will, most likely, be the second generation of games that will take advantage of it.

That would be my bet. They'll likely get some ok bumps out of it for launch, but all these developers have been designing assets around 4GB of ram. No one is going to scrap months or even years of art assets 9 months before release just to take advantage of the new memory.

Reduced loading times, superior draw distances, things like that are viable. The significant jump in art assets this could allow isn't going to happen with any game that has gotten very far into real production though.
 
That would be my bet. They'll likely get some ok bumps out of it for launch, but all these developers have been designing assets around 4GB of ram. No one is going to scrap months or even years of art assets 9 months before release just to take advantage of the new memory.

Reduced loading times, superior draw distances, things like that are viable. The significant jump in art assets this could allow isn't going to happen with any game that has gotten very far into real production though.

the good news of lower ram initially is that Sony would have focused (hopefully) on a lean OS, to not waste precious fast ram.
 
That would be my bet. They'll likely get some ok bumps out of it for launch, but all these developers have been designing assets around 4GB of ram. No one is going to scrap months or even years of art assets 9 months before release just to take advantage of the new memory.

Reduced loading times, superior draw distances, things like that are viable. The significant jump in art assets this could allow isn't going to happen with any game that has gotten very far into real production though.

Especially when they are on a very tight deadline to be ready for the launch.

Yeah, that's what I was saying. Sorry if I wasn't clear. Second Son is the most likely to use it well since it's an open world game.

That would be my bet too.
 
Here people in Neogaf have the docs too, Thuway ( and Reiko? ) for example that said ESRAM was half the transistors of a 680GTX.
No way
http://www.neogaf.com/forum/showpost.php?p=48509538&postcount=285
Scaning around the internet it seems that 64MBit SRAM chips have been produced on TSMCs 28nm process with a cell size of 0.127mm2/Mbit.

For standard 6T-SRAM that's 32.5mm² for 32MB. And for 1T-SRAM it would be 6mm².
On 28nm 32MB of eSRAM would be 33mm² for 6T, and about 6-10mm² for the 1T variant.

Nowhere near 120mm².
Or ~60mm2 max
 
The other possibility is that its really 64megs if esram with 32megs fused off for yields. If that is the case then one thing ms can do is sacrifice yields for the higher ram amount.

But who knows what specs are real. Its my understanding that the durango specs that leaked from supedoa or ehoever (which seem to be the source of vgleaks) are almost a year old and ms still has a good 4 months plus before they have to start production . Some of these specs may have been based off the idea of launching in 2012. Ddr3 may be swapped with ddr4 and esram and other things could be increases. Or ms may be happy with the specs and just massively under cut the price of the ps4. Best guess estimates put the ps4 at least $100 more to produce
 
isnt a shitty pratice releasing those "leaks" in dribs and drabs. And not saying how old the infos are?

Just like their first leak - Gears Jugment trailer

VGLeaks
gowj51.jpg

gowj9.jpg

gowj111.jpg


Day-after official trailer

vlcsnap-2013-02-28-11hbo2x.png

vlcsnap-2013-02-28-11dlokv.png

vlcsnap-2013-02-28-11omrih.png
 
The other possibility is that its really 64megs if esram with 32megs fused off for yields. If that is the case then one thing ms can do is sacrifice yields for the higher ram amount.

But who knows what specs are real. Its my understanding that the durango specs that leaked from supedoa or ehoever (which seem to be the source of vgleaks) are almost a year old and ms still has a good 4 months plus before they have to start production . Some of these specs may have been based off the idea of launching in 2012. Ddr3 may be swapped with ddr4 and esram and other things could be increases. Or ms may be happy with the specs and just massively under cut the price of the ps4. Best guess estimates put the ps4 at least $100 more to produce

It takes to make a console, if they were planning on launching in 2012 then they should have been done in 2011. Maybe they were planning on launching in 2012 but things went wrong and they have to launch 2013 now.
 
isnt a shitty pratice releasing those "leaks" in dribs and drabs. And not saying how old the infos are?

Just like their first leak - Gears Jugment trailer

VGLeaks
gowj51.jpg

gowj9.jpg

gowj111.jpg


Day-after official trailer

vlcsnap-2013-02-28-11hbo2x.png

vlcsnap-2013-02-28-11dlokv.png

vlcsnap-2013-02-28-11omrih.png


they been dragging out these old ass docs for almost 2 months now it's at the point where i question their practices. we still dont know the power of the audio blocks either.
 
Well I hope that VgLeaks is just trying to grab some hits, and decided to basically do an overview of stock Jaguar....

If MS isn't even pulling punches with the CPU then I'm as disappointed with MS as I can possibly get going by everything we know from Durango.
 
6 transistors 1 byte
49.152 transistors 1 Kbyte
50.331.648 transistors 1 Mbyte
1.610.612.736 transistors 32 Mbytes that is near 1,6 billion transistors. Nvidia GTX 680 is 3,5.

The size will depend on the density used. At more density more heat and more difficult to fab.
In 2007,Intel's 291Mb(~37MB) ESRAM already smaller than half GTX 680
Right now 32MB ESRAM can be 33mm2,i can't see they need 140-150mm2 to solve the problem

Plus thuway keep saying something no sense before banned.
 
It takes to make a console, if they were planning on launching in 2012 then they should have been done in 2011. Maybe they were planning on launching in 2012 but things went wrong and they have to launch 2013 now.

Im not sure what your trying to say but it doesnt take long to produce a console. Its a few weeks turn aeound from wheb they start to produce them to when the hit stores. Even at 2 months that means for a tradtional nov launch production wouldnt have to start till the end of august into sept depend on exact days. Even the ps4 isnt fully final yet. Clock speeds can and will change
 
If we're speculating likelihood, I think it's even more likely that the people leaking specs don't understand them. E.g. they read "double-pumped vector unit" and think "oh wow, it's double-pumped, probably twice as fast as stock Jaguar!".

and plus they are naive... basically all of our insiders are "speculators" that get bits of info from real insiders who cant say anything and have no Basic technical knowledge at all to interpret it so they add their own take to it, which is a problem since they have no clue at all.
 
Well, that's disappointing to know that consoles are not a match for 2012 PC hardware. :/

Don't try to compare PC hardware to Console hardware. Put this in perspective: the original Xbox cpu provided by Intel was based on a mobile version of a hybrid Pentium 3/Celeron CPU.
 
Both PS4 and the next Xbox are based on the same identical AMD APU technology.
All the differences so far depend on a single element: the memory system.
Microsoft wanted to have a lot of memory capacity for their non gaming ambitions and in the design stage they thought they could only achieve that with DDR3. Sony initially settled with a lower capacity thinking that bandwidth was more important for games and 4GB was enough so they designed the system around GDDR5.
To overcome the bandwidth limitations of DDR3 while keeping the die size of the APU the same, Microsoft had to swap a few compute units and ROPs in the GPU for 32MB of embedded memory.
For all the other purposes it's absolutely realistic to think that AMD is providing the same APU.
Honestly I think that Microsoft might be a bit angry and frustrated internally at the moment for the fact that right at the last minute thanks to the introduction of 4Gbit GDDR5 modules, 8 GB of GDRR5 have become feasible at a reasonable cost, so this fact has a bit invalidated their design approach.
But at this stage there's nothing they can do if they want to release their product in 2013, still they'll enjoy a cheaper BOM because of DDR3 vs GDDDR5 and that could offset the financial burden of the inclusion of Kinect 2 in every console.

solid post & great summary.
 
Im not sure what your trying to say but it doesnt take long to produce a console. Its a few weeks turn aeound from wheb they start to produce them to when the hit stores. Even at 2 months that means for a tradtional nov launch production wouldnt have to start till the end of august into sept depend on exact days. Even the ps4 isnt fully final yet. Clock speeds can and will change

it takes quite a lot of time to change the apu... not a few weeks. Thats what he was referring to
 
How do you guys figure out how good it is from that figure? Personally I think it's the colors. Doesn't red/orange mean more power and green/blue mean easier to program for? Experts, please correct me if I'm wrong.
 
it takes quite a lot of time to change the apu... not a few weeks. Thats what he was referring to
Right but its not a few weeks we are talking about. It can be upwards of a year and a half depending on when the vgleaks documentation is. Im also not actually talking about changing anything except renabling hardware that may have been used to hwlpyields. Like Iif sony enabled the extra ppu in the cell processor. Edram / Cu / rops may all be disabled depending on yields. So if you have say 14 cu set ups you disable 2 cus to increase yields. Any 2 of the 14 could be bad in that set up but you would still get more good chips than bad. If ebough chips were coming back from the fab early owith 14 working cus they could decide to just ship the 14 cu version. That doesnt need months to change. Same goes with everything else. Even ram wouldnt be a bigas many modern gpus come with the ability to use diffrent ram types . If the memory controller supports ddr4 then a 2012 console that may have used ddr3 in its specs because ddr4 wasbt in production. But if its availble for 2013 then it suddebly becomes an option. That also wouldnt require months to change.
 
Depends how you look at it, the dev kits would have 8GB of GDDR5 because dev kits generally have 2x final. So it might take advantage of some of it.

Yeah the current latest dev kits do have the 8GB of unified GDDR5 so the devs have ~8-9 months to take whatever advantage of it they can.
 
Right but its not a few weeks we are talking about. It can be upwards of a year and a half depending on when the vgleaks documentation is. Im also not actually talking about changing anything except renabling hardware that may have been used to hwlpyields. Like Iif sony enabled the extra ppu in the cell processor. Edram / Cu / rops may all be disabled depending on yields. So if you have say 14 cu set ups you disable 2 cus to increase yields. Any 2 of the 14 could be bad in that set up but you would still get more good chips than bad. If ebough chips were coming back from the fab early owith 14 working cus they could decide to just ship the 14 cu version. That doesnt need months to change. Same goes with everything else. Even ram wouldnt be a bigas many modern gpus come with the ability to use diffrent ram types . If the memory controller supports ddr4 then a 2012 console that may have used ddr3 in its specs because ddr4 wasbt in production. But if its availble for 2013 then it suddebly becomes an option. That also wouldnt require months to change.

well yes, same way the rumor of a higher clock speed increase comes from, if your yields are good... the rumored CU's are less due to size issues and having the eSRAM on chip, i wouldn't think they'd be disabled, they'd be gone.
 
Both PS4 and the next Xbox are based on the same identical AMD APU technology.
All the differences so far depend on a single element: the memory system.
Microsoft wanted to have a lot of memory capacity for their non gaming ambitions and in the design stage they thought they could only achieve that with DDR3. Sony initially settled with a lower capacity thinking that bandwidth was more important for games and 4GB was enough so they designed the system around GDDR5.
To overcome the bandwidth limitations of DDR3 while keeping the die size of the APU the same, Microsoft had to swap a few compute units and ROPs in the GPU for 32MB of embedded memory.
For all the other purposes it's absolutely realistic to think that AMD is providing the same APU.
Honestly I think that Microsoft might be a bit angry and frustrated internally at the moment for the fact that right at the last minute thanks to the introduction of 4Gbit GDDR5 modules, 8 GB of GDRR5 have become feasible at a reasonable cost, so this fact has a bit invalidated their design approach.
But at this stage there's nothing they can do if they want to release their product in 2013, still they'll enjoy a cheaper BOM because of DDR3 vs GDDDR5 and that could offset the financial burden of the inclusion of Kinect 2 in every console.

Well, if the Durango APU is actually using SRAM for the 32MB of embedded memory as many people insist, that means it's probably bigger and more expensive to manufacture than the PS4 APU, despite being less powerful. Second, they supposedly added new chips for processing to Kinect 2 rather than removing them. So that should be substantially more expensive to manufacture than the PS4 stereo camera (assuming both are bundled). So those two things could largely offset the additional cost of the GDDR5 memory in the PS4. I wouldn't assume PS4 is cheaper to make. I think they will be very close to the same cost to manufacture.

I honestly can't tell whether you guys are serious or trolling.

I mean, really? Ms had no idea that in a timespan of 8 years they would be able to have at least an increase of 8x the amount of memory of 360 even if it is from the fastest memory available, despite being able to have that same jump in half the time before? I could understand that scenario if they decided that this time their memory budget would be much lower than it was for 360 (which we have no indication that it was the case) other than that it's just plain stupidity if that was their reasoning for doing so.

And what baffles me even more, is that you sound like any of those changes could be done as an actual improvement of the design... It's impossible that Ms research realized that a significant amount of resources of the gpu goes to waste due memory operations, and found out that dedicating a portion of the transistors to low latency memory could improve performance more than adding those transistors to more processing power. It's impossible that they have realized that waiting for sync between the cpu and gpu can also take a huge portion of the frame time, an realized that the same low latency memory coupled with dedicated hardware to move data in parallel could also improve performance. It's impossible that they decided that a single large bus could be too costly on a long term, specially since they don't shrink well, and realized that having two half buses to different ram modules could provide similar bandwidth, but at a lesser cost, and that the same data mover hardware could be used with customizations on the gpu so it can see both as a single pool...

No, not of this is possible, the only reasonable explanation is that Ms was handed a very powerful chip and to support a very stupid decision, spent a lot of time and money into making this chip slower and costlier to make. And the decision is now even more stupid because the competitor has all the advantages even the one that was the sole reason of this stupid decision.

I mean, really, just think for one second of what you guys are saying...
 
In 2007,Intel's 291Mb(~37MB) ESRAM already smaller than half GTX 680
Right now 32MB ESRAM can be 33mm2,i can't see they need 140-150mm2 to solve the problem

Plus thuway keep saying something no sense before banned.

He's not talking about die area, just transistor count. Also you can't only consider the memory cells when estimating die area. No way could Intel fit that amount of SRAM into such a small area in 2007 and neither could TSMC fit 32MB into 33mm2. I agree its not going to be 120mm2 or more of course. That's the kind of size you'd be looking at on a 40nm process, on 28nm its going to be more like 60mm2 (since memory cells shrink quite well).

EDIT: Oh just noticed from an earlier post that you came to a similar conclusion as me, around 60mm2.
 
COD is like crack though. I hate on that game every time we see a new trailer or announcement but just like a battered spouse I return every year for my punishment.

smh lol

When people say GCN talking about these processors, are they referring to the Gamecube?

Are they still using gamecube architecture in their processors?

GCN = Graphics Core Next. It's AMD's latest GPU technology.
 
Top Bottom