VGLeaks rumor: Durango CPU Overview

? Same CPU! IMO there's no word on how many cores or memory Sony will reserve, but I assume at least 1. So at best you'd have a difference of 1 core, which is way <30%.

He was referring to the rumour that Sony are running the AMD CPU @2Ghz rather than the stock 1.6Ghz that Durango is running it at.
 
But I think they already did that by reserving 512MB. Let's be realistic here, I don't think PS4 OS will need more than 128-256MB for all its functionalities, considering that Vita, which has a lot of OS functionalities, reserves something around 90MB of the OS.

And let's also remember Xbox 360 only uses 32MB.

If Microsoft really is reserving 3GB of memory for Durango's OS, then at this point it's still baffling what it'll be used for. Even the full PC version of Windows 8 will run on a PC with only 1GB of RAM, so the footprint must be much less than that. Never mind the RT version which would likely be running if W8 on Durango actually happens.

Of course, the 3GB footprint rumour could just be utter bollocks, which is more likely.

EDIT: Unless multi-tasking apps is going to be a big part of Durango..
 
? Same CPU! IMO there's no word on how many cores or memory Sony will reserve, but I assume at least 1. So at best you'd have a difference of 1 core, which is way <30%.


Once again, ? I'd assume at least 1GB PS4 reserved for games, probably more, so at current rumors a 7-5 GB edge at best.


This one's true, provided the rumors are correct and ERP's ESRAM theories dont hold any water.

I suppose, but there's the esram and dme's. for the millionth time straight comparing main memory bandwidth is not the full picture.



sure. not sure they're not overkill for 1080p but yes.

The bottom line is the CPU and RAM amounts are the same, and any OS resources for either machine are speculation and subject to later change besides that.


http://www.eurogamer.net/articles/df-hardware-orbis-unmasked-what-to-expect-from-next-gen-console

"Additional hardware: GPU-like Compute module, some resources reserved by the OS"


"We also have hard data on Orbis's memory set-up. It features 4GB of GDDR5 - the ultra-fast RAM that typically ships with the latest PC graphics cards - with 512MB reserved for the operating system. This is in stark contrast to the much slower DDR3 that Durango will almost certainly ship with. Microsoft looks set to be using an offshoot of eDRAM technology connected to the graphics core to offset the bandwidth issues the use of DDR3 incurs. Volume of RAM is the key element in Durango's favour - there'll be 8GB in total, with a significant amount (two sources we've spoken to suggest 3GB in total) reserved for the OS.

There'll be a relatively high CPU overhead too, with potentially two cores reserved for the customisable apps Microsoft wants to run in parallel with gameplay. Orbis has no such ambitions and may power past the new Xbox simply because it focuses its resources on out-and-out games power. There's always the possibility that Microsoft has looked at the prior success of Nintendo and its own Kinect and come to the conclusion that chasing after the maximum in raw horsepower isn't the way to win the next console war."
 
Global Unichip lands sensor chip orders for Microsoft Xbox 720, says paper

IC design service company Global Unichip reportedly has entered, for the first time, the supply chain of Microsoft and will design and ship its ASIC solutions for the Kinect sensors of Microsoft's Xbox 720 gaming machines, according to a Chinese-language Economic Daily News (EDN) report.

Linky
 
Global Unichip lands sensor chip orders for Microsoft Xbox 720, says paper

IC design service company Global Unichip reportedly has entered, for the first time, the supply chain of Microsoft and will design and ship its ASIC solutions for the Kinect sensors of Microsoft's Xbox 720 gaming machines, according to a Chinese-language Economic Daily News (EDN) report.

Linky

That's good news I was wondering when we was going to hear something about the Kinect co- processor.
 
Global Unichip lands sensor chip orders for Microsoft Xbox 720, says paper

IC design service company Global Unichip reportedly has entered, for the first time, the supply chain of Microsoft and will design and ship its ASIC solutions for the Kinect sensors of Microsoft's Xbox 720 gaming machines

..sigh, they really are going to persist with that pile of shit Kinnect aren't they, sums up everything wrong about Xbox these days.

Bye-bye MS, it was fun while it lasted, you'll get no more of my money so long as you keep going down this casual gamer road you seem so determined to take.
 
..sigh, they really are going to persist with that pile of shit Kinnect aren't they, sums up everything wrong about Xbox these days.

Bye-bye MS, it was fun while it lasted, you'll get no more of my money so long as you keep going down this casual gamer road you seem so determined to take.

because it wasn't up to par the 1st time around doesn't mean it's going to be a pile of shit everything has to start somewhere & improve as time goes by, Controllers wasn't always so good & neither was touch screens or any other control interface at the beginning.


I'm sure the 1st Mouse & Keyboard wasn't a pleasure to work with.
 
..sigh, they really are going to persist with that pile of shit Kinnect aren't they, sums up everything wrong about Xbox these days.

Bye-bye MS, it was fun while it lasted, you'll get no more of my money so long as you keep going down this casual gamer road you seem so determined to take.

This doesn't mean it won't have a standard controller. Just as PS4's new dual-lens camera doesn't negate its controller either.

Some of us have our consoles in our lounges and have real life friends. As a result, it's fun to play Kinect/Move/Wii games every now and again in that social setting. However, it doesn't stop me from also enjoying games like Forza, DmC, Gears etc, nor should it stop you.
 
I'm fine with Kinect as an option, as long as it doesn't interfere too much with normal games. I think it had a lot of potential, but really needed to track faster and have better fidelity (finger tracking).
 
because it wasn't up to par the 1st time around doesn't mean it's going to be a pile of shit everything has to start somewhere & improve as time goes by, Controllers wasn't always so good & neither was touch screens or any other control interface at the beginning.


I'm sure the 1st Mouse & Keyboard wasn't a pleasure to work with.

This doesn't mean it won't have a standard controller. Just as PS4's new dual-lens camera doesn't negate its controller either.

Some of us have our consoles in our lounges and have real life friends. As a result, it's fun to play Kinect/Move/Wii games every now and again in that social setting. However, it doesn't stop me from also enjoying games like Forza, DmC, Gears etc, nor should it stop you.


No,no,no...just give me a box and a controller, anything else is just marketing gimmicks thought up by accountants and 21yr old hipster cunts who've just joined MS and are trying to impress their bosses.

I don't need Kinnect and I don't want it.
 
This doesn't mean it won't have a standard controller. Just as PS4's new dual-lens camera doesn't negate its controller either.

Some of us have our consoles in our lounges and have real life friends. As a result, it's fun to play Kinect/Move/Wii games every now and again in that social setting. However, it doesn't stop me from also enjoying games like Forza, DmC, Gears etc, nor should it stop you.

Including Kinect in every box probably adds $60-80 to the BoM, that cost has to come from somewhere, and it looks like it has come from the APU silicon budget. The PS4Eye is much, much less sophisticated and will not add anywhere near that much to the BoM for PS4 allowing Sony to push a bit harder on specs but maintain a similar price structure.

I'm fine with Kinect as an option, as long as it doesn't interfere too much with normal games. I think it had a lot of potential, but really needed to track faster and have better fidelity (finger tracking).

Kinect is probably going to be on all the time if they are setting aside processing power and RAM for it. That means games might have to use it by default.
 
I thought the rumor mill indicated an always-on, mandatory calibration of Kinect was necessary for Durango functioning. I also recall that it may be related to advertising.
 
I thought the rumor mill indicated an always-on, mandatory calibration of Kinect was necessary for Durango functioning. I also recall that it may be related to advertising.

Well, to be honest Mark Cerny during PS4 unveil also mentioned - when talking about controler & dapth-cameras - that PS4 will be aware of gamer's position in 3D space. Probably not to the same extent of sophistication as Xbox, but it will be there to use by the devs.
 
Including Kinect in every box probably adds $60-80 to the BoM, that cost has to come from somewhere, and it looks like it has come from the APU silicon budget. The PS4Eye is much, much less sophisticated and will not add anywhere near that much to the BoM for PS4 allowing Sony to push a bit harder on specs but maintain a similar price structure.



Kinect is probably going to be on all the time if they are setting aside processing power and RAM for it. That means games might have to use it by default.

Not HAVE to use it... but certain things will always be available to games to do with what they please.

Who is playing for example. Are there other people around? Maybe even what the expression of the one playing is? What is being said? UIs have never had this feature build in. My PC here doesn't really know that it is really me that is typing stuff or if I am smiling or laughing. Durango will know that and hopefully devs will be able to do something interesting with it. Durango will just have a lot more meta data to play with.

As a dev I can think why this could be useful. With all the crazy ideas that came out of Kinect on PC I am sure we will see things we have not thought about. The fact that Durango might automatically pause the game if we run off to open the door for example is a small thing, but people will quickly get used to small stuff like that.

Well, to be honest Mark Cerny during PS4 unveil also mentioned - when talking about controler & dapth-cameras - that PS4 will be aware of gamer's position in 3D space. Probably not to the same extent of sophistication as Xbox, but it will be there to use by the devs.

If it is not included on every machine, they will not.
 
..sigh, they really are going to persist with that pile of shit Kinnect aren't they, sums up everything wrong about Xbox these days.

Bye-bye MS, it was fun while it lasted, you'll get no more of my money so long as you keep going down this casual gamer road you seem so determined to take.

This is an example why I love that MS has had the success they have had with Kinect. People only bash MS for Kinect, while pretending Move isnt on PS4 and that Nintendo's Wii-U's biggest push for launch games were all casual.

Kinect was an all-out success for the time frame it was released and what MS was able to pull off.

But, in the end, everyone knew that it would take a new console designed with Kinect embedded for it to reach it's full potential.

Even if the leaks about power remain as they are, MS true core fans will be there for the next box, while they can succeed even the mighty Wii as next gen's go-to for family console as well with an improved Kinect.

They pull this off at E3 and the Next Xbox is going to fly off the shelves regardless of the usual hate in ANY forum on the net.


No,no,no...just give me a box and a controller, anything else is just marketing gimmicks thought up by accountants and 21yr old hipster cunts who've just joined MS and are trying to impress their bosses.

I don't need Kinnect and I don't want it.

Yup, and there are people who own Iphones, that never played Angry Birds in there entire lives. But, it would be strange to pretend that a celphone that has casual apps, somehow doesn't allow me to make calls as it's primary function.

At the end of the day, Motion control devices arent for everyone and never were. Not everyone like Wii-motes/Wii-Fit, Guitar hero/Rock Band in the past. Just like people don't like Move or Kinect at the moment.

Which is why if I'm MS I continue to improve on what has made me successful thus far, and the option to play with Kinect clearly has brought them big success at the end of this gen.
 
I do not think that the eSRAM will help with rendering power much (if at all), its there solely to increase the bandwidth and assist the DDR3.
They've chosen eSRAM over EDRAM, I'd say really low latency is their priority, not bandwidth.

With less transistors (compared to 6-T eSRAM, are there alternatives to that?) they could have chosen 128 MB of EDRAM, or 96 MB of EDRAM with attached ROPS and a way higher internal bandwidth than 102 GB/s.
Instead they've chosen the more costly eSRAM.

So no, I don't think the eSRAM "it's there solely to increase the bandwidth and assist the DDR3", it's there to have some kind of big cache with really low latency. The big question is why they think low latency is so important in their architecture? I doubt we'll get an answer to this question soon.
 
This is an example why I love that MS has had the success they have had with Kinect. People only bash MS for Kinect, while pretending Move isnt on PS4 and that Nintendo's Wii-U's biggest push for launch games were all casual.

Kinect was an all-out success for the time frame it was released and what MS was able to pull off.

To people who use emotive phrases like these, promises always appear far more important than quality. Kinect could be so much better if ms actually put the required effort and research into improving its very apparent pitfalls. However, due to clever marketing and buyers like above they've clearly settled on "it's just about good enough so why spend money improving it" - if the rumored specs are anything to go by.

You may complain or draw parallels to the wiimote or even the move but both of those actually work. On few to no occasions would you find yourself fighting the controls when using them even when things get frantic. kinect on the other hand is a lag filled inaccurate mess when it's actually doing its main function which is "tracking full body movement". Doing lesser functions like pose recognition and hand tracking are a bit laggy but within reason at least - most times.

Kinect can be so much better but it's clear it might never get there if we don't talk with our wallets
 
To people who use emotive phrases like these, promises always appear far more important than quality. Kinect could be so much better if ms actually put the required effort and research into improving its very apparent pitfalls. However, due to clever marketing and buyers like above they've clearly settled on "it's just about good enough so why spend money improving it" - if the rumored specs are anything to go by.

You insane? Kinect 2.0 specs are a generation step up from the old one if we can believe the rumors.
 
They've chosen eSRAM over EDRAM, I'd say really low latency is their priority, not bandwidth.

With less transistors (compared to 6-T eSRAM, are there alternatives to that?) they could have chosen 128 MB of EDRAM, or 96 MB of EDRAM with attached ROPS and a way higher internal bandwidth than 102 GB/s.
Instead they've chosen the more costly eSRAM.

So no, I don't think the eSRAM "it's there solely to increase the bandwidth and assist the DDR3", it's there to have some kind of big cache with really low latency. The big question is why they think low latency is so important in their architecture? I doubt we'll get an answer to this question soon.

Low latency is nearly always only really relevent to CPU tasks if I recall correctly - i.e moving lots of small files around.

So you could have answered your own question there. That ESRAM geared more towards the Xbox3 OS/Kinect?....
 
Low latency is nearly always only really relevent to CPU tasks if I recall correctly - i.e moving lots of small files around.

So you could have answered your own question there. That ESRAM geared more towards the Xbox3 OS/Kinect?....

Esram is not geared towards OS or kinect anymore or less so than it is to feeding the GPU. Esram is fundemental to durango's whole design.
 
You insane? Kinect 2.0 specs are a generation step up from the old one if we can believe the rumors.

Exactly how is it a generational "step up"? It now tracks 26 instead of 20 joints yes but they appear to be focusing on tracking more individuals (6) rather than tracking the same number of individuals more accurately and the latency improvement appears to be minimal.

In theory it should be able to track 78 joints across 2 individuals so i hope that will eventually pan out leading to accurate finger, hand and wrist tracking. With augmented reality that could become quite useful however, based on the state that kinect was released and is currently in it would be naive of me to hope for the best.

So realistically no, it's not a generational step. It's merely a step in the right direction. A leap would have been decreasing that latency down to as low as possible
 
People seem to forget that latency "advantages" in APU setups are really negligible. In fact, APU's have shown consistency that speed (bandwidth) and quantity matter more than latency and frequency.

http://www.legitreviews.com/article/1652/3/

This is supposed to be one of the huge appeals of APU's, their ability to have that CPU and GPU accommodate for each others deficiencies.

Of course we have never seen anything of the PS4's scale of an APU (all of these tests involved ram with sub-60GB/s with low power APUs) so the results might be up in the air, but if the results are consistent, higher bandwidth will improve the overall performance of an APU over latency.
 
They've chosen eSRAM over EDRAM, I'd say really low latency is their priority, not bandwidth.

With less transistors (compared to 6-T eSRAM, are there alternatives to that?) they could have chosen 128 MB of EDRAM, or 96 MB of EDRAM with attached ROPS and a way higher internal bandwidth than 102 GB/s.
Instead they've chosen the more costly eSRAM.

So no, I don't think the eSRAM "it's there solely to increase the bandwidth and assist the DDR3", it's there to have some kind of big cache with really low latency. The big question is why they think low latency is so important in their architecture? I doubt we'll get an answer to this question soon.
ESRAM's 102GB/s is compared to EDRAM to GPU speed=32GB/s,not internal bandwidth,if ESRAM still have internal bandwidth like 360's EDRAM,it will a lot faster than 256GB/s

Including Kinect in every box probably adds $60-80 to the BoM, that cost has to come from somewhere, and it looks like it has come from the APU silicon budget. The PS4Eye is much, much less sophisticated and will not add anywhere near that much to the BoM for PS4 allowing Sony to push a bit harder on specs but maintain a similar price structure.
Kinect2's BoM likely even lower than $56(Kinect1 BoM) since MS will use their own tech this time
 
You insane? Kinect 2.0 specs are a generation step up from the old one if we can believe the rumors.

What does "generation step" imply? Small step?

23% increase in effective depth resolution and a few more skeletal points tracked isn't huge.
 
Decent jump if it's real
Active IR camera should help it a lot
ibbg76QvXT3Xb7.jpg
 
Exactly how is it a generational "step up"? It now tracks 26 instead of 20 joints yes but they appear to be focusing on tracking more individuals (6) rather than tracking the same number of individuals more accurately and the latency improvement appears to be minimal.

In theory it should be able to track 78 joints across 2 individuals so i hope that will eventually pan out leading to accurate finger, hand and wrist tracking. With augmented reality that could become quite useful however, based on the state that kinect was released and is currently in it would be naive of me to hope for the best.

So realistically no, it's not a generational step. It's merely a step in the right direction. A leap would have been decreasing that latency down to as low as possible

It is tracking hands now.
 
What does "generation step" imply? Small step?

23% increase in effective depth resolution and a few more skeletal points tracked isn't huge.

I've asked you this before and you never answer... Where are you getting 23%? I get 2.8x or 180%

Also, from 40 joints to 150 is more than a few
 
What does "generation step" imply? Small step?

23% increase in effective depth resolution and a few more skeletal points tracked isn't huge.
It's true the camera hasn't changed much. But the processing power available as well as the bandwidth available have been significantly improved which should allow for faster more accurate tracking.
 
What does "generation step" imply? Small step?

23% increase in effective depth resolution and a few more skeletal points tracked isn't huge.

Double resolution, 40 joints vs 150 total, better depth and less latency. It looks like a 4 to 5x power increase as a result it also has a host of new middleware sollutions. What did you expect? 4K resolution?
 
Not fingers though, which, imo, would really be a generational leap.

The new minimum required distance is .5 meters so I can see this tracking fingers. Its a higher fidelity camera setup with more processing power so its completely possible that it can track fingers.
 
Not fingers though, which, imo, would really be a generational leap.

It is tracking some states of the hand (open, closed, one or a few fingers raised)... but yeah not all movement on all fingers. The question is, if that is really the point of a device like that. Probably more usefull if you want a virtual keyboard.

That being said... Kinect has the resolution to track fingers as a few PC implementations have shown in an ideal setup.
 
not being held back by the usb is going to be huge.

& it can track pointer finger and thumb. for pew-pew-pew moments.
 
It is tracking some states of the hand (open, closed, one or a few fingers raised)... but yeah not all movement on all fingers. The question is, if that is really the point of a device like that. Probably more usefull if you want a virtual keyboard.

That being said... Kinect has the resolution to track fingers as a few PC implementations have shown in an ideal setup.

So does a normal webcam.
 
So no, I don't think the eSRAM "it's there solely to increase the bandwidth and assist the DDR3", it's there to have some kind of big cache with really low latency. The big question is why they think low latency is so important in their architecture? I doubt we'll get an answer to this question soon.

I completely agree with you. I think the ESRAM will be used as somewhat of a software controlled cache (kind of like an L3 equivalent).

According to vgleaks, for the latency for, CPU a L1 miss is around 17 cycles and, L2 miss is around 144-160 cycles. For the GPU since it's running half the clock, that will be half that. So basically if the GPU runs out of memory and blows through both caches, you stall for 80 or so cycles to get data from the DDR3.

Acert on Beyond3d mentioned that he heard the SRAM latency is around 16-20 cycles. If true, that's 4-5x faster latency than DDR3 and will definitely reduce the number of stalls in cases where the L2 cannot be prefetched.

My theory is that this design is driven for tile based rendering where the main DDR3 ram will act mainly like a read-only buffer for the inputs (Textures, vertices, etc) and write only for the final rendered tile.
If the tiles are sized appropriately, then I think all the intermediate data should stay in the SRAM and not need to be written back to main memory. The move engines are then used for all the tile data prefetch form main DDR and the final write to DDR. You could have a triple buffered scheme, where you're moving data for the next tile into the SRAM, processing the current tile, and writing out the previously processed tiled back to main memory all in parallel. That way all the data movement is hidden and the GPU can just process continually.




ESRAM's 102GB/s is compared to EDRAM to GPU speed=32GB/s,not internal bandwidth,if ESRAM still have internal bandwidth like 360's EDRAM,it will a lot faster than 256GB/s

In the 360's the ROP's were in the EDRAM die. From the leaked specs and what supposed insiders have mentioned, the ROPs on Durango are not on the EDRAM so there is no internal bandwidth.
 
I completely agree with you. I think the ESRAM will be used as somewhat of a software controlled cache (kind of like an L3 equivalent).

According to vgleaks, for the latency for, CPU a L1 miss is around 17 cycles and, L2 miss is around 144-160 cycles. For the GPU since it's running half the clock, that will be half that. So basically if the GPU runs out of memory and blows through both caches, you stall for 80 or so cycles to get data from the DDR3.

Acert on Beyond3d mentioned that he heard the SRAM latency is around 16-20 cycles. If true, that's 4-5x faster latency than DDR3 and will definitely reduce the number of stalls in cases where the L2 cannot be prefetched.

My theory is that this design is driven for tile based rendering where the main DDR3 ram will act mainly like a read-only buffer for the inputs (Textures, vertices, etc) and write only for the final rendered tile.
If the tiles are sized appropriately, then I think all the intermediate data should stay in the SRAM and not need to be written back to main memory. The move engines are then used for all the tile data prefetch form main DDR and the final write to DDR. You could have a triple buffered scheme, where you're moving data for the next tile into the SRAM, processing the current tile, and writing out the previously processed tiled back to main memory all in parallel. That way all the data movement is hidden and the GPU can just process continually.






In the 360's the ROP's were in the EDRAM die. From the leaked specs and what supposed insiders have mentioned, the ROPs on Durango are not on the EDRAM so there is no internal bandwidth.

well lets hope MS's tools are set up well for that, because wasn't tile-based partly their approach with the 360, and it didn't really work out? Does tile based work ok with deferred rendering engines?
 
Low latency is nearly always only really relevent to CPU tasks if I recall correctly - i.e moving lots of small files around.

So you could have answered your own question there. That ESRAM geared more towards the Xbox3 OS/Kinect?....

If it's relevant to cpu then it has a pretty good chance into improving compute performance on gpus too.

And the vgleaks explicitly say esram help the rops maintain their peak throughput, albeit in a different manner than the edram on 360 (instead of having the rops coupled to the ram with insane amounts of bandwidth the esram is used to keep the rops caches fed all the time which allows their peak to be achieved)... So yeah, in this design they most likely focused on lower latency then insane amounts of bandwidth limited to certain portion of the pipeline...
 
well lets hope MS's tools are set up well for that, because wasn't tile-based partly their approach with the 360, and it didn't really work out? Does tile based work ok with deferred rendering engines?

I don't believe tile based approaches work well with deferred rendering. On the 360, the eDRAM contained just the framebuffer. On durango, the SRAM can contain anything and is more general.

A lot of mobile GPU's like PowerVR and Adreno use tiling since it's more efficient when you don't have huge external bandwidth, but do have embedded memory.

I would hope of all the companies that put out tools, MS would be most capable because managing an embedded RAM certainly adds complexity.
 
well lets hope MS's tools are set up well for that, because wasn't tile-based partly their approach with the 360, and it didn't really work out? Does tile based work ok with deferred rendering engines?

You are mixing tbdr and the need to tile when your frame buffer exceed the 10mb eDRAM capacity.

Low latency is important for compute jobs ( I think this is why Sony increased the ACEs in the gpu to better hide latency when doing gpgpu stuffs, but that is speculation on my part) and apparently, from vgleaks, ROP throughput is latency sensitive so yeah, they went with eSRAM for a reason.
 
They've chosen eSRAM over EDRAM, I'd say really low latency is their priority, not bandwidth.

With less transistors (compared to 6-T eSRAM, are there alternatives to that?) they could have chosen 128 MB of EDRAM, or 96 MB of EDRAM with attached ROPS and a way higher internal bandwidth than 102 GB/s.
Instead they've chosen the more costly eSRAM.

So no, I don't think the eSRAM "it's there solely to increase the bandwidth and assist the DDR3", it's there to have some kind of big cache with really low latency. The big question is why they think low latency is so important in their architecture? I doubt we'll get an answer to this question soon.

The answer is that its not really, all modern graphics cards are designed to not be phased by large latencies.
 
not being held back by the usb is going to be huge.

& it can track pointer finger and thumb. for pew-pew-pew moments.

Honestly, above almost everything else this is the best upgrade. The results of the narrow bandwidth versus a wider bandwidth even on the V.1 Kinect would have been massive.
 
Honestly, above almost everything else this is the best upgrade. The results of the narrow bandwidth versus a wider bandwidth even on the V.1 Kinect would have been massive.

Karak, do you have any new MS tidbits to share? Not even saying it has to be tech related, just anything new...
 
I do most of my gaming on PC, but me and the kids have had a lot of fun with the Kinect. We have the space for it, and sometimes it's fun to jump around like a lunatic. Our family enjoyed the Kinect Disneyland game, but the controls were unresponsive and the game often chugged along at about 20 fps.

Just as I appreciate Sony's willingness to take a risk with offbeat and *gasp* artistic games, I like that MS is advancing this tech. It's not hard to imagine that soon every laptop, TV, and refrigerator will recognize your presence, identify you, and understand your preferences (and probably try to sell you something).

Although there is an opportunity cost to be paid, I'm sure there will be no shortage of games for the hardcore. Hopefully, their console will have the power to improve and expand on the experience.
 
So yeah, in this design they most likely focused on lower latency then insane amounts of bandwidth limited to certain portion of the pipeline...

Yes, the rendering portion, the part that makes up the pretty graphics. Both system memory and GPU memory has been going towards higher latency and higher bandwidth for.. ever? DDR->DDR2->DDR3->DDR4 the trend is higher latency and higher bandwidth. GDDR3->GDDR5 pure bandwidth for rendering. Yet, MS is going to buck the trend? I don't buy it, they are using eSRAM for its bandwidth just like they did with the eDRAM. The lower latency will not hurt, but it is not the main driver. They know that 68GB/s with so many sub-systems using the memory will run into contention and bandwidth issues, so they added a separate pool of fast cache that they offload the main memory.

Why use DDR3 with highest CAS timings found in DRAM if they want low latency and don't care about bandwidth? Thy are using the highest latency and highest bandwidth DDR3 available today for the bandwidth. But it is not high enough, so they went back the 360/PS2 model of a fast scratch pad to alleviate the bottleneck.
 
Top Bottom