Kotaku Rumor: Microsoft 6 months behind in game production for X720 [Pastebin = Ban]

can someone tell me if this is true?

"But this can be applied in PS Orbis? Basically to implement virtual memory management it is not necessary that there are two levels of memory, taking the top sufficient storage capacity for the image (color, depth, and stencil) buffers and textures needed for the scene and there is a lower level. For PS Orbis the caches of the GPU do not have enough storage capacity for this and the GDDR5 is a single level of memory for all of the GPU. Obviously the ESRAM and all the mechanism implementation costs in the space that is a sacrifice in terms of computation capability. But the biggest advantage comes from the fact that this allows access to large amounts of memory per frame without having to rely on huge band widths from expensive high-wattage as the GDDR5 memory. The reason why Xbox 8/Durango uses GDDR5 is not by the fact that then the thing would be completely redundant, the GDDR5 exists on the GPUs of face to avoid the Texture Trashing by the use of a higher bandwidth, the use of virtual memory on the GPU and Virtual Texturing are another solution to the same problem that both come into conflict within a system."

coz this would be the answer coz they go with ddr3+esram+move engines
Which forum did you copy and paste this from?
Interesting. Is it using USB 3.0 or a proprietary high-speed bus?

Rumours pointed to it being proprietary. Should mean less lag.
 
Which forum did you copy and paste this from?


Rumours pointed to it being proprietary. Should mean less lag.

So that gives Durango a leg up over Sony then if the processing for Kinect is done by the Kinect itself, presumably freeing up cores for other system tasks.
 
So that gives Durango a leg up over Sony then if the processing for Kinect is done by the Kinect itself, presumably freeing up cores for other system tasks.

Rumours (even more rumours!) about the sound chip indicated that it might have been doing quite a bit of the Kinect legwork too.

That aside, (as with 360) Kinect will only require processing when it's actually being used. So for multiplatform games, Kinect won't make any difference. There is no leg up over Sony here, unless we're talking Kinect vs Move games.
 
Well, a cost difference of $60 would cost them £600,000,000 for the first 10 million PS4 console sales alone, and a good few billion dollars over the lifecycle. We'll have to just wait and see whether that advantage was really that commercially valuable, in comparison to Durango's USPs (Kinect, DVR features etc). These cost differences don't seem much when we're talking about an individual unit, but when you extrapolate it to the volumes, you can see why every penny counts.

MS did well to spend that 1.5 Billion to upgrade the 360 from 256MB to 512, though, didn't they?

If 8GBs is the difference between outselling Durango 2:1 and being outsold 1:2 than it was the smart move, right? And remember, in my example they aren't losing $600 Million, they're making $600 Million less than they might have
 
Rumours (even more rumours!) about the sound chip indicated that it might have been doing quite a bit of the Kinect legwork too.

That aside, (as with 360) Kinect will only require processing when it's actually being used. So for multiplatform games, Kinect won't make any difference. There is no leg up over Sony here, unless we're talking Kinect vs Move games.

If Kinect 2.0 is mandatory (all signs point to 'yes'), and PS4 is getting the camera bar packed in too, it's likely a lot of future multi-platform games will have Kinect-ish features down the line.
 
So that gives Durango a leg up over Sony then if the processing for Kinect is done by the Kinect itself, presumably freeing up cores for other system tasks.

A proprietary bus doesn't tell us much about where the processing is done. Though I would think a high speed bus would actually more imply that the processing is done system side rather than Kinect side (i.e. passing back all the raw data to Durango for processing => good bandwidth requirement). PS4's cam using a proprietary connection and I'm pretty sure related processing is done PS4 side too.
 
You're not making an argument here. It's "fishy" but you don't understand any of the technical, economic or market reasons for any of the decisions? You are seeing contradictions where none exist. Each company did what they did because it gave what they hoped would be their best chance for success. For Sony that meant pleasing devs (and hardcore gamers) with the friendliest and most powerful design they could afford. For MS that meant ensuring costs were under control in the long term, while making sure they had the resources to accomplish their multimedia aspirations. Both had budgets, vague ideas of what tech would be available when, and targets for when they needed to ship a system. There's nothing fishy about that.
I don't disagree, but I don't think MS's decision to go with 8GB RAM is as removed from choices about developer wants as you seem to imply. Seems equally as plausible that they canvassed developers very early, who told them that they wanted a lot of RAM, so they decided very early on that they would go with DDR3 because it was the only way they could guarantee it.

I mean, if you buy into the idea, as I do, that PS4 and Xbox 3 have very similar multimedia aspirations, there's no reason to assume that MS would have originally planned to have twice (or even four times) as much RAM for those functions. Seems to me far more likely that it would be something developers requested.

Also apologies if you've already answered this point, but you seem to be making a lot of hay over the fact that Durango is a headache to develop for, but it looks to me just like a souped up 360 architecture. There's a main memory pool and EDRAM just like on 360; only now you don't have to put the framebuffer in the EDRAM if you don't want to (seems to me rather a moot point though, I can't imagine a developer would want to put anything else there). That hardly looks like a headache, and heading out of a generation with exactly the same system architecture in 360, I can't imagine developers complaining about it in the same way they did the PS3.
 
A proprietary bus doesn't tell us much about where the processing is done. Though I would think a high speed bus would actually more imply that the processing is done system side rather than Kinect side (i.e. passing back all the raw data to Durango for processing => good bandwidth requirement). PS4's cam using a proprietary connection and I'm pretty sure related processing is done PS4 side too.

Those are most frequently proprietary ports rather than specific protocols anyway. The PS4Eye has a proprietary input, but its bandwidth specifications indicate that it's still probably a USB3 protocol.
MS will very probably use USB3 for kinect2 too, even with a specific connector, so that they can continue their "kinect for windows" offer without having to change the hardware.
 
I don't disagree, but I don't think MS's decision to go with 8GB RAM is as removed from choices about developer wants as you seem to imply. Seems equally as plausible that they canvassed developers very early, who told them that they wanted a lot of RAM, so they decided very early on that they would go with DDR3 because it was the only way they could guarantee it.

I mean, if you buy into the idea, as I do, that PS4 and Xbox 3 have very similar multimedia aspirations, there's no reason to assume that MS would have originally planned to have twice (or even four times) as much RAM for those functions. Seems to me far more likely that it would be something developers requested.

Oh, sure. I'd lump keeping the RAM happy devs under the heading of multimedia aspirations, though. When I say that I mean it in the literal sense of multiple medias including games, video, television, musics, etc.

Also apologies if you've already answered this point, but you seem to be making a lot of hay over the fact that Durango is a headache to develop for, but it looks to me just like a souped up 360 architecture. There's a main memory pool and EDRAM just like on 360; only now you don't have to put the framebuffer in the EDRAM if you don't want to (seems to me rather a moot point though, I can't imagine a developer would want to put anything else there). That hardly looks like a headache, and heading out of a generation with exactly the same system architecture in 360, I can't imagine developers complaining about it in the same way they did the PS3.

Well, I mostly keep bringing it up because many in these threads like to repeatedly claim Durango's ESRAM gives it some major latency benefit compared to PS4. If you do the simple thing like you suggest, that advantage doesn't present at all. If you do want to exploit a potential latency advantage you have to jump through all sorts of hoops to make it happen, creating a big headache for yourself. So basically every time someone pops into a thread like this to claim the ESRAM was chosen for latency benefits and not just to provide the needed framebuffer bandwidth, I like to point out the ramifications of that claim.

so no one can answer? :(

It's hard to parse what that quote is trying to argue. It seems like he's basically saying DDR3+ESRAM was a cheaper solution than GDDR5 without too great a performance penalty. That much is true. If he's claiming that virtual texturing solutions are uniquely beneficial to Durango, that's less true, if only because the PS4 isn't dealing with the kinds of drawbacks virtual textures help work around in the case of Durango.
 
Well, I mostly keep bringing it up because many in these threads like to repeatedly claim Durango's ESRAM gives it some major latency benefit compared to PS4. If you do the simple thing like you suggest, that advantage doesn't present at all. If you do want to exploit a potential latency advantage you have to jump through all sorts of hoops to make it happen, creating a big headache for yourself. So basically every time someone pops into a thread like this to claim the ESRAM was chosen for latency benefits and not just to provide the needed framebuffer bandwidth, I like to point out the ramifications of that claim.
Realistically though, I think the latency is a red herring. Much like how everybody went on about tiling on the 360 for 'free' AA, most developers will forgoe the extra effort and just dump the framebuffer there.
 
So that gives Durango a leg up over Sony then if the processing for Kinect is done by the Kinect itself, presumably freeing up cores for other system tasks.

It is not processed by kinect. On Durango memory and CPU time are reserved at OS level and kinect 'standard functions' are available for all games.

They no longer have to implement kinect in their code, it's just there waiting for them to use.

Kinect processing + higher OS memory overhead are far more worrying than different routes to the same effect (memory set ups) or a few flops difference.
 
That aside, (as with 360) Kinect will only require processing when it's actually being used. So for multiplatform games, Kinect won't make any difference. There is no leg up over Sony here, unless we're talking Kinect vs Move games.

If the design documents leaked by VGleaks are still relevant, then that's not true. Kinect processing resources will be reserved at all times, that's one way of encouraging developers to implement Kinect functionality. But there are many rumors now, including the one that claims the 360 SoC will be used for the OS and Kinect processing, freeing up the main hardware entirely. I guess we'll find out in two weeks, once developers are finally free to talk about it.
 
Oh, sure. I'd lump keeping the RAM happy devs under the heading of multimedia aspirations, though. When I say that I mean it in the literal sense of multiple medias including games, video, television, musics, etc.



Well, I mostly keep bringing it up because many in these threads like to repeatedly claim Durango's ESRAM gives it some major latency benefit compared to PS4. If you do the simple thing like you suggest, that advantage doesn't present at all. If you do want to exploit a potential latency advantage you have to jump through all sorts of hoops to make it happen, creating a big headache for yourself. So basically every time someone pops into a thread like this to claim the ESRAM was chosen for latency benefits and not just to provide the needed framebuffer bandwidth, I like to point out the ramifications of that claim.



It's hard to parse what that quote is trying to argue. It seems like he's basically saying DDR3+ESRAM was a cheaper solution than GDDR5 without too great a performance penalty. That much is true. If he's claiming that virtual texturing solutions are uniquely beneficial to Durango, that's less true, if only because the PS4 isn't dealing with the kinds of drawbacks virtual textures help work around in the case of Durango.

reading what he wrote is ms implementing the virtual memory...to avoid texture trashing

as john carmak was asking to implement from years 2000


http://aaronm.nuclearglory.com/vt/johnc.plan.htm
"Name: John Carmack
Email: johnc@idsoftware.com
Description: Programmer
Project: Quake 3 Arena

3/7/00
------
This is something I have been preaching for a couple years, but I
finally got around to setting all the issues down in writing.

First, the statement:

Virtualized video card local memory is The Right Thing."

____

texture trashing is a problem that he describe in this way:

"Almost all of the drivers made a purely LRU memory management. This works correctly while total textures need in a frame to fit into memory once have been loads. The minimum you need a little more than memory that fits into the card, you will see how performance falls sharply. If you have 14 MB of textures to render a frame, your graphics card and it has only 12 MB of available buffers of image, instead of having to upload 2MB that do not fit. You will have to make the CPU to generate 14 MB of command of traffic that can make to the frame rate of a single digit in many drivers."


now the idea to solve this problem is Virtual texturing


and coming back on durango he explain how the virtual memory table +esram+move engine give to the console 100% hardware suppor to virtual texturing.


but the interesting thing is he saying the the cache of the ps4 cpu can handle virtual memory management

"But this can be applied in PS Orbis? Basically to implement virtual memory management it is not necessary that there are two levels of memory, taking the top sufficient storage capacity for the image (color, depth, and stencil) buffers and textures needed for the scene and there is a lower level. For PS Orbis the caches of the GPU do not have enough storage capacity for this and the GDDR5 is a single level of memory for all of the GPU."

both trying to solve the same problem texture trashing

sony with higherbandwith
ms with virtual memory and virtual texturing

and he trying to explain why ms have an advantage in this...is this possible?

and again is true that ps4 cant handle the virtualization of the memory?! (i really dont know sorry)
 
I highly doubt ms would require kinect but not have dedicated hardware for it.

I would be surprised if they didn't. Dedicated hardware (read: asic in best case) would mean guaranteed response- and execution-time which to me is crucial for a concept like kinect.
 
Just noticed VG leaks have started calling the next Xbox 'Xbox Infinity' rather than Durango in their latest posts.

More evidence that it is likely the final name or them just picking up on a bandwagon?
 
Realistically though, I think the latency is a red herring. Much like how everybody went on about tiling on the 360 for 'free' AA, most developers will forgoe the extra effort and just dump the framebuffer there.

I agree. Any benefit from low latency access to the ESRAM is probably so marginal in a typical workload that it isn't worth the effort for devs to chase. That said, 32MB is still a tight squeeze for a full 1080p frame meaning it could still result in some compromises.

It is not processed by kinect. On Durango memory and CPU time are reserved at OS level and kinect 'standard functions' are available for all games.

They no longer have to implement kinect in their code, it's just there waiting for them to use.

Kinect processing + higher OS memory overhead are far more worrying than different routes to the same effect (memory set ups) or a few flops difference.

It's not a binary question of hardware processing in Kinect versus software processing on Durango. Some things may get done by chips built into the camera, and others may still have to be done on the host processor. For example, Kinect 2 might generate skeletons itself, but it will still be up to the main processor to manage voice commands.
 
I agree. Any benefit from low latency access to the ESRAM is probably so marginal in a typical workload that it isn't worth the effort for devs to chase. That said, 32MB is still a tight squeeze for a full 1080p frame meaning it could still result in some compromises.
According to Digital Foundry, Ninja Gaiden 2's framebuffer filled 99.975% of the 360's 10MB EDRAM. Could MS just be banking on developers plonking the framebuffer in the EDRAM and using shader-based AA?
 
can someone tell me if this is true?

"But this can be applied in PS Orbis? Basically to implement virtual memory management it is not necessary that there are two levels of memory, taking the top sufficient storage capacity for the image (color, depth, and stencil) buffers and textures needed for the scene and there is a lower level. For PS Orbis the caches of the GPU do not have enough storage capacity for this and the GDDR5 is a single level of memory for all of the GPU. Obviously the ESRAM and all the mechanism implementation costs in the space that is a sacrifice in terms of computation capability. But the biggest advantage comes from the fact that this allows access to large amounts of memory per frame without having to rely on huge band widths from expensive high-wattage as the GDDR5 memory. The reason why Xbox 8/Durango dosnt uses GDDR5 is not by the fact that then the thing would be completely redundant, the GDDR5 exists on the GPUs of face to avoid the Texture Trashing by the use of a higher bandwidth, the use of virtual memory on the GPU and Virtual Texturing are another solution to the same problem that both come into conflict within a system."

coz this would be the answer coz they gone with ddr3+esram+move engines

Got this from the AMD openCL programmers guide.
gpuspecs7xxx3rovu.png



So im pretty sure that microsoft took that into the design of durango. You read small amount of data from the Esram so you need to access it a lot of times so the lower latency can help with that.
 
I highly doubt ms would require kinect but not have dedicated hardware for it.

Echo cancellation is hardware they specifically added for Kinect inside their audio block. bkillian(former MS audio engineer) said the processing for that would take 1-2ms for a single core on the 360.
 
Just noticed VG leaks have started calling the next Xbox 'Xbox Infinity' rather than Durango in their latest posts.

More evidence that it is likely the final name or them just picking up on a bandwagon?



Noticed that as well. And they seem to be saying it petty matter of factly as well. Hmmm...
 
and he trying to explain why ms have an advantage in this...is this possible?

and again is true that ps4 cant handle the virtualization of the memory?! (i really dont know sorry)

He's wrong about that. There's nothing to prevent virtualizing textures on PS4. It just doesn't help a ton because it's a flat memory pool where every region is fast. With megatexturing, it'd be more about making sure area textures get loaded from disk to RAM when appropriate.

The idea that Durango was explicitly engineered for superior virtual texture support has been kicking around for a while, and it's always just been wishful thinking from the "special sauce" brigade. Frankly, the ESRAM is too small for virtual texturing to be super beneficial in the way some claim. if you've got 2GB of textures to read for a frame, copying it to the ESRAM 32MBs at a time is not exactly a good use of resources. Like that old Carmack example, virtual texturing is mostly beneficial for situations where you can't quite fit all your textures into VRAM. It gives you an elegant method to fall back to system memory.
 
MS did well to spend that 1.5 Billion to upgrade the 360 from 256MB to 512, though, didn't they?

Yup. IMO it singlehandedly save the generation for them.

Rein said MS told him "you just cost us a billion" or whatever when they did the change. He replied no I made a billion happy gamers or something.

But in reality, he probably saved MS who knows how many dollars by allowing them to be competitive technically with PS3 for 7 extremely long years....that should have been his real response.

However, it's difficult to prove. I thoroughly dont believe it, but there are probably those who honestly believe a 360 with 256MB of RAM would have done just as well, crazy as it sounds. The whole "graphics dont matter/only need to be good enough/joe public doesn't care etc etc crowd...

Honestly if Durango specs are as rumored I will probably be on the other side of that argument. HOWEVER the difference may be, I will say the next gen stuff (meaning BF4, Shadowfall, Second Son gameplay screens) I've seen so far indicate a real possibility of diminishing returns. Which is honestly kind of frightening to me...

I remember probably the "scariest" quote for me was a little, off the wall, totally unnoticed one by Cevat Yerli in some interview about Crysis 3 months ago. He made the offhanded comment to the effect "well it's much more difficult to wow gamers with graphical increases now than it was back in Crysis 1 day". I kind of tried to put that out of my head since I hate the idea of diminishing returns.

Another thing along that vein that really sticks with me was when Laa Yosh at B3D said of the UE4 Infiltrator demo, "As a CG house employee I would have a hard time telling a potential customer what we could do better than that"

But still, I dont believe in diminishing returns! LOL.

One thing I'm sure of diminishing returns or no, next gen will be worthwhile. BF4 while it may have fallen short of my next gen dreams, still looks really, really, really good.

Not to mention just doing a PC highest settings, 1080P, 60 FPS port of Crysis 1 alone pretty much requires next gen power. And that's definitely worth it.
 
I would be surprised if they didn't. Dedicated hardware (read: asic in best case) would mean guaranteed response- and execution-time which to me is crucial for a concept like kinect.

Why?

The had a processor in kinect this gen and decided that having it take up a significant amount of 360's processing was preferable to the cost of the hardware going up.

There's fewer barriers to doing that now than there was then?
 
Total bandwagon. It won't be called that.

because?



EDIT:

What are the odds they start either slowly releasing a few juicy things, or stuff leaks?

presumably we've got key MS Game Studios people covered on Twitter so we can see who starts heading over to the MS campus?
 
Why?

The had a processor in kinect this gen and decided that having it take up a significant amount of 360's processing was preferable to the cost of the hardware going up.

There's fewer barriers to doing that now than there was then?

Yes and this caused the delays with kinect v1. Also keep in mind that the cpu-cores in the amd-apu are arguably much faster than 360-cores... Also keep in mind that if you use the cpus for this they are still cpus so response-times are not really good predictable and also MS had enough time to build special hardware for kinect-processing-needs (at least the expensive ones).
 
Yes and this caused the delays with kinect v1. Also keep in mind that the cpu-cores in the amd-apu are arguably much fastet than 360-cores... Also keep in mind that if you use the cpus for this they are still cpus so response-times are not really good predictable and also MS had enough time to build special hardware for kinect-processing-needs (at least the expensive ones).

What you are saying is you would like them to have done that.

There's no compelling reason for them to add hardware for kinect when the processing can be done on the main system... and they can plan for that on day 1.

From what I've read the biggest issue with kinect one was the bandwidth of USB 2.0 - that's kind've gone away (and is likely the reason for the reduction in latency from 90ms to 60ms we see on leaked documentation).
 
What you are saying is you would like them to have done that.

There's no compelling reason for them to add hardware for kinect when the processing can be done on the main system... and they can plan for that on day 1.

From what I've read the biggest issue with kinect one was the bandwidth of USB 2.0 - that's kind've gone away (and is likely the reason for the reduction in latency from 90ms to 60ms we see on leaked documentation).

We'll see :) It's almost useless to discuss now but to me it would be reasonable to not use main-processing-ressources too much for kinect.
 
Yup. IMO it singlehandedly save the generation for them.

Rein said MS told him "you just cost us a billion" or whatever when they did the change. He replied no I made a billion happy gamers or something.

But in reality, he probably saved MS who knows how many dollars by allowing them to be competitive technically with PS3 for 7 extremely long years....that should have been his real response.

However, it's difficult to prove. I thoroughly dont believe it, but there are probably those who honestly believe a 360 with 256MB of RAM would have done just as well, crazy as it sounds. The whole "graphics dont matter/only need to be good enough/joe public doesn't care etc etc crowd...
We don't really know this though. In a hypothetical parallel universe where MS may have stuck with the 256MB of RAM (and had a cheaper price point to boot), we don't know if Sony would actually have put 512MB of RAM in the PS3.
 
because?



EDIT:

What are the odds they start either slowly releasing a few juicy things, or stuff leaks?

presumably we've got key MS Game Studios people covered on Twitter so we can see who starts heading over to the MS campus?

I kind of hope stuff doesn't leak but you just know someone at MS or an ad agency is gonna press the trigger early.
 
This isn't really on topic, but since we're talking about specs anyway, one thing that has puzzled me a little is the eSRAM bandwidth reported by VGLeaks.

Why so (apparently) low?

The PS2 used embedded DRAM 13 years ago and achieved 48GB/s - not much shy of half what is reported for Durango's eSRAM.

Sony mentioned that they considered using embedded memory and it could have offered up to a terabyte of bandwidth.

So why invest so much die space in embedded memory on Durango and then go for 'only' 102GB/s?

Is it something that could be a candidate for a late upgrade?
 
I'm still holding out that Durango is some 3.0TF monster with raytracing chips! Why? Because trends yo. MS has always used hardware a generation ahead of whats on the market (based on a paradigm shift but still! Also this trend doesn't include Xbox, because that was old MS and it doesn't count)

We don't really know this though. In a hypothetical parallel universe where MS may have stuck with the 256MB of RAM (and had a cheaper price point to boot), we don't know if Sony would actually have put 512MB of RAM in the PS3.

I don't think there was ever any indication that Sony was ever going with anything other than 256/256. IIRC all rumors pointed towards it as well.
 
It is not processed by kinect. On Durango memory and CPU time are reserved at OS level and kinect 'standard functions' are available for all games.

They no longer have to implement kinect in their code, it's just there waiting for them to use.

Kinect processing + higher OS memory overhead are far more worrying than different routes to the same effect (memory set ups) or a few flops difference.

do you mean there is reserved RAM and CPU time for kinect regardless of whether you use it?

upside would be no perceived loss of power because it is already reserved, but the downside is that if you don't use it, you don't get any of that power back
 
do you mean there is reserved RAM and CPU time for kinect regardless of whether you use it?

upside would be no perceived loss of power because it is already reserved, but the downside is that if you don't use it, you don't get any of that power back

That should be the case because Kinect is never "off", it's always waiting for voice commands or gesture inputs for system-level functions, if not game level ones.
 
The VGLeaks docs 'confirm' that a standard set of CPU/GPU/memory resources is reserved for Kinect processing, whether the game uses it or not. It's basically MS saying 'go on, you might as well use it!'. Though certain unspecified types of non-standard use cases will come out of the application budget.
 
"All access to the GPU in Durango memory using virtual addresses, and therefore pass through a translation table before settled in the form of physical address. This layer of indirection solves the problem of fragmentation of memory hardware resources, a single resource can occupy several non-contiguous pages of physical memory without penalty.

Virtual addresses can take aim pages in the main RAM, in the ESRAM, or can not be mapped. The Shader read and writes the pages not mapped in well defined results, including optional error codes, rather than block the GPU. This ability is important for the support of resources in "tiles", which are partially resident in physical memory."


yeah
the cpu can access to the esram in durango?

Not directly. You have to go through the GPU to access it.

jLwKZLZ.jpg
 
If this is true I expect to see money hatting for exclusives and tons of money spent on the press conferences. Buy the next gen. Classic Ms strategy.
 
can someone tell me if this is true?

"But this can be applied in PS Orbis? Basically to implement virtual memory management it is not necessary that there are two levels of memory, taking the top sufficient storage capacity for the image (color, depth, and stencil) buffers and textures needed for the scene and there is a lower level. For PS Orbis the caches of the GPU do not have enough storage capacity for this and the GDDR5 is a single level of memory for all of the GPU. Obviously the ESRAM and all the mechanism implementation costs in the space that is a sacrifice in terms of computation capability. But the biggest advantage comes from the fact that this allows access to large amounts of memory per frame without having to rely on huge band widths from expensive high-wattage as the GDDR5 memory. The reason why Xbox 8/Durango dosnt uses GDDR5 is not by the fact that then the thing would be completely redundant, the GDDR5 exists on the GPUs of face to avoid the Texture Trashing by the use of a higher bandwidth, the use of virtual memory on the GPU and Virtual Texturing are another solution to the same problem that both come into conflict within a system."

coz this would be the answer coz they gone with ddr3+esram+move engines

Apples and oranges are both fruits which give you energy via carbohydrates, fibre and vitamins.

On paper they are different routes to doing the same job.

In reality there's a bunch of qualitative reasons you might choose one over the other. Just like the question you are raising.

I think we should just accept they have been targeting similar costs and power budgets and have been using the same chip vendors, they likely had different objectives.... but the truth is they are likely to be more similar than they are different. it's unrealistic to assume different unless one of them was prepared to take significant loss again on the hardware (categorically not happening) or had a much lower end unit price in mind.

'On paper' everyone thought the PS3 would wipe the floor with 360... in reality the PS3 has only gained a consistent parity for multiplat releases in the last year to 18 months.

That'a a very long winded way of saying at this point we should wait to see how things look 'onscreen' - MS have seen Sony's reveal and demos and will pitch their own accordingly.

The real acid test is going to be on the 21st, followed by responses from the showfloor at E3.

The minutiae of each console's memory system will largely become both obvious and irrelevant.
 
Top Bottom