OXM: Why splitting Xbox One's OS RAM allocation is good for developers

Then where did you hear that the os allocation is less complicated on xbone then ps4 because that makes no sense.

You're probably right. I had forgotten about the ESRAM on the XB1.

No thuway means the eSRAM + DDR3 is more complicated then just a pool of GDDR5.

No more complicated that what developers have been learning to develop for for the last 8 years. Isn't that how the 360 was set up?
 
No carnival of stupid in here please.

Interesting. Reading now.

The carnival of stupid is what Microsoft is doing.

And how annoying that you see the need to post what we are to do or not do in this thread even before you could be bothered to read anything.
 
Well that isn't true.

It's being more true by the day. In 5 years many people who are tech heads and play video games will be upgrading to better TV sets, and those TVs will be running the applications. My smart HDTV already does many things like Netflix better than the 360.

Not needing a Gold account is also a plus.
 
Can the title be changed to reflect that the article is talking about OS allocation and not actual split ram pools.
 
I guess if were talking physical amounts of RAM yes, but for the purposes of this thread I was referring to the split between addressable/non-addressable game RAM and the RAM reserved for the OS.
  • Adressable Game RAM = Developers can use as they want
  • Non-addressable game RAM = OS handles that on behalf of the game, developers can request it but don't need to worry about it, its just virtual memory handled by the OS.
  • OS RAM = Developers can't touch that at all.
Its not very difficult to understand.
 
My problem with all this secondary app stuff is that I already have all of this on my tablet, even while sitting in front of a TV. I just don't care about yet another device giving me access to boring, ubiquitous stuff like social networks, communication, and media.

Couldn't agree more.
 
Seems pretty straight-forward to me. Most of the partition is devoted to games, but MS is plugging this as a "center of the room" device, so it makes sense that the other bits are devoted to apps and OS.
 
Naturally, this has provoked a certain amount of upset among those who'd rather each and every byte of memory was set aside for the sole, exclusive purpose of (e.g.) rendering every fold in Batman's cape.

Those still disgruntled by the withholding of RAM that might be devoted exclusively to sharper gun renders

Look how they try to ridicule those of us who want the resource of the machine we buy to be usable for.....you know......actual games.
 
The Xbox One does have a unified RAM pool (as did the 360 and PS4 now).

What a non-unified RAM pool means is that it has separation between VRAM and RAM. For example, on the PS3 which had 512MB RAM, 256MB of that could be accessed by the GPU, and 256MB could be accessed by the CPU. Developers couldn't allocate resources to both depending on their needs as it was determined by the system. With a unified RAM system, going with the PS3 example, that means that developers have 512MB to allocate between CPU/GPU resources as they see fit for their title.

With the X1, developers simply have to think of it as being a 5GB unified RAM. They can allocate those resources between CPU/GPU as they see fit, just as they could on the 360. The system reserves 3GB for the OS and apps for multi-tasking, but that's invisible to the developer (taken care of by the hypervisor OS). It doesn't make development any harder.

Please, people, read what this guy has to say before making more stupid comments ;ike "AHA! So the RAM on the Xbone is split! Sony listens! MS suxx. Cerny is god!".

Anyway, I for on very much enjoy the possibility of various apps like FB or Internet Explorer running alongside my game using the W8 snap feature. I use it all the time on my PC, and will on the Xbox One.
 
Cerny: "The #1 request of developers was a unified RAM pool"

Xbox mag: "Why splitting the RAM is good for developers"

I DON'T KNOW WHO TO BELIEVE ANYMORE.

Not even the same subject, he's clearly talking about the 3GB of the unified RAM set aside for the OS, similar to the 100 pages of 'discussion' we had on the PS4.

Both consoles have a unified RAM pool.
 
Is it coincidence that this comes right after DF PS4 article ?

The article tells me that some of the memory is used to run windows and some to run system wide Kinect features. Neither of those are features that appeal to me.

Can you feel that? They're coming.

tumblr_lunt92GhbX1r6984go1_400.gif

Dinosaur coming confirmed.
 
  • Adressable Game RAM = Developers can use as they want
  • Non-addressable game RAM = OS handles that on behalf of the game, developers can request it but don't need to worry about it, its just virtual memory handled by the OS.
  • OS RAM = Developers can't touch that at all.
Its not very difficult to understand.

Right, which I why I explained what I was referring to. It seems that others were having trouble understanding.
 
You're probably right. I had forgotten about the ESRAM on the XB1.



No more complicated that what developers have been learning to develop for for the last 8 years. Isn't that how the 360 was set up?

I am not saying that the eSRAM setup is hard for developers but that its more complicated then the GDDR5 setup the ps4 has
 
The article tells me that some of the memory is used to run windows and some to run system wide Kinect features. Neither of those are features that appeal to me.

Using console resources to provide an operating system isn't a feature that appeals to you?
 
The carnival of stupid is what Microsoft is doing.

And how annoying that you see the need to post what we are to do or not do in this thread even before you could be bothered to read anything.

The article is about RAM. That's all i needed to know.

And i agree with your first point by the way. :]
 
How sad, I would rather get better games for a game console than sacrifice resources for other crap that I can already do on my HDTV.
 
My problem with all this secondary app stuff is that I already have all of this on my tablet, even while sitting in front of a TV. I just don't care about yet another device giving me access to boring, ubiquitous stuff like social networks, communication, and media.

Same here. I always have either my notebook or my tablet on the living room table, I don't need a third device for it.
 
Interesting read.

To be fair, developers seem to be fine with 5GB of main memory for games. Hence, if Microsoft would have gone for a game-only console, they most probably would have added 4GB instead of 8GB. The even could have considered adding 4GB of GDDR5, just like Sony did.

An interesting question is how much of the 3GB of memory reserved for apps can still be assigned to games later in the console's life cycle.
 
Look how they try to ridicule those of us who want the resource of the machine we buy to be usable for.....you know......actual games.

Yet they spend a good chunk of the article explaining that, nowadays, most people buying new consoles want more than just games to run well on that machine for that price. A lot of words are spent saying the obvious...that the traditional console is in the past, and that most people want and demand more from their boxes. Old-style consoles that focused everything on games alone are done and have been for some time and that's why every current or new one that is coming are all more concerned with stuff that the box can do outside of games. It's not ridicule, it's a observation of the market reality that in order to succeed, your new box needs more than a pure games focus. People who aren't satisfied with that can still build gamer PCs if they like, but consoles are and have always been a compromise of price and ability, even before multitasking app-using gamers of 2013.
 
Right, which I why I explained what I was referring to. It seems that others were having trouble understanding.
I don't know. From what we've heard the RAM split on the Xbox One seems much less complicated than the RAM split on the PS4.
I'm telling you its not complicated. Developers have a unified pool of RAM they can address, in addition to that, they can request further assist RAM from the OS which is handled by the OS, they don't need to do anything which takes away the complication of addressing two virtually split pool of RAM. Oversimplification of course.
 
Having such a large focus on OS features sheds light on why Microsoft went with DDR3 RAM rather than a more bandwidth-heavy RAM like GDDR5.

The fact is, PCs today still use DDR3 because of one thing - It is easier on the CPU, and thus easier on an OS. DDR3 + eSRAM allows Microsoft to easily incorporate OS-specific features while simultaneously allowing developers access to the higher bandwidth RAM they want, albeit in small bits. 32MB is plenty for storing a large framebuffer though.

In my experience, the RAM discrepancy between Xbox One and PS4 is being way overplayed. Xbox One is more difficult to develop for, yeah, but since the 360 had nearly the same setup, there is no steep learning curve to get adjusted to using eSRAM or eDRAM.
 
Where Microsoft and Sony seem to be missing the bus here is that we already have this stuff.


From the article:

"There has been an explosion of devices," Multerer went on. "There are phones, there are tablets, the whole way that people interact and that they live with devices has fundamentally changed. I walk around with a phone all time, everybody I know walks around with phones. The expectation of the next gen gamer is that these things are just there. It's a rapidly changing ecosystem of applications that sit on a rapidly changing ecosystem of devices - fundamentally different to the consoles of the past."

He is, of course, right that there has been an explosion of devices. But the jump in logic that we then want that on a gaming device (or while watching TV, even) seems off the mark to me. When they showed the TV stuff what i thought to myself was "I can do that shit on my ipad. Why do I need it screen?" These so called "next gen gamers" have ipads. They don't need twitter running in the background. They don't need some sort of IMDB shit to come up while watching a movie. We already have that stuff and already love the way we can get it. Tablets won. The end. Putting that functionality into a gaming system isn't going to make anyone throw their ipad in the trash and use the xbox one's functionality.


I don't see anyone (or, vast amounts of people) asking for this stuff. We like TV how it is. We like videogames how they are. Tablets and phones and shit can add to the experience and we have them already if that's what we want to do. But to spend energy on this stuff on your system seems so foolish. It was like when microsoft had a huge E3 hubbub about "We have twitter and facebook!" and the general reaction was "so what? I can already access those on 1000 other devices." They never gained traction as apps. It was a complete waste of time.



I do think having a nimble OS can be a good thing. Both need the overhead to record gameplay. I think this is a worthwhile feature. I like suspend/resume... it's actually one of the reasons I like the vita so much and I'm excited to see it in use on consoles. I also think running background apps which help gaming, like skype, will be handy. Right now my friends and I use skype on our phones/laptops while we play games online (codecs for online chat are awful...) but if it was built into the system that would be fine. Not entirely needed, since we already have access to it, but it would be ok.


So much of this other stuff though... ugh.
 
I am not saying that the eSRAM setup is hard for developers but that its more complicated then the GDDR5 setup the ps4 has

In terms of an absolute scale of "difficulty" I'd wager that they're both extremely easy to develop for compared to last gen.

Here's an artistic rendition:

Difficult........................................................................................................................................Less Difficult
|-----------------PS3-------------360---------------------------PC----------------------------------XB1-PS4--|
 
Having such a large focus on OS features sheds light on why Microsoft went with DDR3 RAM rather than a more bandwidth-heavy RAM like GDDR5.

The fact is, PCs today still use DDR3 because of one thing - It is easier on the CPU, and thus easier on an OS. DDR3 + eSRAM allows Microsoft to easily incorporate OS-specific features while simultaneously allowing developers access to the higher bandwidth RAM they want, albeit in small bits. 32MB is plenty for storing a large framebuffer though.

In my experience, the RAM discrepancy between Xbox One and PS4 is being way overplayed. Xbox One is more difficult to develop for, yeah, but since the 360 had nearly the same setup, there is no steep learning curve to get adjusted to using eSRAM or eDRAM.

Are you saying that GDDR5 is harder on the CPU to use than DDR3? Pretty sure the only reason CPU's use DDR3 and not GDDR5 is because DDR3 is way cheaper and GDDR5 is overkill as far as what a CPU needs. GDDR5 has no negative impact on CPU's or Os's.
 
In terms of an absolute scale of "difficulty" I'd wager that they're both extremely easy to develop for compared to last gen.

Here's an artistic rendition:

Difficult........................................................................................................................................Less Difficult
|-----------------PS3-------------360---------------------------PC----------------------------------XB1-PS4--|
This is correct, people are making it seem harder than it actually is but PC is not very difficult since Directx handles memory management.
 
I don't get why people are saying it's harder to develop for. This article is talking about splitting the ONE universal RAM into different partitions.

It's NOT describing a PS3 - 2 RAM pools situation.
 
Yet they spend a good chunk of the article explaining that, nowadays, most people buying new consoles want more than just games to run well on that machine for that price. A lot of words are spent saying the obvious...that the traditional console is in the past, and that most people want and demand more from their boxes. Old-style consoles that focused everything on games alone are done and have been for some time and that's why every current or new one that is coming are all more concerned with stuff that the box can do outside of games. It's not ridicule, it's a observation of the market reality that in order to succeed, your new box needs more than a pure games focus. People who aren't satisfied with that can still build gamer PCs if they like, but consoles are and have always been a compromise of price and ability, even before multitasking app-using gamers of 2013.

Come on, the two quotes I gave are clearly meant to paint those of us who want all resources on games as being obsessed for unimportant details.

Its an obvious attempt to trivialize the criticism.
 
An interesting question is how much of the 3GB of memory reserved for apps can still be assigned to games later in the console's life cycle.
Probably none (also true for Sony). The split is the result of budgeting memory resources for a game and an unknown number of apps running simultaneously. It's a different modus operandi than with the current generation of consoles.

What we can expect is some optimization in the xbox os and in the application os, saving ram within each allocation but I don't expect the split to change. That wouldn't make a lot of sense. We will probably run more applications by the end of the next generation and max out the 3GB than we will at the beginning.
 
Nice article but...

checking Twitter while sitting through a Call of Duty cutscene, for instance.

Really? Who the **** does this? I can understand maybe looking up a walkthrough while playing a game, but checking twitter? Seriously?

"They're sitting watching a movie and they're texting all the time. "

I hate people who do this because they usually end up missing something, and start asking me questions about the movie I'm actually trying to watch.

I love the idea of switching between games and apps, but I want it for things that actually relate to the game I'm playing. For example being able to load up Halo Waypoint in a second screen while I play a new Halo game and immediately view content as I unlock it (like the terminal vids), or playing Defiance while I listen to Across the Badlands on Twitch. Or having a collectibles map at the side of my screen while I hunt pigeons in Grand Theft Auto. Stuff I already do with my computer.
 
My problem with all this secondary app stuff is that I already have all of this on my tablet, even while sitting in front of a TV. I just don't care about yet another device giving me access to boring, ubiquitous stuff like social networks, communication, and media.

That's great, not everyone does. Conversely you have an equal number of people who complain that they don't want to have to use a second screen to access content.

As with everything it is a balancing act between how different people like to access their entertainment and social media.

Also interesting how many people comment without reading the article.
 
The fact is, PCs today still use DDR3 because of one thing - It is easier on the CPU, and thus easier on an OS. DDR3 + eSRAM allows Microsoft to easily incorporate OS-specific features while simultaneously allowing developers access to the higher bandwidth RAM they want, albeit in small bits. 32MB is plenty for storing a large framebuffer though.
Are you saying that GDDR5 is harder on the CPU to use than DDR3? Pretty sure the only reason CPU's use DDR3 and not GDDR5 is because DDR3 is way cheaper and GDDR5 is overkill as far as what a CPU needs. GDDR5 has no negative impact on CPU's or Os's.
Its the great latency defence.
 
The problem with this demand for ever prettier art assets is that it's increasingly hard to reconcile with the aforesaid player lust for volatility - both for new software and for the perpetual updating and expansion of that software. Games that are optimised for a certain memory setup may be threatened by the introduction of a new app, as Multerer proceeded to illustrate.

"We find ourselves in a position where if we want to change the [Application Programming Interfaces] and make a bigger buffer or talk to a service that has slightly different requirements, and we need 10 more bytes of RAM - 10 bytes - some game is going to start crashing. We have to be extremely careful and offer up a very predictable environment to the game developers to get the best games on your console."

Hence, Microsoft's decision to run apps and games in separate partitions, so that the two sets of requirements can coexist. "In the application world, what the next-gen gamer wants is lots of change. 'I want lots of apps, I want lots of services, I want to talk to services that may decide to change their APIs every couple of months or so, turn off old ones and expect all new ones' - but games don't work that way.

This is nuts. Apps and multitasking don't sell consoles, games do. They should be devoting most of the resources to game development. Not having to worry about useless apps.
 
In terms of an absolute scale of "difficulty" I'd wager that they're both extremely easy to develop for compared to last gen.

Here's an artistic rendition:

Difficult........................................................................................................................................Less Difficult
|-----------------PS3-------------360---------------------------PC----------------------------------XB1-PS4--|

Why would be PC significantly more difficult to develop for than Xbox One and PS4? We have reports that some developers are having great success with leading on PC and porting to XB1 and PS4 from there, since there are so few major differences.
 
I thought splitting up the RAM was an obvious benefit that everyone would agree with.

It makes sense for the multimedia aspect of the hardware.

It's this exact reason that Sony is doing the same thing for the PS4.
 
Are you saying that GDDR5 is harder on the CPU to use than DDR3? Pretty sure the only reason CPU's use DDR3 and not GDDR5 is because DDR3 is way cheaper and GDDR5 is overkill as far as what a CPU needs. GDDR5 has no negative impact on CPU's or Os's.

GDDR5 memory has large latency issues. I know some people think that these are alleviated in the PS4 due to out-of-order CPUs, but that is not the case.

Latency does not effect a GPU much because GPUs are capable of executing instructions in parallel - CPUs can not. CPUs are linear, and can only issue one instruction at a time. Out-of-order execution simply involves the way the CPU finds the next instruction, it does not allow a CPU to issue two instructions at the same time.

When latency gets too high, a CPU can stall.

If you have ever overclocked your RAM in your PC, the most beneficial thing to do is to lower your latencies, not increase the clockspeed.
 
I don't see anyone (or, vast amounts of people) asking for this stuff. We like TV how it is. We like videogames how they are. Tablets and phones and shit can add to the experience and we have them already if that's what we want to do. But to spend energy on this stuff on your system seems so foolish. It was like when microsoft had a huge E3 hubbub about "We have twitter and facebook!" and the general reaction was "so what? I can already access those on 1000 other devices." They never gained traction as apps. It was a complete waste of time.

The way I see it is both Sony and Microsoft are both going to be competing HEAVILY with Google and Apple for living room dollars, and probably sooner rather than later. Google and Apple will waltz into the living room with the benefit of having a huge app and game ecosystem behind them, but both are currently hamstrung by most of the apps/games being in the wrong aspect ratio (4:3 for Apple, 16:10 for Google). Microsoft smartly went with 16:9 as the dominant aspect ratio for their cross-device ecosystem, so they have the benefit of "plug-and-play" for apps/games on a TV. This will probably matter much more mid-way through the generation when Google and Apple play their hand, and if one box has 95% of the games that the other box does, and also has a strong app and service ecosystem underneath, that will drive value.
 
Top Bottom