OXM: Why splitting Xbox One's OS RAM allocation is good for developers

Why would be PC significantly more difficult to develop for than Xbox One and PS4? We have reports that some developers are having great success with leading on PC and porting to XB1 and PS4 from there, since there are so few major differences.



Developing for a PC is easy. The easiest, probably.


But the PC as a platform poses an enormous amount of variations in hardware to have to deal with. It's not uncommon at all for PC games to be flat out broken for certain video cards or sound cards or... whatever. Different hardware, different OS's, different everything in nearly every customer's box. It's not easy.
 
It sounds bad on paper but it's just how it is.

If people want features like resume, downloading in the background, instant-on, internet, etc., you're going to have to pay for it with RAM.

When it comes down to it, games are only going to have to run at 1080p.It's not like they need that full 8GB to impress. Look at the kind of quality we are getting out of something like the Last of Us. It's working with 512MB of memory total. Even 1GB would be huge but it's going to have 5+ to work with.

There's nothing to worry about at all.
 
No thuway means the eSRAM + DDR3 is more complicated then just a pool of GDDR5.

Correct me if I'm wrong, but isn't this the same set up developers have been using in the 360 for what is nearly a decade?

What suddenly makes it more complicated? The different ram? I know the 360 used eDram. Is eSram harder/more complicated to use?
 
Why would be PC significantly more difficult to develop for than Xbox One and PS4? We have reports that some developers are having great success with leading on PC and porting to XB1 and PS4 from there, since there are so few major differences.

While I don't agree that it would be 'significantly' more difficult, consoles do have the advantage over PC that they are a single target with non-changing specs.
 
This is nuts. Apps and multitasking don't sell consoles, games do. They should be devoting most of the resources to game development. Not having to worry about useless apps.

Well its a good thing that both boxes give huge (by comparison) chunks of ram and CPU/GPU cycles to games then. Game developers aren't suddenly running out and developing apps just because the box can run them.

Why would be PC significantly more difficult to develop for than Xbox One and PS4? We have reports that some developers are having great success with leading on PC and porting to XB1 and PS4 from there, since there are so few major differences.

Was just an uneducated guess on my part. I assumed developing for a closed box would be easier than a open box.
 
  • Adressable Game RAM = Developers can use as they want
  • Non-addressable game RAM = OS handles that on behalf of the game, developers can request it but don't need to worry about it, its just virtual memory handled by the OS.
  • OS RAM = Developers can't touch that at all.
Its not very difficult to understand.

Exactly and also what they're trying to say is that some of the game functionality could be removed and built as a separate app that runs off of the 3GB of RAM reserved for the OS.
 
Correct me if I'm wrong, but isn't this the same set up developers have been using in the 360 for what is nearly a decade?

What makes it suddenly more complicated? The different ram?

No it has not gotten any more complicated. Sony just made a setup that is less complicated then the one xbone is using.
 
Correct me if I'm wrong, but isn't this the same set up developers have been using in the 360 for what is nearly a decade?

What makes it suddenly more complicated? The different ram?

I think "more complicated" is a bit of a misnomer. While technically it is, it's also what people have been doing for years... so it's not complicated.


A unified pool is "easier" but that's like saying that touch is better for surfing the web... it may be for some people but for people who are used to doing it with a mouse/KB it's not exactly hard to surf the web. Ya know?


That's a shit example but hopefully my point came across.
 
Developing for a PC is easy. The easiest, probably.


But the PC as a platform poses an enormous amount of variations in hardware to have to deal with. It's not uncommon at all for PC games to be flat out broken for certain video cards or sound cards or... whatever. Different hardware, different OS's, different everything in nearly every customer's box. It's not easy.

Yeah, but that's not the lion's share of development. Just making the game and getting it to work on PC is easiest of all, the complication comes when you try to ensure compatibility and performance on a variety of setups. It's not a no-effort process, clearly, but even that can be mitigated to some degree through setting relatively high sysreqs and dropping support for older OSes.

Was just an uneducated guess on my part. I assumed developing for a closed box would be easier than a open box.

Yes and no, see above. Closed box can also be "harder" in the sense that you may run into the hard limits of the system's capabilities before you get everything working the way you want it.
 
No it has not gotten any more complicated. Sony just made a setup that is less complicated then the one xbone is using.

Ah, okay. That makes a lot more sense. I can see how it might be more complicated than the PS4 set up, but developers are all used to it by now, so I'm not sure there'd be any issues with developers having trouble getting used to it.

I think "more complicated" is a bit of a misnomer. While technically it is, it's also what people have been doing for years... so it's not complicated.

A unified pool is "easier" but that's like saying that touch is better for surfing the web... it may be for some people but for people who are used to doing it with a mouse/KB it's not exactly hard to surf the web. Ya know?

That's a shit example but hopefully my point came across.

Yeah, that's a pretty bad example but the point is fully understood.
 
The 360 utilized eDRAM.

Below definition of both.
SRAM(Static Random Assessable Memory)-where the word static indicates that it, does not need to be periodically refreshed, as SRAM uses bistable latching circuitry (i.e., flip-flops) to store each bit. Each bit is stored as a voltage.Each memory cell requires six transistors,thus giving chip low density but high speed.However, SRAM is still volatile in the (conventional) sense that data is lost when powered down.
Disadvantages are more expensive and also consumes more power than DRAM.

In high speed processors (such as Pentium), SRAM is known as cache memory and is included on the processor chip.However high-speed cache memory is also included external to the processor to improve total performance.

DRAM(Dynamic Random Assessable Memory)- Its advantage over SRAM is its structural simplicity: only one transistor (MOSFET gates) and a capacitor (to store a bit as a charge) are required per bit, compared to six transistors in SRAM. This allows DRAM to reach very high density.Also it consumes less power and is even cheaper than SRAM (except when the system size is less than 8 K) .

But the disadvantage is that since it stores bit information as charge which leaks;therfore information needs to be read and written again every few milliseconds.This is known as refreshing the memory and it requires extra circuitry,adding to the cost of system.
 
GDDR5 memory has large latency issues. I know some people think that these are alleviated in the PS4 due to out-of-order CPUs, but that is not the case.

Latency does not effect a GPU much because GPUs are capable of executing instructions in parallel - CPUs can not. CPUs are linear, and can only issue one instruction at a time. Out-of-order execution simply involves the way the CPU finds the next instruction, it does not allow a CPU to issue two instructions at the same time.

When latency gets too high, a CPU can stall.

If you have ever overclocked your RAM in your PC, the most beneficial thing to do is to lower your latencies, not increase the clockspeed.

Could you please stop spreading bullshit?

There is no inherent latency with GDDR5.

The difference in latency between DDR3 and GDDR5 on PC are because of the memory controllers and what kind of use case they are optimized for.

Do you know the specifications of the PS4 GDDR5 memory controller? If not, please STFU.
 
The RAM isn't split like the PS3 was. All this is, is showing the resources allocated to running apps and how devs could use that to say run their game on the game partition but supplement it with an app with the windows partition that runs right along with it.
 
GDDR5 memory has large latency issues. I know some people think that these are alleviated in the PS4 due to out-of-order CPUs, but that is not the case.

Latency does not effect a GPU much because GPUs are capable of executing instructions in parallel - CPUs can not. CPUs are linear, and can only issue one instruction at a time. Out-of-order execution simply involves the way the CPU finds the next instruction, it does not allow a CPU to issue two instructions at the same time.

When latency gets too high, a CPU can stall.

If you have ever overclocked your RAM in your PC, the most beneficial thing to do is to lower your latencies, not increase the clockspeed.

In a PC environment I would agree with you. But

1. PS4 uses APUs with AMD memory controllers which are pretty low latency.
2. Onion and Garlic buses which help reduce latency.
3. There is a reason why Sony wen the extra step for GPGPU (extra ACEs). They want devs to start shifting some computations to the GPGPU, which like you said is latency tolerant.
 
Exactly and also what they're trying to say is that some of the game functionality could be removed and built as a separate app that runs off of the 3GB of RAM reserved for the OS.

Yeah I thought this was pretty cool, actually. I wonder if Sony will be allowing the same sort of functionally as well.
 
Come on, the two quotes I gave are clearly meant to paint those of us who want all resources on games as being obsessed for unimportant details.

Its an obvious attempt to trivialize the criticism.
It's hard to find fault with their decisions for unreleased hardware since there is distinct lack of evidence proving that they are the wrong ones or in any way cost the gamer making those complaints anything real instead of it all being theoretical and assumed. We all know more memory is better, but at some point, more fully-enabled streaming from modern architectures designed to accommodate them at the hardware level and better-designed memory subsystems and utilization can probably fully offset any loss of potential gains from having more. MS wants to avoid or attack the memory wall with ESRAM in their architecture and added custom logic to manage it, Sony wants to avoid or attack the same impediment with more bandwidth from a single off-chip pool. Both have new software and main hardware design allowing them to better address the usage regardless of hardware differences the come from their deeper customizations. I don't see any reason to worry until we have final games on final hardware repeatedly exhibiting issues that don't seem to go away after more mature second gen of software hits.
 
I think both MS and Sony were smart to logically portion off the OS that way, regardless of the physical aspects. I'm sure these numbers will dwindle as they optimize the crap out of it, too. Getting these things in functional conditions is always the biggest feat.
 
Correct me if I'm wrong, but isn't this the same set up developers have been using in the 360 for what is nearly a decade?

What suddenly makes it more complicated? The different ram? I know the 360 used eDram. Is eSram harder/more complicated to use?

More complicated is, if I'm correct, in relation to PS4's memory architecture.
 
Both have new software and main hardware design allowing them to better address the usage regardless of hardware differences the come from their deeper customizations. I don't see any reason to worry until we have final games on final hardware repeatedly exhibiting issues that don't seem to go away after more mature second gen of software hits.

Next-gen complaints: Hardware Jank
 
Does this info shed new light on this? I kind of though we understood the partition of the memory. We just don't know how well it works yet, and what, if any, trade offs are made from this set up
 
Its the exact same thing that Devs worked with on the 360..

Not really true. The Xbox360 EDRAM was hardwired for the Z-buffer and frame buffer. The Xbox One ESRAM supports Z-buffer, frame buffer, textures, and compute data. To get everything to fit in ESRAM developers will have to use tiling. Many Xbox360 developers avoided tiling. They just lowered the resolution.
 
Does this info shed new light on this? I kind of though we understood the partition of the memory. We just don't know how well it works yet, and what, if any, trade offs are made from this set up

No, in typical MS fashion.
 
I think both MS and Sony were smart to logically portion off the OS that way, regardless of the physical aspects. I'm sure these numbers will dwindle as they optimize the crap out of it, too. Getting these things in functional conditions is always the biggest feat.
I don't think the allocations will change, whether it's for MS or Sony. They are the result of estimating how much resources have to be saved for the simultaneous execution of an unknown number of apps. The 3GB allocation will be even more handy later in the generation when we will have more apps at our disposal so I don't think it will ever make sense to reduce it.

I don't expect the memory split to change, each os in each VM could be optimized though but the one in the game VM is already said "very thin" (which I understand as small with few overhead).
 
Does this info shed new light on this? I kind of though we understood the partition of the memory. We just don't know how well it works yet, and what, if any, trade offs are made from this set up

Those still disgruntled by the withholding of RAM that might be devoted exclusively to sharper gun renders may be mollified by the following: those simultaneously-running apps could form part of the game in some way. Developer fondness for networked features such as DICE's Battlelog services shows no sign of abating, and it's possible the Windows-based partition might handle certain of these services in future.

That part is new info, sorta, and could be interesting.
 
I don't understand posts like this. Sony literally just released PR stating that they aren't telling anyone what their OS RAM reservation is. Shouldn't you say that this is in typical Sony fashion then?

I believe he's referring to Microsoft having never revealed any more information regarding hardware specifications, in comparison to the information Sony presented at their reveal.
 
MS: Hi!
Devs: Hi!
MS: Here's 8GB of RAM!
Devs: Cool!
MS: Now, use 5GB for Games!
Devs: Cool!
MS: And 3GB for companion/Windows Apps!
Devs: Cool!
MS: You cool?
Devs: We're cool!

Its that simple. And why are people suggesting that development on XB1 will be hard? Devs have years of experience using EDRAM on the 360, the XB1 is no different.

As easy as the PS4? No. Hard? Hell no.
 
This is nuts. Apps and multitasking don't sell consoles, games do. They should be devoting most of the resources to game development. Not having to worry about useless apps.

I think it's too early to say anything about what's going to sell these consoles other than the games. A ton of people bought a PS3 just to have a Blu-Ray drive, or as their primary Netflix device.

Additional features outside of the box's marquee feature (games yo) drive value, and informed consumers use those additional features to influence their purchasing decisions.

I believe he's referring to Microsoft having never revealed any more information regarding hardware specifications, in comparison to the information Sony presented at their reveal.

Ah, I agree I want to know much more about the XB1's hardware also. I'm holding out on that GPU upclock rumor as the reason why they're being coy about it.

In terms of OS RAM reservation, Microsoft has been much more forthcoming though, and oddly enough, got crucified for it.
 
I don't think the allocations will change, whether it's for MS or Sony. They are the result of estimating how much resources have to be saved for the simultaneous execution of an unknown number of apps. The 3GB allocation will be even more handy later in the generation when we will have more apps at our disposal so I don't think it will ever make sense to reduce it.

I don't expect the memory split to change, each os in each VM could be optimized though but the one in the game VM is already said "very thin" (which I understand as small with few overhead).

Could be, I was just thinking back on this gen and how the PS3 OS dwindled in size but increased in functionality. I'm not on the programming side of things, so I was also just assuming that the logical partitioning of RAM would be easier to scale where it is allocated.
 
I think that as long as developers are happy with the amount of RAM available (be it on XB1/PS4), there's no reason to ride the rides at the carnival of stupid. Most people freaking out have no idea how this shit works anyway (I am one of those schmucks).
 
I think it's too early to say anything about what's going to sell these consoles other than the games. A ton of people bought a PS3 just to have a Blu-Ray drive, or as their primary Netflix device.

The difference is that you can get literally all of this for little money right now, whereas Blu-Ray players have been very expensive at the time of the PS3's launch. I, for instance, have a Netflix client in (1) my iPad, (2) my Apple TV, and (3) my TV, (4) my PS3, (5) my 360, (6) my Wii U, and (7) on my notebook if I would bother to connect it to the TV (Have I forgotten anything?). Netflix (and similar stuff) is becoming sort of a spam feature. Seriously, if anybody does not have any device capable of displaying Netflix on their TV, buying a 499$ game console for that purpose is not the first idea that would come to mind. There are way cheaper alternatives.
 
To be fair, developers seem to be fine with 5GB of main memory for games. Hence, if Microsoft would have gone for a game-only console, they most probably would have added 4GB instead of 8GB. The even could have considered adding 4GB of GDDR5, just like Sony did.

An interesting question is how much of the 3GB of memory reserved for apps can still be assigned to games later in the console's life cycle.

Actually, an equally interesting question (for me at least) is how much of the 3GB can be used for/by Apps, etc.

I generally assumed that the 3gigs was reserved for the 3os, and for future growth if needed. But, wouldn't they also have to guarantee a set amount for Apps at all times as well?
 
I think that as long as developers are happy with the amount of RAM available (be it on XB1/PS4), there's no reason to ride the rides at the carnival of stupid. Most people freaking out have no idea how this shit works anyway (I am one of those schmucks).

A year from now, gamers will be playing great games on next-gen hardware. No one is going to be thinking about ESRAM, Bandwidth, OS Allocation or even GDDR5, not when you have zombies or whatnot hunting you down in the heat of battle.....or even, retreat.
 
The difference is that you can get literally all of this for little money right now, whereas Blu-Ray players have been very expensive at the time of the PS3's launch. I, for instance, have a Netflix client in (1) my iPad, (2) my Apple TV, and (3) my TV (and (4) on my notebook if I would bother to connect it to the TV). Netflix (and similar stuff) is becoming sort of a spam feature. Seriously, if anybody does not have any device capable of displaying Netflix on their TV, buying a 499$ game console for that purpose is not the first idea that would come to mind. There are way cheaper alternatives.

You're right. The good thing about these consoles is that they're gonna be able to play games. Games that Apple/Google/Samsung/Facebook/Amazon won't be able to get. I don't think anyone is saying that people are going to go out and buy a $500 XB1 or $400 PS4 just to run apps, they'll buy them for the games, and OS features and apps are value-adds that may or may not influence consumers purchasing decisions.

Actually, an equally interesting question (for me at least) is how much of the 3GB can be used for/by Apps, etc.

I generally assumed that the 3gigs was reserved for the 3os, and for future growth if needed. But, wouldn't they also have to guarantee a set amount for Apps at all times as well?

Both consoles have RAM reserved for the Operating System. 2-3 GB for PS4 and 3GB for XB1. This doesn't mean that that's the size of the OS, rather that Sony/MS must make guarantees to developers about how much RAM they'll be able to address for games. For the App question, I believe we'll see more about this when Microsoft details their cross-platform development plans for XB1.
 
It's really hard to get excited about being told they're allocating RAM this way so games will still work when they add shit you probably won't care about down the road. Also
Those still disgruntled by the withholding of RAM that might be devoted exclusively to sharper gun renders may be mollified by the following: those simultaneously-running apps could form part of the game in some way. Developer fondness for networked features such as DICE's Battlelog services shows no sign of abating, and it's possible the Windows-based partition might handle certain of these services in future.
Wowsers!
 
I don't understand posts like this. Sony literally just released PR stating that they aren't telling anyone what their OS RAM reservation is. Shouldn't you say that this is in typical Sony fashion then?

Sony commented on a rumor filled misinformed article. Microsoft released their own article with no new information.

They're not really comparable.
 
Sony commented on a rumor filled misinformed article. Microsoft released their own article with no new information.

They're not really comparable.

It was a troll attempt and Lee was pointing it out. Nothing more. Awesome troll, considering we're still talking about it.
 
Top Bottom