(RUMOR) Xbox One GPU reserve getting smaller soon, down from 10% to 2%

From the Eurogamer article:



So, developers get access to 8% of the reserved 10%, or 98% of the GPU performance.

Look at it this way.

Let's say the theoretical max compute is 100 units to simplify the math.

Game A is using 90% which is 90 units.

It now gains access to up to 8% more, which translates to 8 units of power.

Before it was running at up to 90 units. Now it runs at up to 98. It has gained 8 units or 8/90 = = 8.89% gain.
 
Look at it this way.

Let's say the theoretical max compute is 100 units to simplify the math.

Game A is using 90% which is 90 units.

It now gains access to up to 8% more, which translates to 8 units of power.

Before it was running at up to 90 units. Now it runs at up to 98. It has gained 8 units or 8/90 = = 8.89% gain.

Why would you talk in relative terms instead of absolute? It is in fact using 98% of the GPU and thus that's the only metric that matters
 
Look at it this way.

Let's say the theoretical max compute is 100 units to simplify the math.

Game A is using 90% which is 90 units.

It now gains access to up to 8% more, which translates to 8 units of power.

Before it was running at up to 90 units. Now it runs at up to 98. It has gained 8 units or 8/90 = = 8.89% gain.
You're both right. 8% more of the GPU is available, leading to a ~9% gain over its previous specs. It just depends on what you're looking at.
IMO, using the absolute gain makes more sense (8%).
 
Why would you talk in relative terms instead of absolute? It is in fact using 98% of the GPU and thus that's the only metric that matters

Because the relative impact is the actual impact you can expect. It's a bit difficult to explain with such small numbers so let's instead imagine games started at access to 40% and suddenly gained access to a total of 80%. They'd gain access to a raw 40% more which would result in an effective 100% relative power rate in the games in question - which is the difference players and developers alike would be able to expect to see. The 40% figure wouldn't really mean anything in terms of impact. Hopefully that makes it clear why the 8.89% is more useful a figure than the 8%.
 
Oh god, not that it's 50% more or 33% less argument.

That's not it at all. "50% more" is well defined. If you have 100 units you have 150 now.

Access to a raw 8% more of the GPU is not well defined because the actual impact depends on how much of it you already have access to. For example if games only had access to 16% of the GPU to begin with, and then received access to a raw 8% more, then this would NOW be a "50% more or 33% less issue" as you mentioned. The point being that the raw figure is not really informative.
 
That's not it at all.

In this discussion, that's exactly what it is. The issue is the frame of reference - 8% of the GPU has been freed up for games; games have ~9% more GPU power to utilize than before. All depends on the angle you look at it.
 
Just follow the steps.
They first say Kinect is required to be plugged in at all times.
Whoops, people don't like that that so it gets changed and go as far as advertising that fact.

It was dumb to me just for the fact that if something happened to kinect, then the whole system was unusable.

The big games are not using Kinect. Titanfall etc.

Devs shouldn't be forced into using features for kinect if they can't think of anything that would be good for what they make.

Now if this turn out to be true it's even more of a lack of need for Kinect because they are taking away resources dedicated to the Kinect.

Again though -- 2% is still there for voice. Many games use voice control with kinect. Those games could still be improved with the other 8%.

People like paying less, they can't lower the price significantly and keep Kinect in the box. It's going to happen no matter how much some of you keep your blinders on.

They could if they wanted to. Overall, there are no signs right now that scream "they are going to get rid of kinect".

This news is simply for games that don't use kinect motion control (video), which is more than likely going to be true for the majority of Xbox One games.
 
I always felt it was a mistake to make Kinect part of the One. Not only is a terrible experience to use for gameplay(minus voice maybe), it's sucking up cpu and gpu power for no good reason. To top it off, it's adding extra cost to the machine.

There really isn't much more to say than that.
 
Perhaps for those who are more into FPS than math here's an easier way to look at it. If you have game where the GPU is the bottleneck running at an average 100FPS then this change would theoretically result in a game that now runs at 108.89FPS on average, not 108. As I said - the difference is minuscule, but misuse of numbers is a pet peeve of mine.
 
I always felt it was a mistake to make Kinect part of the One. Not only is a terrible experience to use for gameplay(minus voice maybe), it's sucking up cpu and gpu power for no good reason. To top it off, it's adding extra cost to the machine.

There really isn't much more to say than that.

But in the opinions of many that use the system, it works well with the UI and makes the act of playing games fast.

Also, the kinect sports demo works surprisingly well. Wasn't terrible at all. I might even end up getting the full game.
 
I don't believe an 8% power improvement will be perceptible outside of benchmarks, certainly not with the CPU, DDR3 and ESRAM bottlenecks. Let's hope I'm wrong.
 
Wow. It's crazy that MS might be losing Snap in order to get the 8% bump. The funny thing is that I was planning on buying an XB1 over the next 2 months, and the reason I am being pushed to buying one is for Snap/Skype functionality. If they lose that, they lose my sale for the foreseeable future

it doesn't mean they are losing snap. "Dodd says that the Xbox One is currently on an 80-20 split". this means the console is moving to a 88-12 split.

Microsoft released the xbone with an os that could do far more than sonys but was very unoptimised. they needed to reserve more power until they could refine the OS at a later date. Don't be surprised if we see an extra gig of ram freed up in the future on these consoles.

It would be interesting to know how much sony has reserved for the PS4 OS to grow into. typically its better to take more at the start and give back.
 
Call me skeptical, but I don't believe this rumor. That reservation is expected to decrease, sure, but by this much, leaving it entirely up to the developers, really doesn't seem like it would make very much sense when important OS features are taken into account. I don't see that entire 8% of the reservation just being wiped away just like that while maintaining existing OS features.
 
In this discussion, that's exactly what it is. The issue is the frame of reference - 8% of the GPU has been freed up for games; games have ~9% more GPU power to utilize than before. All depends on the angle you look at it.

My issue is simply that people will misinterpret the data. Look at two true statements.

1. Games have access to 8% more of the GPU.
2. Games will see a performance gain of up to 8.89%.

Both statement are accurate but the first is extremely misleading and largely irrelevant due to the fact that most people will naturally assume that means if you gain access to 8% of the GPU then your game will see a performance gain of up to 8% - which is incorrect.
 
Call me skeptical, but I don't believe this rumor. That reservation is expected to decrease, sure, but by this much, leaving it entirely up to the developers, really doesn't seem like it would make very much sense when important OS features are taken into account. I don't see that entire 8% of the reservation just being wiped away just like that while maintaining existing OS features.

It's not being wiped away. The balancing act will simply be left up to the developers. But I do agree with skepticism here. If games start expecting access to 98% and certain OS functions are expecting access of up to 8% then you're going to have poor performance in one or both if they're both requesting max access simultaneously.
 
Same game, same asset only RESOLUTION would changed what's the problem.

Well, you just doubled the size of your framebuffer for one thing. Good luck fitting it into the 32mb of ESRAM that you'd already committed to the 720P engine.

Game coding isn't just a matter of turning a tap on harder to 'get more game out'. These aren't PCs.
 
I don't believe an 8% power improvement will be perceptible outside of benchmarks, certainly not with the CPU, DDR3 and ESRAM bottlenecks. Let's hope I'm wrong.

probably not, but still nice considering how far behind XB1 multiplatforms are currently... Getting 2-3 fps more out of 30fps game means that you will less likely get any dips.
 
The 8% figure doesn't sound plausible to me. I'd guess that they are optimizing system-level functions by attempting to schedule free resources dynamically to them instead of giving them a fixed budget. That would free some part of that 8%. But in order to reduce that reservation altogether, you would have to do the same with snapped apps, likely crippling or canceling Snap in the process.

Currently, it seems that they are employing some sort of time-slice scheduling, since that provides the best form of isolation and reliability. A scheduling that assigns different execution parts of the GPU-hardware to different processes dynamically is possible, but that must necessarily break isolation and reliability/predictability for at least some concurrently running processes.

However, those characteristics are especially important if you're scheduling resources between two independent applications that cannot have any assumptions about each other. If some snapped app wants to render animations while the game is maxing out its GPU budget, and you cannot find a dynamic scheduling that finds enough "holes" to satisfy both needs in time, one application will "starve".

My best guess that something was lost in translation: they are trying to dial-down system functions that use parts of that 8%, but not the budget assigned to snapped apps.

Personally, I would welcome them dropping Snap, but I don't think that's likely.
 
I'm not sure if this is even worth getting excited about. This DF article clearly shows a 7790 GPU performing better for multi-platform titles.

Problem is the xbox one gpu though being a 7790, is cut back and doesn't perform as well as an actual 7790.

Microsoft really should literally have put a off the shelf R7 260x haha.

but the obsession with the SoC pretty much screwed shit up.

Held back both consoles really imo.
 
don't think it will make a meaningful difference but obviously every bit helps..
even better if it somehow means we are on the way to a kinectless sku
 
Problem is the xbox one gpu though being a 7790, is cut back and doesn't perform as well as an actual 7790.

Microsoft really should literally have put a off the shelf R7 260x haha.

but the obsession with the SoC pretty much screwed shit up.

Held back both consoles really imo.

It also improved yields, helped get mass production up to speed, and enabled both companies to produce a shit ton of consoles on whim. Why would you want an insane supply chain just for the kicks of a few more gigaflops to throw around? None of these guys were ever gonna put in anything past GCN 2.0. The processes just aren't tested enough to insure reliability. And the last time I checked- reliability + cheaper prices >>>> 10-20% better performance.
 
It also improved yields, helped get mass production up to speed, and enabled both companies to produce a shit ton of consoles on whim. Why would you want an insane supply chain just for the kicks of a few more gigaflops to throw around? None of these guys were ever gonna put in anything past GCN 2.0. The processes just aren't tested enough to insure reliability. And the last time I checked- reliability + cheaper prices >>>> 10-20% better performance.

I don't disagree. But following that same thought... a 8-9% boost in available GPU is not going to make a meaningful impact either.
 
I don't believe an 8% power improvement will be perceptible outside of benchmarks, certainly not with the CPU, DDR3 and ESRAM bottlenecks. Let's hope I'm wrong.

It'll result in like 3 extra frames per second. People expecting like 720 -> 1080 or 30fps -> 60fps will be on suicide watch.
 
Yeah, can anyone see what's in that category?
It's the only game with this word in the category section. No other game on the store has this in that spot. Kind of weird. It's in the description section where details like genre and amount of players would be listed.

Edit: I think you're onto something.
 
Seems to work for Apple. I actually wouldn't be surprised if we start seeing 18 month 'generations' as a reaction to Steam Boxes if they take off.

Except Apple's (and really all mobile device manufacturers) method of incremental yearly phone releases don't completely obsolete their previous hardware, hence leading to immense fragmentation. That is to say, if the iPhone 6 launched and I wanted to keep my iPhone 5s, I wouldn't be dead in the water in terms of app accessibility. 99% of apps will work perfectly fine without any real performance loss.

This sort of business model doesn't work for consoles unless they can get the profit margins and sheer volume these mobile guys get. Apple can bite the cost involved in R&D by selling a crapload of devices with huge amounts of profit attached to each sale. Other companies get their designs from other companies and just rework it. You can't do this with consoles, which by definition are closed box devices and make money off software and services and not hardware.

The Steambox can probably have yearly iteration because its a desktop. iBUYPOWER isn't eating billions of R&D costs. They're getting the software from Valve, getting the hardware from typical PC vendors, and are not eating the cost to make the product more cost-attractive.
 
Top Bottom