EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

There is no huge gap, where are you getting this from? The recent PS4 footage of the UI and performance clearly shows how fast it is and there's tons of new features and overall improvements. We will also see a full breakdown within the next 7 days from IGN of the PS4 UI/PSN so you will see more of what I"m talking about. These are completely different consoles, this isn't PS3 and 360.

I'm basing this mostly on previous gen, which I think will translate heavily into next gen. We aren't talking about new hardware built from the ground up. The networks will build mostly on what is already there. Xbox Live and PSN, while having different UIs on the new consoles, will be essentailly a cross gen service. I think Sony will close the gap a little this time but it will still be a significant gap.

I don't want to derail the thread since it is about Hardware performance and not a Live/PSN thread but I felt like I should defend my statement.
 
I'm basing this mostly on previous gen, which I think will translate heavily into next gen.
[...]
I don't want to derail the thread since it is about Hardware performance and not a Live/PSN thread but I felt like I should defend my statement.
I don't think you've defended your statement at all. You've just asserted it once again.

You could at least list some items that make up this huge gap instead of speaking in generalities that can't be analyzed.
 
I'm basing this mostly on previous gen, which I think will translate heavily into next gen. We aren't talking about new hardware built from the ground up. The networks will build mostly on what is already there. Xbox Live and PSN, while having different UIs on the new consoles, will be essentailly a cross gen service. I think Sony will close the gap a little this time but it will still be a significant gap.

I don't want to derail the thread since it is about Hardware performance and not a Live/PSN thread but I felt like I should defend my statement.

Says who? I mean this is entirely conjecture. And not even conjecture based on what they've shown so far.
 
And a little fun fact at the end, the raster/triangle/geometry throughput from the front-end is 6,6% higher on the Xbox One GPU.

Well since you seem very attached to fair facts and comparisons :
PS3 had half the triangle setup of 360, we never saw any game with less polycount on assets, even on the worst multiplatform ports. First party were really heavy in that matter, more than 360 exclusives, because you hit limitations long before filling the theoretical peak of the setup engine.
It does give a tick for the x1 in the comparison chart but it's the only one it has, and it's irrelevant.
 
I'm invested in the Microsoft Brand, Xbox360 was very good to me. I'm sticking with the XBox One mostly because of of the Xbox Live ecosystem and, to a lesser degree, Kinect and the Controller.

But to be perfectly honest, all the tech talk has me both worried and kind of irritated. Looking at the facts, my system of preference is going to have a significant power disadvantage. If the power disadvantage was due to Microsoft working on improving live and adding Kinect then I wouldn't be so irritated, but I honestly feel like MS was suprised by type of machine Sony could put out so they lowballed all of us gamers expecting Sony to put out a similarly spec'd device.

None of this is going to sway my choice due the the huge gap between Live and PSN (in my opinion.) But I'd be lying if I said I wasn't worried about it.
My guess is that MS thought PS4 would have the rumored 4GB of RAM so they thought it would be enough if XB1 had 8GB. But Sony doubled the RAM in the last minute, and lower specs + equal amount of RAM certainly puts MS in a rough spot.
 
That has nothing to do with any alleged reservation of GPU time by the OS. The crucial technology that allows Microsoft to schedule GPU time reliably in slices is the virtualization by the hypervisor OS.
They have both a second gfx-pipeline, which sending work-queues to the Shaders and they have a higher priority than the other pipelines, so for something they both reserve some processing time.
 
I challenge that. Why exactly should the PS4 reserve GPU time for the OS, when there is no feature needing GPU time, whereas on the XBO there are snapped Metro apps and Kinect?


Even though ps4 doesn't have snap, the os will need resources for multitasking, running the GUI when prompted by the user, etc. How much is reserved for the OS is unknown.
 
That is no indication for a alleged fixed reservation of GPU time by the OS. The crucial technology that allows Microsoft to schedule GPU time reliably in slices is the virtualization by the hypervisor OS.

Apart from that, if there are no known OS features that would need GPU time while a game is running then, by Occam's razor, there probably are none.

Aren't you using the Occam's razor incorrectly?

It states that among competing hypotheses, the hypothesis with the fewest assumptions should be selected.

The moment you get into things that are probably not there, you are making more assumptions. 1.31 vs 1.84 has the fewest assumptions at this point. Though 10% is extremely unlikely for the PS4, it is an added assumption (a reasonable one, but an assumption nonetheless). It is only fair to keep the overhead factor on hold until we don't have to make that assumption for PS4.
 
Why? There is a large gap in absolute pixels that would influence that so it's not a given everyone will be able to see that difference in both cases.

I don't know. In my opinion, the eye only notices pixel density, the absolute pixels is irrelevant for qualitative assessment. That's the reason we normalize difference to percentages, right? For the same screen size, 20% density difference has be to less noticeable than 30% density difference, until one reaches a point of diminishing returns, which I don't think happens before 1080p.
 
They have both a second gfx-pipeline, which sending work-queues to the Shaders and they have a higher priority than the other pipelines, so for something they both reserve some processing time.

No, the presence of additional, reserved command buffers does not indicate a reservation of GPU time.

An enforced fixed reservation of resources is only necessary if you are running at least two processes concurrently that cannot negotiate/schedule resource consumption with each other. This is the case when you run a game concurrently with an arbitrary app, both computing and rendering stuff at the same time. Both programs have been developed independently, and both don't know anything about each other. Hence, an authority, in this case the virtualization OS, has to guarantee a fixed amount of resources at any given time for both programs to run without mutual corruption (e.g. the app causing frame dips in the game when both are requesting their peek resource demands).

Even though ps4 doesn't have snap, the os will need resources for multitasking, running the GUI when prompted by the user, etc. How much is reserved for the OS is unknown.

Why would you need to render something if that something is not displayed?

Aren't you using the Occam's razor incorrectly?

Nope. Hypothesis A ("PS4 reserves GPU time") needs the additional ad-hoc assumption that there is some currently unknown feature that needs a fixed reservation of GPU time. Hypothesis B ("PS4 does not reserve GPU time") does not need that assumption. It is hence the more economical, and as a result of basic probability theory, the more probable hypothesis.
 
The gpu power has to be pondered by the number of aces and compute queues.

Let's say you have a 1Tflop Gpu, it's his theoretical maximum performance. Same thing for any GPU with the same architecture, effienciency stays the same (except with compute).
It means that in every situation, you'll never reach 1Tflop a sec. What the compute allows is to get closer to the 1Tflop by filling as much as it can the times where nothing happens. It's millisecond stuff, the gpu works most of the time, but with the power we have today, even milliseconds allow lots of stuff.

So if your X1 Gpu is 1.31Tflops has 10% less for reservation, it means 131 gflops less. It does have an impact, but an impact on something you wouldn't have reached anyways.

The same thing happens with PS4, but it has a lot more ACEs, so it means it's GPU will be even more capable to close in on it's theoretical maximum at 1.84Tflops, and it won't have the 10% reservation to begin with. The reservation could even be 0% because the quick OS display could be handled during the lost cycles of the gpu, or even on the cpu, it's not as demanding as kinect+snap.
 
Says who? I mean this is entirely conjecture. And not even conjecture based on what they've shown so far.

Sure it's conjecture. What is so wrong with basing your opinions on how current gen went? Sony had much better first party support this gen. They invested in their studios and reaped the benefits. Most of us think that Sony will have a better first party offering this gen as well because they have already established that base. Could MS do better than Sony this gen despite being behind right now? Absolutely, but I tend to think that they won't due to the investments Sony has already made.

Same thing goes for the internet service. MS is heavily invested into the Xbox Live ecosystem so I tend to think they are going to continue to be the leader in that regard. That doesn't mean Sony can't overtake them.

When it comes to these hardware power differences, they are cold hard facts and it’s impossible to argue that Sony isn’t ahead. But when it comes to Live/PSN, why not base it on previous gen until more facts come out? Microsoft has talked a lot about an improved reputation system and smart match. Sony has made some improvements as well, but until I see more information out there I’m going to look at current gen as the benchmark.

edit: Fixed some spelling. Spoiled by browser spell check and we have an old crappy IE explorer at work ;)
 
I don't know. In my opinion, the eye only notices pixel density, the absolute pixels is irrelevant for qualitative assessment. That's the reason we normalize difference to percentages, right? For the same screen size, 20% density difference has be to less noticeable than 30% density difference, until one reaches a point of diminishing returns, which I don't think happens before 1080p.
I disagree with the premise.

If the screen size stays the same, the 720p pixels are bigger. Let's just be generous and not mathematically correct and say they get twice the size.

That means a 20% difference at 720p is equivalent to a 40% difference at 1080p.
 
Sort of.

The biggest problems this generation came not from fancy shader effects and shit but from core game systems not being possible on PS360 levels of memory. See: Skyrim, Fallout 3 and Fallout New Vegas all being complete shitshows on the consoles.

That memory limitation is also what held back PC games, more so than the CPU/GPU limitations of the consoles.

With the PS4 and Xbone both consoles have 8GB of RAM with ~5-6GB accessible to games, contrasted against the 512MB the 360 and PS3 had. That is fucking huge and will allow for massive changes in games, from BF4 finally having 64 players on consoles to future RPGs like Witcher 3 having massive open worlds with little to no compromise.

Next-gen games will, in time, open up. The boring linear corridor shooter bullshit of CoD should give way to proper open level design like we saw in Crysis before Crytek neutered the franchise to work on the PS360 consoles. Witcher 3 is the harbinger of RPGs to come; expect the next-gen Fallout, Elder Scrolls and even terrible Bioware RPGs to open up to a scope and scale unprecedented on consoles.

All due to the memory.

Xbone is substantially weaker than the PS4 and it's going to show from day one and only become more egregious as time passes, but that's just graphics. In terms of core gameplay systems and scope they both have roughly the same amount of available memory and are going to allow for better, grander games.

Yeeee
 
I know this is GAF and all, and I'm new and everything...but why don't we just wait for the games? *shrug*

Because this is a specs thread for people who want to talk about specs. If you want to talk about games, there are plenty of other threads.
 
I know this is GAF and all, and I'm new and everything...but why don't we just wait for the games? *shrug*

Because we have threads about the games, and we have threads about 'the real differences is performance'' of the consoles.

If you want to talk about just the games, other threads exist just for that as well. No point going into that thread and saying can't we just talk about the hardware instead.

You see?
 
None of this is going to sway my choice due the the huge gap between Live and PSN (in my opinion.) But I'd be lying if I said I wasn't worried about it.

care to explain where this "huge gap" is?...i use both the PSN and XBL just about every day...and i dont see it...
 
I'm basing this mostly on previous gen, which I think will translate heavily into next gen. We aren't talking about new hardware built from the ground up. The networks will build mostly on what is already there. Xbox Live and PSN, while having different UIs on the new consoles, will be essentailly a cross gen service. I think Sony will close the gap a little this time but it will still be a significant gap.

I don't want to derail the thread since it is about Hardware performance and not a Live/PSN thread but I felt like I should defend my statement.

All this negative talk about PSN is nonsense. It's has a bad reputations that has stuck with the network even though it's a legitimate service now. I have absolutely no issues with download speeds which is another thing people complain about.

Also, all the marketing BS MS has been throwing out there with "the power of the cloud" is all nonsense. For example "drivatars" are nothing new or revolutionary. Tekken has had "Ghost Battle" for a while now and it's pretty much the same thing.

One thing I give advantage to MS is the option of developers using Azure at a discount. But that still DOES NOT guarantee dedicated servers on every MP game
 
I'm invested in the Microsoft Brand, Xbox360 was very good to me. I'm sticking with the XBox One mostly because of of the Xbox Live ecosystem and, to a lesser degree, Kinect and the Controller.

But to be perfectly honest, all the tech talk has me both worried and kind of irritated. Looking at the facts, my system of preference is going to have a significant power disadvantage. If the power disadvantage was due to Microsoft working on improving live and adding Kinect then I wouldn't be so irritated, but I honestly feel like MS was suprised by type of machine Sony could put out so they lowballed all of us gamers expecting Sony to put out a similarly spec'd device.

None of this is going to sway my choice due the the huge gap between Live and PSN (in my opinion.) But I'd be lying if I said I wasn't worried about it.

I think your betting on the wrong horse. And it is not guaranteed Live will be better than PSN next gen, no doubt this gen PSN as terrible in comparison to live, but I do think doubt Sony are going to improve PSN. PS App, sharing, Instant match start, cross chat, multitasking, sleep mode, remote download, social features, Instant Game Collection, Gakai and etc shows that Sony is putting some effort. I'd wait before I can be sure which system would have the better online ecosystem.
 
Sure it's conjecture. What is so wrong with basing your opinions on how current gen went? Sony had much better first party support this gen. They invested in their studios and reaped the benefits. Most of us think that Sony will have a better first party offering this gen as well because they have already established that base. Could MS do better than Sony this gen despite being behind right now? Absolutely, but I tend to think that they won't due to the investments Sony has already made.

Same thing goes for the internet service. MS is heavily invested into the Xbox Live ecosystem so I tend to think they are going to continue to be the leader in that regard. That doesn't mean Sony can't overtake them.

When it comes to these hardware power differences, they are cold hard facts and it’s impossible to argue that Sony isn’t ahead. But when it comes to Live/PSN, why not base it on previous gen until more facts come out? Microsoft has talked a lot about an improved reputation system and smart match. Sony has made some improvements as well, but until I see more information out there I’m going to look at current gen as the benchmark.

edit: Fixed some spelling. Spoiled by browser spell check and we have an old crappy IE explorer at work ;)

No offense, but I can't understand a single thing you're talking about.

Can you name something specific(s) what you don't like about PSN? What do you like about Live more?

All I read is, "Blah blah PSN might catch up to Live some day, but... Blah blah".

Catch up to what???
 
Sure it's conjecture. What is so wrong with basing your opinions on how current gen went? Sony had much better first party support this gen. They invested in their studios and reaped the benefits. Most of us think that Sony will have a better first party offering this gen as well because they have already established that base. Could MS do better than Sony this gen despite being behind right now? Absolutely, but I tend to think that they won't due to the investments Sony has already made.

Same thing goes for the internet service. MS is heavily invested into the Xbox Live ecosystem so I tend to think they are going to continue to be the leader in that regard. That doesn't mean Sony can't overtake them.

When it comes to these hardware power differences, they are cold hard facts and it’s impossible to argue that Sony isn’t ahead. But when it comes to Live/PSN, why not base it on previous gen until more facts come out? Microsoft has talked a lot about an improved reputation system and smart match. Sony has made some improvements as well, but until I see more information out there I’m going to look at current gen as the benchmark.

edit: Fixed some spelling. Spoiled by browser spell check and we have an old crappy IE explorer at work ;)

It's been discussed several times before on GAF. The issues with PSN on PS3 are PS3 hardware issues (limited system RAM & outdated Wi-Fi chip). None of these are issues for the PS Vita & the PS4. Also, Sony is moving PSN from Amazon Web Services to Rackspace's Openstack. So, PSN will be running on a new cloud service.
 
This. But until we know what it is, I would like to maintain 1.31 vs 1.84. It is only fair that comparisons are made on an even plane where the exact numbers are known for both platforms. We can always change them when more authoritative info gets out, right?

no.

the 10% GPU reservation is confirmed. Xbone will never have 1.31 tf available for games.

it's 1.18 vs 1.84.
 
Even though ps4 doesn't have snap, the os will need resources for multitasking, running the GUI when prompted by the user, etc. How much is reserved for the OS is unknown.

How much does the PC reserve of a graphics card when you switch from a fullscreen game to the windows desktop?

It is not the same thing as what XBOX1 is doing.
 
I think your betting on the wrong horse. And it is not guaranteed Live will be better than PSN next gen, no doubt this gen PSN as terrible in comparison to live, but I do think doubt Sony are going to improve PSN. PS App, sharing, Instant match start, cross chat, multitasking, sleep mode, remote download, social features, Instant Game Collection, Gakai and etc shows that Sony is putting some effort. I'd wait before I can be sure which system would have the better online ecosystem.

I really hope these systems will let you suspend your game and run apps without having to quit. That's got to be my number one wanted feature, I'm tired of having to save and quit when my wife wants to watch netflix!
 
I really hope these systems will let you suspend your game and run apps without having to quit. That's got to be my number one wanted feature, I'm tired of having to save and quit when my wife wants to watch netflix!

Pretty sure this is confirmed on XBO and likely on PS4. I know PS4 has suspend but I don't know if you can go into Netflix during that.
 
This comparison looks better, than it is.

The PS4 GPU also sacrifice some processing power to the OS layer.
The TMUs/ROPs and everything else are coupled with the clocks, so the ratio is the same.

1,31 TF vs. 1,84 = + 40% (Shader/ALU-Throughput)
41 Texture Fillrate vs. 57,6 = + 40% (TMUs Throughput)
The Pixel Fillrate is 88% higher, not 100%. (ROPs)

And a little fun fact at the end, the raster/triangle/geometry throughput from the front-end is 6,6% higher on the Xbox One GPU.


I don't expect much difference which you can immediately point out in most of the games.

Your right in factoring the upclock into fillrate etc. But there's a few problems with your math as well. You are not accounting for the 10% reduction in GPU resources such as fillrate and texture units which would bring the numbers back down again. Kinect and snap will use more than just cu's to do their job.
 
the number of people with headsets on PSN is shockingly low

and that is somehow proof that the PSN is a vastly inferior service?...not seeing the connection...

not to mention the majority of people i encounter on XBL (including myself) are in private party chats anyway so the lobbies are just as silent...
 
and that is somehow proof that the PSN is a vastly inferior service?...not seeing the connection...

not to mention the majority of people i encounter on XBL (including myself) are in private party chats anyway so the lobbies are just as silent...
How do you know if the lobbies are silent if you're in a private chat? Just saying.
 
Nope. Hypothesis A ("PS4 reserves GPU time") needs the additional ad-hoc assumption that there is some currently unknown feature that needs a fixed reservation of GPU time. Hypothesis B ("PS4 does not reserve GPU time") does not need that assumption. It is hence the more economical, and as a result of basic probability theory, the more probable hypothesis.

Though I agree with this explanation, given the context, I'm not comparing those to hypotheses.

Assuming no foul play, here are the hypotheses that I think describe this Teraflop argument better.

Hypothesis A: "PS4 is 50% more powerful, hence we would see that commensurate difference in games (1.84 vs 1.31)"

Hypothesis B: "PS4 is 56% more powerful, hence we would see that commensurate difference in games (1.84 vs 1.18)"


Firstly, you might argue that both A and B are not valid hypotheses. Though the numbers themselves are fact, the inference being made has some assumptions on what is actually available in-game. Hence, both A and B are hypotheses.

If you take A, both numbers are confirmed figures and the assumptions being made are restricted to what the respective platforms can actually deliver based on those figures.

If you take B, though both are again confirmed figures, the OS overhead has been confirmed only for one platform, not the other. So there is an added assumption that the other platform will not have any overhead.

So until the OS overhead (or the lack of it) is either discovered by someone or outright confirmed by Sony, B has more assumptions than A.
 
You are talking utter bullshit.
Xbone NEEDS to reserve 10% as they want you to be able to watch TV while gaming, ITS IN ALL THE ADVERTS, they can't have the game slow down so you need a GPU ceiling for what you expect the side-apps to use as to not interfere with the gameOS.


I've been watching your posts all day and I'm finding them increasingly.....shillish.

Incredible. I'd never have thought I would be at the receiving end of such an attack. That single comment has permanently turned me away from responding to you again. It was nice knowing you JoeTheBlow. You can continue watching my posts if you like. Have a fantastic day! :)
 
Though I agree with this explanation, given the context, I'm not comparing those to hypotheses.

Assuming no foul play, here are the hypotheses that I think describe this Teraflop argument better.

Hypothesis A: "PS4 is 50% more powerful, hence we would see that commensurate difference in games (1.84 vs 1.31)"

Hypothesis B: "PS4 is 56% more powerful, hence we would see that commensurate difference in games (1.84 vs 1.18)"


Firstly, you might argue that both A and B are not valid hypotheses. Though the numbers themselves are fact, the inference being made has some assumptions on what is actually available in-game. Hence, both A and B are hypotheses.

If you take A, both numbers are confirmed figures and the assumptions being made are restricted to what the respective platforms can actually deliver based on those figures.

If you take B, though both are again confirmed figures, the OS overhead has been confirmed only for one platform, not the other. So there is an added assumption that the other platform will not have any overhead.

So until the OS overhead (or the lack of it) is either discovered by someone or outright confirmed by Sony, B has more assumptions than A.

i disagree completely...based on the fact that A is the only option that contains information that has been confirmed to be incorrect...saying that the the GPU resources available to developers is 1.84 v 1.31 is categorically incorrect...its not even a hypothesis, just an incorrect statement...

B certainly contains the assumption that the PS4 does not have a set reserve of GPU resources...but given the information that we know it is the only "hypothesis" that contains correct information...
 
i disagree completely...based on the fact that A is the only option that contains information that has been confirmed to be incorrect...

saying that the the GPU resources available to developers is 1.84 v 1.31 is categorically incorrect

Did I say that's what is available for developers? You are misrepresenting my argument. My point is that we don't exactly know what is available to the developers. So if we were to continue talking about numbers, we can only use the numbers that place both platforms on equal footing. I'm tabling the "what is available to developers" comparison until we know the figures for both. Otherwise we are speculating. And I'm fine with speculating, I'm just questioning those who quote it as fact.
 
So there is an added assumption that the other platform will not have any overhead.

Nope. If we follow Bayesian reasoning - and Occam's razor is just a special case of Bayesian reasoning -, we need to consider the entirety of our background knowledge before we estimate probabilities, since all probabilities are conditional on the established evidence. And part of that evidence is (a) that we know that the XBO reserves GPU time for snap and Kinect and (b) that PS4 has neither snap, nor Kinect, nor any other comparable feature. Hence, any assumption that ignores the XBO's need for GPU time reservation as well as any assumption that proposes some need for the PS4 to reserve GPU time needs justification. Otherwise it is ad-hoc and thereby subject to Occam's razor.
 
no.

the 10% GPU reservation is confirmed. Xbone will never have 1.31 tf available for games.

it's 1.18 vs 1.84.

Again, what is available for games is unknown for PS4. So I don't want to quote what is available for games as fact, but you are. I'm saying that the argument about what is available for games cannot be based on fact until we know the number for PS4. Meanwhile, the only argument we can rely on is raw power, regardless of what is available.
 
Though I agree with this explanation, given the context, I'm not comparing those to hypotheses.

Assuming no foul play, here are the hypotheses that I think describe this Teraflop argument better.

Hypothesis A: "PS4 is 50% more powerful, hence we would see that commensurate difference in games (1.84 vs 1.31)"

Hypothesis B: "PS4 is 56% more powerful, hence we would see that commensurate difference in games (1.84 vs 1.18)"


Firstly, you might argue that both A and B are not valid hypotheses. Though the numbers themselves are fact, the inference being made has some assumptions on what is actually available in-game. Hence, both A and B are hypotheses.

If you take A, both numbers are confirmed figures and the assumptions being made are restricted to what the respective platforms can actually deliver based on those figures.

If you take B, though both are again confirmed figures, the OS overhead has been confirmed only for one platform, not the other. So there is an added assumption that the other platform will not have any overhead.

So until the OS overhead (or the lack of it) is either discovered by someone or outright confirmed by Sony, B has more assumptions than A.

Not really because example A is just as much an assumption as example B. Ur not making any sense. In example A you're assuming both systems are using the same amount of GPU resources for OS. Example B is more accurate since Xbox has confirmed 10% GPU reservation and one can deduce through reasoning that Sony will not need to dedicate GPU resources for OS. So example B is the more of a calculated assumption.
 
Hypothesis A: "PS4 is 50% more powerful, hence we would see that commensurate difference in games (1.84 vs 1.31)"

Hypothesis B: "PS4 is 56% more powerful, hence we would see that commensurate difference in games (1.84 vs 1.18)"

Note: 1.84 vs 1.31 is closer to a 40% increase, not 50%. The real world performance gap devs are saying is 'obvious' to see, however (which negates this) indicates a 50% gap. Therefore there's no denying that the 10% reduction is playing a role. I know you want to ignore it, but it just doesn't prove to be sound given everything we know. Again: the examples given by devs are more important then even the raw specs and that is a large gap. But please keep coming up with explanations as to why we should ignore the known facts about os usage.
 
Did I say that's what is available for developers? You are misrepresenting my argument. My point is that we don't exactly know what is available to the developers. So if we were to continue talking about numbers, we can only use the numbers that place both platforms on equal footing. I'm tabling the "what is available to developers" comparison until we know the figures for both. Otherwise we are speculating. And I'm fine with speculating, I'm just questioning those who quote it as fact.

this entire thread is a discussion of how the specs of these consoles influence the games we will see on our TV's...

either way you're not putting them on equal footing...because it is a KNOWN FACT that the Xbone reserves 10% of its GPU resources for the OS functions...period...

it makes "hypothesis A" just flat out incorrect in the context of this thread...
 
Again, what is available for games is unknown for PS4. So I don't want to quote what is available for games as fact, but you are. I'm saying that the argument about what is available for games cannot be based on fact until we know the number for PS4. Meanwhile, the only argument we can rely on is raw power, regardless of what is available.

The thing is that Microsoft has highlighted why they need the reservation - for Kinect and Snap.

Sony doesn't really have either feature, except for the Eye if you really want to argue that. So they have no real need for an OS reservation because they aren't running hardware accelerated apps along side games.
 
Nope. If we follow Bayesian reasoning - and Occam's razor is just a special case of Bayesian reasoning -, we need to consider the entirety of our background knowledge before we estimate probabilities, since all probabilities are conditional on the established evidence. And part of that evidence is (a) that we know that the XBO reserves GPU time for snap and Kinect and (b) that PS4 has neither snap, nor Kinect, nor any other comparable feature. Hence, any assumption that ignores the XBO's need for GPU time reservation as well as any assumption that proposes some need for the PS4 to reserve GPU time needs justification. Otherwise it is ad-hoc and thereby subject to Occam's razor.

What I'm proposing is to ignore both, since a valid comparison needs both. How will it be subject to Occam's razor if it is removed off the table until further evidence?
 
Top Bottom