150Mhz CPU boost on XBO, now in production

Status
Not open for further replies.
Price?

Both XB1 and PS4 have stronger GPU's than the average Steam user according to the Steam hardware survey.

price didnt stop last generation to bring out machines as good as high end desktop pc's or even better.

this generation, they are just going for the big money.

oh, and btw, fuck steam statistics on this!
we have heard enough excuses, our balls are blown. lets not make excuses ourselves for their failures.

they thought that what nintendo did with wii was smart? they where "jealous"? \
well, it might be smart at first, but look at whats happening now.. dear microsoft/sony and your "genius" decision makers..
 
And you're wrong. A 10% increase in performance will be used by the devs. More power is more power.

If I am not mistaken , didnt you say that before the bump that difference between the two wont be that much at all?

If that extra power 40% in GPU didn't matter much , then this 10% bump must be meaningless ?
 
Ok I tried googling for this "Xbox 360 reserve GPU" but found nothing. Do you mind providing links? I'm not saying that you're lying, but I really wanna know why would X360 would reserve some GPU time/core.

I never heard of that either.

It's really no controversial people. XB1 runs a full Metro app in parallel to the game. Metro is hardware-accelerated. Hence, it needs non-trivial GPU resources.

XBOX-One-Snap.jpg


That seperate Windows-8 based OS+app on the right side does not render and display itself out of thin air. This is not comparable to switching between mutually suspending applications.
 
If I am not mistaken , didnt you say that before the bump that difference between the two wont be that much at all?

If that extra power 40% in GPU didn't matter much , then this 10% bump must be meaningless ?

It's certainly narrowing the gap. Thanks for paying attention.
 
Ok I tried googling for this "Xbox 360 reserve GPU" but found nothing. Do you mind providing links? I'm not saying that you're lying, but I really wanna know why would X360 would reserve some GPU time/core.
. As I said. On my phone it was ERP on b3d said specifically about 360 reserve when the news first came out about one...

sorry no I'm not lying ,have no reason to as I'm just trying to come to a conclusion to a mystery like everyone else. Not trying to prove anyone wrong. :)


and el torro is usually right but he is speculating as well
 
you took the wrong lesson. You need a fast cpu as well as a fast gpu. The jaguar cpu even with 8 cores is under powered. Any clock speed increase helps.
Of course you need good both. I'm talking about CPU vs GPU. The gaming seem more benefit on if upper GPU rather than CPU.
 
I never heard of that either.

It's really no controversial people. XB1 runs a full Metro app in parallel to the game. Metro is hardware-accelerated. Hence, it needs non-trivial GPU resources.

XBOX-One-Snap.jpg

The real question is...is Jodie making a booty call and is the person playing Forza aware or not.
 
10% of what 110gflops? (Think I seen that number) so even tho ps4 cpu clock speed isnt confirmed afaik, we are assuming its 1.6 vs 1.75 - thats 11gflops the x1 has in advantage, without factoring in possible extra overheads that the x1 may have on the ps4 anyway...

Incredible :p
 
It's positive Xbox One news. What did you expect? :P

That persecution complex...xb1 threads would get less shit in them if there weren't uninformed posters like theKayle throwing bullshit FUD around like it was confetti. There has been lots of good discussion in this thread which has been destroyed by a handful of posters.
 
10% of what 110gflops? (Think I seen that number) so even tho ps4 cpu clock speed isnt confirmed afaik, we are assuming its 1.6 vs 1.75 - thats 11gflops the x1 has in advantage, without factoring in possible extra overheads that the x1 may have on the ps4 anyway...

Incredible :p
Because in no way is it possible that the PS4 will have more overhead... Look we don't know how much resources the PS4 reserves for it's OS. So don't start throwing around "X reserves more resources than Y does."
 
just out of boredom I overclocked a 3770k 150 MHz... the difference in real world gaming is... none, some benchmarks did show a gain of 1 fps though. I overclocked an "old" 660 Ti 3 GB just 75 MHz and got 5 fps extra, of course these figures don't mean anything, but what does mean anything in this huge thread anyways since we DO NOT KNOW the clock of the PS4 CPU ?
 
I'm still seeing a lot of questions about if PS4 can/will have the same kind of upclock and I just want to do my best to try to provide my insight into the matter. Both consoles seem to be betting on HSA/GPGPU features and in order to reach maximum efficiency the frequencies of the GPU and CPU have to be perfect multiples of one another. 800MHz GPU and 1600MHz CPU are exactly in line with that. As soon as the 853MHz for the XBO GPU was announced I guessed that the final clock speed of the CPU would be 1706 MHz.

That's where this gets a bit interesting. The CPU and GPU are no longer perfect multiples, which I'm going to guess is more to some wiggle room outlined by AMD in terms of efficiency.

For PS4 it seems that Sony valued efficency very highly. I don't see them changing the clock speeds in case it would have any effect on the efficiency of the GPGPU/HSA/hUMA features. PS4 also has a stronger GPU and any alterations in CPU clock speed would also have to be reflected within an outlined range on the GPU, which would add considerably more power and heat to PS4 than it did to XBO. I'm not saying it's impossible, just that it's less likely since they already seem to have gone as high as they could go on the GPU side. More likely they're working to free up 7 cores for games or dividing OS processes between two partial cores since that would be easier for devs to scale CPU threading for. I think that's a far more likely scenario since it wouldn't have an effect on TDP at all.

That said, KZSF multiplayer targeting 60 after previously targeting 30 may mean that the console is coming in a bit more powerful than it was before. All 30fps launch games will probably have a considerable bit of overhead and some more power could have pushed it to the point where a target of 60 is possible.
 
That's if you maintain that you want to compare them; I was merely comparing a 1.75GHz Xbone to a 1.6Ghz Xbone; the difference is basically negligible.


As someone who has become what I would call a bit of a "Sony fan" I wish people like you would cut this shit out, you'all make us look fucking bad.

10% is nothing to balk at for free, close competition is a good thing. If Sony fucking dominate like fuck, you'll see launch PS3 behaviour all over again.
I WANT Xbox fucking idiots to buy more Xbox's - I want the "bros" on that system, I want the Gears fans and Halo fans there.

This is a positive for the Xbox One and a positive for gaming.
I love my PS3 for the incredible and unique exclusives
I love it for the fact I can replace the HDD with any old drive.
I love(d)? it for the fact at the time I could charge the controller with a common generic cable, not a proprietary one like MS.
I love the single player focus of the PS3 (at least in my eyes) - it's where all the good SP games are found.

I do not want Sony to utterly dominate the next round and I don't want MS to fail - if Sony gets too fucking big and cocky, I can regretfully fall back on Microsoft, - without them I'm fucked, I wouldn't even consider Nintendo.


Long story short, 10% is pretty fucking sweet. "Sony people" please stop pulling MS to shreds over things, when they say /genuinely/ dumb shit, roast them but this ain't even close to dumb.
 
. As I said. On my phone it was ERP on b3d said specifically about 360 reserve when the news first came out about one...

sorry no I'm not lying ,have no reason to as I'm just trying to come to a conclusion to a mystery like everyone else. Not trying to prove anyone wrong. :)


and el torro is usually right but he is speculating as well

Like I said, I'm sure you're not lying or at least not lying that you read this information in b3d. But OS reserving GPU time/cores means usually that OS can and will draw/overlay things in the screen especially while playing games. AFAIK X360 don't do this.
 
Because in no way is it possible that the PS4 will have more overhead... Look we don't know how much each system reserves for it's OS. So don't start throwing around X reserves more resources than Y does.

I know at this point its all ifs and buts some of it but J.Rigby pretty much confirmed the ps4 has a secondary chip controlling the other overheads, its ARM based.. because the psEYE data is a raw feed etc (its here on GAF) that means unless someone brings ither info to light the X1 overheads are on the cpu? OS/kinect.
 
Long story short, 10% is pretty fucking sweet. "Sony people" please stop pulling MS to shreds over things, when they say /genuinely/ dumb shit, roast them but this ain't even close to dumb.
Nice post.
And yes, this is good news for the Xbox.
But still, nearly nothing at all. Some of you are acting like this is a big boost, but it isn't.
So why is that (!) a problem for you? It's not like they added new hardware (CU's or ROPs).
It's a small bost during the optimization process of the XBone. It's fine.
There is still a nearly 40% difference (GPU) and memory difference between the PS4 and the Xbone. So what a few people are telling you here, is: Calm down
 
I know at this point its all ifs and buts some of it but J.Rigby pretty much confirmed the ps4 has a secondary chip controlling the other overheads, its ARM based.. because the psEYE data is a raw feed etc (its here on GAF) that means unless someone brings ither info to light the X1 overheads are on the cpu? OS/kinect.
The PSEYE data is not a raw feed ;) J.Rigby corrected himself on that. And the PSEYE is still depended on the CPU/GPU in the PS4. While the kinect for the X1 has it's own processors to handle the data.

Someone correct me if I'm wrong ;)
 
That persecution complex...xb1 threads would get less shit in them if there weren't uninformed posters like theKayle throwing bullshit FUD around like it was confetti. There has been lots of good discussion in this thread which has been destroyed by a handful of posters.
The term persecution complex is getting used way too much around here my friend. I don't feel persecuted in the least bit lol. We are talking about videogames after all and just because one poster is throwing "bullshit" around doesn't make it more acceptable for ten other people to come in here and make the conversation even worse.
 
So if I am reading you guys.

Total CPU capability

Xb1>ps4?

Yes you are correct when comparing cpu.

1.6ghz vs 1.75ghz
CPU Bandwidth: Less than 20gb/s(ps4) vs 30gb/s(xbone)
This is going by the accurate vgleaks we have been using.

Now gpu....yeah, the ps4's gpu totally dominates the xbone's.
 
Like I said, I'm sure you're not lying or at least not lying that you read this information in b3d. But OS reserving GPU time/cores means usually that OS can and will draw/overlay things in the screen especially while playing games. AFAIK X360 don't do this.

It's also a difference if you have a fixed scheduling (like in the virtualization-based case of XB1), or if individual functions temporarily compete for resources, like notifications most probably do. Of course, notifications use up some resources, but it is nowhere near comparable to the permanent execution of an entire application with a hardware-accelerated UI and non-trivial graphics possibilities in another virtual partition. In addition, XB1 apps are not predictable, since there will be many apps from many developers with many different resource needs. Something like a notification is a predictable, controlled, and isolated feature.
 
That's where this gets a bit interesting. The CPU and GPU are no longer perfect multiples, which I'm going to guess is more to some wiggle room outlined by AMD in terms of efficiency.

I didn't heard of those types of 'perfect multiple' configurations since the times of Pentium4.

Nowadays, with current architectures, the best configuration is that where every component get as high as it can.
 
It's also a difference if you have a fixed scheduling (like in the virtualization-based case of XB1), or if individual functions temporarily compete for resources, like notifications most probably do. Of course, notifications use up some resources, but it is nowhere near comparable to the permanent execution of an entire application with a hardware-accelerated UI and non-trivial graphics possibilities in another virtual partition. In addition, XB1 apps are not predictable, since there will be many apps from many developers with many different resource needs. Something like a notification is a predictable, controlled, and isolated feature.

Maybe they will have an mode where they will put all the apps states on the hdd.
When an game wants all the resources.
 
Maybe they will have an mode where they will put all the apps states on the hdd.
When an game wants all the resources.

Doesn't really matter since the user can apparently summon a snapped app at any time. So the game cannot claim those additional resources at any time.
 
just out of boredom I overclocked a 3770k 150 MHz... the difference in real world gaming is... none, some benchmarks did show a gain of 1 fps though.

The failed logic here is that you're assuming that any PC game uses 100% of the CPU.
 
As someone who has become what I would call a bit of a "Sony fan" I wish people like you would cut this shit out, you'all make us look fucking bad.

10% is nothing to balk at for free, close competition is a good thing. If Sony fucking dominate like fuck, you'll see launch PS3 behaviour all over again.
I WANT Xbox fucking idiots to buy more Xbox's - I want the "bros" on that system, I want the Gears fans and Halo fans there.

This is a positive for the Xbox One and a positive for gaming.
I love my PS3 for the incredible and unique exclusives
I love it for the fact I can replace the HDD with any old drive.
I love(d)? it for the fact at the time I could charge the controller with a common generic cable, not a proprietary one like MS.
I love the single player focus of the PS3 (at least in my eyes) - it's where all the good SP games are found.

I do not want Sony to utterly dominate the next round and I don't want MS to fail - if Sony gets too fucking big and cocky, I can regretfully fall back on Microsoft, - without them I'm fucked, I wouldn't even consider Nintendo.


Long story short, 10% is pretty fucking sweet. "Sony people" please stop pulling MS to shreds over things, when they say /genuinely/ dumb shit, roast them but this ain't even close to dumb.

None of this is relevant to my post. At no point did I say that the clock bump is a bad thing. It obviously isn't; it is only a good thing. The stuff about competition is basically completely irrelevant, and I've argued at length in other threads that I don't want a Sony-dominated videogame market.

The fact remains that a 10% clockspeed bump on the GPU is negligible. It translates to a handful more FPS (if that). You probably wouldn't notice the difference between the old Xbone clockspeeds and the new ones if they were running side by side, unless you hooked it up to one of DF's machines and got a numerical readout of the framerate.

At no point did I pull MS to shreds; your post is a wild overreaction to what I think most people would agree is a pretty neutral assessment. If you think there are people who are being overly critical of MS you're barking up the wrong tree by singling me out.
 
Hmm, sounds alright to me. I'm getting a ps4 for launch but I may very well get a XOne down the line pending a price drop so any improvements they can make sounds like a good thing.
 
Like I said, I'm sure you're not lying or at least not lying that you read this information in b3d. But OS reserving GPU time/cores means usually that OS can and will draw/overlay things in the screen especially while playing games. AFAIK X360 don't do this.


yes 360 displays the mini dash allowing you to choose music or check downloads etc or go to marketplace over top of a game

I'm guessing ps4 will have overlays of some sort graphically?
 
Of course you need good both. I'm talking about CPU vs GPU. The gaming seem more benefit on if upper GPU rather than CPU.

I have a 3 core phenom II (which is as if not a bit more powerful than the ps4 cpu (and in the real world it's faster since games don't need to support 8 threads just to get the full performance) and a hd6870 (about as powerful as xbox one gpu and 33 percent less so than the ps4 one) and at this point I will upgrade my cpu before I upgrade my gpu again (even though the cpu upgrade is more costly as I need a new mobo and ram as well).

My minimum fps (those framedrops that cause stutter) would go way up in many games and for some games (ns2, planetside 2 , battlefield 3 , total war, arma) I would gain nothing by having a faster gpu as I'm cpu bottlenecked in those.

I don't think the ps4 cpu and a hd7850 are a good balance, the xbox one gpu is a bit less disproportionate.
 
price didnt stop last generation to bring out machines as good as high end desktop pc's or even better.

this generation, they are just going for the big money.

oh, and btw, fuck steam statistics on this!
we have heard enough excuses, our balls are blown. lets not make excuses ourselves for their failures.

they thought that what nintendo did with wii was smart? they where "jealous"? \
well, it might be smart at first, but look at whats happening now.. dear microsoft/sony and your "genius" decision makers..
You salty, bro?
 
None of this is relevant to my post. At no point did I say that the clock bump is a bad thing. It obviously isn't; it is only a good thing. The stuff about competition is basically completely irrelevant, and I've argued at length in other threads that I don't want a Sony-dominated videogame market.

The fact remains that a 10% clockspeed bump on the GPU is negligible. It translates to a handful more FPS (if that). You probably wouldn't notice the difference between the old Xbone clockspeeds and the new ones if they were running side by side, unless you hooked it up to one of DF's machines and got a numerical readout of the framerate.

At no point did I pull MS to shreds; your post is a wild overreaction to what I think most people would agree is a pretty neutral assessment. If you think there are people who are being overly critical of MS you're barking up the wrong tree by singling me out.


If you can't work out from the wording of my post that it's not specifically targeted at you then I don't know what else to tell you, besides quoting you it's pretty clear I went off on a tangent at all people belittling this improvement for the sake of whining.

How you can't get that from my post is beyond me, it seems pretty apparent to me.
 
It's also a difference if you have a fixed scheduling (like in the virtualization-based case of XB1), or if individual functions temporarily compete for resources, like notifications most probably do. Of course, notifications use up some resources, but it is nowhere near comparable to the permanent execution of an entire application with a hardware-accelerated UI and non-trivial graphics possibilities in another virtual partition. In addition, XB1 apps are not predictable, since there will be many apps from many developers with many different resource needs. Something like a notification is a predictable, controlled, and isolated feature.

Thanks for giving those details.

yes 360 displays the mini dash allowing you to choose music or check downloads etc or go to marketplace over top of a game

I'm guessing ps4 will have overlays of some sort graphically?

Does the game pause? If not does it continue to play normally with no slowdown/IQ loss?
 
Just so I understand; when all we had were leaks and spec sheets people were able to extrapolate power and Tflops and Gflops and Xflops and make assumptions based on this. But now that the XB1 CPU is stated as clocked higher than the PS4 rumored, we're back to "it's not even final yet" and "we don't know" and "oh, you have the numbers now, care to share"?
 
Status
Not open for further replies.
Top Bottom