150Mhz CPU boost on XBO, now in production

Status
Not open for further replies.
Regardless of how it compares against PS4 even after the clock increases of both chips, the only thing that matters is how it affects the longevity of the HW and if it is basically unaffected then the clock speed increase cannot be viewed as anything but a good thing for an Xbox One developer. It still does not affect my purchase decision, but props to them anyways: It is a ballsy/risky move at this point in time.
 
If all of these games end up running at a targeted ~60+fps it won't matter. If it is in the 30's it still won't matter. If the framerate is so poor that it dips into the mid/high 20's then it may very well finally start to matter, sure. But it's not going to do a damn thing for making it somehow more graphically capable than the competition.

But really given the power of both PS4 and Xbone there's no reason for any game to be <30fps at any time. 40-60fps should be what to expect at the very least I'd think, constant 60fps being the most practical. But we shall see.

Hell, I get 40-60fps on my FX 6300 with a GTX 660.

You are not playing next gen games. 60fps in everything is certianly not most practical, especially with such weak CPUs. 2.25x the resolution and DX11 featureset are already pushing the consoles, and games will have more features and scope further in the consoles lives to push them more.
 
Wonder if IGN has concerns about overheating.

Great news. Apparently they have cooling resources which is not surprising given the big case. I would not be concerned about failure rate. I am sure that Microsoft's engineers did test the hardware properly. There is no way Microsoft would take a risk after last gen- If anything, the only variables that might "compromise" each other are clock speed and fan noise. But I don't expect them to increase fan noise noticeably after they stressed the importance of silent operation.
 
Sorry but saying something like this is just ignorant and completely wrong. Natural Selection 2 could easily be done on consoles, you'd just have to optimize it for further multithreading. NS2 is poorly coded and inefficient when it comes to using the CPU, hence the CPU bottleneck.

Yeah I'm sure if they coded it to 8 metals (periodic table.gif)
 
Clearly MS wished they made a stronger console since the 180 lol. Rather they just drop the price though.

Nah, probably Microsoft saying they needed to bump up cpu/gpu performance to stay comparable to PS4. Don't think they necessarily regret making a weaker console, just like they didn't really regret going with DVD for the x360.
 
Sorry but saying something like this is just ignorant and completely wrong. Natural Selection 2 could easily be done on consoles, you'd just have to optimize it for further multithreading. NS2 is poorly coded and inefficient when it comes to using the CPU, hence the CPU bottleneck.

Ehm between the UI (elaborate lua stuff similar to mmos) and the thousands of entities to keep track of in the map (infestation, the many, many AI units like macs and babblers and whips etc, the buildings, item drops etc) the game actually has a good reason to be demanding on the CPU.

There is nothing even remotely similar on the consoles.

NS2 is not poorly coded, it uses 3 threads (most modern games only use 2 anyhow, they are just designed for consoles and not demanding on the cpu at all) , with very few games such as sc2 and bf3 actually being able to use more than 2 properly.

- game logic (the entities, the AI etc) run on one thread, this is the case in most games, and THIS is where ns2 has so much more going on than most other games, and is why it is so demanding.

-rendering (objects etc) run on a seperate thread, the engine supports multithreading for rendering but since the game logic is where the majority of the processing happens, having more cores will not linearly scale performance at all (obviously)

-prediction (netcode related) runs on a seperate thread, accounts for a small amount of processing but does scale up in late game as more shit is going on, having more cpu cores helps performance here

But obviously having strong single core performance is important in this game as it is needed to run the game logic, you need one core to be able to keep up with the game logic, else the other cores will just be waiting on that one core and you won't see the benifits.
(but again a decent amount of (non game logic) processing IS offloaded to other cores, which does help a lot with performance as the 1 core can spend 100 percent of time on the game logic.

If the game was poorly programmed it would run on one thread, it doesn't, and it isn't.

My point stands : ns2 is not possible on the consoles, not even close ,especially considering a single modern cpu core is much more powerful than all 3 xbox 360 cores combined.
Making the game logic multithreaded would still not allow the game to run on the 360.

If ns2 were designed as a multiplatform game it would not have released in it's current form , it would have the mechanics of ns1 instead. (no dynamic infestation, no elaborate UI, no power nodes mechanic)
Another example of design limitations is that the cpu intensive lua code for the ui is why ns2 has mod support to begin with, no lua no mods in this game.
The things you miss out on but never know about if they never happen , eh.

That is the point I was making, a better cpu allows for more elaborate/interesting/new game mechanics, and the cpus in ps4/xb1 means there will be certain things we will not see next gen, that we would have seen if they had had better cpus.
This is why bf3 on consoles only supported 24 players, same exact reason.
Next gen there will be the equivalent to 24 player battlefield in some games which get decent pc ports, but most games will just be from the ground up be designed around the limitations to begin with.

That is why every little bit of increase on the cpu front helps.

The 'gpgpu' shit is only useful for particles and some physics and some tesselation, it won't help for the vast majority of calculations that currently happen (and will continue to happen) on the cpu.
So yeah, enjoy the next gen equivalent of 24 player battlefield 3 (comprehensive reading before replying please, before you go battlefield 4 has 64 players on consoles) with shiny gpugpu particles, instead of meaningful gameplay mechanics.

Gee maybe cause it's legitimate concern?
Only it isn't.
Rrod had nothing to do with power consumption, it was caused by the solder and the shitty motherboard design (the clamp holding the cooler to the motherboard to be precise).
300W gpus with adequate cooling and a decent design will happily run for 10 years in a pc without breaking down.
A 30W cpu and a 60W gpu are not suddenly going to 'cause rrod' with a small overclock.
 
well, 50hz on gpu and 150hz on cpu, when we are talking about well designed, good quality mobile stuff, it means really nothing for reliability.
i bump my laptops gtx clock by 200hz and the gddr5 by 400hz and only get like 5-10c more temperature in long sessions. and a good power boost.
BUT
if not well designed, good quality = problems. I had an almost similar laptop before, 80hz boost and things would start falling apart.


about the xbone consumer, now, the xbone day 1 customer, I think we would have been much happier with a public execution of the stupid motherfucker that decided on the first place to not go all-out in the xbone graphics department, than any news of minor tries to rectify that stupid motherfucker's h/w ideas and decisions for xbone.
still, upclock is not bad, and many of us predicted back then when the specs where only "rumors", that ms should probably bump the clockspeeds a little bit.
but I would prefer a public hanging 8)
 
I find it incredibly amusing that a couple months ago this forum was convinced of rumors that said poor X1 yields were resulting in a down clocking of the specs. Now here we are with both the GPU and CPU clock speed upgraded and some people are like whatever who cares or MS is desperate. lol.
 
They both want to fit in a 100W envelope, there are eight cores and a GPU with a TDP approaching 100W. I suppose the One has more thermal headroom with that oversized cooler and a smaller GPU.

Or the chips are getting built with improved characteristics.

Its probably that.
 
Only it isn't.
Rrod had nothing to do with power consumption, it was caused by the solder and the shitty motherboard design (the clamp holding the cooler to the motherboard to be precise).
300W gpus with adequate cooling and a decent design will happily run for 10 years in a pc without breaking down.
A 30W cpu and a 60W gpu are not suddenly going to 'cause rrod' with a small overclock.

It sure as hell took MS forever to figure it out. And how can you say it isn't a concern??
 
I find it incredibly amusing that a couple months ago this forum was convinced of rumors that said poor X1 yields were resulting in a down clocking of the specs. Now here we are with both the GPU and CPU clock speed upgraded and some people are like whatever who cares or MS is desperate. lol.

This for real. The titan jokes, the rrod jokes, all the jokes are so old and trollish.
"Oh no they downclocked it 150mhz, inferior system and yield problems!"
"They upclocked it 150mhz? who cares it's inferior and has yield problems!"
 
Nothing is coded to any metal anymore. This saying really needs to die in a fire. Its benefits were already vastly overstated to begin with.

Having a fixed target platform is still a substantial benefit. That you have thicker layers of abstraction between the application code and the hardware (in form of a more present OS and more abstract APIs) does not change that. You can still optimize from the software architecture to algorithms and implementation details in hot loops. Things like instruction- and data-cache sizes, individual bottlenecks and hardware-specific capabilities still matter.
 
I find it incredibly amusing that a couple months ago this forum was convinced of rumors that said poor X1 yields were resulting in a down clocking of the specs. Now here we are with both the GPU and CPU clock speed upgraded and some people are like whatever who cares or MS is desperate. lol.

This forum was concerned about yields. Poor yields usually facilitate a downclock if lots of them are leaky, or if they have a high hard defect rate it means there are less of them to go around.

Ring any bells?
 
No, RROD shouldn't be a concern in 2013. Modern x86 processors like Jaguar have security mechanisms that shut down the whole system before overheating.

Any massively produced electronic device (esp with Xbone boasting it's 5bil transistors) should have concerns from potential buyers. Don't care if it's a video game system or a brand new fancy TV.

There's a reason why lots of people avoid buying stuff at launch. I really don't thing it's a big leap to have some concerns.
 
Good news. Although is strange that MS is bumping the frequency clocks when the machine is suppose to be in production. I mean the stress tests must be done with 1.6 Ghz and 800 Mhz, not with the new clocks.
 
It sure as hell took MS forever to figure it out. And how can you say it isn't a concern??
i dont think they figured out the x-clamp stupidity that late, or even "late" at all. dont mind the pr back then.
every single guy with an idea of how electronics work would find that that stupid x-clamp was the problem with the first look.
i think that the way they produced it made them unable to fix it until next mobo shrink/redesign.
i also think that the core team probably had a close estimate after a short while that the majority of first design 360s where going to die.
but they couldnt do nothing, so they just sat and watch the spectacle with popcorn on hand or something close to that..
 
Superior cooling. Bigger box, larger fan, external power supply, it would be really strange if they couldn't run at higher rates than PS4. While still being more silent.

Ah the armchair engineers mentality of bigger must be better.....

Saying that the original PS3 launched with a 380W PSU, and the PS4 is a fraction of that.

Power dissipation is not really an issue compared to 8 years ago.
 
Does that stop motherboards warping and leadfree solder failing under normal use?

Yes....
do you see PCBs and motherboards warping on pc?
I know people think consoles use magical fairy tech, but it's the same shit.

xbox 360 was a design failure (polymer used warped under heat, new leadfree solder broke under stress, which causes the heatsink to physically detatch from the die) , 8 years have passed and every single OEM and manufacturer out there can now produce motherboards , coolers and pcbs that don't break with leadfree solder.
If MS manage to fuck it up where noone else does then it will be an extraordinary feat of incompetence.

This has nothing to do with '5 billion transistors' or 'achitectures' or 'optimimasation' or even power consumption.
It was bad design of a very basic component, no more difficult than making a functional hairdryer or coffee grinder.
There is no way they can fuck this up again.

If the xbox one has a high failure rate it'll have absolutely nothing to do with why the xbox 360 failed.
 
Ah the armchair engineers mentality of bigger must be better.....

Saying that the original PS3 launched with a 380W PSU, and the PS4 is a fraction of that.

Power dissipation is not really an issue compared to 8 years ago.

It's Krilekk. Don't bother with him. He was Juniored for posting insane things.
 
Yes....
do you see PCBs and motherboards warping on pc?
I know people think consoles use magical fairy tech, but it's the same shit.

xbox 360 was a design failure (polymer used warped under heat, new leadfree solder broke under stress, which causes the heatsink to physically detatch from the die) , 8 years have passed and every single OEM and manufacturer out there can now produce motherboards , coolers and pcbs that don't break with leadfree solder.
If MS manage to fuck it up where noone else does then it will be an extraordinary feat of incompetence.

This has nothing to do with '5 billion transistors' or 'achitectures' or 'optimimasation' or even power consumption.
It was bad design of a very basic component, no more difficult than making a functional hairdryer or coffee grinder.
There is no way they can fuck this up again.

If the xbox one has a high failure rate it'll have absolutely nothing to do with why the xbox 360 failed.

They managed to before hence the apprehension.
 
They managed to before hence the apprehension.

Better not buy a fiat 500 then, you know since the pinto tended to explode when rear ended.
Exploding is on my mind every time I see a new car release these days.

I could understand if you said ''rrod was such a miserable failure, ms hardware division are a bunch of incompetents and I have no trust in them producing reliable hardware this time or ever and I can see them fucking up in some way or another.
And we now know that if they fuck up they will probably do like last time and silently put their foot down as they drive the hardware failure off a cliff while the unexpecting consumer who buys this shit THEY know is broken will suffer for it 2 years down the line"
That makes sense

But not 'oh they overclocked the cpu so my conclusion is that now it will be more likely to rrod', that does not make sense, since the problem that caused rrod is now well known and unless they do it on purpose they cannot reproduce this problem.
 
I can't really go through last 28 pages, but let's get this into perspective:

Xbox One: GPU @853Mhz = 1.31TF + CPU @1.75GHz = 0.112TF = 1.42TF TOTAL
PS4: GPU @800 MHz = 1,84TF + CPU @1.6GHz = 0.102TF = 1.94TF TOTAL

That's still 36.5% lead for PS4 compared to Xbox One, down from 45.8% before GPU & CPU up-clocks.

I mean that's great for Xbox and multiplatform games in general - no doubt about it, but nowhere near as significant as MS would like everyone to believe.

Problem is, Sony is quiet allowing MS to lead the general public into thinking Xbox is more powerful on top of other - much less concrete and thus very relative - claims about 'value proposition', integrated entertainment, better launch games lineup, etc. Has Sony got too confident again?
 
Better not buy a fiat 500 then, you know since the pinto tended to explode when rear ended.
Exploding is on my mind every time I see a new car release these days.

You are being ridiculously obtuse with that statement.

When people say RROD they are generally (in my mind at least) talking about the huge failure rate of the 360 and how many units they have moved on through the years.
 
Asked earlier but I don't think it was covered..

If the CPU isn't power hungry, and is designed to run at around 2GHz, why did both Xbox and PS4 have it pegged at 1.6 initially? That seems very conservative and it isn't that strong a CPu in the first place so you'd think they'd want as much as possible from it.

Concern from AMD with a complex APU? Clock rate multiple based on expected GPU speed? GPU was 800MHz which is half the CPU speed. But then Xbox has increased both so they are no longer clean multiples, so that messes that idea up.
 
I figured as much but thought there might be some news in those 29 pages...

Nothing, all other news comments are off topic. However Albert Penello commented in here.
C'mon seriously? That was not a shot - only you guys would think that. I was in a meeting on my phone, shot out a quick tweet. Srsly.

Sony haven't announced the CPU speed, as people have noted. Plus - when I take a pot-shot at the competition I'm gonna be way more obvious than that. I'm not a fan of the passive-aggressive.

As I'm sure you've figured out - you don't announce something like this at the same time you're working on it. Obviously this has been in the works, but now that we're in full production we felt OK about announcing it.

Devs should start seeing this soon - so any performance gains were happening on the 1.6 boxes. (everything at PAX was on the 1.6). This is a boost to developers on top of the optimizations going on.

Lastly - can I PLEASE finally get you over this "we're having production issues" thing that's going around? We increased the GPU 6%. We increased the CPU almost 10%. We have been showing retail boxes. We are now in full production.

If at this point these facts don't outweigh random rumors...
 
My 2009 CPU seems fine for modern PC games. I somehow doubt their CPU is less powerful than my 2009 CPU. This is certainly not a bad thing, but with audio offloaded to another chip, I'd much rather see a more powerful GPU than anything CPU related.
 
Better not buy a fiat 500 then, you know since the pinto tended to explode when rear ended.
Exploding is on my mind every time I see a new car release these days.

I could understand if you said ''rrod was such a miserable failure, ms hardware division are a bunch of incompetents and I have no trust in them producing reliable hardware this time or ever and I can see them fucking up in some way or another.
And we now know that if they fuck up they will probably do like last time and silently put their foot down as they drive the hardware failure off a cliff while the unexpecting consumer who buys this shit THEY know is broken will suffer for it 2 years down the line"
That makes sense

But not 'oh they overclocked the cpu so my conclusion is that now it will be more likely to rrod', that does not make sense, since the problem that caused rrod is now well known and unless they do it on purpose they cannot reproduce this problem.

So all the last minute changes inspire confidence in you?

And yet if Fiat made a car that blew up I wouldn't rush out to buy their next model they make, that would be sheer and utter insanity, no matter how many promises they made.
 
It sure as hell took MS forever to figure it out. And how can you say it isn't a concern??

And really I think 1st gen failures of any product like this is always a concern. First version crappiness is not a non issue. And given the 360 history its understandable it might be even stronger in the mind.
 
And really I think 1st gen failures of any product like this is always a concern. First version crappiness is not a non issue. And given the 360 history its understandable it might be even stronger in the mind.

Exactly.

Even if you take the RROD fiasco out of the equation, you still have to have some sort of concern given 1st production runs of big electronics like this. I'm even worried about my PS4 pre-order. Less so than the Xbone, but still.
 
I can't really go through last 28 pages, but let's get this into perspective:

Xbox One: GPU @853Mhz = 1.31TF + CPU @1.75GHz = 0.112TF = 1.42TF TOTAL
PS4: GPU @800 MHz = 1,84TF + CPU @1.6GHz = 0.102TF = 1.94TF TOTAL

That's still 36.5% lead for PS4 compared to Xbox One, down from 45.8% before GPU & CPU up-clocks.

I mean that's great for Xbox and multiplatform games in general - no doubt about it, but nowhere near as significant as MS would like everyone to believe.

Problem is, Sony is quiet allowing MS to lead the general public into thinking Xbox is more powerful on top of other - much less concrete and thus very relative - claims about 'value proposition', integrated entertainment, better launch games lineup, etc. Has Sony got too confident again?

Your summary (ignoring cpu flops =/= gpu flops, or one architecture flops =/= another achitecture flops, meaning you can compare the gpu flops to the other gpu's flops since they use the same architecture, but not the cpu) also ignores the bandwidth difference.

The hd5770 on pc had a newer ,more efficient architecture than the hd4870 and the same amount of rops and spus and should be faster , but because of the crippled memory bandwidth (80GB/sec instead of 115 GB /sec) the 4870 still outperformed it at 1080p+ and with AA.
 
Status
Not open for further replies.
Top Bottom