150Mhz CPU boost on XBO, now in production

Status
Not open for further replies.
Asked earlier but I don't think it was covered..

If the CPU isn't power hungry, and is designed to run at around 2GHz, why did both Xbox and PS4 have it pegged at 1.6 initially? That seems very conservative and it isn't that strong a CPu in the first place so you'd think they'd want as much as possible from it.

.

i believe so it doesnt have any production problems and over heating issues. but i could be wrong.

i just wonder if the PS4 is also in production. someone ask @ysop lol
 
now with 150mhz more rrod.

You tried but i dont think this works at all. What did you want to get from posting this? What do you think it added to the conversation or thread in general? Did you want a laugh or was this just a drive by posting for no real reason?

You could tell it was an instant "junior" post before even seeing the avatar.
 
i believe so it doesnt have any production problems and over heating issues. but i could be wrong.

i just wonder if the PS4 is also in production. someone ask @ysop lol

Well, we're still waiting for FCC documents from Xbone considering it supposedly passed certification way before PS4. Of course no one can find those documents though.
 
Sony still has the huge advantage of the higher memory bandwidth. A good programmer can save almost anything, but he can't save memory bandwidth.

Yeah, I'm aware of that and it's not what I meant. Just look at press' reaction to this - most will say that Xbox has 1,75GHz processor vs. 1,6GHz in PS4 without any comment what that means in terms of overall system's performance or how it changes the performance gap between the two systems.

Most people - and I'm sure, a lot of press too - will simply think: "Xbox' CPU is 9% faster than PS4, while its GPU is 7% faster, which leads to Xbox being stronger than PS4 by 16%!1!1!!!"

:)
 
Exactly.

Even if you take the RROD fiasco out of the equation, you still have to have some sort of concern given 1st production runs of big electronics like this. I'm even worried about my PS4 pre-order. Less so than the Xbone, but still.

Yeah. Its a well known thing with Apple products too.

Given the components, case size and internal psu of the ps4 I'd probably be more concerned all things being equal.
 
So all the last minute changes inspire confidence in you?

And yet if Fiat made a car that blew up I wouldn't rush out to buy their next model they make, that would be sheer and utter insanity, no matter how many promises they made.

I covered that in the top part of my post didn't I?
I don't trust MS to give half a shit about the consumer if something goes wrong, I don't trust them to make good hardware, but I'm not worried about an overclock causing an rrod.
If they know it's broken right now, I know they will still release it as long as they thing the failures will happen after the warranty expires, because that is what MS do and that is who they are, evil corporate, anti consumer shits.

If you want to be worried about the xbox one , sure, but not because of an overclock (more like de-underrclock since a similar cpu in a laptop will be clocked at 2ghz) . It's a 1.75 ghz netbook cpu in a 5x larger shell , let's get real here.
It's not like it's a 5ghz bulldozer cpu that produces 200W of heat.
 
The geekwire article related post section got a chuckle out of me

Related Posts
Memo: Microsoft online vet Mehdi moving to Xbox business

The next Xbox: What to expect from Microsoft's new console
Xbox exec: 'We have a more complete value proposition than our competitors'

Best Products for Relieving Itchy, Dry Vaginas
 
RROD is only a concern for 1st run. They are not dumb. They have contingency plans, and even before those, they made sure to overengineer in terms of heat.
 
Your summary (ignoring cpu flops =/= gpu flops, or one architecture flops =/= another achitecture flops, meaning you can compare the gpu flops to the other gpu's flops since they use the same architecture, but not the cpu) also ignores the bandwidth difference.

The hd5770 on pc had a newer ,more efficient architecture than the hd4870 and the same amount of rops and spus and should be faster , but because of the crippled memory bandwidth (80GB/sec instead of 115 GB /sec) the 4870 still outperformed it at 1080p+ and with AA.

I'm fully aware of GDDR5 vs. DDR3+ESRAM, hUMA vs. no/maybe hUMA, etc. things and I thought this is not debatable, so I concentrated on PUBLIC IMAGE those latest upclocks create and how PUBLIC (not Gaf / geeks) think about it :)
 
According to the Killzone Shadow Fall PDF, PS4 CPU is running with 1.6GHz and as I mentioned earlier: If the balance between parts of the system is already optimal, then there's no need to change it only because Microsoft does another 180.

Making any change at all is not doing a 180. Improving the CPU clock is not a 180. A 180 degree turn would be to reverse course on a stance, not make an incremental improvement.

Oh, bad news for the Xbox One again, gee, can't take a break.

Except that this is only good news and there is no negative angle to it.
 
I covered that in the top part of my post didn't I?
I don't trust MS to give half a shit about the consumer if something goes wrong, I don't trust them to make good hardware, but I'm not worried about an overclock causing an rrod.

Oh after the 360 disaster and the costs associated with that, I think MS would have taken more care in all stages so they don't have another clusterfuck. Thats why, I'm sure, that the whole back is a vent.
 
Yeah, I'm aware of that and it's not what I meant. Just look at press' reaction to this - most will say that Xbox has 1,75GHz processor vs. 1,6GHz in PS4 without any comment what that means in terms of overall system's performance or how it changes the performance gap between the two systems.

Most people - and I'm sure, a lot of press too - will simply think: "Xbox' CPU is 9% faster than PS4, while its GPU is 7% faster, which leads to Xbox being stronger than PS4 by 16%!1!1!!!"

:)

Sony could just argue with almost 2 TFlops compute power. The Playstation still have the factsheet advantage.

Tbh, I think nobody cares.
 
Asked earlier but I don't think it was covered..

If the CPU isn't power hungry, and is designed to run at around 2GHz, why did both Xbox and PS4 have it pegged at 1.6 initially? That seems very conservative and it isn't that strong a CPu in the first place so you'd think they'd want as much as possible from it.

Concern from AMD with a complex APU? Clock rate multiple based on expected GPU speed? GPU was 800MHz which is half the CPU speed. But then Xbox has increased both so they are no longer clean multiples, so that messes that idea up.
Jaguar isn't "designed" to run at 2GHz, that just happens to be the upper end of the clocks AMD decided on, for the desktop SKUs.

The whole 1.6GHz target seems to have come from AMD itself, in terms of sweet spot for the CPU, they probably ran tests to determine that the perf/W was best around this clock and both MS/Sony went with it, atleast for their initial specs.

The clock rate of the CPU isn't tied to the GPU, just take a look at any AMD APU and you'll find that the CPU cores have dynamic clocks ..
 
Jaguar isn't "designed" to run at 2GHz, that just happens to be the upper end of the clocks AMD decided on, for the desktop SKUs.

The whole 1.6GHz target seems to have come from AMD itself, in terms of sweet spot for the CPU, they probably ran tests to determine that the perf/W was best around this clock and both MS/Sony went with it, atleast for their initial specs.

The clock rate of the CPU isn't tied to the GPU, just take a look at any AMD APU and you'll find that the CPU cores have dynamic clocks ..

Anandtech also notes a huge jump in TDP to go from 1.6 to 2.0 GHz.
 
The 1.6 GHz clock speed was likely for yields, not power consumption. Same reason the GPUs are at ~800 MHz even though we know GCN does fine at 1000 MHz.

Anandtech also notes a huge jump in TDP to go from 1.6 to 2.0 GHz.

The heat concerns are unsubstantiated for a couple of reasons:

1. TDP != power consumption, especially when you're isolating the CPU from an SoC. Anand showed this himself in his Kabini review when he recorded the entire laptop drawing 11.5W under CPU load (the A4-5000 is rated at 15W).
2. The 15W -> 25W jump includes a GPU clock boost of 20%.
3. The Opteron X1150 (CPU-only Jaguar) shows otherwise. A 100% increase in CPU speed (1 GHz -> 2 GHz) results in only an 88% increase in TDP (9W -> 17W).
4. Jaguar consumes so little power that even if a 25% bump to 2 GHz in the XB1 resulted in a 50% increase in power, you're looking at an extra 10W at most.
 
I'm fully aware of GDDR5 vs. DDR3+ESRAM, hUMA vs. no/maybe hUMA, etc. things and I thought this is not debatable, so I concentrated on PUBLIC IMAGE those latest upclocks create and how PUBLIC (not Gaf / geeks) think about it :)

Dude...

The (non-geek) public doesn't know about XB1 upclocks...
 
Anandtech also notes a huge jump in TDP to go from 1.6 to 2.0 GHz.
There was a GPU up clock in there but I think they made that call based on their discussions with AMD, which is why they haven't removed or retracted that bit. (Most of the times Anand gets their info directly from AMD/Intel/Nvidia)
 
It's like a water drop on a hot plate. Both consoles are underpowered, XB1 even more so and it already shows before the gen even started.
 
Isnt it pretty normal to tweek things to the final minute?

I know it was the case for Wii U with Nintendo releasing new devkits like every month. Im sure PS4 is also still being tweeked.
 
Isnt it pretty normal to tweek things to the final minute?

I don't think that this is a real "last-minute" change. People confuse the time of announcement with the time of decision. And ys, it is perfectly possible that final clocks are determined when all other engineering results (like case design) are fixed. Prior clocks could just haven deliberate conservative targets that were guaranteed to be met and that developers could rely on as a base line.
 
I'm actually a little concerned about all the changes being made to the Xbox.

All the DRM issues are to be reveresed with a Day One patch.
What if you don't have internet?

All these BIOS changes (i assume) must be delivered by a Software update.
What if you don't have the internet and buy a game that uses these upped specs that doesn't come pre-loaded with the software update.
 
With now 2 last minute overclocks, I wonder if any potential problems might lie in the power supply instead of overheating problems. I'm sure they thought of this, but last minute overclocks with no power supply changes seems as if it could change the long-term reliability of the console. Although I am certainly no expert...
 
Nothing, all other news comments are off topic. However Albert Penello commented in here.

that should be in the OP. All makes sense to me. Only Neogaf could turn this negative somehow - an increase in performance, therefore (for some reason) greater chance of failure and "PS4 is still more powerful!"

Who cares? if you werent going to buy an xbox, it doesnt affect you.

if you were, its better. Done.
 
It's like a water drop on a hot plate. Both consoles are underpowered, XB1 even more so and it already shows before the gen even started.

Underpowered compared to what? Their price tag? Their tdp? Fully specced PCs?

Its not like we got incredible looking launch window games last gen either. Both consoles will see huge improvements over time.
 
With now 2 last minute overclocks, I wonder if any potential problems might lie in the power supply instead of overheating problems. I'm sure they thought of this, but last minute overclocks with no power supply changes seems as if it could change the long-term reliability of the console. Although I am certainly no expert...
The power designs are mostly over compensated for so that is probably unwarranted IMO.
 
C'mon seriously? That was not a shot - only you guys would think that. I was in a meeting on my phone, shot out a quick tweet. Srsly.

Sony haven't announced the CPU speed, as people have noted. Plus - when I take a pot-shot at the competition I'm gonna be way more obvious than that. I'm not a fan of the passive-aggressive.

As I'm sure you've figured out - you don't announce something like this at the same time you're working on it. Obviously this has been in the works, but now that we're in full production we felt OK about announcing it.

Devs should start seeing this soon - so any performance gains were happening on the 1.6 boxes. (everything at PAX was on the 1.6). This is a boost to developers on top of the optimizations going on.

Lastly - can I PLEASE finally get you over this "we're having production issues" thing that's going around? We increased the GPU 6%. We increased the CPU almost 10%. We have been showing retail boxes. We are now in full production.

If at this point these facts don't outweigh random rumors...
Thanks! So DR3 having less fps problems at PAX makes sense now. I am happy.
 
I'm actually a little concerned about all the changes being made to the Xbox.

All the DRM issues are to be reveresed with a Day One patch.
What if you don't have internet?

All these BIOS changes (i assume) must be delivered by a Software update.
What if you don't have the internet and buy a game that uses these upped specs that doesn't come pre-loaded with the software update.

The device will not run games without the day 1 patch. You need an initial Internet connection.

This upclock has been in production units for a while now. They just felt comfortable to ANNOUNCE it today.
 
Pretty confident in the reliability as they said the team who designed the Trinity 360s revision have designed X1, those consoles seem rock solid. Looking at the size of it and the fan then it seems to have been at the top of the agenda, overkill to make sure has probably left loads of headroom for these clock speed bumps.

More performance is nothing but good news in my book and only devs really know what optimisations MS have put into their hardware which will help it get optimal output, counting basic tflop outputs isn't the total picture. I'm sure game engines changing to being threaded correctly to the eight core CPU will make a huge difference.
 
Ehm between the UI (elaborate lua stuff similar to mmos) and the thousands of entities to keep track of in the map (infestation, the many, many AI units like macs and babblers and whips etc, the buildings, item drops etc) the game actually has a good reason to be demanding on the CPU.

There is nothing even remotely similar on the consoles.

NS2 is not poorly coded, it uses 3 threads (most modern games only use 2 anyhow, they are just designed for consoles and not demanding on the cpu at all) , with very few games such as sc2 and bf3 actually being able to use more than 2 properly.

- game logic (the entities, the AI etc) run on one thread, this is the case in most games, and THIS is where ns2 has so much more going on than most other games, and is why it is so demanding.

-rendering (objects etc) run on a seperate thread, the engine supports multithreading for rendering but since the game logic is where the majority of the processing happens, having more cores will not linearly scale performance at all (obviously)

-prediction (netcode related) runs on a seperate thread, accounts for a small amount of processing but does scale up in late game as more shit is going on, having more cpu cores helps performance here

But obviously having strong single core performance is important in this game as it is needed to run the game logic, you need one core to be able to keep up with the game logic, else the other cores will just be waiting on that one core and you won't see the benifits.
(but again a decent amount of (non game logic) processing IS offloaded to other cores, which does help a lot with performance as the 1 core can spend 100 percent of time on the game logic.

If the game was poorly programmed it would run on one thread, it doesn't, and it isn't.

My point stands : ns2 is not possible on the consoles, not even close ,especially considering a single modern cpu core is much more powerful than all 3 xbox 360 cores combined.
Making the game logic multithreaded would still not allow the game to run on the 360.

If ns2 were designed as a multiplatform game it would not have released in it's current form , it would have the mechanics of ns1 instead. (no dynamic infestation, no elaborate UI, no power nodes mechanic)
Another example of design limitations is that the cpu intensive lua code for the ui is why ns2 has mod support to begin with, no lua no mods in this game.
The things you miss out on but never know about if they never happen , eh.

That is the point I was making, a better cpu allows for more elaborate/interesting/new game mechanics, and the cpus in ps4/xb1 means there will be certain things we will not see next gen, that we would have seen if they had had better cpus.
This is why bf3 on consoles only supported 24 players, same exact reason.
Next gen there will be the equivalent to 24 player battlefield in some games which get decent pc ports, but most games will just be from the ground up be designed around the limitations to begin with.

That is why every little bit of increase on the cpu front helps.

The 'gpgpu' shit is only useful for particles and some physics and some tesselation, it won't help for the vast majority of calculations that currently happen (and will continue to happen) on the cpu.
So yeah, enjoy the next gen equivalent of 24 player battlefield 3 (comprehensive reading before replying please, before you go battlefield 4 has 64 players on consoles) with shiny gpugpu particles, instead of meaningful gameplay mechanics.

Tell you what I'll do... I'll drop a message tomorrow to the NS2 devs since my cousin's office is right next to theirs. I'll post their reply for you, and if they say you're full of shit, then you have to donate $20 to charity. If they say I'm wrong, I'll donate $20 to charity. Deal?

now with 150mhz more rrod.

This post made with 150mhz more stupid.
 
bottom line,
xbox 360 was a beast back at 2005.
I had many burnout on me, but at least they died gloriously.

this time, xbone being underpowered and all, it will be a very not-glorious death if it happens.
the same also holds for ps4.


you give us mid-range stuff? for starters you are very stupid for doing that, but thats another story.
at the very least, they better not break like the last ones you motherfuckers! both!
 
bottom line,
xbox 360 was a beast back at 2005.
I had many burnout on me, but at least they died gloriously.

this time, xbone being underpowered and all, it will be a very not-glorious death if it happens.
the same also holds for ps4.


you give us mid-range stuff? for starters you are very stupid for doing that, but thats another story.
at the very least, they better not break like the last ones you motherfuckers! both!

Do either consoles come with any warranty? At least one year or two and my Amex will add an extra on top of that.
 
Not sure how this makes any sense. If they're boosting the specs obviously they're confident it won't blow up in their faces like the 360 did.

Wanting to not do another round of RROD, I have no idea how you cannot see that making last minute changes to any product makes it appear worse planned. That aside, you still cannot argue that they're left with as much time to test it properly.
 
Do either consoles come with any warranty? At least one year or two and my Amex will add an extra on top of that.

here in EU, everything electronic has a 2 year warranty, either the producer likes it or not.
at least thats how it was until last year. I dont know with all the shitstorm in EU if it has changed or not recently, but i think I'd heard something if it did.
 
Wanting to not do another round of RROD, I have no idea how you cannot see that making last minute changes to any product makes it appear worse planned. That aside, you still cannot argue that they're left with as much time to test it properly.

They don't need to test it now, it is already done. These changes are not made on a whim.
 
here in EU, everything electronic has a 2 year warranty, either the producer likes it or not.
at least thats how it was until last year. I dont know with all the shitstorm in EU if it has changed or not recently, but i think I'd heard something if it did.

1 year in the UK.
 
Do either consoles come with any warranty? At least one year or two and my Amex will add an extra on top of that.

To be honest the warranty services usually suck. You wait like a month for the process and you get a band aid fixed refurb in return.

I think at some point the 360 only had a 90 day warranty. I was burning through those original ones like crazy but the slim revised unit has been extremely reliable. I really hope they carry over whatever they did to fix things.
 
Wanting to not do another round of RROD, I have no idea how you cannot see that making last minute changes to any product makes it appear worse planned. That aside, you still cannot argue that they're left with as much time to test it properly.

Not as last minute as you think it sounds from Albert that they have been working on it for some time and the yields are there to improve clocks so they did.

It's not like they went crazy, very modest clocks and i,m certain they are well within their window
 
Status
Not open for further replies.
Top Bottom