150Mhz CPU boost on XBO, now in production

Status
Not open for further replies.
Just so I understand; when all we had were leaks and spec sheets people were able to extrapolate power and Tflops and Gflops and Xflops and make assumptions based on this. But now that the XB1 CPU is stated as clocked higher than the PS4 rumored, we're back to "it's not even final yet" and "we don't know" and "oh, you have the numbers now, care to share"?
Just so I understand, can you find all these posts that are not dated the last few days that talk about the CPU a lot when comparing the two systems?

If you're asking if the systems were compared based on leaks, articles that tried to interpret leaks, then individual tidbits revealed by either one and final confirmation. Then yes, they were compared. Usually a lot of the time with the addition of "assuming the currently known information is valid and specs are always subject to change."
 
Is the 1.6ghz figure what Sony stated for the CPU clockspeed or what everyone keeps assuming?

Didn't vgleaks said 1.6Ghz.

The X1 box is so huge and has so many vents, a up clocks seemed more possible and perhaps even needed too compete with the ps4.
This upclock will perhaps bring launch games that were cpu limited like perhaps Dead rising 3 running 15~25fps to a more stable 25~30fps range...

/Just assuming Dead rising 3 is cpu limited i can hardly believe a game with those graphics is gpu limited. But then again you have killer instinct being 720p and it killed all the hype i dont feel like supporting devs that cut that big of a corner. I rather have shittier lighting then.
 
Was this posted ?

Digital Foundry's take on this news.

http://www.eurogamer.net/articles/digitalfoundry-xbox-one-cpu-speed-increased-in-production-hardware

The combination of the CPU performance increase in combination with its cluster of custom "Data Move Engines" strongly suggests that while PlayStation 4 has an undoubted graphics hardware advantage, the Xbox One's CPU capabilities are a significant step ahead

The DMEs are there to move data around to address bandwidth issues. Since there is only one pool of memory on the PS4, it doesn't need them.

Only a hack like Leadbetter would use the DMEs to claim that the Xbone's CPU is significantly more capable then the PS4's.
 
Yeah there was a slide during the Killzone:SF demonstration that showed the cpu at 1.6. There was also the document from the 4.5 gb memory article from DF that showed the 2 of the 8 cores from the cpu going to the OS just like the Xbox One.
 
If you can't work out from the wording of my post that it's not specifically targeted at you then I don't know what else to tell you, besides quoting you it's pretty clear I went off on a tangent at all people belittling this improvement for the sake of whining.

How you can't get that from my post is beyond me, it seems pretty apparent to me.
As someone who has become what I would call a bit of a "Sony fan" I wish people like you would cut this shit out, you'all make us look fucking bad.
Seems pretty much to me like you're singling me out.
 
Was this posted ?

Digital Foundry's take on this news.

http://www.eurogamer.net/articles/digitalfoundry-xbox-one-cpu-speed-increased-in-production-hardware

The combination of the CPU performance increase in combination with its cluster of custom "Data Move Engines" strongly suggests that while PlayStation 4 has an undoubted graphics hardware advantage, the Xbox One's CPU capabilities are a significant step ahead

Well, only if PS4 reserves 2 cores for the OS as well. If the PS4 reserves less than 1 core as it was rumored, then it will be the PS4 that will have a CPU advantage.
 
Man, after my experiences with a perpetually red-ringed 360 the last thing I want to hear about the Xbox One is that they are over-clocking the gpu and cpu in a last-minute defensive panic.
 
Just so I understand, can you find all these posts that are not dated the last few days that talk about the CPU a lot when comparing the two systems?

If you're asking if the systems were compared based on leaks, articles that tried to interpret leaks, then individual tidbits revealed by either one and final confirmation. Then yes, they were compared. Usually a lot of the time with the addition of "assuming the currently known information is valid and specs are always subject to change."

Why would I need to find those posts? It's fairly obvious it was used whenever calculations were made, or were all the vs spec posts ignoring the CPU completely?

Man, after my experiences with a perpetually red-ringed 360 the last thing I want to hear about the Xbox One is that they are over-clocking the gpu and cpu in a last-minute defensive panic.

Wow. Ignorance at its finest.
 
You are not playing next gen games. 60fps in everything is certianly not most practical, especially with such weak CPUs. 2.25x the resolution and DX11 featureset are already pushing the consoles, and games will have more features and scope further in the consoles lives to push them more.
But if we are already pushing next-gen hardware fairly hard, particularly on the CPU side, the extra 150MHz isn't going to do anything. My rig for example is already getting taxed pretty hard to play the highest end of current gen stuff on my PC; simply OCing it by a couple hundred MHz or so does not make a significant impact on performance one way or the other. Something like a $400 GPU with more shaders and memory bandwidth would do a lot for me, though. That is the kind of thing MS needs and they can't get it, so they're resorting to these weak clock speed gains and using them as a band-aid.

This has less to do with what gen we're talking about and more to do with how much more gains you are going to make in environments that are going to be heavily taxing hardware by default.
 
Why would I need to find those posts? It's fairly obvious it was used whenever calculations were made, or were all the vs spec posts ignoring the CPU completely?
You're trying to insinuate something that I don't think took place, so I think it would be fair to ask for evidence.

The CPU was assumed to be the same, in addition to being less important for graphics so the vast majority of comparisons were about the GPU capabilities and memory system.
 
Doesn't the ps4 use a core for audio?

Well, these are some quotes from an audio programmer from that same B3D thread where Bkilian talks about the Xbox ONE audio chip.

http://beyond3d.com/showthread.php?t=63677&page=9

Relab said:
I would roughly estimate a very high quality reverb will use 6-10% of a single Jaguar core with 32 sample buffers and around 768KB of memory at 44.1Khz.


Relab said:
It takes a lot of time (1-3 years) to design a high-end reverb algorithm from scratch in our field, so it's obviously way faster to use convolution from a game developer point of view. But since both Sony and MS have moved to multiple x86 cores and more ram compared to previous generation, there're completely new technologies available (with much better quality).

With 5 algorithmic reverbs with the same quality as skysound (from the forza video) - you would use 35-50% of a single Jaguar core.

So surely the Xbox ONE has a very powerful audio chip, but it does not seem like performing some audio tasks on the CPU would require an entire core just for audio.
 
None of this is relevant to my post. At no point did I say that the clock bump is a bad thing. It obviously isn't; it is only a good thing. The stuff about competition is basically completely irrelevant, and I've argued at length in other threads that I don't want a Sony-dominated videogame market.

The fact remains that a 10% clockspeed bump on the GPU is negligible. It translates to a handful more FPS (if that). You probably wouldn't notice the difference between the old Xbone clockspeeds and the new ones if they were running side by side, unless you hooked it up to one of DF's machines and got a numerical readout of the framerate.

At no point did I pull MS to shreds; your post is a wild overreaction to what I think most people would agree is a pretty neutral assessment. If you think there are people who are being overly critical of MS you're barking up the wrong tree by singling me out.

i know what u are doing ..but again this is not how the xb1 architecture work...the cpu advantage start from 9.38% ..could end over 31% in certain cases
 
Well, these are some quotes from an audio programmer from that same B3D thread where Bkilian talks about the Xbox ONE audio chip.

http://beyond3d.com/showthread.php?t=63677&page=9






So surely the Xbox ONE has a very powerful audio chip, but it does not seem like performing some audio tasks on the CPU would require an entire core just for audio.



in fact this is what bkilian answered to this one

"Is this a convolution reverb? What is the size of the impulse? How much bandwidth will it use? I suspect it isn't a convolution, because the memory requirement for that would be way higher.

Irrespective of those things, a game like forza would like to use multiple convolution reverbs, and would probably use an entire jaguar core, if not more, to do it."

re read pls...."entire jaguar core...if not more to do it"
 
Well, these are some quotes from an audio programmer from that same B3D thread where Bkilian talks about the Xbox ONE audio chip.

http://beyond3d.com/showthread.php?t=63677&page=9

Just to stress my point again, he is talking about a specific effect at high quality, not about "audio" in general. I see people here will not differentiate again. This example is by the way, again, very different from "raycasting on the GPU". Some people mistook that as "PS4 uses GPU for audio". We are talking here about different tasks for different purposes, none of which are mandatory to a game nor generally relevant to all games.
 
i know what u are doing ..but again this is not how the xb1 architecture work...the cpu advantage start from 9.38% ..could end over 31% in certain cases

No, it does not. If PS4 reserves less than 1 core for the OS, it will be 8 cores for PS4 versus 6 cores for Xbox ONE, which would give the PS4 a CPU advantage. So untill we don't know how many cores are reserved for the OS on the PS4, and what is the final clock of the CPU, we can't really say which console will have a CPU advantage.
 
No, it does not. If PS4 reserves less than 1 core for the OS, it will be 8 cores for PS4 versus 6 cores for Xbox ONE, which would give the PS4 a CPU advantage. So untill we don't know how many cores are reserved for the OS on the PS4, and what is the final clock of the CPU, we can't really say which console will have a CPU advantage.

right we will see
 
bs like what? explain pls

Its to hard to explain because you quite literally do not know enough about the things you are talking about to realise when you are spouting absolute nonsense. Thats not to say other people in this thread have not been, just that it seems to be your job to regurgitate and misinterpret things incorrectly.


No-one can form a coherent sentence on B3D either?

Coherency is a hard problem to solve ;)
 
Or, you know, 28fps to 30.
Congratulations. Xbone can play games at 30fps in 2013 according to StevieP because he assumes so for the sake of argument. Next-generation is HERE!

Seriously, man. There's no reason any game should run <30fps, wimply clock boost or not. If it is running that slowly there's a bigger problem and it lies with the devs. You are being a bit... paranoid in assuming everything is going to run between 25-38 fps this gen or something.

You're missing the point entirely. The reason Dolphin was mentioned was an example of something requiring more CPU power. Just like some games require more from your CPU, depending on the load and what they're doing. There are recent RTS games and FPS games (one I've mentioned - Planetside 2) that require a VERY beefy CPU to keep up, with less emphasis on the GPU than other titles (of which I also listed an example - Crysis). Each game demands different things than other games, depending on its type, scope, and what's happening on screen. In some cases, a more powerful GPU will help more. In other cases, a more powerful CPU will help more. Yes we're talking a barely-there increase in this thread. But it's a net positive for what's already a weak CPU.
I'm not missing the point at all; your Dolphin comparison was completely invalid and has absolutely no place in this debate whatsoever. You are wrong. It's not practical to even list off Dolphin as an example of a game needing more power because it is emulation of a game not intended for anything but a GameCube/Wii and requires high single-core performance, game hacks, etc. to run optimally. I understand everything else you are trying to say about extra clock cycles needed on either the CPU or GPU side depending on the game, you don't have to lecture me as though I've never played a game on PC that demands better performance from the CPU or GPU than I currently have. But I can tell you even OCing to an extra 400MHz and beyond barely does anything for performance. And going from 29fps to 30fps in some instances is not as life-changing as you would suggest.

If I had more shaders, faster GPU memory and and faster system memory, on the other hand, that would probably help a lot, even with games that demand a lot from the CPU.

But you are silly to bring in this whole lecture about it over an overall 200MHz system boost in clock speed. It's not going to do anything significant. What Xbone needs is more memory bandwidth, more shaders, more ROPS, and like PS4 a significantly higher CPU clock (i.e. 2.5-3.0+ GHz but they are both lacking there), etc. if it wants a legitimate performance boost that is really going to stand out.

Nothing is coded to any metal anymore. This saying really needs to die in a fire. Its benefits were already vastly overstated to begin with.
No kidding. But I meant that the games on Xbone are designed and optimized to run on Xbone, however CPU and/or GPU demanding they may be, and thus comparing emulation of Wii games on an unintended platform is... poppycock.

No, it's not essentially nothing. It's essential something, just not very much of something. It's a positive.
It is not worth pages and pages and pages of discussion. It's a net positive like having $1000 in my checking account and, unlike yesterday, a $10 bill in my wallet today in addition to that.

Nothing's going to be as bad as 580p without even so much as blurry post AA, sure. I agree with that.
Could've fooled me for a second there.

But people are going to start getting pissed when they see some games at 720p/30. It isn't going to take long. It's not laziness at all. Developers are the opposite of lazy the majority of the time. It's a choice that is made to maximize eye candy. There is no platform (except PC) where you can have a consistent resolution and framerate to your liking. On the consoles, developers choose the priorities. Not you. And as you've seen, many of them prefer eye candy to the behest of literally everything else. This will continue no matter the console.
And as I have maintained there is enough power on both platforms to ensure that we won't be locked down to 720p/30 hell or what have you. I agree many times it will be a choice and not laziness, but generally speaking games that stick out with exceptionally poor performance will not have to do with dev "choice" but shitty/lazy coding, and given the pretty open nature of both platforms it will probably happen because it'll be easier for inexperienced devs to try something. On the flip side of the coin, because both platforms are essentially PC's with some extra optimizations, they should be extremely easy to dev for compared to last time, if the dev has any semi-credible level of knowledge on how to design a game. So that combined with more powerful hardware to begin with should prevent a lot of games getting "locked" at a shitty fps. fwiw, some games look just fine at a constant 30. It just depends on the game.

There is nobody here that I am aware of stating that the Xbone is the better spec'd machine, especially on the GPU side. However, if the Sony Jaguar is at 1.6ghz and the MS Jaguar is at 1.75ghz, which one is better? It's a small net positive. That's all this is,.
It isn't practical to dissect it like that and ask which one is better. It's barely a net positive, but a positive nonetheless, sure. But Sony has more things there to help make games look and play better. A raw 150MHz overclock on the CPU side with no further optimizations to the GPU, no extra memory bandwidth, etc. is not enough to outclass PS4, but to a lot of people that is what it is going to look like.

No, I don't get to tell you anything. But it's worth repeating that if you're this invested in the difference between 2 consoles, it's probably best you invest more into the rig you're sporting. There are GPUs releasing that offer double the raw performance of the console for aroundt the same price, for example. A 2tf difference is a lot more than a 500gf difference (hence what seems like hypocrisy here). If you want the best framerates/IQ/etc as you've been saying, you're in second place purchasing these things already before the consoles even release.
I'm not in a race here. I am just calling a spade a spade; this boost is designed to look like something that gives Xbone an edge when it barely does anything; a small 'net positive'. That's it. As I said I am not getting a new GPU anytime soon, I'd much rather wait and see how next-gen plays out after a year or so and buy a PS4, or reluctantly buy an Xbone if it happens to take over the world.

As always (for me at least) purchasing the consoles is a way into their ecosystem for first party platform exclusives that interest me, and not which one performs a slight bit better than the other.
I agree, but it is nice to see Sony have such a well-designed, easy to code for platform this time around and it's laughable to see MS try to compete in that regard. They should just keep flexing their financial muscle, that is how they'll win. 150MHz is not.
 
Are you going to educate me or just call me names?

To summarise:

These arent last minute changes, they just announced them now.

The fan/heatsink on the XB1 are very big. there is a lot of room in there to prevent a repeat of the RROD.

If you look at the last line of xbox's - they were very reliable - the xbox one has been tested and they have designed it to run quiet, cool and that is also why there is so much space inside.

Although its impossible to say for sure, it's obvious that this RROD will not happen on a wide spread level as this would be very damaging to the company.

I dont think you have to let reliability affect your purchase - if you dont want an xbox, dont get one, if you do, then dont let the reliability of hardware created 7 or so years ago which is a different system affect your choice.
 
i know what u are doing ..but again this is not how the xb1 architecture work...the cpu advantage start from 9.38% ..could end over 31% in certain cases

You sound very certain. What if PS4's OS 'uses' only 1 core and audio-related tasks can also be off-loaded to custom chip? In that case, you'd have 6x1.75=10.5 for Xbox vs. 7x1.6=11.2 for PS4, i.e. Sony's console would still have 6.6% advantage over Xbox despite the upclock?

Maybe let's wait with categorical statements until we know the specs of both systems in 100% and hear dev's perspectives as well?
 
Not really. 150 MHz isn't much of a boost. Golf clap for MS though, I guess their cooling won't be shit like the 360.

150MHz x 8 cores isn't much of a boost? Are you sure about that?

It's a good boost, it'll be even more relevant later in the new gen console cycle.
 
You're trying to insinuate something that I don't think took place, so I think it would be fair to ask for evidence.

The CPU was assumed to be the same, in addition to being less important for graphics so the vast majority of comparisons were about the GPU capabilities and memory system.

But you just stated the evidence I need. The CPU was assumed to be the same and accepted. Now that the XB1 CPU is 'greater', it seems we can't assume anymore based on these posts:

Yes, we don't know the PS4 CPU clock speed.

We don't know what it is. Only what was rumored.

And not related:

Are you going to educate me or just call me names?

If you have read the thread or have dealt when any type of product roll-out or manufacturing related to digital goods, this type of increase isn't reactionary. They didn't look at the PS4 and go:

"Scotty, I need more megahurtz!"
"I can't do it captain, all I got is maybe 10%"
"Then do it! The PS4 warship is heading right for us, QUICK MAN!"
 
I dont think you have to let reliability affect your purchase - if you dont want an xbox, dont get one, if you do, then dont let the reliability of hardware created 7 or so years ago which is a different system affect your choice.
But it was created by the same company.

In my 20 years as a gamer, with more than 25 consoles, so far only hardware from Microsoft has died on me (i take great care of my hardware). The HDD of the 1st XBOX (although that was not made by MS, the console as a whole is an MS product), the notoriously shitty wireless 360 controller adapter for PC and of course we all know what happened with the 360.

Why should we trust them now? Its a company that either doesn't care about reliability or they can't design good quality hardware.


If you look at the last line of xbox's - they were very reliable
After how many revisions? And how many years? And how much time and money wasted from gamers?
 
If I had more shaders, faster GPU memory and and faster system memory, on the other hand, that would probably help a lot, even with games that demand a lot from the CPU.


It isn't practical to dissect it like that and ask which one is better. It's barely a net positive, but a positive nonetheless, sure. But Sony has more things there to help make games look and play better. A raw 150MHz overclock on the CPU side with no further optimizations to the GPU, no extra memory bandwidth, etc. is not enough to outclass PS4, but to a lot of people that is what it is going to look like.


I'm not in a race here. I am just calling a spade a spade; this boost is designed to look like something that gives Xbone an edge when it barely does anything; a small 'net positive'. That's it. As I said I am not getting a new GPU anytime soon, I'd much rather wait and see how next-gen plays out after a year or so and buy a PS4, or reluctantly buy an Xbone if it happens to take over the world.


I agree, but it is nice to see Sony have such a well-designed, easy to code for platform this time around and it's laughable to see MS try to compete in that regard. They should just keep flexing their financial muscle, that is how they'll win. 150MHz is not.

I'm not even sure why I bothered in the first place.

Look, it's pretty simple here guys. Not sure why this is a 40 page thread.

The Xbox has: better CPU, better audio hardware. This will help some games in some aspects more (one example would be number of units on screen).
The PS4 has: better GPU, better memory subsystem. This will help some games in some aspects more, that are GPU bound (one example would be the on-screen visuals).

This isn't rocket science. It's a net positive for the xbox, albeit a small one.
 
i know what u are doing ..but again this is not how the xb1 architecture work...the cpu advantage start from 9.38% ..could end over 31% in certain cases

Stahp plz! Kayle we know you love X1, but this is silly bro. X1 is outperformed by PS4 in any graphical benchmark everyone knows that. It may (most likely will) outperform in both audio processing and CPU ability. Games will look kind of similar (3rd party) and a fair bit better on first party. Who gives a shit. Bring on the games.

Software sells hardware
 
Status
Not open for further replies.
Top Bottom