Or, you know, 28fps to 30.
Congratulations. Xbone can play games at 30fps in 2013 according to StevieP because he assumes so for the sake of argument. Next-generation is HERE!
Seriously, man. There's no reason any game should run <30fps, wimply clock boost or not. If it is running that slowly there's a bigger problem and it lies with the devs. You are being a bit... paranoid in assuming everything is going to run between 25-38 fps this gen or something.
You're missing the point entirely. The reason Dolphin was mentioned was an example of something requiring more CPU power. Just like some games require more from your CPU, depending on the load and what they're doing. There are recent RTS games and FPS games (one I've mentioned - Planetside 2) that require a VERY beefy CPU to keep up, with less emphasis on the GPU than other titles (of which I also listed an example - Crysis). Each game demands different things than other games, depending on its type, scope, and what's happening on screen. In some cases, a more powerful GPU will help more. In other cases, a more powerful CPU will help more. Yes we're talking a barely-there increase in this thread. But it's a net positive for what's already a weak CPU.
I'm not missing the point at all; your Dolphin comparison was completely invalid and has absolutely no place in this debate whatsoever. You are wrong. It's not practical to even list off Dolphin as an example of a game needing more power because it is emulation of a game not intended for anything but a GameCube/Wii and requires high single-core performance, game hacks, etc. to run optimally. I understand everything else you are trying to say about extra clock cycles needed on either the CPU or GPU side depending on the game, you don't have to lecture me as though I've never played a game on PC that demands better performance from the CPU or GPU than I currently have. But I can tell you even OCing to an extra 400MHz and beyond barely does anything for performance. And going from 29fps to 30fps in some instances is not as life-changing as you would suggest.
If I had more shaders, faster GPU memory and and faster system memory, on the other hand, that would probably help a lot, even with games that demand a lot from the CPU.
But you are silly to bring in this whole lecture about it over an overall 200MHz system boost in clock speed. It's not going to do anything significant. What Xbone needs is more memory bandwidth, more shaders, more ROPS, and like PS4 a
significantly higher CPU clock (i.e. 2.5-3.0+ GHz but they are both lacking there), etc. if it wants a legitimate performance boost that is really going to stand out.
Nothing is coded to any metal anymore. This saying really needs to die in a fire. Its benefits were already vastly overstated to begin with.
No kidding. But I meant that the games on Xbone are designed and optimized to run on Xbone, however CPU and/or GPU demanding they may be, and thus comparing emulation of Wii games on an unintended platform is... poppycock.
No, it's not essentially nothing. It's essential something, just not very much of something. It's a positive.
It is not worth pages and pages and pages of discussion. It's a net positive like having $1000 in my checking account and, unlike yesterday, a $10 bill in my wallet today in addition to that.
Nothing's going to be as bad as 580p without even so much as blurry post AA, sure. I agree with that.
Could've fooled me for a second there.
But people are going to start getting pissed when they see some games at 720p/30. It isn't going to take long. It's not laziness at all. Developers are the opposite of lazy the majority of the time. It's a choice that is made to maximize eye candy. There is no platform (except PC) where you can have a consistent resolution and framerate to your liking. On the consoles, developers choose the priorities. Not you. And as you've seen, many of them prefer eye candy to the behest of literally everything else. This will continue no matter the console.
And as I have maintained there is enough power on both platforms to ensure that we won't be locked down to 720p/30 hell or what have you. I agree many times it will be a choice and not laziness, but generally speaking games that stick out with exceptionally poor performance will not have to do with dev "choice" but shitty/lazy coding, and given the pretty open nature of both platforms it will probably happen because it'll be easier for inexperienced devs to try something. On the flip side of the coin, because both platforms are essentially PC's with some extra optimizations, they should be extremely easy to dev for compared to last time, if the dev has any semi-credible level of knowledge on how to design a game. So that combined with more powerful hardware to begin with should prevent a lot of games getting "locked" at a shitty fps. fwiw, some games look just fine at a constant 30. It just depends on the game.
There is nobody here that I am aware of stating that the Xbone is the better spec'd machine, especially on the GPU side. However, if the Sony Jaguar is at 1.6ghz and the MS Jaguar is at 1.75ghz, which one is better? It's a small net positive. That's all this is,.
It isn't practical to dissect it like that and ask which one is better. It's barely a net positive, but a positive nonetheless, sure. But Sony has more things there to help make games look and play better. A raw 150MHz overclock on the CPU side with no further optimizations to the GPU, no extra memory bandwidth, etc. is not enough to outclass PS4, but to a lot of people that is what it is going to look like.
No, I don't get to tell you anything. But it's worth repeating that if you're this invested in the difference between 2 consoles, it's probably best you invest more into the rig you're sporting. There are GPUs releasing that offer double the raw performance of the console for aroundt the same price, for example. A 2tf difference is a lot more than a 500gf difference (hence what seems like hypocrisy here). If you want the best framerates/IQ/etc as you've been saying, you're in second place purchasing these things already before the consoles even release.
I'm not in a race here. I am just calling a spade a spade; this boost is designed to look like something that gives Xbone an edge when it barely does anything; a small 'net positive'. That's it. As I said I am not getting a new GPU anytime soon, I'd much rather wait and see how next-gen plays out after a year or so and buy a PS4, or reluctantly buy an Xbone if it happens to take over the world.
As always (for me at least) purchasing the consoles is a way into their ecosystem for first party platform exclusives that interest me, and not which one performs a slight bit better than the other.
I agree, but it is nice to see Sony have such a well-designed, easy to code for platform this time around and it's laughable to see MS try to compete in that regard. They should just keep flexing their financial muscle, that is how they'll win. 150MHz is not.