DF: Xbone Specs/Tech Analysis: GPU 33% less powerful than PS4

What the hell is going on?

What are the actual figures?

I understand the maths you guys are explaining, but no one seems to have clarified what the actual figures are if it isn't 50% or whatever.


Sony PS4: 1,8 Teraflops of GPU grunt (confirmed)

MS XBone: 1,2 Teraflops of GPU (unconfirmed, but the sources were accurate for PS4 and everything surrounding the XBone reveal; coupled with MS's reluctance to talk specifics makes the rumours pretty much confirmed)
 
The GPU advantage the PS4 enjoys is nice. But where the PS4 absolutely murders the Xbone is on sheer bandwidth to main memory. 170 GB/s versus 68 GB/s is more or less night and day for developers. We're talking about situations where the same game using the same assets might be forced to use FXAA on the Xbone but can use MSAA on the PS4 because of sheer fillrate advantage.

I think in immediate visual benefit terms, the customizations Cerny talked about that they did with the PS4 GPU, and the GPU itself, will be the biggest differentiator between the two consoles when comparing graphical fidelity of games.

Details here, first bolded part:

http://www.neogaf.com/forum/showthread.php?t=532077
 
Perhaps you should read the Digital Foundry article then?
Or this thread about the article?

Smart arse.
Sony PS4: 1,8 Teraflops of GPU grunt (confirmed)

MS XBone: 1,2 Teraflops of GPU (unconfirmed, but the sources were accurate for PS4 and everything surrounding the XBone reveal; coupled with MS's reluctance to talk specifics makes the rumours pretty much confirmed)
Ah, so it is 33% then. Thanks.
 
I think in immediate visual benefit terms, the customizations Cerny talked about that they did with the PS4 GPU, and the GPU itself, will be the biggest differentiator between the two consoles when comparing graphical fidelity of games.

Details here, first bolded part:

http://www.neogaf.com/forum/showthread.php?t=532077

That's an interesting read. I think for first-party games this might be quite an interesting advantage for the PS4. Sony presently has a huge first-party stable of studios which can make use of these customizations. That will certainly help the PS4 for Sony's own exclusive titles.

But for the 3rd party multi-plat titles, they'll probably dev on the Xbone as baseline and port up to the PS3. So for most of those games, the biggest difference will probably be AA level and maybe a few LoD differences like twiddlling the settings in PC games between Medium and High.
 
Couldn't find any mention of the OS's RAM usage.

Where did the "3GB of RAM reserved for OS" rumour come from? Speculation? Based on what?
 
I think in immediate visual benefit terms, the customizations Cerny talked about that they did with the PS4 GPU, and the GPU itself, will be the biggest differentiator between the two consoles when comparing graphical fidelity of games.

Details here, first bolded part:

http://www.neogaf.com/forum/showthread.php?t=532077

No, it will be the bandwidth and sheer higher number of shader cores.

The things Cerny talked about are to make compute a bit easier.
 
Nah, the interesting part is that it'll most likely look even better at launch with extra polish and development on final dev kits. Sony first party games will blow console-gamer minds graphic wise. Belieee that.

The only games that will look unimpressive is EA Sports titles. Everything else will be what I expect it to be.
 
3 OS need a GB each, or something like that.

I think 5GB is more than enough for games. Some people dont.
I'm sure 5gb's of DDR3 is plenty, after all, 512mb has been plenty for years, but when your competitor has 8gb's of GDDR5 (-minus a gig or so for OS) it starts to look a little lacklustre.
 
Or this thread about the article?

Smart arse.

Ah, so it is 33% then. Thanks.

It makes more sense to say 50%.

If nVidia releases a new graphics card that performs 50% better than their last, most people don't say "nVidias old card performs 33% worse than their new card". They say "nVidias new card performs 50% better than their old card". It makes more sense to use the low number as baseline when comparing.
 
When the other console has an extra 2GB available, it is not enough.

What have you or anyone else seen to make this claim?

What if at E3, Sony says we need about 2GB for our OS.

Then its 5 vs 6. It'll just be Heavenly Sword again. Arguing how many NPCs we can fit on the screen.
 
It'll probably look like that. But until somebody is playing that at an E3 demo station, I would hold off. Considering the past...transgressions.

they were playing it on jimmy fallon. And Jimmy played it, not some GG guy carefully acting to follow a predefined cutscene - it was clearly realtime and controllable.

I think 50% is a good overall figure for improvements

50% more GPU power
almost 50% more RAM
more than 50% more main memory bandwidth (but that'll be mitigated by the esram on xbox, plus with a less powerful GPU you don't need the higher bandwidth anyway)
 
It makes more sense to say 50%.

If nVidia releases a new graphics card that performs 50% better than their last, most people don't say "nVidias old card performs 33% worse than their new card". They say "nVidias new card performs 50% better than their old card". It makes more sense to use the low number as baseline when comparing.

Exactly.
 
Even only an extra GB is nothing to joke about.

Until a game developer shows us the improvement within their game engine. We really wont know.

If a team uses all 5GB on an XboxOne. Or something close to that. We cant understand what a 25 or 50 percent 'boost' is supposed to look like.*

*Multiplatform game
 
Considering basically all the rumors have been correct, I don't know why people think the PS4's OS might suddenly have become an extra GB. Especially when considering how late the RAM bump came. There's no way they were ever planning on only having 2GB for games.
 
they were playing it on jimmy fallon. And Jimmy played it, not some GG guy carefully acting to follow a predefined cutscene - it was clearly realtime and controllable.

I think 50% is a good overall figure for improvements

50% more GPU power
almost 50% more RAM
more than 50% more main memory bandwidth (but that'll be mitigated by the esram on xbox, plus with a less powerful GPU you don't need the higher bandwidth anyway)

Havent been on gaming side in a minute. Forgot about that exhibition.
 
3 OS need a GB each, or something like that.

I think 5GB is more than enough for games. Some people dont.

however much ram you have is always 'enough' and also 'not enough'. as a developer you cut your cloth to fit, but will always want more.

Arguably, MS's machine is nicely balanced. For the GPU power it has, you don't need massive external bandwidth, and 5GB Ram is probably enough - it may not be able to use much more effectively.

PS4 on the other hand - when it was originally 4GB - could be argued to be unbalanced, the GPU would be able to take advantage of more RAM. Now it is 8GB it is better balanced. And as the GPU is more powerful and the bandwidth higher, it will be more capable of using that extra ram.

So I think with current specs, both machines are fairly balanced, just that the PS4 is balanced at a higher level :)
 
Considering basically all the rumors have been correct, I don't know why people think the PS4's OS might suddenly have become an extra GB. Especially when considering how late the RAM bump came. There's no way they were ever planning on only having 2GB for games.

Well, I wouldn't be surprised if a part of the 8GB deal was to give the OS a little bit more ram. But I would be very surprised if Sony would design a bloated OS, it's insane to think that the OS needs more than 1 - 1 1/2GB ram.
 
Considering basically all the rumors have been correct, I don't know why people think the PS4's OS might suddenly have become an extra GB. Especially when considering how late the RAM bump came. There's no way they were ever planning on only having 2GB for games.

Precautionary measure. 512MB may have seemed like plenty beforehand but who knows if it will later on, so why not? 7GB is still twice as much as developers first expected and 2GB more than XBO devs will have.

Oh and I agree with moniker. PS4 being 50% more powerful than the XBO makes far more sense as a comparison than the other way around.
 
Then... there is posibility of 1GHz GPU on XBOne (and PS4) or not?, i read (don't know if here) that is not possible in Jaguar.
 
ibkLh2II7UZlqe.gif


Still blows me away every time I see it. That was the next-gen 'wow' moment right there. The moment that definitely wasn't at the Xbox One reveal.

and it uses about only half of PS4 total RAM since Sony bumped up the specs. Are you fucking kidding me Sony/GG? o_O
 
Then... there is posibility of 1GHz GPU on XBOne (and PS4) or not?, i read (don't know if here) that is not possible in Jaguar.
What you read is probably on Anandtech, specifically this bit;
Anandtech said:
Microsoft can’t make up the difference in clock speed alone (AMD’s GCN seems to top out around 1GHz on 28nm), and based on current leaks it looks like both MS and Sony are running their GPUs at the same 800MHz clock.
 
XboxOne GPU is 250% more powerful then the one in the WiiU. Or the WiiU is 70% less powerful if you prefer.

(numbers for WiiU unconfirmed)

Edit: oh god I called out people for not knowing math and I made a basic mistake.

How does this bode for Wii U down ports?
 
Until a game developer shows us the improvement within their game engine. We really wont know.

If a team uses all 5GB on an XboxOne. Or something close to that. We cant understand what a 25 or 50 percent 'boost' is supposed to look like.*

*Multiplatform game
Go play some PC games then. The hardware here is pretty damn similar and there are more than enough games out there now that will take advantage of a GTX 680 with 4GB of GDDR5 in collaboration with 8GB of DDR3 (with about 1-2 GB taken up by the OS).

That's the kind of gap you're talking about here, just not full optimized. On PS4 and XB1 that difference will be optimized for each experience. It makes no sense to just hand wave a 2GB difference and say it doesn't matter until you see it.

Well, I wouldn't be surprised if a part of the 8GB deal was to give the OS a little bit more ram. But I would be very surprised if Sony would design a bloated OS, it's insane to think that the OS needs more than 1 - 1 1/2GB ram.

I'd assume they just straight doubled it. I wouldn't even be surprised if the OS memory is effectively "discrete" from the rest of the system, so going from the 256 chips to the 512 chips would inherently have doubled it even if Sony wasn't 100% positive they'd need all of it.

Sony got caught with their pants down on OS utilization last generation and had to make a lot of sweeping improvements to remain competitive. I wouldn't be surprised at all if they're future-proofing the PS4's OS allocation when they can do so and still maintain great hardware symmetry and a competitive advantage over MS.
 
It makes more sense to say 50%.

If nVidia releases a new graphics card that performs 50% better than their last, most people don't say "nVidias old card performs 33% worse than their new card". They say "nVidias new card performs 50% better than their old card". It makes more sense to use the low number as baseline when comparing.
But it's not comparing PS5 to PS4 or 360 to xbone so your analogy doesn't fit. It's comparing the new number (for the xbone) to the already known numbers (of the PS4), so doing the 33% one makes sense. Of course both are perfectly viable depending in the point of view.
 
How does this bode for Wii U down ports?

Wii U is to Xbone/PS4 in the same way Wii was to 360/PS3. A different generation in terms of hardware.

The Wii U is incredibly doomed at this point from the standpoint of 3rd party development, in much the same way the Wii was.
 
But it's not comparing PS5 to PS4 or 360 to xbone so your analogy doesn't fit. It's comparing the new number (for the xbone) to the already known numbers (of the PS4), so doing the 33% one makes sense. Of course both are perfectly viable depending in the point of view.
Also true.

It makes sense to start from the newest product. Eg. our new care has 33% less CO2 emission, rather than, our previous car model had 50% more CO2 emission.

Big numbers sound better though :p
 
Just thought I'd add my math lesson to the pile...

There is a .6 teraflop difference between the PS4 and XB1.

.6 is 33% of 1.8
.6 is 50% of 1.2

.6 less than 1.8 is 33% less
.6 more than 1.2 is 50% more
 
M°°nblade;58774421 said:
Also true.

It makes sense to start from the newest product. Eg. our new care has 33% less CO2 emission, rather than, our previous car model had 50% more CO2 emission.

Big numbers sound better though :p

I still don't understand it after thinking about it a few hours. Well then, I was always shit @maths in school.
 
What the hell is going on?

What are the actual figures?

I understand the maths you guys are explaining, but no one seems to have clarified what the actual figures are if it isn't 50% or whatever.

GAF isn't very good at math unfortunately

Here's my attempt.

PS4 = 1.8 TF
XBO = 1.2 TF

Difference = 1.8 - 1.2 = 0.6 TF difference .. So by what percentage is PS4 (GPU) better?

0.6/1.8 * 100 = 33% MOAR FLOPS

If you compare both GPU's directly you get 1.2:1.8 = 2:3 POWER RATIO

TLDR; PS4 has a GPU that can output 33% more than XBO .. Is that a huge difference? Only the games will tell us. But there are many other factors like CU, RAM etc
 
M°°nblade;58775669 said:

Shit yes you are right, comparing the FLOPS of XBO it is 50%, I was working backwards from PS4's number.. then there's the rest of the hardware.

What do we know about the CPU and Audio processing for both machines? Are they the same?
 
Top Bottom