DF: Xbone Specs/Tech Analysis: GPU 33% less powerful than PS4

It's not more, but it's higher bandwidth ram with more latency. It's a distinct advantage, but that doesn't mean it hasn't been turned into a stupid meme.
Please tell me the latency of DDR3-2133 vs. GDDR5-5500, in nanoseconds. You will notice that the latency differences are negligable.
 
Theoretical FLOPS scale 100% with clock.

Just a pity actual performance doesn't. Especially since we have actual performance figures for GPUs which are similar to those found in these consoles:
The 7790, which is slightly more powerful than the XBone GPU (2 more CUs and 200MHz clock bump)
And the 7850, which is slightly less powerful than the PS4 GPU (2 less CUs)

In those benchmarks the 7850 shows an average > 30% more performance than the 7790 accross the board.
Further to that the 7850 is clocked at nearer to 800MHz and the 7790 at 1000MHz

Sure i'm leveraging inference but the real numbers speak for themselves
 
So what are the consequences of ms and sony both going AMD jaguar going to be for PC gamers? I really know nothing about cpu and gpu architecture or about software development, but for example, would one suddenly be better off with AMD bulldozer, as compared to a Ivy Bridge processor? Also, will there be a sudden jump in pc requirements for maxing the settings on next gen multiplats, or will the more modern architecture of the next gen consoles actually make it easier on pcs?
 
Can someone explain the meaning of "coding to the metal"?

As I understand it, that means writing code that talks directly to the processor without any middleware/engine to consider. Is this correct? Do games coded down to the metal not have an engine to speak of? What games have been coded to the metal?

It's always intrigued me, but I've never had a full understanding of the concept.
It's using every ounce of power that a specific architecture provides instead of finding the middle ground between thousands of PC configurations.

As an aside, it's also possible to literally code to metal. You can have code compiled to silicon chips.
 
It's not more, but it's higher bandwidth ram with more latency. It's a distinct advantage, but that doesn't mean it hasn't been turned into a stupid meme.

You seem really irritated by these people who are excited about the PS4 RAM. Is there a reason this is so irritating? 8 gigs of GDDR5 was a rather unprecedented announcement, and still is. There isn't a single mass market GPU that has that much RAM on it.
 
Is it more, faster RAM or is it not?

This isn't like Cell where there's this unknown hidden potential.

Potential comes from the customized Compute functionality this time around in the PS4 GPU. Killzone: SF only uses a fraction of this functionality, according to those recent GG tech slides.
 
So what are the consequences of ms and sony both going AMD jaguar going to be for PC gamers? I really know nothing about cpu and gpu architecture or about software development, but for example, would one suddenly be better off with AMD bulldozer, as compared to a Ivy Bridge processor? Also, will there be a sudden jump in pc requirements for maxing the settings on next gen multiplats, or will the more modern architecture of the next gen consoles actually make it easier on pcs?

there will be no consequences for the PC gamer utilizing an intel CPU. the main thing that we should see is more threads being used in PC ports. Right now, we typically see ports using 1-2 cores and the next gen systems having 8 cores will definitely increase the amount of CPU power needed.

I don't believe this will translate to AMD > Intel in the PC world. Intel's CPU's are pretty beastly and should handle the tasks thrown at them with no issues. They're powerful and very efficient, so anybody with an i5/i7 quad or higher shouldn't be worried at all especially since these intel CPU's will be running at a clock speed 2-3 times faster than consoles while also boasting significantly better performance per clock cycle.
 
Please tell me the latency of DDR3-2133 vs. GDDR5-5500, in nanoseconds. You will notice that the latency differences are negligable.

sort of. In terms of single application processing (even across multiple threads) the differences are negligible. In terms of multi-application/execution processing the latency is a little more meaningful.

Honestly, MS' DDR3 is very well suited for the extra OS layer stuff above and beyond just playing the game. Likewise PS4's GDDR5 is very well suited for running a game as fast as possible. In the end it means that PS4 will be at a slight disadvantage over latency with various OS/social overhead stuff, while the XBONE will be at a slight disadvantage with memory bandwidth while churning out tons of textures and huge frames.
 
GPUs are less sensitive to latency than CPUs!

The fact you did not know this give you no right to abuse people!

DDR latency has been steadily going up for generations too. I've seen no push from Intel or AMD to lower RAM latencies other than CPU caches.


sort of. In terms of single application processing (even across multiple threads) the differences are negligible. In terms of multi-application/execution processing the latency is a little more meaningful.

Honestly, MS' DDR3 is very well suited for the extra OS layer stuff above and beyond just playing the game. Likewise PS4's GDDR5 is very well suited for running a game as fast as possible. In the end it means that PS4 will be at a slight disadvantage over latency with various OS/social overhead stuff, while the XBONE will be at a slight disadvantage with memory bandwidth while churning out tons of textures and huge frames.

I think cache misses and memory latencies are measured in clock cycles, which are orders of magnitude smaller than application smoothness and user interactivity.
 
DDR latency has been steadily going up for generations too. I've seen no push from Intel or AMD to lower RAM latencies other than CPU caches.

Caches are the way to try and offset it, not much short of replacing DRAM with something else with lower latency (that I am not sure exists and def not in a mass market form) will fix it.
 
Xav de Matos on the new Joystiq podcast.


"the graphical capabilities are the same on the systems, except the ability to power that memory and speed up to next process is faster on the PS4".

"there have been no developers that have come out yet, to say that there are limitations vs one system's GPU to the next. It appears that the games themselves will be sort of similar in that vein, other than the DDR is faster".




The gaming media taking their notes from Microsoft PR.

Was this really said?
 
Seems like Sony has this one in the bag. The only thing that will kill them at this point is the price.

Multiplats will be made with the weaker console in mind anyway though.
 
sort of. In terms of single application processing (even across multiple threads) the differences are negligible. In terms of multi-application/execution processing the latency is a little more meaningful.
What does that have to do with anything? I'm strictly discussing the actual time in nanoseconds you can read/write to GDDR5 vs. DDR3, with the specific clocks in the PS4 and XBone. GDDR5 takes more cycles, but it's clocked more than twice as fast, so it's all a wash independent of what you're running.
 
sort of. In terms of single application processing (even across multiple threads) the differences are negligible. In terms of multi-application/execution processing the latency is a little more meaningful.

Honestly, MS' DDR3 is very well suited for the extra OS layer stuff above and beyond just playing the game. Likewise PS4's GDDR5 is very well suited for running a game as fast as possible. In the end it means that PS4 will be at a slight disadvantage over latency with various OS/social overhead stuff, while the XBONE will be at a slight disadvantage with memory bandwidth while churning out tons of textures and huge frames.

The memory latency trade off for apps is very likely not going to make a difference. I'd hypothesize the user will be unable to detect any difference.
 
Pretty much.
This is a no-brainer, there's not really any discussion to be had aside from how much significantly more powerful the PS4 is.

The PS4 has general-purpose RAM that is faster than the specialized eDRAM of Xbone. It has the same CPU, but the Xbox will presumably gate away more of the CPU processing power for the three Operating Systems needed for their multi-tasking thing. GPU of the PS4 is straight-up 50% faster, and it has 50% more CUs.

PS4 spanks the Xbone back and forth, forever.

I'm getting the PS4, too, but there's going to be a lot of very disappointed people out there if they honestly believe that the PS4 has this significant power advantage over the next Xbox. The PS4 is the stronger console with the more advanced hardware, which means it most certainly will have an advantage in the large majority of cases.

However, the visual difference between games on the two machines will not be anywhere close to as significant as some people here believe. What you're more likely to see is simply a lower resolution on the Xbox One version, but more or less the same visual fidelity will be maintained. And people who claim they can tell the difference between resolutions without pixel counting or digital foundry are simply not believable. Even outside of resolution, most people will be hard pressed to find differences between multiplatforms on these systems. They will need a ton of help from Digital Foundry and other places, and if that's what it takes for you to even spot any differences, then is it truly as significant as you think?

The PS4 also isn't better in all circumstances. Xbox One actually has a latency advantage over the PS4. That latency advantage will be a better solution for feeding information to the console's CPU. And clearly you don't understand enough about the way the OS will work if you somehow believe that its existence will provide the ps4 an even further advantage. What's more, the Xbox One has a very powerful audio chip that appears from what we know quite a bit more advanced and powerful than anything inside the PS4. In other words, the Xbox One CPU won't have to dedicate nearly as many CPU resources to audio as will be the case for the PS4. Audio was so intensive in the last gen that it often took up close to 2 full cores on the Xbox 360. Even if the entire 360 cpu was dedicated to audio, it couldn't match what the Xbox One's audio chip can do.

The PS4 does indeed have the bandwidth advantage, but you also need to consider that there are simply less GPU execution resources on the Xbox One in the first place, so it may not necessarily require nearly as much bandwidth. The Xbox One's ESRAM is not an edram variant. It's 6T-SRAM, which is rather low latency. There are situations in which that low latency will be extremely helpful to Xbox One GPU performance. Again, the PS4 is the stronger console and it's clear as day. I'm not arguing that. But you are going to be extremely disappointed if you're expecting light years of a difference between the two in games. This isn't Wii-U level gap PS4. The PS4 isn't in another generation class from this new Xbox is basically what I'm saying. They are in the same capability range, with the PS4 simply just being quite more capable at higher resolutions. But if a developer simply lowers the resolution for Xbox One, or even goes 720p, where does that leave you? Just forum bragging rights against an Xbox One game that will still look nearly as good or equivalent to the ps4 version but simply runs at a lower resolution? The Xbox One GPU is a slightly weaker 7790 with access to notably more graphics memory and bandwidth. Xbox One isn't weak. It's just not stronger than the PS4. Big, big difference.

Developers are going to properly optimize their games on both machines. They won't just allow the Xbox One version to run like crap in comparison to the PS4 version. They won't just have the Xbox One version looking like crap in comparison to the PS4 version. Does anyone truly believe that the new Xbox won't have fantastic looking exclusives? The specs don't mean the games can't be incredible.
 

However, the visual difference between games on the two machines will not be anywhere close to as significant as some people here believe.
What you're more likely to see is simply a lower resolution on the Xbox One version, but more or less the same visual fidelity will be maintained. And people who claim they can tell the difference between resolutions without pixel counting or digital foundry are simply not believable. Even outside of resolution, most people will be hard pressed to find differences between multiplatforms on these systems. They will need a ton of help from Digital Foundry and other places, and if that's what it takes for you to even spot any differences, then is it truly as significant as you think?
Who are you kidding? The visual fidelity delta will be more pronounced than 360 versus PS3. This is akin to Gamecube --> Xbox. Xbox will run their games at 900p with lower quality settings. If you think that is nothing to worry about, than hey, you lack the critical eye.

Like I've said before, if a PS4 game is 1080p 30 FPS at Ultra, it will be 900p 30 FPS on high on Xbox. First party titles will only magnify (more available) RAM, bandwith, and GPU advantages. Microsoft bet against the hardcore gamer.
 
I don't think you understand the point I'm trying to make. Did people enjoy DMC any less because it didn't look as good as Ninja Gaiden? I don't believe so, at least I didn't and I'm sure the same can be said for many here at GAF.

However, now all of a sudden a smaller gap is suddenly a big deal to many here and I'm sure it'll effect their opinion on Xbone games even though they could still be quality games.

You're telling me this should be considered normal?

I don't know why you feel it necessary to come into a thread meant to discuss a Digital Foundry article about comparing console specs and then complain that people are discussing console specs too much.

It seems highly defensive to take the tired line of "it's about the gameplay guys!" in a thread clearly titled to indicate what its focus is to be.
 
By the way people that keep harping on low latency of the eSRAM, the CPU cannot see it. The eSRAM is only connected to the GPU as far as we know from the leaks. Keep this in mind.

Thel atnecy difference for GDDR and DDR are probably negigiable, at least from one benchmark I have seen. But this ignores the fact that the PS4 has special hardware to help with this, the "Onion" and "Garlic" paths.

http://www.sisoftware.net/?d=qa&f=gpu_mem_latency

gp_lat_global.png
 
I don't understand why people are saying that the PS4 will likely end up just having higher resolutions or something. This is talked about as opposed to framerate increases because the PS4 isn't twice as powerful. But the jump from 720p to 1080p also requires about 2x the resources. What two resolutions do people have in mind?
 
I don't understand why people are saying that the PS4 will likely end up just having higher resolutions or something. This is talked about as opposed to framerate increases because the PS4 isn't twice as powerful. But the jump from 720p to 1080p also requires about 2x the resources. What two resolutions do people have in mind?

1080p isn't necessarily double the resources. If a game runs at 30 FPS at 1080p, it won't magically run at 60 FPS at 720p. It depends on engines, and how games are designed. However there is a great change that if you see a game running at 30 FPS 1080p on Xbone, it will run at 45-50 FPS on PS4 with all things equal. Which is NOTHING to scoff at.
 
I don't understand why people are saying that the PS4 will likely end up just having higher resolutions or something. This is talked about as opposed to framerate increases because the PS4 isn't twice as powerful. But the jump from 720p to 1080p also requires about 2x the resources. What two resolutions do people have in mind?
Resolutions that are around 50% higher (or a bit less).

Something like 1440x1080 on XB1 and 1920x1080 on PS4.
of course, there are many other possibilities that require minimal developer effort. Like the same resolution, but SMAA4x on PS4 and SMAA2Tx on XB1.
 
Resolutions that are around 50% higher (or a bit less)? Why is this hard to understand?

Something like 1440x1080 on XB1 and 1920x1080 on PS4.
of course, there are many other possibilities that require minimal developer effort. Like the same resolution, but SMAA4x on PS4 and SMAA2Tx on XB1.

But 1440x1080 is an awkward resolution to display on a typical TV. Are games actually likely to end up rendering at resolutions which don't neatly fit on a 1920x1080 screen?

It's the other stuff - like the AA you talk about - that I more expect to see the PS4 do a better job on. That and not falling below 30 fps nearly as often.
 
You seem really irritated by these people who are excited about the PS4 RAM. Is there a reason this is so irritating? 8 gigs of GDDR5 was a rather unprecedented announcement, and still is. There isn't a single mass market GPU that has that much RAM on it.

I'm not irritated at all by the inclusion of GDDR5, I'm annoyed by the rampant meme-ification of it.

On a personal level, I think it's really cool that they went that route.
 
But 1440x1080 is an awkward resolution to display on a typical TV. Are games actually likely to end up rendering at resolutions which don't neatly fit on a 1920x1080 screen?
Considering the number of games which don't render at 720p this generation, yes, I don't expect all games to render at exactly 1920x1080 either on the upcoming systems, particularly on XB1.

Of course all of them will output 1080p (and hopefully render the UI at that resolution if developers have any sense).
 
But 1440x1080 is an awkward resolution to display on a typical TV. Are games actually likely to end up rendering at resolutions which don't neatly fit on a 1920x1080 screen?

It's the other stuff - like the AA you talk about - that I more expect to see the PS4 do a better job on. That and not falling below 30 fps nearly as often.

you wouldn't send that out to the TV, you'd scale it on the console to 1920x1080 - the TV would always see 1080p.
 
Considering the number of games which don't render at 720p this generation, yes, I don't expect all games to render at exactly 1920x1080 either on the upcoming systems, particularly on XB1.

Of course all of them will output 1080p (and hopefully render the UI at that resolution if developers have any sense).

Huh, I didn't know there were so many sub-720p games on consoles this time around. Googling around, I see you're right. Then yeah, objection retracted.
 
1080p isn't necessarily double the resources. If a game runs at 30 FPS at 1080p, it won't magically run at 60 FPS at 720p. It depends on engines, and how games are designed. However there is a great change that if you see a game running at 30 FPS 1080p on Xbone, it will run at 45-50 FPS on PS4 with all things equal. Which is NOTHING to scoff at.

This is true.

Also of note - While triple buffering has been commonplace in the PC area for a long time, it has been extremely rare to see it used in the console space. The reason is that the RAM pools were so small that developers couldn't even forgo the few extra MB that would be required to hold 2 extra framebuffers.

I may be off base, but I wouldn't be surprised at all if the RAM pools in both of these consoles allowed developers to use triple buffering. I doubt many games will be in a situation where they're filling the RAM to the point where they can't spare the extra framebuffers, so we won't be stuck with a 30-or-60 (or horrible tearing) world with these consoles. You might also see a PS4 game locked at 60 and the X1 version hovering in the 40+ range. With triple buffering, this won't cause the screen-tearing eye bleed that the same framerate in either scenario would cause with the current consoles.
 
Yeah, but the CPUs on these systems aren't exactly beefy, and require good parallelism. If the quad core 3.2Ghz CPUs had made it I might be more optimistic about 60fps, at least in the short term.

Longer term as devs get used to the CPUs - and also perhaps leverage GPGPU more - cpu and general processing may become less of a hold back on framerate.

I am making an assumption about the CPU being the one holding things back but just a hunch.

Aren't most games today GPU constrained? On PC you need to significantly downgrade resolution and effects to start becoming CPU limited and that limit is usually way above 60 fps anyway.

COD will be 60 fps and while it doesn't look great by any measure, it's an OK baseline for the quality we can expect at 60 frames this gen.
 
Was this really said?

Those are exact quotes.

On the page prior I also quoted a guy from IGN saying something similar. Microsoft intentionally showed a limited specs sheet on Tuesday, and it was for this reason. People see 8 core CPU, 8 gigs of ram, and think "must be the same".
 
Not sure where to dump this, but they say custom a lot.

http://www.tomshardware.com/news/Xbox-One-APU-Jaguar-Radeon-Next-Generation,22726.html

"There are similarities about the [PS4 and XB1] APUs you can find based on data that's public from Sony and Microsoft," said AMD spokesman Travis Williams who said he was not in the position to disclose additional hardware specs. "However, I can say each APU was customized and tailor-made for Microsoft and Sony and their respective console and experience design points."
 
But 1440x1080 is an awkward resolution to display on a typical TV. Are games actually likely to end up rendering at resolutions which don't neatly fit on a 1920x1080 screen?
Well. GT5 is rendered at 1280x1080 then upscaled, but that's because PS3 only has horizontal hardware scaling. The 16:9 equivalent of 1440x1080 is 1664x936.
By the way 1440x1080 isn't 33% less pixels than 1080p. That would be 1280x1080, or 1568x882 in 16:9 format.
 
XBone comes with Kinect 2, PS4 doesnt come with Move 2. So if MS have an incredible Kinect 2 game at launch, XBone could be the new Wii.....
 
I need to see multiplats. The PS3 was supposed to be a beast machine compared to the paltry and underpowered 360. The end result was the multiplats were usually inferior on PS3 and that even on exclusives, the two looked about equal. I mean, if this is that PS3 type of power that not even 1st party devs can tap into, who cares if there is 33% more of it?
 
But 1440x1080 is an awkward resolution to display on a typical TV. Are games actually likely to end up rendering at resolutions which don't neatly fit on a 1920x1080 screen?

You can scale the output buffer to 1080P. This happens now with many games.
 
I'm getting the PS4, too, but there's going to be a lot of very disappointed people out there if they honestly believe that the PS4 has this significant power advantage over the next Xbox.
It's 50% more powerful. That's significant enough to make the PS4 version the superior version of every multiplatform release by default.
Crisper IQ, higher framerate and better AA are the most conservative, basic improvements people can safely expect.
 
Around 30% when you look at 580 to 680. But that transition was when NV started selling their mid-level chips as high-end. And 580 was to 480 as 780 is to 680. The last "real" generational upgrade (IMHO of course) at NV was 285 to 480, and that was 70%.

Interesting, thanks!

No, it means that the PS4 GPU is roughly 50% more powerful than the Xbone GPU. They're same generation GPUs.

That's true, but it's a pretty big power gap in terms of PC GPU iterations.
 
Wouldn't surprise me. Xav is well meaning, but kind of clueless on the technical side of things.

Like 95% of people on podcasts...and well, forums too.

Most people arent like me...

It's comical when weekend confirmed starts talking tech usually...and they're probably some of the better ones.

Ooh, and sales numbers. They're so stupid about sales numbers. Their NPD discussions would just make me cringe.
 
Top Bottom