• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ubisoft GDC EU Presentation Shows Playstation 4 & Xbox One CPU & GPU Performance- RGT

Jonnax

Member
Why is UbiSoft using DirectX 11 on the PS4? It is my understanding that while the PS4 has a DirectX 11 compatibility wrapper around its own API, it is not the most efficient way to access the hardware. Is this a correct interpretation?
playstation-4-directx-11-HLSL.jpg

It's a comparison. The next slide shows the ps4 version.
 

charsace

Member
No, but from what I understand the buses work in tandem when needed.





EDIT:


As he was talking about the lack of bandwidth, I posted the image as it shows the system bandwidth across the buses.




Interesting. So I may have understood it wrong.
If you look at the chart you can see that the GPU can't use all the ram bandwidth.
 

Locuza

Member
Why is UbiSoft using DirectX 11 on the PS4?
http://twvideo01.ubm-us.net/o1/vault/gdceurope2014/Presentations/828884_Alexis_Vaisse.pdf
Look at page 71 and onward.
DX11 is only used as a comparison.

doesn't gpgpu not work against image quality? i'm pretty sure i read somewhere cerny saying devs can up performance without sacrificing graphical quality. i imagine there will be tradeoffs but gpgpu doesn't take away from rendering and vice versa.
In contrast to the PC you could use GPGPU much more efficient.
Cerny was talking about two different command streams, made possible by the ACEs.
One for the Graphics and many for Compute Tasks.
Without the ACEs you have only one, mixed with Compute and Graphics Workload, which can end in under utilisation and resource conflicts.
 

CLEEK

Member
ps3 3 times as powerfull as xbox 360? Never seen that in Ubi games.

That metric was referring to their CPUs only, not overall performance of the PS3 and 360.

The Cell was a beast compared to the Xenon, but the RSX was notably weaker than the Xenos. The best looking PS3 games were the ones that did the opposite of this Ubi presentation. They offloaded GPU tasks to the Cell.
 

Duxxy3

Member
I think this generation will be full of better looking 7th gen games and, hopefully, finally prices of $129/$149. Improvements that you would normally see with a better CPU are not likely to happen.

We're a year into this gen and we have yet to see a price drop. Both companies must be making a nice sized profit on these system already.
 

maneil99

Member
Intel's chips are much much more efficient than amd's they use less power and produce less heat. All while being more powerful. The decision to go with amd chips will have been a cost reduction one as Intel is not cheap. But itel is better than amd in every metric for cpu's
Not for APUs . Going Intel would have required a dedicated GPU.
 

CLEEK

Member
I'm confused with all the "well ya they're offloading stuff to the GPU" like what? I mean that doesn't make sense to me. Help.

Historically, GPUs were just used to calculate the geometry, lighting and texture, and then draw the final image. The CPU was there to do all the game logic, the AI, the physics calculations, the net code etc.

CPUs are designed to do a multitude of complex thing.
GPUs are designed to do only a few things very, very quickly.

If you get get the complex thing that the CPU would do broken down into lots of simple things, the GPU can run it.

Current gen consoles are not balanced. There isn't an even amount of CPU and GPU. There is far more performance provided by the GPU. Unless you can tap that, you will be limited by your CPU. GPGPU has been around for a while, which lets you run CPU task on the GPU. All modern supercomputers are now grids of GPUs, when are previously, they were grids of CPUs.

The AMD GPUs in the the XB1 and PS4 have inbuilt silicon for GPGPU, with the PS4 being specifically customised to be good at it (as this generation goes on, this will be a ket differentiator between the two consoles).

In this Ubi example, calculating the movement of a dancer was a CPU task. This gen, the CPUs are not a generational leap over their predecessor. The only way to get a generational leap in what is calculated is performing those calculations on the GPU.
 

Lady Gaia

Member
I totally buy that CPU bottlenecks are the reason why AC:Unity runs at 30fps on both consoles, and that AI-constrained metrics like number of people in crowds is similar. I might even buy that for game balancing reasons it's not worth having ~7% more people in Xbox One crowds.

What I haven't seen is a coherent explanation from Ubisoft regarding why the resolution cap is the same on both consoles. They could come out and say "CPU bound for framerate and address space limited for resolution" but they didn't. They tried to pass both off as CPU-constrained when that's simply not the case for resolution. I guess there could be a post-processing pass on the CPU — but that would be insane for the exact reasons this presentation illustrates so well.
 

Shin-Ra

Junior Member
Posting images of Uncharted 4's vertical slice mean nothing. If you are impressed by that, just imagine what ND could have achieved with better hardware.
I missed the part where you explain how it means nothing.

Surely the point is to create interactive audiovisual experiences that really impress people. Naughty Dog (or Quantic Dream if you don't trust ND!) seem to have passed those expectations.
 
PC GAF must be laughing their heads off.

As a member of PC GAF, I'm not laughing. I want technology to progress and that isn't going to happen when the lowest common denominator is that low. I do feel slightly vindicated though because I and many others were accused of having an agenda for stating the blindigly obvious ever since next gen console specs were released: these consoles are really weak compared to both contemporary PC technology and in relative power at launch compared to both the PS3 and 360.

In terms of CPU, it is clear that GPGPU is going to have to be pushed hard if these consoles are to last for five or six more years. I do not know however if using the GPU for CPU tasks costs GPU resources.
 
It's a comparison. The next slide shows the ps4 version.

If I'm understanding the slides right, all that is saying is that using DX11 on the XB1 has an implied buffer copy while using DX11 on the PS4 it has to be done explicitly. It's still going through DX11 in both scenarios although it is using DX11 extensions that Sony has added to the API to manually manage the buffer on the PlayStation side. This is evident because it says DX11 on both slides.
 

Foxyone

Member
Is GPGPU gonna be a thing on PC too? If GPUs are doing a lot of the work in new games though, what are the CPUs gonna be doing, sitting half-idle?
 

Kayant

Member
Not really, as Sony never revealed the final clock speed as I said, and dont some other benchmarks show the PS4 cpu is faster

No they confirmed it more than once as shown by Vizzeh's post and here(Guerilla games Page 7,8) - http://www.slideshare.net/DevCentralAMD/mm-4085-laurentbetbeder
and this presentation by ubisoft.

The 2.0Ghz was just a silly rumor.

Now According to details revealed in the benchmark, PS4 is actually running at 2Ghz in order to produce 14 MB/s of textures versus 12 MB/s textures for the Xbox One.

That's not true that's just the OP's speculation at the time at why the difference was happening.
 
Is GPGPU gonna be a thing on PC too? If GPUs are doing a lot of the work in new games though, what are the CPUs gonna be doing, sitting half-idle?

PC games have been using compute shaders and general purpose stuff for a long time already. Now it will just increase that the HD twins can do it too.
 

Futurematic

Member
I have to imagine AMD must've made a hell of an offer to Sony/MS, coupled with the fact that NVIDIA already pissed away any goodwill with either manufacturer, so AMD everything probably seemed logical at that point.
AMD are the only manufacturer that could make these APUs, which is going to save MS and Sony a lot of money in the long run.

I could see an arguement that Nvidia could have done a Maxwell/Denver APU that would have killed[1]. But 2014 would likely be the earliest, and given Nvidia probably wouldn't be cost effective until TSCM finally gets volume 20nm production.

Anything else, of course, would be impossible because Sony/MS needed APU level cheapness.


[1] A somewhat cut down 970 plus 4-6 Denver cores should do nicely. The ARM CPUs would make for somewhat more port work, but ARM is everywhere unlike Cell.
 
Is GPGPU gonna be a thing on PC too? If GPUs are doing a lot of the work in new games though, what are the CPUs gonna be doing, sitting half-idle?

Recent releases have had GPU requirements of a 6GB RAM GPU and only used single threads on the CPU to push not-particulrarly-impressive-by-PC-standards graphics, so....
 

le-seb

Member
I could see an arguement that Nvidia could have done a Maxwell/Project Denver ARM CPU APU that would have killed
And I could see one that both MS and Sony have been burnt from dealing with Nvidia in the past, and probably would refuse to work with them again. At least, not as long as they can work with another company.
 
Recent releases have had GPU requirements of a 6GB RAM GPU and only used single threads on the CPU to push not-particulrarly-impressive-by-PC-standards graphics, so....

The 6GB is way above recommended. It was an enthusiast option: and one at that did not even require that much VRAM.
 

JaseC

gave away the keys to the kingdom.
Recent releases have had GPU requirements of a 6GB RAM GPU and only used single threads on the CPU to push not-particulrarly-impressive-by-PC-standards graphics, so....

Well, it should be noted that what Warner and even Digital Foundry failed to clarify with regard to Mordor's recommended "requirement" of 6GB VRAM is that it's only necessary at 1080p if the resolution setting is set to 200%; when the game is rendering natively at 1080p, it uses up around 3.5GB. Then there's The Evil Within, which runs just fine on 2GB cards despite Bethesda insisting that people with <4GB cards wouldn't be able to play the game at 1080p. Watch Dogs, on the other hand, comes to mind as a game that was actually accurate with its VRAM recommendations -- even at 1680x1050 I'm not able to play the game with Ultra textures without it becoming a microstuttering mess (I've a pair of 2GB 670s).
 
PC games have been using compute shaders and general purpose stuff for a long time already. Now it will just increase that the HD twins can do it too.

True but typically didn't computers have split memory between general ram and video memory on the graphics card. The problem was that to split the work between the CPU and GPU, data had to be copied back and forth between these two separate memories which was inefficient. The consoles just have access to a unified pool of memory so no copying is required. On top of that the PS4 has done extra modifications to the chips to help synchronize access to that shared memory so that the CPU and GPU are not stepping all over one another. Computers are getting that capability but it hasn't always been there.


I also disagree with everybody complaining that the CPUs are too weak. It's been a trend to speed computations up by moving them over to the GPU. The new consoles were just built for that reality and now use more of its resources beefing up the GPU at the expense of the CPU. That's perfectly acceptable because the CPU will be doing less work anyway. We are just experiencing growing pains as devs get comfortable using the GPU more. We ran into a similar problem last gen when devs had to get use to multithreaded programming.
 
As a PC gamer both the advances in multicore utilization and GPU compute forced by the limitations of the consoles are actually something that's very welcome. I'm sure this is something AMD appreciates too, considering their weak CPU performance and great GPU compute capabilities. On the other hand, it also means games that require that CPU time are bound to be 30 FPS on console, which in turn will eventually mean a bunch of bad ports not really optimized to work beyond 30 FPS.
 

virtualS

Member
Essentially devs need to keep pushing code traditionally done on the CPU onto the GPU. In this scenario PS4 has a 100% advantage over the XBOne.

With such a large disparity between the two boxes, will developers bother on multi platform games when Microsoft's console will be at such a large disadvantage?

Last generation on consoles the opposite needed to occur. Cell had a massive advantage but that large disparity yielded little fruit on the multi platform side. Of course this required much time and investment... but have things changed?

PCs throw another interesting dimension into the mix. There's plenty of scope for GPGPU offload in PC land but such techniques are being locked down to vendor specific (nvidia) cards through development deals. Also, with significantly more powerful CPUs there is perhaps less of an incentive to push code onto GPUs.

We REALLY need PS4 to lead development this gen as this will also greatly benefit the PC side of things. Hopefully with sales massively on Sony's side this will occur this generation.

Microsoft know this shift in coding paradigm will significantly harm them in a comparative sense in the console space and I'm sure they're doing everything in their power to curb it.

AC Unity should be running at 60fps at 1080p on PS4 with heavy use of compute. I'm assuming AI code can be ported to run on GPUs with time and effort... perhaps not. If not then it should be running at 30fps at 1080p on PS4.
 
PCs throw another interesting dimension into the mix. There's plenty of scope for GPGPU offload in PC land but such techniques are being locked down to vendor specific (nvidia) cards through development deals. Also, with significantly more powerful CPUs there is perhaps less of an incentive to push code onto GPUs.

Just wanna say that this is not completely true really. Yes, Nvidia makes deals for CUDA specific stuff in games... but that does not mean GPGPU driven stuff is vendor specific. There are tons of examples in modern game engines where this is NOT the case.
 
I don't quite get the CPU results. Last gear substance benchmark showed a 15% advantage for the PS4 in CPU processing. There was a user here called Matt who basically confirmed PS4 CPU > XBO CPU. Today, this new benchmark shows a 15% advantage for the XBO in CPU processing.

The hard data from Sony and MS seems to be XBO 1.75 ghz >9%> 1.6 ghz PS4. But the benchmark shows 15% difference. I don't understand any of this. Can somebody explain why the tides have turned in CPU performance?
 

stryke

Member
I don't quite get the CPU results. Last gear substance benchmark showed a 15% advantage for the PS4 in CPU processing. There was a user here called Matt who basically confirmed PS4 CPU > XBO CPU. Today, this new benchmark shows a 15% advantage for the XBO in CPU processing.

The hard data from Sony and MS seems to be XBO 1.75 ghz >9%> 1.6 ghz PS4. But the benchmark shows 15% difference. I don't understand any of this. Can somebody explain why the tides have turned in CPU performance?

June SDK update.
 

AmFreak

Member
I don't quite get the CPU results. Last gear substance benchmark showed a 15% advantage for the PS4 in CPU processing. There was a user here called Matt who basically confirmed PS4 CPU > XBO CPU. Today, this new benchmark shows a 15% advantage for the XBO in CPU processing.

The hard data from Sony and MS seems to be XBO 1.75 ghz >9%> 1.6 ghz PS4. But the benchmark shows 15% difference. I don't understand any of this. Can somebody explain why the tides have turned in CPU performance?

Maybe, cause aside from what people here think, this isn't a benchmark, this is a "the console cpu's are really weak - use the gpu when you can - here is how we have done it" -presentation.
 
So why did Microsoft and Sony opt for relatively weak cpus by AMD instead of chips by Intel? A lot of products nowadays have Intel's chip as their cpus. In Microsoft's case, the Surface runs with i3, i5, and i7. Perhaps they will use profits from this gen to create powerful hardware next generation? Phones, tablets, and other stuff can be crazy expensive but gaming consoles can't?
 

The Llama

Member
So why did Microsoft and Sony opt for relatively weak cpus by AMD instead of chips by Intel? A lot of products nowadays have Intel's chip as their cpus. In Microsoft's case, the Surface runs with i3, i5, and i7. Perhaps they will use profits from this gen to create powerful hardware next generation? Phones, tablets, and other stuff can be crazy expensive but gaming consoles can't?

Two reasons IMO.

First is that they wanted a combined CPU+GPU in one package, to save on costs. AMD is the way to go for that.

Second is they wanted to break even on costs from the start (or come very close to breaking even), and decided to spend more of their limited budget on the GPU rather than the CPU, believing that in the long run a stronger GPU would be more useful than a stronger CPU.
 
So why did Microsoft and Sony opt for relatively weak cpus by AMD instead of chips by Intel? A lot of products nowadays have Intel's chip as their cpus. In Microsoft's case, the Surface runs with i3, i5, and i7. Perhaps they will use profits from this gen to create powerful hardware next generation? Phones, tablets, and other stuff can be crazy expensive but gaming consoles can't?

because they want lots of monies, they wasted lots of monies last gen in RROD and the cell
 

Duxxy3

Member
So why did Microsoft and Sony opt for relatively weak cpus by AMD instead of chips by Intel? A lot of products nowadays have Intel's chip as their cpus. In Microsoft's case, the Surface runs with i3, i5, and i7. Perhaps they will use profits from this gen to create powerful hardware next generation? Phones, tablets, and other stuff can be crazy expensive but gaming consoles can't?

They want to make a profit on the systems. That's all there is to it. Just my own speculation, but if both systems dropped to $299 I would guess they would be profitable after two games sold.
 

Faustek

Member
We REALLY need PS4 to lead development this gen as this will also greatly benefit the PC side of things. Hopefully with sales massively on Sony's side this will occur this generation.

I'm hoping for that but I doubt it :/

MS will do the money thing if it goes to shit for them :(
 
So why did Microsoft and Sony opt for relatively weak cpus by AMD instead of chips by Intel? A lot of products nowadays have Intel's chip as their cpus. In Microsoft's case, the Surface runs with i3, i5, and i7. Perhaps they will use profits from this gen to create powerful hardware next generation? Phones, tablets, and other stuff can be crazy expensive but gaming consoles can't?

First answer is cost. AMD was willing to give Sony and Microsoft a better deal, and we have already seen how price sensitive the console market is.

Second is that (contrary to what MS originally proposed) consoles are special purpose gaming machines. They are not trying to strike a balance to run every conceivable type of software possible. The trend for gaming has been to use the GPU more for things like AI, collision detection, and physics instead of the CPU. While it requires more work, you really do get more bang for your buck if you can offload those tasks to the GPU. So if you recognize those trends and are designing a console to last 8-10 years, you'd use your limited resources to beef up your GPU at the expense of your CPU.
 

Pain

Banned
I think this generation will be full of better looking 7th gen games and, hopefully, finally prices of $129/$149. Improvements that you would normally see with a better CPU are not likely to happen.

We're a year into this gen and we have yet to see a price drop. Both companies must be making a nice sized profit on these system already.
Xbox One has had multiple price drops/promotions. Where have you been?
 

JordanN

Banned
Phones, tablets, and other stuff can be crazy expensive but gaming consoles can't?
But a console is not a phone or a tablet.

You're not signing up for a contract in the case that a phone is. A tablet's number one function is not to game, but is closer to a laptop/PC is.

The game industry already has history of releasing expensive consoles and they all fail.

3DO, PS3, Neo-Geo. There has never been acceptance for $600 consoles or more.
 
because they want lots of monies, they wasted lots of monies last gen in RROD and the cell

They want to make a profit on the systems. That's all there is to it. Just my own speculation, but if both systems dropped to $299 I would guess they would be profitable after two games sold.

First answer is cost. AMD was willing to give Sony and Microsoft a better deal, and we have already seen how price sensitive the console market is.

Second is that (contrary to what MS originally proposed) consoles are special purpose gaming machines. They are not trying to strike a balance to run every conceivable type of software possible. The trend for gaming has been to use the GPU more for things like AI, collision detection, and physics instead of the CPU. While it requires more work, you really do get more bang for your buck if you can offload those tasks to the GPU. So if you recognize those trends and are designing a console to last 8-10 years, you'd use your limited resources to beef up your GPU at the expense of your CPU.

So if price a major factor, do you think Intel will offer the console manufactures a good price on cpus next gen? Intel is currently developing heat efficient (Broadwell?) cpus so surely Sony and Microsoft would be interested in that. It just saddens me that other consumer electronics sell for high prices but gaming consoles can't offer slightly more expensive consoles without fear of losing sales.
 

The Llama

Member
So if price a major factor, do you think Intel will offer the console manufactures a good price on cpus next gen? Intel is currently developing heat efficient (Broadwell?) cpus so surely Sony and Microsoft would be interested in that. It just saddens me that other consumer electronics sell for high prices but gaming consoles can't offer slightly more expensive consoles without fear of losing sales.

Probably not. Intel controls like ~80% of the CPU market (or something like that, anyway), and don't really have any incentive to give either console manufacturer a good deal.
 

Duxxy3

Member
Probably not. Intel controls like ~80% of the CPU market (or something like that, anyway), and don't really have any incentive to give either console manufacturer a good deal.

Yep. Intel has no reason to change anything that they're doing right now. They're insanely profitable and are multiple generations ahead of anything AMD is doing. The only thing they might have to worry about would be anti-trust laws.
 

Biker19

Banned
It is really irritating to me just how low MS and Sony went on these CPUs. Just terrible.

I agree, though if they were to go with stronger CPU's like the i5 or the i7, it would've made the consoles more expensive, like gruenel mentioned earlier.

Maybe next gen, they'll go with stronger CPU's that'll be as powerful as the PS3's CPU or better, but much cheaper, & more developer friendly for 3rd party developers as well (like with Xbox 360 & PS4) at the same time.
 
Maybe, cause aside from what people here think, this isn't a benchmark, this is a "the console cpu's are really weak - use the gpu when you can - here is how we have done it" -presentation.
But why the tide change in CPU performance? Is there a difference in the CPU structure that makes each CPU better suited for certain tasks? Or did the new june SDK give the XBO CPU a ~30% performance boost?
 
I could see an arguement that Nvidia could have done a Maxwell/Denver APU that would have killed[1]. But 2014 would likely be the earliest, and given Nvidia probably wouldn't be cost effective until TSCM finally gets volume 20nm production.

Anything else, of course, would be impossible because Sony/MS needed APU level cheapness.


[1] A somewhat cut down 970 plus 4-6 Denver cores should do nicely. The ARM CPUs would make for somewhat more port work, but ARM is everywhere unlike Cell.

I highly doubt an ARM APU would be viable even now for the current generation of consoles. The 64 bit cores are only now beginning to reach products outside Apple and they scale poorly on the high performance side of things. They are designed for efficiency at tight tdp/power envelopes. The x86 cores still dominate in applications where that's less of a constraint. Jaguar might not look like much vs a i7 but its more than capable enough vs the highest end ARM cores.

The only viable alternative to AMD was Intel. They could have made an 22nm APU with cut down Core-i CPU cores (no need for turbo boost etc on consoles) and a scaled up Iris GPU. That solution would have been too expensive (Intel loves its 60%+ gross margins) and I don't think there was much interest on Intel's part in the first place. AMD needed the design wins and was willing to do the extra work at a lot smaller margin. Imo there was no alternative.
 
So if price a major factor, do you think Intel will offer the console manufactures a good price on cpus next gen? Intel is currently developing heat efficient (Broadwell?) cpus so surely Sony and Microsoft would be interested in that. It just saddens me that other consumer electronics sell for high prices but gaming consoles can't offer slightly more expensive consoles without fear of losing sales.

Could be, but I think I remember that Intel was like "We make so much profit on our chips that its not worth it to make cut rate deals on consoles." I also think there is some bad blood between Intel and the console makers. Finally AMD might be able to offer a better CPU/GPU package than Intel. Not sure about that last one. I haven't been following Intel's integrated graphics capabilities.

Oh and backwards compatibility will be easier if the console manufacturers stick with AMD. My wild ass prediction is that this console cycle will be a lot shorter than people expect. However the next gen of consoles won't replace the current gen, but will be an upgrade to it. I think Microsoft will push this trend because they will want to hit the reset switch as soon as possible on this gen. They made their name on being the most powerful console and yet now they are reduced to claiming resolution doesn't matter. That has got to hurt.
 
As a member of PC GAF, I'm not laughing. I want technology to progress and that isn't going to happen when the lowest common denominator is that low.

FYI, that lowest common denominator is still higher than the vast majority of Windows PCs. Want to blame the lack of technical progress on something? Blame the consumers who won't spend more on hardware, be it console or PC.
 
Top Bottom