• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ubisoft GDC EU Presentation Shows Playstation 4 & Xbox One CPU & GPU Performance- RGT

So 40% more detail isn't substantial!?

I believe the crux of his argument is that compared to Xbox One games PS4 titles don't have 40% "better graphics". That while the resolution difference is there, PS4 games don't have 40% better effects, better shaders, better geometry or better animations. They are the same games, only a bit shinier.

They have to live with console limitations, though, which I'm sure was Can Crusher's point.
Otherwise, why would PC gamers complain being dragged by them?

Consoles are the lowest common denominator in multiplatform games, that is very much true. Some limitations in game design are inevitable but thankfully some very important factors like resolution, framerate and advanced graphical effects are usually not an issue on PC. Even if consoles didn't exist there would still be other lowest common denominators, like Intel integrated GPUs, so I don't blame consoles at all for that. I am happy that the bar has been raised even by that small amount.
 

Percy

Banned
I believe the crux of his argument is that compared to Xbox One games PS4 titles don't have 40% "better graphics". That while the resolution difference is there, PS4 games don't have 40% better effects, better shaders, better geometry or better animations. They are the same games, only a bit shinier.

It's not just resolution differences though. There's plenty of games that also have better performance and occasionally better visual effects on PS4 compared to Xbox One.
 

KidJr

Member
I'm not sure this analogy is helpful. ACEs are for scheduling and dispatching work to the GPU's multiprocessors (= compute units = CUs). This AnandTech article explains it a bit.



The way I see it is the more different tasks you want to do on the GPU, possibly in parallel, the more ACEs you need if you don't want them to become a bottleneck. If there's no bottleneck for a specific application on the other hand, more ACEs don't do anything for performance.



In terms of theoretical peak performance? Yes. Taking a look at this a current Haswell CPU core can do up to 32 FLOPs per cycle. The Core i5-4430 (slowest i5) clocks at 3 GHz and has 4 cores. 3*4*32=384 GFLOP/s. The PS3's Cell (with one SPU deactivated) has 204.8 GFLOP/s.
Real world performance for most applications is a lot higher anyway of course.


edit:



PS3 had a Nvidia GPU, Xbox 360 an AMD GPU. Even if they were the same architecture though clock doesn't mean anything without the number of shader units.

Sorry my post was distorted, the analogy was more for general purpose of how you get GPU compute to work as opposed to ACE queues.

The question is though, will more ACE benefit GPGPU in PS4 games?
Will there be even games that will use GPGPU compute in a way You will need more ACE for wavefront scheduling.
ACE number upgrade was mostly designed for very high end cards that reach 5-6Gflops, not for 1.8tflops. And still, we dont know if it not technology designed more for GPGPU based research applications and not games.

It can be a boost for PS4, but it can also be not and for now, we dont have any prove that it helps in real world games scenarios.

I mean I get what your saying...there are no games using all of those queues because no game is that heavily optimised for GPU compute... so technically all of that theoretical power, is just that, theoretical.

And I highlighted the bolded part because funnily enough my friend of mine works for a firm that uses GPU compute heavily for research. While the Algorthims he uses are very different to games ( his stuff is all about applying machine learning techniques to large data sets) the logic is still the same, GPU compute is a beast at parrellell processing and number crunching whether this is in research form or a game engine. Industry isnt a measure of how useful GPU compute is at all, it's about how parallel your code is to keep all the GPU execution pipelines busy and as long as your code is highly and have regular access memory patterns. All of which gaming lends itself to?

So I hear what your saying and yeah there are no real game scenarios now that prove it, sure you can see it's just a matter of time before it does become factual?
 
It's not just resolution differences though. There's plenty of games that also have better performance and occasionally better visual effects on PS4 compared to Xbox One.

No doubt, but the level of quality for the casual observer can't really be described as 40% better. Double the framerate or advanced graphical effects would indeed be considered significant. A bit steadier framerate and a bit sharper image are not easily noticeable.
 
I believe the crux of his argument is that compared to Xbox One games PS4 titles don't have 40% "better graphics". That while the resolution difference is there, PS4 games don't have 40% better effects, better shaders, better geometry or better animations. They are the same games, only a bit shinier.

Well of course , it's takes more than 40% power difference to have certain things better along with res and frame rate .
 

omonimo

Banned
No doubt, but the level of quality for the casual observer can't really be described as 40% better. Double the framerate or advanced graphical effects would indeed be considered significant. A bit steadier framerate and a bit sharper image are not easily noticeable.
The difference in the multiplat between Ps4 &xbone should be more noticeable than the past generation .
 

Percy

Banned
No doubt, but the level of quality for the casual observer can't really be described as 40% better. Double the framerate or advanced graphical effects would indeed be considered significant. A bit steadier framerate and a bit sharper image are not easily noticeable.

Pretty sure there are games that are "double the framerate" on PS4 compared to Xbox One.
 
well at least now we know that albert penello was right when he said that MS weren't going to just give sony a 40% speed advantage.

they're giving sony a 100% speed advantage.

And boy can you tell when you look at the games!

Pretty sure there are games that are "double the framerate" on PS4 compared to Xbox One.

The closest example you're gonna find is Tomb Raider, and even there it's not doubling the framerate.
 
The difference in the multiplat between Ps4 &xbone should be more noticeable than the past generation .

Oh I agree, I never thought the differences between PS3 and 360 multiplatform games were ever big enough to warrant those huge discussions. Still, if I were Sony I'd tell developers to have resolution parity with the Xbox and push framerate and quality higher. The result would be immediately noticeable even to the untrained eye.
 

Paz

Member

Oh my god this is amazing, it's no wonder gamers are intensely distrustful of anything they are told when this kind of stuff happens on a routine basis.

Also this thread is pretty great, the Cell was an insane piece of hardware and I'm not surprised to see it able to do intensive stuff like the test linked in the OP. A good friend of mine is an engine programmer and for the entire PS360 generation all I heard was how sad it was that the PS3 didn't have a half decent GPU because even without one they were still achieving ridiculous things using the Cell.

Krazy Ken was so close yet so far.
 
PC gamers are the only ones that don't have to "live with it" though, they have the means to power through any limitations. If you feel this discussion is something that doesn't interest you or you don't understand, you can just ignore it. Noone is forcing you to participate.

This thread isn't about PC though, and no you really can't "power through" design decisions. Your tiresome and constant pointing of the finger has reached the point of nonsense. Nothing will change the design of these consoles, and the fact that games will be designed around their limitations. Doesn't matter if you have an hexacore at home.
 
This thread isn't about PC though, and no you really can't "power through" design decisions. Your tiresome and constant pointing of the finger has reached the point of nonsense. Nothing will change the design of these consoles, and the fact that games will be designed around their limitations. Doesn't matter if you have an hexacore at home.

OK, thank you for your input and criticism.
 
Oh I agree, I never thought the differences between PS3 and 360 multiplatform games were ever big enough to warrant those huge discussions. Still, if I were Sony I'd tell developers to have resolution parity with the Xbox and push framerate and quality higher. The result would be immediately noticeable even to the untrained eye.

Nah having the game 1080p is much better marketing point and easier to understand than some effects .
All they have to say is it's matches your TV can't get any simpler than that.
If people saying consumer can't tell res apart then how are they suppose to see better effects like AA , particle effects , frame rate etc etc .

EDIT As for GPGPU stuff i expect games made for PS4 only going try and use it more until later on in the gen .
 
The bad CPU in both machines has always been a bad idea IMHO. I know why they did it, but a much better CPU would have made game development a lot more flexible.
 

Kinthalis

Banned
This thread isn't about PC though, and no you really can't "power through" design decisions. Your tiresome and constant pointing of the finger has reached the point of nonsense. Nothing will change the design of these consoles, and the fact that games will be designed around their limitations. Doesn't matter if you have an hexacore at home.

Wow. ok...
 

adamsapple

Or is it just one of Phil's balls in my throat?
How relevant are these numbers ? PS3-360 gap in the "final performance" seems baffling.

Pretty sure most, if not all, Ubi games on ps3/360 are voted as better on 360 (reference of digital foundry).
 

Vizzeh

Banned
CPU'S I'm sure we're always likely to be an achillies heel for both consoles, I suspect that is why Cerny highly customised the PS4 for compute, he made a fair few speeches on fine grain asynchronous-compute and how the future will be impacted by it.

I'm surprised Ubisoft etc haven't started to utilise this compute or perhaps it's not in their budget too as they tailor their engines to the worst performing denominator. 1080p was obviously a priority for Sony too, anything ubi say about ps4 inability to hit it seems like Hot Gas... use compute instead of the weak CPU'S?

(Not that CPU'S should affect hitting 1080p anyway)
 

eot

Banned
How relevant are these numbers ? PS3-360 gap in the "final performance" seems baffling.

Pretty sure most, if not all, Ubi games on ps3/360 are voted as better on 360 (reference of digital foundry).

That's because games are more than just cloth simulators.
 

ethomaz

Banned
I'm having a little of confusion to undestand how increase the resolution to 1080p will increase the AI and CPU use.
 
So 40% more detail isn't substantial!?

Entirely subjective. He feels the difference isn't that substantial, and questions himself for that. But it's fine to feel that way. Same can be said for the opposite, if you feel it's substantial for you, then it is. There's nothing more to it.
 
Slides 71-73 are highlighting how they go from reference(DirectX) to PS4 implementation and are clearly labeled as such.

Ahh... Now that I look at it again I see your point. However the thing that is keeping me from fully believing that is the diagram on slide 72. It specifically states "On DirectX 11" and shows the full compute shader work (Steps 1, 2 & 3) along with a Copy Resource to a buffer without needing to copy the working buffer. That can't happen on stock DirectX 11, so it has to be on GNMX, the PS4 emulation of of DirectX 11.

On the other hand, an alternative interpretation is that the slides 72 and 73 aren't mean't to be take separately but in a series. In that case slide 72 could simply be the intention of what is to be done while slide 73 show what actually happens along with the implicit buffer copy. This theory is backed up because the title switches from "On DirectX 11:" on slides 72 & 73 to "On PS4:" on slide 74 indicating the difference in how the two systems operate.
 

th4tguy

Member
I'm having a little of confusion to undestand how increase the resolution to 1080p will increase the AI and CPU use.

If Ubi is offloading a lot from the cou to the gpu it's less resources for the gpu to do things like process things at a higher resolution. I still don't think the ps4 and xbone should be the same. Perhaps 900p was needed because of the amount of offloading that is happening to the gpu. I would think we'd see a higher frame rate on the ps4 then but if it wasn't solid and fluctuated between 30s and 50s I can see Ubi locking it at 30 to keep it smooth.
 

ypo

Member
And boy can you tell when you look at the games!



The closest example you're gonna find is Tomb Raider, and even there it's not doubling the framerate.

Totally agree with u bro. Looking at games like cod ghost and tomb raider u can totally see it. Its almost 2000% difference. U be like MS dun fucked. What a mess. Hi-fi.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
They started ACU years ago. Their first work on it was on PC, GPGPU will not be apart of this work, this is for the future, probably the next project
 
This thread isn't about PC though, and no you really can't "power through" design decisions. Your tiresome and constant pointing of the finger has reached the point of nonsense. Nothing will change the design of these consoles, and the fact that games will be designed around their limitations. Doesn't matter if you have an hexacore at home.

It always ends like this.
 

goonergaz

Member
Entirely subjective. He feels the difference isn't that substantial, and questions himself for that. But it's fine to feel that way. Same can be said for the opposite, if you feel it's substantial for you, then it is. There's nothing more to it.

It can't be entirely subjective, in any situation 40% is substantial (pay-rise - discount - fuel efficiency - etc) and not subjective - what's subjective is that he feels he can't see that difference.
 

Konosuke

Member
Is CELL better than Jaguar or not? I am wondering if it would be better to just have CELL + AMD GPU. Would it be too expensive?
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
Is CELL better than Jaguar or not? I am wondering if it would be better to just have CELL + AMD GPU. Would it be too expensive?

CELL is only powerful at certain things. Unless coded for its specific advantages, its useless as a general purpose CPU. Jaguar is much more flexible at all purpose tasks, but weaker in overall output.

To begin with...

Cell is a dead technology that benefited no one in the industry and only served to put Sony out of first place and into the ire of third party devs for 8 years. Sony wanted to reduce power consumption, reduce cost, and simplfy their design. And AMD had the right APU for the job.
 
How relevant are these numbers ? PS3-360 gap in the "final performance" seems baffling.

Pretty sure most, if not all, Ubi games on ps3/360 are voted as better on 360 (reference of digital foundry).

I'll give you the simple answer. The chart you're seeing in the OP though is purely showing a CPU comparison. That's only one component of the console. The PS3 had a much better CPU than the 360. BUT, the 360 had a much better GPU than the PS3. So that's why they were very similar consoles in terms of power.
 

buckeye13

Banned
And boy can you tell when you look at the games!



The closest example you're gonna find is Tomb Raider, and even there it's not doubling the framerate.

What are we , one year in to next gen, with many devs working on cross platform. As developers actually start to push graphics, I think it will be much easier to see as this gen goes on. Look at the 360 launch games for example, most looked like uprezzed xbox games. Takes Devs a bit of time.
 

goonergaz

Member
What are we , one year in to next gen, with many devs working on cross platform. As developers actually start to push graphics, I think it will be much easier to see as this gen goes on. Look at the 360 launch games for example, most looked like uprezzed xbox games. Takes Devs a bit of time.

This really, I think the ceiling will be hit sooner on XBO and we will start seeing a bigger gap as devs start utilising the CUs. Certainly 1st party PS4 games should look clearly better - ND have a lot to live up to! (setting myself up for disapointment!)
 

Cole Slaw

Banned
Wow. ok...
What's wrong with what he said? He's pretty much right. This thread is about the CPUs in the PS4 and XB1; not the CPU in your PC.

It's not like people go into PC threads and argue that an i7 is holding back the top 500 supercomputers. That would be utterly inane, yet that's what alexandros has pretty much allowed himself to do, post after post, as if he is a bearer of some unique knowledge.

These consoles are cheap and weak when compared to a $1,000 PC? No duh. The only way that would be a contribution to any thread is if someone really asked whether a $1,000 PC is going to have a better CPU than a $400 console, but that is supposed to be common sense and doesn't require derailing a whole thread.
 
What's wrong with what he said? He's pretty much right. This thread is about the CPUs in the PS4 and XB1; not the CPU in your PC.

Its actually about the difficulties in programming games for those consoles due to their decision to go with weak CPUs and the constraints that that results in.

Given games made for the PS4 and Xbone are also games made for the PC, and given that the OP is pretty much outright saying "improvements in games are going to be mostly constrained to purely aesthetics as a result", it isn't irrelevant at all to question that decision by the platform holders.
 
What's wrong with what he said? He's pretty much right. This thread is about the CPUs in the PS4 and XB1; not the CPU in your PC.

It's not like people go into PC threads and argue that an i7 is holding back the top 500 supercomputers. That would be utterly inane, yet that's what alexandros has pretty much allowed himself to do, post after post, as if he is a bearer of some unique knowledge.

These consoles are cheap and weak when compared to a $1,000 PC? No duh. The only way that would be a contribution to any thread is if someone really asked whether a $1,000 PC is going to have a better CPU than a $400 console, but that is supposed to be common sense and doesn't require derailing a whole thread.

The CPU decision is cheap looking in comparison to PCs that are much much cheaper than $1000.

The thread has devolved into a debate of how different games will be based upon the moderate/sideways jump in CPU performance. It is fine to mention the comparison to PC components.
 
What's wrong with what he said? He's pretty much right.

No he isn't. Saying both "PC multiplatform games will be designed with the console limitations in mind" and "this is a console thread, what does PC have to do with this" is a contradiction. He directly contradicts himself in the same sentense. And I didn't derail anything. Every single post was a response to comments other users already made.

Edit: removed some unnecessary parts.
 

Kinthalis

Banned
What's wrong with what he said? He's pretty much right. This thread is about the CPUs in the PS4 and XB1; not the CPU in your PC.

It's not like people go into PC threads and argue that an i7 is holding back the top 500 supercomputers. That would be utterly inane, yet that's what alexandros has pretty much allowed himself to do, post after post, as if he is a bearer of some unique knowledge.

These consoles are cheap and weak when compared to a $1,000 PC? No duh. The only way that would be a contribution to any thread is if someone really asked whether a $1,000 PC is going to have a better CPU than a $400 console, but that is supposed to be common sense and doesn't require derailing a whole thread.

Lol at the $1,000!!! DOLLARS!!! PC!!!

Nonsense. These CPU's are low end compared to what's on your typical $500 gaming PC.

More importantly, this discussion has evolved to include the topic of what these weak CPU's mean for the future of multi-plat games. Including the PC in the discussion is useful for anumber of reasons, not the least of which is because it too is affected.
 

vpance

Member
Consoles are the baseline, and we're still going to get amazing looking games beyond what we've seen so far. Yes, we will see something more impressive than Unity. Just think of it as 10x previous gen instead of 1/4 of Joe Schmo's PC and it'll be easier to fathom and not stress out about it all.
 

Locuza

Member
That can't happen on stock DirectX 11, so it has to be on GNMX, the PS4 emulation of of DirectX 11.
1. Okay than maybe the representation of the process is wrong in your opinion, but that's not necessarily makes the logical path to "this must be DX11 how it works on PS4", since DX11 is not available on PS4.
2. If GNMX would be an translation layer why they didn't wrote simply on "GNMX"?
3. They saying "porting", if GNMX is the emulation of DX11, why do you really need to port anything, if it should be an emulator?

So my final statement is, it's is a comparison.
First they show you the synchronisation points, on the next slide the buffer duplication.
After this they compare it with the PS4 Model, no implicit synchronisation or buffer dublications.
GNMX is more an DX11-ish type of API, abstracting details of certain things.
Not an DX11 wrapper/emulator.
Something suitable for small (Independent-) Studios.
Since GNMX also have a good amount of CPU-Overhead, they shouldn't be any big production Game with an CPU-Limit using it.
 

truth411

Member
The PS3's RSX was notoriously weak in comparison to the Xenos and its unified shader architecture back then. Much of the optimizing for cell this gen in multiplatform games was just to get them up to par because of the split ram, and weak gpu
Not trying to derail the thread but that's not true, it was not "notoriously weak" the wording is very exaggerated. It's better to say the 360 GPU was more flexible, the RSX was a shader monster. The only area where the RSX was significantly weaker than the 360 GPU was Vertex Calculations, but Cell could do vertex calculations and the PS3 was DESIGNED for Cell and the RSX to work together it was NOT an after thought, so ultimately its a pretty moot point shown by 1st party developers.
Back to your regularly scheduled program.....
 

Kinthalis

Banned
Consoles are the baseline, and we're still going to get amazing looking games beyond what we've seen so far. Yes, we will see something more impressive than Unity. Just think of it as 10x previous gen instead of 1/4 of Joe Schmo's PC and it'll be easier to fathom and not stress out about it all.


I think we'll certainly see improvements, but on consoles those improvements are NOT going to be equivalent to what was the case with last gen. There are major differences between the state of 3D rasterization in terms of talent with requisite expertise, asset creation API's AND hardware back then and today.
 

Ryoku

Member
People are getting pissed when PCs are even mentioned. Jeez, grow up. Stop playing the victim of irrelevant arguments. As long as it is used to support an argument pertaining to the topic, anything is relevant--including PCs.
 
I think we'll certainly see improvements, but on consoles, those improvements are NOT going to be equivalent to what was the case with last gen. There are major differences between the state of 3D rasterization in terms of talent with requisite expertise, asset creation API's AND hardware back then and today.

I think a lot of console only folk are going to be hella disappointed with just how quickly the disparity between what devs can do on console with their multiplats and what devs can do on PC with their multiplats grows.

The CPUs in the consoles simply don't have the grunt to carry out the evolution that we'll be seeing over the next 4 years as PC tech changes.

And I'm sure both Sony and especially MS are intimately aware of the fact that this will need to be a shorter generation.
 
Bolded are wrong. And I'd love to see your source for the claim the Cell is equivalent to an Atom.

PS3 offloaded GPU tasks to the CPU, as the SPUs were designed to handle such tasks. As for Core i5 GPGPU, only you know what you're talking about.

Performance wise? Look at this

LoHWUMx.jpg

With the PS4 and Xbox One CPUs themselves equivalent to an overclocked Atom processor the performance is near comparable. Now how do you think the performance of the PS4 and Xbox One CPUs be in that chart if they had a variant as powerful as a i5 or i7 Processor.
 
Top Bottom