You're wrong imo. 40% more GPU power isn't that substantial. But it's enough to boost the resolution from 900p to 1080p for example.
So 40% more detail isn't substantial!?
You're wrong imo. 40% more GPU power isn't that substantial. But it's enough to boost the resolution from 900p to 1080p for example.
They have to live with console limitations, though, which I'm sure was Can Crusher's point.PC gamers are the only ones that don't have to "live with it" though
So 40% more detail isn't substantial!?
They have to live with console limitations, though, which I'm sure was Can Crusher's point.
Otherwise, why would PC gamers complain being dragged by them?
I believe the crux of his argument is that compared to Xbox One games PS4 titles don't have 40% "better graphics". That while the resolution difference is there, PS4 games don't have 40% better effects, better shaders, better geometry or better animations. They are the same games, only a bit shinier.
I'm not sure this analogy is helpful. ACEs are for scheduling and dispatching work to the GPU's multiprocessors (= compute units = CUs). This AnandTech article explains it a bit.
The way I see it is the more different tasks you want to do on the GPU, possibly in parallel, the more ACEs you need if you don't want them to become a bottleneck. If there's no bottleneck for a specific application on the other hand, more ACEs don't do anything for performance.
In terms of theoretical peak performance? Yes. Taking a look at this a current Haswell CPU core can do up to 32 FLOPs per cycle. The Core i5-4430 (slowest i5) clocks at 3 GHz and has 4 cores. 3*4*32=384 GFLOP/s. The PS3's Cell (with one SPU deactivated) has 204.8 GFLOP/s.
Real world performance for most applications is a lot higher anyway of course.
edit:
PS3 had a Nvidia GPU, Xbox 360 an AMD GPU. Even if they were the same architecture though clock doesn't mean anything without the number of shader units.
The question is though, will more ACE benefit GPGPU in PS4 games?
Will there be even games that will use GPGPU compute in a way You will need more ACE for wavefront scheduling.
ACE number upgrade was mostly designed for very high end cards that reach 5-6Gflops, not for 1.8tflops. And still, we dont know if it not technology designed more for GPGPU based research applications and not games.
It can be a boost for PS4, but it can also be not and for now, we dont have any prove that it helps in real world games scenarios.
It's not just resolution differences though. There's plenty of games that also have better performance and occasionally better visual effects on PS4 compared to Xbox One.
I believe the crux of his argument is that compared to Xbox One games PS4 titles don't have 40% "better graphics". That while the resolution difference is there, PS4 games don't have 40% better effects, better shaders, better geometry or better animations. They are the same games, only a bit shinier.
The difference in the multiplat between Ps4 &xbone should be more noticeable than the past generation .No doubt, but the level of quality for the casual observer can't really be described as 40% better. Double the framerate or advanced graphical effects would indeed be considered significant. A bit steadier framerate and a bit sharper image are not easily noticeable.
No doubt, but the level of quality for the casual observer can't really be described as 40% better. Double the framerate or advanced graphical effects would indeed be considered significant. A bit steadier framerate and a bit sharper image are not easily noticeable.
well at least now we know that albert penello was right when he said that MS weren't going to just give sony a 40% speed advantage.
they're giving sony a 100% speed advantage.
Pretty sure there are games that are "double the framerate" on PS4 compared to Xbox One.
The difference in the multiplat between Ps4 &xbone should be more noticeable than the past generation .
PC gamers are the only ones that don't have to "live with it" though, they have the means to power through any limitations. If you feel this discussion is something that doesn't interest you or you don't understand, you can just ignore it. Noone is forcing you to participate.
This thread isn't about PC though, and no you really can't "power through" design decisions. Your tiresome and constant pointing of the finger has reached the point of nonsense. Nothing will change the design of these consoles, and the fact that games will be designed around their limitations. Doesn't matter if you have an hexacore at home.
a $899 ps4 probably
Oh I agree, I never thought the differences between PS3 and 360 multiplatform games were ever big enough to warrant those huge discussions. Still, if I were Sony I'd tell developers to have resolution parity with the Xbox and push framerate and quality higher. The result would be immediately noticeable even to the untrained eye.
This thread isn't about PC though, and no you really can't "power through" design decisions. Your tiresome and constant pointing of the finger has reached the point of nonsense. Nothing will change the design of these consoles, and the fact that games will be designed around their limitations. Doesn't matter if you have an hexacore at home.
How relevant are these numbers ? PS3-360 gap in the "final performance" seems baffling.
Pretty sure most, if not all, Ubi games on ps3/360 are voted as better on 360 (reference of digital foundry).
So 40% more detail isn't substantial!?
Slides 71-73 are highlighting how they go from reference(DirectX) to PS4 implementation and are clearly labeled as such.
I'm having a little of confusion to undestand how increase the resolution to 1080p will increase the AI and CPU use.
And boy can you tell when you look at the games!
The closest example you're gonna find is Tomb Raider, and even there it's not doubling the framerate.
Somehow this data proves misterxmedia was right all along.
This thread isn't about PC though, and no you really can't "power through" design decisions. Your tiresome and constant pointing of the finger has reached the point of nonsense. Nothing will change the design of these consoles, and the fact that games will be designed around their limitations. Doesn't matter if you have an hexacore at home.
Entirely subjective. He feels the difference isn't that substantial, and questions himself for that. But it's fine to feel that way. Same can be said for the opposite, if you feel it's substantial for you, then it is. There's nothing more to it.
The bad CPU in both machines has always been a bad idea IMHO. I know why they did it, but a much better CPU would have made game development a lot more flexible.
I believe in the future developers will become better at programming for 8 cores.
Is CELL better than Jaguar or not? I am wondering if it would be better to just have CELL + AMD GPU. Would it be too expensive?
How relevant are these numbers ? PS3-360 gap in the "final performance" seems baffling.
Pretty sure most, if not all, Ubi games on ps3/360 are voted as better on 360 (reference of digital foundry).
And boy can you tell when you look at the games!
The closest example you're gonna find is Tomb Raider, and even there it's not doubling the framerate.
What are we , one year in to next gen, with many devs working on cross platform. As developers actually start to push graphics, I think it will be much easier to see as this gen goes on. Look at the 360 launch games for example, most looked like uprezzed xbox games. Takes Devs a bit of time.
What's wrong with what he said? He's pretty much right. This thread is about the CPUs in the PS4 and XB1; not the CPU in your PC.Wow. ok...
What's wrong with what he said? He's pretty much right. This thread is about the CPUs in the PS4 and XB1; not the CPU in your PC.
What's wrong with what he said? He's pretty much right. This thread is about the CPUs in the PS4 and XB1; not the CPU in your PC.
It's not like people go into PC threads and argue that an i7 is holding back the top 500 supercomputers. That would be utterly inane, yet that's what alexandros has pretty much allowed himself to do, post after post, as if he is a bearer of some unique knowledge.
These consoles are cheap and weak when compared to a $1,000 PC? No duh. The only way that would be a contribution to any thread is if someone really asked whether a $1,000 PC is going to have a better CPU than a $400 console, but that is supposed to be common sense and doesn't require derailing a whole thread.
What's wrong with what he said? He's pretty much right.
What's wrong with what he said? He's pretty much right. This thread is about the CPUs in the PS4 and XB1; not the CPU in your PC.
It's not like people go into PC threads and argue that an i7 is holding back the top 500 supercomputers. That would be utterly inane, yet that's what alexandros has pretty much allowed himself to do, post after post, as if he is a bearer of some unique knowledge.
These consoles are cheap and weak when compared to a $1,000 PC? No duh. The only way that would be a contribution to any thread is if someone really asked whether a $1,000 PC is going to have a better CPU than a $400 console, but that is supposed to be common sense and doesn't require derailing a whole thread.
1. Okay than maybe the representation of the process is wrong in your opinion, but that's not necessarily makes the logical path to "this must be DX11 how it works on PS4", since DX11 is not available on PS4.That can't happen on stock DirectX 11, so it has to be on GNMX, the PS4 emulation of of DirectX 11.
Not trying to derail the thread but that's not true, it was not "notoriously weak" the wording is very exaggerated. It's better to say the 360 GPU was more flexible, the RSX was a shader monster. The only area where the RSX was significantly weaker than the 360 GPU was Vertex Calculations, but Cell could do vertex calculations and the PS3 was DESIGNED for Cell and the RSX to work together it was NOT an after thought, so ultimately its a pretty moot point shown by 1st party developers.The PS3's RSX was notoriously weak in comparison to the Xenos and its unified shader architecture back then. Much of the optimizing for cell this gen in multiplatform games was just to get them up to par because of the split ram, and weak gpu
Consoles are the baseline, and we're still going to get amazing looking games beyond what we've seen so far. Yes, we will see something more impressive than Unity. Just think of it as 10x previous gen instead of 1/4 of Joe Schmo's PC and it'll be easier to fathom and not stress out about it all.
I think we'll certainly see improvements, but on consoles, those improvements are NOT going to be equivalent to what was the case with last gen. There are major differences between the state of 3D rasterization in terms of talent with requisite expertise, asset creation API's AND hardware back then and today.
Bolded are wrong. And I'd love to see your source for the claim the Cell is equivalent to an Atom.
PS3 offloaded GPU tasks to the CPU, as the SPUs were designed to handle such tasks. As for Core i5 GPGPU, only you know what you're talking about.