• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ubisoft GDC EU Presentation Shows Playstation 4 & Xbox One CPU & GPU Performance- RGT

I see so many posts wishing an Intel CPU was used; I understand that Intel has the most effecient x86 CPU compared to AMD in both power consumption and instuctions per cycle, but doesn't Intel cripple their low end CPUs by removing instructions and cores to reduce cost?

Considering the cost restrictions of the current consoles, would we have seen anything more than dual core ATOM with hyperthreading or a dual core Celeron with missing instructions like AVX?

The current generation of ATOM has respectable performance, I think close to Core 2 territory in terms of IPC, but considering when the consoles were being planned, I think we would have ended up with last gen ATOM which would be slower and have less cores compared to the AMD CPU the PS4 and Xbox One have today.
Sort of, but every CPU maker does this to an extent. It's for the same reason why the 360 had a tri-core which was actually a quad-core processor with one disabled core for improved yields. (not every CPU on a wafer is perfect)
 

Realyn

Member
What is considered low end though? My PC still uses core 2 duo and only has 2 gigs of RAM. I can't even run L4D2 at 60fps at 900p and I play most PC games at low or medium presets at low resolutions at 30fps or so.

Well that's what, 2006-2007 specs? I would say in a rapidly changing business that's very much low end. It's only 2 years after 360 release.
 
Are YOU kidding?

That's more of a sprint and nowhere close to replicating the kind of lighting and particle effects of light speed in Second Son. Graphically the two games aren't even remotely comparable.

The discussion has nothing to do with graphics - it's about whether an open world like infamous SS could have been done on last gen hardware. The point the other guy raised was whether or not traversal like Infamous could be done - as in, dashing quickly around a city. Arguably what's going on in that video is actually faster traversal around an open world than infamous.
 
What is considered low end though? My PC still uses core 2 duo and only has 2 gigs of RAM. I can't even run L4D2 at 60fps at 900p and I play most PC games at low or medium presets at low resolutions at 30fps or so.

A low end gaming PC is a rig with a contemporary entry-level graphics card and CPU. Your specs are 360/PS3 tier.
 

Etnos

Banned
Owning both consoles, and as right now almost a year for release, I would say there is no game on the PS4 that looks 40% better, or has a substantial superior fidelity. People keep parroting the 1080p vs 900p its 40% more pixels... doesn't translate into 40% better visuals, not even close (diminishing returns).

Of course this may change later on the generation.
 
The discussion has nothing to do with graphics - it's about whether an open world like infamous SS could have been done on last gen hardware. The point the other guy raised was whether or not traversal like Infamous could be done - as in, dashing quickly around a city. Arguably what's going on in that video is actually faster traversal around an open world than infamous.
If that's his assertion, didn't Prototype already do the superspeed running up walls?
 

UnrealEck

Member
I bet the Emotion Engine from PS2 would wipe the floor with the latest i7. Remember, Saddam was going to buy hundreds of PS2's to build a super weapon with. Incredible stuff!
joking.
 
Owning both consoles, and as right now almost a year for release, I would say there is no game on the PS4 that looks 40% better, or has a substantial superior fidelity. People keep parroting the 1080p vs 900p its 40% more pixels... doesn't translate into 40% better visuals, not even close (diminishing returns).

Of course this may change later on the generation.

I don't know what 40% better visuals mean, at all. The phrase does not make any sense.
 

Xando

Member
Owning both consoles, and as right now almost a year for release, I would say there is no game on the PS4 that looks 40% better, or has a substantial superior fidelity. People keep parroting the 1080p vs 900p its 40% more pixels... doesn't translate into 40% better visuals, not even close (diminishing returns).

Of course this may change later on the generation.

How do you see 40% better visuals though?
Only thing you would see if it looks better or worse than another game.
 

Salvadora

Member
How do you see 40% better visuals though?
Only thing you would see if it looks better or worse than another game.
40% more p

1080 ÷ 100 x 40 = 432

1080 + 423 = 1512

The PS4 should be displayed games at 1512p if the Xbox One version is 1080p.
 

Etnos

Banned
I don't know what 40% better visuals mean, at all. The phrase does not make any sense.

Higher fidelity overall

How do you see 40% better visuals though?

You know what... I wonder the same, how come one of my consoles that is 40% more powerful doesn't translate that into 40% more fidelity, sure games looks slightly better (in some cases), but not really 40% better.

40% more power should be a substantial improvement in fidelity, I'm guessing... maybe I'm wrong.
 

BeforeJam

Neo Member
i read somewhere than cell even does some things better than an i7, so it's not hard to imagine it being better than ps4 cpu. we're talking about a cpu that did graphics rendering in 2006 and was really good at parallel processing.

The idea behind Cell was the 'processor to end all processors.' Despite it being a bitch to code for it has aged very well.

40% more p

1080 ÷ 100 x 40 = 432

1080 + 423 = 1512

The PS4 should be displayed games at 1512p if the Xbox One version is 1080p.


Also TIL that 1080p Xbox games should easily be able to downsample from 1440p on PS4.
 

Fafalada

Fafracer forever
cheesekao said:
What is considered low end though?
On enthusiast message boards it's mostly price-bracketed (and low-end starts at sub 200). As you point out - real-world is completely different, and actual low-end tail of PC hw extends far beyond that (and reaches well below even the last gen of console-hw in places).

EventHorizon said:
If the XB1 or PC were being described, wouldn't they be labeled?...at least once?...somewhere?
Slides 71-73 are highlighting how they go from reference(DirectX) to PS4 implementation and are clearly labeled as such.
 

megathor

Member
I'd like 1080p/30fps, no bugs, and weapon and clothing animations that don't clip through the fucking character. Seriously, it's been happening since AC1. You run and the sword goes through your character and the robe goes through his legs. How is this still happening?

Cloth is hard, so most games only simulate the visual part, and ignore the physics interaction with the rest of the game, there's no real solution to it at the minute, unfortunately. Also, no bugs is just never going to happen. It's too expensive to ship bug free software for it to be worth it in anything other than space travel/defence.
 

stryke

Member
I know right... If only measuring visuals were as simple as comparing synthetic GPU benchmarks.

It is simple. You're the one making it subjective and complicated.

But sure, go ahead and complain about "I see no 40%" when you don't even attempt to establish to us what 40% means to you - higher framerate? more accurate AO? texture filtering? AA? Tesselation? Other buzz words?
 

Caayn

Member
40% more p

1080 ÷ 100 x 40 = 432

1080 + 423 = 1512

The PS4 should be displayed games at 1512p if the Xbox One version is 1080p.
That's almost a 100% increase in resolution not a 40% increase ;)
Also TIL that 1080p Xbox games should easily be able to downsample from 1440p on PS4.
It doesn't work that easy. Also 1440p over 1080p is a 75%+ increase in resolution, the hardware gap is not that big.
 
40% more p

1080 ÷ 100 x 40 = 432

1080 + 423 = 1512

The PS4 should be displayed games at 1512p if the Xbox One version is 1080p.

Who said anything about "40% more p" (= lines)? It's ~40% more pixels. In 16:9 1080p has 1920*1080 = 2073600 pixels, 40% more are 2903040 pixels or a resolution of ~2272x1278.

edit: Just an advice: If you want to know about an increase of 40% (= 140%), it's easier to just multiply with 1.4.

Higher fidelity overall



You know what... I wonder the same, how come one of my consoles that is 40% more powerful doesn't translate that into 40% more fidelity, sure games looks slightly better (in some cases), but not really 40% better.

40% more power should be a substantial improvement in fidelity, I'm guessing... maybe I'm wrong.

You're wrong imo. 40% more GPU power isn't that substantial. But it's enough to boost the resolution from 900p to 1080p for example.
 

KidJr

Member
Mostly, because there are no factual tests that are showing 2 ACE being a bottleneck in GPGPU computing in games.

Well thats not quite true, I mean isnt it just logical, having more than 2 would be beneficial to GPU Compute.


I dont know if others would agree or disagree with this analogy but this is how it was explained to me a while ago. The GPU can be seen as a person lying on top is the data and in the base of each nail there is a processor, so the nail is actually an arrow pointing from processor to memory. All nails are in a regular pattern, like a grid. If the body is well spread, it feels good (performance is good), if the body only touches some spots of the nail bed, then the pain is bad (bad performance).
 

Green Yoshi

Member
The Cell is a fucking monster for crunching raw data. The problem is actually using the architecture effectively in a game engine, and working around the PS3's lackluster GPU. PS3/360 gen performance was dictated more by architecture differences in the two machines then this gen where the boxes are largely the same configuration outside of bandwidth and GPU differences.
I thought the PS3 had a faster GPU than the 360 (550 Mhz vs. 500 Mhz).

Still, actual games as Alien Isolation, The Evil Within and F1 2014 perform better on 360.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
Owning both consoles, and as right now almost a year for release, I would say there is no game on the PS4 that looks 40% better, or has a substantial superior fidelity. People keep parroting the 1080p vs 900p its 40% more pixels... doesn't translate into 40% better visuals, not even close (diminishing returns).

Of course this may change later on the generation.

A tag quote has never been more appropriate.
 

Realyn

Member
That's almost a 100% increase in resolution not a 40% increase ;)
It doesn't work that easy. Also 1440p over 1080p is a 75%+ increase in resolution, the hardware gap is not that big.

I agree with you that it's terrible thinking on his part. But I think/hope it was tongue in cheek and that he's joking? If he isn't ... well, a statement like that sums up the reasoning of some people perfectly.
 

SpyGuy239

Member
FYI, that lowest common denominator is still higher than the vast majority of Windows PCs. Want to blame the lack of technical progress on something? Blame the consumers who won't spend more on hardware, be it console or PC.

Spot on. Finally someone with some sense.
 

Mugatu

Member
Windows PCs in general are an irrelevant metric when we're talking about PC gaming. Even the Steam hardware survey that some people like to bring up from time to time is massively skewed by the fact that a lot of people who participate in it are from very poor countries. Given the now well proven fact that even Core i3s and 750Tis can give consoles a run for their money I'd say that even low end gaming PCs are at least on par with current gen consoles.

You should get your facts straight. Even AMD's crap PC CPUs are enough to compete with those netbook-grade Jaguar cores.

Yes but they are also skewed towards gaming PCs, which are higher spec than the general computing population. I doubt it evens out but it is biased in both the positive and negative.

You're right though that those CPUs are very low end.
 

KKRT00

Member
Well thats not quite true, I mean isnt it just logical, having more than 2 would be beneficial to GPU Compute.


I dont know if others would agree or disagree with this analogy but this is how it was explained to me a while ago. The GPU can be seen as a person lying on top is the data and in the base of each nail there is a processor, so the nail is actually an arrow pointing from processor to memory. All nails are in a regular pattern, like a grid. If the body is well spread, it feels good (performance is good), if the body only touches some spots of the nail bed, then the pain is bad (bad performance).
Sure more ACE will benefit GPU Compute, its why it was designed for.
The question is though, will more ACE benefit GPGPU in PS4 games?
Will there be even games that will use GPGPU compute in a way You will need more ACE for wavefront scheduling.
ACE number upgrade was mostly designed for very high end cards that reach 5-6Gflops, not for 1.8tflops. And still, we dont know if it not technology designed more for GPGPU based research applications and not games.

It can be a boost for PS4, but it can also be not and for now, we dont have any prove that it helps in real world games scenarios.

--
You should let AMD know that the extra ACEs in their higher end cards are going to waste then.

Read my answer above.
 
No, the ACE increase was chosen specifically by Sony for the PS4 after which AMD adopted it across their entire product line from Kaveri to their high end gpus.
 
Well thats not quite true, I mean isnt it just logical, having more than 2 would be beneficial to GPU Compute.


I dont know if others would agree or disagree with this analogy but this is how it was explained to me a while ago. The GPU can be seen as a person lying on top is the data and in the base of each nail there is a processor, so the nail is actually an arrow pointing from processor to memory. All nails are in a regular pattern, like a grid. If the body is well spread, it feels good (performance is good), if the body only touches some spots of the nail bed, then the pain is bad (bad performance).

I'm not sure this analogy is helpful. ACEs are for scheduling and dispatching work to the GPU's multiprocessors (= compute units = CUs). This AnandTech article explains it a bit.

Anandtech said:
Meanwhile on the compute side, AMD’s new Asynchronous Compute Engines serve as the command processors for compute operations on GCN. The principal purpose of ACEs will be to accept work and to dispatch it off to the CUs for processing. As GCN is designed to concurrently work on several tasks, there can be multiple ACEs on a GPU, with the ACEs deciding on resource allocation, context switching, and task priority. AMD has not established an immediate relationship between ACEs and the number of tasks that can be worked on concurrently, so we’re not sure whether there’s a fixed 1:X relationship or whether it’s simply more efficient for the purposes of working on many tasks in parallel to have more ACEs.[...]

The way I see it is the more different tasks you want to do on the GPU, possibly in parallel, the more ACEs you need if you don't want them to become a bottleneck. If there's no bottleneck for a specific application on the other hand, more ACEs don't do anything for performance.

Have i5 processors catch up with the Cell yet?

In terms of theoretical peak performance? Yes. Taking a look at this a current Haswell CPU core can do up to 32 FLOPs per cycle. The Core i5-4430 (slowest i5) clocks at 3 GHz and has 4 cores. 3*4*32=384 GFLOP/s. The PS3's Cell (with one SPU deactivated) has 204.8 GFLOP/s.
Real world performance for most applications is a lot higher anyway of course.


edit:

I know but they were from the same generation. Nvidia built quite well GPUs back then.

PS3 had a Nvidia GPU, Xbox 360 an AMD GPU. Even if they were the same architecture though clock doesn't mean anything without the number of shader units.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
I know but they were from the same generation. Nvidia built quite well GPUs back then.

The PS3's RSX was notoriously weak in comparison to the Xenos and its unified shader architecture back then. Much of the optimizing for cell this gen in multiplatform games was just to get them up to par because of the split ram, and weak gpu
 
I don't even understand the whole pc debate. The consoles are as they are, and games will be designed with them in mind. Live with it.
 
I don't even understand the whole pc debate. The consoles are as they are, and games will be designed with them in mind. Live with it.

PC gamers are the only ones that don't have to "live with it" though, they have the means to power through any limitations. If you feel this discussion is something that doesn't interest you or you don't understand, you can just ignore it. Noone is forcing you to participate.
 

goonergaz

Member
It is simple. You're the one making it subjective and complicated.

But sure, go ahead and complain about "I see no 40%" when you don't even attempt to establish to us what 40% means to you - higher framerate? more accurate AO? texture filtering? AA? Tesselation? Other buzz words?

My eyesight was fine until I went to the opticians, all that time I spent thinking my eyes were fine turns out they we good enough but not 'perfect'. Having finally got my glasses I now see in 'superHD'.

I wonder if people suffer from the same thing, their eyesight seems fine but in reality it's just not good enough to see the differences!?
 
Top Bottom