EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Developers likely won't deliberately make the PS4 version run worse. HOWEVER, if it is running acceptably when they port it over and the Xbone version isn't, they will be spending their time improving the Xbox version rather than optimizing both. Obviously, this only applies if one team is doing both versions.

Another thing to consider is that if the PS4 is as easy to develop for as reported, there may be many more exclusives produced than in the previous generation. More and better exclusives could force third party developers to do a better job so that they don't look bad in comparison.
 
Why don't you tell us how that turned out because last time I checked the PS3 had some of the most graphically acclaimed games, of any console, this generation.

Also, maybe you didn't get the memo, but these aren't hypothetical projections based on theoretical performance. These are actual performance numbers based on running code. So, yeah, the PS3 didn't make the Xbox look like an Xbox 1.5, but thanks to Microsoft the PS4 is making the Xbox One look like the 360.5.

Wouldn't it be Xbox 540, if you're only adding 50% of the previous generations' number to the value in an ironic way?
 
The thing is the PS4Eye is more capable than Kinect 1 and can do a lot of the same things Kinect 2 can, albeit Kinect 2 is indeed superior to it. So casual games like Fitness and Dance games will be able to be played on the PS4 without the move this time around. They also said they have the SW ready for voice commands and facial recognition that worked on PS3, so they can expand on that for PS4. Sony won't be without its kinect alternative this time around.

Dance games, sure, but Kinect 2 is on another level when it comes to fitness applications. I hate to post this twice in the span of a few a few hours, but
ziFzpaS.png

I don't see any way that they could extract exertion or pulse information from the 3d PlayStation Camera. Whether that turns out to be useful/meaningful, who knows

There was a tech demo demonstrating this as well
 
It's just 18CUs, there are no hardware differences. The "14+4" is just an example of how you could schedule graphics and compute, if you want to.

Does that Cerny quote refute the leak which mostly seems to be correct on most things? He vaguely mentions it had more ALU than you would have for "just graphics".

I had thought there was a programmable part to shape, clearly not.
 
You can't say how big that gap will be because we don't know. But once the drivers get accustomed to the track and their cars, there's a good chance we're gonna see that gap grow.

The devs quoted in this article have called the difference "significant" and "obvious." You don't say that unless there's a fairly decent difference.

You are right. No one knows how wide the gap is going to be. There WILL BE a power difference, make no mistake about that. I'm arguing that because both cars were up and running at about the same speed at lap 1, the distance between the two cars will not be that big. I'm arguing that when car A is finishing lap 9, car A will me somewhere in the middle of lap 9, as opposed to the impressions that I get from some of you where it seems like car A will be running laps around car b.
 
Dance games, sure, but Kinect 2 is on another level when it comes to fitness applications. I hate to post this twice in 2 pages, but whatever:


I don't see any way that they could extract exertion or pulse information from the 3d PlayStation Camera.

If there's anything gamers are known for, it's all the strenuous activity they like to do while gaming. Or while not gaming. Or period.

You are right. No one knows how wide the gap is going to be. There WILL BE a power difference, make no mistake about that. I'm arguing that because both cars were up and running at about the same speed at lap 1, the distance between the two cars will not be that big. I'm arguing that when car A is finishing lap 9, car A will me somewhere in the middle of lap 9, as opposed to the impressions that I get from some of you where it seems like car A will be running laps around car b.

I don't disagree, though it's hard to agree either. We simply don't know.
 
If there's anything gamers are known for, it's all the strenuous activity they like to do while gaming. Or while not gaming. Or period.



I don't disagree, though it's hard to agree either. We simply don't know.

Because all gamers are the same and not one of us have different likes or dislikes.
 
If there's anything gamers are known for, it's all the strenuous activity they like to do while gaming. Or while not gaming. Or period.

Post I was replying to was about the Playstation Camera's effecitveness versus Kinect in fitness and dance games. I don't think I've ever defended Kinect as a core gaming device
 
I'm actually a bit surprised about the "It's a pain to use the ESRAM" bit though, given that even back with the 360 you had the EDRAM, and I don't recall any devs having issue with that.

Why would anyone be surprised it's a pain in the ass? Developers need to constantly micro manage it. It's just one more task that they could do without. With the PS4 none of this is necessary due to the GDDR5 super highway.
 
Because all gamers are the same and not one of us have different likes or dislikes.

I was speaking of the majority. But it doesn't really matter, since I was joking.

Post I was replying to was about the Playstation Camera's effecitveness versus Kinect in fitness and dance games. I don't think I've ever defended Kinect as a core gaming device

My apologies. I lost the thread of your discussion.
 
Sure, if the VGLeaks article on the PS4 architecture is accurate, the GPU in there is a 14 CU GPU with 4 extra CUs tacked on.

There's very little information available about those 4 extra CUs. Cerny was asked about this by DF and responded that they added a little more ALU than you normally would to encourage studios to use GPGPU for things like audio raycasting, etc.

If the 14+4 GPU is correct it may not have extra texture cache, L1 cache, L2 cache or TLB cache for those CUs. The PS4 works around that by supplying the Onion+ bus which gives the GPU a channel to memory at 20 GBps which doesn't use the L1 or L2 cache. Bypassing the caches means that the GPGPU performance would be slower on CUs using Onion+ as they have to wait for the writes to complete before caring on. Memory caches are used to buffer memory into fast on-chip RAM.

The Xbox One doesn't have these extra CUs but it does have some fixed function co-processors to do jobs like audio raycasting and moving memory, and it has a fast on-chip scratch pad which will have much less latency than writing to external memory (if the PS4's ROPs suffer from ~2x latency of the Xbox One, their ROP throughput will be similar when writing to ESRAM). The leaked PS4 GPU clock is 800mhz and the Xbox One is 853mhz so you've got 12@853 vs 14+4@800. I doubt anyone in this thread knows exactly how that will play out in the long term.

This is where the balanced argument comes from.

lol, welcome to january? There is no 14+4... there is only 18.

As to rest of the "argument", hah.
 
Proof is in the pudding. PS3 was supposed to make 360 look like Xbox 1.5.

We all know how that turned out

PS3 had a strange architecture that was difficult to program for so the power wasn't utilized until far into it's life. The PS4 was developed with an entirely different mindset and is basically as easy to program for as a pc. I don't think it is valid to compare it to the ps3 vs 360 debate.
 
It's not about stupidity, it's about burning bridges. Microsoft is a huge company in gaming and in many other fields. You really want to be on good terms with them, not to mention that they might also aggressively money hat devs for parity and more exclusives.
You typed this and yet don't know what you're talking about.
 
Still have not seen any games or comparisons, and these consoles launch in 2 months.

Are you also wondering what the deal with "Assassin's Creed" and "Watchdogs" is as well? I'm curious to see these two games running on Xbox One. Heard nothing about it so far. This has kept me from reserving either one so far. Words are one thing... give me something "I" can judge if it's about the games. -Adam
 
Yet you spend the rest of your post disagreeing...

And the analogy was obviously a bad idea because you're not able to look at anything outside of the perfectly literal sense. So I'll talk literally, I guess, and see if any of that gets through:

The devs that are working on launch games for these consoles right now don't have time to figure out what makes these machines special. They're working around the clock just trying to get these games ready to ship in November. They were developing software towards a moving target until just months ago. They have bugs to squash, networks to stress test, and in many cases, other platforms to develop and optimize for. If there were large differences between games on the two consoles at this point, that might be revealing. The fact that there aren't large differences is not revealing at all.

Disagree if you like. I can't sum up the reality of a hardware launch any better than that. I've got to go take a dump anyway.

I do understand what you are trying to say. It is "getting through" (lol again with the superiority complex) I just disagree with it.

Thank you for acknowledging that your analogy sucked. Your analogy was obviously a bad idea because it was poorly constructed, not because I was being "perfectly literal". You are just trying to downplay the practical parity achieved by both console launch games because it doesn't show any evidence for a large disparity of real-world performance. The fact that there aren't large differences is not revealing of how far the PS4 will go vs the Xb1 (this is basically what you are saying), but it does show that the Xb1 is capable of generally matching the PS4.

If I had said this:

"because the launch games are similar then the performance will be similar"

then whatever argument you are trying to make here would be applicable. But that's not what I am saying. The PS4 WILL outperform the XB1, just not by a fuckton like you are suggesting.
 
Are you also wondering what the deal with "Assassin's Creed" and "Watchdogs" is as well? I'm curious to see these two games running on Xbox One. Heard nothing about it so far. This has kept me from reserving either one so far. Words are one thing... give me something "I" can judge if it's about the games. -Adam
Yea I am, watch_dogs because im interested, assassin creed just because. I know we keep hearing things but really there havent been anything shown.
 
Yet you spend the rest of your post disagreeing...

And the analogy was obviously a bad idea because you're not able to look at anything outside of the perfectly literal sense. So I'll talk literally, I guess, and see if any of that gets through:

The devs that are working on launch games for these consoles right now don't have time to figure out what makes these machines special. They're working around the clock just trying to get these games ready to ship in November. They were developing software towards a moving target until just months ago. They have bugs to squash, networks to stress test, and in many cases, other platforms to develop and optimize for. If there were large differences between games on the two consoles at this point, that might be revealing. The fact that there aren't large differences is not revealing at all.

Disagree if you like. I can't sum up the reality of a hardware launch any better than that. I've got to go take a dump anyway.

In addition to all that, all devs working on next gen titles have been working on non-final hardware for the longest time ever. It's really hard to code for something when your final target keeps changing specs for the 2+ years of development.
 
In addition to all that, all devs working on next gen titles have been working on non-final hardware for the longest time ever. It's really hard to code for something when your final target keeps changing specs for the 2+ years of development.

But that is the case for both consoles! How is it that because this is happening, only Sony gets to benefit from Dat Inexperience while MS stays stagnant? Heck MS just now somehow figured out that its drivers were shit and so optimized them, which basically made Dead Rising 3 not run like crap. Both console developers will gain experience and we don't know how that will play out, though we do know that Sony has the most theoretical room for growth. The only evidence we have to an actual difference - the launch games - suggests a relative non-inferiority. Even one of the people who said that the PS4 was roughly 50% more powerful said that those were the stats RIGHT NOW, and that it would be stupid to suggest that there will be some kind of massive difference at this point. This isn't that difficult to understand.
 
But that is the case for both consoles! How is it that because this is happening, only Sony gets to benefit from Dat Inexperience while MS stays stagnant? Heck MS just now somehow figured out that its drivers were shit and so optimized them. Both console developers will gain experience and we don't know how that will play, other than the fact that Sony has the most theoretical room for growth. The only evidence we have to an actual difference - the launch games - suggest a relative non-inferiority.

Launch games never take advantage of a console's power. Not even close. Compare Uncharted to Uncharted 3.

As you said, PS4 has the most room to grow. So the gulf between what we see now and what we will see down the road is much wider for PS4 than for Xbone.
 
I don't see any differentiation between CUs there either. It explicitly says "Unified Array of Compute Units".

That image certainly does say that.

Was VGLeaks wrong on the 14+4 when everything else looked about right and what did Cerny mean by the hardware not being 100% round? Just the extra compute queues?
 
At page 43 I think back to...
012413.jpg

I see you've toned down your rhetoric since tasting the hammer. Now instead you just post pictures instead.

With that said, is anyone surprised by Edge? If you've been following along this entire time and aren't a misterxmedia diehard Xbone fan, you already know this. I enjoy reading others bring it up though (i.e. Edge).
 
It comes down to how they're connected to the ACEs and caches. I'd assumed there would be differences there which haven't been disclosed.

Because there is no difference, this has been discussed to death, the 14+4 was just a example of doing compute + your normal ff graphics at the same time. The split is neither fixed, nor does it have to exist at all, the CU's are all identical and so are how they connect to everything else. Also you are off about latency, the GDDR5 latency won't decrease the amount of fillrate the PS4 has, GPU's are designed to handle high latency.
 
I don't watch those shows. How about if I describe them in vegetables?

PS4: Corn

Xbone: Rutabaga

WiiU: Celery
FALSE! The PS4 is equivalent to corn+beets, otherwise known as corned beet. This is the closest vegetables can come to meat used in deli sandwiches. PCs are, of course, similar to Boar's head products.
 
I see you've toned down your rhetoric since tasting the hammer. Now instead you just post pictures instead.

With that said, is anyone surprised by Edge? If you've been following along this entire time and aren't a misterxmedia diehard Xbone fan, you already know this. I enjoy reading others bring it up though (i.e. Edge).
.

Any positive xbone post tend to stick out. Its all good fun though.
 
But that is the case for both consoles! How is it that because this is happening, only Sony gets to benefit from Dat Inexperience while MS stays stagnant? Heck MS just now somehow figured out that its drivers were shit and so optimized them, which basically made Dead Rising 3 not run like crap. Both console developers will gain experience and we don't know how that will play out, though we do know that Sony has the most theoretical room for growth. The only evidence we have to an actual difference - the launch games - suggests a relative non-inferiority. Even one of the people who said that the PS4 was roughly 50% more powerful said that those were the stats RIGHT NOW, and that it would be stupid to suggest that there will be some kind of massive difference at this point. This isn't that difficult to understand.

If you would re-read my post, I mentioned ALL devs. I did not only mention PS4 devs. *Facepalm*

Launch games will not use much of the power of either console and we will not see not see them until 2 years later when 343i, ND, SM come into the equation.
 
practical parity achieved by both console launch games because it doesn't show any evidence for a large disparity of real-world performance.

did we actually see any of XB1 games running on XB1? Forza 5 only?

Just because nobody has released any proper footage, doesnt mean that there wont be difference between games

If hardware power difference is 50%, then there will be quite an difference in games - like that rumor about COD running much better on PS4 shows.

KZ looks stunning, I dont see anything in XB1 camp looking anywhere close.
 
Because there is no difference, this has been discussed to death, the 14+4 was just a example of doing compute + your normal ff graphics at the same time. The split is neither fixed, nor does it have to exist at all, the CU's are all identical and so are how they connect to everything else. Also you are off about latency, the GDDR5 latency won't decrease the amount of fillrate the PS4 has, GPU's are designed to handle high latency.

GPUs hide latency by using instruction level parallelism in the shaders to compute other threads whilst waiting on things like texture reads elsewhere. The scalar cores help with this too.

I'm not sure what the ROPs can do to combat write latency.
 
It's going to be pretty disappointing if devs target the Xbox One first.

I'm hoping that PS4 software sales will be so much larger than Xbox One software sales that more attention gets placed on targeting the PS4, and then developers will simply downport Xbox One as necessary.
 
Launch games never take advantage of a console's power. Not even close. Compare Uncharted to Uncharted 3.


Question:

A runner has 1000 calories of energy. Another has 1500 calories. This is 50% more calories, but we don't know what that translates into in a real race. At the start of the race, both runners are within 5% of each others' speed. We know theoretically that the 1500 calorie runner will go further, but by how much is unknown. If both runners are able to achieve similar speeds at the start of the race, does it make more sense to say that 1500 calorie runner will run several laps around the 1000 calorie runner or does it make more sense to say that the 1000 calorie runner may lag by half to one lap behind?
 
GPUs hide latency by using instruction level parallelism in the shaders to compute other threads whilst waiting on things like texture reads elsewhere. The scalar cores help with this too.

I'm not sure what the ROPs can do to combat write latency.

ROPs access a lot of large amounts of memory less often, this decreases the impact of the latency.
 
Question:

A runner has 1000 calories of energy. Another has 1500 calories. This is 50% more calories, but we don't know what that translates into in a real race. At the start of the race, both runners are within 5% of each others' speed. We know theoretically that the 1500 calorie runner will go further, but by how much is unknown. If both runners are able to achieve similar speeds at the start of the race, does it make more sense to say that 1500 calorie runner will run several laps around the 1000 calorie runner or does it make more sense to say that the 1000 calorie runner may lag by half to one lap behind?

its more like one car has 500hp and other 750hp... 750hp grabs the pole and wins the race by the first corner.

Your analogy makes it seem like XB1 will overheat or something.
 
no way will they gimp 3rd party titles on PS4. They might not go out of their to use the extra power for better effects but if they can run at a higher resolution or frame rate they will do it.
 
Dance games, sure, but Kinect 2 is on another level when it comes to fitness applications. I hate to post this twice in the span of a few a few hours, but


I don't see any way that they could extract exertion or pulse information from the 3d PlayStation Camera. Whether that turns out to be useful/meaningful, who knows
The people that discovered the technique in the first place used a single video camera. It's not a super Kinect feature. The camera just measures how flushed your skin is getting.
 
It's going to be pretty disappointing if devs target the Xbox One first.

I'm hoping that PS4 software sales will be so much larger than Xbox One software sales that more attention gets placed on targeting the PS4, and then developers will simply downport Xbox One as necessary.

i dont think thats going to be excuse at all. If PS4 is more powerful by such large amount, then devs will use all of that to create better looking games than other devs.

So if BF comes out and looks a lot better than COD on PS4, then thats bad on COD devs... same goes for KZ, or AC, or any other game.

Basically, it is competition between games that will drive the graphics, not the competition between consoles.

So if there is true 50% advantage, I expect it to show with first games.
 
There will be plenty of people purchasing GTA 5 shortly on systems with massively compromised performance/IQ in comparison to the PC build that Rockstar is showing in the commercials. And that's cool! It's totally fine. I've played plenty of subpar ports this generation, on and off the PC. But I don't pretend that I'm getting the best version when I play them, and I don't fool myself into thinking i'm playing the best version over small difference from one platform to another when I myself have chosen to play it on a console where compromise is in its very nature. When I play on the consoles, I accept things like tearing and subpar framerates and awful IQ, and I don't even see it after I get used to it. It's always going to be in second (or third, or 4th or whatever) place, and I don't proclaim that Call of Duty looks SO MUCH BETTER on the 360 than on the PS3 because of a slightly less subHD render. If I cared SO MUCH about how much better it looks from one minor difference to another, why would I limit myself to a choice between 2 subpar versions in comparison to the one that runs in 1080p/60 (or whichever resolution I happen to choose on my own)? I hope that sort of helps you understand my disconnect here.
I may be wrong, but hasn't Rockstar gone on the record and said the GTAV footage being shown in commercials are from the PS3 version?
 
Top Bottom