Ubisoft issues totally convincing response to Assassin's Creed Unity's resolution

Mostly cause of this:

There's a difference between output resolution and internal/native resolution.

You do know they are using the same engine (from PS3 UC) for PS4 next-gen right?



http://www.digitalspy.com/gaming/ne...ast-of-us-engine-for-ps4.html#~oSilShxtmeWuSJ

-M

Wait, are people really comparing what is likely a very linear game to an open world game? ND has a chance to hit 60fps because they are using a more restricted design. I'd love to see them have a scene in an open world with 2000+ NPCs and still hit 60fps.

I know you didn't make any comparison, I'm just commenting on the poster you quoted.

My TV is 1080p. I run my PC games through my 60" TV at 1080p. I run my PS4 games through my TV at 1080p. What further understanding of numbers do I need to know that running 900p through my TV is less than ideal?

(Before the "get it on PC" crowd chimes in. I can run games like BL2, Presequel, Batman AA through my PC at 1080p, but certainly not these new crop of current gen games.)

I said people sometimes don't understand =P

I was also referring to numbers beyond just resolutions.

I understand running under the native resolution of your display is not ideal, I'm not trying to dispute that. I'm just saying that if he didn't want a debate, he should have stayed away from commenting on the technical aspects of the game. Your average gamer may understand a higher resolution is better, but I doubt most of them care about native resolutions like the average gaffer does.
 
So Ubisoft claims that they're CPU-bound on both consoles? But as far as I know, the resolution of a game is generally GPU-bound. Assuming I am right, there is no reason why the PS4 version shouldn't have a higher resolution considering it has 30-40% more GPU power than the XBO.
Anyway, we'll eventually see it when the first PC benchmarks appear. These should show if the resolution/fps scales with the CPU...
 
So Ubisoft claims that they're CPU-bound on both consoles? But as far as I know, the resolution of a game is generally GPU-bound. Assuming I am right, there is no reason why the PS4 version shouldn't have a higher resolution considering it has 30-40% more GPU power than the XBO.
Anyway, we'll eventually see it when the first PC benchmarks appear. These should show if the resolution/fps scales with the CPU...

Resolution is largely a function of the GPU. Increasing the resolution of a scene with thousands of ai characters should have no effect on frame rate.

For example running furmark(a math intensive CPU testing software meant to test the ability to maintain maximum CPU usage and temps) on PC is basically the same at 1080p vs a small 640x480 window.

Then again with the gpgpu setup maybe they had to steal resources from the GPU to do calculations? Now sure how that works.

I know that on PC when I'm trying to get 120+fps in battlefield 4 lowering the resolution or texture quality has the least effect because I think its tied to video memory.
 
OK, here's the thing.

I cancelled my preorder, and it's not because of "forced parity". It's not because of "avoiding debates". (Just to be clear, I'm still fucking livid about all this). I cancelled for the same reason I told the Sony Customer Service Rep (surprised with the quality of customer support, btw):

I preordered Assassin's Creed: Unity from the Sony Entertainment Network Store.

On Unity's Game Page, it says:

1 player
Network Players 2-4
22MB minimum save size
DUALSHOCK®4
Remote Play
1080p HD Video Output
Online Play (Optional)

I put money down based on a claim that turned out to be false. I took my money back when I found out they lied.

Fuck you Ubisoft.


While it may be misleading it is not really incorrect. The game will output at 1080p, it's just not rendering at 1080p.
 
We don't know whether or not they are taking advantage of the PS4 in other ways. So why jump on the hate train yet?
I'm still trying to figure out why everyone is going berserk when we don't have all of the details. We have a resolution for both, we have the target framerate for both, and we know from previews that the 900p XB1 build was running like crap. There could be many effects that differ on the two versions, as well as a large performance gap, but people seem to be of the opinion that every single effect is locked to parity, and that performance will be identical for both.

I am personally of the opinion that "specs" refer to the framerate and resolution when talking about individual games. If someone were to ask me the "specs" of a game I would tell them those two things. However, it would seem that everyone here is of the opinion that "specs" refers to every single graphical effect the game has to offer. Even if that is the case, and it could be, I still don't see the reason for getting worked up about it until we have seen technical analysis of both versions of the game. If someone has an issue with the possibility of parity then I would absolutely expect them to cancel their preorders and wait for final details, but to be getting this worked up about it when we simply don't have all of the details is rather premature. I have absolutely no problem with people going nuts if there is indeed full parity, but only once all of the facts are out there.

The one point I will agree on though is that Ubisoft has handled this whole mess horribly. The original quote is absurd, and the way they have tried to answer the questions about it has been awful. I feel like they are probably caught in a tight spot though. If there is complete parity then there is no good way to respond and any justification they try to give it would get them in even more trouble, and rightfully so. If there are other graphical or performance differences they still can't say anything or they would diminish the worth of one version of their game. Even more dire is the fact that they would be speaking out against the version that belongs to their marketing partner for this game. Saying that they are developing the game to the strengths of each platform is about all they can say without potentially pissing off MS or discouraging their XB1 customers, which is unfortunate because it doesn't provide the answer that people want.
 
Oh I know, but hypothetically, if I was just your regular consumer that doesn't follow industry news but still cares about the best graphical experience, I'd still be mighty pissed off. Not at Ubi, not at Sony, just pissed off that someone lied to me and took my money based on a false claim.
But the game will technically output to 1080p thanks to upscaling. So it's not really a lie if you really think about it. Now if it said will be native 1080p then you'd have every right to be pissed about that slide.
 
No one can diss The Order's developers anymore because they knew when to stop arguing their point. Unity's developer could learn when to shut up.
Agreed, there isn't anything they can say without pissing off one of their platform partners and a segment of their customers. They should have made the one statement and then left it alone. All they did was rekindle the discussion with their blog post.
 
Ha

1SJkkbw.jpg
 
I'm still trying to figure out why everyone is going berserk when we don't have all of the details. We have a resolution for both, we have the target framerate for both, and we know from previews that the 900p XB1 build was running like crap. There could be many effects that differ on the two versions, as well as a large performance gap, but people seem to be of the opinion that every single effect is locked to parity, and that performance will be identical for both.

I am personally of the opinion that "specs" refer to the framerate and resolution when talking about individual games. If someone were to ask me the "specs" of a game I would tell them those two things. However, it would seem that everyone here is of the opinion that "specs" refers to every single graphical effect the game has to offer. Even if that is the case, and it could be, I still don't see the reason for getting worked up about it until we have seen technical analysis of both versions of the game. If someone has an issue with the possibility of parity then I would absolutely expect them to cancel their preorders and wait for final details, but to be getting this worked up about it when we simply don't have all of the details is rather premature. I have absolutely no problem with people going nuts if there is indeed full parity, but only once all of the facts are out there.

The one point I will agree on though is that Ubisoft has handled this whole mess horribly. The original quote is absurd, and the way they have tried to answer the questions about it has been awful. I feel like they are probably caught in a tight spot though. If there is complete parity then there is no good way to respond and any justification they try to give it would get them in even more trouble, and rightfully so. If there are other graphical or performance differences they still can't say anything or they would diminish the worth of one version of their game. Even more dire is the fact that they would be speaking out against the version that belongs to their marketing partner for this game. Saying that they are developing the game to the strengths of each platform is about all they can say without potentially pissing off MS or discouraging their XB1 customers, which is unfortunate because it doesn't provide the answer that people want.

What if?

There are number of these posts using pure speculation to explain why people shouldn't over react to the comment. The comment is pretty straightforward though. They locked the game at the same spec to avoid debates. This is what people are reacting to. It's not anyone's job to explain for Ubisoft what was really meant by that comment, if indeed what appeared to be clearly stated, wasn't in fact what he meant.

Ubisoft has given a number of PR statements, and none of them have given a satisfactory answer to that specific question. They seem to be dancing around that question and answering a strawman instead.

Maybe the PS4 version has slightly better texture detail for Arno's boots. It's just pure speculation that we don't yet have an answer to.
 
My guess is that they'll soak up the extra GPU power in the PS4 version with more expensive motion blur precision, and maybe slightly better AA and call it a day. Then they can comfortably say "See? We took advantage of all the things".
 
I originally posted this in the Bioware thread regarding DA:I but I also think it applies here and probably the Witcher 3 thread too.

I had a look at some PC benchmarks at TPU (http://www.techpowerup.com/reviews/S..._Dual-X/6.html) and using the R7-265 and the R7-260X as surrogates for the PS4 and Xbox One you can see what FPS the R7-265 averages at 1080p vs the R7-260X at 900p. Now I should mention that technically the R7-260X is faster than the Xbox One GPU. It has more Tflops then the R7-265 and the PS4 but it has fewer ROPS and less memory bandwidth so is a lower performing card overall. In certain shader heavy games though it gets closer to the R7-265 than the Xbox One can get to the PS4 but in certain bandwidth limited games the gap is likely to be larger. The reason I used this card though is because it has 2GB of VRAM and with some games 1GB cards that are closer in GPU performance to the XBox One are limited by the VRAM not the GPU and with Xbox One that would not be the case. The R7-265 on the other hand is practically the same in all performance metrics compared to the PS4 GPU. So with that in mind what do we find out.

Below is a table generated from the link above to show the FPS of the 260X @ 900p compared to the 265 @ 1080p. Also bare in mind that the 260X is closer in performance to the 265 than the Xbox One GPU is to the PS4 GPU generally although there are two outliers in Batman:AO and DIablo 3. I expect these are more memory bandwidth bound and if someone was to investigate the reason for the performance difference in Batman it might help explain the differences you see in Fox Engine games.

-------------------- 260X @ 900p ----- 265 @ 1080p
ACIV ----------- 25.7 ------------------- 27.7
Batman: AO - 32.2 ------------------- 66.5
Battlefield 3 -- 53.4 ------------------- 52.6
Battlefield 4 -- 35.7 ------------------- 35.2
BS: Infinite --- 68.0 ------------------- 61.1
COD:G --------- 64.3 ------------------- 60.8
COJ:Gunslinger 128.3 -------------------127.8
Crysis ---------- 43.3 ------------------- 41.8
Crysis 3 ------- 20.8 ------------------- 20.8
Diablo 3 ------- 88.2 ------------------- 104.9
Far Cry 3 ----- 26.7 ------------------- 25.4
Metro:LL ----- 38.3 ------------------- 35.2
Splinter Cell - 27.1 ------------------- 29.6
Tomb Raider - 27.0 ------------------- 23.6
WoW ----------- 66.9 ------------------- 74.9

As you can see increasing the resolution does not suddenly cause much in the way of a performance deficit vs the slower GPU at 900p and in a few cases performance at 1080p on the 265 is greater than 900p on the 260X. Again you should all take into account of the fact the the 260X is closer to the 265 than the Xbox One is to the PS4.
 
But the dude straight up said they locked it to avoid debates.. how is that wording it incorrectly? He implied that one was limited.
Yep. The OP here doesn't do anything to make it seem like a case of bad wording.

In fact, I leave just as off-put because he is missing that the controversy wasn't about resolution, it was about parity.
 
"Ubisoft Next-gen Starts Here" should be Next-gen is 900p@30fps
Someone should gif that.

Now, what they said about 900p is not exactly wrong, different games need different specifications, in some cases a more demanding game can only run at 30 fps or sub 1080p on consoles, as more graphically intensive games are released.

That doesnt say anything about Ubisoft competency,their PR team, the hardware differences of XBO and PS4, and the optimizations that developers can do on closed systems, but "next gen" can be a lot of things, may be a discussion worth having .
 
My guess is that they'll soak up the extra GPU power in the PS4 version with more expensive motion blur precision, and maybe slightly better AA and call it a day. Then they can comfortably say "See? We took advantage of all the things".

Ubisoft is backed in a corner anyway until the game is out, the best strategy for them at this point is silence.
Even if they forced 900p parity because of a marketing deal they aren't allowed to talk about other differences, they can't make comparisons and can't answer about where the extra power of the PS4 is going because that would go against the terms of the deal as well, they can't even show the PS4 version until launch!
They just fucked up and we'll see how things unfold at launch, if the PS4 version has been totally locked to XB1 specs with a frame rate cap (in that case the two versions will be totally identical with a more stable frame rate on PS4) or if it's just a case of resolution parity asked by MS for marketing purposes (although at this point the deal failed anyway since they got a lot of negative backlash and what came out is that they have to pay if they want parity with PS4) while other aspects can show differences. We won't know until launch, btw it would be interesting to read Sony's stance on the matter although they can't say a lot publicly not to hurt their relationship with Ubisoft.
 
The one point I will agree on though is that Ubisoft has handled this whole mess horribly. The original quote is absurd, and the way they have tried to answer the questions about it has been awful. I feel like they are probably caught in a tight spot though. If there is complete parity then there is no good way to respond and any justification they try to give it would get them in even more trouble, and rightfully so. If there are other graphical or performance differences they still can't say anything or they would diminish the worth of one version of their game. Even more dire is the fact that they would be speaking out against the version that belongs to their marketing partner for this game. Saying that they are developing the game to the strengths of each platform is about all they can say without potentially pissing off MS or discouraging their XB1 customers, which is unfortunate because it doesn't provide the answer that people want...

absolutely. the ms deal completely compromised their ability to answer any/all questions in a simple, straight-forward manner. meaning it will continue to be 24/7 gobbledygook from ubi up till launch, at which point the versions will speak for themselves...

meaning they really should, at this point, simply shut up, weather the storm, & let the chips fall. anything they say, at this point, will only make the situation worse...
 
I originally posted this in the Bioware thread regarding DA:I but I also think it applies here and probably the Witcher 3 thread too.

I had a look at some PC benchmarks at TPU (http://www.techpowerup.com/reviews/S..._Dual-X/6.html) and using the R7-265 and the R7-260X as surrogates for the PS4 and Xbox One you can see what FPS the R7-265 averages at 1080p vs the R7-260X at 900p. Now I should mention that technically the R7-260X is faster than the Xbox One GPU. It has more Tflops then the R7-265 and the PS4 but it has fewer ROPS and less memory bandwidth so is a lower performing card overall. In certain shader heavy games though it gets closer to the R7-265 than the Xbox One can get to the PS4 but in certain bandwidth limited games the gap is likely to be larger. The reason I used this card though is because it has 2GB of VRAM and with some games 1GB cards that are closer in GPU performance to the XBox One are limited by the VRAM not the GPU and with Xbox One that would not be the case. The R7-265 on the other hand is practically the same in all performance metrics compared to the PS4 GPU. So with that in mind what do we find out.

Below is a table generated from the link above to show the FPS of the 260X @ 900p compared to the 265 @ 1080p. Also bare in mind that the 260X is closer in performance to the 265 than the Xbox One GPU is to the PS4 GPU generally although there are two outliers in Batman:AO and DIablo 3. I expect these are more memory bandwidth bound and if someone was to investigate the reason for the performance difference in Batman it might help explain the differences you see in Fox Engine games.

-------------------- 260X @ 900p ----- 265 @ 1080p
ACIV ----------- 25.7 ------------------- 27.7
Batman: AO - 32.2 ------------------- 66.5
Battlefield 3 -- 53.4 ------------------- 52.6
Battlefield 4 -- 35.7 ------------------- 35.2
BS: Infinite --- 68.0 ------------------- 61.1
COD:G --------- 64.3 ------------------- 60.8
COJ:Gunslinger 128.3 -------------------127.8
Crysis ---------- 43.3 ------------------- 41.8
Crysis 3 ------- 20.8 ------------------- 20.8
Diablo 3 ------- 88.2 ------------------- 104.9
Far Cry 3 ----- 26.7 ------------------- 25.4
Metro:LL ----- 38.3 ------------------- 35.2
Splinter Cell - 27.1 ------------------- 29.6
Tomb Raider - 27.0 ------------------- 23.6
WoW ----------- 66.9 ------------------- 74.9

As you can see increasing the resolution does not suddenly cause much in the way of a performance deficit vs the slower GPU at 900p and in a few cases performance at 1080p on the 265 is greater than 900p on the 260X. Again you should all take into account of the fact the the 260X is closer to the 265 than the Xbox One is to the PS4.

You could also say the memory setup is different enough to make the comparison invalid. PC's have system memory and video memory. The consoles have this shared ( which can cause numerous other issues if not utilized correctly.
 
The Facebook post from Ubisoft with this response is depressing. So many "HERPDERP DA GAMEZ STILL FUN DOE" responses. Of course it's still fun (potentially) but that's not the point :-/
 
You could also say the memory setup is different enough to make the comparison invalid. PC's have system memory and video memory. The consoles have this shared ( which can cause numerous other issues if not utilized correctly.

More so on Xbox One than PS4 due to the low bandwidth of the DDR3. It requires correct use of ESRAM to avoid being bandwidth starved on Xbox One but that is not a concern on the PS4 as it has fast ram for the entire memory pool.

The comparison is perfectly valid to illustrate the point that if the Xbox One can handle 900p the PS4 can handle 1080p with the same ingame settings since the biggest difference there is the GPU which is the same as the PC in the example.
 
More so on Xbox One than PS4 due to the low bandwidth of the DDR3. It requires correct use of ESRAM to avoid being bandwidth starved on Xbox One but that is not a concern on the PS4 as it has fast ram for the entire memory pool.

The comparison is perfectly valid to illustrate the point that if the Xbox One can handle 900p the PS4 can handle 1080p with the same ingame settings since the biggest difference there is the GPU which is the same as the PC in the example.

Look at it this way, ignore the esram setup on xbox one. You don't have to use it. That makes the memory setup similar on xbox one and ps4 despite using different speeds of memory. Which is the point I'm making. you have different performance concerns with console that are not applicable on PC. There's less potential for stalls when access video memory or system memory compared to the consoles.
 
We don't know whether or not they are taking advantage of the PS4 in other ways. So why jump on the hate train yet?

Oh really? What will they be "taking advantage of"? Touchpad support?

The only thing Ubi will continue to "take advantage of" is customers until people put their foot down and say, "Fuck Off".
 
Oh really? What will they be "taking advantage of"? Touchpad support?

The only thing Ubi will continue to "take advantage of" is customers until people put their foot down and say, "Fuck Off".

This is the mentality on NeoGaf of late, why is it all or nothing? You, I, and everyone else here, clearly don't know if they are doing anything else that would eat up the performance on ps4 ( thus making 1080p not a viable option ). That is a fact.
 
This is the mentality on NeoGaf of late, why is it all or nothing? You, I, and everyone esle here clearly don't know if they are doing anything else that would eat up the performance on ps4 ( thus making 1080p not a viable option ). That is a fact.
That's not what they're original statement was about though.

They clearly stated that they kept both games at parity to avoid upsetting certain fans.

People are more annoyed at that than the resolution.
 
I'm still trying to figure out why everyone is going berserk when we don't have all of the details. We have a resolution for both, we have the target framerate for both, and we know from previews that the 900p XB1 build was running like crap. There could be many effects that differ on the two versions, as well as a large performance gap, but people seem to be of the opinion that every single effect is locked to parity, and that performance will be identical for both.

I am personally of the opinion that "specs" refer to the framerate and resolution when talking about individual games. If someone were to ask me the "specs" of a game I would tell them those two things. However, it would seem that everyone here is of the opinion that "specs" refers to every single graphical effect the game has to offer. Even if that is the case, and it could be, I still don't see the reason for getting worked up about it until we have seen technical analysis of both versions of the game. If someone has an issue with the possibility of parity then I would absolutely expect them to cancel their preorders and wait for final details, but to be getting this worked up about it when we simply don't have all of the details is rather premature. I have absolutely no problem with people going nuts if there is indeed full parity, but only once all of the facts are out there.

The one point I will agree on though is that Ubisoft has handled this whole mess horribly. The original quote is absurd, and the way they have tried to answer the questions about it has been awful. I feel like they are probably caught in a tight spot though. If there is complete parity then there is no good way to respond and any justification they try to give it would get them in even more trouble, and rightfully so. If there are other graphical or performance differences they still can't say anything or they would diminish the worth of one version of their game. Even more dire is the fact that they would be speaking out against the version that belongs to their marketing partner for this game. Saying that they are developing the game to the strengths of each platform is about all they can say without potentially pissing off MS or discouraging their XB1 customers, which is unfortunate because it doesn't provide the answer that people want.

Ubisoft should have just told the truth, that they're forcing parity because they didn't want to upset MS, with whom they have a co-marketing deal, and that the PS4 is clearly more powerful and they are holding the PS4 version back. Just lay out the facts and deal with the backlash. Xbone owners should just be informed that they bought a weaker console, there's no point obfuscating things.
 
Look at it this way, ignore the esram setup on xbox one. You don't have to use it. That makes the memory setup similar on xbox one and ps4 despite using different speeds of memory. Which is the point I'm making. you have different performance concerns with console that are not applicable on PC. There's less potential for stalls when access video memory or system memory compared to the consoles.

In that scenario the Xbox One is basically a PC with a 7770 GPU and more VRAM than normal. The PS4 is the same PC with an R7 265 GPU and more VRAM than normal. That kind of performance concern is such a tiny fraction of the overall performance profile that unless something is going very wrong, in which case it will affect all resolutions, it is not really worth thinking about.
 
I'm still trying to figure out why everyone is going berserk when we don't have all of the details. We have a resolution for both, we have the target framerate for both, and we know from previews that the 900p XB1 build was running like crap. There could be many effects that differ on the two versions, as well as a large performance gap, but people seem to be of the opinion that every single effect is locked to parity, and that performance will be identical for both.

I am personally of the opinion that "specs" refer to the framerate and resolution when talking about individual games. If someone were to ask me the "specs" of a game I would tell them those two things. However, it would seem that everyone here is of the opinion that "specs" refers to every single graphical effect the game has to offer. Even if that is the case, and it could be, I still don't see the reason for getting worked up about it until we have seen technical analysis of both versions of the game. If someone has an issue with the possibility of parity then I would absolutely expect them to cancel their preorders and wait for final details, but to be getting this worked up about it when we simply don't have all of the details is rather premature. I have absolutely no problem with people going nuts if there is indeed full parity, but only once all of the facts are out there.

The one point I will agree on though is that Ubisoft has handled this whole mess horribly. The original quote is absurd, and the way they have tried to answer the questions about it has been awful. I feel like they are probably caught in a tight spot though. If there is complete parity then there is no good way to respond and any justification they try to give it would get them in even more trouble, and rightfully so. If there are other graphical or performance differences they still can't say anything or they would diminish the worth of one version of their game. Even more dire is the fact that they would be speaking out against the version that belongs to their marketing partner for this game. Saying that they are developing the game to the strengths of each platform is about all they can say without potentially pissing off MS or discouraging their XB1 customers, which is unfortunate because it doesn't provide the answer that people want.


Then there is no need to state the "parity" aspect. If one of the console's resources are placed within the game they could've stated that 900/30fps was the target but is able to push each console's available strengths. See, easy as that. No room for controversy other than lofty high expectations (which isn't an issue for an Ubisoft game).

But the moment "parity" was highlighted, then it's only natural how it will be interpreted. This, on top of the "non-debate" expectation is deliberately intended to divert the attention from the subject entirely, something that ended up with egg in their faces.

And yet, despite of this. Their continuous denial and side-stepping only vindicated our suspicious about them. The fact that they CANNOT address this only affirms our suspicions all along: The PS4 is compromised and it's NOT just a simple graphical upgrade.
 
In that scenario the Xbox One is basically a PC with a 7770 GPU and more VRAM than normal. The PS4 is the same PC with an R7 265 GPU and more VRAM than normal. That kind of performance concern is such a tiny fraction of the overall performance profile that unless something is going very wrong, in which case it will affect all resolutions, it is not really worth thinking about.

That's not right. The consoles don't have 4 - 16 GB of system memory like a PC. On a PC the CPU wouldn't often have to write to video memory ( if I recall correctly), but on a console, the CPU and GPU write to the same memory pool. This in itself makes the test case further from accurate example.
 
The fact that they CANNOT address this only affirms our suspicions all along: The PS4 is compromised and it's NOT just a simple graphical upgrade.


That's total BS and typical internet forum behavior. The "If you can't give us all the information, we are right you are wrong!" mentality specifically. In reality they don't have to give you ANY information about how their engine works to prove you right or wrong. Why should they, you've already made up your minds.

The only fact here is people don't have enough information to claim anything more then a guess or give an opinion on the matter. There are no FACTS PS4 is compromised.
 
To be fair, resolution, etc says nothing as to whether the PS4 rev actually supports higher effects/settings ... and if you notice, they were quite careful in NOT addressing that.


There really are only 3 potential reasons why this still hasn't been addressed:


1) Of course the PS4 version does have some advantages over the XBone version. They simply decided that they could do more keeping the resolution where it's at. For this particular instance, using the GPU for better effects, etc yielded what they consider better results than just upping the rez. While I prefer the choice ala PC (and some console games like GT), it could be a reasonable decision.

However, they don't want to discuss those advantages given the massive spotlight shining on them right now. While MS may not be forcing parity, given the current spectacle of it all ... avoiding that discussion is likely seen as the best move by their PR team in terms of relations with MS.


2) There are no differences in effects/settings because they are in fact artificially gimping the PS4 rev. With the PS4's GPU and bandwidth advantages, it simply can perform better. So if the game doesn't, something is afoot.


3) There are no differences in effects/settings because while the PS4 version actually does maintain 30fps essentially throughout ... the XBone version actually does not.





For Ubisoft's sake ... it better be #1 :p
 
Thank you Jason! It's nice to see a member of the press not being dismissive of 1080p.

http://kotaku.com/why-everyone-cares-about-all-this-1080p-stuff-1644908894

GAF's own Jason Schreier to the rescue, as per usual!

Edit: I loved this passage:

I've seen some pundits and journalists theorize that this is all about console wars—puerile battles over slavish company loyalty—but I really don't think that's true. I think it's fairly reasonable for PS4 owners to get mad when a company appears to be limiting a game's capabilities.

I've hated how dismissive some of the press has been in response to this when it's so abundantly clear that this is not about the game being 900p instead of 1080p. It's about the implication of forced parity. You can say that a million times and people will still try to color it as something else.
 
That's total BS and typical internet forum behavior. The "If you can't give us all the information, we are right you are wrong!" mentality specifically. In reality they don't have to give you ANY information about how their engine works to prove you right or wrong. Why should they, you've already made up your minds.

The only fact here is people don't have enough information to claim anything more then a guess or give an opinion on the matter. There are no FACTS PS4 is compromised.

No it does not.

Oh really? Tell me otherwise without being so reductive. Why is it that Ubisoft is unable to clear the fog regarding this particular situation? By your very logic, we can also account to the PC being locked at the same "parity specs" because of this magical "gating" engine that seems to withhold potential features from a more technical built hardware.
 
Then again with the gpgpu setup maybe they had to steal resources from the GPU to do calculations? Now sure how that works.
First, they talked about CPU, not GPU, so that doesn't work.

And if they use GPGPU, it's even worse : at the same frame rate, the GPGPU usage (for AI, etc.) is the same on both console.

Imagine now that they use 50% of the GPU for GPU computation, and 50% for display, on a given GPU. Now take a 25% more poverful GPU. For the same task, the GPGPU computations take now 40% of the GPU, so the actual display power has increased 50%.


In short: if the GPU is used as a GPGPU, it's even more easy to achieve a higher resolution on PS4 than on XBOne... You can take it as you want, their explanations doesn't make sense.


Not that it's new, anyway, it (forced parity) has probably been done last generation in reverse. The REALLY bad idea from Ubisoft has been to let this "information" slip...
 
To be fair, resolution, etc says nothing as to whether the PS4 rev actually supports higher effects/settings ... and if you notice, they were quite careful in NOT addressing that.


There really are only 3 potential reasons why this still hasn't been addressed:


1) Of course the PS4 version does have some advantages over the XBone version. They simply decided that they could do more keeping the resolution where it's at. For this particular instance, using the GPU for better effects, etc yielded what they consider better results than just upping the rez. While I prefer the choice ala PC (and some console games like GT), it could be a reasonable decision.

However, they don't want to discuss those advantages given the massive spotlight shining on them right now. While MS may not be forcing parity, given the current spectacle of it all ... avoiding that discussion is likely seen as the best move by their PR team in terms of relations with MS.


2) There are no differences in effects/settings because they are in fact artificially gimping the PS4 rev. With the PS4's GPU and bandwidth advantages, it simply can perform better. So if the game doesn't, something is afoot.


3) There are no differences in effects/settings because while the PS4 version actually does maintain 30fps essentially throughout ... the XBone version actually does not.





For Ubisoft's sake ... it better be #1 :p
From their PR wording sound like number 2. Hope not for their sake too. We shall see when it coming out.
 
From their PR wording sound like number 2. Hope not for their sake too. We shall see when it coming out.

Yeah, that's the worry. We want number 1 if they don't wind up at 1080p, we'd live with number 3, but a DF analysis saying it is number 2 is going to be trouble for them.
 
That's total BS and typical internet forum behavior. The "If you can't give us all the information, we are right you are wrong!" mentality specifically. In reality they don't have to give you ANY information about how their engine works to prove you right or wrong. Why should they, you've already made up your minds.

The only fact here is people don't have enough information to claim anything more then a guess or give an opinion on the matter. There are no FACTS PS4 is compromised.

Finally someone here who speaks the truth! Well said oldergamer!
 
That's total BS and typical internet forum behavior. The "If you can't give us all the information, we are right you are wrong!" mentality specifically. In reality they don't have to give you ANY information about how their engine works to prove you right or wrong. Why should they, you've already made up your minds.

The only fact here is people don't have enough information to claim anything more then a guess or give an opinion on the matter. There are no FACTS PS4 is compromised.

They have issued literally 4 different statements at this point and the only time they have said anything that actually explains why the PS4 is not running at a higher resolution was the first time when it was described as a choice to embrace parity. They have had ample opportunity to tell us about other differences that could explain sticking at 900p on PS4 but all we have gotten is evasive and obfuscatory responses.
 
They have issued literally 4 different statements at this point and the only time they have said anything that actually explains why the PS4 is not running at a higher resolution was the first time when it was described as a choice to embrace parity. They have had ample opportunity to tell us about other differences that could explain sticking at 900p on PS4 but all we have gotten is evasive and obfuscatory responses.

Please explain why there was parity with Destiny then? If the xbone is as weak as people believe how did it manage the 1080p? How did it manage to match the resolution and frame rate of the ps4 version? According to the net the ps4 being 50% more powerful in certain areas should've run Destiny at 1080p whilst the xbone should've stuck with 900p...
Does it occur to anyone that Unity may be highly demanding on ps4 and it just can't go above 900p? But much like Destiny if the xbone could match the ps4 for 1080p then maybe it can do the same here...
As for the resolution difference on Dragons Age, that's a tricky one. Funny how some devs can match ps4 and xbone yet others can't...
 
Please explain why there was parity with Destiny then? If the xbone is as weak as people believe how did it manage the 1080p? How did it manage to match the resolution and frame rate of the ps4 version? According to the net the ps4 being 50% more powerful in certain areas should've run Destiny at 1080p whilst the xbone should've stuck with 900p...
Does it occur to anyone that Unity may be highly demanding on ps4 and it just can't go above 900p? But much like Destiny if the xbone could match the ps4 for 1080p then maybe it can do the same here...
As for the resolution difference on Dragons Age, that's a tricky one. Funny how some devs can match ps4 and xbone yet others can't...

Dude, Destiny is a cross-gen title....
 
Top Bottom