Assassin's Creed "Parity": Unity is 900p/30fps on both PS4 & Xbox One

Status
Not open for further replies.
How does reaching resolution parity bring it up to PS4's level? This is my entire point, we have no idea what is going on beyond resolution. There is no way in hell they got the XB1 version to 900p without making sacrifices to performance, graphical settings, or both. The recent preview from Videogamer where they got hands on with the XB1 900p build talks about how poorly the game was running with dips into the low twenties. That alone is a clear sign that sacrifices were made to reach that resolution, and further sacrifices may need to be made in order to smooth out the framerate. Even if all other things are equal, is it still parity if the two version are at 900p, but one of them performs significantly worse than the other?

Dude, fixing framerate issues is almost always done at the latest stages of development. They target resolution and IQ then stabilize the framerate latter. So this doesn't mean that they suddenly upped the resolution.

And because Ubi said the game is CPU bound, this likely means that PS4 version has the same framerate issues just like Xbone version.
 
Dude, fixing framerate issues is almost always done at the latest stages of development. They target resolution and IQ then stabilize the framerate latter. So this doesn't mean that they suddenly upped the resolution.

And because Ubi said the game is CPU bound, this likely means that PS4 version has the same framerate issues just like Xbone version.

I haven't seen Ubi say that it's still CPU-bound on either console. (after capping at 30fps, and especially not after 900p)
 
I haven't seen Ubi say that it's still CPU-bound on either console. (after capping at 30fps, and especially not after 900p)

His quote "Technically we're CPU-bound,". He didn't say "we were", so I'm assuming they're still CPU bound.
 
I haven't seen Ubi say that it's still CPU-bound on either console. (after capping at 30fps, and especially not after 900p)

Dude, its in the article quoted in the OP.

I don't believe that the game is limited by CPU to that extent for one minute, but Ubi have made the claim for sure.
 
I never said that exclusives shouldn't exist.

I said BOUGHT exclusives for franchises that are usually multiplatform shouldn't exist.
I also said that "premium editions" that cut out gameplay and give it to only one console shouldn't exist.

Quite obviously things made and funded exclusively by one company or another, especially new IP, is the lifeblood of the industry.



I know this is from awhile ago, but I agree completely with this statement.. oh Tomb Raider...
 
So I see that everyone is still assuming the PS4 version was brought down instead of asking what was sacrificed on the XB1 version to reach resolution parity. Looking at this thread you would think that resolution is the only visual effect used it putting a game on screen.

Of course, no matter how close versions of a game are to one another, there will always be differences. But nobody is assuming anything. The PS4 is significantly more powerful than Xbox One, and they specifically said - and I quote:

"We decided to lock them at the same specs to avoid all the debates and stuff," senior producer Vincent Pontbriand...

Then they went on to talk about how they're CPU-bound (even though PS4 wins here too, but I digress), when that should have no issue whatsoever on whether or not the PS4's significant extra GPU power should not be utilized to enhance the resolution.

In any event, the assumptions went out the window the second Ubisoft's senior producer on AC: Unity straight up admitted they're fucking their PS4 customers in the name of coddling Microsoft and their legion of sad fanboys.

So you assume. When people talk about "specs" on a console game they are talking about resolution and framerate. This has always been the case. Why people assume that they are talking about anything beyond that is beyond me. And even beyond that, not all 30fps games are equal in performance. We have no idea how this game is going to perform on each console, nor do we know the overall visual quality on each platform. We have specific numbers for resolution and target framerate, nothing else.

What the heck? Specs are not even remotely the synonym to "resolution and framerate." Types of resolution and framerate are necessary components that may result from your specs being at a certain level, but only the most ill-informed dunce would try to suggest that "specs" and "framerate and resolution" were equivalent.

In fact, I don't even know how you're going to begin to qualify this argument. Where is your proof that people are talking resolution and framerate when they talk about specs? Exactly how did you conclude that there is nothing else being talked about, even though other aspects of game's technical prowess are discussed and debated literally every day on game websites, forums and all sorts of social media? Things such as AA, bokeh, motion blur, v-sync. And all aspects of specs are discussed too, and I should know since this forum debated every last flop and triangle these systems could push down to the letter in the lead up to the consoles releases.

And on top of all that, you're not even using the slightest hint of basic logic. If the SENIOR PRODUCER said they locked the game "at the same specs" to "avoid debate", do you believe that means they have a game that is performing significantly better on the console that should, by definition, allow the game to perform significantly better? Clearly if there were any major differences, then debate would still arise. QED, there is little difference because Ubisoft is a bunch of embarrassing fucking assholes.
 
Just a reminder, this is the second update from Ubi
Final specs for Assassin's Creed Unity aren't cemented yet,

It's like Ubi is actually admitting that they forced parity, since now they're going to change "specs" (for PS4 most likely) just like that.
 
If you are CPU bound, it means the GPU is waiting after the CPU. But once you do solve your CPU issues, you now become GPU bound. It's one or the other. So if you are CPU bound at one stage of the production, additional graphical optimizations will do nothing to improve framerate, but it doesn't give you a free pass to put more work on the GPU because either you'll end up GPU bound instead, or will be when the CPU will be fine, potentially ending up with sub 30 framerate again.

The whole parity comment thing, whatever, but saying "we're currently CPU bound and that's our bottleneck" means that's what you'll want your programmers to work on if your FPS is below 30, not boosting the graphic quality, because you'll still be CPU bound and still not run at 30.

So pretty sure in that interview, the CPU part was a way of saying "we're working on CPU-related optimizations because otherwise we won't run at 30FPS, so no point in putting programmers to work on higher resolution in that context anyway".
 
It's like Ubi is actually admitting that they forced parity, since now they're going to change "specs" (for PS4 most likely) just like that.
The nasty (fun?) thing about these kinds of situations is that when you screw up your first line you cannot recover. Any and all differences between the two versions will be scrutinized to death. Allegations of cover-ups, bribes, sabotage, all of that. The uproar over a 1080p bump on the PS4 would be epic, as would a lack of a bump! After screwing up this bad Ubi completely deserves it, so whatever.
Timed parity. M. Knight twist jpeg.
Meh, not really much of a twist. It would explain last year, actually.
 
Im sorry. Your cancelling a pre order for a watch.... because of parity.

A watch.

I can't GAF. Your politics are too funny. Lets all tweet at ubisoft about how rustled your jimmies are and watch them respond back to us with the laughs of them swimming in that MS money.

Catching up on the thread and I noticed your comment, bit late since it was made on the 6th and you're banned, but no, I am not cancelling my pre-order of a watch.

I cancelled my pre-order of the Amazon special offer of the game that comes with a watch, thanks for the comment though.

http://i1.minus.com/ibbxluZiTMk3dr.jpg
 
What the heck? Specs are not even remotely the synonym to "resolution and framerate." Types of resolution and framerate are necessary components that may result from your specs being at a certain level, but only the most ill-informed dunce would try to suggest that "specs" and "framerate and resolution" were equivalent.

We don't have an unabridged transcript to work from, but if the guy was just asked a question about res and framerate, and he goes "yeah, we kept the specs the same", it seems very likely he was referring to res and framerate. He's not quoting a fact sheet verbatim, he's having a conversation, and so taking a literal interpretation to mean that everything that could be considered a "spec" was set to be identical is, at best, premature (if it turns out to be correct) or at worst it's completely wrong (if it turns out not to be the case).

Like I said earlier, the dude makes a very spurious claim in the interview - that they could run it at 100fps if only it wasn't CPU bottlenecked. This must surely be a vast exaggeration, or possibly what he meant to say is that even if they paired it with a beastly video card it wouldn't be possible to improve the frame-rate, and it just came out of his mouth wrong. Either way though, it's a weird interview all round.
 
Exactly. Everyone is freaking the hell out over complete parity when the only information we have is on resolution and target framerate. There are a hell of a lot more effects that go into making a game look good than just resolution. Even with Watchdogs there were graphical differences apart from resolution. We have no idea what graphical settings were lowered, or how performance was effected by getting unity to 900p on the XB1. Everyone is looking at this issue backwards. It shouldn't be about what the PS4 version isn't getting by being at 900p, it should be about what the XB1 version is losing by being at 900p.

How does reaching resolution parity bring it up to PS4's level? This is my entire point, we have no idea what is going on beyond resolution. There is no way in hell they got the XB1 version to 900p without making sacrifices to performance, graphical settings, or both. The recent preview from Videogamer where they got hands on with the XB1 900p build talks about how poorly the game was running with dips into the low twenties. That alone is a clear sign that sacrifices were made to reach that resolution, and further sacrifices may need to be made in order to smooth out the framerate. Even if all other things are equal, is it still parity if the two version are at 900p, but one of them performs significantly worse than the other?

Amazing how you still don't understand, and you continue to spin this strange narrative.

Had the dev said "We were able to achieve (or reach, or whatever euphemism) parity to avoid debates and stuff", it would make me think they worked hard to get the X1 version to match the PS4's. That would be great. Everyone wins.

Had the devs said "We locked the specs at 900p / 30 in order to achieve all of this cool stuff. Check it out." That would be fine with me. Resolution isn't the be all and end, though part of the reason we all buy new consoles or upgrade our gaming rigs is to experience amazing new graphics.

But instead Ubi said they locked specs to avoid debates and stuff.

And this was after Ubi said they were targeting 1080p / 60 previously.

"Locked" implies restraint of one, not advancement of the other. That's the 1st red flag.

The reason to "avoid debates and stuff" is obviously the other.


Personally, the game could be 720p to me and it wouldn't matter, so 900p doesn't either. It's the reason that the game is 900p that is off putting.

But I'm sure by now that you've read similar posts to mine that you're just going to continue to ignore, all so you can push your "omg all this about parity and resolution???" narrative.
 
Just a reminder, this is the second update from Ubi


It's like Ubi is actually admitting that they forced parity, since now they're going to change "specs" (for PS4 most likely) just like that.

i think more likely they're trying to satiate the justified anger of a massive portion of their customer base so they hope they forget about it at launch day after they all already purchased it only to find out they changed nothing.

Ubisoft, Microsoft, EA and a bunch more have all been shady as shit this generation. They really fucking think nobody will learn, and they might be right to a degree. I can't believe, for example, how many gamers are actually willing to trust EA with a fucking subscription after all the horrible, hateful, distrustful, anti-consumer shit they pulled last gen and into this gen. But nay, flip a few dollar savings in front of people's eyes, and they'll forget you fucking stabbed their Dog while forcing their entire family to watch your tears of agony.

I guess some people just don't give a fuck how they throw their dollars into the incinerators. But it's a shame, because everyone has to suffer for gamers lack of self-control.

ThoseDeafMotes said:
We don't have an unabridged transcript to work from, but if the guy was just asked a question about res and framerate, and he goes "yeah, we kept the specs the same", it seems very likely he was referring to res and framerate. He's not quoting a fact sheet verbatim, he's having a conversation, and so taking a literal interpretation to mean that everything that could be considered a "spec" was set to be identical is, at best, premature (if it turns out to be correct) or at worst it's completely wrong (if it turns out not to be the case).

I was more responding to his seeming generalization there which said this:

Yoday: "When people talk about "specs" on a console game they are talking about resolution and framerate. This has always been the case. Why people assume that they are talking about anything beyond that is beyond me."

I mean, in some cases they may be referring to those specific technical aspects, in many other cases they'd be talking about something else. It'd be silly as a rule to just assume specs = resolution/framerate in any discussion, unless there is specific contextual justification, as there is in this case. I'd agree in this case with you on what he meant, but strongly disagree with Yoday's generalization about the subject overall in other words.
 
This adds on to the delay of unity, and now seeing how good & upgraded GTA V looks with similar release date, lost most of my interest in this game.

It also was clear that shadow mordor has better combat than any AC, and I suspect unity.

Watch dogs was a disappointment also.
 
i think more likely they're trying to satiate the justified anger of a massive portion of their customer base so they hope they forget about it at launch day after they all already purchased it only to find out they changed nothing.

Well yeah, this could be just a BS PR from Ubi, and they're not going to change anything. However because "Final specs for Assassin's Creed Unity aren't cemented", it means when they said "we're proud to say that we have reached those goals on all SKUs" they were bullshitting. So they didn't reach the goals, they forced the parity.
 
...or the ps4 is simply about 40% more powerful, and can thus render more pixels due to its better GPU. Maybe I'm crazy though.

It's just not that simple. But forget about the comparison as that drag on forever...Also forget about the Xbone build about the game. And consider this: Watch Dogs was 900p/30 fps on Ps4. Both from the same company, which shares a lot of tech internally (and that is very noticeable on their games). Sure there are engine optimizations, but ACU looks a lot better, sports a better lighting model, better char models and far more people on screen at once all of which have more advanced AI routines than the ones found in Watch Dogs. Why is it so out of the ordinary that ACU is also 900p to the point some other company MUST have paid for parity?
 
Where are you getting this from? Reading the thread and other posters, it seems this game was at 792p or something in a previous build awhile ago, and an announcement by Ubi says the game was not downscaled. They probably got the game to 900p and felt it wasn't worth the extra man hours to get it to 1080 and maybe felt they had to finish other things like polish. There is no evidence it was political or an agreement was made with Microsoft so far to keep it at 900p (although I wouldn't be surprised). That's a management decision.

Hold on, where did you get this information? And even if that's true, last I checked, it was the X1 that was having issues bumping resolution. PS4 was always on standby waiting for the X1 to catch up.

Now I've said it a million times, it's Ubi's fault for not spending the additional resources to get it to 1080p. They obviously felt it wasn't worth it. I'm pissed about that too. Like I said, it doesn't surprise me because Ubi has been going EA lately. However, my opinion is if I really liked this game, I would play it at 900p on my 50 inch screen. That's my opinion.

But according to you, they've spend additional resources to bump up from 792p. One that they're emphasizing "parity" on. And you don't find this hypocrital? Your 900p could've potentially been higher.
 
You are approaching this from the perspective of someone who actually knows or cares about graphical properties. This is an interview conducted by a mainstream website for a mainstream audience. All these sites or the developers they talk to mention when it comes to console games is the resolution and the framerate. You almost never hear any of the games media talking about any kind of graphical effect beyond those two things when comparing versions of a game. Hell, most of the gaming press couldn't even tell you what AA or AF is. Even in the article the only two things brought up are framerate and resolution, that's it. He specifically said that they wanted to avoid the debate, and the debate is about resolution and framerate. You don't see differences in other graphical qualities exploding the internet, because largely nobody cares.

You can believe what you want, but there is a very long history of console games using differing qualities of graphical effects that simply aren't brought up in the media(outside of Digital Foundry) unless there is a noticeable difference in performance or resolution. When publishers talk about how a game looks or performs on consoles they never mention anything having to do with graphical effects, it is always resolution and framerate. Regardless of what effects you talk about, find important, or think of as a games "specs", the games media and PR have a very different view. It is a huge assumption to believe that he is talking about anything beyond framerate and resolution. This is exactly why I think MS wants to reach resolution parity on as many games as possible regardless of how it impacts other aspects of the visuals. As long as they can claim parity with resolution and framerate, then nobody cares, because that is all the media reports on.

I've said it before. If it turns out that the two versions are indeed at complete parity, then I will be there waving my anti-parity flag sky high with a plate full of crow, but I don't believe for a second that there is going to be complete parity beyond resolution and possibly framerate. Even with the resolution difference, Watchdogs had a number of lower quality effects on the XB1 along with the slightly less stable performance. I can't think of a single game that Ubisoft has ever approached with complete parity, and I don't think they are going to start now.
 
Catching up on the thread and I noticed your comment, bit late since it was made on the 6th and you're banned, but no, I am not cancelling my pre-order of a watch.

I cancelled my pre-order of the Amazon special offer of the game that comes with a watch, thanks for the comment though.

http://i1.minus.com/ibbxluZiTMk3dr.jpg

This reminds me of the Collector's Edition for Lighting Returns.

Tq8eQOzl.jpg
 
Ubisoft, Microsoft, EA and a bunch more have all been shady as shit this generation.

I don't even know why!

Other than Sony returning to its top dog status in the console market, what else has
changed significantly from last gen? What has warranted all the secrecy and absurd
behavior from these devs/pubs?
 
I don't even know why!

Other than Sony returning to its top dog status in the console market, what else has
changed significantly from last gen? What has warranted all the secrecy and absurd
behavior from these devs/pubs?

One of them is desperate to sell a console. The other two love checks.
 
There's alot of debate for nothing here.

This all boils down to ONE THING Ubisoft isn't doing and should end all discussions. They should push their game as much as they can on every hardware, regardless the differences.
As long as they give the most they can with each hardware.

As long as they aren't doing this, they are bullshit.
 
I don't even know why!

Other than Sony returning to its top dog status in the console market, what else has
changed significantly from last gen? What has warranted all the secrecy and absurd
behavior from these devs/pubs?

I think Sony had to be humble this gen, but even they've had a slip up here and there this gen. I think they'll be back making arrogant mistakes if they have a lot of gens of top dog success like this, but we'll see :P

i don't know what has changed. I think there's a lot of bitterness toward consumers over the way we've started taking control of the conversation via social media. I think there is a bit of contempt from some of the major players that they have to change big decisions because sometimes we all collectively decide we're not going to allow ourselves to be stepped on.

But at the same time I think they love to push the boundaries because of all the counter examples of us just buying their bullshit games anyway despite some controversy. So it's a thin line and sometimes their contempt slips out into actual PR :P

You are approaching this from the perspective of someone who actually knows or cares about graphical properties. This is an interview conducted by a mainstream website for a mainstream audience. All these sites or the developers they talk to mention when it comes to console games is the resolution and the framerate. You almost never hear any of the games media talking about any kind of graphical effect beyond those two things when comparing versions of a game. Hell, most of the gaming press couldn't even tell you what AA or AF is. Even in the article the only two things brought up are framerate and resolution, that's it. He specifically said that they wanted to avoid the debate, and the debate is about resolution and framerate. You don't see differences in other graphical qualities exploding the internet, because largely nobody cares.

In this case, I would say there is contextual points that would make this a fair assumption vis-a-vis framerate and resolution. What I was objecting to was the generalization that "specs" mean framerate/resolution in all these discussions, and to assume anything else is somehow wrong. Specs in tech interviews across the board have wildly different meanings and interpretations based on the context of the moment, and the same is true on forums like this place.

I've seen tons of interviews that specifically mention AA/AF. I've seen tons of interviews that discuss pretty much any technical effect you can think of, on mainstream sites and hardcore hobbyist sites. The key here is that the generalization was silly, not that in this specific case it is right or wrong (I'd agree in this case resolution/framerate is the smart money).

I've said it before. If it turns out that the two versions are indeed at complete parity, then I will be there waving my anti-parity flag sky high with a plate full of crow, but I don't believe for a second that there is going to be complete parity beyond resolution and possibly framerate. Even with the resolution difference, Watchdogs had a number of lower quality effects on the XB1 along with the slightly less stable performance. I can't think of a single game that Ubisoft has ever approached with complete parity, and I don't think they are going to start now.

I mean, no two games are ever exactly alike. So there's some pretty big wiggle room here. But if they're pretty much indistinct from one another, with PS4 having a one or 2 frame advantage here and there, that's still essentially parity.

But yes, you're right that there can be other visual effect differences that exist. But logically, it would not make sense if we're following the fucked up logic that allowed them to admit this shit in the first place. That logic was explicitly that they wanted to 'avoid debate', and you don't do that by having some other portion of the game being hugely distinct visually. Things like significantly better AA, shadows, texture quality, whatever would be obviously noted and the debate would be on.

Well yeah, this could be just a BS PR from Ubi, and they're not going to change anything. However because "Final specs for Assassin's Creed Unity aren't cemented", it means when they said "we're proud to say that we have reached those goals on all SKUs" they were bullshitting. So they didn't reach the goals, they forced the parity.

Yeah, of course. I'm just very interested to see where they go from here. It's bullshit they think this is acceptable behavior.
 
As long as they aren't doing this, they are bullshit.

We don't, and have never known whether they are or are not doing this. They both have 30 frame caps and a 900p internal rendering resolution. Yet these are not the only points of comparison between versions. Even if there are no new effects or graphical improvements, if the "dips down to 20fps in some places" still happens on XBO but doesn't on PS4, it would make a lot of this talk seem quite silly retroactively.
 
We don't, and have never known whether they are or are not doing this. They both have 30 frame caps and a 900p internal rendering resolution. Yet these are not the only points of comparison between versions. Even if there are no new effects or graphical improvements, if the "dips down to 20fps in some places" still happens on XBO but doesn't on PS4, it would make a lot of this talk seem quite silly retroactively.

hehe @ an AC game not dipping into the 20s on every console ;)
 
Well, the comment was bullshit that they ever targetted 1080p60 because this should've been clear is not possible with the footage we saw. To blame UbiSoft now for a comment of a level designer is of course justified but I think with a reasonable mind noone expected the game to hit 1080p60.

So let me get this straight. Your argument is that when a developer states they are targeting 1080p 60fps with their next game we should immediately know they are lying and this not hold them accountable to that statement? Yea that makes total sense.

Do you even realize how crazy that sounds. Take a step back, reread your own words and perhaps rethink things before typing them.

If a developer says we are aiming for 1080p 60 then the assumption should be that they are aiming for 1080p 60 because that's what they said. When the product does not hit that target that's fine obviously things did not go as planned but they can't then go back and claim "well we hit all our targets for all platforms and never actually scaled back on anything" when they are on record saying otherwise. That's absolutely ludicrous and they totally deserve to get called on it.
 
His quote "Technically we're CPU-bound,". He didn't say "we were", so I'm assuming they're still CPU bound.

You're CPU-bound even with high-end PC-hardware (like an OC'd 5980X and a GTX 980) in quite a lot of games - depending on the resolution of course and not on 60 or even 30 fps. Let's use 1080p with no AA -. Just let me throw in some: Watch Dogs, Wolfenstein, Battlefield 4 (MP), Star Citizen, Skyrim (Vanilla), WoW, The Sims 4, PvZ, Goat Simulator, Project Cars, GRID Autosport, Ry... (oh, can't talk about that yet) etc. the list goes on.

However, by turning down the resolution you will gain not a single Fps if CPU-bound. So the reason why Ubisoft turned down the resolution to 900p has absolutely nothing to do with being CPU-Bound. If it would be just that, and the GPU would be powerful enough, they could just as well go to 4K - it wouldn't make any difference regarding the framerate.

So what they said regarding the PS4 was not true - it might not been completly false, but turning down the resolution has to have a different reason. It can only be the GPU or trying to hide the reduction of the Level-of-Detail (which helps both CPU and GPU) behind a lower resolution. With that in place, you can apply a more agressive texture-LOD (helps streaming and thus the CPU), less NPCs and a lower viewing-distance without people noticing much (because it's all pixelated in the distance anyways). You can also use less detailled shaders for the depth-of-field, shadow-lod and other shader-related stuff, which also saves GPU-time. It also saves GPU-memory, so you can have more detailled textures up in front.

Those 30 fps might be because of being CPU-bound. Not the resolution however. Guess with all the resolution-gate and frame-wars going on you console-gamers will have to learn some classical PC-stuff as well. Might as well get a decent gaming-plattform ;-)
 
I played the PC version, but I thought even that had dips here and there?

DF said that it was locked

In fact, on the PS4 we never felt or noticed anything other than a locked 30fps update at all during our four hours or so of capturing - an impressive feat considering the boost to 1080p rendering. The Xbox One version mostly achieves the same solid 30fps performance at 900p, but there are times when the frame-rate is noticeably compromised, resulting in uneven motion and even the odd torn frame (too few in number to be noticeable).

http://www.eurogamer.net/articles/digitalfoundry-assassins-creed-4-next-gen-face-off
 
99% of the time it was 30 fps. If you got into a huge naval battle it'd drop a little but hardly ever noticeable.

It's actually locked on 62,5 fps. Any better mid-range GPU will get there in 1080p with no AA or FXAA, SMAA (those take only about 2-7 % performance of the GPU)

If you're using Vsync and double-buffering however, the framerates will crash to 30, even if you would get 59.9 without vsync.

These kind of drops you can fix by yourself (by turning Vsync off or forcing Triple-Buffering)
 
I think Sony had to be humble this gen, but even they've had a slip up here and there this gen. I think they'll be back making arrogant mistakes if they have a lot of gens of top dog success like this, but we'll see :P

Nah, Sony's already learned what happened to them with PS3, & what happened to Microsoft with Xbox One.

I doubt that they'll turn back into the Sony from 2005-2007, especially with their current financial situation.
 

Ah, interesting. Well, I'll give them that exception being that it was a cross-gen game and a launch one at that, which they needed an update for to get the resolution right. So i doubt it'd happen for a legit "next-gen console only" title!


To be honest, they'll have to be humble forever. Their company still isn't exactly in top shape financially.

you mean they should be.

will they be? I hope so :P
 
It's just not that simple. But forget about the comparison as that drag on forever...Also forget about the Xbone build about the game. And consider this: Watch Dogs was 900p/30 fps on Ps4. Both from the same company, which shares a lot of tech internally (and that is very noticeable on their games). Sure there are engine optimizations, but ACU looks a lot better, sports a better lighting model, better char models and far more people on screen at once all of which have more advanced AI routines than the ones found in Watch Dogs. Why is it so out of the ordinary that ACU is also 900p to the point some other company MUST have paid for parity?

If those enhanced specs are limiting the PS4 to 900p then the Xbone spec should be even lower like it was with Watch Dogs, but it's not. They are either deliberately not using all of the PS4's power, or they are using that extra power to enhance the PS4 version in other ways (very slim chance given the "we locked at the same spec" comment). The latter would still be a suspicious move because they know most of their audience cares about resolution first, so why not prioritize that?
 
If those enhanced specs are limiting the PS4 to 900p then the Xbone spec should be even lower like it was with Watch Dogs, but it's not. They are either deliberately not using all of the PS4's power, or they are using that extra power to enhance the PS4 version in other ways (very slim chance given the "we locked at the same spec" comment).

Well let's be honest. No game during this (still early) point of the gen is using the PS4 to its fullest capabilities. There's a difference between that and intentionally not wanting to do more and Ubisoft's statements come across as the latter.

The latter would still be a suspicious move because they know most of their audience cares about resolution first , so why not prioritize that?

The audience they targeted the statement to? Yeah. The console's audience overall though? Disagree. The game would probably still do pretty well if it was 720p on the PS4 TBH. The series is too popular.
 
I tell you what. Even if they don't end up bumping up the PS4 version, it's still pretty damn gorgeous. I just saw more footage that left me quite impressed with the visuals, new animations, the traversal and the co-op. It shows off the game fairly well for a bite-sized 5 minutes.

https://www.youtube.com/watch?v=UbiJwqUV_j4

that's just playing into their hand, the ol' "well maybe it's good enough let's settle"

I want 1080p, I moved past these old resolutions. Sure, if it's not possible, I'll accept it. But only if it's not possible and there's no alternative. But if you're going to come and tell us you gimped a version just to appease pathetic Xbox fanboys? Well, now you've gone and upped the anti-consumer nonsense into serious territory, and I gotta fight back on principle.
 
I tell you what. Even if they don't end up bumping up the PS4 version, it's still pretty damn gorgeous. I just saw more footage that left me quite impressed with the visuals, new animations, the traversal and the co-op. It shows off the game fairly well for a bite-sized 5 minutes.

https://www.youtube.com/watch?v=UbiJwqUV_j4

Damn thats fuzzy looking. Also I'm not sure what to blame for the juddery frame rate, Youtube or Ubisoft, but it doesn't look that stable or smooth to me.
 
that's just playing into their hand, the ol' "well maybe it's good enough let's settle"

I want 1080p, I moved past these old resolutions. Sure, if it's not possible, I'll accept it. But only if it's not possible and there's no alternative. But if you're going to come and tell us you gimped a version just to appease pathetic Xbox fanboys? Well, now you've gone and upped the anti-consumer nonsense into serious territory, and I gotta fight back on principle.

You gotta do what you gotta do. But I am not going to let 900p stop me from enjoying a game I want to play. Of course that does not mean that I have to agree with the stupidity of their statement or reasoning. Regardless, I have the game pre-ordered on PS4 and will play it because I think it looks both beautiful and fun. Not going to deny myself that over this mess.
 
I'm perplexed people even question the undeniable fact that AAA continues to exist because of consoles. The business model started with consoles and will end with consoles. Sure there are a few outliers, like Crysis, and Crytek had to go consoles due to piracy, there's of course Star Citizen, but being a pay2win game with ships that cost 100s of dollars that's not surprising.

Without consoles these publishers would've been making mobile games. We would only have MOBA's, MMO's, RTS, rampant F2P models and indie games. People who love traditional gaming should be thankful consoles exist.

That is not even remotely true.
 
Status
Not open for further replies.
Top Bottom