Assassin's Creed "Parity": Unity is 900p/30fps on both PS4 & Xbox One

Status
Not open for further replies.
For realz? When people stop pulling stories out of their ass. All the "benchmarks" are Java and HTML5 based browser benchs. Which proves nothing, from all we know CPUs in both consoles are twins, PS4 runs at 1.6Ghz and Xbox One is at 1.75Ghz. Otherwise they are 1:1 copies of each other.

I mean not only you assume that both consoles dedicate same amount of power when running browser, you also asume that WebKit and Internet Explorer would produce anywhere near results.

SMH

Yeah just java and HTML5 based tests sure.....

Yes, you can get more out of the PS4's CPU than you can the Xbox's.
 
So if its CPU bottleneck (which its mostly not), Ubisoft can at-least put better filtering and AA onscreen..right?
 
For people buying on PC, honestly has there been an optimised AC game before?
Also how was ACIV on PC?
 
You can get more out of the PS4 CPU because of the better RAM, yes. The CPU itself is completely same and they only differ in clock speed.
BTW, there is no Kinect resource usage anymore and dedicated silicon is present in both machines.

We can't really say why the PS4 CPU performs better for games than the Xbox One's. We just have evidence that despite the clock speed advantage the Xbox One commands developers still get better results on PS4. There could be a number of reasons, including differences in memory architecture or differences in compiler optimization. Personally I suspect Microsoft's multiple virtual machine setup has significant overhead that steals a lot of performance.
 
Are you serious, I can't believe the Xbox one would be less powerful than the Xbox 360, in regards to CPU. You're saying Sony and MS downgraded CPUs?

Yes, that's why they emphasize "GPGPU" - watch Cerny note on that topic. Overall those CPUs might be better than 360 Xenon, but per core performance is either very similar or worse, and that's with 7 year old CPU.

Another reason - they needed SoC from the get go to reduce price. So the choice was - Nvidia with their GPU + ARM CPU on single SoC, AMD or Intel. In Nvidia case - both had bad relationships with Nvidia in regards to GPUs in Original Xbox and PS3, plus ARM is still nowhere near the performance and flexibility of x86. Intel has excellent CPU and is far ahead of everyone else on shrinking nodes, but their GPUs are average at best. So they chose AMD.
 
For realz? When people stop pulling stories out of their ass. All the "benchmarks" are Java and HTML5 based browser benchs. Which proves nothing, from all we know CPUs in both consoles are twins, PS4 runs at 1.6Ghz and Xbox One is at 1.75Ghz. Otherwise they are 1:1 copies of each other.

I mean not only you assume that both consoles dedicate same amount of power when running browser, you also asume that WebKit and Internet Explorer would produce anywhere near results.

SMH

The benchmarks being referenced are not browser tests. They involve real game code from a middleware provider, and we have corroboration from a confirmed developer on this board.
 
For realz? When people stop pulling stories out of their ass. All the "benchmarks" are Java and HTML5 based browser benchs. Which proves nothing, from all we know CPUs in both consoles are twins, PS4 runs at 1.6Ghz and Xbox One is at 1.75Ghz. Otherwise they are 1:1 copies of each other.

I mean not only you assume that both consoles dedicate same amount of power when running browser, you also asume that WebKit and Internet Explorer would produce anywhere near results.

SMH

Huh? The benchmark was done using Substance Engine (an algorithmic texture generation middleware) and it's not browser benchmark. Also, are you going to ignore what Matt (developer and insider) said?
Yes, you can get more out of the PS4's CPU than you can the Xbox's.
 
The benchmarks being referenced are not browser tests. They involve real game code from a middleware provider, and we have corroboration from a confirmed developer on this board.

They should rerun those. I'm with you on the Xb1 having more overhead. I wonder if it's been lowered since.
 
Again, without solid evidence it's stories out of someones ass. So far we only saw browser benchmarks. From AMD keynote both CPUs are "carbon copies" of each other, before Microsoft overclocked their to 1.75Ghz.

My point was simply it not being solely browser and java based tests but an insiders take on it. Whether you choose to believe Matt or not wasn't really the point of contention but as far as insiders on here go, Matt by far has the best track record

Huh didn't realize the Substance test was a middleware test
 
This really didn't have to be such a big deal. They apparently had a difficult time hitting 1080p with Watch Dogs for whatever reason, a title I assume will otherwise be less technically impressive in pretty much every single way, and it's generally understandable that sometimes a game will more comfortably sit somewhere beneath the 1080p/60fps ideal standard.

And while there's parity here in terms of resolution and FPS, that doesn't mean the X1 version will be as polished overall. I can imagine a development process that goes for something like 900p as a minimum, reaches that in slightly different ways platform to platform, and is then finished up there without tweaking this or that to comfortably reach 1080p on a specific platform. Going back to the Watch Dogs example, even in terms of graphics, it wasn't being 900p on the PS4 that made it so disappointing. And I must believe Ubisoft is aiming for something more overall visually ambitious with Unity.

So resolution parity isn't necessarily this horrible, damning thing. Even now, this whole story could seem somewhat different once the game actually releases. It's just amazing how viciously terrible Ubisoft's PR seems to be.
 
For people buying on PC, honestly has there been an optimised AC game before?
Also how was ACIV on PC?

Played trough all of them besides Black Flag and could say that they were fine. Nothing special but acceptable.

FPS drops here and there and a few bugs but nothing that hinders the gameplay.

Topic Related : How Microsoft wants to ruin console gaming for everyone.
 
The issue is that if you target the lowest common denominator you automatically crippled the superior one. If you nerfed PS4 version you forced parity, if you set 900p@30fps so that the weaker unit could reach it, once again you limited the other.

I agree with you....I should have been clearer: What I meant was, what is the issue with what he said ? He essentially was saying in his video, that the developer needs to bring out the strengths of each console...which is what we're all agreeing to here :)
 
I like to ignore evidence that goes against my views, too.

Because devs can't be fanboys, right? I know few people from Ubisoft Shanghai (not related to Unity), some of them are clearly fanboys of one platform, were 2 gens ago, last gen and chose same platform again this gen (talking about Xbox family), in their eyes Xbox is just better, even though objectively this gen Xbox One is worse hardware overall than PS4.

Checking his post history I can clearly see which side he leans on. But that might be just a coincidence.

Derp, it's time to pack it in, babe.

Nice edit.
 
Yeah just java and HTML5 based tests sure.....
This was before the Kinect-ressources where made usable though, wasn't it? And even still, "more" need not equate the possibility to up the resolution or the framerate considerably.
The original reason given for parity was "to stop debate". Why are you trying to assume then that it is stability or cost that is the actual reasoning? Why would a producer on the game lie about that? How does that make sense? How is that a better reasoning for consumers to accept?
If an impact on stability is a consequence of upping framerate or resolution, that will lead to discussions. Because then, which version is the one to get then?

My point was MS was not aware of the effect of such a large difference of a important title to the perception of their console to the larger market until the uproar [aka "resolutiongate"]. MS could easily have thought that it would've just been an issue amongst core gamers and not received all the exposure it did thus they would've been in no position at the time to do anything about COD's performance disparity
It's possible that Microsoft underestimated the impact of the perceived difference between PS4 and Xbox One (not saying there is none, but sometimes it seems as if certain PS4-owners think it's almost day and night) but I'm quite positive that they knew that having a worse version would be to their detriment.

Eh crippled is probably a step to far. I guess there is some debate as to what "barely playable" actually means. I could forsee games that struggle on XB1 and therefore if forced parity is a wider thing, the PS4 version is turned down quite dramatically but I suppose being crippled would be unlikely
You usually don't aim for "barely playable" when developing a game. You might accept it for a port though.
 
For realz? When people stop pulling stories out of their ass. All the "benchmarks" are Java and HTML5 based browser benchs. Which proves nothing, from all we know CPUs in both consoles are twins, PS4 runs at 1.6Ghz and Xbox One is at 1.75Ghz. Otherwise they are 1:1 copies of each other.

I mean not only you assume that both consoles dedicate same amount of power when running browser, you also asume that WebKit and Internet Explorer would produce anywhere near results.

SMH

Son, stop derpin' and show me sources.
 
Because devs can't be fanboys, right? I know few people from Ubisoft Shanghai (not related to Unity), some of them are clearly fanboys of one platform (past gen).

Checking his post history I can clearly see which side he leans on. But that might be just a coincidence.



Nice edit.

Edit? I misspelled your name because autocorrect doesn't recognize derp.

Keep pushing against vetted devs, I guess. Sometimes it's far easier just to admit you fucked up.
 
Sources for the bolded pls?

Source? Knowledge in computer architecture :) Nothing is as fast as the CPU and its registers. This is why you have a thing called "memory pyramid. The bigger the memory the slower it is. From a CPU viewpoint you are almost always waiting for data and you benefit from faster memory.
As both systems feature the same CPU (including caches) it is important to have fast main memory so you can get the data quickly to the CPU. As PS4 memory is a lot faster than XBone's you have a CPU with less "idling" and thus can get more out of the CPU. This doesn't make the CPU itself faster.
 
To be honest, it is probably a bit of both: Ubisoft being too lazy to optimize it for both platforms and MS somehow nudging them to keep parity.

Will be interesting to see what comes from this. Better a shitstorm now about parity as opposed to lying down and letting this roll over, encouraging over devs to do the same thing. Considering the X360 consistently had better multiplats last gen, I don't see how not having parity is an issue.
 
Heh, pretty much this all of last-gen:
Bqc5xFJ.png
 
Edit? I misspelled your name because autocorrect doesn't recognize derp.

Keep pushing against vetted devs, I guess. Sometimes it's far easier just to admit you fucked up.

Vetted? Does it mean it can't be biased or misleading? Cboat is clearly biased against Xbox even though he is either part of Microsoft or their close partner, while ntkrnl is heavily biased towards it to the point where he reads like controlled leaks or sneaky PR.
 
Can we expect Unity to run worse on PC then AC IV did? It seems likely since it's better looking than AC IV, and Ubi has never been very good in optimizing for PC. If so, i won't be getting this game, doubt my aging GTX 680 could handle this game in a good way.
 
Source? Knowledge in computer architecture :) Nothing is as fast as the CPU and its registers. This is why you have a thing called "memory pyramid. The bigger the memory the slower it is. From a CPU viewpoint you are almost always waiting for data and you benefit from faster memory.
As both systems feature the same CPU (including caches) it is important to have fast main memory so you can get the data quickly to the CPU. As PS4 memory is a lot faster than XBone's you have a CPU with less "idling" and thus can get more out of the CPU. This doesn't make the CPU itself faster.

The Substance engine benchmarks which first shed light on the CPU performance difference is purely algorithmic and does not stress memory bandwidth. That is the least likely explanation based on the evidence.
 
Source? Knowledge in computer architecture :) Nothing is as fast as the CPU and its registers. This is why you have a thing called "memory pyramid. The bigger the memory the slower it is. From a CPU viewpoint you are almost always waiting for data and you benefit from faster memory.
As both systems feature the same CPU (including caches) it is important to have fast main memory so you can get the data quickly to the CPU. As PS4 memory is a lot faster than XBone's you have a CPU with less "idling" and thus can get more out of the CPU. This doesn't make the CPU itself faster.

Ok, then it's an IMO just as I thought. We don't have confirmed info on clocks or the amount of unlocked cores in games. That's the point people mostly ignore.
 
Vetted? Does it mean it can't be biased or misleading? Cboat is clearly biased against Xbox even though he is either part of Microsoft or their close partner, while ntkrnl is heavily biased towards it to the point where he reads like controlled leaks or sneaky PR.

Yup, you're too far down that tunnel now. Be careful, I guess.
 
No matter how you spin it, it's impossible. You will have dumb down AI and remove those massive crowds, which they highlight as one of pillars for AC: Unity. New console CPUs are designed to be in tablets and netbooks, not in gaming hardware. I would bet that per core performance is worse than Xbox 360 CPU.

Mmmh, perhaps... There are a lot of things to consider here, though.
Jaguar is a way more modern CPU with a lot of instruction set extensions while it is of course clocked way lower. I'd say there is no clear winner in this comparison, both CPUs would have strengths in different (gaming) scenarios.
 
Son, stop derpin' and show me sources.

Nice tactic to flip tables like that. So far all sources of PS4 "stronger" CPU was moot posts by random insider. The only concrete data I saw was HTML5 and Java benchmarks from around the launch, or some texture generator before the June SDK. Even then, OS might take different portion of CPU performance, but overall they are exactly the same and with overclock objectively Xbox One CPU is faster, June SDK just frees up more cycles for games to work with.

Anyone with common sense who would look at specs or AMD keynote will see CPUs are exactly the same, hell, they even look the same on silicon, same layout of cores etc., but keep going on about "PS4 CPU is stronk, insider told me so".
 
Vetted? Does it mean it can't be biased or misleading? Cboat is clearly biased against Xbox even though he is either part of Microsoft or their close partner, while ntkrnl is heavily biased towards it to the point where he reads like controlled leaks or sneaky PR.

So? We're not only taking his word for it. He's just corroborating information published by a middleware provider.
 
Source? Knowledge in computer architecture :) Nothing is as fast as the CPU and its registers. This is why you have a thing called "memory pyramid. The bigger the memory the slower it is. From a CPU viewpoint you are almost always waiting for data and you benefit from faster memory.
As both systems feature the same CPU (including caches) it is important to have fast main memory so you can get the data quickly to the CPU. As PS4 memory is a lot faster than XBone's you have a CPU with less "idling" and thus can get more out of the CPU. This doesn't make the CPU itself faster.

But PS4 RAM is not really faster, more like it has higher bandwidth than Xbone. And when talking about CPU, PS4 CPU has 20GB/s bandwidth while Xbone CPU has 30GB/s bandwidth.
 
All these gaming journalists who know nothing about hardware specs and its impact on performance.

If the game is CPU bound as Ubi are saying then increasing the resolution will not affect framerate because that is wholly affected by the GPU. If that is their technical reason it is total bullshit. If the Xbox One can run the game at 900p30 then the PS4 can do it at 1080p30, it is that simple.

This is a case of Ubisoft gimping a version of the game because they do not want the fall out. Maybe it is a way to get the game in the news so that when they announce that they will push out a 1080p patch for the PS4 version that news hits more potential customers.

To any journalists who may read this, the reason you should talk to Ubi about it is because the technical reasons their PR people are providing do not hold up to scrutiny. They are bogus reasons and it is your jobs as journalists to question the line provided by these companies and get to the truth.
 
I understand why people would be mad over this, but yet I also understand Ubisofts view.

I'm more worried if the game is actually good and doesn't bore me after an hour.
 
Nice tactic to flip tables like that. So far all sources of PS4 "stronger" CPU was moot posts by random insider. The only concrete data I saw was HTML5 and Java benchmarks from around the launch, or some texture generator before the June SDK. Even then, OS might take different portion of CPU performance, but overall they are exactly the same and with overclock objectively Xbox One CPU is faster, June SDK just frees up more cycles for games to work with.

Anyone with common sense who would look at specs or AMD keynote will see CPUs are exactly the same, hell, they even look the same on silicon, same layout of cores etc., but keep going on about "PS4 CPU is stronk, insider told me so".

Ok, I see you're just out for trolling, carry on then.
 
cancelled as soon as I heard about it, took my son to school walked to EB games and transferred the funds from unity to gtav.

9kXXgv2.jpg
 
This was before the Kinect-ressources where made usable though, wasn't it? And even still, "more" need not equate the possibility to up the resolution or the framerate considerably.

Wait what? The kinect resources consisted of something close to 10% GPU being made somewhat available for game usage. Nothing to do with CPU usage/resources

The PS4 GPU is roughly 50% more powerful spec wise than the XB1 GPU is even before taking into account the 10% kinect reserve that they have somewhat gotten back

If an impact on stability is a consequence of upping framerate or resolution, that will lead to discussions. Because then, which version is the one to get then?

The "debates" that the producer was clearly referring to involves graphics differences between PS4 and XB1 games. What PS4 game has a higher resolution but worse stability? Ghosts maybe, kind of? And the best comparison we have for AC Unity is ACIV which had no issues in the PS4 version being a higher resolution to stabilty or framerate.

Honestly I think you're reading way too far into what seems to be a very straightforward quote

It's possible that Microsoft underestimated the impact of the perceived difference between PS4 and Xbox One (not saying there is none, but sometimes it seems as if certain PS4-owners think it's almost day and night) but I'm quite positive that they knew that having a worse version would be to their detriment.

Right....and again my point in that last post was that MS likely thought it would be to their detriment among a small minority and not like a whole thing with twitter hashtags and all that? MS likely were fully aware they'd get a lesser version of the game beforehand but likely thought it wouldn't blow up as it did

You usually don't aim for "barely playable" when developing a game. You might accept it for a port though.

True enough although some developers aren't as technically proficient as others so poorly optimized games are sure to be a reality for both systems at some point if not already
 
Status
Not open for further replies.
Top Bottom