The Witcher 3: 1080p/30fps on PS4, 900p/30fps on Xbox One

Not really. Not for everyone at least. Depends on how far you are sitting from the tv. I sit about 8 ft I'd guess and the difference is negligible. If you require side by side comparisons to see a difference it's not a huge deal imho. I think many use their PC monitors to look at side by sides from a few inches away, having been told the numbers beforehand, and come away with the predisposition that one is dramatically different than the other. While in reality we often are shown upscaled 1080p imagery and nobody ever notices until they are told.

For instance, IGN's vid comparison for CoD: Ghosts lat year. It was 720p in both versions yet many people were 'certain' the PS4 version was dramatically more crisp. Similarly with BF4. First batch of DF images had ppl shocked it looked 'better' on X1 due to many preferring the darker blacks and warmer hues without the res difference being all that visible. Yet once we got detailed metrics suddenly the PS4 version looked worlds ahead...somehow. RYSE has some incredibly nice IQ at 900p. Yet we didn't notice it was 900p until we were told so. Nobody actually sits 8 ft or more from their 1080p tv and plays side by side games on these consoles simultaneously. Even if we did, we wouldn't notice anything in most cases. Exception might be in thin geometry like BF4's power lines, etc where the aliasing gives the lower res away.

I think the expectation/assumption that the PS4 version is superior based on metrics that aren't as viable for judging the impact visuals have on the user as they used to be has a strong psychological effect on folks. Especially ones here at GAF surrounded by a veritable echo chamber of forumers who have the past 24 months making assertions about the way these consoles would perform relative to one another.

720p.jpg
 
let him believe.

Poor guy. In case he could not understand : she/he can play wherever he pleases but his facts are all wrong I'm afraid.
The recommended specs will certainly provide a better experience than either console provided the 770 is the 4gb version, run better and/or with more effects.

Even though, one should not expect 60fps at high settings with that setup.
 
...and more peak memory bandwidth when used properly, and a much better chip to handle audio tasks, and the CPU as you noted which has more cores and is faster. Seeing as the CPU is what tends to govern open world game framerates, and since audio is traditionally done on the CPU, I'd say those add up to notable advantages. It was certainly relevant in GTA and Unity to get those games to effectively be at parity (or even favor X1's build) even at the same resolution as the PS4 build.

Question: What happens when they begin optimizing graphics in the coming months (almost always the last thing to be optimized in open world games due to bugs being top priority) and they tap into the GPU boosts from the past several SDK updates? What happens when they can enjoy a more stable framerate once utilizing the November update that opens up the 7th core?

My impression is many of you guys haven't considered the actual process of developing a game like this or how the timelines tend to play out. It's very, very likely that the eventual DF article discussing the release build will be 1080p on both. It's very, very *possible* that we end up 1080p on both with X1 having an ever so slightly more stable framerate a la Unity.
Anyone else hear that old fashion tap dance music while reading this post?

It can't just be me , can it?
 
At the end of the day. I'm sure the difference would only be really noticeable if you are playing the exact scenes at the same time on two consoles on two tvs right next to each other.

I'd love to see an actual controlled test where someone has to watch someone play on one tv on one console, never seeing anything that might give away which console is being used, like the controller or the options menu.

The next day, they would see the exact same area played on the other console.

They would then need to guess which is which.

This should be repeated 12 to 20 times with an expected rate of 50% correct guesses being considered pure guessing.

This eliminates any sense of console bias. If one looks demonstrably better, the rate of correct identification would be much higher.
 
...no, they aren't actually. One only does compression/decompression and some DMA work (PS4), the other does that on more than twice as many voices along with a bunch of other stuff. Stop.

Why is it so difficult for you guys to admit that X1 has hardware advantages in some areas over PS4?

The sound of 1080p,
 
Man you're good. 30 or so posts into your GAF career, all guns blazing, secretsauce,resoloutions not noticeable, infinite power of the cloud, talking about bans and rules.

Found that secret chip yet?

Oh here we go....

I guess actual facts aren't welcomed here.




plasmawave, read the second paragraph of what you quoted from my previous post there. You guys have a hard time it seems acknowledging the way people reacted even here on these very forums. Hell, some of those reactions I described may well have been from some of you guys!



cakely, see Pathfinder's post. Also, we have a dev on this very forum who said without compromise it was very real and he was making a game to exploit that fact. We also have devs (lead on Watch_Dogs, guy from Avalanche, and others) who said it was absolutely interesting and might be the real deal.



I get the feeling ppl here care more about clinging to some ignorant narrative than actually engaging in factual discussions. The narrative which has been adorned as consensus here on GAF is very far from being informed. That's the reason many of the folks here must resort to vast, corporate conspiracy theories when things don't play out as they assumed it would.
 
Not really. Not for everyone at least. Depends on how far you are sitting from the tv. I sit about 8 ft I'd guess and the difference is negligible. If you require side by side comparisons to see a difference it's not a huge deal imho. I think many use their PC monitors to look at side by sides from a few inches away, having been told the numbers beforehand, and come away with the predisposition that one is dramatically different than the other. While in reality we often are shown upscaled 1080p imagery and nobody ever notices until they are told.

For instance, IGN's vid comparison for CoD: Ghosts lat year. It was 720p in both versions yet many people were 'certain' the PS4 version was dramatically more crisp. Similarly with BF4. First batch of DF images had ppl shocked it looked 'better' on X1 due to many preferring the darker blacks and warmer hues without the res difference being all that visible. Yet once we got detailed metrics suddenly the PS4 version looked worlds ahead...somehow. RYSE has some incredibly nice IQ at 900p. Yet we didn't notice it was 900p until we were told so. Nobody actually sits 8 ft or more from their 1080p tv and plays side by side games on these consoles simultaneously. Even if we did, we wouldn't notice anything in most cases. Exception might be in thin geometry like BF4's power lines, etc where the aliasing gives the lower res away.

I think the expectation/assumption that the PS4 version is superior based on metrics that aren't as viable for judging the impact visuals have on the user as they used to be has a strong psychological effect on folks. Especially ones here at GAF surrounded by a veritable echo chamber of forumers who have the past 24 months making assertions about the way these consoles would perform relative to one another.

This includes the feedback of verified 3rd party game developers working on both consoles. Spare us the condescending stuff. Thanks.
 
900p to 1080p is not much of a difference so it does not matter anyway. But even more low res textures is sad...

You do realize that lower resolution degrades not only the appearance of the textures, but the entire image, yes? It wouldn't surprise me if texture resolution is the same on both and that the people are just seeing a lower resolution image overall, not lower res textures.

There really is a big difference between 900p and 1080p. having 50% more pixels being rendered provides a cleaner, clearer image... Especially since there's no upscaling required. If you can't see it then fine. But PC gamers have been seeing the difference clearly for years.
 
Not sure what you mean?
I've just checked the recommended specs for The Witcher 3 and priced them online and for the motherboard, graphics card, Processor and RAM alone cost £759 sterling or €1,000 and that's without PC case, cooling, power supply etc. so it would easily cost €1,500 to play this @ the recommended settings

Man is is this silly, you don't need that expensive ram, nor mobo, nor graphics card. Also that CPU doesn't go with that motherboard.
 
Yep, really.

Think it through ... is anyone actually going to design a game with AI or physics that simply stop working when your internet connection goes down?

Developers already are working on games with AI and physics that take advantage of Azure.

I tend to lean toward actual facts rather than what people THINK will or won't happen, but thanks for your input.
 
Not sure what you mean?
I've just checked the recommended specs for The Witcher 3 and priced them online and for the motherboard, graphics card, Processor and RAM alone cost £759 sterling or €1,000 and that's without PC case, cooling, power supply etc. so it would easily cost €1,500 to play this @ the recommended settings

770 as recommended, not 970.
 
Expected.

Then what happens when somebody wants to play offline and can't access it?

They could turn down or turn off the advanced physics effects. Or they could just make it always-online. Destiny has proven that people will pay for singleplayer games that require an internet connection.
 
At the end of the day. I'm sure the difference would only be really noticeable if you are playing the exact scenes at the same time on two consoles on two tvs right next to each other.

I'd love to see an actual controlled test where someone has to watch someone play on one tv on one console, never seeing anything that might give away which console is being used, like the controller or the options menu.

The next day, they would see the exact same area played on the other console.

They would then need to guess which is which.

This should be repeated 12 to 20 times with an expected rate of 50% correct guesses being considered pure guessing.

This eliminates any sense of console bias. If one looks demonstrably better, the rate of correct identification would be much higher.

I'd love to see something like this done. I think the results would be rationalized away by many here though. I can easily imagine folks chanting on and on about how the test dummies weren't "real" gamers or something along those lines.
 
Let me take it one step further. As an avid PC gamer, what if the PC parts actually DID cost a very high amount of money? So what?

No seriously, so what?

If 120+fps and better textures, higher resolutions etc etc etc matter so much to me that I am willing to pay a very high premium (which isnt' even the case in modern PC gaming) then let me pay what I want to pay to play how I want to play.

I just really don't get the salt. I don't look at people with more expensive setups than mine and insult them. I say, "Hey that's awesome"
That's the point we all know that the 2 consoles aren't as powerful so why so salty that the PS version is 1080 p.
Look, who cares what anybody plays it on but we should all support such a great dev as CDPR this game is going to be amazing on all platforms With free DLC for everybody
 
Yep, really.

Think it through ... is anyone actually going to design a game with AI or physics that simply stop working when your internet connection goes down?

I don't think they will it would be a awful decision. Not to mention lots of people Internet couldn't handle it in the 1st place.
 
Depends on the context of the comparison. If the intention is get an idea of relative performance for the game, I can agree that too poor a framerate makes either version unplayable, though what's "unplayable to you might not be the same threshold for others. Your second sentence is just silly though. There hasn't been one single case of forced parity in terms of game performance that I'm aware of stemming from either Sony or MS. A dev at B3d straight up said such deals simply don't exist in the real world. I trust him. It doesn't make sense anyhow. If the version being pushed by marketing is the X1 version than it doesn't matter what the PS4 version looks like as it won't get the exposure until just prior to launch anyhow.

If ppl here didn't bother looking at GTA, which had a marketing deal with Sony, we wouldn't know about grass-gate. I personally care more about that kinda stuff in terms of graphics than pixel counts, so that would annoy me to not have that be discovered. Nor would we have known about the real world difference X1's CPU adjustments could make without studying differences in performance in Unity. Nor would we know about the horizontal upscale approach in FarCry 4 and how nice it can look.



I wish mods here held claims about 'forced parity' to the same standard as claims about insider info. After all, that is exactly what it is...a claim made about the legal contracts between these companies. Would have a lot less nonsense in tech discussions around here if folks got banned for making claims about confidential legal agreements they have no access to nor any evidence supporting their existence.

Please stop. You and your post will be the reason people won't stop destroying any valuable discussion in future threads for taking your words as a general attitude for Xbox one fans.
 
It just reminds me of the edge thread.

We went through all of it already : the shape chip, the esram bandwidth, the display panes, the secret sauce... I thought we were past this but apparently, it's a never ending cycle.
You don't get it man... We have to keep looking.

On topic: Deliver on stable frame rate CDP and we are golden.
 
Man, I feel like I was just time warped back to the beginning of this gen with these posts talking about offloading to the cloud haha. 1080p is better than 900p, nothing anyone here says will change that fact.
 
Are you joking?

you know stand me correctly? You think joke no Sony pay N4G GAF mods to tell lies about MS and Xbox One this Sony know they have no chance agaist big MS brand that's why they announce PS4 first they scared they become irrelevant and then they start to tell lies about having more powerful console GDDR5 18 compute units MS have more powerful console but can't say nothing until AMD reveal their GPU first why you think MS 3 billion on deal AMD? Huh? You think Xbox One GPU is it? No silly ponies they have hidden power under 4th layer of the stereo drivers once the hyper viser gets updated Xbox One GPU is really 4.2 TF MONSTER

eldery semens correct everything said
 
Oh here we go....

I guess actual facts aren't welcomed here.




plasmawave, read the second paragraph of what you quoted from my previous post there. You guys have a hard time it seems acknowledging the way people reacted even here on these very forums. Hell, some of those reactions I described may well have been from some of you guys!



cakely, see Pathfinder's post. Also, we have a dev on this very forum who said without compromise it was very real and he was making a game to exploit that fact. We also have devs (lead on Watch_Dogs, guy from Avalanche, and others) who said it was absolutely interesting and might be the real deal.



I get the feeling ppl here care more about clinging to some ignorant narrative than actually engaging in factual discussions. The narrative which has been adorned as consensus here on GAF is very far from being informed. That's the reason many of the folks here must resort to vast, corporate conspiracy theories when things don't play out as they assumed it would.
Nope. You're just months and months behind on these discussions.

These topics have been discussed to death and the best research and consensus achieved don't exactly resonate with what you are claiming.
 
You should hold off on making that joke until we actually see what is possible with cloud compute. Presumably this E3 will have lots for us to look at in that regard, at least in terms of first party stuff (Crackdown, etc).

If it really was that promising and powerful I think they woudln't waste their time scraping 5% here and 10% there on their hardware to begin with.
But yes, we shall see.
 
No company is going to spend money on servers to improve physics in a game. Companies are too cheap to give us servers for multiplayer games.

And what happens when they drop the servers?

Not worth it
 
They could turn down or turn off the advanced physics effects. Or they could just make it always-online. Destiny has proven that people will pay for singleplayer games that require an internet connection.

This. I don't see why it's hard to believe online-only games would be severely limiting to game sales or acceptance. CoD is predominantly a MP game, requiring a connection at all times to enjoy its most prominent feature. Destiny, WoW...clearly we have at least one publisher (#1 in the world last I recall) who is all about online-only games.
 
Uhh... who here is arguing against that point? Care to quote it?

You're right, my mistake, I thought I saw a few posts like that, but they were just saying that it's hard to see the difference depending on circumstances. Even though I don't agree with them, no one actually said 900p is better.
 
Awesome, can't wait to play it!

I'm not even gonna read through all the pages here, '1080p vs 900p makes no difference' narrative is annoying.
 
It just reminds me of the edge thread.

We went through all of it already : the shape chip, the esram bandwidth, the display panes, the secret sauce... I thought we were past this but apparently, it's a never ending cycle.

People are trying reeeaaalllly hard to justify their purchase by focusing on the hardware (aka fighting losing battle) rather than focusing on the games and features that their system of choice offers exclusively.

These systems offer incredibly similar architectures and are VERY easy to understand on a technical level so it's hilarious when people who don't know shit about the hardware throw around marking buzzwords and other PR bullshit as facts and then try to extrapolate from that. "DX12!" "Teh balance!" In reality differences in most games line up with what you would expect, and this game is no exception.

If it stays like this, everyone is fine. If they end up bringing xbox one to 1080p, all hell will break loose. lol oh man.

Not going to happen.
 
Crackdown is coming, and has been confirmed that it is using that technology.

Maybe so but it will provide nothing that having a few extra hundred gigaflops of local huma driven compute available won't, all at a huge cost

Ergo, it's bollocks and it will never get used outside of MS trying to prove a point.
 
I'm starting to get major launch window vibes in this thread. The resolution difference really isn't a big deal, that's basically what's expected at this point. What I'm more confused about, is the apparent downgrade and stuff. They claimed they had the E3 2014 demo running at 900p on XB1 hardware. The game now doesn't look as good, yet it's still only running at 900p on the same hardware. Am I missing something?
 
If it really was that promising and powerful I think they woudln't waste their time scraping 5% here and 10% there on their hardware to begin with.
But yes, we shall see.

Who are you referring to? You think MS is wasting their time updating their SDK (which affects ALL devs, not just those who want to design a game to be built around a connection)?
 
Maybe so but it will provide nothing that having a few extra hundred gigaflops of local huma driven compute available won't, all at a huge cost

Ergo, it's bollocks and it will never get used outside of MS trying to prove a point.

Are you willing to bet your avatar on that? :)
 
Yep, really.

Think it through ... is anyone actually going to design a game with AI or physics that simply stop working when your internet connection goes down?

Considering they've already released AAA games that completely stop working when your internet connection is down, yes. i.e. Destiny.
 
They are using SMAA T1x, but its called TAA in options.

Was postAA component confirmed for The Order after all? I know that they were experimenting with it, but dunno what was finalized, but its quite possible that they are using some temporal technique too. Still, i think most of their blurriness comes from other sources, like CA, grain filters etc.

I don't think there is any solid confirmation, no. Looking at the screenshots, I'm finding it hard to see any strong aliasing at all on things other than geometry, suggesting that they might be using something.
 
Top Bottom