The Witcher 3: 1080p/30fps on PS4, 900p/30fps on Xbox One

I think the xb1 is great and all, it's my only current gen console, but seriously, this is some misterxmedia level of spin you got going.

Wut? o.0

Do you have something more specific to criticize in my comment? Simple math isn't "spin". According to Nick Baker it isn't unheard of to have an X1 game/app topping 200 GB/s in total bandwidth. Or are you unaware of the SHAPE audio chip or...?
 
Hmmm some parts where better on PS4 than XB1 in GTA other situations the opposite was true. PS4's framerate also improved in Unity with patches to more equal XB1's framerate. Maybe the added clock speed and access to the seven core? helped that but given how broken the game was and still is? not really a great example to highlight an advantage.

Better audio chip in what way exactly?
That's an opinable assumption of a DF redactor, to be honest, not necessary a fact.
 
Wut? o.0

Do you have something more specific to criticize in my comment? Simple math isn't "spin". According to Nick Baker it isn't unheard of to have an X1 game/app topping 200 GB/s in total bandwidth. Or are you unaware of the SHAPE audio chip or...?

Is this 2013 all over again?
 
Wut? o.0

Do you have something more specific to criticize in my comment? Simple math isn't "spin". According to Nick Baker it isn't unheard of to have an X1 game/app topping 200 GB/s in total bandwidth. Or are you unaware of the SHAPE audio chip or...?

Are you going to address all the posters who did specify criticism to your post?
 
Wut? o.0

Do you have something more specific to criticize in my comment? Simple math isn't "spin". According to Nick Baker it isn't unheard of to have an X1 game/app topping 200 GB/s in total bandwidth. Or are you unaware of the SHAPE audio chip or...?
This thing of 200 GB it's a pure PR spin. It's a weird addition from MS but it's not exactly the true full bandwidth available.
 
A bit disappointing, but it's not confirmed it'll be the final resolution. Either way, won't stop me from making a purchase.
 
Wut? o.0

Do you have something more specific to criticize in my comment? Simple math isn't "spin". According to Nick Baker it isn't unheard of to have an X1 game/app topping 200 GB/s in total bandwidth. Or are you unaware of the SHAPE audio chip or...?

So Xbox Games can have 5GB RAM @ 200gb/s? HAHA... no. Nick Baker is full of shit if he said that. The eSRAM rarely hits 109 gb/s and that's just for the 32MBs. We've gone over this here at GAF back in 2013 -- these MS guys were wanting to double the bandwidth back when it was 102 gb/s to make it 204 gb/s. Well, the improvements before the Xbox launched pushed the theoretical bandwidth to 109 gb/s and they were still sticking to that 204.
 
It's a myth. Normal eyesight granted, there is always a very noticable difference between scaled and native images within a sitting distance which still makes sense. Sure you can stand across the room and maybe couldn't see it anymore, but no one will play like it.
Even tested this out with FFX Remaster. Difference with my 39" TV from the distance I sit from it was pretty small.
 
Unfortunately, it really doesn't matter when the "stable" framerate is in the mid-20s. Best to not even compare games that had marketing deals with a specific manufacturer, either.

Depends on the context of the comparison. If the intention is get an idea of relative performance for the game, I can agree that too poor a framerate makes either version unplayable, though what's "unplayable to you might not be the same threshold for others. Your second sentence is just silly though. There hasn't been one single case of forced parity in terms of game performance that I'm aware of stemming from either Sony or MS. A dev at B3d straight up said such deals simply don't exist in the real world. I trust him. It doesn't make sense anyhow. If the version being pushed by marketing is the X1 version than it doesn't matter what the PS4 version looks like as it won't get the exposure until just prior to launch anyhow.

If ppl here didn't bother looking at GTA, which had a marketing deal with Sony, we wouldn't know about grass-gate. I personally care more about that kinda stuff in terms of graphics than pixel counts, so that would annoy me to not have that be discovered. Nor would we have known about the real world difference X1's CPU adjustments could make without studying differences in performance in Unity. Nor would we know about the horizontal upscale approach in FarCry 4 and how nice it can look.



I wish mods here held claims about 'forced parity' to the same standard as claims about insider info. After all, that is exactly what it is...a claim made about the legal contracts between these companies. Would have a lot less nonsense in tech discussions around here if folks got banned for making claims about confidential legal agreements they have no access to nor any evidence supporting their existence.
 
McCaffrey?

This got me good.
IGN.gif
 
The question is, will there be a performance hit in other areas on PS4 due to pushing 1080p. There have been other games that have performed better on XONE in this way.

I'd wait for the reviews to see which is the better performing version overall before buying.

We don't pre-order anyway these days do we, right?
Considering these performance differences have never been all that huge I would still opt for the better looking console version. Preordering soon for PS4.
 
Uh... It was shown in the framerate video they showed. It's a fact.
It's not that simple. Video shows fps but not the impact of this scenario in the cpu. They presume it's the CPU the cause of the fps difference but it's not necessarily true.
 
As someone who is getting the game on ps4 I'm extremely happy to hear they have gotten that version to 1080p. Last I remember reading both consoles were going to be 900p.
 
Makes sense. The PS4's GPU was built for gaming at 1080p.

With that in mind, i really don't mind a reduction in graphics to reach a native presentation with decent AA and a stable framerate for many games, and i hope devs continue to go that route. Give me a mix of high and medium graphics if you must, or even just medium. We know the texture resolution is gonna be a higher setting than anything else graphics related anyway just cause of all that RAM.
Whenever the PS4 doesn't have the highest resolution textures available, I sigh, with all the GDDR5 ram, textures should never be muddy on the PS4. Looking at that Saints row port or even some of the assets of Remake against pc, I just shake my head. If the graphics engine is really pushing the envelope, I could understand how some effects would have to be dialed back to maintain framerate but textures should never be one of those. Still the PS4 is a pretty good machine as far as bandwidth is concerned, built for high res particle effects, larger worlds in rpg's and 1080p native imagery.
 
The statement is crystal clear: "ambas versiones no alcanzaban los 1080p" or "both versions did not reach 1080p". Where is the misreading? Who is lying here? 3Djuegos or gamestar?

Well according to an interview with the Witcher devs today from Playstation access they confirm the PS4 version is 1080p and 30FPs... so I think we know who's telling the truth :)

http://youtu.be/D4LHxsvknog

End of the video.
 
But we already knew the PC version will look better if you have the machine for it. We always know that. It's nothing new, it's not unexpected. But still people feel the need to come into every thread like this and tell us again and again, as if to make sure we don't forget how inferior our gaming hardware of choice really is.

If you were really choosing between one and the other, and this information made you decide that the PC version is a better choice for you, that's fine. I have no problem with that. It's the fast drive-by posts that only exist to demonstrate the PC's superiority that annoy me.


jeremiah_johnson_nodding-Robert-Redford.gif


I think they just need to reassure themselves that €1,500 PC is going to play the game better than the PS4 and Xbone
 
The PS4 version will be the better looking version on console ? i'm shocked ..

Still buying the PC version but I'm impressed with the 1080p@30 ~ High on the PS4, Good Job CD projekt on console.

unnamed-111.gif
 
I bet it did, even though its 100% correct(what I said, not McCaffrey).

Of course its an uncomfortable truth for some when its pointed out the difference between resolutions is dependent on certain factors.

The truth is you can feel you perceive whatever you want to perceive to be the truth. It's all perception.

Disagreeing with you is not failing to see the truth - don't kid yourself.
 
60fps version for me. Don't even know/care if my card can handle ultra. Just glad i have a choice.
 
This got me good.
IGN.gif

It's so fucking unbelievable that it hurts just watching it. Imagine politics journalists sitting together "well I heard the president of the US is named Obama or something like that". How the fuck in the world...why do they earn money again. Hilarious. And based on these peoples opinion other people make their decisions. Wow.
 
Whenever the PS4 doesn't have the highest resolution textures available, I sigh, with all the GDDR5 ram, textures should never be muddy on the PS4. Looking at that Saints row port or even some of the assets of Remake against pc, I just shake my head. If the graphics engine is really pushing the envelope, I could understand how some effects would have to be dialed back to maintain framerate but textures should never be one of those. Still the PS4 is a pretty good machine as far as bandwidth is concerned, built for high res particle effects, larger worlds in rpg's and 1080p native imagery.

Well there are still limits. Eventually 6gb is going to be common in PC games, and consoles will fall behind in the sheer amount again. Shadows of Mordor requires 6GB VRAM alone for highest possible textures, so getting that on console is rather dubious. However i do expect the texture resolution of games to be the thing that holds up the longest this gen. I really do like how Sony prioritized high speed RAM. Hopefully they can give some of that back to devs, as we know the saturation point hasn't been hit yet.(I think it was around 5.79GB according to the PS4's peak bandwidth)

FFX is 1080p.

No AA though..:( The 720p setting goes for a light FXAA though.

I hope the PS4 version will be 1440p with SMAA or MSAA. There should be the headroom.
 
...and more peak memory bandwidth when used properly, and a much better chip to handle audio tasks, and the CPU as you noted which has more cores and is faster. Seeing as the CPU is what tends to govern open world game framerates, and since audio is traditionally done on the CPU, I'd say those add up to notable advantages. It was certainly relevant in GTA and Unity to get those games to effectively be at parity (or even favor X1's build) even at the same resolution as the PS4 build.

Question: What happens when they begin optimizing graphics in the coming months (almost always the last thing to be optimized in open world games due to bugs being top priority) and they tap into the GPU boosts from the past several SDK updates? What happens when they can enjoy a more stable framerate once utilizing the November update that opens up the 7th core?

My impression is many of you guys haven't considered the actual process of developing a game like this or how the timelines tend to play out. It's very, very likely that the eventual DF article discussing the release build will be 1080p on both. It's very, very *possible* that we end up 1080p on both with X1 having an ever so slightly more stable framerate a la Unity.
Shape chip? Trying to spin 32mb of RAM at 133gb/s better 5gigs at 176/s? Is this the VGLeaks days again?
 
But we already knew the PC version will look better if you have the machine for it. We always know that. It's nothing new, it's not unexpected. But still people feel the need to come into every thread like this and tell us again and again, as if to make sure we don't forget how inferior our gaming hardware of choice really is.

If you were really choosing between one and the other, and this information made you decide that the PC version is a better choice for you, that's fine. I have no problem with that. It's the fast drive-by posts that only exist to demonstrate the PC's superiority that annoy me.

Yeah, it's a good thing people who own consoles never do drive by posts claiming superiority over a competitor. That'd be pretty annoying.
 
So Xbox Games can have 5GB RAM @ 200gb/s? HAHA... no.

Did I claim that or did you just make that claim up and pretend I said it?

*crickets* :/

Nick Baker is full of shit if he said that.

Right...random guy on internet knows better. Well then, good to know. :p

Here is what he said, for reference:

"And then if you say what can you achieve out of an application - we've measured about 140-150GB/s for ESRAM. That's real code running. That's not some diagnostic or some simulation case or something like that. That is real code that is running at that bandwidth. You can add that to the external memory and say that that probably achieves in similar conditions 50-55GB/s and add those two together you're getting in the order of 200GB/s across the main memory and internally."

http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview

The eSRAM rarely hits 109 gb/s and that's just for the 32MBs. We've gone over this here at GAF back in 2013

...and there's your problem. You let the ignorant chorus determine what you "know". Tell me more about how it 'rarely hits' the minimum guaranteed bandwidth. 109GB/s isn't the peak. It's the min for reads and writes separately, not combined.

-- these MS guys were wanting to double the bandwidth back when it was 102 gb/s to make it 204 gb/s. Well, the improvements before the Xbox launched pushed the theoretical bandwidth to 109 gb/s and they were still sticking to that 204.

No.

MS manufactured the chips under the baseline assumption it couldn't do reads and writes simultaneously on each cycle. Turned out the yields were good enough for the final product to be able to *almost* do both on every cycle. 1 out of every 8 cycles can't quite pull it off though, so instead of doubling the peak you get 1 + 7/8 = 1.875 times the baseline peak of 109 GB/s. That's the 204 GB/s and it is ONLY for eSRAM. The 200 GB/s figure I quotes is total bandwidth (150 GB/s from eSRAM, ~50 GB/s for DDR3 RAM).

See the quote above for more clarification. People on GAF also paraded the notion around that it was somehow disingenuous to add the 2 bandwidths together when in fact that is precisely what is appropriate given they are set up in parallel. Don't believe everything you read on GAF. Particularly regarding tech issues, and especially ones involving X1 tech.
 
Even tested this out with FFX Remaster. Difference with my 39" TV from the distance I sit from it was pretty small.

Well, don't get me wrong, but a) are you sure your eyesight is 100% and b) are you sure that a game with PS2 assets is the best to test that...? Also, I don't know your TV settings. If you add additional sharpness (which, for many TV means, Sharpness is not set above 0 or 50), then the whole thing is skewed again. I'd agree in an instant that the difference between 720p and 900p is probably not too visible. But scaled vs. native normally is always very visible because it comes with a huge detail and sharpness bonus.
 
Top Bottom