The Witcher 3: 1080p/30fps on PS4, 900p/30fps on Xbox One

Dis gon be good

It's not 2013. People have come to grips of the differences between the two.

...and more peak memory bandwidth when used properly, and a much better chip to handle audio tasks, and the CPU as you noted which has more cores and is faster. Seeing as the CPU is what tends to govern open world game framerates, and since audio is traditionally done on the CPU, I'd say those add up to notable advantages. It was certainly relevant in GTA and Unity to get those games to effectively be at parity (or even favor X1's build) even at the same resolution as the PS4 build.

Question: What happens when they begin optimizing graphics in the coming months (almost always the last thing to be optimized in open world games due to bugs being top priority) and they tap into the GPU boosts from the past several SDK updates? What happens when they can enjoy a more stable framerate once utilizing the November update that opens up the 7th core?

My impression is many of you guys haven't considered the actual process of developing a game like this or how the timelines tend to play out. It's very, very likely that the eventual DF article discussing the release build will be 1080p on both. It's very, very *possible* that we end up 1080p on both with X1 having an ever so slightly more stable framerate a la Unity.

Well, some people have.
 
What's he saying is that resolution doesn't matter... unless it does, but, even when it does, it only matters a little bit.... except for those rare occasions when it matters alot, but even then its not that important.

I don't see what's so hard to understand here??
I dont either. Which is why its strange you've gotten what I said so wrong.
 
As a ps4 and PC owner i do believe im going to end up getting the console version. Is everyone OK with that?

nyTjQ0K.gif
 
UK sales tax is 20% so that's probably why things are more expensive here. lol
I priced them out in Austria, same sales tax. I think you just don't know where to shop for PC components :P
(Don't worry, few people do)

Let me take it one step further. As an avid PC gamer, what if the PC parts actually DID cost a very high amount of money? So what?

No seriously, so what?

If 120+fps and better textures, higher resolutions etc etc etc matter so much to me that I am willing to pay a very high premium (which isnt' even the case in modern PC gaming) then let me pay what I want to pay to play how I want to play.

I just really don't get the salt. I don't look at people with more expensive setups than mine and insult them. I say, "Hey that's awesome"
Also a good point.
 
None of that really matters anyway because it's peanuts compared to the infinite power of the cloud.

You should hold off on making that joke until we actually see what is possible with cloud compute. Presumably this E3 will have lots for us to look at in that regard, at least in terms of first party stuff (Crackdown, etc).
 
There was a thread at Beyond3d back in the day about it I think, but I don't have a link off hand for the thread. One of the posters there did some work on the audio architecture for X1. SHAPE was specifically designed to take audio loads off the CPU (apparently games like Forza commonly used 50% of the CPU power just for audio on 360...which is kinda nuts). SHAPE also has partitions that help with Kinect voice inputs, which are separate from the component handling game voices which I referenced earlier. More info on that here: http://www.vgleaks.com/durango-sound-of-tomorrow

For PS4, some details here: http://www.vgleaks.com/playstation-4-audio-processor-acp

Dunno how much it matters in these games, but it is an advantage nonetheless. It doesn't necessarily have much in common with the thread topic though.

That lacks any specifics on the PS4s audio chip though. The articles that talk about it make it seem rather specific. I don't see anything to back up your claim that the Xbox one has a much better audio chip.

http://www.anandtech.com/show/7513/ps4-spec-update-audio-dsp-is-based-on-amds-trueaudio
 
Let me take it one step further. As an avid PC gamer, what if the PC parts actually DID cost a very high amount of money? So what?

No seriously, so what?

If 120+fps and better textures, higher resolutions etc etc etc matter so much to me that I am willing to pay a very high premium (which isnt' even the case in modern PC gaming) then let me pay what I want to pay to play how I want to play.

I just really don't get the salt. I don't look at people with more expensive setups than mine and insult them. I say, "Hey that's awesome"

Its just a way for haters to mock PC and the people who game on PC because they can't find any other ways to attack the platform in a negative light. It has no basis in common sense.

Its the same for those certain PC players who find reasons to rag on the console crowd just because the hardware is not as good or the selection of games is not as extensive. Its all bullshit platform wars in the end no matter the allegiance.

Game on what you want to game on
 
You should hold off on making that joke until we actually see what is possible with cloud compute. Presumably this E3 will have lots for us to look at in that regard, at least in terms of first party stuff (Crackdown, etc).

I don't really think any game can rely on "cloud compute" unless it's designed to be always online. Which (I'm strongly presuming) Crackdown isn't.

"The power of the cloud making your Xbox One run better" was just marketing mate.
 
Sincere question, how easily can the naked eye discern 900p vs. 1080p? How far away do you have to be from the screen for it to matter? Just curious as someone who is cool with 720p for most releases.
I play a lot of Battlefield 4 on my PS4 which is 900p. All of the other titles I have on that console are 1080p. I have to look much harder at the screen and even squint at times to pick out the details when playing the 900p game. Everything is fuzzier. There's a difference. Although I sit close to the screen when I play.
 
As a ps4 and PC owner i do believe im going to end up getting the console version. Is everyone OK with that?

As someone who was going to either upgrade their PC or buy a PS4 and ended up getting a PS4. I can understand your position.
 
...so the main RAM just...doesn't exist anymore or what? Where did it go? See my reply to the other guy. Actual games/apps have run at 150 GB/s just for the eSRAM. No idea where you got the 133 GB/s figure. And that figure is actual applications, not peak (your 176 GB/s for PS4 isn't actual game/app...it's theoretical peak).

The SHAPE chip is what handles audio tasks on X1, which otherwise need to use up CPU cores to run. The SHAPE chip in X1 handles significantly more voices than PS4's hardware for audio does.
You can't add main RAM and eSRAM. 133 is just the last peak bandwidth I heard quoted. The numbers you're quoting are theoretical peaks as well and a while back devs were saying it was much easier to hit that peak on PS4.

The Shape chip and the TruAudio on PS4 are very sinilar.
 
is this the final resolution for xboxone or will there be a update later on?

It's the current build...which almost certainly has has no actual graphics optimizations done to it at all since around E3 of last year or earlier. Typically devs in open world games have to iron out a heap of nasty bugs before focusing on optimizing visuals. Game is out in late May, for reference. They explicitly noted the resolution wasn't necessarily final.
 
Its just a way for haters to mock PC and the people who game on PC because they can't find any other ways to attack the platform in a negative light. It has no basis in common sense.

Its the same for those certain PC players who find reasons to rag on the console crowd just because the hardware is not as good or the selection of games is not as extensive. Its all bullshit platform wars in the end no matter the allegiance.

Game on what you want to game on

Agreed agreed agreed
 
I seriously question that. You offered absolutely nothing of worth that I did not know - so the truth thing? Not sure where to find it.

Here is what you did - you replied to a post of mine with some passive aggressive "truth search" as bait and now you're copping out. Yes, avoid me having to waste my time.
Copping out? I've kept up with this, explained myself, and you conveniently dodge admitting the truth of what I'm saying. Which is entirely predictable, sadly.
 
I find it interesting that some people relish in the stating of what should be obvious and expected (PS4 version being visually superior to the XOne version in most measures) while simultaneously betting sensitive and salty at the stating of what should also be obvious and expected (the PC version being better than the PS4 version).

You can dish it out without catching feelings. You should be able to take it too.
 
That lacks any specifics on the PS4s audio chip though. The articles that talk about it make it seem rather specific. I don't see anything to back up your claim that the Xbox one has a much better audio chip.

http://www.anandtech.com/show/7513/ps4-spec-update-audio-dsp-is-based-on-amds-trueaudio

Well, he probably never heard about it when he linked an old vgleaks article.
 
That lacks any specifics on the PS4s audio chip though. The articles that talk about it make it seem rather specific. I don't see anything to back up your claim that the Xbox one has a much better audio chip.

http://www.anandtech.com/show/7513/ps4-spec-update-audio-dsp-is-based-on-amds-trueaudio

It isn't. The audio on both systems is dealt with with a DSP on the GPU iirc, Shape was done specifically to deal with voice commands and kinect in general.

Also, people tend to remember every chip on the XBone and forget the 256MB of DDR3 and the ARM chip on PS4 to handle background tasks.
 
I play a lot of Battlefield 4 on my PS4 which is 900p. All of the other titles I have on that console are 1080p. I have to look much harder at the screen and even squint at times to pick out the details when playing the 900p game. Everything is fuzzier. There's a difference. Although I sit close to the screen when I play.

I think in BF's case, the fuzziness is explained by both the lower resolution as well as the bad FXAA implementation they used. Very prone to blurring.
 
It's the current build...which almost certainly has has no actual graphics optimizations done to it at all since around E3 of last year or earlier. Typically devs in open world games have to iron out a heap of nasty bugs before focusing on optimizing visuals. Game is out in late May, for reference. They explicitly noted the resolution wasn't necessarily final.
Regardless of the specific title here and the 1080 vs 900 debate, I just want to say that this is not even remotely true; this is not how a dev cycle works man.
 
Copping out? I've kept up with this, explained myself, and you conveniently dodge admitting the truth of what I'm saying. Which is entirely predictable, sadly.

What have I failed to admit as truth? Start from your first reply to my post. Work your way down. Make assumptions about my opinions, knowledge and such. Maybe I'll get another reality I'm not seeing.

Edit: Since you got obviously worked up on a gif that I replied to a different poster that mocked your post I'll break it down:

I don't need to admit that which I already know (the concept as you call it) - of which you simply stated a part of and clinched to it to death - you labeled this as truth. "I was in denial to the truth from the start" - without me ever stating such disagreement (my first post, your first post). How you got to that conclusion in the first place is unknown to me (you never asked, nor do I remind myself stating such) - but that's part of what makes bait trolling a practice you often undertake. I wonder if you'll admit you made a bait post.

And yes you're right I will not admit to only a part of what I believe is correct. You assumed I did not know the concept, part of it that is - simply by failing to embrace it in agreement. Objectively, native resolution> non-native resolution (1080p>900p>720p). Yes, viewing distance and TV size does affect perceived difference. In normal viewing scenarios however, it is rather easy to spot the difference (2 ft desktops (16-26inch displays), 5ft -15ft (32-75 inch displays). Now the "easy" part is subjective. Someone can claim it's rather hard to perceive the difference, not "huge" - an untrained eye - however it's labeled. There is no such thing as objective truth in normal viewing distance scenarios - and definitely not objective truth in the words used to describe the differences. In extreme scenarios (viewing distance greater than 15ft.> the perceived difference does dramatically become lower and lower. So if you say for your specific situation the difference is rather small - I can chose to take your word at face value and nothing more. I would have to be there to gauge myself and I could either agree or disagree on the difference. That's subjective and I am free to make that claim.

Your post did not deserve my response. Certainly not when you start off the bat making assumptions as to what I perceive to "be the truth" and as to what I deny.
 
Or watch some people make up absurd prices as a coping mechanism. It's certainly entertaining.

What? See my post earlier. It would easily cost €1,500 to build a PC just to play the recommended settings. Maybe the PC parts I'm picking aren't correct I don't know.

The point is (without getting all emotive about it) is there is no way in hell that €400 consoles will be able to beat a PC that costs twice as much (happy now)
 
It depends on AA technique though, but in Witcher 3's case the difference will be probably quite visible even on TV.

Putting those shots on TV and sitting in normal distance wont show noticeable difference, but on the other hand not every game uses this kind of AA or post processing:
http://i5.minus.com/is1ucxNIWOaoj.png
http://i6.minus.com/ibhMNoZj0BdN70.png
One is 1536x864 with TAA and other is 1080p with TAA.

The Order 1886 has the same problem, they are using 4x MSAA'ed image without scaling, but their game is still very blurry as or even sometimes more blurry than 900p with proper scaling and sharperning.

What is TAA? I thought Ryse used SMAA T2X?

AFAIK the Order uses MSAA 4x and some form of postproc AA on top of that.
 
CD Projekt RED called playing on consoles a compromise anyway--as in getting a great experience and not really having to pay for a high level PC.

“Yeah, so it’s always a compromise and I can go back to the time of [The Witcher 2: Assassins of Kings – Enhanced Edition] on 360 and it looked like the PC version running on medium,” said Iwiński.

8 1xE9iCv 300x168 CD Projekt Red: The Witcher 3: The Wild Hunt to welcome the more general players“So, the PC, that’s the nature of the format – it’s scalable up, but let’s ask ourselves: how many people have the rig that is like six times the price of a console? Not that many. And I think you will have lots of fun with our game. Having said that, all the console gamers will just enjoy it on the consoles and I think the game still looks amazing on the consoles.”

Iwiński continued: “On the PC, one thing to add, it will be scalable all the time, so if you cannot afford the rig today maybe in a year or two years you will be playing Witcher: Wild Hunt on the uber settings, yes? Which is probably the most expensive rig ever made, as of today.

via Metro from GamerCenterOnline
 
You can't add main RAM and eSRAM.

Stop listening to the random ppl here. You absolutely can add them as they are set up in parallel.

133 is just the last peak bandwidth I heard quoted.

Quote by whom? Some know nothing posting here? Did you see my other post with Nick Baker's quote in it? Actual real world games/apps have been measured using 150 GB/s with just the eSRAM alone. Did you read my post at all?

The numbers you're quoting are theoretical peaks as well and a while back devs were saying it was much easier to hit that peak on PS4.

Are you even reading what I typed? 176 GB/s is theoretical peak on PS4. 204 GB/s + 68 GB/s (272 GB/s) is theoretical peak combined on X1. Yes, it is much easier to hit the peak on PS4, but no game ever will. That isn't my point. If it were my point I would have been quoting 272 GB/s as the bandwidth for X1, which I wasn't. My point was real world, actual games/apps have been measured to hit 150 GB/s ONLY using the eSRAM and using that as an efficiency estimate for the main RAM nets us around 200 GB/s total bandwidth. Straight from the guy who designed it. He knows more than you do.

Even if we severely low ball the DDR3 RAM's real world usage efficiency it STILL easily eclipses the real world usage of the PS4's GDDR5. You could have 38% usage efficiency for the DDR3 in X1 and 100% perfect efficiency on PS4 and they'd still be equal.

The Shape chip and the TruAudio on PS4 are very sinilar.

...no, they aren't actually. One only does compression/decompression and some DMA work (PS4), the other does that on more than twice as many voices along with a bunch of other stuff. Stop.

Why is it so difficult for you guys to admit that X1 has hardware advantages in some areas over PS4?
 
What is TAA? I thought Ryse used SMAA T2X?

AFAIK the Order uses MSAA 4x and some form of postproc AA on top of that.

They are using SMAA T1x, but its called TAA in options.

Was postAA component confirmed for The Order after all? I know that they were experimenting with it, but dunno what was finalized, but its quite possible that they are using some temporal technique too. Still, i think most of their blurriness comes from other sources, like CA, grain filters etc.

--
Temporal Anti Aliasing. It blends two frames together to create the illusion of smoothness. Very prone to ghosting though, as seen in Reach, probably the worst implementation ever
We are so past that. Actually SMAA T1x blends 4 frames and it is focused to eliminate ghosting even in comparison to SMAA T2x, its also the most effective AA technique to date for a cost of sharpness.
 
It's the current build...which almost certainly has has no actual graphics optimizations done to it at all since around E3 of last year or earlier. Typically devs in open world games have to iron out a heap of nasty bugs before focusing on optimizing visuals. Game is out in late May, for reference. They explicitly noted the resolution wasn't necessarily final.

Man you're good. 30 or so posts into your GAF career, all guns blazing, secretsauce,resoloutions not noticeable, infinite power of the cloud, talking about bans and rules.

Found that secret chip yet?
 
You should hold off on making that joke until we actually see what is possible with cloud compute. Presumably this E3 will have lots for us to look at in that regard, at least in terms of first party stuff (Crackdown, etc).

You do understand that "The Power of the Cloud", as in, offloading Xbox One game calculations off to Azure is simply never going to happen, right?

It was all buzzword bullshit, and it was debunked in 2013.
 
What? See my post below. It would easily cost €1,500 to build a PC just to play the recommended settings. Maybe the PC parts I'm picking aren't correct I don't know.
Intel CPU Core i7 3770 3,4 GHz
AMD CPU AMD FX-8350 4 GHz
Nvidia GPU GeForce GTX 770
AMD GPU Radeon R9 290
RAM 8GB
OS 64-bit Windows 7 or 64-bit Windows 8 (8.1)
DirectX 11
HDD Space 40 GB
Those recommended specs are worth 1500$ ?
 
You should hold off on making that joke until we actually see what is possible with cloud compute. Presumably this E3 will have lots for us to look at in that regard, at least in terms of first party stuff (Crackdown, etc).

Dude, stop.

They will be carting you off in a white van if you carry on rambling aimless gibberish.
 
--

We are so past that. Actually SMAA T1x blends 4 frames and it is focused to eliminate ghosting even in comparison to SMAA T2x.

I would hope the technology had gotten better! Its been 5 years after all...its hard to believe back then, even FXAA wasn't even common yet. Reach would have looked tons better even with that pitiable solution in comparison to the abomination they came up with.
 
is this the final resolution for xboxone or will there be a update later on?

Well, the game doesn't come out for about another 3 1/2 months. And CDPR has said the game is basically done, it is just bug fixing and polish. And who knows if the latest build the press got to see was even using the December XDK (the one that got Dying Light up to 1080p). Because these press builds are usually a few months old prior to being shown.

It remains to be seen what the final resolution on Xbox One will be.
 
You can't add main RAM and eSRAM. 133 is just the last peak bandwidth I heard quoted. The numbers you're quoting are theoretical peaks as well and a while back devs were saying it was much easier to hit that peak on PS4.

The Shape chip and the TruAudio on PS4 are very sinilar.

This. The very recently leaked X1 sdk said that Devs could (real world) depend on 102 GB/s for the esram with spikes up to around 133 gb/s on occasion. While ps4's peak numbers are 176 GB/s, it is much easier to hit near peak numbers on ps4 due to a single unified pool of ram all with the same bandwidth, as well as the cutting down on contention for bandwidth due to copy overhead back and forth between cpu and gpu since ps4 essentially has a near complete HSA/hUMA memory subsystem architecture that is capable of assigning the volatile flag for bits of data that can be worked on by both cpu and gpu. This saves cpu and gpu cycles, making the system as a whole more efficient with its use of memory bandwidth.
 
PC + PS4 and a Xbox one owner here.

Will get it in Xbox one ... because I pre-ordered the Wild Hunt edition in amazon uk for a £120 only.


I might double dip if i can find a cheap PC version.
 
where would anyone get the idea that there are variables in place for 900p versus 1080p? regardless of size and distance its pretty obvious. take a screen shot at 900p and 1080p, put them on 2 23" monitors side by side, blow the 900p up to 1080p, and even at around 10 feet the differences would still be obvious

Replace that 23" with a 50" living room TV and it goes from obvious to blatant

Wondering if I will be able to hit 30FPS on ultra at 5760x1080...
 
Top Bottom