The Witcher 3: 1080p/30fps on PS4, 900p/30fps on Xbox One

Whoa...someone who actually understands how resolution works! :)

You are definitely correct. As resolutions get higher and higher the difference in the perception of sharpness in the image dramatically decreases. It is absolutely an area of strongly diminished returns.
There's diminishing returns going from 8k to 16k on a screen, certainly.

On the hand, going from 900p to 1080p (and from there onwards to 4k) is still a massive difference.
 
What the hell is that supposed to mean? Are you upset that people are getting the PS4 version?

Or he just likes posting unrelated gif responses

cVxKpKe.gif
 
What about the TrueAudio DSP? We never got info on it "improving" anything other than that was how it was being branded by AMD. It didn't improve the audio capabilities beyond what Cerna had previously described (which is what I was referring to). And voices doesn't refer to spoken dialogue in games...you know that right?

Audio chip in PS4 was a mistery before TrueAudio DSP reveal. And things has changed after that. Many people thought that SHAPE in Xbone is superior. If you have clues, show us.

Anyway, "about 512 vs 256 maybe" you probably referring on an old vg247 link.
 
Or watch some people make up absurd prices as a coping mechanism. It's certainly entertaining.

Well to be fair the PC they were running on was a i7-4790 with GTX 980 and 8gb ram. Did a quick search online and just those three components are £923 (1234 euros). =P
 
Are you trying to suggest that anyone that believes or "sees" and opines that the difference between 1080p and 720p are huge is incorrect and/or factually wrong? I'm getting that vibe.
Never said anything like that. i'm saying the difference isnt a constant and may not be 'huge' in certain situations. its really a pretty simple concept.
 
...and more peak memory bandwidth when used properly, and a much better chip to handle audio tasks, and the CPU as you noted which has more cores and is faster. Seeing as the CPU is what tends to govern open world game framerates, and since audio is traditionally done on the CPU, I'd say those add up to notable advantages. It was certainly relevant in GTA and Unity to get those games to effectively be at parity (or even favor X1's build) even at the same resolution as the PS4 build.

Question: What happens when they begin optimizing graphics in the coming months (almost always the last thing to be optimized in open world games due to bugs being top priority) and they tap into the GPU boosts from the past several SDK updates? What happens when they can enjoy a more stable framerate once utilizing the November update that opens up the 7th core?

My impression is many of you guys haven't considered the actual process of developing a game like this or how the timelines tend to play out. It's very, very likely that the eventual DF article discussing the release build will be 1080p on both. It's very, very *possible* that we end up 1080p on both with X1 having an ever so slightly more stable framerate a la Unity.

You forget hyper visor under 3 OS layer dGPU hidden reveal soon sony ponies laugh now MS last laugh DX12 boost performance mark cerny lies PS4 2010 GPU MS have cloud 2018 GPU 3.2TF sony ship sinking sony ponies tell lies about XB1 N4G GAF mods paid ponies spread FUD scared of MS all reveal soon watch table turning XB1 10X powerful PS4
 
I have my choice of picking this up on PS4/Xbone/PC, but I think PC will be the way to go for me. I want to flex my new rig.
 
Never said anything like that. i'm saying the difference isnt a constant and may not be 'huge' in certain situations. its really a pretty simple concept.

The question is now rather which "truth" I need to accept? As you referenced earlier. Somehow you seemed to suggest there was a certain truth I was in denial about - as well as others (not named).
 
If the controls are Vita friendly I'll be so happy :)

Left Stick - Movement, Steer Horse, Steer Boat, [Press] Call Horse
Right Stick - Camera Movement, [Press] Change Quest Objective, [Press] Lock On Target

R1 - Quick Access item
L1 - Quick Access Menu
L2 - Witcher Senses, Parry/Riposte
R2 - Cast Sign

Triangle - Strong Attack
Square - Fast Attack, Dive, Stop/Reverse Boat
Circle - Surface / Climb / Dodge / Jump, [Hold] Roll / Dismount
X - Interact, Swim Fast, Run/Sprint, Canter/Gallop, Accelerate Boat


D-Pad - [Up] Potion 1, [Down] Potion 2,
Draw/Sheathe Steel Sword ,
Draw/Sheathe Silver Sword

Touchpad - Game Menu
Options - Pause Menu​


The controls. If they switch L2/R2 mapping to L and R it should work flawlessly as the Quick Access wheels slow down time.​
 
Well to be fair the PC they were running on was a i7-4790 with GTX 980 and 8gb ram. Did a quick search online and just those three components are £923 (1234 euros). =P
a) Even those components are less than 900€
b) You don't remotely need a 980 or a 4790 to outperform any console in this or any other game
 
You forget hyper visor under 3 OS layer dGPU hidden reveal soon sony ponies laugh now MS last laugh DX12 boost performance mark cerny lies PS4 2010 GPU MS have cloud 2018 GPU 3.2TF sony ship sinking sony ponies tell lies about XB1 N4G GAF mods paid ponies spread FUD scared of MS all reveal soon watch table turning XB1 10X powerful PS4
Oh lawd.
Did you copy/paste that from MisterX?
 
I'm thinking about getting the PS4 version if it performs and looks that good.

Current rig: 3770k and 680 GTX 2GB
 
Never said anything like that. i'm saying the difference isnt a constant and may not be 'huge' in certain situations. its really a pretty simple concept.

Clear as mud. Which resolution do you prefer to play ffx at in your "certain situation". I'm going to take a wild guess that you prefer 1080.
 
I'm outperforming the consoles with a 3770k and 660ti.

I'm told that I didn't even need to go that far with the CPU, but the pseudo-octocore thing felt more future-proofy.

And you know, once you have the case, a power supply, ram, a mobo and a hard drive, you don't really have to spend much money upgrading -- unless you're really looking to splurge. I've spent maybe 400-500 in the past 6 years upgrading because I could sell older parts (without the loss of backwards compatibility).
 
The question is now rather which "truth" I need to accept? As you referenced earlier.
I wasn't offering options. Anyways, you either get it or you dont. I'm sure you understand but you're clearly going to keep trying to avoid admitting it. I'll stop wasting my time now.
 
Well to be fair the PC they were running on was a i7-4790 with GTX 980 and 8gb ram. Did a quick search online and just those three components are £923 (1234 euros). =P

You don't actually need those specific components to play the game at "high" presets, as they are, considering that apparently "ultra" presets, can run on that same system. Considering the "high" presets out perform the consoles, and we have no idea what system is required to match them, all we can say is that it would assuredly cost significantly less than £923. Hence why the previous point stands
 
I've never heard the audio chip in the X1 is better then the PS4. Do you have a link?

There was a thread at Beyond3d back in the day about it I think, but I don't have a link off hand for the thread. One of the posters there did some work on the audio architecture for X1. SHAPE was specifically designed to take audio loads off the CPU (apparently games like Forza commonly used 50% of the CPU power just for audio on 360...which is kinda nuts). SHAPE also has partitions that help with Kinect voice inputs, which are separate from the component handling game voices which I referenced earlier. More info on that here: http://www.vgleaks.com/durango-sound-of-tomorrow

For PS4, some details here: http://www.vgleaks.com/playstation-4-audio-processor-acp

Dunno how much it matters in these games, but it is an advantage nonetheless. It doesn't necessarily have much in common with the thread topic though.
 
I think the "PC parts cost 9 million bucks", rhetoric needs to stop in this day and age of off the shelf parts and easy accessibility. Its probably the easiest time out of any in history to be a PC gamer.

But its also a good thing that both consoles are being earnestly optimized for, especially in the case of this game that's clearly very impressive from a technical standpoint, so....i think everyone should just calm down a bit with the platform warring....
 
You forget hyper visor under 3 OS layer dGPU hidden reveal soon sony ponies laugh now MS last laugh DX12 boost performance mark cerny lies PS4 2010 GPU MS have cloud 2018 GPU 3.2TF sony ship sinking sony ponies tell lies about XB1 N4G GAF mods paid ponies spread FUD scared of MS all reveal soon watch table turning XB1 10X powerful PS4

Are you joking?
 
Clear as mud. Which resolution do you prefer to play ffx at in your "certain situation". I'm going to take a wild guess that you prefer 1080.

What's he saying is that resolution doesn't matter... unless it does, but, even when it does, it only matters a little bit.... except for those rare occasions when it matters alot, but even then its not that important.

I don't see what's so hard to understand here??
 
The game itself looks great. I was planning on buying it at some point in time, but what CD Projekt has said about DLC will make me a Day 1 buyer again (something that has not happened in a long time). The Witcher 3 will be receiving 16 DLC packages, free of charge across all platforms without pre-order or special edition purchase requirements.

Outstanding! If developers are worried about people trading in their games, they can give us an incentive to keep them. This is definitely a good incentive. Thanks CD Projekt!

Also, props to them for making a game that scales based on hardware and not some bullshit parity clause.

The resolution argument going on in this thread is ridiculous. If all that is available to you is 900p, then you should be okay, you can enjoy the game. If you have the option between 900p and 1080p, that is a no-brainer, you go with 1080p. Anyone that says it doesn't make a difference is delusional.
 
Will there be a system like the Dragon Age keep for people who are new to the series since this will be the first game on a Playstation console? Or maybe a way to send your PC saves to console? This is really the deciding factor for me.

on PC it sounds like you can but for PS4 it wont be a straight up transfer. they hinted at some type of prologue story selector that can simulate choices made in the previous game.

http://www.youtube.com/watch?v=D4LHxsvknog&t=1m35s
 
Man, how I remember people saying that PS4 games would quickly drop resolution to sub 1080p, yet here we are, and there's only a handful of titles.
 
On the hand, going from 900p to 1080p (and from there onwards to 4k) is still a massive difference.

It depends on AA technique though, but in Witcher 3's case the difference will be probably quite visible even on TV.

Putting those shots on TV and sitting in normal distance wont show noticeable difference, but on the other hand not every game uses this kind of AA or post processing:
http://i5.minus.com/is1ucxNIWOaoj.png
http://i6.minus.com/ibhMNoZj0BdN70.png
One is 1536x864 with TAA and other is 1080p with TAA.

The Order 1886 has the same problem, they are using 4x MSAA'ed image without scaling, but their game is still very blurry as or even sometimes more blurry than 900p with proper scaling and sharperning.
 
...and more peak memory bandwidth when used properly, and a much better chip to handle audio tasks, and the CPU as you noted which has more cores and is faster. Seeing as the CPU is what tends to govern open world game framerates, and since audio is traditionally done on the CPU, I'd say those add up to notable advantages. It was certainly relevant in GTA and Unity to get those games to effectively be at parity (or even favor X1's build) even at the same resolution as the PS4 build.

Question: What happens when they begin optimizing graphics in the coming months (almost always the last thing to be optimized in open world games due to bugs being top priority) and they tap into the GPU boosts from the past several SDK updates? What happens when they can enjoy a more stable framerate once utilizing the November update that opens up the 7th core?

My impression is many of you guys haven't considered the actual process of developing a game like this or how the timelines tend to play out. It's very, very likely that the eventual DF article discussing the release build will be 1080p on both. It's very, very *possible* that we end up 1080p on both with X1 having an ever so slightly more stable framerate a la Unity.

None of that really matters anyway because it's peanuts compared to the infinite power of the cloud.
 
Good news for PS4 owners. I remember when they said they might not hit 1080p

Might go with PS4 version if the framerate holds up. Lord knows it's gonna be an unoptimized mess on PC like TW2.
 
a) Even those components are less than 900€
b) You don't remotely need a 980 or a 4790 to outperform any console in this or any other game

UK sales tax is 20% so that's probably why things are more expensive here. lol

Anyway I was being facetious. ;)
 
You forget hyper visor under 3 OS layer dGPU hidden reveal soon sony ponies laugh now MS last laugh DX12 boost performance mark cerny lies PS4 2010 GPU MS have cloud 2018 GPU 3.2TF sony ship sinking sony ponies tell lies about XB1 N4G GAF mods paid ponies spread FUD scared of MS all reveal soon watch table turning XB1 10X powerful PS4

Why is sniping like this allowed here? You didn't respond to anything I said. You simply made up some bullshit, tried to attribute it to me, and then walked off...

Everything I claimed is factual and cited in my posts. If you want me to explain things clearer or if you would like links or whatnot, just ask.
 
0_0....at that price it better do way more than that lol

Not sure what you mean?
I've just checked the recommended specs for The Witcher 3 and priced them online and for the motherboard, graphics card, Processor and RAM alone cost £759 sterling or €1,000 and that's without PC case, cooling, power supply etc. so it would easily cost €1,500 to play this @ the recommended settings
 
I wasn't offering options. Anyways, you either get it or you dont. I'm sure you understand but you're clearly going to keep trying to avoid admitting it. I'll stop wasting my time now.

I seriously question that. You offered absolutely nothing of worth that I did not know - so the truth thing? Not sure where to find it.

Here is what you did - you replied to a post of mine with some passive aggressive "truth search" as bait and now you're copping out. Yes, avoid me having to waste my time.
 
Clear as mud. Which resolution do you prefer to play ffx at in your "certain situation". I'm going to take a wild guess that you prefer 1080.

He is saying that the difference seen for different resolutions is variable depending on the distance at which you are from the screen. He is right. If you are a foot away from a screen you will notice a change of say 1080p to 720p, a whole lot more than if you are sat 10 feet away. There is a difference, but that difference is reduced, the further away you are from that screen
 
I think the "PC parts cost 9 million bucks", rhetoric needs to stop in this day and age of off the shelf parts and easy accessibility. Its probably the easiest time out of any in history to be a PC gamer.

Let me take it one step further. As an avid PC gamer, what if the PC parts actually DID cost a very high amount of money? So what?

No seriously, so what?

If 120+fps and better textures, higher resolutions etc etc etc matter so much to me that I am willing to pay a very high premium (which isnt' even the case in modern PC gaming) then let me pay what I want to pay to play how I want to play.

I just really don't get the salt. I don't look at people with more expensive setups than mine and insult them. I say, "Hey that's awesome"
 
Top Bottom