The Witcher 3 runs at 1080p ULTRA ~60 fps on a 980

I hope what they said it's actually true, my i7 4770k and GTX 980 both overclocked should run it on ultra and 1080p/60fps
 
Slightly off-topic question but, am I crazy for thinking there is a difference between console 30fps and PC 30fps? Like, when I play GTA V with 30fps locked on PC (max settings) it looks and feels much more noticable than playing a 30fps locked game on consoles. Why is that so?
 
Slightly off-topic question but, am I crazy for thinking there is a difference between console 30fps and PC 30fps? Like, when I play GTA V with 30fps locked on PC (max settings) it looks and feels much more noticable than playing a 30fps locked game on consoles. Why is that so?

You could be locking it incorrectly. They should be qualitatively the same when done correctly.
 
Slightly off-topic question but, am I crazy for thinking there is a difference between console 30fps and PC 30fps? Like, when I play GTA V with 30fps locked on PC (max settings) it looks and feels much more noticable than playing a 30fps locked game on consoles. Why is that so?

I agree. A locked 30 feels pretty good on consoles, but it looks like a choppy mess on PC.
 
Slightly off-topic question but, am I crazy for thinking there is a difference between console 30fps and PC 30fps? Like, when I play GTA V with 30fps locked on PC (max settings) it looks and feels much more noticable than playing a 30fps locked game on consoles. Why is that so?

30fps on PC always feels worse than it does on console for reason. I think it has something to do with the frame limit implementation and mismatched refresh rate.
 
I hope what they said it's actually true, my i7 4770k and GTX 980 both overclocked should run it on ultra and 1080p/60fps

It's true. Much after this thread was made, at the YouTube event, all the videos we saw from that were running Ultra @60fps on a single 980.

Also, there is a day one patch, and CDPR said yesterday they have improved performance on all platforms in the past 2 weeks, not to mention the build YouTubers played was a couple months old already.



A 4770K and a 980 will destroy this game on Ultra@ 1080p. The only settings that will bring it pause are Ubersampling, which is not available at launch, and Hairworks, which was still being optimized but seems to have no drastic effects on framerate in the Angry Joe or Jesse Cox videos, which both have Hairworks enabled on a 980 on Ultra, running at 60fps.
 
Slightly off-topic question but, am I crazy for thinking there is a difference between console 30fps and PC 30fps? Like, when I play GTA V with 30fps locked on PC (max settings) it looks and feels much more noticable than playing a 30fps locked game on consoles. Why is that so?
Console games tend to have more even frame-pacing as developers lock 30fps.

For PC, you may have to jump through some hoops such as double-buffered VSync through Driver/Afterburner, even then it's not perfect. A GAF poster recommended layering D3DO and Afterburner to smooth it out further.
 
Slightly off-topic question but, am I crazy for thinking there is a difference between console 30fps and PC 30fps? Like, when I play GTA V with 30fps locked on PC (max settings) it looks and feels much more noticable than playing a 30fps locked game on consoles. Why is that so?

They should be the same. Are you using mouse and keyboard or a gamepad?

Latency is much more noticeable with a mouse and keybaord than with a gamepad. Also if you were just playing the game at 50+ FPS and you lock it, you WILL feel the sluggishness of 30 FPS.

I felt the same exact way and it was because of the two reasons above. I finally tried a blind test on my Xbone and my PC locked at 30 (with Ryse). Couldn't tell the differennce wiht a gamepad (wouldn't be ablind test with the mouse ;) ).
 
It's true. Much after this thread was made, at the YouTube event, all the videos we saw from that were running Ultra @60fps on a single 980.

Also, there is a day one patch, and CDPR said yesterday they have improved performance on all platforms in the past 2 weeks, not to mention the build YouTubers played was a couple months old already.



A 4770K and a 980 will destroy this game on Ultra@ 1080p. The only settings that will bring it pause are Ubersampling, which is not available at launch, and Hairworks, which was still being optimized but seems to have no drastic effects on framerate in the Angry Joe or Jesse Cox videos, which both have Hairworks enabled on a 980 on Ultra, running at 60fps.

Not to mention they played on like a 1 or 2 month old build of the game.
 
Yes, the Xbox One and PlayStation 4 versions had previously been confirmed to run at the PC equivalent of High. Though that remains to be seen when the game comes out.

To be honest, I find it hard to believe the PS4 version is running at medium/high, considering it takes a high end CPU & GPU to run it at Ultra.
The PS4 version must be extremely well optimised then. Then it begs the question on how well optimised the PC version is, if the PS4 can run it @ medium/high.
 
To be honest, I find it hard to believe the PS4 version is running at medium/high, considering it takes a high end CPU & GPU to run it at Ultra.
The PS4 version must be extremely well optimised then. Then it begs the question on how well optimised the PC version is, if the PS4 can run it @ medium/high.

I find it hard to believe because EVERY SINGLE develoepr who has said the same thing were lying. BF4 was also running at "Very high" settings, and so was dying light, and so was just about eveyr other multiplat.. until we got the game and realized it was maybe a setting or two at high, and the rest at medium, and some even at low.
 
I find it hard to believe because EVERY SINGLE develoepr who has said the same thing were lying. BF4 was also running at "Very high" settings, and so was dying light, and so was just about eveyr other multiplat.. until we got the game and realized it was maybe a setting or two at high, and the rest at medium, and some even at low.

Unrelated to your post but have you see the Jesse Cox preview, it looks so good even with low bitrate shadowplay recording and yt compression, much better than any stuff we got. It's spoiler free too except for maybe like 2 seconds.

http://www.gfycat.com/HonoredWhimsicalJerboa so good
 
To be honest, I find it hard to believe the PS4 version is running at medium/high, considering it takes a high end CPU & GPU to run it at Ultra.
The PS4 version must be extremely well optimised then. Then it begs the question on how well optimised the PC version is, if the PS4 can run it @ medium/high.
It will likely be a mixture of medium/low/high depending on the specific limitation. Something that is hard on CPU will be low/med at best.

The upshot to this is consoles will be excluded from the painful downgrade debate, since that's for the high-end PCs.
 
They should be the same. Are you using mouse and keyboard or a gamepad?

Latency is much more noticeable with a mouse and keybaord than with a gamepad. Also if you were just playing the game at 50+ FPS and you lock it, you WILL feel the sluggishness of 30 FPS.

I felt the same exact way and it was because of the two reasons above. I finally tried a blind test on my Xbone and my PC locked at 30 (with Ryse). Couldn't tell the differennce wiht a gamepad (wouldn't be ablind test with the mouse ;) ).

Yeah that might be it. I rarely play with a gamepad on PC
 
Using the Nvidia control panel and setting the vsync to half refresh rate is just as good as consoles. I play GTAV and AC Unity recently at 3200x1800p capped at 30fps.

Considering many console games dip below 30, where as I can change settings to never dip below 30fps and frames are consistent unlike a console game.

Can only think motion blur is missing in some titles. Seems more like the myth of angular graphics, consoles have smoother chips and other magic.

Devs on console even resort to screen tear like in Alan Wake on 360 as frames are out of sync when stuff is going on and there's nothing you can do
 
Is this something that can really be considered legitimate though, and not just a way to influence the purchases of new cards?

You make a good point. Of course you can run the game at wqhd res with an 980 and 4k with a Titan X, but with low fps i think. Surely not with 60.
This is the first game that makes me really consider the switch to WQHD. But then i need another 980 (60 fps yo) and another monitor. My wallet says no.
 
Slightly off-topic question but, am I crazy for thinking there is a difference between console 30fps and PC 30fps? Like, when I play GTA V with 30fps locked on PC (max settings) it looks and feels much more noticable than playing a 30fps locked game on consoles. Why is that so?

Frame rate vs. frame pacing. On a 60hz screen, 30fps will appear smooth when each frame is repeated twice per second. If the frame pacing is off, some frames might be repeated three times or only once per second. Still technically 30fps, but it'll appear a lot less smooth.
 
my post from the other thread, because I misposted :)

I just want to know how this game will perform on an AMD Card ( MSI 290),because it is a heavily marketed nvidia Game. ?

Looking at the recommended requirements 770 vs 290 ( wtf ? the 290 is more powerful) and the interview where a dev said we should expect 30fps with the rec specs I am very concerned.

If my i5 4560k and my 290 will run the game just as good as the PS4, I might buy the PS4 version...That would be a terrible optimization btw.
 
Frame rate vs. frame pacing. On a 60hz screen, 30fps will appear smooth when each frame is repeated twice per second. If the frame pacing is off, some frames might be repeated three times or only once per second. Still technically 30fps, but it'll appear a lot less smooth.

Frame pacing is moslty an issue when the frame rate variates. Not when it's steady. There is judder in vsync but only if the frame rate isn't being capped to 30.
 
my post from the other thread, because I misposted :)

I just want to know how this game will perform on an AMD Card ( MSI 290),because it is a heavily marketed nvidia Game. ?

Looking at the recommended requirements 770 vs 290 ( wtf ? the 290 is more powerful) and the interview where a dev said we should expect 30fps with the rec specs I am very concerned.

If my i5 4560k and my 290 will run the game just as good as the PS4, I might buy the PS4 version...That would be a terrible optimization btw.

Your 290 will run it better than the PS4. Why would you even doubt that?
 
my post from the other thread, because I misposted :)

I just want to know how this game will perform on an AMD Card ( MSI 290),because it is a heavily marketed nvidia Game. ?

Looking at the recommended requirements 770 vs 290 ( wtf ? the 290 is more powerful) and the interview where a dev said we should expect 30fps with the rec specs I am very concerned.

If my i5 4560k and my 290 will run the game just as good as the PS4, I might buy the PS4 version...That would be a terrible optimization btw.

AMD's drivers, even though improved, still aren't as good at DX 11 as Nvidia's.

Let's hope they improve performance from what they are syaing in the rec specs.

Still, even if you're only getting PS4 levels of performance (thoguh I doubt that!), on PC you sitll have options, not so on consoles. YMMV, but for me, I'll take options over none any day of the week.
 
I'm going to be unhappy if my 7970 Crossfire setup can't run this at a solid 60fps with hairworks enabled. I really want to play it with hairworks. But that whole "go slow on AMD cards" worries me.

I'm hoping that slight bit of extra brute force my setup has over a single 980 will be enough to make the difference.
 
Using the Nvidia control panel and setting the vsync to half refresh rate is just as good as consoles. I play GTAV and AC Unity recently at 3200x1800p capped at 30fps.

New to using this Adaptive Vsync Half Refresh option, but interestingly enough, while playing the Original Crysis I get 55-75FPS at 4K Maxed, but I get horrible screen tearing. The same scene with adaptive Vsync Half Refresh gives me 23fps... In fact, all forms of Vysnc seem to absolutely decimate performance at below 30fps.

I only have this problem in Crysis 1. Crysis 3 runs without issue.

3770k
16GB
SLI 970 FTWs

I only ask here because I am sort of interested in locking Wild Hunt at 30pfs @4K if possible, as I know 60FPS is not even close to the realm of possibility (Ultra)
 
I'm going to be unhappy if my 7970 Crossfire setup can't run this at a solid 60fps with hairworks enabled. I really want to play it with hairworks. But that whole "go slow on AMD cards" worries me.

I'm hoping that slight bit of extra brute force my setup has over a single 980 will be enough to make the difference.

Unless crossfire support is fucked out fo the gate, I woudl think you'd definitely be able to pull it off. Even if it is fucked out of the gate, I'm sure and x-fire profile will be out soon after reelase, and a single 7970 should be enough for most bells and whistles on until the profile hits.

Let's see how fast AMD moves to optimize the driver for this game.

On an unrelated note, I'm pretty sure the GPU manufacturers must be happy about DX12. It shoudl mean a lot less necessary driver optimizaitons on their end after every game.
 
New to using this Adaptive Vsync Half Refresh option, but interestingly enough, while playing the Original Crysis I get 55-75FPS at 4K Maxed, but I get horrible screen tearing. The same scene with adaptive Vsync Half Refresh gives me 23fps... In fact, all forms of Vysnc seem to absolutely decimate performance at below 30fps.

I only have this problem in Crysis 1. Crysis 3 runs without issue.

3770k
16GB
SLI 970 FTWs

I only ask here because I am sort of interested in locking Wild Hunt at 30pfs @4K if possible, as I know 60FPS is not even close to the realm of possibility (Ultra)
24hz bug in the original crysis! playing thorough HDMI I assume?
 
Your 290 will run it better than the PS4. Why would you even doubt that?

Because of that interview:
http://www.gamepressure.com/e.asp?ID=51

And that part especially:

Will we get 30fps or 60fps on the recommended requirements for PC?

I think we'll stick with 30fps. Until the very end we intend to work on efficiency – it is crucial, as the machines are quite varied. Our programmers strongly urged us not to resolve to demagoguery, confuse anyone, or bring down the announced requirements to accommodate lower spec. It’s as I’ve said and it can only be better.

He is saying I only get 30fps with the rec Specs ( on high=, which are i5 3570k, and a amd 290. The PS4 is running presumably on high at lock 30 fps
If this is true it would be just horrible optimization imo.
 
So for this beast Intel i7-4790 with 16GB of RAM and an NVIDIA GTX980 You pay more than $/€ 1.000. Right? Even at that pricepoint you arent able to play it on locked 60fps!

Price vs. Quality is completly out of proportion.

Ultra-setting is the only setting?
 
Because of that interview:
http://www.gamepressure.com/e.asp?ID=51
And that part especially:
He is saying I only get 30fps with the rec Specs ( on high=, which are i5 3570k, and a amd 290. The PS4 is running presumably on high at lock 30 fps
If this is true it would be just horrible optimization imo.

Considering a 960 runs the game on high according to Nvidia... you have no reason to worry. Recommended and min specs are just guestimates and should not be interpretted as being biblical truth. Your card has 2x the Flops of the PS4 on top of way more bandwidth. It is an impossibility that it will run worse, I guarantee it.

Also, I am quite positive that the 770 is going to run the game at ultra cvars (no hairworks) or MSAA at 30 fps.
 
I have 3570K/MSI Gaming 970
I should only have to dial down one or two settings to get a steady 60fps, isn't that right?

Probably yes.


Is the 970 OC'd at all? Many of the OC'ed Editions of the 970 come very close to or equal to stock 980 performance.

Also, the builds that were getting 60FPS on a Single 980 were months old. Performance has improved since then, so that may have closed the gap even further.
 
Top Bottom