Quantum Break PC performance thread

Still busted then (unless you have an AMD card).

Pretty much, but even a R9 Fury X won't run it at native 1080p at 60fps without dips. But yea, overall performance is better on AMD.

Also, the game likes to crash a 970 drivers, which i have experienced depending on my settings, among with the weird frame distribution of the animations, which they showcase towards the end of the video. It really does feel sluggish, its weird.
 
Digital Foundry on Patch 1.7 Analysed

TL;DW: Its still pretty unplayable on a 970.


ugh... i might give up on this game and just refund it, no way i want to upscale from a lower resolution and downres it when i can run more taxing games with my rig.

Yeah I just started playing and on a mix of medium high and ultra I was getting like 55-60fps then random jerks to like 25-30. Really random and offputting.

Not getting any driver crashes though.
 
Hate to say it, but if we can't lock 60 on 980Ti(s), a 950 is really going to struggle. Even on low.

The game has to be bruteforced if you want true 1080p and a good framerate. I recommend a lower res maybe. Keep the upscaling on. :(

Wow!! So this game wasn't fully optimized then?? Why does it require so much power?
 
What do you mean by, not running correctly? are you seeing frame
rate issues, or graphical corruption?

Also a few more questions:
- Have you updated to the latest Nvidia Drivers?
- What i5 do you actually have? if you have a really old i5, the fact that it's 3.5ghz probably means nothing. It could actually be worse than the X1's processor.

I actually haven't updated my nvidia drivers in quite sometime..

Well that might be it.. my processor is at least almost 5 years old :-(

But what puzzles me is how great it runs other games and this one I can't even get decent quality..

I mean it runs slow even in low quality...
 
On my rig (980ti, i7-3770k, 8GB RAM) at 1080p and most settings at medium I get an extremely smooth 60fps experience. I don't have a framerate counter, obviously, but I'm naturally very perceptive of framerate drops and I rarely notice them during gameplay.

I'm happy with the game's performance and image quality now, it's a marked improvement over Xbox One.
 
Global Illumination, Lighting overall, and reflections seemed to be the biggest hogs.

There's a LOT of all that going on too. I'm not sure I've seen so much in any game ever. I know nothing I have does it across the board.

There seems to be more going on than in games like Rise of the Tomb Raider or Witcher 3.
 
Also, is there any other game where 390 is 50% faster than 970?

Pretty much

LRG3P7l.png


hSmrn73.png


960/970/980 getting beat ranging from 30-45% in another DX12 game
 
I played on a 980Ti and did not experience a single crash, I wonder what's up for DF that's causing that.

I'm wondering if it hasn't got something to do with the fact that they're using Ultra textures on a card with a weird VRAM setup (I do believe they're only getting crashes on the 970). I know that on a 980 (which has 4Gb of RAM but with all of it at the same speed contrary to the 970) I've had to turn down the textures from Ultra to High because they were causing problems. Did wonders for my performance.
 
I'm wondering if it hasn't got something to do with the fact that they're using Ultra textures on a card with a weird VRAM setup (I do believe they're only getting crashes on the 970). I know that on a 980 (which has 4Gb of RAM but with all of it at the same speed contrary to the 970) I've had to turn down the textures from Ultra to High because they were causing problems. Did wonders for my performance.

This is what I imagine is happening, they'll be hitting the 3.5gb and the game is freaking out over it since its not expecting it
 
I'm beginning to suspect that we are getting a glimpse at a post DX9/11 world. Instead of Nvidia and AMD correcting game problems through driver workarounds we are completely at the mercy of game developers to solve the problems themselves. No more "optimized" drivers that give us big boosts.
 
I'm beginning to suspect that we are getting a glimpse at a post DX9/11 world. Instead of Nvidia and AMD correcting game problems through driver workarounds we are completely at the mercy of game developers to solve the problems themselves. No more "optimized" drivers that give us big boosts.

That is what low level APIs are all about : responsabilities.
 
So it looks like the reconstruction is still happening, just the resolution it's doing it from has changed with Upscaling off. Given that's the case, wouldn't you be better off setting the game to 720p and letting your display upscale the image instead? Also, what about doing the same, but with 900p instead?
 
Still busted then (unless you have an AMD card).
Yeah, seems like they did nothing with NV's h/w in this patch unfortunately.

Ashes and FC: Primal maybe?

Nowhere near as bad.
http://www.legitreviews.com/wp-content/uploads/2016/03/ashes-1080p.jpg
https://tpucdn.com/reviews/Performance_Analysis/Far_Cry_Primal/images/2560_1440.png

Pretty much

960/970/980 getting beat ranging from 30-45% in another DX12 game

They are getting beat by almost the same margin in DX11 as well in this game so it's not a DX12 issue.
 
I'm beginning to suspect that we are getting a glimpse at a post DX9/11 world. Instead of Nvidia and AMD correcting game problems through driver workarounds we are completely at the mercy of game developers to solve the problems themselves. No more "optimized" drivers that give us big boosts.

If that's the case, at least it's way easier to boycott lazy devs than a hardware vendor.
 
After finally being able to get the update the game is now playable on my i5 4690k and 7870 system instead of a stuttering mess. I'm enjoying it, but in engine cutscenes are sometimes out of sync.
 
For anyone wanting a rock solid 30FPS on a 980/i5-4690k machine, see below:

This is at 1440p without upscaling or AA and with the 30FPS lock. It feels fine now. Never drops below it either - been measuring performance with DXtory. Game looks incredible at times. It should be performing at 60FPS considering but I'll take an experience that's smoother than the Xbone and doesn't have the INCREDIBLY BAD input lag of that version (0.5 seconds, measured myself - absent on PC). Also, absolutely 0 crashes for me on the latest NVidia drivers. No idea what the issue for others is there.

mMzoadT.png
 
There's a LOT of all that going on too. I'm not sure I've seen so much in any game ever. I know nothing I have does it across the board.

There seems to be more going on than in games like Rise of the Tomb Raider or Witcher 3.

That doesn't explain why an Xbox One can run it reliably and infinitely more powerful PCs still can't.
 
Yes that is true. How is the DX11 supported in this game though? You can have it compatible but still be fundamentally a DX12 game.

What does that mean? A game working under DX11 isn't a DX12 game as pretty much everything which a DX12 game does are being done by the drivers in DX11. The shader code itself isn't different between DX11 and DX12 or at least it shouldn't be, and that's the main reason for such performance differences everywhere we've seen them. Some games are just not optimized for NV's h/w these days for a bunch of reasons, DX12 has nothing to do with it.
 
That doesn't explain why an Xbox One can run it reliably and infinitely more powerful PCs still can't.

well 720p on med and being more optimized as it is a single platform. Also 1080p is more than 2x as many pixels and people aiming for 60fps compared to 30fps on xbox. just being simple about it that requires a video card 4x more powerful than an xbox one (based on the xbox one having a 1.3tf gpu). so something along the lines of a 290x/290. not that simple irl but you know for arguments sake yada yada.
 
Yeah, you didn't have to be like that about it. I was just curious. Wasn't aware that you could do that with Win10-Store games, so I was going to thank you for the pointer.

I can confirm it works. You will see the difference.

Quantum Break has no .exe but the game has to go through the driver.
 
I have decided I'll wait until I get a GTX1070 to play this game. It's just not worth it to play it on my 660ti. I can play it either at 720p with upscaling on at 60 FPS (horrible image quality), or 720p with upscaling off at 30 FPS, or 1080p with upscaling on at 30 FPS with dips. Of course, this is with graphics set at minimum quality. And I have a 1440p monitor so it's even worse to play at those low resolutions.
 
I went through the nvidia hub to set up the settings on the game to see what they recommend now the game does not even start. haha. this game.
 
I have decided I'll wait until I get a GTX1070 to play this game. It's just not worth it to play it on my 660ti. I can play it either at 720p with upscaling on at 60 FPS (horrible image quality), or 720p with upscaling off at 30 FPS, or 1080p with upscaling on at 30 FPS with dips. Of course, this is with graphics set at minimum quality. And I have a 1440p monitor so it's even worse to play at those low resolutions.

Honestly - it's surprisingly smooth at 30FPS and I'm a stickler for 60+ normally. However, dips would mar that so I'd recommend you wait.

I have a 980 and at 1440p, I can run a mix of high/med settings and get a locked 30FPS without upscaling. Game looks phenomenal at times. Still should perform better but, then again, no game uses actual Global Illumination like this.
 
Does anyone know if there's a way to trick the game into thinking it has focus? I'd like to use my computer for other things while the TV episodes are playing out, but the moment I click on anything else it pauses.
 
What reasons that games are not optimized for Nvidia's 960, 970 and 980? Don't mention Keplar please.

Publisher not giving a fuck about PC version? AMD paying the devs to not optimize for NV's h/w? Devs themselves thinking that it doesn't matter? Take your pick. And why shouldn't I mention Kepler? Kepler needs some specific optimizations as well, being basically the only h/w on the market with FL11_0 feature level.
 
Publisher not giving a fuck about PC version? AMD paying the devs to not optimize for NV's h/w? Devs themselves thinking that it doesn't matter? Take your pick. And why shouldn't I mention Kepler? Kepler needs some specific optimizations as well, being basically the only h/w on the market with FL11_0 feature level.

its quite obviously not that one
 
There's a LOT of all that going on too. I'm not sure I've seen so much in any game ever. I know nothing I have does it across the board.

There seems to be more going on than in games like Rise of the Tomb Raider or Witcher 3.

Yep. Unmatched lighting right now.
 
Picked up a code from BST for $25. Just ran through act 1 on an overclocked 970 and everything seems fine. No crashes or anything which is surprising. I thought I'd have at least turn of the overclock but so far so good. I'm running with the 30fps cap, 1080p no reconstruction, AA off, most settings high. I can't say for sure I'm getting 30 fps locked all the time, I'm sure I could turn a couple more settings down for it though. If someone is interested I'll run through with a fps counter to get a more objective reading but performance is fine for me at least. Wish it had launched in this state but this configuration seems fine now. I'm not going to bother trying for 60fps seems out of reach for a 970 anyway.

Edit: CPU is last gen i5 overclocked to 4.6GHz and game is installed on an SSD. 16GB RAM. This probably make a difference too.
 
Im also upset that the game cant be captured with any recording programs
The Windows capture method itself is alright (Windows key + G). If you can set your capture software to record the screen or active programming, that works (I did this with Mirillis Action).
 
Unfortunately the live show is just playing about 5 seconds then buffering, then 5 seconds then buffering..... Not really watchable.

Edit: Would it really be so hard to just let me set it manually to 720p or something? I can stream video from basically any service just fine but this just isn't working.

Edit2: Guess I'll wait for another patch or something. This just doesn't work. Ugh
 
Top Bottom