• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Watch_Dogs PC Performance Thread

I've finally gotten the game to run well on my 770 GTX 2gb but I'm not happy about it. I turned the textures to Medium with VSync on, and everything else to Ultra with HBAO low and SMAA. Stuttering is finally gone. Textures look horrendous as expected but I don't get a migraine while playing this now.
 

Dr Dogg

Member
You are never going to believe this but disable your audio driver and hey presto no stutter. I'm only using the on-board Real Tek rubbish at the moment but disabling it sorted that out. Bloody typical as I took my Xonar out the other day. Will test a bit more throughly tomorrow but if your having stuttering issues disable what audio devices you have either via device manager or via option clicking in the volume controls and see if it's better for you. Granted you won't have any sound though :/

Edit: Scratch that. Call it reduced but still bloody there.
 
i5 4670k
gtx-670 4gb
8gb ram

you guys are gonna hate me but i turned everything down except shaders/textures (which are on highest)

vsync on via nvidia panel.

anywhere from 30-60 fps.
 

Megasoum

Banned
This game is starting to piss me off. I can live with the fact that I'm not getting a constant 60 fps (well...that sucks but anyway)... But no matter what VSync option I use ingame it will still always try to lock it at 60. I thought one of those options is supposed to lock it at 30?

i5 4670 with 16gb of ram and a 770 (2gb). In 1080
 

Skyzard

Banned
This game is starting to piss me off. I can live with the fact that I'm not getting a constant 60 fps (well...that sucks but anyway)... But no matter what VSync option I use ingame it will still always try to lock it at 60. I thought one of those options is supposed to lock it at 30?

i5 4670 with 16gb of ram and a 770 (2gb). In 1080

iZOVez2.png


Select Borderless Window in the in-game display options (for less tearing).
 

Conezays

Member
Bah, I tried to get into the game 3 times tonight to show a friend. Kept freezing on the loading screen. WTF? I'm at least 10 hours in and haven't had this problem yet.
 
So, I've got an i7 4770K and a GTX 780 clocked at 1250Mhz and this game runs like dog shit.

What's the consensus here?
I have that set up exactly and it runs fine for me maxed out on ultra minus textures to high and it generally runs at 60 fps. I have about 20 hours or so played too.
 

Vossler

Member
Going to add to the misery here:

i52500k @ 4.3
2 x gtx 670 asus DCU TOP
8 GB ram
Win7

Game is great for a while, set at 1080/30 forced, SMAA, then shits its pants.....CMON UBI, patch it up for us wanna be l337 haXX0rs!
 

GHG

Member
Game crashes whenever I hack a billboard when driving, anybody else getting this issue? Like it literally instantly crashes the game.

This has got to be one of Ubi's worst PC ports yet, not only is the stuttering and performance issues ruining the game, it also has major game breaking bugs revolving around the main gameplay mechanic...GREAT JOB UBI!

Same issue here. Instant CTD.
 

riflen

Member
Hahaha yeah I saw them which is kind of what prompted me to have a look as I was feeling I was getting no where near the same experience. Don't know how Afterburner/RTSS polls it's results but I assume at the driver level so no idea how accurate they are but will try and get FRAPS working tomorrow. I've tried with both the Steam and Uplay overlays disabled and FRAPS still hangs on me. Only does it with DX11 titles and not exclusive to Watch_Dogs. Going to fiddle around on my second BIOS tomorrow and do a clean install and see what I get.

I have the Uplay release, but I doubt Steam is the cause. FRAPS has been known to misbehave with DX11 and it was crashing the game for me too. I was simultaneously running Afterburner to grab the other metrics. I found I had to start the game with Afterburner running (disable RTSS overlay), then alt-tab out and run FRAPS (overlay disabled). My Uplay overlay is also disabled. Good luck!
 

riflen

Member
Are GPU buffered frames good for anything else than horrible input latency?

Not really, no. If your system is creating wildly varying frame times, a longer buffer can help keep the frame rate consistent. Don't use more than 3, generally. It also costs VRAM to use the buffer.
 

xXBaconXx

Banned
That kind of stuttering is way too extreme for running out of VRAM, an 16X PCIe 3.0 bus should take less than 130ms to transfer enough data to fill the VRAM with assets.

I don't think I'm running out of VRAM. According to Afterburner I still have 100 MB to spare, and I even tried lowering the resolution once the stuttering started, which didn't help.

The game takes up a crazy amount of pagefile space though. I'm using the "pagefile hack", not sure if that even does anything.

Granted, it never stuttered this hard for me, but anyway have you tried borderless window? The game is completely stutter-free for me, on ultra with textures high.

I'll try that out.
 

noomi

Member
Disabling my SLI got rid of my stuttering completely. Thought I was running out of VRAM, but my usage is normally 2900-3050.
 

MaLDo

Member
I think sli configuration is using a workaround to avoid flickering that conflicts with the game loading assets routines.


With 0x0E0932F5 sli bits, there is flickering in vegetation and HDR variations but eliminates stuttering in tri-SLI setup.


Only for testing purposes (backup your config file and click 'restore default profile' in nvidia inspector after test):

Inspector profile
http://yourfiledrive.com/0e6e2633e4b44dca6e81b34028a75ee3/watch_dogs_flicker.nip

Game config file
http://yourfiledrive.com/120fe8f094bb26d90840edaa48574870/GamerProfile.xml
 

Authority

Banned
Reposting,

Ubisoft Working On PC Patch For Watch Dogs, Offers Advice To Boost Performance

If you’re one of many disheartened gamers playing Watch Dogs on PC, there’s hope on the horizon. Ubisoft‘s Sebastien Viard, the game’s Graphics Technical Director, took to Twitter this morning to address concerns, update players on an impending patch, and dish out some technical advice for getting the best performance.

Regardless of your preferred graphics card maker, Watch Dogs isn’t living up to performance expectations on PC. Benchmarks across the internet have varied dramatically, and my sources at Nvidia suggest that even they can’t get consistent back-to-back benchmark runs. Nvidia explains that Ubisoft’s Disrupt engine is quite complex, and that the way the game streams in data may be a potential bottleneck. If you think about the open world variables and the metadata that must be randomly assigned to every individual roaming the streets of Chicago (for the purpose of Aiden’s profiler tool), it stands to reason that’s at least one factor impacting performance.

Nvidia says that AMD was free to approach Ubisoft at any time to suggest tweaks and improvements to the game for their hardware.

Ubisoft’s Sebastien Viard explains that “Making an open world run on [next-generation] and [current-generation] consoles plus supporting PC is an incredibly complex task.” He goes on to to say that Watch Dogs can use 3GB or more of RAM on next-gen consoles for graphics, and that “your PC GPU needs enough Video Ram for Ultra options due to the lack of unified memory.” Indeed, Video RAM requirements are hefty on PC, especially when cranking up the resolution beyond 1080p. (This is why I knew Sony was on the right track using unified GDDR5 memory for the PlayStation 4.)

While recent driver updates from Nvidia and AMD should improve general performance, Viard promises that their PC programmers are “currently working on a patch to improve your experience,” and thanks both his team and users for diligently reporting performance issues. The most perplexing thing about the PC version of Watch Dogs is how divided user experiences actually are. I saw at most 30% scaling when using Nvidia GPUs in SLI, while others with near-identical hardware see upwards of 75%. Some users see stutter and artifacting while others report no problems using even older generations of AMD and Nvidia cards.

In the meantime, Ubisoft’s Viard offers some technical advice for boosting performance, especially if you’re experiencing stuttering or drastic framerate drops: Reduce your texture quality, level of Anti-Aliasing (FXAA is the least taxing in my experience), and/or resolution. Unless you have 4GB of Video RAM, playing at 1440p or 4K will be problematic, at least for now.

Watch Dogs is an important release for Ubisoft, and it’s reassuring to see their team working on improving the game for PC users. Nvidia is also continuing to consult with Ubisoft engineers on optimizing drivers for the game.

Credits to Forbes
 
I had a problem with a weird flickering effect on mainly trees and grass etc. Turned the HBAO to low and it fixed it. Also went from Ultra to High textures and now virtually no stutter now.
 

Seanspeed

Banned
Sounds like there wont be any miracle fixes then. The game engine just has certain limitations and the nature of the consoles not having super powerful GPU's, but lots of vRAM available means that they've developed the game with next gen consoles and their limitations in mind to get the most out of them.

IF Watch Dogs is an indication, it does seem as though its going to take PC GPU's offering 4GB cards as the sort of baseline standard to really be able to get the most out the power available, at least for console ports like this. But when this does happen, it should make mince meat of these games.

Then again, a better game engine may make Watch Dogs just a bad example and things will progress more smoothly with time.
 

fade_

Member
Sounds like there wont be any miracle fixes then. The game engine just has certain limitations and the nature of the consoles not having super powerful GPU's, but lots of vRAM available means that they've developed the game with next gen consoles and their limitations in mind to get the most out of them.

IF Watch Dogs is an indication, it does seem as though its going to take PC GPU's offering 4GB cards as the sort of baseline standard to really be able to get the most out the power available, at least for console ports like this. But when this does happen, it should make mince meat of these games.

Then again, a better game engine may make Watch Dogs just a bad example and things will progress more smoothly with time.

I wouldn't fret too much...Ubisoft is notorious for shoddy PC optimization.
 

MaLDo

Member
Sounds like there wont be any miracle fixes then. The game engine just has certain limitations and the nature of the consoles not having super powerful GPU's, but lots of vRAM available means that they've developed the game with next gen consoles and their limitations in mind to get the most out of them.

IF Watch Dogs is an indication, it does seem as though its going to take PC GPU's offering 4GB cards as the sort of baseline standard to really be able to get the most out the power available, at least for console ports like this. But when this does happen, it should make mince meat of these games.

Then again, a better game engine may make Watch Dogs just a bad example and things will progress more smoothly with time.


I would bet that streaming routines are identical in watch dogs, far cry 3 and assassins 3 and 4. Good streaming process must take a fixed time and must be asyncronous to renderer.

Looking to a single gpu setup, the variable with bigger effect on stutter is the conbo Maxprerenderedframes and MaxDriverprerenderedframes.
 

cripterion

Member
Game crashes whenever I hack a billboard when driving, anybody else getting this issue? Like it literally instantly crashes the game.

This has got to be one of Ubi's worst PC ports yet, not only is the stuttering and performance issues ruining the game, it also has major game breaking bugs revolving around the main gameplay mechanic...GREAT JOB UBI!

Yes, it always happens for me.

It's funny I was looking at the optimal settings for my rig in Nvidia Experience and everything is at the highest settings, even with textures on Ultra for my 2GB cards.
 

Authority

Banned
Sounds like there wont be any miracle fixes then. The game engine just has certain limitations and the nature of the consoles not having super powerful GPU's, but lots of vRAM available means that they've developed the game with next gen consoles and their limitations in mind to get the most out of them.

IF Watch Dogs is an indication, it does seem as though its going to take PC GPU's offering 4GB cards as the sort of baseline standard to really be able to get the most out the power available, at least for console ports like this. But when this does happen, it should make mince meat of these games.

Then again, a better game engine may make Watch Dogs just a bad example and things will progress more smoothly with time.

From what I understand after reading, is that they knew PC gamers would experience this mess and still, deliberately pushed for a release on PC instead of delaying it to optimizer it more.

What the fuck?

Edit: Unless I am reading it wrong.
 

Robert7lee

Neo Member
Gtx titan
I7 4770k
32gb

Like digital foundry reportedly high textures still get fps hitches even on low graphics (preset)

Trying the 50hz 50fps cap, still can get hitches and frame drops to 46fps.

V sync off in high textures and ultra graphics temporal fx and Ao off rarely goes below 60 except for driving in built up areas.

Rig sucks
 

riflen

Member
I think sli configuration is using a workaround to avoid flickering that conflicts with the game loading assets routines.


With 0x0E0932F5 sli bits, there is flickering in vegetation and HDR variations but eliminates stuttering in tri-SLI setup.


Only for testing purposes (backup your config file and click 'restore default profile' in nvidia inspector after test):

Inspector profile
http://yourfiledrive.com/0e6e2633e4b44dca6e81b34028a75ee3/watch_dogs_flicker.nip

Game config file
http://yourfiledrive.com/120fe8f094bb26d90840edaa48574870/GamerProfile.xml

Nice. Thanks for posting this.
 

Amey

Member
If dev's have built their next gen engines around unified memory archs then I am afraid these stutter fests are going to become common in PC ports this generation. So better buckle up for that. PC gamers may once again have to go for the brute force approach.

I still remember how last gen weaker dual core CPUs suffered as both engines were designed around multi-core archs.
 

oneils

Member
anyone try this with a 460gtx w/ 768mb vram?

I wonder if it is worth trying at all with that card. I'm guessing I will have to play on medium settings? Maybe that's not too bad.
 

b0bbyJ03

Member
I really hope this is not an indication of what is going to happen now that devs are working with unified memory on consoles, which im assuming totally changes the way they handle the data distribution. Titanfall had issues with its insane textures and now watch dogs does as well and the thing is that it doesn't seem the the issue is rooted in how much VRAM the game needs (6 GB Titans and 780 Tis are stuttering) but more with how assests are being streamed. I guess we'll have to wait and see.
 

b0bbyJ03

Member
If dev's have built their next gen engines around unified memory archs then I am afraid these stutter fests are going to become common in PC ports this generation. So better buckle up for that. PC gamers may once again have to go for the brute force approach.

I still remember how last gen weaker dual core CPUs suffered as both engines were designed around multi-core archs.

haha, i didn't see this post but I basically just echoed this!
 

Qassim

Member
I really hope this is not an indication of what is going to happen now that devs are working with unified memory on consoles, which im assuming totally changes the way they handle the data distribution. Titanfall had issues with its insane textures and now watch dogs does as well and the thing is that it doesn't seem the the issue is rooted in how much VRAM the game needs (6 GB Titans and 780 Tis are stuttering) but more with how assests are being streamed. I guess we'll have to wait and see.

But the consoles have less than 6GB (approx 5 - 5.5) of unified memory accessible to developers. They could fit everything they're doing on the GPU in that single pool of memory on the consoles in a 6GB card. So I don't think the lack of unified memory is actually the problem here, there are issues with the engine that need fixing.

Not to mention Watch Dogs is not the first 'next-gen' game we have played on PC - there have been plenty of others that have had no such problem.
 

riflen

Member
I really hope this is not an indication of what is going to happen now that devs are working with unified memory on consoles, which im assuming totally changes the way they handle the data distribution. Titanfall had issues with its insane textures and now watch dogs does as well and the thing is that it doesn't seem the the issue is rooted in how much VRAM the game needs (6 GB Titans and 780 Tis are stuttering) but more with how assests are being streamed. I guess we'll have to wait and see.

It's not. My frame times are much, much more in line with what I'd expect from this game with the SLI bits Maldo just posted. Using them, the rendering is broken on most lighting and associated effects, but it shows that Watch_Dogs just has bugs. Extrapolating based on a single game is silly. Extrapolating based on a Ubisoft open-world game is insanity.
 

scitek

Member
A lot of devs give the "you're a PC gamer, you're used to it by now" vibe when addressing things like this via Twitter and whatnot. I'd bet most of them just make the conscious decision to prioritize the consoles first, and deal with the PC later since it's a smaller audience in their eyes.
 

Dr Dogg

Member
I think sli configuration is using a workaround to avoid flickering that conflicts with the game loading assets routines.


With 0x0E0932F5 sli bits, there is flickering in vegetation and HDR variations but eliminates stuttering in tri-SLI setup.


Only for testing purposes (backup your config file and click 'restore default profile' in nvidia inspector after test):

Inspector profile
http://yourfiledrive.com/0e6e2633e4b44dca6e81b34028a75ee3/watch_dogs_flicker.nip

Game config file
http://yourfiledrive.com/120fe8f094bb26d90840edaa48574870/GamerProfile.xml

Cheers for posting MaLDo. I'll try these out latter.

A lot of devs give the "you're a PC gamer, you're used to it by now" vibe when addressing things like this via Twitter and whatnot. I'd bet most of them just make the conscious decision to prioritize the consoles first, and deal with the PC later since it's a smaller audience in their eyes.

Seeing some of the twitter comments by Ubisoft employees gives me the impression these people are more management than working in engineering.
 

Amey

Member
But the consoles have less than 6GB (approx 5 - 5.5) of unified memory accessible to developers. They could fit everything they're doing on the GPU in that single pool of memory on the consoles in a 6GB card. So I don't think the lack of unified memory is actually the problem here, there are issues with the engine that need fixing.

Not to mention Watch Dogs is not the first 'next-gen' game we have played on PC - there have been plenty of others that have had no such problem.

Yes, but how well can your CPU access data from your vRAM if at all when GPU is accessing the same? It's not unified remember.
 

Wag

Member
I find it hard to believe they got the PC version to run smoothly on more than a few machines. It's just amazing to me that they released it in this state. Last time I pre-order a Ubi game.
 

Blizzje

Member
So, is it true that a gpu comparable to the ps4's is actually capable of running this in 1080p on high at 30fps? Very disappointing PS4 version if so...
 

30IR

Banned
This game is a stuttering pile of mess - both in 4K and 4K Surround.

With everything maxed out, including HBAO+ High, I get around 25 FPS in 4K Surround.

With the same settings in 4K, I get around 40 FPS w/ a LOT of stuttering.

It is almost unplayable.

This is w/ a 3970X @ 4.8Ghz and 4x GTX-Titan Black SCs @ 1215Mhz. :rolleyes:
 
Top Bottom