• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Watch_Dogs PC Performance Thread

I feel like the only person on Gaf who's enjoying this.
14320432583_0afec798eb_o.png

14296909721_cb506bdcb2_o.png

Once the performance issues have been sorted it will be a happier GAF. I have no idea how this passed quality control in this state on the PC. I have been lucky in that the game runs as smooth as butter for me at 50fps locked with no lag or stutter. What I am annoyed at is the creative director morin mentioned that a core i7 would be a huge jump in performance compared to an i5 due to the extra cores being used so I went and sold my i5 and got myself an i7 but looking at the performance charts there is hardly any difference so thanks for that ubisoft! Technical issues aside in my opinion it's a great game and I am loving every minute of it
 

Linius

Member
So Uplay experts, is there a way to delete save files from the cloud? I feel like I can't get rid of the bug haunting me because the game picks up info from the cloud every time.
 

Evo X

Member
Going to enable anisotropic filtering x16 but can anyone tell me what performance hit this takes on the game

AF has taken an almost non existent performance hit on any game within the last 15 years if you've had half decent hardware. I don't know why it isn't standard at this point.
 

BenouKat

Banned
Just got a copy. My specs:

i5 2500k @4.5 GHz
2GB GTX 670
8GB RAM

I've settled on all high, except water on Ultra, and have in-game vsync on 1, with a frame buffer of 2 and DoF and Motion Blur off. I put the "weakest" AO but I don't try the others, may be it runs.

I also cchanged my refresh rate to 50Hz and am using MSI Afterburner to limit it to 50fps. No lie, I'd never tried it before, but had seen people mentioning they do it sometimes, but perfectly vsync'd 50Hz looks like 60. I doubt I'd be able to tell if put on the spot. I had no idea.

Pretty same has me.

I got :

I5 2500K (non overclocked, yes I know, its stupid and etc)
GTX 560 Ti 2GB
8GB RAM

Difference : I put no vsync cause it locks my game to 20, but the game runs between 25~30 FPS on city, 45-50 on closed environment.

After all the "performance drama" I read, I pretty proud that my "old buddy" 560 Ti Phantom is still kicking some asses :) I don't have textures on ultra and a megaton AO but the game looks very clean and nice.

I have a question : I enabled the DoF, but I see it occasionnaly, in cut scene for exemple but how do you have this kind of screen upper this post, with no UI and DoF ? Is there a photo mode or something ?
 

Seanspeed

Banned
Same dev that said the stuttering was to do with VRAM and that people should turn down textures (disregarding the fact that people with Titan's are also getting the same issues)

I won't hold out any hope...considering it shipped in this state in the first place and that the guy who tweeted that said they did a great job with the port tells me there is serious delusion going around Ubi.
It also goes to show that this dev isn't one of the PC programmers and probably doesn't know the full story.

Give them a chance man.

Hooo eeeee!

That's a fine looking screenshot.

Well one reason i bought it for PS4
For like the 20th time, its unoptimized on the consoles as well(why do you think its only 900p?). A decent PC will run this at console levels just fine. Its just people want more than that, and it'll hopefully get there soon with a bit of patching.
 

parabolee

Member
Well one reason i bought it for PS4..is this because Ubi is always bad in optimization.

I understand what you are saying. Especially for anyone with a medium or low end PC. But even with medium textures I can easily make the game look much nicer than the PS4 version and get 60 fps. So that is why I chose the PC version over PS4.

I'm OK with 900p but 30 fps and an inferior image quality does not cut it for me. As much as these issues are a pain, it's worth it for what is still a superior version.

Hell it's worth it just so I can use TXAA, because DAMN it makes a HUGE difference! I'd probably choose TXAA over Ultra textures anyway to be honest.
 

b0bbyJ03

Member
If uplay is stuck on looking for updates does that mean I cannot play watch dogs?

yes. if uplay doesn't launch you can't play. the good new is that there is a little cog on the top right corner of the login screen that has the option to launch uplay in offline mode. do that and you should have no issues.
 

rashbeep

Banned
I understand what you are saying. Especially for anyone with a medium or low end PC. But even with medium textures I can easily make the game look much nicer than the PS4 version and get 60 fps. So that is why I chose the PC version over PS4.

I'm OK with 900p but 30 fps and an inferior image quality does not cut it for me. As much as these issues are a pain, it's worth it for what is still a superior version.

Hell it's worth it just so I can use TXAA, because DAMN it makes a HUGE difference! I'd probably choose TXAA over Ultra textures anyway to be honest.

Personally I don't like TXAA. Huge performance hit and it just looks like it blurs the screen.
¯\_(ツ)_/¯
 

dr_rus

Member
Same dev that said the stuttering was to do with VRAM and that people should turn down textures (disregarding the fact that people with Titan's are also getting the same issues)

I seriously doubt that people with Titans are having _the same_ issue. I see almost no stuttering on GTX770 with 4 GB in 2560x1600.
 

dr_rus

Member
Plenty of people on Ubi forums with Titan's single and SLI reporting stuttering into single digits, also people in this very thread with Titan's reporting stuttering issues...but you know, your's works so I guess it can't be happening then.

It sure can. But for a different reason than running out of VRAM on 2-3 GB videocards, you know.
 

Ganzor

Member
4770k and a R9 290 clocked at 1040 and got the game on an SSD.
Atm i got the game locked at 30 fps everything ultra 1080p with TSMAA. The framerate was way to unstable at 60hz vsync : / and even now and then i get dips below 30 fps.
Dunno if i'm exspection to much
 

Wurstsemmel

Neo Member
i5-2500k (non oc)
770gtx 4GB
8GB RAM

Everything on high except water on ultra, motion blur off (I hate it).
Ingame-vsync with tripple buffering lags like hell so I forced vsync through nvidia inspector and set max buffered frames to 2.
Game runs okay with 40-60fps but stutters like hell when driving. Setting textures to the lowest possible option made it better, setting world detail in addition to medium made it playable.
TXAA 2x is easily possible and I'm using it when playing on my tv.
 

Buburibon

Member
It sure can. But for a different reason than running out of VRAM on 2-3 GB videocards, you know.

I can't speak for other Titan owners, but the stuttering I'm witnessing is very much related to texture-streaming. And it's pretty easy to test it too, because all I have to do is knock down textures to medium for a close-to-perfectly smooth experience. But you're right, these Titans aren't slowing down due to a lack of VRAM. It's definitely something else.
 

frontieruk

Member
http://www.extremetech.com/gaming/183095-watch-dogs-analyzing-the-impact-of-nvidias-gameworks-integration-and-amd-performance

Today, Ubisoft is launching its long-awaited Watch Dogs, a 3D open-world game in which the goal is to hack computer networks and take control of data nets to solve puzzles and track down story objectives instead of simply gunning down everyone you meet. Watch Dogs has been held up as a visually stunning title that takes full advantage of modern GPU capabilities — and, notably, it makes prominent use of Nvidia’s GameWorks technology. For those of you that haven’t followed the GW saga, GameWorks is a set of proprietary libraries, distributed by Nvidia, used to implement common DirectX 11 functions. Developers that use these libraries cannot share the code with AMD for optimization and, in some cases, cannot see the source code at all. The result has been a competitive landscape that has often been significantly tilted against AMD.

Over at Forbes, reviewer Jason Evangelho has tested the R9 290X against the Nvidia GTX 770 with full results in a range of configurations coming today. His preliminary results show exactly the kind of skewed pattern I’d previously been concerned about, with the GTX 770 pulling ahead of the more expensive R9 290X at lower detail levels and only slipping barely behind at the highest 1080p settings.

A quick check of the other Watch Dog reviews popping up across the net shows a different (and more conventional) performance distributions, with the R9 290X outperforming the GTX 770 by 18-20% at High detail in 1080p. That’s a bit lower than the typical margin of roughly 25% in a non-GameWorks title, but it’s not ridiculously low the way we’ve seen in some other GameWorks games like Arkham Origins. The other thing they reveal — and something I can vouch for myself in the small amount of time I had to run my own tests — is that this game is in pretty wretched shape.

Even on a GTX 770 with “High” Texture quality (recommended for GPUs with a 2GB frame buffer), the game tends to skip and stutter with unexplained performance drops. Rotate the camera quickly, and you’ll see the game engine stutter as it tries to keep up. This occurs on both graphics cards, but the problem honestly seems worse on the Nvidia side of the fence. Meanwhile, in the absence of an official timedemo, reviewers were free to create their own test runs — and many, including Guru3D, complained that the game’s performance was so erratic, it had to kill their attempt to test multiple cards due to high run variation.
 

Seanspeed

Banned
I know this is a different dev team, but this is the fourth AAA ubi game in a row on PC with performance issues at launch, I would say they have had a fair chance of doing quality control correctly by this point.

Also it doesn't run at console levels just fine, the stuttering still happens for people even with settings lowered and we shouldn't have to wait for patches to get a playable product.
I got the impression that most people can get it to run at 30fps locked ok, its just when trying to push the framerate or settings that the stuttering becomes an issue?

Could be wrong on that, obviously.

With phone DOF

watch_dogs2014-05-292kgsk1.jpg



Without phone DOF

watch_dogs2014-05-292pguwh.jpg
I thought you hated DoF man? What happened?
 

Hawkian

The Cryptarch's Bane
With phone DOF

watch_dogs2014-05-292kgsk1.jpg



Without phone DOF

watch_dogs2014-05-292pguwh.jpg
Is this the kind of thing that could be injected into the game GeDoSaTo-style with a custom solution? Looks so very nice.

I have a couple questions for the knowledgeable:

What are the pros/cons (or features, I guess of):
Temporal SMAA vs TXAA
NVIDIA: "On" vs. "Adaptive" V-sync


I've been using Temporal SMAA so far.. I haven't had a lot of experience with games offering it so far but I think it looks pretty excellent.
 

Jado

Banned
Poor reading on my part, this makes more sense now, thanks.



screw the in-game v-sync.

i think people who want less tearing are running it in a borderless window.

Some clarification on this because I'm not sure -- is it better in all/most cases to ignore a game's v-sync setting and leave it off, and instead use the v-sync setting in the AMD/Nvidia video driver software?
 

cripterion

Member
Personally I don't like TXAA. Huge performance hit and it just looks like it blurs the screen.
¯\_(ツ)_/¯

I don't know why but TXAA is the less taxing option of AA for me on this game.
Changing to FXAA I get lots of stuttering while driving.

TXAA doesn't look half as bad as people make it out to be but I understand some people prefer a "cleaner" look to their games. To me everything looks "smooth" with it.
 

Hawkian

The Cryptarch's Bane
Some clarification on this because I'm not sure -- is it better in all/most cases to ignore a game's v-sync setting and leave it off, and instead use the v-sync setting in the AMD/Nvidia video driver software?
Not necessarily all cases. but DEFINITELY in this case. The game's Vsync option introduces a virtual ton of input lag.
 

Skyzard

Banned
Those with stuttering issues - have you tried lowering the resolution and then editing the gamerprofile.xml (C:\Users\User\Documents\My Games\Watchdogs\someweirdfoldername) file to SuperSampling="1" to make it look better possibly?

Some clarification on this because I'm not sure -- is it better in all/most cases to ignore a game's v-sync setting and leave it off, and instead use the v-sync setting in the AMD/Nvidia video driver software?

Not as far as I know but with Watch Dogs specifically I'm finding the in-game v-sync to be awful for performance, read quite a few posts saying the same thing too.
 

lmbotiva

Junior Member
I seriously doubt that people with Titans are having _the same_ issue. I see almost no stuttering on GTX770 with 4 GB in 2560x1600.
Than you're one of the few lucky ones, I have a msi 770 4gb and had to put the textures on medium to get a non stutter locked 60 frames on the game
 

riflen

Member
How did you hide the HUD and got into first person view?

You can hide the hud by editing your GamerProfile.xml file. He's not in first-person, he probably just moved the character close to an object and rotated the camera until the character was out of shot.

Is this the kind of thing that could be injected into the game GeDoSaTo-style with a custom solution? Looks so very nice.

I have a couple questions for the knowledgeable:

What are the pros/cons (or features, I guess of):
Temporal SMAA vs TXAA
NVIDIA: "On" vs. "Adaptive" V-sync


I've been using Temporal SMAA so far.. I haven't had a lot of experience with games offering it so far but I think it looks pretty excellent.

Read the guide on geforce.com for Watch_Dogs.
If you have a 600 series or better Nvidia card, TXAA has hardware support, meaning better performance than certain other methods. TXAA is based on MSAA with a temporal aspect that targets the crawling or shimmering that you get in games during motion. So, 2xTXAA is 2xMSAA with Nvidia's temporal component. 4xTXAA is based on 4xMSAA.

Unlike MSAA, SMAA is a post-process AA solution, which means it operates on the final completed image, rather than on certain elements that comprise the image as it's constructed by the GPU. This can make it less effective, but means that it's simpler to add support for. It's really rather good performance-wise and quality-wise and definitely superior to FXAA which is another post-process solution. Temporal SMAA adds a temporal component that's again designed to target artefacts that appear where motion occurs (crawling etc). It's not as effective as TXAA and does not have hardware support, but it's less expensive performance-wise.

Adaptive Vsync will disable syncronisation when your frame rate drops from the requisite rate (60fps for a 60Hz display for example). This will cause tearing of the image, but will not force you to the next valid sync rate (30fps) as double-buffered Vsync will do. Adaptive is designed to create a more consistent experience by syncing when ideal (60fps) and releasing sync when the system needs to drop frames. Sync is re-enabled when 60fps are achieved once more.
 

nbthedude

Member
Man I love playing "looking for Uplay updates"

Fuck this utter bullshit.

I literally can't play on my TV because this bullshit stops Steam Big Picture mode from properly loading the game.
I have literally spent an hour trying to boot on my TV.
 

Hawkian

The Cryptarch's Bane
You can hide the hud by editing your GamerProfile.xml file. He's not in first-person, he probably just moved the character close to an object and rotated the camera until the character was out of shot.
figured he's in a parked car in first person.

For new page, if anybody can school me:
What are the pros/cons (or features, I guess of):
Temporal SMAA vs TXAA
NVIDIA: "On" vs. "Adaptive" V-sync
 

Denton

Member
Man I love playing "looking for Uplay updates"

Fuck this utter bullshit.

I literally can't play on my TV because this bullshit stops Steam Big Picture mode from properly loading the game.
I have literally spent an hour trying to boot on my TV.

I play on my TV all the time, although I am not using big picture, not necessary thanks to logitech K400.
 

Hawkian

The Cryptarch's Bane
Read the guide on geforce.com for Watch_Dogs.
If you have a 600 series or better Nvidia card, TXAA has hardware support, meaning better performance than certain other methods. TXAA is based on MSAA with a temporal aspect that targets the crawling or shimmering that you get in games during motion. So, 2xTXAA is 2xMSAA with Nvidia's temporal component. 4xTXAA is based on 4xMSAA.

Unlike MSAA, SMAA is a post-process AA solution, which means it operates on the final completed image, rather than on certain elements that comprise the image as it's constructed by the GPU. This can make it less effective, but means that it's simpler to add support for. It's really rather good performance-wise and quality-wise and definitely superior to FXAA which is another post-process solution. Temporal SMAA adds a temporal component that's again designed to target artefacts that appear where motion occurs (crawling etc). It's not as effective as TXAA and does not have hardware support, but it's less expensive performance-wise.
Thanks.

I'm honestly surprised Temporal SMAA is a post-process effect. It looks light years better than FXAA to me.

I'm happy to stick with it for the performance savings over TXAA, but i'm glad to have the option for other games.
 

Denton

Member
Can anyone explain to me how exactly does this borderless window thing work, technically?

How come the picture is completely synced, there is not a single tear, EVER, and yet framerate is completely unlocked and goes over 60 easily ? Huh ?
I am playing on panny plasma btw.
 

Hawkian

The Cryptarch's Bane
Can anyone explain to me how exactly does this borderless window thing work, technically?

How come the picture is completely synced, there is not a single tear, EVER, and yet framerate is completely unlocked and goes over 60 easily ? Huh ?
I am playing on panny plasma btw.
It's just a window. Like playing any game in windowed mode on PC, your monitor is operating at its native resolution and refresh rate and the game is just occupying space on it. The borderless feature thing that makes it "feel" fullscreen when you render at your monitor resolution is just an illusion (try changing the resolution in borderless windowed mode to see what I mean).

The advantages you have described hold true for any game, however it's usually balanced out by avg. performance being considerably worse in windowed mode over fullscreen. I lose 10-15 fps.
Drop the res and turn on supersampling through config files guys.
What does this achieve exactly, do you use less VRAM this way? What do you set the res to in the file to sample back up to 1080p?
 

Buburibon

Member
It's just a window. Like playing any game in windowed mode on PC, your monitor is operating at its native resolution and refresh rate and the game is just occupying space on it. The borderless feature thing that makes it "feel" fullscreen when you render at your monitor resolution is just an illusion (try changing the resolution in borderless windowed mode to see what I meant).

The advantages you have described hold true for any game, however it's usually balanced out by avg. performance being considerably worse in windowed mode over fullscreen. I lose 10-15 fps.

What's that in percentage points if you don't mind me asking? I have never noticed a drop in performance because of borderless window gaming.
 

Nokterian

Member
It also goes to show that this dev isn't one of the PC programmers and probably doesn't know the full story.

Give them a chance man.


Hooo eeeee!

That's a fine looking screenshot.


For like the 20th time, its unoptimized on the consoles as well(why do you think its only 900p?). A decent PC will run this at console levels just fine. Its just people want more than that, and it'll hopefully get there soon with a bit of patching.

And yet again it's more playable to me. Since again shows how bad Ubisoft it is doing it on PC with optimization with every release. I love PC gaming always do but this is one of those things that says get it on console just to play it without any hassle.
 

Hawkian

The Cryptarch's Bane
What's that in percentage points if you don't mind me asking? I have never noticed a drop in performance because of borderless window gaming.
Performance drops for running in a window vary a lot from game to game. In watch_dogs I'm roughly talking I guess a 15% reduction in avg fps.

If you were averaging above 60 fps in fullscreen it's totally possible for it to feel like there is no performance loss going to windowed.
 

riflen

Member
Can anyone explain to me how exactly does this borderless window thing work, technically?

How come the picture is completely synced, there is not a single tear, EVER, and yet framerate is completely unlocked and goes over 60 easily ? Huh ?
I am playing on panny plasma btw.

I'm just guessing at the cause, but it could be explained by at what point in the render process the software you're using is taking its frame-rate reading.
For example, in a true triple buffering design, the GPU is permitted to render as fast as it can and if you add Vsync, the output will be synchronised to the display. The GPU renders to two back buffers, either of which can be flipped to the front buffer for display. If a completed frame is not needed by the display, then it's dropped from one of the back buffers. This way the display gets its 60Hz update and the GPU can churn out frames as fast as possible.

If windowed borderless mode uses a technique like this and if your frame-rate counter is taking its reading from the back buffer, then it'll display >60fps at times, while the out put is actually synced to 60Hz. For this theory to be correct, it would mean Aero desktop display uses true triple buffering + vsync, so who knows? :)
 
Top Bottom