• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Watch_Dogs PC Performance Thread

Why are the majority of peeps choosing temporal SMAA? Is this like a herd mentality? I don't get it, even on most bench-marking sites, this is what they choose to go along their ultra settings, can someone enlighten me why this is? Even this morning Kotaku had their benchmark article posted and this is their only setting for AA...

http://kotaku.com/watch-dogs-benchmarked-how-does-your-pc-stack-up-1581917595

Why-Meme.png
 
Can't you guys just cap the framerate to 30 and be happy with it... until Ubi/AMD/Nvidia releases something ?

I would, but capped to 30fps the game feels like it's stuttering, but we had a thread about that one or two days back. Sadly none of the tweaks in it worked for me to allow a smooth 30fps experience.
Motion blur maybe ?
 
FX8350 at 4.4ish. R9 270X 2GB at 1170Mhz, with the beta drivers from Guru3D. Running at 1080p with textures on high and everything else maxed out and it seems more than happy locked at 30fps so far. I actually preferred the feel of it unlocked, running at between 35 and 50fps outdoors, but the tearing was awful, and I couldn't get triple-buffering going at all (I tried D3DO and RadeonPro).

No weird stutter or judder. I'm pretty happy with the performance, although I've only done the first few story missions so I guess it's possible there are huge crowd scenes or something coming up. I'm happy to drop shadows and level of detail down to high if that happens, though.

As of about ten minutes ago I switched to 2x MSAA and I really like the sharper look compared to temporal SMAA, but we'll see if the framerate holds up.

Doesn't really seem to be thrashing the computer much so far (2x MSAA):

iUTGB0frPTYFH.png
 

mkenyon

Banned
I suspect your i7 is giving you an edge in any case, there's probably some reason they insisted on including 8 core/hyperthreading in the recommended specs. It's just a bit hard to swallow when the CPU load doesn't even go above 80 %.
There aren't any decent bechmarks with frame times on CPU performance, but the game seems to only have 2-5% increased performance with an i7 over i5 at similar frequencies. That one Russian site seems to be an outlier, but it always is. Something is funky about their testing methods, methinks.
 

Dennis

Banned
Why are the majority of peeps choosing temporal SMAA? Is this like a herd mentality? I don't get it, even on most bench-marking sites, this is what they choose to go along their ultra settings, can someone enlighten me why this is? Even this morning Kotaku had their benchmark article posted and this is their only setting for AA...

http://kotaku.com/watch-dogs-benchmarked-how-does-your-pc-stack-up-1581917595

Why-Meme.png

I know, right?

We pros use the games 2x2 SSAA setting.
 

Pjsprojects

Member
Latest Nvidea drivers and day one patch.

FX-8350 cpu @ 4.2mhz
GTX-580 3gb overclocked
8 gig Corsiar ram

Game fps all over the place when driving with settings on ultra or low at 1080p
Given up and purchased PS4 version which plays great.
 

Arkanius

Member
Why are the majority of peeps choosing temporal SMAA? Is this like a herd mentality? I don't get it, even on most bench-marking sites, this is what they choose to go along their ultra settings, can someone enlighten me why this is? Even this morning Kotaku had their benchmark article posted and this is their only setting for AA...

http://kotaku.com/watch-dogs-benchmarked-how-does-your-pc-stack-up-1581917595

Why-Meme.png

Because SMAA is awesome.
https://www.youtube.com/watch?v=75HwcS3iARY
 

TronLight

Everybody is Mikkelsexual
Why are the majority of peeps choosing temporal SMAA? Is this like a herd mentality? I don't get it, even on most bench-marking sites, this is what they choose to go along their ultra settings, can someone enlighten me why this is? Even this morning Kotaku had their benchmark article posted and this is their only setting for AA...

http://kotaku.com/watch-dogs-benchmarked-how-does-your-pc-stack-up-1581917595

Why-Meme.png

Because it's fast enough not to cripple the framerate like MSAA or TXAA, it does at very good job a removing "normal" aliasing like MSAA and also temporal aliasing as TXAA without being an ugly blur-filter like FXAA?
 

AndyBNV

Nvidia
Why are the majority of peeps choosing temporal SMAA? Is this like a herd mentality? I don't get it, even on most bench-marking sites, this is what they choose to go along their ultra settings, can someone enlighten me why this is? Even this morning Kotaku had their benchmark article posted and this is their only setting for AA...

http://kotaku.com/watch-dogs-benchmarked-how-does-your-pc-stack-up-1581917595

Why-Meme.png

Because it's the best post-process option in the game, and is better than 2xMSAA. http://www.geforce.com/whats-new/gu...-performance-and-tweaking-guide#anti-aliasing

Moving up to 4xMSAA from Temporal SMAA has a massive performance impact, and I imagine many folks would rather use those frames on other graphical effects.

watch-dogs-anti-aliasing-performance-chart.png
 
Remember when people laughed at 3GB VRAM for GPUs?

Good times.

Which is why I was laughing at people in the PC thread saying 2 was fine. And that I was only cautioning for more because I play modded Skyrim. Modded Skyrim teaches you the meaning of VRAM and how integral it will be in the future. In short, if you bought a 2gb card in the last year, prepare to upgrade. I'll be upgrading my 780 to the 6gb 780ti as soon as it is released.

That being said I'm at a locked 60 fps at 1200p on my 780. Hard to complain, but I do get weird lag's every now and then. It isn't often, but it does happen.
 

Vuze

Member
Can't you guys just cap the framerate to 30 and be happy with it... until Ubi/AMD/Nvidia releases something ?

*expects everyone to 100% agree*

I would, but capped to 30fps the game feels like it's stuttering, but we had a thread about that one or two days back. Sadly none of the tweaks in it worked for me to allow a smooth 30fps experience.
 

Netboi

Banned
Why are the majority of peeps choosing temporal SMAA? Is this like a herd mentality? I don't get it, even on most bench-marking sites, this is what they choose to go along their ultra settings, can someone enlighten me why this is? Even this morning Kotaku had their benchmark article posted and this is their only setting for AA...

http://kotaku.com/watch-dogs-benchmarked-how-does-your-pc-stack-up-1581917595

Why-Meme.png


I really don't find MSAA any huge difference over SMAA and Temp SMAA. TXAA there is a huge difference but performance drops. FXAA looks more blurred. Temp SMAA make it nice without getting annoyed with jaggies.
 
Because it's lightweight enough not to cripple the framerate like MSAA or TXAA, it does a very good job a removing "normal" aliasing like MSAA and also temporal aliasing as TXAA without being an ugly blur-filter like FXAA?

Have you guys tried TXAA 2X in this game? I found it to be no less demanding than most settings except TX 4x and MSAA 8x, if your gpu supports it, I have found it to the best option so far without really any major dips in performance vs other inferior choices (which gave me at best a few bumps in frames). I do not agree with NVidia's chart, I really do not see that sharp dip between TX 2x and other lesser settings.
 
Maaaaaan, this game is a mess... Crysis 1/2/3 were demanding but they were never this complicated to get them to run as you wanted them to. How come High textures require 2GB of memory but my card is only using 1.5GB? 660ti, memory at 6700MHz (700MHz overclock), boost clock 1246MHz (160MHz Overclock). Is this normal?

My 660ti has 166GB/s of memory bandwidth, a stock 670 has 192GB/s.
 
D

Deleted member 17706

Unconfirmed Member
Well, this may be the first game I pick up on PS4 over PC.

I'm running the following:

i5 2500k @ 4.2 Ghz
16 GB DDR3 1600 Mhz RAM
Radeon 7970 3GB RAM @ 1 Ghz
Windows 8

And it sounds like I can't even expect a stable 30 fps at high settings.

Luckily, I'm in no rush to get Watch Dogs, so I'll probably wait a few weeks or months and see if PC performance improves with patches.
 

Dr Dogg

Member
Hmmm yes well 3840x2160 with TXAA on 3gb VRAM isn't exactly playable (ie 3-4fps but good for screenshots I suppose with another AA solution) but surprisingly with no AA at all, only the occasional stutter as it swaps things around but otherwise not too bad. Should be a happy medium between downsampling and AA in there somewhere.
 

mkenyon

Banned
Which is why I was laughing at people in the PC thread saying 2 was fine. And that I was only cautioning for more because I play modded Skyrim. Modded Skyrim teaches you the meaning of VRAM and how integral it will be in the future. In short, if you bought a 2gb card in the last year, prepare to upgrade. I'll be upgrading my 780 to the 6gb 780ti as soon as it is released.
Watch Dogs is a huge outlier, as is modded Skyrim. Guru3D have gone as far to say that there are some major programming issues in the game that needed to be patched.

You can mark my words, Ubisoft is going to release a bunch of zero-day patches as Watch Dogs does not seems to be behaving the way it should with better than HIGH image quality settings. It is very hard to pinpoint exactly what the problem with the game is, but increased frame-buffers do make a big difference indicating there a lot going on in graphics memory. Interestingly enough GTA as an open world game always has had the very same issue and really liked huge framebuffers. But as to why the game needs this much of it is a bit of a riddle. Obviously a large chunk is eaten away by high quality textures and ultra quality settings.

Well, this may be the first game I pick up on PS4 over PC.

I'm running the following:

i5 2500k @ 4.2 Ghz
16 GB DDR3 1600 Mhz RAM
Radeon 7970 3GB RAM @ 1 Ghz
Windows 8

And it sounds like I can't even expect a stable 30 fps at high settings.

Luckily, I'm in no rush to get Watch Dogs, so I'll probably wait a few weeks or months and see if PC performance improves with patches.
Are you being hyperbolic?

snlVjAQ.png
 
So with an i7 3770k@4.5ghz, 8GB 2400mhz RAM and a stock 3GB 780 I can play on Ultra and Temporal SMAA and get 45 - 60+. However if I go to 4xMSAA it drops 30 - 55, which isn't too bad to be honest. 2xTSAA is a little better but seems blurry to me.

The real issue was v-sync or the lack of it. Adaptive didn't work, D3DOverrider didn't work. Then, whilst looking for ways to stop the stuttering I found that going Borderless Windowed fixed it. I have the D3DOverrider on and I get no tearing but have 60+fps (now I remember that this is how I solved it on ACIV). Also the stuttering was improved. However there are still times when it will drop to 5fps for a second or two before going back to normal so there is still an issue. But, generally I'm pretty happy all in all. I think with a patch here and a bit of tinkering there it could get better still. And if I could get that fucking 337.88 driver to install maybe even better better still.
 
Which is why I was laughing at people in the PC thread saying 2 was fine. And that I was only cautioning for more because I play modded Skyrim. Modded Skyrim teaches you the meaning of VRAM and how integral it will be in the future. In short, if you bought a 2gb card in the last year, prepare to upgrade. I'll be upgrading my 780 to the 6gb 780ti as soon as it is released.

That being said I'm at a locked 60 fps at 1200p on my 780. Hard to complain, but I do get weird lag's every now and then. It isn't often, but it does happen.

But Skyrim is a 32bit program... it can only address 4GB of ram... total. That means vram and system. If you're pushing beyond 2GB of vram in skyrim, you're probably crashing every 30-60 minutes as the game hits it's overall limit and crashes.
 

Sinatar

Official GAF Bottom Feeder
Hmmm yes well 3840x2160 with TXAA on 3gb VRAM isn't exactly playable (ie 3-4fps but good for screenshots I suppose with another AA solution) but surprisingly with no AA at all, only the occasional stutter as it swaps things around but otherwise not too bad. Should be a happy medium between downsampling and AA in there somewhere.

TXAA combines MSAA with a post processing filter and hits performance pretty hard. If you're downsampling, you're better off just doing SMAA.
 

arcoN

Neo Member
Hey,

Does Anybody else hast that Motion Blur Bug ?

ximg.php


Im not moving - but motion blur is still like im turning.

Rig:

3770k @ 4,2 Ghz
16gb GeIL 1600 Mhz Ram
Asus ROG Matrix platinum 7970 3GB
Driver: AMD Catalyst 14.6 Beta

Settings:
Textures High (Stuttering on Ultra)
Res: 1920x1200
Everything else on Ultra.
MSAA 4x
MSHBOA

If i set all settings to low - and only Activate "motion blur" - the error is still active....

does anybody has an solution?
 

Mononoke

Banned
Watch Dogs is a huge outlier, as is modded Skyrim. Guru3D have gone as far to say that there are some major programming issues in the game that needed to be patched.

Titanfall also was an outlier, pushing over 2.5 gb at times. Damn. Thanks for the heads up. Unfortunately, I still have 680s SLI at 2gb. Might hold off until this game gets patched, if it's pushing over.
 
D

Deleted member 17706

Unconfirmed Member
Are you being hyperbolic?

Well, nevermind, then!

I just skimmed this thread and saw awful results being reported with similarly powerful hardware. I guess VRAM amount is hugely important this time around?
 

mkenyon

Banned
Well, nevermind, then!

I just skimmed this thread and saw awful results being reported with similarly powerful hardware. I guess VRAM amount is hugely important this time around?
Initially, but there's no reason for it outside of lack of time to properly code it. It'll be patched.

As it is, 2GB is enough for 2XMSAA/SMAA @ 1080p.
 

Nivash

Member
There aren't any decent bechmarks with frame times on CPU performance, but the game seems to only have 2-5% increased performance with an i7 over i5 at similar frequencies. That one Russian site seems to be an outlier, but it always is. Something is funky about their testing methods, methinks.

That's the way it should be, right? Hyperthreading is not supposed to do that much for gaming.

And in that case I genuinely have no idea why some people who also have 780 tis is getting better performance than me and some other people are.
 

sirap

Member
SLI is ass on this thing, getting annoying slow downs especially when driving. Dropping down to a single Titan does the trick :/
 

VashTS

Member
Hey,

Does Anybody else hast that Motion Blur Bug ?

If i set all settings to low - and only Activate "motion blur" - the error is still active....

does anybody has an solution?

This bug happens due to having MSAA active along with motion blur. You could disable/change one or the other setting and it will be fixed.

Also, I'm pretty sure the newest ATI drivers completely fix this bug too.
 
Hey,

Does Anybody else hast that Motion Blur Bug ?

ximg.php


Im not moving - but motion blur is still like im turning.

Rig:

3770k @ 4,2 Ghz
16gb GeIL 1600 Mhz Ram
Asus ROG Matrix platinum 7970 3GB
Driver: AMD Catalyst 14.6 Beta

Settings:
Textures High (Stuttering on Ultra)
Res: 1920x1200
Everything else on Ultra.
MSAA 4x
MSHBOA

If i set all settings to low - and only Activate "motion blur" - the error is still active....

does anybody has an solution?

It's because you're using MSAA. Use another AA type and it won't happen.

edit - beaten
 

lefantome

Member
What's the point of the Techspot benchmarks?,They're useful only to compare different gpus and cpu on a very limited scenario

Read their methodology:
For testing, we settled at the first save point once Aiden Pearce reaches his hide out and takes a well-deserved nap. Wakened from his slumber, he surveys the area around the hideout for criminals. We started the benchmark when we opened the door to leave.
The first half of the benchmark is a rendered scene where the perimeter of the hideout is viewed from fixed cameras and then we take a 45-second walk before the test ends at 90 seconds.

A 45 second walk? this is why their avg fps results are so high
 

J.Edge

Neo Member
Hey everyone,

Was considering using WatchUNDERSCOREDogs as an excuse to upgrade from my 560Ti to a 6GB 780. If this is the rest of my setup, is Ultra a pipe dream at 1080p (I'm not bothered about 30 fps as long as it's stable)?

i5 2500k @ 4.4GHz
8GB DDR3 2000 MHz
Windows 8.1
 

Sajjaja

Member
This bug happens due to having MSAA active along with motion blur. You could disable/change one or the other setting and it will be fixed.

Also, I'm pretty sure the newest ATI drivers completely fix this bug too.

Any idea when these drivers will come out?
 

hitoshi

Member
Okay my Uplay is doing one of two things: Crashing or not going online. What do?

Praise Gaben.

Also remind Ubisoft that Uplay is still the worst piece of software out there, and launching your biggest new IP since the dawn of time without a proper server park always was, and still a bad idea. Its like every major publisher thinks that they don't need to spend money on an online only game / infrastructure until the machines break, despite the fact that they are absolutely necessary for even starting their biggest game to date.

Long story short:
Uplay is totally down, even the website / forums are not working.
 
I'm playing with game on Ultra, (besides AA off due to 1440p monitor) and I'm getting 30-55 fps with some micro stuttering when I turn the camera. Playable to me

I have:

i7 2600k
GTX 780
16 GB RAM
1440p monitor
 

dawid

Member
Anyone getting Triple buffering to work?
I don't seem to be able to force it. And the game's built in vsync-settings are all terrible. :(
 
I'm now playing on high textures with everything at ultra save for AA (FXAA) because it has always tanked my card. I've turned off the game's V-sync and have gone through RadeonPro instead. Seems like there's less stuttering now even though I'm fluctuating between 32-49fps depending on where I am.
 

Buburibon

Member
Praise Gaben.

Also remind Ubisoft that Uplay is still the worst piece of software out there, and launching your biggest new IP since the dawn of time without a proper server park always was, and still a bad idea. Its like every major publisher thinks that they don't need to spend money on an online only game / infrastructure until the machines break, despite the fact that they are absolutely necessary for even starting their biggest game to date.

Long story short:
Uplay is totally down, even the website / forums are not working.

It's as though they've underestimated the current state of the industry, and PC gamers' willingness to pay for legitimate copies of WD. That being said, I wonder how well it's sold on PC so far. A couple hundred thousand perhaps?

Anyway, I'll try and post performance-related impressions once I'm able to activate my copy of WD.
 
This game (along with Wolfenstein New Order) is seriously getting me to reconsider my gaming platform.

My computer has a good cpu (i7 3770), and 12 gigs of ram. The only thing lacking is my gtx 660 (oem 192 bit) which was kinda underwhelming me before this. I'm not going to spend $600+ on a video card. At $400 the card offerings don't seem that much better that m current card. I'm thinking wouldn't getting a ps4 be better? Like others had said, if Nvidia had a decent mid range I would go with that. I could go with an AMD card but since both of these games aren't AMD friendly, there goes that idea. Plus being PC, Uplay is there to screw things up. Ugh.
 
Top Bottom