Watch_Dogs PC performance thread [Read post #1215 before posting]

Status
Not open for further replies.

Diablos

Member
Here we go.
I'm running the game between 30 - 50fps on a Core i7-3820 @ 4.2Ghz, 32GB DDR3-1866Mhz RAM & GTX680 4GB GDDR5. OS: Windows 7 64-bit. All settings maxed out, 1920x1080. However, I'm only using "temporal-SMAA". My computer can't handle proper Anti-Aliasing.

The image shows the not-so-good daylighting of the game. But it's the worst case scenario. It looks pretty damn good at night. There are a few ugly textures (grass texture) but overall, it looks decent and plays decent.

Code:
[IMG]http://i4.minus.com/ibqZcxmnYANN9j.png[/IMG]
Wait, this is max settings on an i7 rig and you are getting <60fps for such washed out shite (at 1080p)? Are you kidding? The engine is screaming for optimization.
 

Tiberius

Member
Wait, this is max settings on an i7 rig and you are getting <60fps for such washed out shite (at 1080p)? Are you kidding? The engine is screaming for optimization.

remember when there were report from journalists telling us that the differences between ps4 and pc were minimum
 

GavinUK86

Member
Have a quick question about limiting the frame rate in watch dogs, or any game for that matter... I noticed people usually post about how they limit frame rate with 3rd party apps (I read riva tuner mentioned for example), just wondering how come people don't just use the Nvidia control panel. I usually use that for adaptive Vsync and it works great. Im assuming there is a setting for 30 fps as well no?

yeah you can half the frame rate but it never works great, for me at least. i usually just use dxtory.
 

b0bbyJ03

Member
yeah you can half the frame rate but it never works great, for me at least. i usually just use dxtory.

thanks for the answer... im still trying to decide how im going to play this game... i have a 780 and a 3770k. Im new to pc gaming but the idea that i have full control over what i want to do is very exciting... not sure if i want to lower the resolution and shoot for everything at ultra 60fps (has anyone tried this? maybe match the ps4 res and see what you get) or go for 1080p and lock it at 30 fps... one thing i do know is that variable frame rates drive me crazy so i think ill try and stick to a locked frame rate.
 
thanks for the answer... im still trying to decide how im going to play this game... i have a 780 and a 3770k. Im new to pc gaming but the idea that i have full control over what i want to do is very exciting... not sure if i want to lower the resolution and shoot for everything at ultra 60fps (has anyone tried this? maybe match the ps4 res and see what you get) or go for 1080p and lock it at 30 fps... one thing i do know is that variable frame rates drive me crazy so i think ill try and stick to a locked frame rate.

If you're like me and you want Ultra, I'd say go Ultra, go TXAA X4, or whichever antialiasing method you like at maximum, and lock it to 30. The game is still unoptimized like hell, it should NOT need beefy parts for the image quality it produces. The only impressive thing about it is water. And on the exact opposite end are the textures, which are a joke at "Ultra" levels.
 

Smokey

Member
Are people having better luck with those drivers than me?

Here's what a guy at OCN posted in their WD thread:

Well I got SLI to work, apparently the solution is running beyond 1080p. Must be some kind of bottleneck because on 1080p my SLI results in worse performance, but on 1440p it's double the FPS.

I then put 3-4 hours into the game at 1440p and 2x MSAA, driving around fast and free roaming. I even put 1 hour into the spider tank mission and was above 60FPS almost the entire time. Overall, I average ~60FPS, with dips into the 50s. Sometimes, I will get the same chops I get on 1080p and drop to low FPS for a couple frames. I really don't think it's caused by VRAM though, otherwise 1440p simply wouldn't be playable. The graphics are insane in my eyes, and IMO the best we've ever seen in an open world game by a long shot.

I furthered tested if it was VRAM by going up to 1440p 4x MSAA and the chops remain the same, but average FPS was about 45FPS. Just to test how far I could actually go on 1440p before hitting 3GB of non-cached VRAM.. the answer was 8x MSAA. This is where my average FPS goes from about 45FPS, to about 10FPS, and almost starts locking up. This is what I expect when VRAM capping, not the chops many people are experiencing. I feel like drivers or patches will fix that.

By the way, the police AI is nuts in the game. It's really really good... best i've probably ever seen. A single cop can be a challenge to get away from at times.


But you're playing at 2560x1600 and still having issues...so not too sure. I'm sure NVidia will release an official driver in the next few days for the US though. He has 2x 780tis.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'd be interested to hear more comments on the "i5 2500k is dead" issue. As a owner of a 2500k this is kind of upsetting. So, more opinions please?

Going to buy this if it runs well, thanks for the demo Ubi... ugh :(

i5 2500K CPU
4GB Ram
6950 2GB GPU

trying to run at 1080

What do you reckon? Not fussed with 30fps, not fussed with lower settings.


Okay okay All my fellow 2500K owners.
Did a few tests finally and I can pretty much confirm the 2500K still lives, CPU usage was pretty high but nothing im not used to.
Running at 4.5GHz even a GTX570 can enjoy this game decently.

A 670 at 1080 shouldnt really be suffering, i dont see how people at 1080 are failing to achieve high playability with OC'd i7s.

Everyone with a 2500K just OC it to or beyond 4.5GHz and we are more than sweet depending on the graphics card.
A GTX570 can play this game matched to a 2500K on high at 30-50fps
A GTX670 can have a couple settings higher and enjoy the same or higher frame rates.

If you want locked 60 on a 570 you will have to sacrifice some features....if you dont mind playing at 30 then id say a 2500K is much more than sufficient.

I cant really go too hectic on a Graphics card comparison as I only have 570, 580 and a 670 at my disposal, but i will call over a couple friends with other "common" cards if people want more tests.

n1APHbC.jpg

The 2500K lives!
 
I'm using beta driver 337.50, I've not tested with other drivers, too lazy.

My rig with titan and i7 4770k 4.2ghz can achieve 60fs on ultra settings without AA or ao with v sync on 1920 x 1080p but it's not constant, the, on foot the frame rate dips to low 50s, there is an area early in the game where there is a crowd an the frame rate dipped to the 40s ad driving it can go as lw as 30 in some areas.

Textures on high and everything else on ultra improves frame rate, still get dips from 60 but it isn't as bad.

Best results for me so far is setting textures to high, level f detail to high and everything else on medium with AA and AO off.

It's not constant 69fps throughout but it's the closest I think I can get without sacrificing too much visual quality.

Why is it 30fps on pc even on ultra looks so bad but looks ok on consoles?

Have you tried Riva Tuner?

The fact that a Titan has such trouble with the game is a joke though. I really hope Ubisoft will patch the performance but who am I kidding...

Have a quick question about limiting the frame rate in watch dogs, or any game for that matter... I noticed people usually post about how they limit frame rate with 3rd party apps (I read riva tuner mentioned for example), just wondering how come people don't just use the Nvidia control panel. I usually use that for adaptive Vsync and it works great. Im assuming there is a setting for 30 fps as well no?

Using Nvidia control panel usually introduces a lot of input lag on my rig. MSI Afterburner + Riva Tuner is just perfect. I have a Gigabyte card, but I'll never go back to their crappy software.
 

dark10x

Digital Foundry pixel pusher
Well, seems to be just what I expected.

Not great.

That said, it seems to me that simply locking the frame-rate to 30 fps would be optimal for most people here.
 

velociraptor

Junior Member
Why is it 30fps on pc even on ultra looks so bad but looks ok on consoles?
This is something I have noticed myself.

Games look and feel 'fine' when playing on consoles despite running at 30FPS. Yet on the PC, they look pretty horrendous. Dark Souls I was the only game that didn't feel bad at 30FPS though, but perhaps that was because the entire game was built and designed upon 30FPS.
 
Man, the game is beautiful and seems to be running at 1920p@40-50 FPS (with some slow downs in intense car chases.) at high-ultra settings, V-sync on.

Got an i5 4770k, GTA 7950, and 8GB of RAM. Only got FXAA on though, as performance take a massive hit if you go with fancier AA.
 

wowzors

Member
Okay okay All my fellow 2500K owners.
Did a few tests finally and I can pretty much confirm the 2500K still lives, CPU usage was pretty high but nothing im not used to.
Running at 4.5GHz even a GTX570 can enjoy this game decently.

A 670 at 1080 shouldnt really be suffering, i dont see how people at 1080 are failing to achieve high playability with OC'd i7s.

Everyone with a 2500K just OC it to or beyond 4.5GHz and we are more than sweet depending on the graphics card.
A GTX570 can play this game matched to a 2500K on high at 30-50fps
A GTX670 can have a couple settings higher and enjoy the same or higher frame rates.

If you want locked 60 on a 570 you will have to sacrifice some features....if you dont mind playing at 30 then id say a 2500K is much more than sufficient.

I cant really go too hectic on a Graphics card comparison as I only have 570, 580 and a 670 at my disposal, but i will call over a couple friends with other "common" cards if people want more tests.
The 2500K lives!


Sounds good, this is encouraging news for the 770 4gb I have right now. 2500k is at 4.5ghz.
 
Okay okay All my fellow 2500K owners.
Did a few tests finally and I can pretty much confirm the 2500K still lives, CPU usage was pretty high but nothing im not used to.
Running at 4.5GHz even a GTX570 can enjoy this game decently.

A 670 at 1080 shouldnt really be suffering, i dont see how people at 1080 are failing to achieve high playability with OC'd i7s.

Everyone with a 2500K just OC it to or beyond 4.5GHz and we are more than sweet depending on the graphics card.
A GTX570 can play this game matched to a 2500K on high at 30-50fps
A GTX670 can have a couple settings higher and enjoy the same or higher frame rates.

If you want locked 60 on a 570 you will have to sacrifice some features....if you dont mind playing at 30 then id say a 2500K is much more than sufficient.

I cant really go too hectic on a Graphics card comparison as I only have 570, 580 and a 670 at my disposal, but i will call over a couple friends with other "common" cards if people want more tests.

http://i.imgur.com/n1APHbC.jpg[IMG]
[B]The 2500K lives![/B][/QUOTE]

The question is whether it is the CPU holding you back or the GTX670.
 

DSN2K

Member
you do have to wonder why this game is hitting i5, i7 so much....seems like Ubi's engines are not optimised well at all. Assassins Creed showed that as well.

Visuals don't match the performance, there are better looking games out there doing more with less resources.
 
From my latest experiment, the two things that really cripple framerate are "Water" and "Shadows" this is the only two settings that gives me horrible drops into 10 when everything else set on High/Ultra. If i put them both on medium, im fine. Then i guess it's just up to everyone available sheer power to scale up the rest.

I can run it on a 670 lock and stable like that, but everytime i put high on shadows or water, everything falls apart. I tried Ultra texture but obviously limited by the 2GB of vram.

Im suspecting "Water" does some funky far away tessellation or something and shadows have always been a problem when set too high in that kind of game.

My settings

Detail : Ultra
Shadows : Medium
Reflection : Ultra
AO : HBAO+ High
Water : Medium
Shader : High
Texture : High
 

Detective

Member
Guys, Just bought a gaming PC and need your input about its spec.

i7 4t, gtx860 With 2GB GDDR, 12 GB memory.

What settings can I play with these specs?
 

DSN2K

Member
Guys, Just bought a gaming PC and need your input about its spec.

i7 4t, gtx860 With 2GB GDDR, 12 GB memory.

What settings can I play with these specs?

presume its a laptop with the 860m, that 2GB is going to hurt you going by what people have said about vram usage.
 
There is a massive difference between Low and Ultra, when I played both, Low not only looked like crap but pop up became more common.

Yea but the specific setting for that is "Detail" wich is basically how agressive your LOD will be and also the geometry in general (low/high poly)
 

Xyber

Member
I'm guessing there is no way to force better anisotropic filtering? I tried with Nvidia control panel but that didn't do anything.

The game could really need a bump in that department.
 
Whats worryeing is this trend to have good machines capping the framerate at 30fps.

This is bad, i mean, we're not on consoles right?

Personally im on locked 60fps as i reported, but with a 3770k at 4.6 and a 780 ti oced.

But its insane that machines a little below mines are forced to run it capped at such lowly fps. Such a shame. UBI and their wonderful optimizations, they never learn.
 

Donrule01

Banned
AMD FX 8350 4ghz here with a GTX 770. I have dual 21 inch monitors running at 1600x900 so I should be safe. My screen res always lets me add a few extra bells and whistles lol.
 

Havel

Member
It is on Nvidia's website, but only the international version. There is also a Chinese version on Nvidia's website.

Considering neither of the two SLI profiles added in 337.81 work (Watch Dogs and Wolfenstein), I'm guessing this driver was a quick update for the 4k issues people have been having.

"We posted an international driver that is not a wide driver release but does contain the fix for this the monitor display blank screen issue."
 
Considering neither of the two SLI profiles added in 337.81 work (Watch Dogs and Wolfenstein), I'm guessing this driver was a quick update for the 4k issues people have been having.

"We posted an international driver that is not a wide driver release but does contain the fix for this the monitor display blank screen issue."

That driver was 337.61

From what I've read, SLI does seem to work, at least on the 7XX series with the 337.81 drivers.
 
Okay okay All my fellow 2500K owners.
Did a few tests finally and I can pretty much confirm the 2500K still lives, CPU usage was pretty high but nothing im not used to.
Running at 4.5GHz even a GTX570 can enjoy this game decently.

A 670 at 1080 shouldnt really be suffering, i dont see how people at 1080 are failing to achieve high playability with OC'd i7s.

Everyone with a 2500K just OC it to or beyond 4.5GHz and we are more than sweet depending on the graphics card.
A GTX570 can play this game matched to a 2500K on high at 30-50fps
A GTX670 can have a couple settings higher and enjoy the same or higher frame rates.

If you want locked 60 on a 570 you will have to sacrifice some features....if you dont mind playing at 30 then id say a 2500K is much more than sufficient.

I cant really go too hectic on a Graphics card comparison as I only have 570, 580 and a 670 at my disposal, but i will call over a couple friends with other "common" cards if people want more tests.

n1APHbC.jpg

The 2500K lives!

This is good to know. I'll try to get my 2500K OC to 4.3-4.5 range tonight and with my 770 it should hopefully run decently.

We just had Wolfenstein release and what is the next big PC/Cross plat release? It should be interesting to see if any other upcoming titles will require as much power or it is very bad optimization by Ubisoft.
 

Herne

Member
I don't see what all the hubbub is about the 2500K, or why Black_Stride's post felt necessary. My non-oc'd 2500 is giving me great performance.
 
That driver was 337.61

fraps benchmark results from watchdogs -

SLI disabled -

2014-05-25 12:34:31 - watch_dogs
Frames: 3211 - Time: 79891ms - Avg: 40.192 - Min: 22 - Max: 52


SLI enabled -

2014-05-25 12:37:50 - watch_dogs
Frames: 4638 - Time: 74719ms - Avg: 62.073 - Min: 15 - Max: 88

seems to be working on 337.81 w/ GTX 780s

Oh nice. What res you running at?
 
I don't see what all the hubbub is about the 2500K, or why Black_Stride's post felt necessary. My non-oc'd 2500 is giving me great performance.

When the specs were announced they talked about a Passmark score of 9000. The 2500K, unless overclocked pretty well, did not hit that.

A lot of commotion, got a lot of people worried.
 

GHG

Gold Member
That driver was 337.61

fraps benchmark results from watchdogs -

SLI disabled -

2014-05-25 12:34:31 - watch_dogs
Frames: 3211 - Time: 79891ms - Avg: 40.192 - Min: 22 - Max: 52


SLI enabled -

2014-05-25 12:37:50 - watch_dogs
Frames: 4638 - Time: 74719ms - Avg: 62.073 - Min: 15 - Max: 88

seems to be working on 337.81 w/ GTX 780s

This is what I was hoping to see. Hopefully SLI gets optimised further with the next driver release.
 
May I ask what CPU you have?

i7 3770k @ 4.4Ghz

The cpu that was running the benchmarks was a 4770k, I think.. but I can't imagine it being much, if any faster than a 3770k.

They weren't my results. I should have clarified when posting, just saw some evidence that SLI does work with the 337.81 drivers and thought I'd share it.
 

Herne

Member
When the specs were announced they talked about a Passmark score of 9000. The 2500K, unless overclocked pretty well, did not hit that.

A lot of commotion, got a lot of people worried.

Boo to Ubisoft for referring to Passmark. People were told in that thread to ignore Passmark results, that they're useless for measuring real-world game performance, but I guess people will understand the numbers game better than explanations like, "It's more complicated than that".
 
Status
Not open for further replies.
Top Bottom