Watch_Dogs PC performance thread [Read post #1215 before posting]

Status
Not open for further replies.
Would really appreciate someone running the game on a similar set up to a GTX 680 4gb, i7 3770k processor with 8gb RAM posting how the game runs on their system.

Would usually be confident with getting the game for PC over holding out until I purchase a Xone/PS4. However with ubisofts track record on PC I'm not too sure
 

TSM

Member
Getting about 14-20 fps at 4k maxed. Dropping below/above at times.
Dual 680's
16gb ddr3 2133
3930k at 4.6

Also, anyone else who's running this at 4K, can you tell me what settings your at(along with vram) how much system ram is being used.

WD seems to have alot of "ghost" ram (I'd assume this is system video memory, but Process explorer shows nothing) Killing the process frees up about 11GB of ram :<

Doesn't do this at 1080p. Kinda hard to think you'll need that much ram(vram) for 4k.


The game is using more then 3GB of ram at 1080p on some people's computers:

http://www.neogaf.com/forum/showthread.php?p=113227294#post113227294
 

HRose

Banned
WD seems to have alot of "ghost" ram (I'd assume this is system video memory, but Process explorer shows nothing) Killing the process frees up about 11GB of ram :<

If this happens it's usually a sign of a memory leak.

Memory leaks could also explain high CPU usage (and are also very hard to patch if the engine wasn't built around a decent memory manager).
 

Nydius

Member
One game doesn't indicate a trend in VRAM requirements.

You're right, but it isn't just one game. This year we've had Titanfall, Wolfenstein, and now Watch_Dogs, all which eat 2GB of VRAM like a snack size candy bar and ask for more. The hilarity of that list is that Wolfenstein uses MegaTexture which whose specific design was stated as a way to limit the amount of VRAM needed for higher resolution textures...
 

Karmum

Banned
i5-2500k with GTX 460 SLI (only running one 460 with the game at the moment)... ~30-40 FPS on mostly medium and some high settings. Occasional dips, usually when I first load the game and also when I'm fully accelerated in a vehicle and maneuvering around stuff. 1080p.

Curious to see how it may run with both 460s, even if my second 460 is a bit iffy. Haven't done much research yet on it (sorry), but have SLI drivers/profiles been released? Thought I read somewhere about them not being released, along with some YT videos of people having SLI setups and not getting any performance increases at all.

From what I've played so far...I'm "ehhh" when it comes to the visuals. Not awful, but not impressed. Obviously I haven't maxed it out, but I'm not sure it gets that much better at high/ultra settings. Maybe I'm wrong, but I think it's probably not going to look *that* much different if/when I SLI and up the settings a bit. While I need a new video card anyway, I was thinking about picking one up around the time this was released.
 

Xyber

Member
Would really appreciate someone running the game on a similar set up to a GTX 680 4gb, i7 3770k processor with 8gb RAM posting how the game runs on their system.

Would usually be confident with getting the game for PC over holding out until I purchase a Xone/PS4. However with ubisofts track record on PC I'm not too sure

680 2GB@1200Mhz, i7 4770@4,2GHz and 8GB RAM.

Everything maxed out apart from textures (high) and using SMAA at 1080p got me about 45-60FPS outside depending on what's going on. Some minor stuttering.
 

Senaxx

Banned
i5 2500k with Geforce GTX 780 Ti and 8GB. Hopefully I can finally use this beast and is my processor not to much of a constraint.
 
BF4, Watch_Dogs and Call of Duty Ghosts all have issues with 2GB of vram. We are currently 7 months into the "next gen". You don't think there's a trend going on here? These are the big publishers. Wait until the little guys start having small teams port their PS4/XB1 games over to PC.



I don't think you understand how video cards work.

This is basically the "512MB of VRAM is more than enough" that we heard from people at the start of last generation all over again. It didn't take long before that wasn't true.

Don't save a little money and get the 2GB instead of the 4GB version of a card. Especially if you are not going to upgrade for a while.

I almost regret getting a 2GB 680 DCII two years ago, but I got it for almost half the retail price of a stock 680 so I can't complain. But I like to downsample from higher resolutions, so 2GB has been a big bottleneck in some games.



But why limit yourself to presets? Use ultra textures and turn down the other stuff to high if needed to maintain 60fps.

I am just saying why you shouldn't rely on one game to draw the conclusion 2GB is not enough. I don't think it is enough. But especially with an Ubisoft game, which can be quite hit and miss in regards to PC ports you need to rely on other titles too.
 

TSM

Member
I am just saying why you shouldn't rely on one game to draw the conclusion 2GB is not enough. I don't think it is enough. But especially with an Ubisoft game, which can be quite hit and miss in regards to PC ports you need to rely on other titles too.

Except it isn't one game:

You're right, but it isn't just one game. This year we've had Titanfall, Wolfenstein, and now Watch_Dogs, all which eat 2GB of VRAM like a snack size candy bar and ask for more. The hilarity of that list is that Wolfenstein uses MegaTexture which whose specific design was stated as a way to limit the amount of VRAM needed for higher resolution textures...

We reached a point where 512MB wasn't enough. We've reached the point where it's obvious 2GB isn't going to be enough.
 

Trejik

Member
Having some issues. Running an i7 4770k, GTX 770 2GB OC and 8GB of RAM.

Running on textures high (2gb VRAM) and everything else Ultra. For starters the game doesn't look too phenomenal, but on top of that, I'm getting 60 FPS in closed spaces, 45-30 in everything else, but sometimes when it's going from 45 fps to 30 and stabilizing the stutter looks like frames are being dropped into the single digits. Is it impossible to run at 60 solid with this rig? Cuz' I'd assume it should be able to no problem. Am I doing something wrong? I'm reading people with lower end PC's getting 60 solid on same specs.

Edit: And MSAA x2. Dropping it had no impact on performance, but made the game look terrible (IMO).
 

TSM

Member
Having some issues. Running an i7 4770k, GTX 770 2GB OC and 8GB of RAM.

Running on textures high (2gb VRAM) and everything else Ultra. For starters the game doesn't look too phenomenal, but on top of that, I'm getting 60 FPS in closed spaces, 45-30 in everything else, but sometimes when it's going from 45 fps to 30 and stabilizing the stutter looks like frames are being dropped into the single digits. Is it impossible to run at 60 solid with this rig? Cuz' I'd assume it should be able to no problem. Am I doing something wrong? I'm reading people with lower end PC's getting 60 solid on same specs.

Check your vram when you are having issues. Multiple people have reported stuttering when their video card runs out of available vram.
 
Except it isn't one game:



We reached a point where 512MB wasn't enough. We've reached the point where it's obvious 2GB isn't going to be enough.

I answered someone who asked whether Watch Dogs wasn't proof enough. It isn't. I know there are other games with the same issues, but that is irrelevant.
 

Redmoon

Member
If this happens it's usually a sign of a memory leak.

Memory leaks could also explain high CPU usage (and are also very hard to patch if the engine wasn't built around a decent memory manager).

If it was a leak, wouldn't it show up in task manager/process explore? It almost always stays under 3gb of ram usage.

I just think its due to Video memory. Im using ~4GB at 1080p, so I'd imagine 4K being alot more, but I'm only seeing about 700mb of system vram being used.
 
So a single game isn't proof enough. Also multiple games aren't proof enough...

I am not saying that.

I am not discussing the fact whether 2GB is too little nowadays. Yes, I think that is the case, and multiple games are the proof of that.

What that guy asked whether Watch Dogs was proof enough. No, it isn't. With other titles together, yeah, you got a case but that wasn't what he said.
 

blastprocessor

The Amiga Brotherhood
Another conclusion:

A GTX 770 is roughly twice as fast as a PS4. Watch Dogs seems in line with previous observations. A 770 should be able to run at full HD, locked 60 FPS, everything on high, what the PS4 runs at 900p/30 FPS.

The 770 GTX is around 3.2 tf, this is not twice as fast as PS4 1.8tf so l doubt 60 fps consistent from this card. 780 GTX should do it with a good CPU.
 

TSM

Member
I'm interested in how much this will be cleared up by a driver update from nvidia and AMD. We are still well early of the launch date of the game.
 

HRose

Banned
But why limit yourself to presets? Use ultra textures and turn down the other stuff to high if needed to maintain 60fps.

Because what affects video memory the most is resolution and what kind of AA you run. Meaning that you can't just flip the texture setting and have the same FPS. Texture quality has no effect on FPS, but AA will.

And 4Gb on the videocard means twice as much loading/unloading texture, meaning that if you use those 4Gb you'll also see twice as much stuttering as with a 2Gb model (on high).

In the end the difference between "high" and "ultra" is minimal, while the impact on all the other aspects is far greater.
 

TSM

Member
Because what affects video memory the most is resolution and what kind of AA you run. Meaning that you can't just flip the texture setting and have the same FPS. Texture quality has no effect on FPS, but AA will do.

And 4Gb on the videocard means twice as much loading/unloading texture, meaning that if you use those 4Gb you'll also see twice as much stuttering as with a 2Gb model (on high).

In the end the difference between "high" and "ultra" is minimal, while the impact on all the other aspects is far greater.

Wow, so much wrong in one post. 4GB actually means less swapping of memory as it can retain more. The 2GB would have to trade assets in and out of memory if they don't fit in the 2GB memory cap. This is what is happening when people are getting stuttering. Their 2GB card has to swap assets in and out of memory and it's limited by system bus speed.
 

HRose

Banned
The 770 GTX is around 3.2 tf, this is not twice as fast as PS4 1.8tf so l doubt 60 fps consistent from this card. 780 GTX should do it with a good CPU.

Yes, people slowly getting to the point ;)

My factory OC 770 2Gb is indeed 3.5 tf, meaning really close to the 2x thing. From what I've read, if not CPU capped, the 770 can indeed run at 60 fps on equal PS4 settings (so "high", not ultra).

With a 780 you should expect "ultra". Or at least this is what Ubisoft always guaranteed, again if not CPU capped.
 

thematic

Member
And here we are, debating whether 2GB vs 4GB are still enough for 1080p... While the proof is already here.

Texture : Ultra (Requires 3GB VRAM)

So no, 2GB won't be enough for "next gen texture". You can say all bad port you want, but unless it's PC-only title, it would probably use similar requirement for Ultra VRAM.

Also the tests ran by most 2GB 4GB comparison was using "old" games without "Ultra next gen texture", of course it won't show any difference.
 

Dennis

Banned
NVIDIA 337.81 beta drivers does NOT work well with SLI for me.

I get extreme amounts of stuttering, texture errors and my fans sound like a jet engine.
 

TSM

Member
This dude is talking in circles and has no idea what he's saying

He has a 770 2GB so he apparently has a vested interest in it being sufficient. Combine this with his lack of understanding with how things actually work and you get this.

To be fair though, a driver update could do a lot to improve the current situation.
 

HRose

Banned
Wow, so much wrong in one post. 4GB actually means less swapping of memory as it can retain more. The 2GB would have to trade assets in and out of memory if they don't fit in the 2GB memory cap. This is what is happening when people are getting stuttering. Their 2GB card has to swap assets in and out of memory and it's limited by system bus speed.

Don't read HALF of what someone writes.

Of course 4Gb can hold twice than 2Gb, so less swapping. But I'm comparing 4Gb/ultra and 2Gb/high.

Meaning that if they scale in a similar way, the same texture that is being pushed in one example is twice the size in the other.

We are NOT comparing 2Gb versus 4Gb on the *same* texture size. So, if you run 2Gb/high you're going to still see much less stuttering than 4Gb/ultra. The bandwidth is the same, the texture size is not.

Obviously assuming that 4Gb fits "ultra" equally as well as 2Gb fits "high". Meaning that the frequency of the swapping would be the same, but the size would not, so much greater performance hit on 4Gb/ultra.
 

thematic

Member
Because what affects video memory the most is resolution and what kind of AA you run. Meaning that you can't just flip the texture setting and have the same FPS. Texture quality has no effect on FPS, but AA will.

And 4Gb on the videocard means twice as much loading/unloading texture, meaning that if you use those 4Gb you'll also see twice as much stuttering as with a 2Gb model (on high).

In the end the difference between "high" and "ultra" is minimal, while the impact on all the other aspects is far greater.

LOL
4GB means it store more data and less stuttering WHILE GAMING because the required data is preloaded sufficiently on 1ST LOAD.

The stuttering everyone experience is because 2GB data worth, isn't enough for Ultra and need to load the rest WHILE GAMING

In the end, whether you say "minimal", it's there. ULTRA = better = need > 2GB VRAM.
 
I ended up with a mix of medium/low, motion blur and DOF on with temporal SMAA at 1600x1050 to get it running at a locked 30 on my laptop. Then I left it paused on my laptop for 20 minutes and it almost melted.
 

TSM

Member
Don't read HALF of what someone writes.

Of course 4Gb can hold twice than 2Gb, so less swapping. But I'm comparing 4Gb/ultra and 2Gb/high.

Meaning that if they scale in a similar way, the same texture that is being pushed in one example is twice the size in the other.

We are NOT comparing 2Gb versus 4Gb on the *same* texture size. So, if you run 2Gb/high you're going to still see much less stuttering than 4Gb/ultra. The bandwidth is the same, the texture size is not.

Obviously assuming that 4Gb fits "ultra" equally as well as 2Gb fits "high". Meaning that the frequency of the swapping would be the same, but the size would not, so much greater performance hit on 4Gb/ultra.

Except some people are having stuttering using high textures on a 2GB card so I'm not sure what you are trying to get at.
 

HRose

Banned
Texture : Ultra (Requires 3GB VRAM)

So no, 2GB won't be enough for "next gen texture".

Why you people must contradict yourself so plainly?

"Ultra" texture are NOT what the PS4 runs at. We are looking at high. If they wanted they could easily push in some insane resolutions to the point that not even 10Gb will be enough. So what? Not even 3 or 4 Gb will be enough in a few years. It's not like the resolution of a texture requires a lot more work for an artist.

We have to agree on what "next gen" means, because if we consider Xbox one and PS4 the canon of next-gen, then we still have to prove that on PC more than 2Gb is need (on same settings).
 

knitoe

Member
Ubisoft just needs to update the game so it will just auto select the best settings for your VRAM while ignoring whatever the user selected. That way, everyone can select 'ultra', get 45-60fps and be happy. Ignorance is bliss.
 

HRose

Banned
Except some people are having stuttering using high textures on a 2GB card so I'm not sure what you are trying to get at.

Oh, everyone is stuttering. Even those on 4Gb models. If you look, even those with 6Gb titans are filling up all the memory.

Give me the proof that 4Gb/ultra runs better than 2Gb/high, because that's the point. If both are running short than there isn't much of a difference.
 

Serandur

Member
The 770 GTX is around 3.2 tf, this is not twice as fast as PS4 1.8tf so l doubt 60 fps consistent from this card. 780 GTX should do it with a good CPU.
Just a note, FLOPS isn't exactly a good measure of performance in anything but, well, floating-point operations. It's one type of calculation only, shader performance, texel and pixel fillrate, and memory bandwidth as well as numerous more specific architectural details are all factors in overall game performance. You shouldn't go by FLOPS (in this case, only single-precision FLOPS) as a predictor of game performance. AMD cards, for example, tend to have higher stronger floating-point performance than their game-equivalent Nvidia cards, the equivalent to the 770 being a ~4.1 TFLOPS 280X.
 

HRose

Banned
All you want from a 770 is parity with a PS4?

Yes, exactly!

Parity, at 1080p/60FPS, versus PS4 900p/30FPS.

This was what was being discussed.

You want "ultra"? Fine, a 4Gb 770 won't be enough. You need more memory AND faster card.

You want PS4 on same setting, FPS and resolution? Fine, a 7850 should be enough.
 

Gambit61

Member
PC version leaked since uplay didn't do a release date check. Early copies apparently work and review copies as well.


but yea so much whining about graphics imo

Phenom II x955 BE
4GB RAM
Radeon 6850 1GB
GIGABYTE GA-880GMA-UD2H

runs great at that 1777 or whatever resolution that is just below 1920x1080 @ medium (with one setting at high if i remember correctly) been using motion blur on too but will see how that affects the performance over time

EDIT: i'm sure a patch will be released by Ubisoft and then the graphics cards makers. game is pretty awesome..made me shove my whole backlog even further back ...should tide me over until EA's UFC game

What kind of frame rates are you getting and did you overclock the 955? I have a 955BE and 7870XT and was wondering if I should get WD.
 

Camp Lo

Banned
680 2GB
3570K @ 4.7
16GB RAM

I managed to get 60 fps on High with TXAA 2X

My frames dip to 30 while driving and is completely horrible on anything other than TXAA.

Edit: Feels like Black Flag all over again.
 

knitoe

Member
What kind of frame rates are you getting? I have a 955BE and 7870XT and was wondering if I should get WD.
These days, I am not sure why people don't take and show screenshots displaying CPU & GPU usage, PC & GPU ram usage and fps. Saying a game running fine could be 144fps, 60fps, 45fps, 30fps, 20fps and etc. Once, someone said this game was 'fine,' and when he posted a screenshot, it was 15fps.
 

dmr87

Member
Everyone, the game isn't officially out yet so there's hope for a day one patch (if there are bugs/perf issues) and I would expect both AMD and Nvidia to have performance guides/new drivers out on Monday/Tuesday.
 

HRose

Banned
You don't need a faster card to run Ultra textures though. You just need at least 3GB vram.

Still need actual benchmarks. I only said 4Gb models are slightly slower (no factory OC, until recently) and it could increase stuttering because of bigger textures being swapped.

I really would like to see a 770 4Gb model running everything on high, textures ultra, and steady 60 FPS.

My point was only that 2Gb was enough for parity with PS4, not that there won't be any game, on PC, in the future potentially using more. Do you think "ultra" settings are really hugely better than "high" at a standard 1080p? I don't.
 

Reishiki

Banned
I'm running:

i7-4930K (OCed at 4.3Ghz right now)
GTX Titan
16GB RAM
Planning on running at 1920x1080, but can go to 2560x1440.

Am I good? I remember some issues with God Rays in AC4 until a later patch fixed issues with nVidia cards.

I'm thinking there's some driver updates I might need, I'm still running drivers dating back to March this year.
 

Xyber

Member
Still need actual benchmarks. I only said 4Gb models are slightly slower (no factory OC, until recently) and it could increase stuttering because of bigger textures being swapped.

I really would like to see a 770 4Gb model running everything on high, textures ultra, and steady 60 FPS.

My point was only that 2Gb was enough for parity with PS4, not that there won't be any game in the future potentially using more. Do you think "ultra" settings are really hugely better than "high" at a standard 1080p? I don't.

Unless you hit the VRAM limit, a 2GB and a 4GB 770 with the same clockspeed will perform the same if the only difference is higher texture quality.

And while you might not think that the small differences between high and ultra is worth it, plenty of others do.
 
Status
Not open for further replies.
Top Bottom