Watch_Dogs PC performance thread [Read post #1215 before posting]

Status
Not open for further replies.
Except that PS4 and XB1 have a minimum of 5GB of unified memory available to developers. Multiplatform AAA games will routinely use more then 2GB of vram going forward.

Flawed logic being flawed.

5Gb of *unified* memory isn't very much. It means you divide memory between system and GPU. Comparably a PC has 8gb + minimum of 2Gb. I'd say on average 10 > 5. "Only" twice as much.

So if Watch Dog ran with 1Gb RAM requirement then great, I would understand it requiring more than 2Gb on the videocard. But it doesn't. It burns quite a bit of system memory.

All things being equal, if Watch Dogs eats a bare minimum of 2Gb of system ram, then on a PS4 only a max of 3Gb could be used for graphic.

Hence, going forward, you really shouldn't expect PS4 quality on PC requiring more than 2/3 Gb.
 
If I do I'll let you know. My mobo is ITX BTW, just bought it back in Feb as I went from mATX to ITX. I kept the 2500K and reused for the ITX and I probably should of gone with a Haswell and the 4770K at that point. Oh well, LOL!

It seems like you are getting decent performance with your setup and probably better than what it would be on consoles.

Cheers, man. Not sure about the ITX board but the processor is what I'm looking for anyway - I appreciate you keeping me in mind :D

Good luck with your experience with the game!
 
I'n getting about 60% SLI usage with 337.50 drivers with SLI 780s, although the game runs at 30fps. I don't think the SLI drivers are utilising the cards enough as it should be maxing them when I'm getting 30fps on something. I'm playing a 4K, maxed by the way
 
Any AMD Crossfire folks are going to want to disable crossfire unless AMD magically gets a new driver release out before release.

Spent a few hours this morning messing with the PC version at my bro-in-law's house, and he's running crossfired 290xs. With both cards running, it was hitting 29-37 FPS outside while driving on ultra. Once I disabled the second card, I never saw the FPS counter drop under 50 outside and it stayed close to 60 the majority of the time. That's with a 4770K OC'd to 4.5.


Still have hope that I can pull a solid 30 on my 5870 on medium settings!
 
I'n getting about 60% SLI usage with 337.50 drivers with SLI 780s, although the game runs at 30fps. I don't think the SLI drivers are utilising the cards enough as it should be maxing them when I'm getting 30fps on something. I'm playing a 4K, maxed by the way

What do you get with vsync off? (I'm assuming it's on)

Also, try the drivers linked above.
 
Flawed logic ahead.

5Gb of *unified* memory isn't very much. It means you divide memory between system and GPU. Comparably a PC has 8gb + minimum of 2Gb. I'd say on average 10 > 5. "Only" twice as much.

So if Watch Dog ran with 1Gb RAM requirement then great, I would understand it requiring more than 2Gb on the videocard. But it doesn't. It burns quite a bit of system memory.

All things being equal, if Watch Dogs eats a bare minimum of 2Gb of system ram, then on a PS4 only a max of 3Gb could be used for graphic.

Hence, going forward, you really shouldn't expect PS4 quality on PC requiring more than 2/3 Gb.

Fixed that for you. It's unified memory, so developers could dedicate 3+ GB of it to things what would be handled by vram by a video card on a computer if they wanted to. Then this probably bloats even more when converted to PC if they add "ultra" textures and such. The beauty of unified memory is that developers can divide it up however they want to.
 
I'n getting about 60% SLI usage with 337.50 drivers with SLI 780s, although the game runs at 30fps. I don't think the SLI drivers are utilising the cards enough as it should be maxing them when I'm getting 30fps on something. I'm playing a 4K, maxed by the way

what type of monitor setup you running? can you post a screenshot at 4k?
 
Setting vsync to 2 drops the fps down to 20, which is definitely not an option. I don't mind 30, so I'm okay with that, but if you're willing to deal with some screen tearing and your pc is similar to mine, you should be hitting 60 or close to.

That is a problem. On plasma, tearing is extremely noticeable and unplayable. So is 30fps for me. So I guess I will have to drop to medium-high to get 60fps. God dammit.
 
I coulda sworn Toms and Anand did articles about PC hardware myths such as VRAM. It was concluded 2GB was enough for 1080p with good post-processing.

Still is.

In many cases, for example on the GTX 770, the 2Gb model can have substantial overclock done by the factory. The 4Gb models, branded the same, go at stock clocks.

So in the end the 2Gb models can run actually faster.

At 1080p, I bet Watch Dogs too, you aren't going to have textures on ultra ON TOP of good AA and everything else. Diminishing returns. Texture requirements are the kind of stuff that doesn't grow in a linear way, but exponentially.

It's very likely that you can achieve much better video quality by turning down textures to "high" and then tweak the other settings.

The bottom line is: more than 2Gb is only needed if you aim at above 1080p, or on faster models. On a GTX 770 (or 670) level card performance hits you before the Vram limit does.

Once again, NEVER buy future-proof. Instead buy cheap, and upgrade often. By the time you need a new videocard new models will be out that wipe the floor with the current ones.
 
Okay, so been playing for about an hour.

Humble rig:
i5 4570 @ 3,2
HD 7870 GHz 2GB @ stock
16GB DDR3
installed on SSD

Everything on Medium, MHBAO, DoF on, Motion Blur on, FXAA, High textures @ 1080p. Getting about 45-60fps outside, steady 60 inside.

Only problem I have is tearing in the upper screen half, can't get it fixed without enabling vsync ingame which totally destroys the performance I'm getting.

this makes me sad
 
The bottom line is: more than 2Gb is only needed if you aim at above 1080p, or on faster models. On a GTX 770 (or 670) level card performance hits you before the Vram limit does.

Once again, NEVER buy future-proof. Instead buy cheap, and upgrade often. By the time you need a new videocard new models will be out that wipe the floor with the current ones.

There are already games that need more then 2GB or you get large performance issues at 1080p. BF4 and Watch_Dogs are two that I know of. Neither one of them is unplayable, but there is a large performance loss between the 2GB and 4GB versions of the same video card.

Someone also posted this image of their video card running call of duty ghosts maxed at 1080p with no AA:

XDoz3vN.png


The start of this generation seems to signal that the days of 3GB of vram being sufficient are over.
 
If somebody is good with Cheat Engine you can try to see if there is anything useful in game code that can improve graphic etc.

Yeah, probably a way to force the game to stay night/late afternoon throughout lol. Shouldn't be too hard to achieve since the game allows you to sleep x hours anyway.
 
so i see that there is a beta driver available that shows and SLI profile for WD but will that affect the current performance on single cards? I know they typically release updated drivers for big titles. Has anyone heard anything? sorry if this question has already been asked.
 
Has anyone played this on a FX 6300?

I am hoping to get 30fps on my rig (3.8GHz, 8GB DDR3, GTX 660 Non-Ti). If not high then medium settings.

Is there going to be a demo?
 
There are already games that need more then 2GB or you get large performance issues at 1080p. BF4 and Watch_Dogs are two that I know of. Neither one of them is unplayable, but there is a large performance loss between the 2GB and 4GB versions of the same video card.

Can I see some evidence to back your claim? Not arguing against you. I'd just like to see proof.

My rig handles it smoothly @1440p with SMAA(temporal) with everything maxed out.

i7-3770k @4.4Ghz
8GB DDR3 @2133Mhz
GTX 770 4GB Windforce OC

Runs around 40-60 FPS, with occasional drops to the low 30s.

At 1680x1050, runs over 55FPS, averaging around 57-62. Sometimes drops down to 40s, but picks back up.

I have noticed some issues with regards to performance when moving into or out of an area (like going outside, etc.). It's not all the time, either. But it'll drop down to like 17FPS for a few seconds before jumping back up again. Streaming issues?
 
Another conclusion:

A GTX 770 is roughly twice as fast as a PS4. Watch Dogs seems in line with previous observations. A 770 should be able to run at full HD, locked 60 FPS, everything on high, what the PS4 runs at 900p/30 FPS.

If this is equal to everything we've already seen up to this point, then the "novelty" of Watch Dogs is really all about what's happening on the CPU side. So it seems that the "video" side of the new consoles are really performing the same of PC hardware, whereas the rest of the engine could still bog down a good PC, in comparison.

Now it would be interesting to figure out what's taxing so much the CPU (beside this being just bad code).

It's also fairly odd, because we were expecting the opposite: the video side of the consoles performing much better than the numbers of the hardware, while being taken back by their rather weak CPU side.
 
Can I see some evidence to back your claim? Not arguing against you. I'd just like to see proof.

From this thread:

http://www.neogaf.com/forum/showthread.php?p=113227294#post113227294

And this happens to him with the textures at high. Ultra textures he says use another 200MB of vram.


Anecdotal, same for the BF4 stuff I read.

Oh man, with my GTX 670 @1254MHz and i5-3570k @4.1GHz I can hit 40-50 most of the time with shadows, reflection and level of detail high. But it stutters a lot in some scenes, especially while driving. It froze a couple of times on me too. Christ, this game is terribly optimized.

Check your vram usage while playing. Other people have found that their video card was out of memory when that was happening. If you are out of vram, it's not an optimization issue.
 
Who plays these games with KB/M? I could understand wanting to play FPS that way (I do) but for 3rd person action games, controllers are a must.

People (like me) play with kb/m because that's what they are most comfortable with. Why is that so weird?
 
Can I see some evidence to back your claim? Not arguing against you. I'd just like to see proof.


Well isn't Watch Dogs proof enough? Run ultra textures on a 2GB card and the performance is complete garbage. It stutters like crazy. Turn it down to high and it's actually playable.

And that's at 1080p with SMAA.

2GB isn't enough anymore if you want to play with the best textures in these big new games. Sure 2GB might still work in plenty of games for quite some time, but no one who has plans on buying a high-end GPU today should get 2GB of VRAM.
 
I was tinkering with all the available graphic options and discovered the Water quality setting has a substantial impact on my framerate, I lowered it from High to Medium and watched the fps jump from 25fps to 90fps and I was indoors at the time, maybe a bug? or perhaps my GTX 480 just isn't up to the task lol.

(PC) i7 920 @ 3.6ghz / 16gb / GTX 480 1.5gb / Win 7
 
Well isn't Watch Dogs proof enough? Run ultra textures on a 2GB card and the performance is complete garbage. It stutters like crazy. Turn it down to high and it's actually playable.

And that's at 1080p with SMAA.

2GB isn't enough anymore if you want to play with the best textures in these big new games. Sure 2GB might still work in plenty of games for quite some time, but no one who has plans on buying a high-end GPU today should get 2GB of VRAM.

Because it is one game, you always have plenty of exceptions, some games just run worse with certain configurations than others.
 
There are already games that need more then 2GB or you get large performance issues at 1080p.

I didn't say there aren't. I said that on a 770 level card performance hits you before vram limits do.

So, usually, the moment the 4Gb are useful also tend to coincide with FPS going under 60. Meaning that you'd be much better dialing down the texture setting and have smoother performance, then keep "ultra" but at lower overall FPS.

Texture size does not affect FPS. It affects the amount of loading/unloading which causes stuttering and FPS dips. But it won't cause changes in sustained FPS.

In practice: I mean that if you have a 770 4Gb you probably won't be able to play Watch Dogs all at "ultra", using all your 4Gb, because the raw power of that card isn't enough. So, a 770 2Gb is likely to run better if you dial back textures to "high" (which is really a minimal difference quality-wise, unless you're staring directly at a wall), stay in your hardware limit, and have overall higher FPS than a 770 4Gb that is pushed to its own.
 
I was tinkering with all the available graphic options and discovered the Water quality setting has a substantial impact on my framerate, I lowered it from High to Medium and watched the fps jump from 25fps to 90fps and I was indoors at the time, maybe a bug? or perhaps my GTX 480 just isn't up to the task lol.

(PC) i7 920 @ 3.6ghz / 16gb / GTX 480 1.5gb / Win 7

The game calculates the moisture level inside every person's body with the water quality setting.
 
With vsync turned off I get a steady 45-60fps, typically hovering around 50fps usually. However, I can't stand screen tearing, so I have vsync set to on (1 frame) and it's mostly locked at 30fps. I say mostly because Fraps is telling me it's jumping up to about 45fps again, but for the vast majority of the time, locked at 30.

Have you tried forcing triple buffering in Radeonpro and then switching between windowed and full screen modes in game? I had a similar problem with another game, might've been Assassin's Creed, when I was first getting to grips with tweaking catalyst drivers after using Nvidia cards for the longest time.
 
Because it is one game, you always have plenty of exceptions, some games just run worse with certain configurations than others.

BF4, Watch_Dogs and Call of Duty Ghosts all have issues with 2GB of vram. We are currently 7 months into the "next gen". You don't think there's a trend going on here? These are the big publishers. Wait until the little guys start having small teams port their PS4/XB1 games over to PC.

I didn't say there aren't. I said that on a 770 level card performance hits you before vram limits do.

I don't think you understand how video cards work.
 
Man, this games performance is weird. I was messing around a bit and no matter of the resolution or graphics quality, I couldn't get that damn thing above 40fps when it was raining at daytime lol?

As stated in a prior post, it ran pretty okay at 45-60fps @ 1080p before... super CPU bottleneck?
It's also strange that my four cores are only used ~70% each...

E: Proof thingy for mods http://i.imgur.com/hBAs3UU.png
 
I was tinkering with all the available graphic options and discovered the Water quality setting has a substantial impact on my framerate, I lowered it from High to Medium and watched the fps jump from 25fps to 90fps and I was indoors at the time, maybe a bug? or perhaps my GTX 480 just isn't up to the task lol.

Nope, it's possible that if this is confirmed it could explain why it's so intensive on CPU. It might end up that there's something done entirely on the CPU side, that makes everything else very slow.
 
Hey guys, I was wondering if I could get the most ultra settings and get around 45 fps with my rig. 60 fps is no requirement, everything between 40-45 will be golden. Will I get that with a:

i5 2500k 3.3 ghz (NOT overclocked)
8 GB ram
GTX 770 2 gig
Playing at 1920 x 1080

Thanks!
 
Well isn't Watch Dogs proof enough? Run ultra textures on a 2GB card and the performance is complete garbage. It stutters like crazy. Turn it down to high and it's actually playable.

Once again, if we are talking of a 770 level card, 4Gb may let you run the game at "ultra", compared to a 2Gb model. But I doubt that it's powerful enough to give you 60 FPS.

So I argue that a 770 2Gb, runs at smooth 60 FPS, on high, and this is better than a 770 4Gb running at ultra but 40-50 FPS in comparison.

I'm simply saying that smooth 60 FPS are far more important than the difference between "high" and "ultra" textures when performance goes lower than 60.
 
Because it is one game, you always have plenty of exceptions, some games just run worse with certain configurations than others.

This is basically the "512MB of VRAM is more than enough" that we heard from people at the start of last generation all over again. It didn't take long before that wasn't true.

Don't save a little money and get the 2GB instead of the 4GB version of a card. Especially if you are not going to upgrade for a while.

I almost regret getting a 2GB 680 DCII two years ago, but I got it for almost half the retail price of a stock 680 so I can't complain. But I like to downsample from higher resolutions, so 2GB has been a big bottleneck in some games.

Once again, if we are talking of a 770 level card, 4Gb may let you run the game at "ultra", compared to a 2Gb model. But I doubt that it's powerful enough to give you 60 FPS.

So I argue that a 770 2Gb, runs at smooth 60 FPS, on high, and this is better than a 770 4Gb running at ultra but 40-50 FPS in comparison.

I'm simply saying that smooth 60 FPS are far more important than the difference between "high" and "ultra" textures when performance goes lower than 60.

But why limit yourself to presets? Use ultra textures and turn down the other stuff to high if needed to maintain 60fps.
 
Hey guys, I was wondering if I could get the most ultra settings and get around 45 fps with my rig. 60 fps is no requirement, everything between 40-45 will be golden. Will I get that with a:

i5 2500k 3.3 ghz (NOT overclocked)
8 GB ram
GTX 770 2 gig
Playing at 1920 x 1080

Thanks!

I have the same system and if WatchDogs will run like AC Black Flag (in cities) i will be fine.
 
Once again, if we are talking of a 770 level card, 4Gb may let you run the game at "ultra", compared to a 2Gb model. But I doubt that it's powerful enough to give you 60 FPS.

So I argue that a 770 2Gb, runs at smooth 60 FPS, on high, and this is better than a 770 4Gb running at ultra but 40-50 FPS in comparison.

I'm simply saying that smooth 60 FPS are far more important than the difference between "high" and "ultra" textures when performance goes lower than 60.

Except that the 770 4GB may run at 60 FPS with ultra textures and the only reason the 770 2GB can't is because it lacks enough vram. We have finally reached a point where 2GB of vram isn't enough. It's only going to get worse moving forward.
 
But why limit yourself to presets? Use ultra textures and turn down the other stuff to high if needed to maintain 60fps.

Because, as I explained, most models at 2Gb come with some overclocks by default, whereas similar branded 4Gb models have stock clocks.

So 2Gb models run actually faster, and in these games this performance is more important than having 4Gb when you have to turn details down.

Also, on same class videocards if you run textures at ultra/4Gb the constant streaming is going to hit your FPS MUCH higher than a high/2gb. It means twice the work at the same speed, so twice the stuttering.
 
Because, as I explained, most models at 2Gb come with some overclocks by default, whereas similar branded 4Gb models have stock clocks.

So 2Gb models run actually faster, and in these games this performance is more important than having 4Gb when you have to turn details down.

The hell..

Those "overclocks" are usually minimal unless you go for the more expensive versions like DCII from Asus.

You can achieve the exact same overclock in a matter of seconds in Catalyst control center for AMD cards and minutes on a Nvidia card with something like GPU Tweak or EVGA Precision.
 
Because, as I explained, most models at 2Gb come with some overclocks by default, whereas similar branded 4Gb models have stock clocks.

So 2Gb models run actually faster, and in these games this performance is more important than having 4Gb when you have to turn details down.

This is wrong. All manufacturers sell factory overclocked 4GB GTX 770 video cards. In fact I have a factory overclocked 770 4GB.
 
Getting about 14-20 fps at 4k maxed. Dropping below/above at times. (With alot of loading/stuttering cause insufficient vram/system ram )
Dual 680's 4GB
16gb ddr3 2133
3930k at 4.6

Also, anyone else who's running this at 4K, can you tell me what settings your at(along with vram) how much system ram is being used.

WD seems to have alot of "ghost" ram (I'd assume this is system video memory, but Process explorer shows nothing) Killing the process frees up about 11GB of ram :<

Doesn't do this at 1080p. Kinda hard to think you'll need that much ram(vram) for 4k.
 
Status
Not open for further replies.
Top Bottom