Watch_Dogs PC performance thread [Read post #1215 before posting]

Status
Not open for further replies.

Dr Dogg

Member
Anyone else glad they own a GTX TITAN with 6GB of VRAM?

Anyone?

Someone give me a high five here

No but I'm pissed I pulled the trigger on a second 780 and my step up ran out just before EVGA offered the 6gb ti's. Been trying to offload one to a few mates who were interested in a cheapish 780 a few months back but now their looking at the offer like I'm trying to sell them a leper in light of Wolf14 and now Watch_Dogs.
 
16 GB RAM is nuts, sure it will happen eventually but I can't imagine it will happen very soon. We have had octo-cores with AMD for a while now on the PC. Quad-cores of Intel consistently out perform them. Every game benefits from higher clock speeds, while there are fewer games that scale well with more cores.

Using Daylight as an example is about the worst thing you can do. The game is atrociously optimized, it is on Unreal Engine 4, which I can use on my PC. The things I get running on my PC look and run a whole lot better than Daybreak.

And also runs like shit on the PS4 anyway.

We are getting new engines, developers are getting a lot more to work this. Brute forcing graphics with much better hardware on the PC can't happen anymore until the hardware requirements are higher. Some developers do not seem to care and don't take any care in optimizing their PC builds, causing situations like this. But that doesn't anymore as the hardware difference is a bit smaller now.


Who trumps who between 8 cores and hyperthreading was not the point I was making. The point was that the technology existed on pc for many years but wasn't being taken advantage of because games never needed more than 4 cores up until now, the reason being that the target hardware(ps360) weren't supporting that technology. A conclusion would be that next gen pc games are going to benefit from having more physical cores/hyper threaded cores due to the octa cores in consoles now.


Daylight is like the only next gen only example one can give, it might be badly optimized, but would it take that many resources if it was that badly optimized on UE3? Would it look that good with UE3? Games are gonna start taking more resources from now on whatever optimization they undergo because the hw requirement is going up. Just wait for the next gen only games coming out in fall.
 

slapnuts

Junior Member
Not sure what's the word on it but i've read here and there that even those get filled up when playing in ultra. If there's any truth to that then the game might be designed to always use what's available. Anybody know more?

Same has been said at guru3d i believe and a few other sites..no matter how much vram..somehow its getting ate up by this specific game.

I really wouldnt worry if you have a 3gb card or higher. Especially those that are sticking at 1080p, like myself personally...my main gaming rig is used on a big screen 55 inch plasma...i cannot go any higher than 1080p on that tv anyways ...so im fine with a 3gb card until the true next gen cards come out...i would honesty stick with what you have now if you are at 3gb or more and 1080p is your rez until 20nm cards come out.

Going any higher than 1080p usually means you are the type of person that is an enthusiasts when it comes to computer gear which want the highest resolutions, best AA that can be had. Most of us are not that way and are fine with 1080p, average AA and 60fps. My current card does exactly that with everything..not sure what i will get with this game but its not the best looking game out and from what im reading for performance...its not really optmized all that great imho.

People that have 4gb and 6gb cards are getting that shit used up to max with this game from what i've read...for what i see, it makes no sense because like i said this game is not really next-gen visuals 90% from what i can see in pics and vids i've seen.
 

Netboi

Banned
I got sweet fx to work with Watchdogs and it looks way better.

4TWzXbt.png


You can find more screenshots here http://imgur.com/a/U3D3i#RqVENLY
 

scitek

Member
Well, it's been confirmed by Jonathan Morin (Creative Director of Watch Dogs) that yes, 3 GBs of VRAM is limiting 780s/Tis (perhaps also 79xx/280 cards) in this game.

See here.

As a new 780 owner, I'm very upset at Nvidia.

Well, we have a Titan owner earlier in this thread that said he couldn't even set the game to Ultra textures without it stuttering, and he has 6GB of VRAM. Hopefully a patch will help iron things out.
 
Who trumps who between 8 cores and hyperthreading was not the point I was making. The point was that the technology existed on pc for many years but wasn't being taken advantage of because games never needed more than 4 cores up until now, the reason being that the target hardware(ps360) weren't supporting that technology. A conclusion would be that next gen pc games are going to benefit from having more physical cores/hyper threaded cores due to the octa cores in consoles now.


Daylight is like the only next gen only example one can give, it might be badly optimized, but would it take that many resources if it was that badly optimized on UE3? Would it look that good with UE3? Games are gonna start taking more resources from now on whatever optimization they undergo because the hw requirement is going up. Just wait for the next gen only games coming out in fall.

Oh I am not even talking about hyper-threading. I am just talking about normal quad-cores. Even without hyperthreading they outperform octo-cores. And every title is going to support quad-cores, less then the titles which will fully utilize the octo-cores and hyperthreading.

Daylight runs bad and looks bad on the PS4 too. It would be a better comparison if it would be a good version on the PS4 and much worse on the PC. Sure hw requirements are going up, but hyper-threading and octo-cores on Intels side are the most expensive CPUs. If you need the most expensive CPUs to run your game normally, there is something wrong there.
 

HelloMeow

Member
Well, we have a Titan owner earlier in this thread that said he couldn't even set the game to Ultra textures without it stuttering, and he has 6GB of VRAM. Hopefully a patch will help iron things out.

High textures run fine on my 2gb 770, but when I select ultra I get some annoying stuttering.

Even though the game warns me that I need 3gb for ultra, it runs fairly good except for the stuttering which a lot of people with 3gb+ seem to have as well.
 
Oh I am not even talking about hyper-threading. I am just talking about normal quad-cores. Even without hyperthreading they outperform octo-cores. And every title is going to support quad-cores, less then the titles which will fully utilize the octo-cores and hyperthreading.

Daylight runs bad and looks bad on the PS4 too. It would be a better comparison if it would be a good version on the PS4 and much worse on the PC. Sure hw requirements are going up, but hyper-threading and octo-cores on Intels side are the most expensive CPUs. If you need the most expensive CPUs to run your game normally, there is something wrong there.

Yeah that's just down to AMD's cores not being as good as Intels. Quad cores will definetly be supported for quite some time after octa core/i7 becomes the norm for the high end, just like Dual cores were still supported when games started utilizing quad cores.
 

Netboi

Banned
Getting serious Alan Wake vibes here. Looks good although I dislike the outfits in this game. Are there other appearances than Coats and long jackets?

Nope just preset suits you unlock. They wanted to keep the same tone in the game. But who knows, maybe DLC.
 
I just purchased Watch_dogs on GMG for $37.50. They have a 25% voucher currently if you're pre-ordering. I think that's the best price atm for PC.


I haven't posted here in this thread about my performance, but tomorrow when I install & setup the game, I'll post my thoughts regarding the leaked version and the actual release version. (The keys are sent out on release) I''m currently using the beta patch 337.50. I'll also post screenshots of my purchase information as well as a pic of my Uplay games list...to quell the backseat moderators, twiddling their thumbs with nothing better to do.

Unfortunately, all of my experiences will be based on High (Texture) settings, as my 2x 680's are 2GB.
 

pahamrick

Member
I've watched as much footage as I could find, of the intrusion multiplayer mode at least, and I find the game looks okay. It has it's moments at night, but doesn't match the original 2012 reveal.

Everything else, like the little touches that are missing, I'm hoping are just growing pains for the team doing their first modern open world game. If they can do the same growth and evolution for Watch_Dogs 2 that happened from Assassin's Creed -> Assassin's Creed II, it should be something real special.

Judging by what little I've seen of the opening story moments, and the couple hours of multiplayer footage, I don't regret getting this day 1. I'm not super hyped for it, but I don't regret taking advantage of the Target Trade In from last week to snag it either.
 

EmpReb

Banned
You need to
disable
disable Adblock for NeoGAF. Thems the rules.
How about stop this BS. TOS says nothing on this and everyone keeps whining like its the end of the world the adblock is used. Its what a lot of SMART people use to stop the internet from being bill board central and sometime they white list sites. But I would assume anyone worth a salt just turns on an adblocking software and stops there.

Anyways these pic and performance info are really ubisoft being ubisoft. I hope the drivers and patchs help with the vram issues. I have 4GBs on each 760 in a SLI set up. I hope that works but hearing people with 6GBS having trouble with VRAM?!! That Console port feels...... its almost like they for get PC optimization again.
 

KHlover

Banned
Coming from a completely different direction, which settings could I run with the following setup?

ATI Radeon HD4850 1GB VRAM
Intel Core2Duo E8500 overclocked to 4GHz
8GB Ram

OS is Windows 8.1, 1080p monitor.

I don't have particulary high demands for IQ, 30fps and no AA are fine by me.

I just finished Sleeping Dogs and got an average > 30fps at high details without AA and reduced shadow details.

Can I expect similar performance from Watch Dogs or do I have to go to Mid. Settings or (shudder) even lower to reach 30fps@1080p?
 

Grinch

Banned
Coming from a completely different direction, which settings could I run with the following setup?

ATI Radeon HD4850 1GB VRAM
Intel Core2Duo E8500 overclocked to 4GHz
8GB Ram

OS is Windows 8.1, 1080p monitor.

I don't have particulary high demands for IQ, 30fps and no AA are fine by me.

I just finished Sleeping Dogs and got an average > 30fps at high details without AA and reduced shadow details.

Can I expect similar performance from Watch Dogs or do I have to go to Mid. Settings or (shudder) even lower to reach 30fps@1080p?

Low settings if even those.
 

Parsnip

Member
So reading some of the posts, am I to understand that an overclocked i5-4670K is fine? I run stock, maybe I need to do some overclocking.

That seems a bit nuts though, I wonder what exactly is so cpu intensive about it. Or is it just your typical Ubiport nonsense? Either way the previous generation versions must be truly gimped then.
 

JBourne

maybe tomorrow it rains
How about stop this BS. TOS says nothing on this and everyone keeps whining like its the end of the world the adblock is used. Its what a lot of SMART people use to stop the internet from being bill board central and sometime they white list sites. But I would assume anyone worth a salt just turns on an adblocking software and stops there.

People have gotten banned for using adblock on gaf.
 

TheD

The Detective
All next gen games going forward will benefit from hyper threading/6/8cores, as the games are made with octa cores in next gen consoles in mind. It will be a necessity to run games in decent performance within a year or 2. It doesn't matter whether consoles' cpu has a clockspeed of 1.6-2.0Ghz, that's the point. Technology on the high end will only become a standard if it's supported on the hardware these games are designed for: consoles. It's not a coincidence we see talk of i7 becoming beneficial just right now when that tech existed for 10 years.

Hardware requirements will soon skyrocket when next gen only games like AC Unity and Witcher 3 come out. Not only in the cpu and gpu department, PC games rarely used more than 2-3GB ram on high settings for last gen console games. 16GB ram will be essential for high settings. The huge boom in hardware requirement is already happening, Daylight(UE4) shocked people with its ram/vram usage http://www.neogaf.com/forum/showthread.php?t=810472



Who trumps who between 8 cores and hyperthreading was not the point I was making. The point was that the technology existed on pc for many years but wasn't being taken advantage of because games never needed more than 4 cores up until now, the reason being that the target hardware(ps360) weren't supporting that technology. A conclusion would be that next gen pc games are going to benefit from having more physical cores/hyper threaded cores due to the octa cores in consoles now.

Complete bullshit.

Xenon had 3 cores (6 "logical cores"), Cell had 1 PPE core (2 "logical cores) and 7 SPE cores.
That did not force PCs last gen to have that many cores and these new consoles having 6 (usable), slow cores are not going to obsolete quad cores (very much so when you take into account the slowing down of Moore's Law, which means that PC versions of games later into this gen can not just rely on CPUs in PC's of the day being massively faster than they had been at the start of the gen)!

To claim that you will need a CPU with 6/8 cores or hyperthreading in a year to be able to run games decently is just absurd!
 

Yibby

Member
I got sweet fx to work with Watchdogs and it looks way better.

http://i.imgur.com/4TWzXbt.png
You can find more screenshots here [url]http://imgur.com/a/U3D3i#RqVENLY[/url][/QUOTE]

You should post a comparison shot, with and without sweetfx.
 
Well, it's been confirmed by Jonathan Morin (Creative Director of Watch Dogs) that yes, 3 GBs of VRAM is limiting 780s/Tis (perhaps also 79xx/280 cards) in this game.

See here.

As a new 780 owner, I'm very upset at Nvidia.

Well, with consoles having 8GB of RAM, it was bound to happen. As more games being designed from zero for new consoles appears, there will be need for gpus with ~4500MB of Video RAM.
 
Yeah that's just down to AMD's cores not being as good as Intels. Quad cores will definetly be supported for quite some time after octa core/i7 becomes the norm for the high end, just like Dual cores were still supported when games started utilizing quad cores.

For high end? Yeah, maybe. But I really doubt they'll be standard in a year or two for most games.
 

gossi

Member
Coming from a completely different direction, which settings could I run with the following setup?

ATI Radeon HD4850 1GB VRAM
Intel Core2Duo E8500 overclocked to 4GHz
8GB Ram

I think you will struggle. The game happily uses 4 cores, although officially it will scale to two cores on the minimum requirements. Graphics card doesn't meet minimum specs.
 

ElyrionX

Member
What's the general consensus from the early impressions so far? Is this going to bring my i5 2500K 560Ti 8GB RAM rig to its knees?
 

Gibbo

Member
Had a feeling this was going to be similar to the Ass Creed 3 PC release. FYI, the frame rate on that game was never fixed
 

ISee

Member
What's the general consensus from the early impressions so far? Is this going to bring my i5 2500K 560Ti 8GB RAM rig to its knees?

Always hard to tell. But you should be fine with a mix of high/mid settings without demanding AA-Options.

I am running this on my i5 3570k/GTX 660 Ti (2gb)/8gb sys. ram on mixed high/ultra Settings.

45-33 fps.

1080p
Level of Detail: Ultra
Shadows: High
Reflections:High
Ambient Occlusion: HBAO+ (high)
Motion Blur: On
DoF:On
Water:High
Shader:High
V-Sync: yes /1 frame
AA: TXAA 2x


4DID1fR.jpg
 

wowzors

Member
What would you say a 650Ti will do? I've got 16GB Ram and an I-3570K clocked at ~4.4GHz.

That 650Ti is severely holding you back. I'm not saying go for a 780 or anything too crazy but maybe look into a 670, 760 or 770. A 650TI isn't really better than a 560ti.
 

MJLord

Member
That 650Ti is severely holding you back. I'm not saying go for a 780 or anything too crazy but maybe look into a 670, 760 or 770. A 650TI isn't really better than a 560ti.

Yeah I've been thinking about upgrading it for a while. I got it last year and had to skimp on it because other debts came up.

Thanks for the suggestions I'll take a look around now.
 
You should post a comparison shot, with and without sweetfx.


Since WD already supports SMAA and temporal SMAA, i only see a case where you could get better results, which would be for example combining 2x-4x MSAA + the sweetfx SMAA, since you can't do that in WD. But MSAA is so hungry in this game, even at 2x it kills my framerate and it seems to be the case for a lot of people if better gpu/more vram.
 

levious

That throwing stick stunt of yours has boomeranged on us.
How about stop this BS. TOS says nothing on this and everyone keeps whining like its the end of the world the adblock is used. Its what a lot of SMART people use to stop the internet from being bill board central and sometime they white list sites. But I would assume anyone worth a salt just turns on an adblocking software and stops there.

Anyways these pic and performance info are really ubisoft being ubisoft. I hope the drivers and patchs help with the vram issues. I have 4GBs on each 760 in a SLI set up. I hope that works but hearing people with 6GBS having trouble with VRAM?!! That Console port feels...... its almost like they for get PC optimization again.


Admitting to using adblock on this site is a likely ban. Guy was just giving him a heads up.
 
Status
Not open for further replies.
Top Bottom