Watch_Dogs PC performance thread [Read post #1215 before posting]

Status
Not open for further replies.
Actually, this is supported by solid evidence.

Do you have any evidence that a 4gb 670/760/770 performs noticeably better than their 2gb counterparts? You claim its fact, but nobody has provided any evidence of this 'fact' whatsoever.

No matter what we need to have an open mind when it comes to these things. Every game is different. It's a new generation and developers will be using new engines that utilize these cards differently. The consoles have lots of ram so maybe game development will change in ways where more graphics memory will be required. Truth is we don't know how things will turn out. I'm sure if there are developers here who can shed some light on this issue it would very much be appreciated by everyone.
 

sk3tch

Member
I'm in "budget mode" with my rigs until next gen PC GFX cards drop...3GB or lower all around...doh! I'm still only doing 5760x1080/120hz so maybe I'll be fine. No 4K, yet.

Glad Ubi is pushing graphics like this...everything I have seen so far has been impressive so I'm not surprised that 4K equals 5-6GB VRAM. I'm also surprised people are upset...4K hasn't been in the gaming vernacular for that long. 760s/780s/290s/290xs/etc. still kill games at 1080p/1440p/etc. - if you really want to max a game like Watch_Dogs then maybe it's time to upgrade to a bigger VRAM card. You'd be making a mistake though. I'm playing the waiting game until next gen.
 

LiquidMetal14

hide your water-based mammals
I'm in "budget mode" with my rigs until next gen PC GFX cards drop...3GB or lower all around...doh! I'm still only doing 5760x1080/120hz so maybe I'll be fine. No 4K, yet.

Glad Ubi is pushing graphics like this...everything I have seen so far has been impressive so I'm not surprised that 4K equals 5-6GB VRAM. I'm also surprised people are upset...4K hasn't been in the gaming vernacular for that long. 760s/780s/290s/290xs/etc. still kill games at 1080p/1440p/etc. - if you really want to max a game like Watch_Dogs then maybe it's time to upgrade to a bigger VRAM card. You'd be making a mistake though. I'm playing the waiting game until next gen.

Pushing in the wrong context though. It just seems very badly optimized if it will eat up so much VRAM because it simply can.
 

Netboi

Banned
Downloaded the new Nvidia drivers for Watch Dogs, seems a lot smoother. Didn't experience any micro stutters.

AVG FPS: 45-60
Blowing up more than 5 cars in the gas station = 25fps

My setup:

Gigabyte Windforce GTX 770 4GB (OC 1200/1800)
FX-8350 (Not overclocked)
8 Gigabyte Ram

I used the Nvidia settings chart for Watch Dogs here
http://www.geforce.com/whats-new/gu...ng-guide#watch-dogs-optimal-playable-settings

I also changed the FX Quality to PC instead of console, and used Sweet FX 1.5.1

J9EmP9m.png
 
What the hell are you talking about?

Fever dream maybe? Dementia? I'm wondering why there's a complete meltdown over the beginning of a game. Not even a chance to let things develop. Plus, spoiling the opening, maybe people wanted to go in fresh.

Anyway, I beg to differ, a lot.
 

Dr Dogg

Member
I wonder what Batman AO's vram usage was at 4K with TXAA High. Probably pretty high.

Let's have a look-see. Been meaning to give the new WHQL drivers a bit more of a test.

Edit: Oh and just in case this get misconstrued this is not anything from Watch_Dogs but Batman Arkham Origins which uses similar Nvidia technology relevant to both titles.

2.7gb minimum but I hit 3gb pretty quick. I'd assume if you have more it would climb up. Performance is good though but I'd expect in the Deadshot room and a couple of the crime scenes for that to take a dive as they're quite busy.

Does anybody know if Watch_Dogs has a built in benchmark tool? Far Cry 2 had one of the better ones available.
 
Downloaded the new Nvidia drivers for Watch Dogs, seems a lot smoother. Didn't experience any micro stutters.

AVG FPS: 45-60
Blowing up more than 5 cars in the gas station = 25fps

My setup:

Gigabyte Windforce GTX 770 4GB (OC 1200/1800)
FX-8350 (Not overclocked)
8 Gigabyte Ram

I used the Nvidia settings chart for Watch Dogs here
http://www.geforce.com/whats-new/gu...ng-guide#watch-dogs-optimal-playable-settings

I also changed the FX Quality to PC instead of console, and used Sweet FX 1.5.1

J9EmP9m.png
performance? resolution? aa? memory usage? settings?
 

Nokterian

Member
Let's have a look-see. Been meaning to give the new WHQL drivers a bit more of a test.



2.7gb minimum but I hit 3gb pretty quick. I'd assume if you have more it would climb up. Performance is good though but I'd expect in the Deadshot room and a couple of the crime scenes for that to take a dive as they're quite busy.

Does anybody know if Watch_Dogs has a built in benchmark tool? Far Cry 2 had one of the better ones available.

With batman arkham origins i never use v-sync..i use adaptive v-sync and get around 80/100fps.

Also i am torn..thinking to get Watch Dogs on PS4 instead of PC..since well ubisoft being as always unoptimized as usual.
 

LowParry

Member
Hmmm. I'm kind of interested in the game.

i5 3570k (4.2)
ATI 7950 3GIG RAM
8GIG RAM (1866)
SSD
Will be running 1080p

I figure I'll just be going with High settings since I don't see too much of a difference between High and Very High/Ultra settings with other games.

I should be alright? And controller support is in yes?
 

sk3tch

Member
Pushing in the wrong context though. It just seems very badly optimized if it will eat up so much VRAM because it simply can.

4K resolution, though? C'mon. 2x 1080p...

Yeah, it may be a bit unoptimized...but give it some time.

Rather see it push (even if a bit unoptimized) than be like Wolfenstein (which is a great game) that is very console-ized.
 

LiquidMetal14

hide your water-based mammals
4K resolution, though? C'mon. 2x 1080p...

Yeah, it may be a bit unoptimized...but give it some time.

Rather see it push (even if a bit unoptimized) than be like Wolfenstein (which is a great game) that is very console-ized.

I agree with you on the latter point.
 
Let's have a look-see. Been meaning to give the new WHQL drivers a bit more of a test.



2.7gb minimum but I hit 3gb pretty quick. I'd assume if you have more it would climb up. Performance is good though but I'd expect in the Deadshot room and a couple of the crime scenes for that to take a dive as they're quite busy.

Does anybody know if Watch_Dogs has a built in benchmark tool? Far Cry 2 had one of the better ones available.

I usually disable vsync when running benchmarks
 

Netboi

Banned
performance? resolution? aa? memory usage? settings?

I used the Nvidia compatible chart that I linked in the post.
The AA was temp smaa, shadows High, everything else was maxed out.
No Micro stutters.

Avg. fps 42-60
1080p
Memory usage on ram was 2 gigs.
CPU Usage 29 percent.
Not sure about VRAM. I only have 4 gigs total.

It can dip down to 25fps when putting 5 or more cars in the gas station and blowing it up.
 

pahamrick

Member
Just come over to the green side, grass is greener as you know ;)

Going nVidia for me is a matter of when, not if, so no problems there. Ever since Catalyst 13.something, I'm constantly having crashes because my display drivers crash/stop responding amongst other driver related issues. Would have a 770 already, or a pair of them, if it wasn't for my finances.
 

Jtrizzy

Member
Should this have shown up in my uplay account by now for a pre load? I bought it with an Nvidia code redeemed on the ubisoft website. Help! Are people using VPn's?
 

Seanspeed

Banned
No matter what we need to have an open mind when it comes to these things. Every game is different. It's a new generation and developers will be using new engines that utilize these cards differently. The consoles have lots of ram so maybe game development will change in ways where more graphics memory will be required. Truth is we don't know how things will turn out. I'm sure if there are developers here who can shed some light on this issue it would very much be appreciated by everyone.
There is no need for an open mind in an area where clear performance tests can verify claims one way or the other.

I'm not saying its absolutely not true a 4GB 670/760/770 was a smarter decision than a 2GB, but so far, this hasn't been the case, even at very high resolutions where you'd expect them to be vRAM-limited. The guy claimed that the 'memory bus width' limitation was a bunk theory, but made absolutely no attempt to provide evidence to support this claim. I'm sorry if that isn't good enough for me.
 

TSM

Member
There is no need for an open mind in an area where clear performance tests can verify claims one way or the other.

I'm not saying its absolutely not true a 4GB 670/760/770 was a smarter decision than a 2GB, but so far, this hasn't been the case, even at very high resolutions where you'd expect them to be vRAM-limited. The guy claimed that the 'memory bus width' limitation was a bunk theory, but made absolutely no attempt to provide evidence to support this claim. I'm sorry if that isn't good enough for me.

AndyBNV from nvidia already posted in this thread that 770 4GB gets you ultra textures. 2GB limits you to high textures or you get stuttering. So there's at least one game where the 2GB means lower quality visuals.
 

cripterion

Member
Going nVidia for me is a matter of when, not if, so no problems there. Ever since Catalyst 13.something, I'm constantly having crashes because my display drivers crash/stop responding amongst other driver related issues. Would have a 770 already, or a pair of them, if it wasn't for my finances.

I feel ya. For me ATI peaked with the Radeon 9800 pro (my last red card was the X850xt)

I was actually trying to resell my two Asus 670's and buy a 780ti but I'm keeping them for sure, until Nvidia releases their new lineup. Even though my cards are 2GB, buying them was the video card investment I've ever made so far.
 

Aeana

Member
As a courtesy, I have PMed everyone who has yet to provide evidence of ownership who still needs to. You have about 7 hours.

This thread will be closed and a new one will be created this evening.
 
Going nVidia for me is a matter of when, not if, so no problems there. Ever since Catalyst 13.something, I'm constantly having crashes because my display drivers crash/stop responding amongst other driver related issues. Would have a 770 already, or a pair of them, if it wasn't for my finances.

I built a gaming PC for the first time in 2012 and put an AMD 7870 in it. Not only would certain games suffer from terrible terrible performance (due to the pub only optimizing for Nvidia cards) but my damn Catalyst Control Center stopped opening. I tried pretty much everything but eventually I had to re-install Windows. This happened again a few weeks later.

I'm not sure if it was actually a problem with AMD's shit but I know since I got a Nvidia card my PC has been a-okay. And Shadowplay is awesome, once I get upload that doesn't suck ass, I'll start a Youtube channel and try to get the big bucks.
 

TheContact

Member
I don't think my 760 is good enough for this, I'm most likely going to buy the ps4 version when I actually finally buy a ps4 in a month
 

Caayn

Member
I don't know what ass nvidia has their head up, because i can play with my 750ti with everything on default high settings and i get 30-40fps just fine :|
It's stated in that article that he aimed to get 35fps as a minimum.
 

pahamrick

Member
I built a gaming PC for the first time in 2012 and put an AMD 7870 in it. Not only would certain games suffer from terrible terrible performance (due to the pub only optimizing for Nvidia cards) but my damn Catalyst Control Center stopped opening. I tried pretty much everything but eventually I had to re-install Windows. This happened again a few weeks later.

I'm not sure if it was actually a problem with AMD's shit but I know since I got a Nvidia card my PC has been a-okay. And Shadowplay is awesome, once I get upload that doesn't suck ass, I'll start a Youtube channel and try to get the big bucks.

Had that same problem a couple times, causing me to do a complete reinstall several times. I only went with the 5870s when I built my system because it was before the GTX 400 series came out and the 5870s were the card to beat at the time. I figured I had a Radeon 9800Pro and it treated me well, so I'll go red this time.

In the end, I don't completely regret going red, but I definitely won't stray from nVidia in the future.
 

knitoe

Member
I don't know what ass nvidia has their head up, because i can play with my 750ti with everything on default high settings and i get 30-40fps just fine :|
How about?

1) Post your spec.

2) Post game settings.

3) Run MSI afterburner or EVGA Precision. Enable ingame monitoring. Select CPU & GPU usage. System & Video card ram usage and fps.

4) Take screenshot built into Afternburner / Precision and post it. Even, better if, you show a video.

Then, we might take you seriously...
 
Status
Not open for further replies.
Top Bottom