Watch_Dogs PC performance thread [Read post #1215 before posting]

Status
Not open for further replies.

Xyber

Member
Engine changes between the last WHQL and the new one makes any direct comparison between the two pointless in either sGPU or SLI. As such, all I can say is 'download today's driver for the best experience'.

So there will be a new WHQL out today? Or do you mean "just use the latest beta driver for the best experience"?
 

cripterion

Member
Well, it's been confirmed by Jonathan Morin (Creative Director of Watch Dogs) that yes, 3 GBs of VRAM is limiting 780s/Tis (perhaps also 79xx/280 cards) in this game.

See here.

As a new 780 owner, I'm very upset at Nvidia.

That guy is just full of crap though, seriously. He's the one who talked about the CPU & ram requirements and running ultra on a single GTX670. Or how pc will look better than 2012 footage.
 
I feel like this game was made by people who don't go outside much during the day. I mean the graphical discrepancies between shitty weather/night time and day are almost generational.
 
Proof:
wEKEl2S.png
 

Kinthalis

Banned
Can't wait to see if there are any improvements to performance once the game hits and drivers from AMD and Nvidia, hopefully, update.

As for the discussion about multiple cores vs single core performance - It's really a non-issue for modern processors with 4 cores or more.

Let's say you're breaking up a complex computation into 4 millisecond chunks. On a PS4 in 4 milliseconds you have completed 6 such chunks. On a PC, say a last gen i5, each of those 4 millisecond chunks takes 2 milliseconds to complete. In 4 milliseconds the PC has completed 8 such chunks. Even more on say a modern i7.

It's just a matter of devs taking the time to properly optimize the CPU performance, at least for now. New API's on PC like DX12 and Mantle will make this a lot easier, and will free up a lot of CPU time that is currently spent just babysitting the GPU.
 

Brofist

Member
That guy is just full of crap though, seriously. He's the one who talked about the CPU & ram requirements and running ultra on a single GTX670. Or how pc will look better than 2012 footage.

Still he has a point. Why is 3GB still standard in most of the 780/Ti series, especially when even the 770 is available in a 4GB model.
 

cripterion

Member
Still he has a point. Why is 3GB still standard in most of the 780/Ti series, especially when even the 770 is available in a 4GB model.

Nvidia is kinda greedy on ram but honestly I had no troubles playing any game so far on my rig with my 670's. I'm 100% positive I can take them well into next year when they have their new cards out. Even Titanfall who apparently doesn't use SLI runs flawlessly. As long as Nvidia continues to optimize their driver, I think it's all good.

PS : I play at 1080P
 

Zafir

Member
I'm hoping the 800 series doesn't take too long to come out. I want to upgrade from a 660TI, but I don't really see much point in getting a 7xx card if the 8xx series is coming this year.
 

Xyber

Member
Still he has a point. Why is 3GB still standard in most of the 780/Ti series, especially when even the 770 is available in a 4GB model.

I believe it's because of the memory bus. 770 uses a 256bit bus and for that 2 and 4GB of VRAM works the best. 780 uses a 384bit bus where 3/6GB works the best.

I'm not all that great with these stuff though, so this might not be completely accurate. :p
 
Complete bullshit.

Xenon had 3 cores (6 "logical cores"), Cell had 1 PPE core (2 "logical cores) and 7 SPE cores.
That did not force PCs last gen to have that many cores and these new consoles having 6 (usable), slow cores are not going to obsolete quad cores (very much so when you take into account the slowing down of Moore's Law, which means that PC versions of games later into this gen can not just rely on CPUs in PC's of the day being massively faster than they had been at the start of the gen)!

To claim that you will need a CPU with 6/8 cores or hyperthreading in a year to be able to run games decently is just absurd!

The cpu tech in the consoles were really advanced for their time for being multi core. PC's got 2 core cpu's with Pentium D and Core 2 duo. The hunt for higher clock frequency stopped and the gains were made through improved cpu functions. There were no consumer 6/8 cores available at the launch of last gen consoles but we did see an evolution towards more cores. AMD can't compete with Intel on the strength of individual cores, because of the weak competition, Intel cpu's have small performance jumps in the last few years. The new console cpu's too are weak on individual cores. So I think with all these factors in mind, devs will shift to utilize more cores and there is untapped power in what's available: octa cores and Intel's hyper threading technology.
 

Megasoum

Banned
Also... I just want to say, the port might be terrible but can we at least recognize the fact that we're getting a PC version of an Ubi game on day 1? That's almost unheard of...
 
I believe it's because of the memory bus. 770 uses a 256bit bus and for that 2 and 4GB of VRAM works the best. 780 uses a 384bit bus where 3/6GB works the best.

I'm not all that great with these stuff though, so this might not be completely accurate. :p

Anything beyond the 2gb on a 256 bit bus has no benefit. It starts to become marketing PR then to sell more cards.
 

Kinthalis

Banned
PS4 has 8GB on a 256bit bus.

Only 5 of that is available for games. AND that 5 gigs needs to hold not just things required by the renderer, but EVERYTHING that makes up the game.

On PC, only things required by the renderer need to be int he GPU buffer (with a few exceptions), the rest is stored in system RAM.
 
Also... I just want to say, the port might be terrible but can we at least recognize the fact that we're getting a PC version of an Ubi game on day 1? That's almost unheard of...

Assassin's Creed games are the only Ubisoft games that don't get a PC release on day one.
 

SoundLad

Member
Thinking about picking this up but first I'd like to know if any of your setups are similar to mine and mention how it's running?

My specs:
i7 3770 OC'd slightly to 3.8 Ghz (non "K" chip)
Palit Jetstream GTX 680 2GB
8 GB RAM
WD Black 7200rpm 1TB

I'm not expecting much (high/ultra 1080p, no AA with ~40-60fps?), going by the impressions of some of the posters here
 

cripterion

Member
SLI optimizations

Nvidia said:
Battlefield 4 – updated profile to support test client
Bound by Flame – updated profile
Call of Duty: Online – added profile
Dark Souls II – added profile, added NVCPL anti-aliasing support
Daylight - added profile
Diablo III – updated profile
Everquest: Landmark – updated profile
Icarus – added profile
Smite – added profile
Sniper Elite 3 – added profile
Total War: Rome 2 – added profile
War Thunder – updated profile
Watch Dogs – updated profile
Wildstar – added profile
Windborne – added profile
World of Tanks – updated profile

reverse-227839420.gif


I never thought I'd see Rome 2 in that list. Maybe it's time to get back into it!
 

Crossing Eden

Hello, my name is Yves Guillemot, Vivendi S.A.'s Employee of the Month!
watch-dogs-4k-screenshot-001.png

watch-dogs-4k-screenshot-005.png

watch-dogs-4k-screenshot-006.png

Those 4k screenshots look phenomenal. O.O I'm getting the ps4 version but seriously the pc version looks great on really good pcs.
 

Serandur

Member
You're upset at Nvidia because Ubisoft made a game that needs more VRAM than the graphics card you bought has? Wat.

I'm upset that a game seems clearly able to actually utilize more than the 780 has (not just plain caching) while the 780 is still very much performing well enough to justify having that extra VRAM while Nvidia, for the longest time, forbade 6 GB models from AIBs all to protect their precious and pointless Titan line as well as to force people's hand at premature upgrades, yes.

780s/Tis are premium cards, Nvidia's best short of ridiculously-priced Titans. They cost a fortune, they should not be so clearly mismatchemed and restricted from being otherwise and only now, too late for still-recent buyers like myself, are they allowing just 6 GB regular 780s (not even the Ti). The best part is issues are coming mostly with Nvidia-sponsored games like Watch Dogs and Daylight. People like me don't buy these $500+ monsters to join the "it's just unoptimized" train when games demand more than we've got, we buy them either by themselves or in SLI to play games at their best. Clearly that's not happening now.

A developer's ability to easily use more VRAM tha the card has is not the problem, the problem is the card was intentionally crippled so that it never had the VRAM amount its position and pricetag demanded at least as an option for in the first place. It's disgusting, and I hope to swear off Nvidia entirely with my next 4K-targeted upgrade (when I won't need their exclusive support of driver downsampling anymore).

Nvidia have really gone off the deep end this hardware generation, in my opinion. First the GK104 misleadingly released as their flagship, then the Titan crap getting first crack at the real flagship, then the iterative Titan Black and 780 Ti, and then the Titan Z with shady market segmentation all around. I've never felt more shocked at Nvidia than I am at the moment, a $3000 dual-GPU card when their competitor is selling something of the same caliber for half the price?

My two month old 780 is already exhibiting troubling limitations it should not (when the GPU itself is so powerful). AMD are offering much better hardware sense and value at every pricepoint, Nvidia are selling people $700 cards with no more VRAM than AMD's 2.5 year old and now half-price old flagship in the middle of a console gen shift. Upset is probably not a strong enough word, outraged is better. They are in the best position to have known this would happen and they probably wanted it to.
No but I'm pissed I pulled the trigger on a second 780 and my step up ran out just before EVGA offered the 6gb ti's. Been trying to offload one to a few mates who were interested in a cheapish 780 a few months back but now their looking at the offer like I'm trying to sell them a leper in light of Wolf14 and now Watch_Dogs.
That's harsh, sorry to hear that.
 

GHG

Gold Member
Liking what i see in that article. Looks like my 660 sli setup will have some legs since i got the 3gb versions. Hopefully i can enable txaa x2 over smaa.

So glad i got these puppies over the more expensive single 2gb 680 that everyone was suggesting at the time.
 

popo

Member
I understand that the mods have a policy but this thread turned nasty.

Industry insiders breaking the terms of their employment contract are lauded as heroes but someone who posts performance info for the good of others, and don't provide a receipt, are pounced on by the lynch mob.

Ask the publishers which of the two they regard as the biggest crime.
 

ymmv

Banned
I understand that the mods have a policy but this thread turned nasty.

Industry insiders breaking the terms of their employment contract are lauded as heroes but someone who posts performance info for the good of others, and don't provide a receipt, are pounced on by the lynch mob.

Ask the publishers which of the two they regard as the biggest crime.

Why are you sticking up for pirates?
 

Pandemic

Member
New GeForce update! for Watch Dogs

GeForce 337.88 WHQL, Game Ready Watch Dogs Drivers Increase Frame Rates By Up To 75%
The new GeForce 337.88 WHQL, Game Ready Watch Dogs Drivers are now available to download from GeForce.com, and automatically through GeForce Experience. For gamers jumping into the world of Watch Dogs tomorrow, the 337.88 WHQL drivers are an essential upgrade, optimizing single GPU and multi-GPU SLI performance in the open-world title. For more about the game, its performance, and how to tweak it for the ultimate experience, check out our Watch Dogs Graphics, Performance, & Tweaking Guide.

In addition to day-of-launch Watch Dogs enhancements, 337.88 WHQL also includes a wealth of upgrades and optimizations that boost system performance and improve existing features. Namely, system-wide DirectX 11 and SLI performance optimizations of up to 75%, new technology that reduces game load times, new and updated SLI profiles, and 3D Vision enhancements that optimize DirectX 10 and DirectX 11 3D titles. To learn more, read on.
SOURCE

Edit: Already posted, my bad!
 

Dr Dogg

Member
Loving some of the config file tweaks. By the sounds of it there's a fair bit of customisation that can be done too.

That's harsh, sorry to hear that.

Nah it's alright. It's just a combination of bad timing and me being impatient. 1920x1080 is still very viable with 3gb and I can see so for a fair while yet. Though I do love a higher resolution and more advanced AA implementation which 3gb is becoming the limiting factor more than the GPU power at really higher ends.
 
Status
Not open for further replies.
Top Bottom