Watch_Dogs PC performance thread [Read post #1215 before posting]

Status
Not open for further replies.
Maybe a long shot but...is anybody playing at 4k with a 3GB card? I am curious what performance is like. Most games I am fine at 4k with VRAM if I don't use AA...and really it's almost not needed at that res.
 
What were you expecting? You're running at 2.5x the resolution of the PS4 version at twice the framerate and with higher image quality settings and that's unpatched without any optimised drivers. That doesn't sound unoptimised to me. I personally wasn't expecting anyone with an i5 or anyone running at 2560x1440 to hit 60fps yet here we are.

Wow someone else that thinks like me.

Small world.
 
i7-4770t

8GB RAM

R9 290

2560x1440 at ULTRA settings

game looks horrible

rAaACs2.jpg

Part of the point I'm making. Even at ULTRA settings, regardless of playable framerate or not, this game looks terrible. I'm not sure WHAT exactly is using up so much GPU/CPU power.
 
Do you even understand what "needed" and "necessary" mean?

Hell, that video quoted by the person you quoted was not proof of anything, it only showed that the Window's CPU scheduler was putting some threads on cores that already had something running on their other (logical core / thread what ever), standard practice.

It does not show 1. what speed up the game was getting from it, 2. if the game was taking notable advantage of it (and it was not something like the program that was recording the footage or other programs that are running)!

Why are you fighting this so hard man? It is pretty damn obvious that unless something changes an i7 is going to be standard from at least Ubisoft for this gen on pc. Sure DX12 will help most likely but it is still at least a year away if not more.
 
Same thing was said about Crysis 3 which gained a whopping 4 fps with a 3770k vs a 3470

Yeah, speaking of which. Those Crysis 3 benchmarks someone was posting around clearly showed that the game could take advantage of more than 4 cores (Hex core Intels had a large performance advantage over the 2600K and 2500K), yet there was very little difference between an 2500K and a 2600K.

Why are you fighting this so hard man? It is pretty damn obvious that unless something changes an i7 is going to be standard from at least Ubisoft for this gen on pc. Sure DX12 will help most likely but it is still at least a year away if not more.

But it is not "damn obvious"!

The only thing that is obvious is that a lot of people that have very little knowledge of CPUs are claiming that i7s will be "needed" for games with no solid evidence.
 
I was referring to people running the game with i5s maxed with 780s and neither having full utilization. You think this is a good port? A game where people with AMD cards can't seem to play ultra with cards that match the nvidia counterpart.?A game that has built in mouse acceleration? A game that uses 5gb~ of VRAM for 1440p? Your biases have been made clear awhile ago but how you can call this a good port is beyond me.

The issue is that the writing has been on the wall for a while now with games like Crysis 3 and even titanfall needing more than 2gb to run maxed at higher resolutions. Ive been saying for the last 12 months that 2gb cards would not be enough and 3gb will be pushing it pretty soon. As for the quad core issue, as soon as the next gen consoles were announced people should have seen this coming. Consoles are always the baseline in terms of multiplatform games, so as a PC gamer you need to try and mirror them from an architectural standpoint as much as you can. That means having 8 cores/threads or more and having as much vram as possible.

As for it being a decent port - I stand by this statement because the game appears to be scaling well to older hardware provided you are happy to tweak settings. The twitch stream that surfaced yesterday was on a gtx 570 and it was running well. Then there are further examples from gaffers in the official thread with modest hardware who say its running well for them, even on high/ultra settings.
 
When time and again AMD have botched game releases with their drivers, I'd be hesitant to apportion all the blame for that one at UbiSoft's door.
Both gpu vendors, not just AMD.

It seems like every new AAA game will run like pants (god bless SLI owners) until 2nd patch and beta drivers optimized for the game.
 
Part of the point I'm making. Even at ULTRA settings, regardless of playable framerate or not, this game looks terrible. I'm not sure WHAT exactly is using up so much GPU/CPU power.

yeah its clearly unoptimized considering how poor some of these textures look. it's as though they rushed through a lot of the development of the game and probably contracted out a lot of the work to 3rd rate development teams..

this one doesn't look too bad however, but the animation isn't fluid to me.

RqxQKez.jpg
 
Yeah, speaking of which. Those Crysis 3 benchmarks someone was posting around clearly showed that the game could take advantage of more than 4 cores (Hex core Intels had a large performance advantage over the 2600K and 2500K), yet there was very little difference between an 2500K and a 2600K.

They were using like a 3960x which was like 6 Cores 6 HT lol. Not to mention it was like $1100
 
Same thing was said about Crysis 3 which gained a whopping 4 fps with a 3770k vs a 3470
With Dx12 coming thats even less so. If the i7s with true multicores will move ahead, HT will not. Even thats extremely unlikely.

Erm, one of the major motivations behind DX12 is better multithreading in games. DX12 will only increase the gap between an i5 and i7.

We now have Crysis 3, Battlefield 4, Wolfenstein and Watch Dogs that all offer good scaling upto 8+ threads and that's without DX12. The actual tangible benefit in the here and now is minimal, I agree, but if you want to buy a CPU that will last you through this generation of games then an i7 is becoming an increasingly sound investment.
 
The issue is that the writing has been on the wall for a while now with games like Crysis 3 and even titanfall needing more than 2gb to run maxed at higher resolutions. Ive been saying for the last 12 months that 2gb cards would not be enough and 3gb will be pushing it pretty soon. As for the quad core issue, as soon as the next gen consoles were announced people should have seen this coming. Consoles are always the baseline in terms of multiplatform games, so as a PC gamer you need to try and mirror them from an architectural standpoint as much as you can. That means having 8 cores/threads or more and having as much vram as possible.

As for it being a decent port - I stand by this statement because the game appears to be scaling well to older hardware provided you are happy to tweak settings. The twitch stream that surfaced yesterday was on a gtx 570 and it was running well. Then there are further examples from gaffers in the official thread with modest hardware who say its running well for them, even on high/ultra settings.
Dual cores did will for the first half of last gen, especially OCed. I have no debate over vram, 2gb will not be good enough, but this game uses more ram then anything besides a Titan or 780 Ti for textures that aren't great and don't seem to get that low when you disable AA .
 
At those who are already playing: Are you enjoying the game?

I still have my India Origin 20€ digital deluxe pre-order but I'm thinking about cancelling it.
The technical side seems pretty shoddy and if the gameplay is also not so great than there is no reason for me to hold on to the pre-order.
 
Erm, one of the major motivations behind DX12 is better multithreading in games. DX12 will only increase the gap between an i5 and i7.

We now have Crysis 3, Battlefield 4, Wolfenstein and Watch Dogs that all offer good scaling upto 8+ threads and that's without DX12. The actual tangible benefit in the here and now is minimal, I agree, but if you want to buy a CPU that will last you through this generation of games then an i7 is becoming an increasingly sound investment.

It also lowers draw calls and overhead lol, Buying an i7 with 6 real cores maybe but the benifits of buying an i7 with HT are low. Look at the BF4 benchmarks post nvidia patch, the frame rates for differing CPUs flatlined within a 3-7 fps difference between i5s and higher end i7s. These Consoles can only use 6 cores for games aswell.
 
The daytime lighting in this game is bizarrely flat looking. Assassin's Creed has a flat look too, but as it is a different engine, I'm wondering if this is related to the teams in the Ubisoft studio network instead. The Division looks like an improvement over both other Ubi engines.

Overall a pretty nice set of visuals in Watch Dogs though.


Just looks a bit dated in ways with how dull and lacking in certain small effects the game seems to be.
 
yeah its clearly unoptimized considering how poor some of these textures look. it's as though they rushed through a lot of the development of the game and probably contracted out a lot of the work to 3rd rate development teams..

this one doesn't look too bad however, but the animation isn't fluid to me.

RqxQKez.jpg

Jesus..some of those textures (yellow/black portion of the cement pillar) just look laughable. How can this game possibly be maxing out current gen i5s and a 780 Ti? It does not look the part.

I can't imagine that nVidia won't release highly optimized drivers for this, or hopefully Ubi has a day1 patch. There's no excuse for it, the game does NOT look how it performs. Holy shit.

With that being said, it does have its moments in art direction. I particularly enjoyed the color pallet/scenery in the part when
Aiden goes to his nephews birthday party, and him and his sister are talking. That part really popped for some reason.

Otherwise, mediocre at best. Sleeping Dogs is more impressive, IMO, and that's a 2012 game. Runs better, too.
 
Dual cores did will for the first half of last gen, especially OCed. I have no debate over vram, 2gb will not be good enough, but this game uses more ram then anything besides a Titan or 780 Ti for textures that aren't great and don't seem to get that low when you disable AA .
C2D faded really fast before 2009, and eventually the C2Q Q6600 lasted longer than the top C2D despite being older/cheaper.

Having said that, there's no visibility of whether future PC games will go wider. Games only seem to benefit from 3-4 strong cores.
 
The issue is that the writing has been on the wall for a while now with games like Crysis 3 and even titanfall needing more than 2gb to run maxed at higher resolutions. Ive been saying for the last 12 months that 2gb cards would not be enough and 3gb will be pushing it pretty soon. As for the quad core issue, as soon as the next gen consoles were announced people should have seen this coming. Consoles are always the baseline in terms of multiplatform games, so as a PC gamer you need to try and mirror them from an architectural standpoint as much as you can. That means having 8 cores/threads or more and having as much vram as possible..

The VRAM issue is why I refuse to upgrade my GPU with the available options being so poor. The GTX 770 has about the power I feel I need for comfortable 1080P gaming for a good few years but it is crippled with 2GB VRAM and the price premium for 4GB is completely ridiculous. I've tried AMD before and it's just not worth the hassle especially since their power consumption has dropped off the deep end now as well. I honestly can't remember a period in the last ten years where we were in such dire need of a true industry wide GPU refresh, and no, I don't mean yet another rebadge. We've had enough rebadges in the last few years to last a lifetime.
 
The daytime lighting in this game is bizarrely flat looking. Assassin's Creed has a flat look too, but as it is a different engine, I'm wondering if this is related to the teams in the Ubisoft studio network instead. The Division looks like an improvement over both other Ubi engines.

Overall a pretty nice set of visuals in Watch Dogs though.


Just looks a bit dated in ways with how dull and lacking in certain small effects the game seems to be.

Not that different actually, Ubisoft's engines are modular, so to speak. They develop using different techs from different proprietary engines.
 
C2D faded really fast before 2009, and eventually the C2Q Q6600 lasted longer than the top C2D despite being older/cheaper.

Having said that, there's no visibility of whether future PC games will go wider. Games only seem to benefit from 3-4 strong cores.

Pretty sure skylake i5's/i7s are moving to 6 base cores aswell
 
Dual cores did will for the first half of last gen, especially OCed. I have no debate over vram, 2gb will not be good enough, but this game uses more ram then anything besides a Titan or 780 Ti for textures that aren't great and don't seem to get that low when you disable AA .

Dualcores started to fade fast at the start of last gen from what i remember. The main source of bitching at the time was around VRAM requirements at HD resolutions. Sound familiar?

People are playing the game at over twice the resolution of the PS4 version and at twice the framerate, all with increased visual quality and yet they are still moaning about performance. Welcome to next gen folks.

This is a game that is actually multithreaded and spreads resources well over 8 cores but yet we have people saying its unoptimised?

PC gamers who are happy to toss their egos to one side and play at 30fps locked or at high settings rather than ultra, then this group will be happy enough.
 
No rag doll in the gunplay really pissed me off. I'm tired of ubisoft not using physics based deaths (note: grenades, driving, explosions do trigger rag dolls in this) but there is something satisfying about a headshot that sends someone blown back and not some jank ass, dog shit ass animation ubisoft be using in their games.
 
Dualcores started to fade fast at the start of last gen from what i remember. The main source of bitching at the time was around VRAM requirements at HD resolutions. Sound familiar?
No, The dual core Pentium D only came out a few months earlier than the 360 and the C2D only came out the next year.

The C2D lasted well into last generation, only in the last 2-3 years have you really needed to upgrade to an i3 or higher.
 
game looks horrible

rAaACs2.jpg

I'm really not sure what some of you guys were expecting at this point. While it's certainly nowhere close to the original reveal (which has been by essentially everyone at this point), it's a decent looking game. Artistically, not so much, but from a technical standpoint it does a few things right. The lighting is fantastic, the water looks great, the weather effects are exceptionally pretty, et cetera. Easily the best looking game I've played this year, and if it weren't for Arkham Knight and Isolation this probably wouldn't change for a while.
 
Yeah, speaking of which. Those Crysis 3 benchmarks someone was posting around clearly showed that the game could take advantage of more than 4 cores (Hex core Intels had a large performance advantage over the 2600K and 2500K), yet there was very little difference between an 2500K and a 2600K.



But it is not "damn obvious"!

The only thing that is obvious is that a lot of people that have very little knowledge of CPUs are claiming that i7s will be "needed" for games with no solid evidence.

The situation you describe with Crysis 3 is likely down to a shortcoming of Hyper-threading. When the CPU is heavily loaded, applications are unable to take advantage of Hyper-threading to its maximum potential. You're basically too close to the edge of the performance envelope for the CPU.
 
Part of the point I'm making. Even at ULTRA settings, regardless of playable framerate or not, this game looks terrible. I'm not sure WHAT exactly is using up so much GPU/CPU power.

I'm betting it's the things it's running behind the scenes. People with lives, maybe it's rendering off screen stuff, minimal use of streaming in case you want to black out the city and stuff. I don't know, maybe this game was too ambitious.
 
No, The dual core Pentium D only came out a few months earlier than the 360 and the C2D only came out the next year.

The C2D lasted well into last generation, only in the last 2-3 years have you really needed to upgrade to an i3 or higher.

The Pentium D's did not have any legs once the next gen games started coming out. The C2Ds fared better, but if you wanted to max games out and play at higher resolutions then the quad core Phenoms were the way to go for a similar price or the C2Q if you had more money to spare. To avoid running into CPU bottlenecks then you needed a Quad pretty early into that generation. But yeh, you could "get by" on a dual core, just not expect to max everything out with clean IQ.

If we are to do an apples to apples comparison then the i5 is what the pentium D was back then. I'm not even sure why people are suprised by what they are seeing when a lot of us have been here before.
 
You know, outside of the shitshow that is the graphics of this game, I think I'd enjoy it. But after hearing even the better rigs out there struggling albeit at ultra settings, I think I would've been happier with the PS4 version. I'm on a 4670k @ 4.5ghz w/ 780ti too.
 
Can't believe all the people saying the game looks "awful". Completely fucking ridiculous. Talk about some ridiculously high and unattainable standards.
 
You know, outside of the shitshow that is the graphics of this game, I think I'd enjoy it. But after hearing even the better rigs out there struggling albeit at ultra settings, I think I would've been happier with the PS4 version. I'm on a 4670k @ 4.5ghz w/ 780ti too.

Look like we have similar setups, except my i5 can't reach 4.5. I'd take it on PC any day, if just for 60fps. Game feels atrocious at 30fps.
 
I have no regrets cancelling my pc pre-order. This looks nothing like the 2012 footage and Ubisoft said it would exceed it on pc. I'll still wait for the reviews and even then buy it cheap.
 
You know, outside of the shitshow that is the graphics of this game, I think I'd enjoy it. But after hearing even the better rigs out there struggling albeit at ultra settings, I think I would've been happier with the PS4 version. I'm on a 4670k @ 4.5ghz w/ 780ti too.

I agree. I can't play PC games without V-sync and after hearing that V-Sync limits you to 30fps if you can't handle solid 60fps which my rig probably couldn't do on High settings. So if I'm gonna play at 30fps might as well be on PS4. I'll wait for the reviews to make sure it's stable 30fps though and there are no major issues.
 
The Pentium D's did not have any legs once the next gen games started coming out. The C2Ds fared better, but if you wanted to max games out and play at higher resolutions then the quad core Phenoms were the way to go for a similar price.

If we are to do an apples to apples comparison then the i5 is what the pentium D was back then. I'm not even sure why people are suprised by what they are seeing when a lot of us have been here before.

No, the i5 today is not the fucking same as a P4D was back then!

The reason the P4D died off was due to it's much lower IPC than Core2 based CPUs, it was close to half the speed clock for clock, that meant that the C2D could be scaled up performance wise much easier than them (a P4D would need some really insane clockspeeds to match something like a 3Ghz C2D).

If anything, the i5s are todays C2Ds.

They are much faster vs the CPUs in the new consoles than the high clocked P4Ds and low clocked C2D were vs Xenon and Cell.
Hell, some people forget that both the Xenon and Cell had over 3x the number of cores (or logical cores/threads) than the dual cores that people had no problem gaming on for a large amount of last gen!
 
For all of those complaining about the graphics of the title: you should always expect a downgrade with Ubisoft titles especially.

Anyway, is the game shaping up to be fun? I might get it for cheap in a few years time.
 
That they showed off to a huge audience during a press conference 2 years ago :P
They missed their goal. As much as that sucks it doesn't make what we've got "awful" in any reality I'm familiar with. Apart from the performance issues, which I understand if people are pissed about, where are all the better looking open world games out there that make this one look "awful" and a "shitshow" by comparison? It's hyperbole to the nth degree. It's almost as bad as the constant "looks like a PS2 game" comments on the console side.
 
Not that different actually, Ubisoft's engines are modular, so to speak. They develop using different techs from different proprietary engines.

So SnowDrop, Anvil Next, and whatever this Watch Dogs engine was called, are sort of sister engines?
 
After an hour or two of tweaking, this is my sweet spot for 55-65 FPS constant, bar insane situations where it might dip down to 45-50. I am pretty happy with it, but the game is clearly very unoptimized and at the very least a new nVidia driver should drop before release, considering how much they've promoted it.

I am using:

2560x1440 @ 100hz
"High Preset" except HBAO+ High
MSAA 2X
High Textures

My specs:

i5-4670K @ 4.2ghz
EVGA GTX 780 Ti @ 1500/7800mhz
16gb Kingston HyperX Blu DDR3-1600

And this is what my game looks like:



I'm pretty underwhelmed. This would be acceptable if the game looked phenomenal, but it does not.
That's what I figured.

And I was all excited after the OP posted all these good looking pics, but its clear they are just cherry-picked and downsampled and not representative at all of what the game normally looks like.

Oh well. Maybe when the game is like $5-10 or something later this year.

Boss★Moogle;113242747 said:
I agree. I can't play PC games without V-sync and after hearing that V-Sync limits you to 30fps if you can't handle solid 60fps which my rig probably couldn't do on High settings. So if I'm gonna play at 30fps might as well be on PS4. I'll wait for the reviews to make sure it's stable 30fps though and there are no major issues.
You can probably find it cheaper on PC, though.
 
Honestly, if this is true about it effectively utilizing 8 threads, I am impressed with Ubisoft. The VRAM issues, however, are upsetting as a new 780 owner. Though while I believe Ubisoft are partially responsible in adapting the game for today's high-end stuff so High and Ultra textures don't cause the game to stutter like crazy on top-end Geforces (especially with the Nvidia partnership), I am particularly upset at Nvidia for limiting VRAM amounts on their 780s to this degree. They should have been 4.5 or 6 GB cards from day one, they're clearly capable of utilizing that to full effect (see: GTX Titan).
 
I got a great rig, playing on a lower resolution and medium settings, framerate is around 50-60ish but i got some serious stutters :-/ seems like the game is extremely poorly optimized for PC and rushed for next gen consoles.

Also, motion blur & DOF give huge amounts of stutter, so i turned that off.
 
So SnowDrop, Anvil Next, and whatever this Watch Dogs engine was called, are sort of sister engines?

Snowdrop is not being developed by Ubisoft, regarding the others, well, kind of. :)
It would be unproductive starting from scratch, the know-how gets shared one team to another.
 
Status
Not open for further replies.
Top Bottom