Watch_Dogs PC performance thread [Read post #1215 before posting]

Status
Not open for further replies.
Based on what I'm reading, will this game struggle to hit 60fps even on 1080p with ultra on a 780?

So I guess we need to be content with high?
For now pretty much. We'll see how it runs after the inevitable driver updates. They should really be ready before a game releases but we all know they always lag behind.
 
Honestly, if this is true about it effectively utilizing 8 threads, I am impressed with Ubisoft. The VRAM issues, however, are upsetting as a new 780 owner. Though while I believe Ubisoft are partially responsible in adapting the game for today's high-end stuff so High and Ultra textures don't cause the game to stutter like crazy on top-end Geforces (especially with the Nvidia partnership), I am particularly upset at Nvidia for limiting VRAM amounts on their 780s to this degree. They should have been 4.5 or 6 GB cards from day one, they're clearly capable of utilizing that to full effect (see: GTX Titan).
I'd be impressed if there was a nice payoff for the high requirements.

But there's not. Its not a great looking game, its just a resource hog. Don't know whats impressive about that.
 
For now pretty much. We'll see how it runs after the inevitable driver updates. They should really be ready before a game releases but we all know they always lag behind.
I honestly can't tell the difference between high and ultra though.

High on PC should be much better than the Ps4 at least.
 
Honestly, if this is true about it effectively utilizing 8 threads, I am impressed with Ubisoft. The VRAM issues, however, are upsetting as a new 780 owner. Though while I believe Ubisoft are partially responsible in adapting the game for today's high-end stuff so Ultra doesn't stutter like crazy on top-end Geforces (especially with the Nvidia partnership), I am particularly upset at Nvidia for limiting VRAM amounts on their 780s to this degree. They should have been 4.5 or 6 GB cards from day one, they're clearly capable of utilizing that to full effect (see: GTX Titan).

Yeah, Nvidia's history of being cheap with VRAM and its recent habit of holding back its actual flagship cards (beginning with the 6xx series) is immensely frustrating. If the 870 isn't at least 4GB by default I'll likely ditch my two 670s for two R390s instead.
 
Yeah, Nvidia's history of being cheap with VRAM and its recent habit of holding back its actual flagship cards (beginning with the 6xx series) is immensely frustrating. If the 870 isn't at least 4GB by default I'll likely ditch my two 670s for two R390s instead.

I was originally going to go with an R9 290 this time around, but I really, really like downsampling, unfortunately for me. AMD, please start supporting it... please, for sanity's sake, please with $100 on top. :l

I'd be impressed if there was a nice payoff for the high requirements.

But there's not. Its not a great looking game, its just a resource hog. Don't know whats impressive about that.

Only impressed at the thread-utlization (if true), not the performance and more from the programming side. Whatever's causing that terrible stuttering, for example, is inexcusable for a game that's been supposedly developed in collaboration with a major GPU manufacturer and must have been able to run perfectly fine on weaker hardware than what's available now when they were showing it off a couple years ago. Actually, it's inexcusable, period.

They knew what they were developing for and if there isn't enough VRAM, for example, they should have developed and tested a more thorough streaming solution. The game does look good, I really think it does, but there's no way even top-end hardware should be struggling like it is, that I agree with (though I cannot speak on the CPU performance issues as it's a bit more difficult to determine the cause and whether it's justified or not). I'm still impressed by any game that actually utilizes multiple threads however, more from an appreciation of programming than gaming performance.
 
Getting sick of these ludicrously high VRAM requirements since next gen consoles came out, I could understand it if the texture quality was worthy of the high VRAM usage, but nothing in recent games have shown me remarkable improvements in that area.

Games that have come out on PC over the last 3 months have mostly been pretty terrible ports completely unoptimized with ridiculous VRAM requirements. (See Titanfall, COD Ghosts etc) not to mention the poor implementation (or complete lack of) Sli support in most titles.

My GTX 690 which is still a rather expensive card and considered to be, in SLI, one of the fastest on the market still, is getting hammered by VRAM requirements that are not even justified.

I am eventually going to be upgrading once the high end 800 series comes out, but developers should be optimizing there games better, OR, putting textures in games that ACTUALLY justify the VRAM usage.

If the textures looked so damn good that they justified the requirements I wouldn't be so salty about it and would probably just bite the bullet and upgrade now, but the fact is, they don't and its just a bullshit way to get enthusiasts to upgrade for a tiny performance gain and a few more GB's of VRAM.

My feeling exactly. I have a 780 Ti with 3gb VRAM, and even though the game states that is enough for Ultra texture, it causes terrible stuttering.

What makes it worse, is as you said, these textures look like total shit.
 
From watching streams on twitch the game looks like shit, and has some of the worst pop-in I can remember. Cars just pop up like nothing. It's awful.
 
Thats pretty good video. It shows that game is using all threads and GPU to its fullest. So he is only GPU limited in this video.

No, it does not.
It just shows that windows is putting something on those logical cores (possibly what is encoding the video).
The only way to tell how many threads the game is using and how much CPU time they each are using is to use something like process explorer.
 

WTF at those CPU temps! I mean is the TIM on the 4770K that shitty that he's hitting 80C CPU temps without even being at 100% use and only at 4.0GHz. I mean either that or he didn't install his HSF/thermal compound properly (gonna assume he's on air with those temps).

For reference my 2600K @ 4.6GHz on air never goes above 67C at 100%, during gaming I've never seen it go above 61C. And my ambient temperature is pretty high most of the year (around 26C).
 
Any benchmarks from big sites with HT on and off at 1080p benches?
Or user from here?
Boss★Moogle;113245099 said:
WTF at those CPU temps! I mean is the TIM on the 4770K that shitty that he's hitting 80C CPU temps without even being at 100% use
The gap the TIM has to fill and the TIM and thermals can be that bad yes.
 
How are people posting screenshots of the PC version its not out yet...
5135+Pirates+Ship.jpg
Argh
 
Regarding multicore usage - expect a ton of console ports which effectively use cores now. Why? Xbox One and PS4 both have low power AMD Jaguar processors, commonly seen in mobile devices. 1.6ghz clock on PS4 (but 8 cores).
 
forget 60 fps, I'm going to be choosing the highest settings I can, while staying locked at 30. I need this game to look as good as possible. AA will have a pretty high priority
 
Oh my god, those VRAM usages.
I'll just sit in a corner and cry with my 2GB.

I know those feels.

If I could go back 6 months and slap myself for not spending the extra on the 4GB GTX 770, I would. But no, my dumbass had one of those "2GB should be good enough" moments. Having to tweak a bunch of settings to compensate for the lower VRAM has become an all too common refrain this year.
 
I know those feels.

If I could go back 6 months and slap myself for not spending the extra on the 4GB GTX 770, I would. But no, my dumbass had one of those "2GB should be good enough" moments. Having to tweak a bunch of settings to compensate for the lower VRAM has become an all too common refrain this year.

Someone should tell the guy that runs the "I need a New PC!" thread to stop recommending 2GB video cards altogether.
 
I know those feels.

If I could go back 6 months and slap myself for not spending the extra on the 4GB GTX 770, I would. But no, my dumbass had one of those "2GB should be good enough" moments. Having to tweak a bunch of settings to compensate for the lower VRAM has become an all too common refrain this year.

Not quite the same thing, but I bought my 780 a couple months ago and literally about two days later, 6GB 780s finally came around. I hate my luck.
 
What video card/how much VRAM do you have? Ultra textures is the number one thing that instigates stutters for me on a 780 Ti w. 3GB VRAM.

I have a HD7970 Lightning 3GB. Runs the game around 50-60 FPS at 1080p Ultra-highish. My copy arrived early at the shop, so I picked it up, though you need to force Uplay to offline in order for your retail copy to work.

I was considering buying a 670 2GB last year because it was cheaper then the Lightning, but i figured the extra VRAM might come in handy one day. Lucky I did.
 
Someone should tell the guy that runs the "I need a New PC!" thread to stop recommending 2GB video cards altogether.

A card with less than 4GB wasn't even on my radar when I recently upgraded. I knew VRAM requirements would go through the roof with the new consoles coming out and to be honest I would've liked to get more than 4GB if it was available for a reasonable price.
 
They missed their goal. As much as that sucks it doesn't make what we've got "awful" in any reality I'm familiar with. Apart from the performance issues, which I understand if people are pissed about, where are all the better looking open world games out there that make this one look "awful" and a "shitshow" by comparison? It's hyperbole to the nth degree. It's almost as bad as the constant "looks like a PS2 game" comments on the console side.

Oh I definitely agree with you on the overreactions on how the final product ended up looking. A lot of changes can be made during development during 2 years, the explosions, effects, models, all were rendered using insanely high GPU processing power. I think you're right in a way that they used that 10 min long demo to sort of provide a layout for how the game would look, some gameplay aspects, and I'm confident that due to being a game worked on by over 1000 people from across the globe using different teams to focus on different aspects of the game fucked them over with a mess of a game (on the conceptual level, I'm sure the team that built that initial demo had much more focus on how to really show off what kind new experience that could draw you in with the new visual effects they could pull off using DX11, and with their own ways of handling a modern open world game. The fact that they're building around the previous console hardware too severely limits what they can pull off with their design since they have to basically make sure every animation, piece of dialogue, character model, and object models can be implemented into a piece of hardware with 512mb of RAM.

I mean sure, they could support higher resolutions, hbao, and whatever visual effect all the current gen games use now. But you're limiting yourself so much when all those visual features you implemented are still built in this framework of an engine designed to be efficient for 8 year old hardware with a market that would be able to run those features on their machine are a much smaller piece of the pie. Not that those are the only reasons, the other is how Ubisoft uses their development team less and less as creative forces to innovate in ways that are only possible with this new hardware. Which I think is why they just throw different chunks of their AAA games around that end up with a bunch of different project teams and leads literally designing different parts of the game independently from one another.

That ends up as a clusterfuck when they finally finishe their aspect of the game they were assigned to, and all they tried to dp was just mash all these different mechanics, and designs into the game when the legwork was done. That causes a lack of vision, direction, all these different teams working on a portion of a game without a real identity to the end product? That's just basic teamwork right there, having an understanding of what the game is so they can create using their limitations to their advantage. Those were how the most clever design mechanics in history were developed. These AAA yearly franchise entertainment giants They treat design and creating like a manufacturing process and it ends up falling short of the kind of game they initially conceived but still left with all the pieces they wanted there.

2 years ago that same trailer made it seem like this would be a hybrid MMO kinda like Destiny with the camera focusing on this other player in that players game, and then proceeds to zoom out showing all these other similar shapes that identified other players running around in the same Chicago. It's a combination of Ubisoft trying to create their own brand and style to all their games, while also treating the design and conception stage of an artistic entertainment product into a game factory.
 
The Pentium D's did not have any legs once the next gen games started coming out. The C2Ds fared better, but if you wanted to max games out and play at higher resolutions then the quad core Phenoms were the way to go for a similar price or the C2Q if you had more money to spare. To avoid running into CPU bottlenecks then you needed a Quad pretty early into that generation. But yeh, you could "get by" on a dual core, just not expect to max everything out with clean IQ.

If we are to do an apples to apples comparison then the i5 is what the pentium D was back then. I'm not even sure why people are suprised by what they are seeing when a lot of us have been here before.
No, not even remotely close. The i5 is still viable if you OC it but the Pentium D sucked ass the moment it was released. Plus it runs at like 90C.
 
Is it possible to select refresh rate in the options?

I might play this at 50fps instead to take some pressure of the CPU.

Assassins creed 4 would not let you select refresh rate and always seemed to default to 60hz
 
Is it possible to select refresh rate in the options?

I might play this at 50fps instead to take some pressure of the CPU.

Assassins creed 4 would not let you select refresh rate and always seemed to default to 60hz

You are thinking of an FPS cap, use EVGA Precision X if you have Nvidia or Afterburner if you have AMD. Monitors don't have 50hz refresh rates, most are 60hz so 30 and 60hz are factors and work well. As does 72 for 144hz
 
You are thinking of an FPS cap, use EVGA Precision X if you have Nvidia or Afterburner if you have AMD. Monitors don't have 50hz refresh rates, most are 60hz so 30 and 60hz are factors and work well. As does 72 for 144hz

You can create custom resolutions with custom refreshrate if are into the refreshrate gap that your monitor support. I played a few games using 50 hz resolutions (for example, NFS Rivals before nvidia 337.50 drivers) to avoid framerate fluctuations. Deadpool can only be played with smooth framerate using a 59hz resolution.

If a game has a framerate between 60 and 50, the best solution is to use a 50 hz resolution so framerate = refreshrate and is completely smooth. 50 fps at 50 hz is very near to 60 fps at 60 hz. But 50 fps in 60 hz is a judder festival.
 
You are thinking of an FPS cap, use EVGA Precision X if you have Nvidia or Afterburner if you have AMD. Monitors don't have 50hz refresh rates, most are 60hz so 30 and 60hz are factors and work well. As does 72 for 144hz


Using plasma so 50hz is an option for me. Thankfully

Sometimes use 50fps in 3d for the reduced ghosting that comes with the refresh rate.
 
I have gtx580 1.5gb, 16gb ram, i72600k default speed. With everything on ultra except shadows and reflection and no vsync I get 44-60+ fps.
 
finally a game that prove "2GB is enough for 1080p on Ultra!!!" wrong... I'm busting my a$$ to explain it to the other guy on another thread about 2GB vs 4GB @ 1080p... and finally here we are...

I have GTX 760 2GB, looks like I'm stuck with High texture :(
 
Status
Not open for further replies.
Top Bottom