What were you expecting? You're running at 2.5x the resolution of the PS4 version at twice the framerate and with higher image quality settings and that's unpatched without any optimised drivers. That doesn't sound unoptimised to me. I personally wasn't expecting anyone with an i5 or anyone running at 2560x1440 to hit 60fps yet here we are.
i7-4770t
8GB RAM
R9 290
2560x1440 at ULTRA settings
game looks horrible
![]()
Do you even understand what "needed" and "necessary" mean?
Hell, that video quoted by the person you quoted was not proof of anything, it only showed that the Window's CPU scheduler was putting some threads on cores that already had something running on their other (logical core / thread what ever), standard practice.
It does not show 1. what speed up the game was getting from it, 2. if the game was taking notable advantage of it (and it was not something like the program that was recording the footage or other programs that are running)!
Same thing was said about Crysis 3 which gained a whopping 4 fps with a 3770k vs a 3470
Why are you fighting this so hard man? It is pretty damn obvious that unless something changes an i7 is going to be standard from at least Ubisoft for this gen on pc. Sure DX12 will help most likely but it is still at least a year away if not more.
I was referring to people running the game with i5s maxed with 780s and neither having full utilization. You think this is a good port? A game where people with AMD cards can't seem to play ultra with cards that match the nvidia counterpart.?A game that has built in mouse acceleration? A game that uses 5gb~ of VRAM for 1440p? Your biases have been made clear awhile ago but how you can call this a good port is beyond me.
Both gpu vendors, not just AMD.When time and again AMD have botched game releases with their drivers, I'd be hesitant to apportion all the blame for that one at UbiSoft's door.
Part of the point I'm making. Even at ULTRA settings, regardless of playable framerate or not, this game looks terrible. I'm not sure WHAT exactly is using up so much GPU/CPU power.
So how long before we get ENB and driving mods?
Yeah, speaking of which. Those Crysis 3 benchmarks someone was posting around clearly showed that the game could take advantage of more than 4 cores (Hex core Intels had a large performance advantage over the 2600K and 2500K), yet there was very little difference between an 2500K and a 2600K.
Same thing was said about Crysis 3 which gained a whopping 4 fps with a 3770k vs a 3470
With Dx12 coming thats even less so. If the i7s with true multicores will move ahead, HT will not. Even thats extremely unlikely.
Dual cores did will for the first half of last gen, especially OCed. I have no debate over vram, 2gb will not be good enough, but this game uses more ram then anything besides a Titan or 780 Ti for textures that aren't great and don't seem to get that low when you disable AA .The issue is that the writing has been on the wall for a while now with games like Crysis 3 and even titanfall needing more than 2gb to run maxed at higher resolutions. Ive been saying for the last 12 months that 2gb cards would not be enough and 3gb will be pushing it pretty soon. As for the quad core issue, as soon as the next gen consoles were announced people should have seen this coming. Consoles are always the baseline in terms of multiplatform games, so as a PC gamer you need to try and mirror them from an architectural standpoint as much as you can. That means having 8 cores/threads or more and having as much vram as possible.
As for it being a decent port - I stand by this statement because the game appears to be scaling well to older hardware provided you are happy to tweak settings. The twitch stream that surfaced yesterday was on a gtx 570 and it was running well. Then there are further examples from gaffers in the official thread with modest hardware who say its running well for them, even on high/ultra settings.
Erm, one of the major motivations behind DX12 is better multithreading in games. DX12 will only increase the gap between an i5 and i7.
We now have Crysis 3, Battlefield 4, Wolfenstein and Watch Dogs that all offer good scaling upto 8+ threads and that's without DX12. The actual tangible benefit in the here and now is minimal, I agree, but if you want to buy a CPU that will last you through this generation of games then an i7 is becoming an increasingly sound investment.
yeah its clearly unoptimized considering how poor some of these textures look. it's as though they rushed through a lot of the development of the game and probably contracted out a lot of the work to 3rd rate development teams..
this one doesn't look too bad however, but the animation isn't fluid to me.
![]()
C2D faded really fast before 2009, and eventually the C2Q Q6600 lasted longer than the top C2D despite being older/cheaper.Dual cores did will for the first half of last gen, especially OCed. I have no debate over vram, 2gb will not be good enough, but this game uses more ram then anything besides a Titan or 780 Ti for textures that aren't great and don't seem to get that low when you disable AA .
The issue is that the writing has been on the wall for a while now with games like Crysis 3 and even titanfall needing more than 2gb to run maxed at higher resolutions. Ive been saying for the last 12 months that 2gb cards would not be enough and 3gb will be pushing it pretty soon. As for the quad core issue, as soon as the next gen consoles were announced people should have seen this coming. Consoles are always the baseline in terms of multiplatform games, so as a PC gamer you need to try and mirror them from an architectural standpoint as much as you can. That means having 8 cores/threads or more and having as much vram as possible..
The daytime lighting in this game is bizarrely flat looking. Assassin's Creed has a flat look too, but as it is a different engine, I'm wondering if this is related to the teams in the Ubisoft studio network instead. The Division looks like an improvement over both other Ubi engines.
Overall a pretty nice set of visuals in Watch Dogs though.
Just looks a bit dated in ways with how dull and lacking in certain small effects the game seems to be.
C2D faded really fast before 2009, and eventually the C2Q Q6600 lasted longer than the top C2D despite being older/cheaper.
Having said that, there's no visibility of whether future PC games will go wider. Games only seem to benefit from 3-4 strong cores.
Dual cores did will for the first half of last gen, especially OCed. I have no debate over vram, 2gb will not be good enough, but this game uses more ram then anything besides a Titan or 780 Ti for textures that aren't great and don't seem to get that low when you disable AA .
No, The dual core Pentium D only came out a few months earlier than the 360 and the C2D only came out the next year.Dualcores started to fade fast at the start of last gen from what i remember. The main source of bitching at the time was around VRAM requirements at HD resolutions. Sound familiar?
game looks horrible
![]()
Yeah, speaking of which. Those Crysis 3 benchmarks someone was posting around clearly showed that the game could take advantage of more than 4 cores (Hex core Intels had a large performance advantage over the 2600K and 2500K), yet there was very little difference between an 2500K and a 2600K.
But it is not "damn obvious"!
The only thing that is obvious is that a lot of people that have very little knowledge of CPUs are claiming that i7s will be "needed" for games with no solid evidence.
Part of the point I'm making. Even at ULTRA settings, regardless of playable framerate or not, this game looks terrible. I'm not sure WHAT exactly is using up so much GPU/CPU power.
Pretty sure skylake i5's/i7s are moving to 6 base cores aswell
No, The dual core Pentium D only came out a few months earlier than the 360 and the C2D only came out the next year.
The C2D lasted well into last generation, only in the last 2-3 years have you really needed to upgrade to an i3 or higher.
You know, outside of the shitshow that is the graphics of this game, I think I'd enjoy it. But after hearing even the better rigs out there struggling albeit at ultra settings, I think I would've been happier with the PS4 version. I'm on a 4670k @ 4.5ghz w/ 780ti too.
Can't believe all the people saying the game looks "awful". Completely fucking ridiculous. Talk about some ridiculously high and unattainable standards.
You know, outside of the shitshow that is the graphics of this game, I think I'd enjoy it. But after hearing even the better rigs out there struggling albeit at ultra settings, I think I would've been happier with the PS4 version. I'm on a 4670k @ 4.5ghz w/ 780ti too.
The Pentium D's did not have any legs once the next gen games started coming out. The C2Ds fared better, but if you wanted to max games out and play at higher resolutions then the quad core Phenoms were the way to go for a similar price.
If we are to do an apples to apples comparison then the i5 is what the pentium D was back then. I'm not even sure why people are suprised by what they are seeing when a lot of us have been here before.
They missed their goal. As much as that sucks it doesn't make what we've got "awful" in any reality I'm familiar with. Apart from the performance issues, which I understand if people are pissed about, where are all the better looking open world games out there that make this one look "awful" and a "shitshow" by comparison? It's hyperbole to the nth degree. It's almost as bad as the constant "looks like a PS2 game" comments on the console side.That they showed off to a huge audience during a press conference 2 years ago![]()
Not that different actually, Ubisoft's engines are modular, so to speak. They develop using different techs from different proprietary engines.
That's what I figured.After an hour or two of tweaking, this is my sweet spot for 55-65 FPS constant, bar insane situations where it might dip down to 45-50. I am pretty happy with it, but the game is clearly very unoptimized and at the very least a new nVidia driver should drop before release, considering how much they've promoted it.
I am using:
2560x1440 @ 100hz
"High Preset" except HBAO+ High
MSAA 2X
High Textures
My specs:
i5-4670K @ 4.2ghz
EVGA GTX 780 Ti @ 1500/7800mhz
16gb Kingston HyperX Blu DDR3-1600
And this is what my game looks like:
I'm pretty underwhelmed. This would be acceptable if the game looked phenomenal, but it does not.
You can probably find it cheaper on PC, though.Boss★Moogle;113242747 said:I agree. I can't play PC games without V-sync and after hearing that V-Sync limits you to 30fps if you can't handle solid 60fps which my rig probably couldn't do on High settings. So if I'm gonna play at 30fps might as well be on PS4. I'll wait for the reviews to make sure it's stable 30fps though and there are no major issues.
You can probably find it cheaper on PC, though.
So SnowDrop, Anvil Next, and whatever this Watch Dogs engine was called, are sort of sister engines?