Thanks, this seemed to improve input lag somewhat. Sadly, I am testing it in Versailles instead of Paris. The game just deleted my single-player progress for no reason. One part of me wants to look if there is still something left that can actually be salvaged, but the much more logical thing seems to be to leave the game behind at this point.
Where is the Unity save game located usually?
Edit: I now get the following message when starting the game: "Data corrupt! Your save has become corrupted. Do you wish to overwrite it and start a new game? A - YES, B - NO"
My computer was not forcefully reset or powered down the last time I played the game. I guess this is some uplay bug. Synching with uplay corrupted my save somehow?
I had this happen day of release and I disabled cloud sync in Uplay settings and haven't had another problem yet. Almost gave up when I had to play the beginning of this game 4 times.
After 20 hours of Unity, I'm still torn on whether I prefer 4x MSAA at locked 30fps or 60fps with dips to 45-55 fps with no AA (prefer both to FXAA, FXAA is too blurry). 60fps feels nice, but everytime it dips it's really noticable. 30fps at least is consistent and goddamn with 4x MSAA the game looks incredible.
Weirdly, cloud synch seems to be disabled already in uplay. When I logged in to play the game the first time, it was as if there was no save file at all. I restarted uplay + steam and was asked to enter my uplay email and password. Now when I started the game, I got the "corrupted" message. So, either this is not related to cloud synching or my uplay profile reconfigured/reset itself somehow.
I purchased the GTX 980. These fucking Ubisoft games. Seriously what gives. Assassins Creed: Unity will stay locked at 60fps with everything maxed out except for AA which I have set to MSAAx2. When it gets to a cutscene it instantly drops to 45-30 range. I just don't understand why this game does this. Watch Dogs will stay at 52-60fps range with everything maxed out except AA at MSAA4x, but if I enable TXAA it drops to the lower 40s.
Is TXAA that demanding that a single 980 can't handle it or all Ubisoft ports just fucking shit? Or could it be my CPU (i5 3570k)?
Reference point... Call of Duty: Advance Warfare, Shadows of Mordor, Borderlands Pre-Sequel all settings maxed out give me a smooth and stutter free 1080p 60fps experience.
I purchased the GTX 980. These fucking Ubisoft games. Seriously what gives. Assassins Creed: Unity will stay locked at 60fps with everything maxed out except for AA which I have set to MSAAx2. When it gets to a cutscene it instantly drops to 45-30 range. I just don't understand why this game does this. Watch Dogs will stay at 52-60fps range with everything maxed out except AA at MSAA4x, but if I enable TXAA it drops to the lower 40s.
Is TXAA that demanding that a single 980 can't handle it or all Ubisoft ports just fucking shit? Or could it be my CPU (i5 3570k)?
Reference point... Call of Duty: Advance Warfare, Shadows of Mordor, Borderlands Pre-Sequel all settings maxed out give me a smooth and stutter free 1080p 60fps experience.
Shit you could probably downsample from 4K and get 60 fps in those games. But still Unity is doing so much more, which might not be necessary with all the NPCs and stuff but the lighting and details and scale really shine, Mordor is really plain and simple in comparison.I
Reference point... Call of Duty: Advance Warfare, Shadows of Mordor, Borderlands Pre-Sequel all settings maxed out give me a smooth and stutter free 1080p 60fps experience.
Man I cant wait til I get my hands on a future graphics card. Thing is I recently got my rig with the 760 around April. So I'll wait next summer see if another card gets released. But I got my eye on that 970/980.
One thing I'll give credit to Ubisoft is controller functionality. You can plug in the PS4 controller via micro usb and play, with vibration and playstation button prompts. Neat stuff for sure!
After 20 hours of Unity, I'm still torn on whether I prefer 4x MSAA at locked 30fps or 60fps with dips to 45-55 fps with no AA (prefer both to FXAA, FXAA is too blurry). 60fps feels nice, but everytime it dips it's really noticable. 30fps at least is consistent and goddamn with 4x MSAA the game looks incredible.
One thing I'll give credit to Ubisoft is controller functionality. You can plug in the PS4 controller via micro usb and play, with vibration and playstation button prompts. Neat stuff for sure!
Thanks, this seemed to improve input lag somewhat. Sadly, I am testing it in Versailles instead of Paris. The game just deleted my single-player progress for no reason. One part of me wants to look if there is still something left that can actually be salvaged, but the much more logical thing seems to be to leave the game behind at this point.
Where is the Unity save game located usually?
Edit: I now get the following message when starting the game: "Data corrupt! Your save has become corrupted. Do you wish to overwrite it and start a new game? A - YES, B - NO"
My computer was not forcefully reset or powered down the last time I played the game. I guess this is some uplay bug. Synching with uplay corrupted my save somehow?
That's unfortunate. Yes, It definitely sounds like yet another uPlay bug. In any case, save files can generally be found at following location:
"C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\savegames\41290769-ecea-4acf-8a7d-8d56dfb47fc3\720"
The new patch seems to have broken dual GPU performance. I have two HD7970's in crossfire, performance was good before the patch, now everything is a stuttering, hitching mess. In fact, I'm even getting negative scaling now.
Disabling the second GPU brings better performance for me at the moment.
Oh shit, where does this game keep its saves? I just booted the game and got hit with the intro.
That's unfortunate. Yes, It definitely sounds like yet another uPlay bug. In any case, save files can generally be found at following location:
"C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\savegames\41290769-ecea-4acf-8a7d-8d56dfb47fc3\720"
What Ubi say and WCCF's response to it, i don't have much to add to what WCCF said as they hit the nail on the head, really.
But there is one thing that i will say at the end of all this.
Ubi
We are aware that the graphics performance of Assassins Creed Unity on PC may be adversely affected by certain AMD CPU and GPU configurations. This should not affect the vast majority of PC players, but rest assured that AMD and Ubisoft are continuing to work together closely to resolve the issue, and will provide more information as soon as it is available.
WCCF
It goes without saying that I had serious trouble believing that the entirety of the glitches present in Assassins Creed Unity are the cause of Catalyst Drivers (AMD). While modern drivers can be the cause of low frame rates in certain cases, they are not usually behind texture popping and entity glitches. One of the primary selling points of Assassins Creed Unity (from Ubisofts Marketing) was the fact that the game supported thousands of NPCs on screen. Well, they were right about that, but looks like they conveniently forgot to mention the performance hit that would ensue from using so many dynamic objects. We sent some emails and and found out what is really happening:
Ubi
The game (in its current state) is issuing approximately 50,000 draw calls on the DirectX 11 API. Problem is, DX11 is only equipped to handle ~10,000 peak draw calls. What happens after that is a severe bottleneck with most draw calls culled or incorrectly rendered, resulting in texture/NPCs pooping all over the place. On the other hand, consoles have to-the-metal access and almost non-existent API Overhead but significantly underpowered hardware which is not able to cope with the stress of the multitude of polygons. Simply put, its a very very bad port for the PC Platform and an unoptimized (some would even go as far as saying, unfinished) title on the consoles.
WCCF
Games should be created with the target hardware in mind. And from what I have seen so far, high end rigs built with the likes of Titans (Nvidia) and R9 295Xs are glitching as well. So unless the Titan GPU was secretly made by AMD, I am not really sure what Ubisoft PR is on about. The game appears to be barely functional, something that would automatically merit low scores. The post-launch embargo on reviews seems to have foreshadowed the condition of the title. I really enjoyed Assassins Creed Black Flag, but to me, Ubisoft has been making bad calls after bad calls lately, and their PR is heading towards a colossal train wreck. Alienating PC users is one thing, but at this rate, pretty soon, even console users will be vary of their games. Still, Far Cry 4 has yet to be released, so maybe not all hope is lost yet (fingers crossed).
http://wccftech.com/ubisoft-points-finger-amd-technical-bugs-assassins-creed-unity/#ixzz3IzQdlUNs
This is what i want to address.
The game (in its current state) is issuing approximately 50,000 draw calls on the DirectX 11 API. Problem is, DX11 is only equipped to handle ~10,000 peak draw calls
This is absolutely true, if they are pushing 50K Draw Calls they have problem with DX11, on AMD and Nvidia, its no good saying its only an AMD problem because the internet is full of Nvidia users with the same performance problems, there would be, DirectX has some serious efficiency problems affecting Draw Call performance amongst other things.
That is exactly why AMD got together with 'other' developers to create Mantle, its why Apple and Khronos 'OpenGL' and Microsoft are now following AMD's lead.
Existing API's like DX11 are too high level and bloated.
What Ubisoft need is Mantle, Their partners Nvidia might not like it, be that as it may then hold AMD to their word, they said Mantle will be open, So get AMD and Nvidia together to make it work for both.
http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/mantle#overview
I played for 3 hours with high textures with 2 GB vram without a single dip.
I know unity is underoptimzed and all, but im starting to regret my 290x purchase. Doesnt seem to be that future proof compared to the 970 and 980..
I know unity is underoptimzed and all, but im starting to regret my 290x purchase. Doesnt seem to be that future proof compared to the 970 and 980..
So yea, they just tried to do too much. What we all pretty much suspected.
So yea, they just tried to do too much. What we all pretty much suspected.
Shame, cuz if they'd just pushed the development timeline back a year, they might have had time to implement DX12. Wouldn't help consoles much, but at least the PC version could have realized their vision.
So yea, they just tried to do too much. What we all pretty much suspected.
Shame, cuz if they'd just pushed the development timeline back a year, they might have had time to implement DX12. Wouldn't help consoles much, but at least the PC version could have realized their vision.
That's impressive but how is it possible unless Ubi have been very conservative with their targets ? Usually when I set textures to a level my GPU can't handle the game stutters like mad, pauses every 10 seconds or so.
I tried ultra textures and 3gb don't seem to be enough for smooth gameplay.
This was done with internet disabled. Activating my net connection after Alt+Tab from the game, bring back stutters every three seconds.
I have the same setup minus the 970, which I'm getting soon. Do you personally get any boost in performance if you run it at 1080p instead?I'm getting 55-60 free roaming through Paris.
Falls to ~45 (rarely) when those huge NPC crowds appear (can't belive this was their main selling point, uggghh).
Haven't tried it.I have the same setup minus the 970, which I'm getting soon. Do you personally get any boost in performance if you run it at 1080p instead?
I would have been completely happy playing Rogue on PC this year and Unity next year.
Every...10-15min (ballpark) the game just hangs (draw call limit/issue?) for a couple of secs only to resume afterwards.
Just looking at the textures. They look the same to me. The shadow resolution is different.
Haven't tried it.
I don't play games in any resolution other than the native (1920x1200).
Don't. The 970/980 are actually a monumentally small jump in performance from 780/290s and they came out quite a bit later than the 290X as well. You've got similar performance, the same amount of VRAM, and Mantle (for games that support it) well before DX12 will be a viable alternative for other GPUs. In some scenarios, the 290X actually performs better. It's an excellent GPU you got for a good price. Newer, better stuff will always come out in the future, but I would frankly hesitate to even call the 980/970 "better" in any significant way for the first time in a new series' debut in a very long time unless power consumption is your thing. I'd say you got quite lucky going with a 290X, it's still top-notch. I own two 970s too, ergo no personal motive to sugarcoat things.![]()
1) The performance difference between a 290x and a 970 isn't that big.
2) The nvidia gpus were released 10 months later than the 290x, so of course they are better. 2013 gpu vs 2014 gpu. And then later amd will launch an improved gpu in 2015 better than the 9x0, and then nvidia will launch an even better gpu in 2016, and the hardware race will go on.
That happened to me without the latest drivers. Worth a shot.So, I turned textures to low, restarted the game, and am still having severe freezing. I've noticed that the hair physics in the game spaz every time it freezes and objects in the distance render when it freezes too. I'm just going to assume this is an engine issue, but does anyone have any ideas as to what might be wrong?
That happened to me without the latest drivers. Worth a shot.
I also wanted to report that changing shadows to low fixed the 5-10 second locking up. I played for hours and it never happened with low shadows, which visually aren't much of a downgrade. My hip fire guess is something's wrong with PCSS implementation, however I didn't do long enough testing on high shadows to substantiate this.
I wish I could record a video with FRAPS but it destroys my FPS, and I don't like Geforce Experience at all.
Even in crowed areas the framerate is high enough for me, the recent patch had a positive impact for sure.