I don't game on a laptop, so maybe I've missed this, but is this whole "It may work but it isn't supported" thing new? I've seen this in two or three games now, it seems odd.
Processor
Minimum
Intel Core i5-2500K @ 3.3 GHz or AMD FX-8350 @ 4.0 GHz or AMD Phenom II x4 940 @ 3.0 GHz
Recommended
Intel Core i7-3770 @ 3.4 GHz or AMD FX-8350 @ 4.0 GHz or better
http://blog.ubi.com/assassins-creed-unity-pc-specs/
They are trolling you guys and you bite.
The 940 isn't even in the same universe as the 2500k, and even the FX have a hard time keeping up;
Yet, here they are: 940 in the same place as the 8350 and 2500k, and in the recommended the 8350 and 3770, but not the 2500 which performs the same in games, even better compared to the FX.
So they don't support anything below a 680 but they support everything from the 700 series, including those chips that are identical to those in the 600 series (for instance 670 = 760). Same goes for AMD.Additional Notes: Supported video cards at the time of release: NVIDIA GeForce GTX 680 or better, GeForce GTX 700 series; AMD Radeon HD7970 or better, Radeon R9 200 series Note: Laptop versions of these cards may work but are NOT officially supported.
In a way, it's good for minimum specs to be i7-4970K at 4.5Ghz and Triple-SLI GTX 980s because that means video game technology is progressing forward.
And we can separate the wannabes from the true gamer.
So you are indeed trolling.Also, even though the AMD CPUs are weaker, maybe Ubisoft developers coded pedal to the metal to achieve better optimality and efficiency. Thus, weaker AMD CPUs are performing better than Intel CPUs.
He's just saying that for him, he can't really justify the cost of keeping is PC updated to meet the increasing requirements. Nothing wrong with that.
Being a PC gamer myself I admit that its very expensive and not for everyone. I spent about $1,000 US on my build in Oct 2011, and then I spent another $350 in 203 upgrading my GPU to a gtx670.
I could definitely see how a lot of people would be turned off by that, and the difference between 900p and 1080p is not enough to persuade them.
Being PC gamers we accept the added cost to stay relevant though. But it doesn't mean I don't get annoyed at ridiculous specs like this when I've spent so much to stay current.
Still waiting for these specs to be debunked though.
AC3 is an atrocious PC port.
Did you install that Worse Mod thing? It improved the performance significantly for me. That game looks great in places, but the urban areas look rubbish during the day.This is how Watch Dogs runs on my new rig [snip]
I genuinely thought you were being sarcastic about that whole 2x performance thing.
If the performance pinnacle of a console is 2x that of a Radeon HD 7850, why would it not be 2x the framerate if it's fully utilised? Why would it not be 2x faster?
I guess we have to just wait for developers to get good with the hardware and to unlock the potential of that 2x then.
http://blog.ubi.com/assassins-creed-unity-pc-specs/
![]()
They are trolling you guys and you bite.
The 940 isn't even in the same universe as the 2500k, and even the FX have a hard time keeping up;
Yet, here they are: 940 in the same place as the 8350 and 2500k, and in the recommended the 8350 and 3770, but not the 2500 which performs the same in games, even better compared to the FX.
They are lying and trolling and lots of people are biting.
I'm not in the slightest bit worried about how my 970 or the 780 Ti I'm eying to switch it with will handle this game at 2560x1440; there is nothing, absolutely nothing this level of GPU performance cannot handle well at this resolution or below (lol, waste of such performance if you ask me), including Ubisoft games. Especially given adjustable srttings. Therefore I find it sad when everyone loses their shit over some meaningless and context-less "requirements". I am, however, worried that Unity will have very limited use of multiple threads and therefore will hit only one CPU core pretty hard, limiting my framerate consistency with my 3770K.
TLDR: Why the hell is anyone with a powerful machine even half-seriously looking at "required specs" when it's not at all obvious what they mean? The "minimum" here will run 1920x1080 with respectable settings and framerate, I bet.
Why is Evil Within being used as reassurance? That game actually ended up being hard to run. What kind of pc does it even take to maintain 60 fps on high settings in Evil Within?
Yeah, idtech5 is probably largely to blame but my point is, sometimes when a game sounds like it's going to be hard to run, it might actually end up being hard to run. Or in the case of Ubisoft, another lackluster port but on next-gen levels.
I mean things like AI/physics calculations can be much faster on the closed platform, but this may not translate to a linear gain in FPS depending on where the bottleneck is.
Such a ridiculous post. Not everyone wants to pump $1000 into just their GPU. Go wave your massive epenis around somewhere else, that's not the point of this thread.
Why is Evil Within being used as reassurance? That game actually ended up being hard to run. What kind of pc does it even take to maintain 60 fps on high settings in Evil Within?
Yeah, idtech5 is probably largely to blame but my point is, sometimes when a game sounds like it's going to be hard to run, it might actually end up being hard to run. Or in the case of Ubisoft, another lackluster port but on next-gen levels.
Of course, as mentioned, even pretty low end PC CPU's are about twice as fast per core or more than consoles - this is exactly why when we take a look at most multi-plats we can ignore CPU (assuming a base line) and just concentrate on GPU performance. This is borne out by the data which shows that on a modern system a GPU that has similar specs to a PS4 performs similarly to a PS4.
Did you install that Worse Mod thing? It improved the performance significantly for me. That game looks great in places, but the urban areas look rubbish during the day.
http://blog.ubi.com/assassins-creed-unity-pc-specs/
![]()
They are trolling you guys and you bite.
The 940 isn't even in the same universe as the 2500k, and even the FX have a hard time keeping up;
Yet, here they are: 940 in the same place as the 8350 and 2500k, and in the recommended the 8350 and 3770, but not the 2500 which performs the same in games, even better compared to the FX.
They are lying and trolling and lots of people are biting.
I think a R9 265/GTX 750ti should do as well as consoles, that's not saying much (a pitful 900p/30fps) but at least it should be easy to outshine both with even stronger hardware.
A 680 as minimum is very strange, I wonder what targets they must have set to conclude a 680 is a good fit for minimum specs.
Perhaps console like settings at 60fps ?
A 780ti is NOT $1,000. More like 400, and you can get equivalent power for about $330.
So this time round crapper gpu's will out perform better one's that are in console's?
A 265 is better than the Xbox One GPU, same goes for the 750ti.
Given that there is parity between the two console versions a 750ti should do "as well" as both, taking into account PC overhead.
A 750ti can run Watch Dogs at a very stable 30fps, 1080p using high-ultra settings.
It beats the mighty PS4 in other games too.
Lol that's all I needed to know.
Why is Evil Within being used as reassurance? That game actually ended up being hard to run. What kind of pc does it even take to maintain 60 fps on high settings in Evil Within?
Yeah, idtech5 is probably largely to blame but my point is, sometimes when a game sounds like it's going to be hard to run, it might actually end up being hard to run. Or in the case of Ubisoft, another lackluster port but on next-gen levels.
Don't trust me, do a little bit of research. If both can run demanding games as well or better than consoles why would Unity be any different ?
We shall see.
Where did I say I didn't trust you?
You post implied that you didn't believe a word of what I was saying.
That's fine, look around for yourself, look at some 265 or 750ti benchmarks and you'll see that they manage to perform as well or better than either consoles in recent high profile games (Battlefield 4, Watch Dogs, AC4, COD Ghosts, Thief, Shadow of Mordor, Alien Isolation, Ryse Son of Rome).
I never ask people to take my word for anything, see for yourself.
I asked a question then you answered then I replied finding it amusing about the console's. You sure read a lot from so few word's.
Really? I played it mostly maxed out (sans AA) with a Radeon 5870 1GB and a i5 2500k at 4.2ghz and it ran well enough. Probably around 30-50fps most of the time.. Was a damn nice looking game, too. Never ran into any bugs or anything.
I guess our definitions of "atrocious" differ.
What did you find "amusing" in my post ?
What did you find "amusing" in my post ?
Nope.It's a ps3/360 game released in 2012, and that was a very high end computer in 2012.
Thanks for clearing that up a little, I really don't know what he's getting at.I'm guessing he never really cared about what you were going to answer, only if you would answer or not. He may find it funny that you are being so helpful while he's being facetious.
I'm going to need a better source. That just sounds like complete bullshit.
I'm guessing he never really cared about what you were going to answer, only if you would answer or not. He may find it funny that you are being so helpful while he's being facetious.
WTF my pc is a core i5 2500 and i have 8gb ram with a r9 270X and i dont even pass to play the game at min settings?? WTFFF!!!!! this is BS , ubi always make their pc games run fine by brute forcing with the hiest specs rigs available, this is un acceptable.
These fucks better support SLI.