And There is no way to play this game at 32-bit OSes.
So, If you're 32-bit OS user, It's time to change your OS.
Don't be silly.I have a 780, but boy am I terrified.
Yea, scalability towards older hardware/OS's can only go so far. Next-gen only games are obviously going to have higher minimum requirements to some degree and we have to leave behind these dated things in order to have progress.If you're running the minimum specs listed and are running a 32 bit OS, you're doing things wrong. Hell it's 2014 if you're running a 32bit OS on any piece of hardware made in the last 5 years, you're doing things wrong.
I also have a 1GB 560ti. I was told I wouldn't be able to run Watch_Dogs properly, but that turned out to be bullshit. 2550k turned out to be more than fine as well.rip 1GB 560Ti. Kinda want to wait until next year to upgrade though, so I guess I will just get myself addicted to WoW meanwhile.
There's no way the recommended and minimum processor for AMD are the same.
Yea, scalability towards older hardware/OS's can only go so far. Next-gen only games are obviously going to have higher minimum requirements to some degree and we have to leave behind these dated things in order to have progress.
Although having a 2500k/GTX680 or 7970 as minimum is a bit more of a leap than I think we should expect at this stage. A game better be downright groundbreaking to require this kind of hardware just to run at the lowest settings.
Don't be silly.
Your GPU is fine.
Hahaha I assume you're trolling, nobody can be that stupid.
- CPU:
Minimum - Intel Core® i5-2500K @ 3.3 GHz or AMD FX-8350 @ 4.0 GHz or above
None of those 'require' an i7.
![]()
So could we stop this 'people who thought their old hardware was enough were wrong' talk until at least one game really needs more cpu power to run smoothly? Maybe Unity will be it but so far the official requirements of the latest AAA productions haven't been very reliable, Evil Within's VRAM recommendations for example ended up being so wrong it was almost funny.
This is kind of telling really. AC3 and AC4 are two of the worst ports I've ever seen, I remember the frame rate literally cutting in half the minute a few trees appeared in AC4. And this was with my 4770K @ 3.8ghz and SLI'd 780Ti's.
I can't wait to see how bad it performs this time.
I have seen this game running on a VERY powerfull PC and it was no-where near a locked 60, it peaked and trough'd like a rollercoaster, I think like most Ubisoft games they work on quantity over quality and would need the 2 week delay to get into any optimising for the day one patch.Your 2500K will still be good to match consoles for the whole gen, perhaps even more with DX12.
PS4/XBO have not inflated PC requirements much at all when I think about it, so much for PC having a hard time "adapting" to new consoles.
I7 was listed as recommended as early as 2011 (Dirt 3).
http://www.codemasters.com/uk/dirt_3/pc/faq/95/
Not sure what you're even laughing at.
In the OP, did you not read it?
I expect ACU to be that game considering it's one of the first non cross gen games and the performance of past AC/Ubi games.
I have seen this game running on a VERY powerfull PC and it was no-where near a locked 60, it peaked and trough'd like a rollercoaster, I think like most Ubisoft games they work on quantity over quality and would need the 2 week delay to get into any optimising for the day one patch.
Infact after seeing it I wondered if Rogue was set for all gens this year and Unity is a year sooner due to the poor sales from WD (relatively) on last gen to now, hence the delayed PC and subsequent X1/PS4 release in a retro package next year.
I think this will have performance issues even after a hulking day 1 patch on ALL formats, just expect the PC version to be the worse (again relatively).
So as long as I'm not forced to play at 900p/30fps it will be just fine.
50fps, 1080p, near maximum settings should be easily achievable on high-end systems.
That should be enough to blow the PS4/XBO skus out of the water.
yeah I would not bank on anything yet!
Because... we already had games "requiring" i7 CPUs... and working as good on i3 CPUs ?
Why are they recommending i7 if i3 is good enough? Is this still true on 1080p and higher? I've seen only 1 image for 720p.
Why are they recommending i7 if i3 is good enough? Is this still true on 1080p and higher? I've seen only 1 image for 720p.
The Evil Within has been shown to have absolutely terrible multi-threading that doesn't even take advantage of i7s (With an i3 near the top, like if it was Dolphin or something), and I've been playing Mordor on my 2500k at 60 FPS with a 760 just fine... (At PS4 settings as well)Shadow of Mordor, Watch Dogs, The Evil Within all come with i7 recommendations. System reqs will only get higher and old hardware like 2500k will only be enough for low end.
Why are they recommending i7 if i3 is good enough? Is this still true on 1080p and higher? I've seen only 1 image for 720p.
Shadow of Mordor, Watch Dogs, The Evil Within all come with i7 recommendations. System reqs will only get higher and old hardware like 2500k will only be enough for low end.
That's weird. I played it just fine on my laptop (at 1440x900, mind you) at mid settings with an i7, 4gb ram and a 460M. My desktop computer had at the time a GTX 660 and a Athlon II X4 630 and it was running great with everything at maximum except one particular setting which I just can't remember right now.
I've always been in the group of people "AC games run fine in my computer" though.
In any case, these system requirements are super weird. There's no way the recommended and minimum processor for AMD are the same, and the gap between graphic cards doesn't make sense either, because, no matter how bad Ubi ports are, you can always disable whichever effects you don't need.
The way I see it, either the minimum requirements or the recommended are absolutely made up. Keep in mind this comes from a distributor for Ubisoft. Not ubisoft themselves. I'd take it with a grain of salt, and definitely wait until we can see it in a reliable official source (such as the Ubi shop).
I don't think the problem was with my PC as much as it was with ACIV's terrible optimization. I had to cap the game at 30 FPS to get a reliable result, otherwise it'd constantly alternate between either 30 or 60. There was no inbetween. :lol
Given the fact that all previous cases of huge minimum requirements turned out to be hogwash, I see no reason at all not to panic in light of this news.
I don't think the problem was with my PC as much as it was with ACIV's terrible optimization. I had to cap the game at 30 FPS to get a reliable result, otherwise it'd constantly alternate between either 30 or 60. There was no inbetween. :lol
That's because of messed up vsync. Just disable it and use triple buffering with Nvidia Inspector or AMD CCC.
There is no way these system requirements are legit. A 2500K and 660 minimum?
Ask yourself if you think this game is going to be more of a graphical bear than Crysis 3. No? Because 2600K and 660 meet the High Performance requirements for Crysis 3.
There is no way AC Unity is pushing Crysis 3 max settings type of quality.
I'm going to assume they are asking for these beefy specs because the port is optimized horribly and will require significant overhead.
Welcome to double buffering. You could try for force triple with D3DOverrider and find out that your experience will be much better.
I'm getting 50-60fps with maximum settings (except Physx on low) at 1080p.
It's not enough to call the port a magnificent effort but my experience has been very positive regardless.
I see no reason why having a superior experience to that of consoles will be difficult on PC, even on low end GPUs (GTX 750ti, R9 265). Obviously, 60fps will always be a hard target to reach.
seriously...they just want people to be scared so they buy the more expensive console version. all those recommended specs had been bullshit lately.
The problem with triple buffering is it adds up to another 33ms of latency.
CPU has nothing to do with resolution. So an I3 is "good enough" for 30fps at 1080p in CPU limited scenarios.
But as the generation goes along, more cores (physical or otherwise) will make a difference or become necessary for high framerates.
Ah, if only DX12 released much earlier.
Because... They don't own multiple PCs to test for sure it's working ? Because minimum and recommended settings means nothing and change depending of the publishers ?
And yes, it's still true on 1080p, because you know, resolution has a few thing to do with CPU...
The power gap between minimum and recommended is way too low to have any conclusions about this. If these requirements (which likely don't) mean Low/High-Max settings, then it would be nowhere near enough power difference to show it (or there's barely any configurable settings). Hell, even the AMD CPU is the same on both.
Worth keeping in mind the Nvidia collaboration, so I wouldn't be surprised if the specs were based around enabling some of their features as well.
The Evil Within has been shown to have absolutely terrible multi-threading that doesn't even take advantage of i7s (With an i3 near the top, like if it was Dolphin or something), and I've been playing Mordor on my 2500k at 60 FPS with a 670 just fine... (At PS4 settings as well)
Reccomended requirements are generally hyper-inflated for liability reasons. Especially as of late.
As I said earlier, it's essentially impossible to form an argument using recommend system requirements as they are steeped in bullshit.
I played Mordor on almost maxed settings at 60 fps (pretty stable, too) with a non-OCd (3.3ghz) 2500k.
What are you talking abuot they are both 60 dollars
Now clock that i5 2500k at 4.5ghz+, which is a very simple task and a four year old product can play at max. Mine is clocked at 4.5ghz and Shadows of Mordor, WatchDogs and the Evil Within didn't even slightly rattle it.Yes the 2500k is listed as a minimum requirement so you should be okay on the lower settings.
Shadow of Mordor, Watch Dogs, The Evil Within all come with i7 recommendations. System reqs will only get higher and old hardware like 2500k will only be enough for low end.
![]()
Freaky stuff!
Ah that's cool, I was only looking at system reqs, so my 3570k will be enough for some time
Still I think from next year on with current gen only games will really start using more cores and hyper threading.
No idea what you are trying to prove with this benchmark. A Core I3 does very well here.![]()
Freaky stuff!
That was the case the previous console generation as well.Still I think from next year on with current gen only games will really start using more cores and hyper threading.
What are you talking abuot they are both 60 dollars
here in europe console games are way more expensive, 70€ while the pc version is 50€. (if your search for key resellers even for 30€. so its not far fetched to say they exagerate on purpose. last games have shown that the official specs are not correct.
So the consensus is the specs are bullshit? If so, why would ubisoft want to scare off potential customers?