• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Gears of War Ultimate Edition PC Specs Leaked

What's with these Microsoft PC titles and their seemingly hyper inflated system requirements? 16GB ram for a 360 remake? What a joke.

Have you even seen the game? It's beyond just a simple rerelease or port. It's a full on remaster including the multiplayer. I believe the PC version brings 4K support as well which is why you could see inflated spec requirements.
 
I would like to play this in 4k... I probably need to get rid of my sealed OG GoW... but I want to see some side by side comparisons of both at the same res maxed out. So how much is it and will it have cross buy?
 
So how much is it and will it have cross buy?

No idea on price, mostly the current Xbox store tag ($39.99?). Most likely, yes, cross buy in the same vein of Quantum Break. Maybe all Xbox One users who have the game digitally will get a free PC copy too. They could go out of their way and reward the physical copies too, with a date, like the GeOW back compat they did last year, but it may get cumbersome.
 
Believe it or not, but you'll also need a ton of ram for a PS1 remake when FFVII launches.

Wait you mean we will need 24GB+ ram + whatever VRAM for FF remake and 32GB for new games? Since people are saying 16GB is now suddenly common for a remake.

Poster below is right, FF7 isn't a remake, it's being done entirely different, you aren't playing the same game.
 
From a technical perspective they kind of are.

gears ultimate edition looks like an xbox 360 game with some higher res assets. ff7 remake looks like a high budget modern game on cutting edge tech. it would be completely reasonable for ff7 remake to recommend top end pc hardware
 

So they plan on offering DX12 specific MSAA support, better textures, and better geometry quality for the game models so it still looks good @ 4k and not just upressed (kinda like how gears 1 PC had higher res textures than the 360 release even). Furthermore, they will make use of Async compute.

It does not sound like a straight port when you put it this way. MSAA support, if done corrrectly, could be absolutely awesome and quite performant since it is forward rendered and DX12.

I hope they offer effects parity with the cutscenes or make the cutscenes real time on PC, otherwise there will be a horrible contrast with gameplay. OBMB and bokeh whilst in game would be nice.
 
So they plan on offering DX12 specific MSAA support, better textures, and better geometry quality for the ame models so it still looks good @ 4k and not just upressed. Furthermore, they will make use of Async compute.

It does not sound like a straight port when you put it this way. MSAA support, if done corrrectly, could be absolutely awesome and quite performant since it is forward rendered and DX12.

what changes in dx12 make msaa more performant? also isnt UE3 mostly deferred?
 
what changes in dx12 make msaa more performant? also isnt UE3 mostly deferred?

UE3 eventually went fully deferred (samaritan), but since they started with that GeOW 1 base of the engine for this, they kept it forward rendered according to the interview. And DX12 offers enhanced MSAA support through baseline programmable MSAA sample locations and stuff for coverage maps. AKA, you can have smarter MSAA in simple terms.

In the interview they mention
On the GPU side, we've converted SSAO to make use of async compute and are exploring the same for other features, like MSAA.
 
From a technical perspective they kind of are.

They really aren't, Gears of War Ultimate Edition is closer to Halo 2 Anniversary or even The Legend of Zelda: Twilight Princess HD. Lots of work has been done to change the textures and modernise the game but it certainly isn't a full fledged remake like Final Fantasy 7 will be.
 
UE3 eventually went fully deferred (samaritan), but since they started with that GeOW 1 base of the engine for this, they kept it forward rendered according to the interview. And DX12 offers enhanced MSAA support through baseline programmable MSAA sample locations and stuff for coverage maps. AKA, you can have smarter MSAA in simple terms.

In the interview they mention

i thought programmable sample locations were for improved quality without incurring further performance degradation. i also thought even the first version of UE3 was mostly deferred, hence msaa always being a huge problem for it. msaa via async compute sounds good, if not extremely difficult. would certainly be a huge win for amd gpus
 
650Ti minimum and 970 at 1080p(60fps?) are considered high requirements for a 2016 game?

I don't get it either. Due to the inflated prices since being stuck on 28nm (from 2011 onwards) the 970 is just an overpriced (and gimped) performance card and not high end. It just doesn't have the appropriate price.
 
They really aren't, Gears of War Ultimate Edition is closer to Halo 2 Anniversary or even The Legend of Zelda: Twilight Princess HD. Lots of work has been done to change the textures and modernise the game but it certainly isn't a full fledged remake like Final Fantasy 7 will be.
As they've said, they didn't start again because they wanted to keep the core gameplay elements. That would of changed with a re-write.

Come on PC guys. Every thread around requirements is either "shitty console port and theyve done nothing for PC" or "Unoptimised crap, look at those specs".

A stick of DDR4 8GB is 30 quid for christs sake.
 
i thought programmable sample locations were for improved quality without incurring further performance degradation.
Exactly, smarter MSAA. Same / better quality with less sampling? Sounds like more performant to me.

i also thought even the first version of UE3 was mostly deferred
If I remember correctly, and if the smaritan upgrade PDF (see slide 15 and below) is good indication, UE3 was always primarily forward rendered. I think the way cubemaps and shadows were done was deferred though.
As per the PDF and the UDK docs reference the past post-samaritan: "Deferred shading is a technique which allows dynamic lights to be rendered much more efficiently, but with a restricted feature set. Traditional UE3 lighting is called forward shading, because the dynamic lighting calculations are done while rendering the meshes of the scene."
I am pretty sure DX10 in GeOw 1 PC and on xbox360 had MSAA due to enhanced support for MSAA in MDr/HDR render targets (see slide 17). Although, the xbox360 MSAA was just terribly broken as it came before the HDR pass. ON PC I believe - I could have to reinstall to check - it has better coverage due to DX10.
, hence msaa always being a huge problem for it.
I think that depends on the game actually, whether it used the dx10 path. SGSSAA uses MSAA samples right? And that has great support in UE3 titles under dx9 even.
msaa via async compute sounds good, if not extremely difficult. would certainly be a huge win for amd gpus
I am not sure why that sounds difficult as I have no idea of the technical backend behind it, but since they are professional devs, I think they know what they are talking about.

edit: If I recall, nooblet has a good memory of all the technicals regarding Gears of War 1 on xbox 360 and PC. Perhaps he could chime in.
 
As they've said, they didn't start again because they wanted to keep the core gameplay elements. That would of changed with a re-write.

Come on PC guys. Every thread around requirements is either "shitty console port and theyve done nothing for PC" or "Unoptimised crap, look at those specs".

A stick of DDR4 8GB is 30 quid for christs sake.

Yeah I agree, just saying its not a remake on par with Final Fantasy 7.
 
As they've said, they didn't start again because they wanted to keep the core gameplay elements. That would of changed with a re-write.

Come on PC guys. Every thread around requirements is either "shitty console port and theyve done nothing for PC" or "Unoptimised crap, look at those specs".

A stick of DDR4 8GB is 30 quid for christs sake.

There is no reason that what is essentially a 360 game with slightly improved assets should require anywhere near 16gb of ram and a graphics card with another 4 gb for 1080p.
 
Wait you mean we will need 24GB+ ram + whatever VRAM for FF remake and 32GB for new games? Since people are saying 16GB is now suddenly common for a remake.

Poster below is right, FF7 isn't a remake, it's being done entirely different, you aren't playing the same gam
e.
Why does it matter if a game plays the same or not?
System requirements are related to the rendering and one has to be blind to not be able to see that Gears UE has totally new and modern rendering that has absolutely nothing to do with the original game....it does not matter if the gameplay engine is the same.

They really aren't, Gears of War Ultimate Edition is closer to Halo 2 Anniversary or even The Legend of Zelda: Twilight Princess HD. Lots of work has been done to change the textures and modernise the game but it certainly isn't a full fledged remake like Final Fantasy 7 will be.

1) Twilight Princess HD is not the same as Halo 2 Anniversary.
Revamping textures of what is essentially the same engine is not the same as making the same game in a new engine (and believe me Gears 1 engine and the engine used in Gears UE are atleast 2 generations apart).

2) Absolutely NOWHERE, does it say that a remake has to use a different gameplay engine to be called a remake and that a game with a completely remade rendering/cutscenes isn't a remake because it uses the same gameplay engine. If you are aware of such distinctions then I'd like you to show it to me.

3) See above response to the other person
 
There is no reason that what is essentially a 360 game with slightly improved assets should require anywhere near 16gb of ram and a graphics card with another 4 gb for 1080p.
"Slightly improved" assets. That's just untrue.

There is a world of difference between Ultimate and the original game in terms of detail.
 
There is no reason that what is essentially a 360 game with slightly improved assets should require anywhere near 16gb of ram and a graphics card with another 4 gb for 1080p.
What your referring to there is the PC version of GoW which released in '07.

This is a completely different kettle of fish.
 
Why does it matter if a game plays the same or not?
System requirements are related to the rendering and one has to be blind to not be able to see that Gears UE has totally new and modern rendering that has absolutely nothing to do with the original game....it does not matter if the gameplay engine is the same.



1) Twilight Princess HD is not the same as Halo 2 Anniversary.
Revamping textures of what is essentially the same engine is not the same as making the same game in a new engine (and believe me Gears 1 engine and the engine used in Gears UE are atleast 2 generations apart).

2) Absolutely NOWHERE, does it say that a remake has to use a different gameplay engine to be called a remake and that a game with a completely remade rendering/cutscenes isn't a remake because it uses the same gameplay engine. If you are aware of such distinctions then I'd like you to show it to me.

3) See above response to the other person
Nooooooobbbbbbleettttttt

Demonstrate your knowledge of Gears 1 and its original rendering. :D
 
Why does it matter if a game plays the same or not?
System requirements are related to the rendering and one has to be blind to not be able to see that Gears UE has totally new and modern rendering that has absolutely nothing to do with the original game....it does not matter if the gameplay engine is the same.



1) Twilight Princess HD is not the same as Halo 2 Anniversary.
Revamping textures of what is essentially the same engine is not the same as making the same game in a new engine (and believe me Gears 1 engine and the engine used in Gears UE are atleast 2 generations apart).

2) Absolutely NOWHERE, does it say that a remake has to use a different gameplay engine to be called a remake and that a game with a completely remade rendering/cutscenes isn't a remake because it uses the same gameplay engine. If you are aware of such distinctions then I'd like you to show it to me.

3) See above response to the other person

what in the world are you talking about?
 
There is no reason that what is essentially a 360 game with slightly improved assets should require anywhere near 16gb of ram and a graphics card with another 4 gb for 1080p.
You mean completely re-done right? Because there is nothing the same in this remake as the original or pc versions. Everything was remade for the xbox one.
 
If I remember correctly, and if the smaritan upgrade PDF (see slide 15 and below) is good indication, UE3 was always primarily forward rendered. I think the way cubemaps and shadows were done was deferred though.
As per the PDF and the UDK docs reference the past post-samaritan: "Deferred shading is a technique which allows dynamic lights to be rendered much more efficiently, but with a restricted feature set. Traditional UE3 lighting is called forward shading, because the dynamic lighting calculations are done while rendering the meshes of the scene."
I am pretty sure DX10 in GeOw 1 PC and on xbox360 had MSAA due to enhanced support for MSAA in MDr/HDR render targets (see slide 17). Although, the xbox360 MSAA was just terribly broken as it came before the HDR pass. ON PC I believe - I could have to reinstall to check - it has better coverage due to DX10.

I think that depends on the game actually, whether it used the dx10 path. SGSSAA uses MSAA samples right? And that has great support in UE3 titles under dx9 even.

I am not sure why that sounds difficult as I have no idea of the technical backend behind it, but since they are professional devs, I think they know what they are talking about.

edit: If I recall, nooblet has a good memory of all the technicals regarding Gears of War 1 on xbox 360 and PC. Perhaps he could chime in.
yea Gears 1 uses what you can describe as a partial deferred renderer.
As far as I know Gears 1 PC suffered from the same issue as the 360 version wherein the MSAA was applied before the HDR pass and only applied on static objects, but on PC it needed Dx10 due to its ability to perform a custom AA resolve in the shader hardware.

The whole thing about MSAA and deferred rendering incompatibility was a mostly software issue related to the API as far as I know, this was a good post on why consoles with their DX9 level hardwarecould sometimes do things that on PC would require Dx10.
 
Metro 2033 had 8gb ddr3 recommended even thought it couldn't use past 2gb. Evil Within required 4gb of vram but the reality is it didn't get anywhere near that, could go on forever how silly the rec specs have been.

Games now can use more than 4gb of system ram. I play Rise of the Tomb Raider no problem whatsoever on my i7 930 rig that has 6gb of ddr3. Games these days will just use up what you have. Sadly someone with 16gb of ram will report, the game is using up all my ram shrreeeek, heavens above oh the horror and sets off a bunch people thinking they need to upgrade. Witcher 3 and GTA V just use up what you have, I play them both fine with no issue.

Lets wait till the game comes out.
 
You mean completely re-done right? Because there is nothing the same in this remake as the original or pc versions. Everything was remade for the xbox one.


maxresdefault.jpg

It looks better, but there is nothing about the screens I've seen of a remake that screams it needing 16gb of ram and a brand new, 4gb graphics card.

https://www.youtube.com/watch?v=35Sk3-TWmsI
 
what in the world are you talking about?

I am talking about the fact that the engine used in Gears 1 was first gen UE3, the engine used in Gears 2 was 2nd generation UE3, Epic has covered the improvements made in enough detail to verify this claim (it was informally referred to as UE3.5 even back then). When they added lightmass along with other features for Gears 3 that's when you saw the third generation of UE3. Infact, you don't even need to go through Epic's coverage on this issue since you can tell the difference between these three iterations just by looking at them.

Just because all these iterations are called Unreal Engine 3 does not mean they are the exact same engine. Since engine development is a constant process a build that is made a week after v1.0 can be called v1.1 or an v1.2...naming an engine is a completely arbitrary process.
 
The 16GB of RAM for recommended at 1080p seems pretty crazy to me. I highly doubt the game will use that much.
Not that it is necessarily expensive to get that much RAM.
 
maxresdefault.jpg

It looks better, but there is nothing about the screens I've seen of a remake that screams it needing 16gb of ram and a brand new, 4gb graphics card.

https://www.youtube.com/watch?v=35Sk3-TWmsI
How about you don't pick a shot that shows an unlit character taking up 40% of the screen, aiming at nothing ? Also that's from MP which is simpler visually than SP (which is what I assume is the reason for recommended specs)

https://www.youtube.com/watch?v=eF-Gc5wWsik

 
Already played and anjoyed on Xbone, a good remastered, but not that much polished and with a few raw edges here and there. About this PC requirements they do look strange at the very least, "OK" with Quantum Break, but this shouldn't be that demanding, i'm really curious to see how it will really run on PC when released, i could also buy it myself to try it out in front of Quantum Break release since they have almost the same requirements, especially the Ram ones, i've never had a problem so far even in the most demanding games with 8GB
 
didn't the previous pc version have a map editor? would be nice for this version to have one as well, not expecting anything though.
 
How about you don't pick a shot that shows an unlit character taking up 40% of the screen, aiming at nothing ?

https://www.youtube.com/watch?v=eF-Gc5wWsik

We know it was remade and all, i played the original on 360 and this ultimate edition on Xbox One last year, though these graphics doesn't justify a Gtx 980ti and 16GB of Ram for recommended 1080p settings, there are plenty of PC games out there which look way better, with open world and dynamic weather but are way less demanding, especially with Ram.
But i bet this game will be easily maxed out at 1080p 30fps with a 970 and 8gb of Ram in the end

PS: Alright i read wrong the first time, it says indeed that a GTX980ti is recommended for 4k, and 970 for 1080p...so everything looks fine to me, but for that system Ram requirement, 16GB is not right at all for 1080p
 
We know it was remade and all, i played the original on 360 and this ultimate edition on Xbox One last year, though these graphics doesn't justify a Gtx 980ti and 16GB of Ram for recommended 1080p settings, there are plenty of PC games out there which look way better, with open world and dynamic weather but are way less demanding, especially with Ram.
But i bet this game will be easily maxed out at 1080p 30fps with a 970 and 8gb of Ram in the end
read the specs again :)
 
16GB of RAM? Gotta be a joke.

Why? It's 2016. PS4 has 8GB VRAM. As a shared pool sure, but still. If anything it's time for memory requirements to take a step up. We're on the verge of getting insane amounts of VRAM on our single GPU's with a smaller form factor as well. In terms of RAM you can get 16GB DDR4 for less than I got my 16GB DDR3 6 months ago.
We've been sitting on 8GB RAM as a standard for a long time now. It's time for progress.

EDIT: Any word on it's release? Dying to get a hold of a DX12 game and not to mention the new features and performance it brings.
 
Top Bottom