Nvidia's GameWorks causing issues for AMD users in Gears of War: Ultimate Edition

12787215_10100126885168469_1733808464_o_2.jpg

We’ve just learned that no one at AMD was informed by Microsoft that review codes were being handed out to journalists for performance testing earlier this week and only found out after the fact.

We have also tested the game again with the newly released patch. This patch was intended to fix the visual corruption issue with ambient occlusion turned on. However, it appears that all this patch does to address the issue is forcibly disable Ambient Occlusion, even if enabled through the menu.

In the case of Gears of War Ultimate Edition it was Nvidia’s proprietary HBAO+ GameWorks feature listed under a generic name. In fact we only learned that the ambient occlusion implementation in Gears of War : Ultimate Edition was in fact HBAO+ when Nvidia’s Andrew Burnes announced the availability of Game Ready drivers on the GeForce.com blog and named it as one of the game’s features.

Simply turning off ambient occlusion in the settings will completely resolve the artifacting that we witnessed while testing the AMD Radeon R9 Nano and R9 380 graphics cards.

As it turns out HBAO+ wasn’t the only thing hidden behind a veil. Whilst digging into the game’s files a friend of WCCF – youtuber Blindrun – spotted a very interesting hidden game file. The file is BaseEngine.ini and it can be found in the following folder path on Windows 10 : C:ProgramFilesWindowsAppsMicrosoft.DeltaPC_1.6.0.0_x64__8wekyb3d8bbweEngineConfig

The WindowsApps folder is windows protected. Which you means that you’ll have to jump through some hoops to actually access it.

In BaseEngine.ini we spotted a very peculiar entry. “bDisablePhysXHardwareSupport=False”. More peculiar is the fact that this file cannot be edited in anyway. If it is, the game will simply overwrite any changes once it’s booted up and connected to Microsoft’s servers.

The entry means that hardware accelerated PhysX is enabled by default in the game and because any changes to the file are overwritten automatically upon game start-up means it can’t be disabled. For those of you unaware PhysX is Nvidia’s proprietary game physics engine and can only be accelerated by Nvidia’s own graphics cards. If a competing GPU is detected by the PhysX driver, the work will instead be automatically offloaded onto the CPU. This in turn means that the feature will run disproportionately slower on non Nvidia equipped systems.

Additionally, because hardware accelerated PhysX features are only visual and aren’t part of the game’s core mechanics ;disabling them would not affect the game’s behavior in any way. Sadly because we are unable to turn off hardware acceleration in Gears of War Ultimate Edition we don’t know what it’s actually doing. And whether it could account for some of the performance disparity we’re seeing between Nvidia and AMD graphics cards. Right now we simply don’t know what effect it has because we simply can’t test it.

tl;dr ambient occlusion is actually nvidias hbao+ despite being unlabeled in game (AMD users either play with a broken game or get no ambient occlusion), and physx is forced on at all times with no way to turn it off or know what it does for performance thanks to microsoft's total control over windows store apps

http://wccftech.com/nvidia-gameworks-visual-corruption-gears-war-ultimate-edition/

old? searched nvidia in thread titles but didn't find anything about this, and i figure it's big enough news to be seperate from the PC performance thread

fucking christ. piss poor job by both MS and Nvidia here.

reminds me of the tesselation in crysis 2

I'm worried we're gonna see more of this, not less, as time goes on and Nvidia grows even stronger

definitely feeling replacing my 970 with whatever flagship AMD launches at 13 inches or under this year


or apparently it's only MS fault? my bad
 
I don't see how any of this can be blamed on Nvidia. They didn't walk up to Microsoft and say "hey, we're going to force you to use our AO and nothing else, and BTW we're going to force you to use PhysX and not be able to toggle it off too". It's a Microsoft fuck up, nothing more and nothing less, don't try to push an agenda.
 
tl;dr ambient occlusion is actually nvidias hbao+ despite being unlabeled in game (AMD users either play with a broken game or get no ambient occlusion), and physx is forced on at all times with no way to turn it off or know what it does for performance thanks to microsoft's total control over windows store apps

http://wccftech.com/nvidia-gameworks-visual-corruption-gears-war-ultimate-edition/

old? searched nvidia in thread titles but didn't find anything about this, and i figure it's big enough news to be seperate from the PC performance thread

fucking christ. piss poor job by both MS and Nvidia here.

reminds me of the tesselation in crysis 2

I'm worried we're gonna see more of this, not less, as time goes on and Nvidia grows even stronger

definitely feeling replacing my 970 with whatever flagship AMD launches at 13 inches or under this year

I fail to see why this is nVidia's problem.

I also like how your response to this is to go and buy one of the cards that's having the issue...out of principle XD
 
We’ve just learned that no one at AMD was informed by Microsoft that review codes were being handed out to journalists for performance testing earlier this week and only found out after the fact.

This, assuming for the moment Nvidia played a part in how the game's launch build fares on AMD hardware, is similar to what happened with Tomb Raider 2013. Squeenix stopped updating Nvidia's branch some weeks before launch and didn't push through a patch until just days remained; Nvidia found out the hard way that most of its optimisation work had been undone in the interim.

Neither side has a moral high horse. It's a merry-go-round of politics that'll continue to spin for the foreseeable future.
 
I fail to see why this is nVidia's problem.

I also like how your response to this is to go and buy one of the cards that's having the issue...out of principle XD

If AMD's market share wasn't so small the port would have needed to accommodate their hardware better.

But yeah, this appears to be on the developer and Microsoft's shitty store, not Nvidia.
 
This Windows 10 store is looking more and more like a worse GFWL, and we all know how that ended up. Its PR disaster after PR disaster.

Get your shit together and stop treating PC gamers like shit whilst pretending you care.
 
These entry are standard in the unreal engine .ini, is there more (A benchmark with/without modified .ini?) aside speculation and conspiration theory?
 
And the rather stupid misinformation about what PhysX is and what it does still continues...

As a developer it's rather infuriating, because a lot of games actually use PhysX as the base physics engine (instead of Havok for example). PhysX is much, much more often used as the rigid body solver of the engine, and run only on the CPU (because you wouldn't offload base object logic onto the GPU). Unity and Unreal 3/4 both use PhysX that runs solely on the CPU. So no, PhysX is not the goddamn reason here and is not run on the GPU.

Yes, you can still have some GPU-run particle effects, but that's not PhysX stands for in most of the cases these days (a lot of the particle fanciness has been moved under the GameWorks umbrella).

These entry are standard in the unreal engine .ini, is there more (A benchmark with/without modified .ini?) aside speculation and conspiration theory?
Yes, and Unreal Engine uses PhysX as its core physics system, and is not run on the GPU unless you adopt some of the particle physics effects (which I don't think are generally even available for UE at the moment).

EDIT: Please read on what PhysX is used for in UE4. It has nothing to do with rendering nor vendor specific performance.
 
These entry are standard in the unreal engine .ini, is there more (A benchmark with/without modified .ini?) aside speculation and conspiration theory?

It would be really useful to test this. Unfortunately, it sound like actually modifying the ini is not possible thanks to how uwa/the windows store work.
 
I'm actually loving this. Durante took a load of shit in one of the previous UWA threads for being opposed to the whole platform due to restrictions on mods. People were telling him that it was rare that mods are needed to fix a game and they tend to be for niche titles or Japanese developed games.

http://www.neogaf.com/forum/showpost.php?p=195507497&postcount=2384

And now look, we are on what, just our 2nd major release on the Windows store and it has issues that could likely be fixed with a simple mod that changes the configuration of the physx and AO settings for AMD users to allow them to play while we wait for a patch.
 
I'm actually loving this. Durane took a load of shit in one of the previous UWA threads for being opposed to the whole platform due to restrictions on mods. People were telling him that it was rare that mods are needed to fix a game and they tend to be for niche titles or Japanese developed games.

http://www.neogaf.com/forum/showpost.php?p=195507497&postcount=2384

And now look, we are on what, just our 2nd major release on the Windows store and it has issues that could likely be fixed with a simple mod that changes the configuration of the physx and AO settings for AMD users to allow them to play while we wait for a patch.

I will never understand people that are ok with closed platform nonsense on PC. And stuff like this is exactly why.
 
crysis 2 was built around the strength of nvidia cards and had TONS of unneeded tesselation solely to cripple AMD cards

This is a myth. And has been one since forever.

Maldo and Crytek have written up about this in detail.
---
On topic - HBAO+ should not by design cripple performance as it runs really well on AMD in tons of other titles. It is the Coalitions problem that their implementation is messed up, not NV.
 
Sorry, what does this have to do with Nvidia? Did they make the game?

Well if it's true that the unlabeled AO in the game is actually HBAO+ by Nvidia, I doubt it just went past them without their knowledge. While I haven't seen HBAO+ being too big of a problem on AMD GPUs, it's still not something that should be obscured behind a generic AO label. That said, I don't get why Nvidia would approve the inclusion of HBAO+ and not have it explicitly stated. After all GameWorks effects are about marketing.
 
But, but i thought, since it´s UWA now and they can´t depend on modders to fix their games, everything would be great now? No more broken games....

We we´re told that. Right here on gaf.
 
I edited my opinions out with a strike through in the OP.



crysis 2 was built around the strength of nvidia cards and had TONS of unneeded tesselation solely to cripple AMD cards

It was confirmed that not everything you see (that famous wireframe screenshot) was being rendered. Because in wireframe mode there is no culling.
 
And the rather stupid misinformation about what PhysX is and what it does still continues...

As a developer it's rather infuriating, because a lot of games actually use PhysX as the base physics engine (instead of Havok for example). PhysX is much, much more often used as the rigid body solver of the engine, and run only on the CPU (because you wouldn't offload base object logic onto the GPU). Unity and Unreal 3/4 both use PhysX that runs solely on the CPU. So no, PhysX is not the goddamn reason here and is not run on the GPU.

Yeah, PhysX is a general purpose physics system. The makers even tried to sell specific dedicated physics cards prior to nVidia purchasing them.
 
Isn't Physx just part of UE3? Yup, it is. This is kind of bullshit, guys.
Now, maybe the AO part is true. And I can't tell, since HBAO+ runs better on AMD, lol.
 
But, but i thought, since it´s UWA now and they can´t depend on modders to fix their games, everything would be great now? No more broken games....

We we´re told that. Right here on gaf.



It's funnier because it's a 1st party game, from the owner of the OS, owner of the so called great ecosystem, maker of the so called great app format, maker of the API its using.

Another great commitment and here we'll have people telling us to wait for actions, that previous mistakes are a thing of the past.
 
And the rather stupid misinformation about what PhysX is and what it does still continues...

As a developer it's rather infuriating, because a lot of games actually use PhysX as the base physics engine (instead of Havok for example). PhysX is much, much more often used as the rigid body solver of the engine, and run only on the CPU (because you wouldn't offload base object logic onto the GPU). Unity and Unreal 3/4 both use PhysX that runs solely on the CPU. So no, PhysX is not the goddamn reason here and is not run on the GPU.

Yes, you can still have some GPU-run particle effects, but that's not PhysX stands for in most of the cases these days (a lot of the particle fanciness has been moved under the GameWorks umbrella).


Yes, and Unreal Engine uses PhysX as its core physics system, and is not run on the GPU unless you adopt some of the particle physics effects (which I don't think are generally even available for UE at the moment).

EDIT: Please read on what PhysX is used for in UE4. It has nothing to do with rendering nor vendor specific performance.

But it should be done in software when using an AMD card. I understand the use of PhysX, although I don't like it, but games like borderlands 2 let you change the effects used, if you didn't have an Nvidia card.
 
Isn't Physx just part of UE3? Yup, it is. This is kind of bullshit, guys.
Now, maybe the AO part is true. And I can't tell, since HBAO+ runs better on AMD, lol.

Exactly. Also. Disabling support for hardware based physx doesn't mean it's going to be used even if the hardware running it isn't fit for it.
 
Yeah.

I wouldn't state this is Nvidia's problem.

There are three issues here:

  • The HBAO+ implementation may be buggy
  • Microsoft didn't inform AMD of the game's release
  • The AO setting isn't labelling the implementation (and no alternative is provided).
The physx thing may be a problem or not. The issue here is that hardware physx flag is active in the config ini, but we don't know whether the system gracefully reverts to software if it doesn't detect support or even if the flag does anything at all.
 
I can't believe Microsoft started their "let's put our games on PC now shall we" iniative with a shit port like this. Nice going guys you're off to a great start.
 
But it should be done in software when using an AMD card. I understand the use of PhysX, although I don't like it, but games like borderlands 2 let you change the effects used, if you didn't have an Nvidia card.

Read the post, Gears UE is software Physx for everyone. No hardware accelerated Physx.
 
Yeah.

I wouldn't state this is Nvidia's problem.

There are three issues here:

  • The HBAO+ implementation may be buggy
  • Microsoft didn't inform AMD of the game's release
  • The AO setting isn't labelling the implementation (and no alternative is provided).

I might add:
- There's no way for modders to fix it, due to it being a Windows Store game.

Read the post, Gears UE is software Physx for everyone. No hardware accelerated Physx.

Sorry. I thought the “bDisablePhysXHardwareSupport=False” meant it used Hardware PhysX.
 
at least with GFWL you had microsoft peddling that shit on other services as well. now their peddling their shit on their own shit.
 
But it should be done in software when using an AMD card. I understand the use of PhysX, although I don't like it, but games like borderlands 2 let you change the effects used, if you didn't have an Nvidia card.
You misunderstood, the PhysX in Unreal 3/4 is run software on the CPU and has absolutely nothing do with anything GPU related nor could possbily have an implication on vendor specific performance. It's never run on GPU hardware.

The effects in BL2 or similar have nothing to do with the PhysX used for collision physics.
 
This is definitely a Microsoft problem. I have to imagine Nvidia isn't happy with their technology not being explicitly referred to, either.

The game is a total fucking dog of a port and it's a disgrace that Microsoft is getting away with GFWL: Electric Boogaloo
 
This is definitely a Microsoft problem. I have to imagine Nvidia isn't happy with their technology not being explicitly referred to, either.

The game is a total fucking dog of a port and it's a disgrace that Microsoft is getting away with GFWL: Electric Boogaloo

They refer to it on their website. So they did know it was being used.
 
Top Bottom