GameWorks game that performs more poorly on AMD HW than on Nvidia HW? Color me shocked.
It would be interesting if somebody compiled a list of gameworks vs non-gameworks titles where the developer issues warnings of poor performance on AMD cards.
Can't think of any in recent history. Anybody willing to chime in?
I think it's quite nice of Rocksteady at least to say it out loud that GameWorks game may and do run worse on AMD cards than on Nvidia. After all GameWorks is all about catering to Nvidia's ecosystem and [future] consumers.
If you are interested to see how AMD GPU's perform in games that utilize blackbox GameWorks code/features then you just need read benchmarks, at least 1 GW game in there usually.
Have you ever used PhysX in an Arkham game? It's incredibly bad ass and I hope to see more uses in games in the future. It's the only thing around that seems to be doing something new on the PC.
I can understand people being unhappy with PhysX features(and they absolutely should be optional) but they are really awesome and make every game I've used it in, much better for being there.
How many games anymore use PhysX's GPU bound features? Also should we really celebrate the fact that things like improved physics engines are developed as proprietary tech by companies like Nvidia instead of as open source by game developers themselves?
I'm talking about the 390x they just refreshed. The Fury X looks great and hopefully will beat the 980 Ti. Maybe the 390x isn't the card that AMD want's to compare to the 980 Ti, but it's what's available and people will make the comparisons.
Why anyone would compare 390X and 980Ti? Those aren't even comparable cards in price or performance, 390X isn't positioned against 980Ti. Anyone looking at benches should realize that.
Having large amounts of tessellation, a method used in tons of games now-a-days, is sabotage?
HairWorks in Witcher 3 uses 64x tesselation factor and 8x MSAA for hair, which cripples performance on AMD and delivers rather hard blow on NV performance too. Then AMD user forces tesselation to e.g. 19x through drivers, gains significant performance increase while next to none visual quality downgrade.
How forcing such insane tesselation factors into the game and/or effect of the game isn't intentionally sabotaging performance of GPU's when image quality gain just isn't there to justify it?
Also to note; CDPR can't and isn't even allowed to touch HairWorks code and settings because it's blackbox GameWorks code. Only Nvidia is allowed to tweak it.
People do realize that Nvidia isn't really throwing literal money at devs to write better code for them, right? What they do is share their knowledge base and provide engineering assistance to the devs. AMD is just as capable of doing the same thing. Let's face it, at minspec, this is not a Gameworks issue.
Nvidia providing devs with blackbox GameWorks code/features and then engineers to help implementing it into the game while improving games performance on NV cards isn't happening for free and out of NV's good will. In exchange NV gets to use game, in this case latest Batman, to market their latest HW, to create HW + game bundles and just do
"Best on Nvidia!" brand marketing. Nvidia doesn't need to transfer 10 000 000USD to WB's/Rocksteady's bank account or other way around, there is other ways of "payment".
Also I would argue that AMD isn't capable of doing same thing as their similar techs are open source and not proprietary and blackboxed like NV's techs. One could argue that maybe AMD then should start doing same stuff as NV, but would it do any good for them or to consumers? I don't think it would.
Edit:
I'm just going to place this here.
Catalyst 15.6 but yeah, AMD sucks.
Why AMD sucks? Because they have to work with code they can't see and still have to optimise for it?