• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

nasty rumours about Nvidia/ATI on farCry...

DSN2K

Member
http://www.driverheaven.net/showthread.php?s=&threadid=51427

So what about the juicy gossip? Well to say today has been a dramatic day is an understatement, during our testing period for the last 12 hours or so we have spent alot of time on the phone and in email to many people in the industry, one such person in the industry even stated this:

"Nvidia encouraged Crytek to use partial precision in the shaders wherever possible. This means that many of the shaders will run in 16-bit precision, not the 32-bit precision (a requirement of SM3.0) that they are touting to the press. Ironically, while promoting the 32-bit precision of SM3.0 as a "must have", Nvidia is asking developers to use less precision than was available in SM1.0 - that is, all the way back in DirectX8! Even more ironically, ATI hardware will run most of the FarCry shaders more accurately (ATI hardware runs all shaders in 24-bit precision). Microsoft, the owner of DirectX, defines 24-bit and greater as "full precision" and 16-bit as "partial precision", Nvidia has claimed that ATI paid Crytek to delay the patch and include ATI features (the figure mentioned was $500k!)."

So there you have it, some of the facts, and some rumours.... nonetheless Crytek has committed to delivering 3Dc support in patch v1.3, which will provide higher detail images, or free up memory with better compression than traditional normal map compression options. FarCry is a great example of a next generation title that uses normal maps extensively. When the "final" patch arrives, we will be giving it a good going over and posting indepth results.

...
 
So basically nvidia is cheating again and false advertising AGAIN, and ati is fianlly getting a devolper to do what nvidias been doing for years (adding in card specific features) and nvidia doesnt like it.

why does nvidia come across as an over hill the prom queen that cant take the fact they are not hot anymore and people wont lick their ass like they used to?
 

Shompola

Banned
"Nvidia encouraged Crytek to use partial precision in the shaders wherever possible. This means that many of the shaders will run in 16-bit precision, not the 32-bit precision (a requirement of SM3.0) that they are touting to the press."

Was this before 6800 cards were released or? If yes then it's nothing new really. Nvidia tried everything to speed up the 5xxx cards.
 

Mrbob

Member
Awesome. The 9800 is capable of geometry instancing which means 9800 (9500 and up) should at least get a small boost with the Farcry 1.2 patch. Also, if I'm reading the thread right, 9700 and 9800 owners are going to be able to enable the SM2.0B path to get an extra boost.
 

marsomega

Member
i'm pretty sure this wouldn't surprise MrBob or me. Nvidia is pretty much brutal and in this generation they really don't care about saving face. The "leaked" slides of what they are brainwashing their sales people and the retail people with alone, pretty much tell you they are up on all claws and teeth.

Comments such as "they did not implement Shader Model 3.0 because they don't know how" is something you should be prepared to hear from someone trained at those camps. Which any knowledgeable person would know, this is the lowest form of fanboy fud. Like the 16-bit days, "Mortal Kombat is not comming to SNES because their is too much blood and they signed a contract with Sega" or "SNES can't handle Street Fighter Champion Edition, only the genesis is powerful enough". Tee Hee Hee.
 

mashoutposse

Ante Up
Driver Heaven seems to be on a mission. They've gone as far as to admit that they bench 6800 cards with optimizations on, while leaving all X800 cheats enabled:

"*NOTE: All optimisations were disabled using the Nvidia drivers; this included tri-linear optimisations. With the ATI drivers it is not possible to disable some optimisations and therefore they are enabled. Whilst this does not provide a true apples to apples comparison between cards it is our opinion that every review product should be tested at the best quality it can provide. It is up to the end user when they have the product to decide on optimisations which suit them. If ATI add an option to their drivers in future revisions we would also choose to disable the optimisations. Therefore it can be assumed that the performance on the 6800 Ultra can be improved over that shown in this review should you require it however this would mean reducing IQ. We suggest you read up on the optimisations which each manufacturer uses before deciding on your final purchase. "

Either match them up as evenly as possible or put an asterisk next to every X800 result.
 

Culex

Banned
mashoutposse said:
Driver Heaven seems to be on a mission. They've gone as far as to admit that they bench 6800 cards with optimizations on, while leaving all X800 cheats enabled:

"*NOTE: All optimisations were disabled using the Nvidia drivers; this included tri-linear optimisations. With the ATI drivers it is not possible to disable some optimisations and therefore they are enabled. Whilst this does not provide a true apples to apples comparison between cards it is our opinion that every review product should be tested at the best quality it can provide. It is up to the end user when they have the product to decide on optimisations which suit them. If ATI add an option to their drivers in future revisions we would also choose to disable the optimisations. Therefore it can be assumed that the performance on the 6800 Ultra can be improved over that shown in this review should you require it however this would mean reducing IQ. We suggest you read up on the optimisations which each manufacturer uses before deciding on your final purchase. "

Either match them up as evenly as possible or put an asterisk next to every X800 result.

The difference is that the optimizations on the Nvidia side decrease IQ to speed up the game, while the optimizations are unoticable on the ATI side.
 

Bregor

Member
The fuss isn't nearly as objectionable to me this time, seeing as both companies currently have excellent products out. It is no longer the case that one of them is trying to cover the fact that they fielded a flawed video card line.
 

SKluck

Banned
Ubisoft has taken the decision to withdraw this patch due to issues experienced by a number of users. An official statement follows:
"Far Cry patch 1.2 has shown unexpected behaviour on specific hardware configurations. These matters are mainly due to incompatibilities with several optimisations brought lately to the code, with the intent to please a large number of users.

We’re currently asking CRYTEK to work on delivering a new patch as soon as possible. Until then we have decided to remove the patch 1.2 from the official UbiSoft websites."

Hmm...
 

mashoutposse

Ante Up
Culex said:
The difference is that the optimizations on the Nvidia side decrease IQ to speed up the game, while the optimizations are unoticable on the ATI side.

??? Wrong. The ATi cheats decrease IQ, as well.

Anyway, check this review if you want an idea of how the X800s perform with optimizations off:

Benchmarks

In some settings, performance drops as much as 30%.
 
mashoutposse said:
"*NOTE: All optimisations were disabled using the Nvidia drivers; this included tri-linear optimisations. With the ATI drivers it is not possible to disable some optimisations and therefore they are enabled. Whilst this does not provide a true apples to apples comparison between cards it is our opinion that every review product should be tested at the best quality it can provide. It is up to the end user when they have the product to decide on optimisations which suit them. If ATI add an option to their drivers in future revisions we would also choose to disable the optimisations. Therefore it can be assumed that the performance on the 6800 Ultra can be improved over that shown in this review should you require it however this would mean reducing IQ. We suggest you read up on the optimisations which each manufacturer uses before deciding on your final purchase. "

I'll admit I didn't spot the benchmarks when I hit the above link, but doesn't that qualify for the asterick you wanted?
 

Stryder

Member
Who cares, Far Cry is old news, it's all about Doom 3 and HL2 / S.T.A.L.K.E.R. etc. now

and if the point of the rumour is that one video card company (Nvidia) is more dirty than another (Ati), then they each have their flaws:

here is what Carmack said about the Ati drivers in the [H] benchies:
The benchmarking was conducted on-site, and the hardware vendors did not have access to the demo before hand, so we are confident that there is no egregious cheating going on, but it should be noted that some of the ATI cards did show a performance drop when colored mip levels were enabled, implying some fudging of the texture filtering. This has been a chronic issue for years, and almost all vendors have been guilty of it at one time or another. I hate the idea of drivers analyzing texture data and changing parameters, but it doesn't visibly impact the quality of the game unless you know exactly what to look for on a specific texture.
 

Tenguman

Member
$500k to delay a fucking patch

excuse me while I take my grain of salt


far cry runs shitty anyway. What they NEED to do is fix their CPU optimizations.
 

tenchir

Member
The optimization that Carmack has been talking about(performance drop when mip map colour is enabled) has been known for awhile. A lot of people were talking about it on beyond3d forum and they said that it is a GLOBAL optimization, meaning every game behave this way. It was also conclusive that the change in image quality is so insignificant that you would have to do a lot of things to both the original image and the optimized image to even see the small differences(Even Carmack says you would only see it if you know what you are looking for....in a specific texture). All of ATI's optimization has been on a global level unlike Nvidia who optimizes for specific games(sometimes benchmark).
 

mashoutposse

Ante Up
tenchir said:
The optimization that Carmack has been talking about(performance drop when mip map colour is enabled) has been known for awhile. A lot of people were talking about it on beyond3d forum and they said that it is a GLOBAL optimization, meaning every game behave this way. It was also conclusive that the change in image quality is so insignificant that you would have to do a lot of things to both the original image and the optimized image to even see the small differences(Even Carmack says you would only see it if you know what you are looking for....in a specific texture). All of ATI's optimization has been on a global level unlike Nvidia who optimizes for specific games(sometimes benchmark).

So, your defense for ATi's behavior is, "They cheat in all games equally"...?

Come on, now.
 

tenchir

Member
mashoutposse said:
So, your defense for ATi's behavior is, "They cheat in all games equally"...?

Come on, now.

What part of global optimization don't you understand? Do you even know what the optimization does or how it works before labeling it a cheat? You have something against performance improvement on all applications instead of optimization on specific games(or benchmark) that another videocard company do?
 
Top Bottom