• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Yeah Minecraft RTX is kind of a worst case scenario for these cards right now. Given that Ampere has better RT capabilities in a general sense but also the architecture seems to really shine in fully path traced games, which Minecraft is.

In addition to that the game is optimised for Nvidia cards/RT solution so it will perform even worse on AMD.

We already saw this earlier in the thread in the initial reviews where a few reviewers benchmarked with Minecraft to show RT performance.

Luckily for AMD there are only 2 path traced games right now (Minecraft and Quake), RT hardware just isn't performant enough this generation of GPUs to allow path tracing with modern graphical quality/complexity, maybe in 3 generations from now the GPU hardware will be powerful enough to make it somewhat possible.

In the meantime we will pretty much only be seeing hybrid rendered titles.
 

Mister Wolf

Gold Member
Yeah Minecraft RTX is kind of a worst case scenario for these cards right now. Given that Ampere has better RT capabilities in a general sense but also the architecture seems to really shine in fully path traced games, which Minecraft is.

In addition to that the game is optimised for Nvidia cards/RT solution so it will perform even worse on AMD.

We already saw this earlier in the thread in the initial reviews where a few reviewers benchmarked with Minecraft to show RT performance.

Luckily for AMD there are only 2 path traced games right now (Minecraft and Quake), RT hardware just isn't performant enough this generation of GPUs to allow path tracing with modern graphical quality/complexity, maybe in 3 generations from now the GPU hardware will be powerful enough to make it somewhat possible.

In the meantime we will pretty much only be seeing hybrid rendered titles.

What happens with a hybrid game like cyberpunk that will use raytracing for at least 5 different things?
 
What happens with a hybrid game like cyberpunk that will use raytracing for at least 5 different things?

We won't know for a few weeks/months until CDPR is allowed by Nvidia to enable RT support for AMD cards 🤷‍♂️

But common sense would dictate that given CyberPunk is a heavily sponsored Nvidia title that it will be optimised for Nvidia/their RT solution and thus will likely perform similar to Control on AMD cards when they eventually flip the switch to allow RT on AMD cards. Seems pretty obvious.
 

Ascend

Member
Just leaving this here... Too bad we don't have info about its resolution and framerate. But these cards should be better than the XSX at Minecraft.



It might fall in line though, because if we assume the XSX was running at 1080p/30fps (which DF claims was not the case), the 30 fps at 1440p kind of makes sense. I guess it's quite clear that getting the AMD cards for RT is not the best move at this moment.

In any case... Some AIB card comparison;
 

Bolivar687

Banned
This was probably posted at some point but AMD still is miles behind in some DX11 CPU heavy games. This was the reason why i changed 290 to 970, GPU performance was similar but in some games AMD cad was just killing my 2600K:


rEqQ9Js.jpg


vQ2iLIy.jpg


This is not relevant for new games but there are probably more titles like this (I wonder how FC4 in first village run, massive CPU hog...)

Those results seem a little hard to believe - it's getting less than half the frames in Watch Dogs 2 as its sequel? I assume Rattay City is the most demanding part of the game because the 6800xt gets beyond those 1080p numbers in 4k. If this is true, which I can't tell because no one else is benchmarking these games, this seems like a really limited scenario.

Also, the 290 aged significantly better than the 970 in the DX11 generation, I'm similarly having a hard time believing your 2600k was causing significant enough issues to reverse that.

edit: wouldn't Total War be the gold standard for CPU intensive games? It's within the margin of error of the 3080 in Warhammer II in DX 11:

 
Last edited:

Rikkori

Member
Those results seem a little hard to believe - it's getting less than half the frames in Watch Dogs 2 as its sequel? I assume Rattay City is the most demanding part of the game because the 6800xt gets beyond those 1080p numbers in 4k. If this is true, which I can't tell because no one else is benchmarking these games, this seems like a really limited scenario.

Also, the 290 aged significantly better than the 970 in the DX11 generation, I'm similarly having a hard time believing your 2600k was causing significant enough issues to reverse that.

edit: wouldn't Total War be the gold standard for CPU intensive games? It's within the margin of error of the 3080 in Warhammer II in DX 11:

I address this before here:


The crux of the matter is this: There are settings that can overload the CPU (and even with no visual benefits) and then because of the way AMD drivers handle DX11 differently than Nvidia this then knee-caps the performance in certain games. The easy fix here is to not be retarded, but admittedly it's a large ask these days.

This is a variation of the "Tessellation problem" when Nvidia used to kill their own GPU's performance by getting devs to use insane levels of tessellation but which hit AMD harder. So even though their own performance was shit, that was ok because at least the other guy lost harder. Though in this case with CPU performance & drivers Nvidia deserves props because it's just down to good work they did (and AMD didn't). Still dumb to use those settings though.
 
Last edited:

BluRayHiDef

Banned
Yeah Minecraft RTX is kind of a worst case scenario for these cards right now. Given that Ampere has better RT capabilities in a general sense but also the architecture seems to really shine in fully path traced games, which Minecraft is.

In addition to that the game is optimised for Nvidia cards/RT solution so it will perform even worse on AMD.

We already saw this earlier in the thread in the initial reviews where a few reviewers benchmarked with Minecraft to show RT performance.

Luckily for AMD there are only 2 path traced games right now (Minecraft and Quake), RT hardware just isn't performant enough this generation of GPUs to allow path tracing with modern graphical quality/complexity, maybe in 3 generations from now the GPU hardware will be powerful enough to make it somewhat possible.

In the meantime we will pretty much only be seeing hybrid rendered titles.
Excuses, excuses, excuses. Ampere is simply better for ray tracing than RDNA2.
 

BluRayHiDef

Banned
In pure RT/PT it definitely is:



Ampere is better in ray tracing combined with rasterization as well.

Also, in regard to Dirt 5 - the one game in which RDNA 2 performs better in ray tracing - it's been proven that RDNA 2 cards render the game with worse image quality than Ampere cards when ray tracing is on and when it's off; it's also been proven that Ampere cards render the game with more effects that hamper performance (e.g. more shadows).

So, RDNA 2 cards cheat in order to peform better in Dirt 5 than Ampere.

Go to 3:30 and with until 5:56.

 

rofif

Can’t Git Gud
Ouch, the graphs are painful to watch... But I guess once the games start to exceed 10GB VRAM the Radeons will close the gap ;)
By that time it will be 2 or 3 years and both cards will be too slow.
What always matter is here and now. And now 3080 wins if you can get it for 700 for sure
 
Excuses, excuses, excuses. Ampere is simply better for ray tracing than RDNA2.

What on earth are you talking about? I think you have spent a little too long staring at your Nvidia 3080/3090 desktop wallpaper today that it might have affected your reading comprehension.

I've never stated that RDNA2 had better performance with RT than Ampere. I've repeatedly stated that Ampere simply has a more performant RT solution and if you need the best RT performance you should buy the 3000 series this generation. I've again, never stated otherwise. Nor has any reasonable person in this thread so no idea where this take is coming from.

As for my quoted post you are replying to, I pretty much explain it all in there. You may want to re-read it without the Nvidia goggles as I clearly state that the huge improvement in path tracing performance is due to how the Ampere architecture handles RT and when not hybrid rendering the cards are really able to stretch their wings.

You see this also with Turing vs Ampere where the biggest gains in performance for Ampere over Turing with RT on are path traced games such as Minecraft and Quake.

I mentioned that this title was a worst case scenario for RDNA2 vs Ampere given that Ampere really shines with fully path traced games, that is where we see the biggest gains with their RT solution. To amplify the already wide delta in performance between the two in path traced games, Minecraft RTX is only optimized for Nvidia cards/RT solution so will perform even worse again than it would otherwise on RDNA2 based cards. These are simply facts.

The perfectly reasonable point I'm making is that while Minecraft is certainly a bad performer on RDNA2 cards with RT enabled, that this performance gap we see was not representative of most real world gaming scenarios which use hybrid rendering instead. While the 3000 are still better at RT in most cases and we will see them ahead in most cases when RT is enabled, the gap is nowhere near as big as it is in Minecraft. The takeaway from that for any reasonable person is that Nvidia has a better RT solution but not more than 100+% better in real world scenarios.
 

BluRayHiDef

Banned
What on earth are you talking about? I think you have spent a little too long staring at your Nvidia 3080/3090 desktop wallpaper today that it might have affected your reading comprehension.

I've never stated that RDNA2 had better performance with RT than Ampere. I've repeatedly stated that Ampere simply has a more performant RT solution and if you need the best RT performance you should buy the 3000 series this generation. I've again, never stated otherwise. Nor has any reasonable person in this thread so no idea where this take is coming from.

As for my quoted post you are replying to, I pretty much explain it all in there. You may want to re-read it without the Nvidia goggles as I clearly state that the huge improvement in path tracing performance is due to how the Ampere architecture handles RT and when not hybrid rendering the cards are really able to stretch their wings.

You see this also with Turing vs Ampere where the biggest gains in performance for Ampere over Turing with RT on are path traced games such as Minecraft and Quake.

I mentioned that this title was a worst case scenario for RDNA2 vs Ampere given that Ampere really shines with fully path traced games, that is where we see the biggest gains with their RT solution. To amplify the already wide delta in performance between the two in path traced games, Minecraft RTX is only optimized for Nvidia cards/RT solution so will perform even worse again than it would otherwise on RDNA2 based cards. These are simply facts.

The perfectly reasonable point I'm making is that while Minecraft is certainly a bad performer on RDNA2 cards with RT enabled, that this performance gap we see was not representative of most real world gaming scenarios which use hybrid rendering instead. While the 3000 are still better at RT in most cases and we will see them ahead in most cases when RT is enabled, the gap is nowhere near as big as it is in Minecraft. The takeaway from that for any reasonable person is that Nvidia has a better RT solution but not more than 100+% better in real world scenarios.

I'm not going to waste my time reading this dissertation. You made excuses as to why RDNA2 performs worse than Ampere in ray tracing, such as games being optimized for Ampere. However, now you're backtracking.
 
Last edited:
I'm not going to waste my time reading this dissertation. You made excuses as to why RDNA2 performs worse than Ampere in ray tracing, such as games being optimized for Ampere. However, now you're backtracking.

I'm not back tracking on anything, nor did I make any excuses. If you are not even going to bother reading reasoned replies and posts on a discussion forum then why are you here? Please stop putting words in my mouth or fantasizing about what you would like me to be saying that makes it easier to argue your points rather than what I'm actually saying, it is very tiresome. Your hot take was handily slapped down by my reply and now you are retreating with your tail between your legs.

Speaking of making excuses, aren't you the guy who bought both a 3080 and then also a 3090, has a picture of them as your actual real life desktop wallpaper? Didn't you then after being triggered about RDNA2 being a more efficient architecture you made an entire thread trying to downplay their advantages and state that "No guys, for realz Ampere is actually secretly the more efficient architecture!".


Didn't you try to give this architectural overview while knowing almost nothing about how GPU architectures work or even being very proficient with computers/technology? You made a thread asking for help because you didn't even know which slots on your motherboard to put your RAM. While that in and of itself is not a crime or something to be ridiculed over it is only one in a long line of posts from you asking beginner desktop PC technical questions, which is hilarious that you would then try to act as an a GPU architectural expert or someone with enough knowledge to give any kind of comparison or breakdown.

It was simply a fanboy thread designed to put in a lot of:

Excuses, excuses, excuses. RDNA2 is simply better for efficiency than Ampere.
 

BluRayHiDef

Banned
I'm not back tracking on anything, nor did I make any excuses. If you are not even going to bother reading reasoned replies and posts on a discussion forum then why are you here? Please stop putting words in my mouth or fantasizing about what you would like me to be saying that makes it easier to argue your points rather than what I'm actually saying, it is very tiresome. Your hot take was handily slapped down by my reply and now you are retreating with your tail between your legs.

Speaking of making excuses, aren't you the guy who bought both a 3080 and then also a 3090, has a picture of them as your actual real life desktop wallpaper? Didn't you then after being triggered about RDNA2 being a more efficient architecture you made an entire thread trying to downplay their advantages and state that "No guys, for realz Ampere is actually secretly the more efficient architecture!".


Didn't you try to give this architectural overview while knowing almost nothing about how GPU architectures work or even being very proficient with computers/technology? You made a thread asking for help because you didn't even know which slots on your motherboard to put your RAM. While that in and of itself is not a crime or something to be ridiculed over it is only one in a long line of posts from you asking beginner desktop PC technical questions, which is hilarious that you would then try to act as an a GPU architectural expert or someone with enough knowledge to give any kind of comparison or breakdown.

It was simply a fanboy thread designed to put in a lot of:

Excuses, excuses, excuses. RDNA2 is simply better for efficiency than Ampere.

I'm too smooth to engage in petty arguments based on ad hominems. This is for the rough jacks; I'm a smooth cat, baby!
 
Ampere is better in ray tracing combined with rasterization as well.

I never stated otherwise. Again once you've licked your wounds re-read my post.

Also, in regard to Dirt 5 - the one game in which RDNA 2 performs better in ray tracing

Actually RDNA2 also performs better than Ampere with RT in World of Warcraft. With the 6800XT even beating out a 3090. Godfall will likely be the same once RT is enabled on Nvidia cards.



Of course I don't necessarily believe that this is because RDNA2 is secretly more powerful in RT performance than Ampere, that would be silly. Nor am I saying that all AMD sponsored or optimized RT titles will perform better on RDNA2 than Ampere.

It just goes to show that optimization really plays a huge part in all of this. I don't believe RDNA2 will ever get close to Ampere in path traced games even with better optimizations, and Ampere will also generally be ahead in most Hybrid rendered scenarios. But my main point was that the performance gap shown in Mincecraft is not the gap we should expect in 99% of titles. Obviously Ampere will still be ahead in most titles but not by that kind of delta, which is again a pretty reasonable take.

In fact if we look at RiftBreaker we can see that Ampere performs better with RT enabled and that is an AMD sponsored/optimized title. But the difference in performance is nowhere near the Minecraft example.

It's been proven that RDNA 2 cards render the game with worse image quality than Ampere cards when ray tracing is on and when it's off; it's also been proven that Ampere cards render the game with more effects that hamper performance (e.g. more shadows).

So, RDNA 2 cards cheat in order to peform better in Dirt 5 than Ampere.

Has it? This is the first I've heard of this. No idea who this Youtuber is but I think I'll wait for more reliable sources before taking this as confirmed. It could be true obviously, I just haven't heard or seen it reported anywhere until now. If it is true then this could be what the recent driver update fixed that Ascend Ascend quoted a page or two back.
 

BluRayHiDef

Banned
Sure John:

NheDaPV.png

Those are bullshit benchmark "results." Anyone can create a graph with "results."

Here's actual footage of the RTX 3080 vs the RX 6800 XT in ray tracing across six games.

Go to 13:03 to see both cards running Watch Dogs: Legion at 1080p and 14:34 to see them running the game at 1440p.

 

Armorian

Banned
Those are bullshit benchmark "results." Anyone can create a graph with "results."

Here's actual footage of the RTX 3080 vs the RX 6800 XT in ray tracing across six games.

Go to 13:03 to see both cards running Watch Dogs: Legion at 1080p and 14:34 to see them running the game at 1440p.



It's AMD OPTIMIZED clearly, just look at quality of it:

Zrzutekranu202012061.png
 
It's AMD OPTIMIZED clearly, just look at quality of it:

Zrzutekranu202012061.png

My understanding for this case was the driver was bugging out on AMD cards causing effects to not render properly. I believe this was called out in AMD's review guidelines/press deck if I'm remembering correctly. Presumably this will be fixed with a driver update (was it already fixed with the recent update? I can't remember). It is possible performance may drop on AMD cards once the driver update hits, we won't really know until then.
 

BluRayHiDef

Banned
It's AMD OPTIMIZED clearly, just look at quality of it:

Zrzutekranu202012061.png

Yea, I mentioned a week or so ago that the reflections on the trash bins are missing in the RDNA 2 rendering of the game.

Ironically, RDNA 2 cards still run the game worse with ray tracing enabled despite producing inferior image quality.

Unbelievable.
 

Armorian

Banned
My understanding for this case was the driver was bugging out on AMD cards causing effects to not render properly. I believe this was called out in AMD's review guidelines/press deck if I'm remembering correctly. Presumably this will be fixed with a driver update (was it already fixed with the recent update? I can't remember). It is possible performance may drop on AMD cards once the driver update hits, we won't really know until then.

I wonder if it's this or the same thing they did on consoles, many materials (like some metal objects) just don't reflect to save performance.

Yea, I mentioned a week or so ago that the reflections on the trash bins are missing in the RDNA 2 rendering of the game.

Ironically, RDNA 2 cards still run the game worse with ray tracing enabled despite producing inferior image quality.

Unbelievable.

Yeah, WD2 can't be used for any benchmarking with RT on.

Now moving to Godfall RT, this looks like some screenspace bullshit and not RT :messenger_tears_of_joy:

 
Last edited:

BluRayHiDef

Banned
I wonder if it's this or the same thing they did on consoles, many materials (like some metal objects) just don't reflect to save performance.



Yeah, WD2 can't be used for any benchmarking with RT on.

Now moving to Godfall RT, this looks like some screenspace bullshit and not RT :messenger_tears_of_joy:



I guess RDNA 2 has to sacrifice image quality in order to produce good performance.
 
I wonder if it's this or the same thing they did on consoles, many materials (like some metal objects) just don't reflect to save performance.

It is certainly possible, it could be a way of saving performance on AMD cards as the RT performance is less than Ampere. But this seems like a weird thing to do in the PC space as if they were going to do that kind of thing they would probably detect if you were using for example a weaker Nvidia card like a 2060 and also drop effects if it was to save performance.

Generally on PC they just give you all effects as is and it is up to the user if they want to lower effects or quality settings for better performance. So for this I'm almost 100% convinced it is related to the driver rather than purposely rendering fewer effects/lower quality ones, but I suppose we will find out soon once an updated driver sorts out the issue I'm sure people will rebenchmark etc.. to see if the issue is fixed or not.

EDIT: Oh I found the "known issues" in the press drivers which calls out the lack of reflections:




UyAk8XV.png


So like I mentioned earlier, looks like it is a driver issue that AMD is aware of. It would be pretty unprecedented to assume that RT effects would be disabled or lower quality on AMD cards. That type of thing doesn't really happen in the PC space. If your current hardware is not capable enough developers kind of say "tough luck! we are doing it anyway! Performance may tank with your current card but hopefully your future card runs it better!".
 
Last edited:

llien

Member
Dirt 5 is a DXR 1.1 game, I don't know if NV even supports it.
If not, it's not even apples to apples comparison (and good job, Huang).


Those are bullshit benchmark "results." Anyone can create a graph with "results."
2020, when Lisa did such things to Huang that some people feel urge to claim computerbase is not a well known German source with hell of a strong reputation, but "anyone" who "create a graph with 'results'".
Yikes.
 

BluRayHiDef

Banned
Dirt 5 is a DXR 1.1 game, I don't know if NV even supports it.
If not, it's not even apples to apples comparison (and good job, Huang).



2020, when Lisa did such things to Huang that some people feel urge to claim computerbase is not a well known German source with hell of a strong reputation, but "anyone" who "create a graph with 'results'".
Yikes.

Every other source's benchmark results contradict those "results."
 

Rikkori

Member
Shit is in stock on mindfactory, to whom it may concern:

6800 - 769€+ (claimed $579)

For reference (also all in stock)
3070 - 689€+ (claimed $499)
3080 - 1019€+ (claimed $699)
3090 - 1779€+ (claimed $1499)

Interesting. I paid ~660€ for my RX 6800 but at the time 3070s were ~780€ (and about ~720€ recently), and 3080s were about 990€ at launch and pretty much same since then on the rare moment they do drop. Also 3060 Ti are about ~530€.
Pretty much why I was happy with it, even though ideally I'd have paid 600€ at most. But I don't see much relief in pricing until late spring so eh, I'd rather have the GPU for these few months and use it than just wait and wait.
 
Yeah prices are pretty crazy across the board right now. I'm hoping by late February/March that prices start to become sane along with more steady supply.

Right now I'm in no rush to upgrade, I'm in the process of saving for a house/applying for a mortgage so I'm happy to wait until I have all of that sorted out and then I can look at the market again in maybe March.

I'll first be looking to upgrade to a 3440x1440 Ultrawide monitor which will probably cost around €900-€1000. Along with that then I'll either get a 6800XT AIB model or a 3080 AIB model. It really depends on what is available at the time and for what kind of pricing.
 
Dirt 5 is a DXR 1.1 game, I don't know if NV even supports it.

Just to be clear I'm pretty sure Nvidia supports the DXR 1.1 spec/features. The new features added by AMD/MS collaboration should also benefit Nvidia cards and allow further optimizations there, it will just allow for much better optimization on AMD cards compared to their performance with DXR 1.0 so it potentially benefits AMD much more than Nvidia, but it should still benefit Nvidia too.

Also because DXR 1.1 is an extension of the DXR 1.0 spec you could technically use DXR 1.1 but only use the DXR 1.0 function calls/features and the game would still be "DXR 1.1" because that is the version of the API package you are using.
 
Last edited:
as i wrote pages before WD:L is rendering exactly the same quality RT on my 3080 and 6800 as long as it does not bug out which it does occasionally (but definitely not constantly).


Those are bullshit benchmark "results." Anyone can create a graph with "results."

computerbase.de is one of the oldest and still best benchmarking sites around. please stop calling them "bullshit benchmarks" when other don't come even close putting out their quality.

edit: and they have been plenty critical on AMD / Radeon in the past. so no reason to suspect any sort of bias here.


Also, in regard to Dirt 5 - the one game in which RDNA 2 performs better in ray tracing - it's been proven that RDNA 2 cards render the game with worse image quality than Ampere cards when ray tracing is on and when it's off; it's also been proven that Ampere cards render the game with more effects that hamper performance (e.g. more shadows).

So, RDNA 2 cards cheat in order to peform better in Dirt 5 than Ampere.

Go to 3:30 and with until 5:56.



your are putting a lot of faith in a "proof" that might as well be just a bug in the time of day rendering or cloud coverage difference. if you look closely on the video you will find, that in this video the RX 6800 bench is in bright daylight condition while the rx3800 seems to be mostly in overcast. that might be a bug rendering out daylight on the nvidia benchrun and not the other way around (i actually played the game quite a bit on both cards now and am pretty sure at this point that the RX6800 video shows how the benchmark should actually look).

for further information: ray tracing in dirt 5 is still not public as of now and just available as a preview build to some outlets as comeputerbase.de. bryan did not use the formentioned preview build.
 
Last edited:
Great results in mGPU. It is a pity that GPU manufacturers/developers didn't pursue that avenue a bit more but I suppose the kind of person who would buy two high end GPUs to use together is only a tiny segment of the market so for developers who are already on crazy crunch and pushed to the limit with budgets and manpower it just didn't make enough financial sense.

Similarly I know there are some technical hurdles with VRAM and micro stuttering. From GPU manufacaturer point of view maybe it would remove incentive for users to upgrade to your latest GPU if two of your older ones can give similar or better performance.

Still, it would have been interesting to see an alternate timeline where manufacturers/developers went all in on multi GPU setups for gaming.
 
Last edited:

ZywyPL

Banned
Great results in mGPU. It is a pity that GPU manufacturers/developers didn't pursue that avenue a bit more but I suppose the kind of person who would buy two high end GPUs to use together is only a tiny segment of the market so for developers who are already on crazy crunch and pushed to the limit with budgets and manpower it just didn't make enough financial sense.

Similarly I know there are some technical hurdles with VRAM and micro stuttering. From GPU manufacaturer point of view maybe it would remove incentive for users to upgrade to your latest GPU if two of your older ones can give similar or better performance.

Still, it would have been interesting to see an alternate timeline where manufacturers/developers went all in on multi GPU setups for gaming.


I'm still salty about that DX12 promise to naively see multiple GPUs as a single big one, with stacking the VRAM even. But then again, seeing how far a single GPU performance has gone in the recent years, I can't really blame the devs nor NV/AMD for dropping mGPU tech.
 
I'm still salty about that DX12 promise to naively see multiple GPUs as a single big one, with stacking the VRAM even. But then again, seeing how far a single GPU performance has gone in the recent years, I can't really blame the devs nor NV/AMD for dropping mGPU tech.

I think for it to become really meaningful it would need to be something that happens automatically at the driver level and requires little to no work on the part of the developers. Who knows with MCM/Chiplets designs coming with RDNA3/Hopper a lot of the work to make that work would probably also benefit mGPU setups. Maybe there will be a push towards multi GPU support somewhere in the future?

At least we can dream about what could be...
 
That is far from obvious and I'm talking about "support" as "actually do" and not formally accept the call but execute it 1.0 way.

I get what you mean but then it becomes an impossible standard for us as outside observers to confirm one way or the other as we don't have access to the Nvidia driver source code to confirm how it is handling it and Nvidia are certainly not going to do an interview or whitepaper where they explain it.

I guess game developers might be able to shed some light on it as they are working directly with the API, but then if they do for example a DXR 1.1 feature's function call and the driver on Nvidia cards recognizes and accepts that call then unless there is a visual difference to be compared (like a feature on/off) it would be hard even for developers to tell if a fallback DXR 1.0 function was run in place of the DXR 1.1 function call.

But then again seeing as DXR 1.1 offers very specific new features not present in DXR 1.0 I think it would be pretty obvious with the end result to see if those features are either working or not so if the Nvidia driver accepts the DXR 1.1 feature function call we can only assume it is running the feature as intended as we don't really have any evidence to the contrary.

In addition to that DXR is a Microsoft spec/API so I doubt Nvidia would not support something so mainstream. Now a different question might be how worthwhile the new DXR 1.1 features are for Nvidia cards and how much they actually benefit from those features and if they offer more optimization options for Nvidia vs DXR 1.0

It could be a case where a game is for example DXR 1.1 because that is the version of the API library they are using but when running on Nvidia cards the developers only run the DXR 1.0 features/functions and ignore the new DXR 1.1 ones. Anything is possible and we unfortunately don't know enough about how this is all implemented to be able to make any reasoned assumption.

All we can do is assume the best until there is some evidence to the contrary. With that mindset we can only assume that DXR 1.1 features are supported on Nvidia cards. 🤷‍♂️
 
Last edited:

Armorian

Banned
It is certainly possible, it could be a way of saving performance on AMD cards as the RT performance is less than Ampere. But this seems like a weird thing to do in the PC space as if they were going to do that kind of thing they would probably detect if you were using for example a weaker Nvidia card like a 2060 and also drop effects if it was to save performance.

Generally on PC they just give you all effects as is and it is up to the user if they want to lower effects or quality settings for better performance. So for this I'm almost 100% convinced it is related to the driver rather than purposely rendering fewer effects/lower quality ones, but I suppose we will find out soon once an updated driver sorts out the issue I'm sure people will rebenchmark etc.. to see if the issue is fixed or not.

EDIT: Oh I found the "known issues" in the press drivers which calls out the lack of reflections:




UyAk8XV.png


So like I mentioned earlier, looks like it is a driver issue that AMD is aware of. It would be pretty unprecedented to assume that RT effects would be disabled or lower quality on AMD cards. That type of thing doesn't really happen in the PC space. If your current hardware is not capable enough developers kind of say "tough luck! we are doing it anyway! Performance may tank with your current card but hopefully your future card runs it better!".


Yeah, on PC it was brute force approach most of the time with solutions like tesselation quality scaler in AMD drivers back in The Wicher 3 days (is it still in?). But devs made cuts specifically for RDNA2 GPUs on consoles so maybe it was intentional.

as i wrote pages before WD:L is rendering exactly the same quality RT on my 3080 and 6800 as long as it does not bug out which it does occasionally (but definitely not constantly).

Can you provide screenshots for that?
 

Sun Blaze

Banned
as i wrote pages before WD:L is rendering exactly the same quality RT on my 3080 and 6800 as long as it does not bug out which it does occasionally (but definitely not constantly).
AMD's very own patch notes state that the RT effects ARE not rendering as expected and MAY be missing altogether. They don't state "may be rendering incorrectly".
It's not an isolated incident, they aren't rendering properly. Can you provide evidence to the contrary?
 
Top Bottom