• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

SantaC

Member
But its not slightly ahead at 1440p. Its below 3080 at 1400p and 4k. Hardware unboxed more than likely ran the tests and chose nvidia in the worst light and amd in the best he could. You can be sure a 3080 will net you better performance at ultrawide, better features, better ray tracing, productivity, dlss.
Sweclockers, very unbiased had 6800Xt ahead in 1440p
 
Last edited:

MadYarpen

Member
Welp boys I just decided I'm going to wait till 2022-23 till I either build a new rig or upgrade what I got, by then these cards will be refreshed, and all the new hotness will have died down.

Right now I a have a Ryzen 7 37000x with a Sapphire 5700 XT Nitro plus and 32 gigs of DDR4 that I built last year and for the games I play right now I'm just happy with 1440p at 120 Hz with freesync enabled.

I also don't want to spend like a madman to get either an Nvidia 3080 or Radeon 6800 XT right now with Christmas coming up and other bills to pay.
To be honest this is also something I am thinking about... Only I'd be buying 5700XT instead of 6800. It seems to make no sense, but I need a card for UWQHD monitor and I cannot afford one which would allow to game in this resolution with RT on, above 60 FPS. Not sure if 5700XT is enough, though.
 
Sweclockers, very unbiased had 6800Xt ahead in 1440o





All of them have the 3080 win in both 1440p and 4k. Thing is, like i said, these guys are all humans and like how we have biasises and we like one company over another, so do they. Hardware unboxed is especially awful in his review today. If one guy leans more towards amd, he will shoot a mild overclock at the amd benches, turn up the AC in the studio so the card is cooler and boosts more, stuff like that. He runs the tests 5 times and chooses the best one. Then he chooses the worst for nvidia and turns off the AC. He didnt lie per se, but you can see how it looks
 

llien

Member
But its not slightly ahead at 1440p. Its below 3080 at 1400p and 4k. Hardware unboxed more than likely ran the tests and chose nvidia in the worst light and amd in the best he could

On computerbase.de 3080 is 6% ahead at both 4k and 1440p (curious), but RDNA2 fares much better at newer games.

HU tested with Zen 3, TPU with 9900k, I couldn't figure what computerbase was using.



As far as RT goes, Dirt 5 shows which manufacturer RT was optimized for does matter:

ywJz0W2.png



Again, not for reflections, which is what MM and WD:L are using it for. Dynamic GI and shadowing has always been possible with other solutions.
I think we have already discussed that a zound of older games has relfections.
 
Last edited:

Dampf

Member
On computerbase.de 3080 is 6% ahead at both 4k and 1440p (curious), but RDNA2 fares much better at newer games.

HU tested with Zen 3, TPU with 9900k, I couldn't figure what computerbase was using.



As far as RT goes, Dirt 5 shows which manufacturer RT was optimized for does matter:

ywJz0W2.png




I think we have already discussed that a zound of older games has relfections.
Dirt 5 doesn't even have Raytracing on Nvidia yet.
 

Dampf

Member


They probably used a hack to enable that. Anyway, it is not the official patch and thus, no performance tests should be done as of now. Use Google and check for patches, there are none that add Raytracing. Dirt 5 was shipped without DXR support on PC.
 
Last edited:

Ascend

Member
Hardware unboxed is especially awful in his review today. If one guy leans more towards amd, he will shoot a mild overclock at the amd benches, turn up the AC in the studio so the card is cooler and boosts more, stuff like that. He runs the tests 5 times and chooses the best one. Then he chooses the worst for nvidia and turns off the AC. He didnt lie per se, but you can see how it looks
The fact that you actually believe this says a lot about you.

Hardware Unboxed is one of the more straightforward and neutral ones out there. In my opinion they still unjustly slammed the R9 390 promoting the 3.5GB 970GB over it. And they are supposed to be AMD biased? COME ON. Lemme guess, you think they are AMD biased because they thought the 5700XT was the best card to buy for the last year?
 

Dampf

Member
It is a DXR based (Microsoft API) title, NV supports DXR, why would anyone need to hack anything to run it???

There is no Raytracing for PC as of now, it will be added later. https://www.dsogaming.com/pc-performance-analyses/dirt-5-pc-performance-analysis/ The latest patch for Dirt 5 on PC is 1.04 and there is no mention of DXR support in there.

Not only that, but there aren’t any Ray Tracing options right now. AMD and Codemasters have already showcased the game’s Ray Tracing effects, so I don’t really know why there isn’t any support for them yet. Perhaps Codemasters will add them via a post-launch update once the RX 6000 series is out?

Godfall will also support RT later on Nvidia, via a patch. https://forums.gearboxsoftware.com/t/november-18-2020-godfall-quick-update-pc-2-0-95/4548737
Same thing here. Godfall and Dirt 5 benchmarks with Raytracing on Nvidia hardware are fishy, to say the least.

AMD biased because they thought the 5700XT was the best card to buy for the last year?

Honestly, yes. That card will age terribly as it has no DX12 Ultimate, no HW-RT, poor AI performance and no Direct Storage support, it has no future. Anyone recommending such an outdated architecture should not be allowed to make tech videos, period.
 
Last edited:
  • Like
Reactions: amc
The fact that you actually believe this says a lot about you.

Hardware Unboxed is one of the more straightforward and neutral ones out there. In my opinion they still unjustly slammed the R9 390 promoting the 3.5GB 970GB over it. And they are supposed to be AMD biased? COME ON. Lemme guess, you think they are AMD biased because they thought the 5700XT was the best card to buy for the last year?


You can see how his scores tell one thing and 10 others say the opposite. You can hear his bias in how he talks about this. If you're making a review of a product, you review the product. You dont shit and dismiss features of the product because you personally dont like them. Just listen to the review. Are you so gullible to think that what i said doesnt happen ? Seriously ?
 

Rickyiez

Member



All of them have the 3080 win in both 1440p and 4k. Thing is, like i said, these guys are all humans and like how we have biasises and we like one company over another, so do they. Hardware unboxed is especially awful in his review today. If one guy leans more towards amd, he will shoot a mild overclock at the amd benches, turn up the AC in the studio so the card is cooler and boosts more, stuff like that. He runs the tests 5 times and chooses the best one. Then he chooses the worst for nvidia and turns off the AC. He didnt lie per se, but you can see how it looks

I've been going through KitGuru review and seems like they had the best one so far with the game diversities . 3080 is indeed ahead in most 1440p and 4k benchmarks , confirmed from Techpowerup's too .

Oh and isn't it ironic where for all the extra VRAM 6800 had , it's clearly slower in 4k because of the slower bandwidth ? Either way , neither is perfect at this point .
 
Last edited:

llien

Member
You can see how his scores tell one thing and 10 others say the opposite
Show me "10 others" who tested with Zen 3.

Tom's shows a tie at 1440p (and 3080 not "scaling" well at that resolution is a known fact, computerbase's 6% diff looks strange, TPU has it at 4%):

Ma9BCGB7yZAvScp5QmbTV8-970-80.png.webp
 
Last edited:

00_Zer0

Member
To be honest this is also something I am thinking about... Only I'd be buying 5700XT instead of 6800. It seems to make no sense, but I need a card for UWQHD monitor and I cannot afford one which would allow to game in this resolution with RT on, above 60 FPS. Not sure if 5700XT is enough, though.
I run my PC through a 48 inch 4K LG CX Oled and i can run some low level games at 4k/60, then there are others that will run 1440P/ 60 and then some at 1440p/120.

If you get a Sapphire 5700 XT nitro+ many games will run near 120 fps. Sapphire also has its own software layer that works in conjunction with Radeon drivers and all the Adrenaline software features. This is called Trixx Boost. You can take a native 4k or 1440p image and drop it down by a certain percentage to get an extra frame rate boost.

When you add in AMD image sharpening some results look pretty amazing. I run Jedi Fallen order at 85% of 1440p with a 90 FPS lock, all this while having Freesync enabled and I don't notice any severe drops or stutter.

A lot of people are also running their 48 inch LG CX Oled in 21:9 mode and are getting great results even though it's a 16:9 TV. I am not a resolution snob or a graphics snob so dropping my resolution down to 1440p or using image sharpening doesn't bother me.

By the time I get into some of the ray trace heavy games in a year or two Nvidia and AMD will have improved on what they have now. By then I will jump in and grab a card that is even better than what is on offer today.
 
Last edited:

FireFly

Member
I think we have already discussed that a zound of older games has relfections.
Sure, primarily using SSR and cube maps, which either don't capture off screen objects or are not dynamic. While planar reflections require you to render the world again each time, so are used very sparingly, pretty much exclusively for water and mirrors in bathrooms.

Just compare the difference in MM with RT enabled and disabled. Given how much ray tracing costs, especially on AMD GPUs, if Insomniac knew of a more performant technique, you would think they would have used it. So we're just waiting for the Lumen of reflections. Which may exist, but certainly isn't demonstrated in the UE5 video.

There are more shades of grey than just "ray tracing is useless" and "ray tracing is perfect for everything". It's a powerful tool that is very expensive, but can yield large visual benefits when used intelligently as MM shows.
 

Hojaho

Member
There is no Raytracing for PC as of now, it will be added later. https://www.dsogaming.com/pc-performance-analyses/dirt-5-pc-performance-analysis/ The latest patch for Dirt 5 on PC is 1.04 and there is no mention of DXR support in there.





Honestly, yes. That card will age terribly as it has no DX12 Ultimate, no HW-RT, poor AI performance and no Direct Storage support, it has no future. Anyone recommending such an outdated architecture should not be allowed to make tech videos, period.

Didn't Steve say it was available on a beta branch of Dirt 5 ?
I'm too lazy to check the video again. Maybe it was Wendell on Level1techs.
 

llien

Member
he didnt test with zen 3. He tested with zen 2
True.
Stil, TH has a tie at 4k and is using only 9 games.
Which games are used matters too, Dirt 5 for instance has quite an impact.

If he does shit, it should be visible on per game bases.

PS
Hold on, so he has 153 fps vs 157 fps, that's a 2% difference.

His 4k results show 5% in favor of 3080, well in line with others.
 
Last edited:

MadYarpen

Member
I run my PC through a 48 inch 4K LG CX Oled and i can run some low level games at 4k/60, then there are others that will run 1440P/ 60 and then some at 1440p/120.


If you get a Sapphire 5700 XT nitro+ many games will run near 120 fps. Sapphire also has its own software layer that works in conjunction with Radeon drivers and all the Adrenaline software features. This is called Trixx Boost. You can take a native 4k or 1440p image and drop it down by a certain percentage to get an extra frame rate boost.

When you add in AMD image sharpening some results look pretty amazing. I run Jedi Fallen order at 85% of 1440p with a 90 FPS lock, all this while having Freesync enabled and I don't notice any severe drops or stutter.

A lot of people are also running there 48 inch LG CX Oled in 21:9 mode and are getting great results even though it's a 16:9 TV. I am not a resolution snob or a graphics snob so dropping my resolution down to 1440p or using image sharpening doesn't bother me.

By the time I get into some of the ray trace heavy games in a year or two Nvidia and AMD will have improved on what they have now. By then I will jump in and grab a card that is even better than what is on offer today.
Yeah, I'd probably consider sapphire. Or assus/xfx.

I just don't know if those new cards are really worth that much money for what I need. Although I do want to play CP2077 with high settings.

Decisions, decisions...
 
Last edited:

Ascend

Member
Honestly, yes. That card will age terribly as it has no DX12 Ultimate, no HW-RT, poor AI performance and no Direct Storage support, it has no future. Anyone recommending such an outdated architecture should not be allowed to make tech videos, period.
$400:
cc248274f1a57d1c2ab624c62fea5192f9625ca19ee8277d6197352362d7c88b.png


You can see how his scores tell one thing and 10 others say the opposite.
Did you check difference in settings and system configuration before you started throwing baseless accusations around?

You can hear his bias in how he talks about this.
You're projecting.

If you're making a review of a product, you review the product. You dont shit and dismiss features of the product because you personally dont like them. Just listen to the review. Are you so gullible to think that what i said doesnt happen ? Seriously ?
It is his review. He has the right to discuss and exclude whatever he wants. If you don't like it, that's fine. Go to others if you want your biases confirmed. He reviewed the product just like he reviews every other product. And it's not like he completely omitted and and refrained from mentioning RT at all.
 
Last edited:

Dampf

Member

This says absolutely nothing. Valhalla is not using any of the DX12 Ultimate features, it doesn't use Direct Storage and certainly nothing AI related.

In next gen games, the 5700XT will struggle so much it's not even funny.
 
Last edited:
You were always arguing for both RT and for 16GB of RAM.

Key question;
Would you prefer RTX 3080 for its RT, or the 6800XT for its 16GB?

16 GB for VR future proofing. But I don't like to settle. Thats why 3090 is my current target for acquisition.
 

Dampf

Member
It is one of the latest games released.
It's a cross gen game that is not even using Raytracing.

If a game uses Sampler Feedback alone, the 5700XT will get crushed and has to use low resolution textures to even remotely keep up. Not to mention Mesh Shaders, which renders ton of geometry far more efficiently. And if a game requires DXR, the next gen game straight up won't boot on a 5700XT anymore, because AMD apparently does not care adding DXR support for these cards.

Let's just say the seemingly good price to performance ratio in current generation games comes with the big caveat that this card won't be able to render next generation graphics, unlike the Nvidia Turing generation which has full featureset parity with Next Gen consoles and even better Raytracing performance.
 
Last edited:

Durask

Member
Well, it looks like it is competitive except for raytracing and of course Nvidia has DLSS.
However, if you are looking for a next gen card I think it makes sense to buy whatever is available given that we will probably have shortages of both NVidia and AMD cards for the next 6 months at least if not a year.
 

MadAnon

Member
So AMD is more efficient, as expected due to node advantage.

Price is all over the place so hard to say if it's better value.

Nvidia with GDDR6X showing it's bandwidth advantage at 4K. As I said, 10GB will be a non-issue this gen. AMD sponsored Godfall basically equal performance to 3080. So much for that 10GB+ RAM requirment.

RT not even a contest, AMD is far behind.

So, a very good raster card at lower resolutions but falls behind Nvidia at ethusiast levels.
 

Ascend

Member
It's a cross gen game that is not even using Raytracing.
And it's still one of the latest game releases. Maybe that also says something about ray tracing support and its importance at this point in time.

If a game uses Sampler Feedback alone, the 5700XT will get crushed and has to use low resolution textures to even remotely keep up. Not to mention Mesh Shaders.
And it will continue to beat out the 2060S in performance, and beat out the 2070S in value. The 5700XT was the best value card since it was released. Live with it.

It's always the same story. The goal post is always shifted to fit nVidia and to dismiss AMD.

AMD has 4GB of RAM? Meh, too little. We need 6GB at least, otherwise it won't age.
AMD has 16GB of RAM? Meh, 10GB is enough, it's about RT and DLSS!
AMD has async compute? Meh, not important since barely any game uses it, It's about power consumption!
AMD has better power consumption? Meh. Not important. It's about RT and DLSS, despite only a handful of games using it!
 

00_Zer0

Member
Yeah, I'd probably consider sapphire. Or assus/xfx.

I just don't know if those new cards are really worth that much money for what I need. Although I do want to play CP2077 with high settings.

Decisions, decisions...
If you had a 5700 XT Nitro+ I bet you could run 2077 on a mixture of high/med settings along with Trixx Boost and AMD sharpening at around 1440p/60 fps with Freesync enabled. YMMV on what is acceptable to you, but to me that wouldn't be too bad.

I think I will wait to purchase any major ray traced games when I get a new card in a year or two. I think I would rather get my money's worth out of what I have now and get an even better card with better ray tracing later. I use my PC for emulation and my Steam back log is ridiculous any way.

As far as getting a Nitro+ on a good deal, well I bought this card new last year for around the $430+ mark. Maybe you can find one around the $250-300 mark somewhere?

I was going to sell it to a friend for $250 after purchasing a 3080 or 6800XT, but now that I decided against it I'll have to tell him the bad news.
 
Last edited:

MadYarpen

Member
If you had a 5700 XT Nitro+ I bet you could run 2077 on a mixture of high/med settings along with Trixx Boost and AMD sharpening at around 1440p/60 fps with Free sync enabled. YMV on what is acceptable to you, but to me that wouldn't be too bad.

I think I will wait to purchase any major ray traced games when I get a new card in a year or two. I think I would rather get my money's worth out of what I have now and get an even better card with better ray tracing later. I use my PC for emulation and my Steam back log is rediculous any way.

As far as getting a Nitro+ on a good deal, well I bought this card new last year for around the $430+ mark. Maybe you can find one around the $250-300 mark somewhere?

I was going to sell it to a friend for $250 after purchasing a 3080 or 6800XT, but now that I decided against it I'll have to tell him the bad news.
I can deduct vat and cost so I get a better price overall for new tbh. Will see. I can wait a little more.
 

Rikkori

Member
My thoughts on the RT performance so far:

It's clear that in games with multiple effects like Minecraft and Control it tanks a LOT compared to Ampere. Curiously, those are also implementations where Nvidia had several engineers work on it full time. In fact for Control Nvidia almost half-built their tech going by some of their talks. Nonetheless it's only dire in Minecraft, in Control the difference is significant but not as disastrous (50%).
In games where you have a single RT effect the performance differential is actually more in-line with a tier jump (so 20-30%). Metro Exodus was a big surprise, the performance difference was actually small. And in fact here it depends on who makes it, because with Dirt 5 we can see AMD even pulls ahead.

So all in all, it's not bad, there's obvious room for optimisation here & let's not forget Nvidia has a 2 year head start on drivers & game integration but it will all vanish as we get the console versions soon (which are RDNA 2 optimised). And indeed in such games we can see AMD pull significantly ahead like in Dirt 5 & Godfall.

The disappointment here will be for CP 2077, which will 100% be a repeat of Control imo, so it's going to be a 50% performance difference which is huge. And since they add DLSS but not VRS, it's going to make the gap even bigger. Since this is why I upgrade it's hard to stomach, there's no two ways about it. Of course if you only enable 1 RT effect or none at all, then you'll be fine.

I think in the long run the RDNA 2 cards will end up as better buys, esp. as we see more VRS added to combat DLSS (which imo is just a crutch), the 16 GB vram buffer will stretch its legs better & we see further game and driver optimizations.

I was half-tempted to just cancel my card orders but considering the 3070 is even more expensive than the 6800 XT in my region, and everything else is sold out or priced even worse, I think I'll keep 'em. If the 3080 weren't a spring launch in reality (for stock at/near msrp) maybe it would be different, who knows.

I'm actually positively surprised that it doesn't lose at 4K much at all, I was expecting a bigger difference. The only result that disappointed me (more than RT performance even) was Fallen Order being so low, but I guess that's down to other factors & I don't expect it will change.
 

llien

Member
So AMD is more efficient, as expected due to node advantage.
It was expected the other way around, because AMD was "years behind".
AMD touching only 3070 was also another thing expected, because, you've guessed it, AMD was "years behind".
On RT front, it was expected that AMD is not even on Turing level, because, you've guessed it, AMD was "years behind". (note how it beats NV in Dirt 5)


Price is all over the place so hard to say if it's better value.
No, it's better perf/$ all around, with bonuses like more VRAM as icing.

10GB will be a non-issue this gen.
Godfall already doesn't fit in 10GB and we are just transitioning into "this gen".

RT not even a contest, AMD is far behind.
AMD has much better than expected performance in NV sponsored RT games, and actually beats NV in Dirt 5.
 
It is a DXR based (Microsoft API) title, NV supports DXR, why would anyone need to hack anything to run it???


Notable that they are also on Intel CPU.

Smart access gives 1-2% gains :D

 

Andodalf

Banned
AMD has 4GB of RAM? Meh, too little. We need 6GB at least, otherwise it won't age.
AMD has 16GB of RAM? Meh, 10GB is enough, it's about RT and DLSS!
AMD has async compute? Meh, not important since barely any game uses it, It's about power consumption!
AMD has better power consumption? Meh. Not important. It's about RT and DLSS, despite only a handful of games using it!

I mean, through all of that Nvidia performed significantly better. They made the right choices in those instances, so far at least. That's not a gatcha as much a list of how things haven't worked out for AMD
 

Ascend

Member
I'm glad I'm not desperate to get a card right now. Most likely I'll be waiting till early next year and get a Sapphire Nitro card. I can also see what the inevitable big driver overhaul will do in December.

Not that it's a primary concern of mine, but AMD also maybe gets a chance to stretch its legs with RT. Because as of right now, all the games with RT were created with nVidia in mind. They work on AMD, but I doubt they are optimized for it. I don't expect it to suddenly beat out the 3000 series, but I do expect it to improve in image quality and performance.
 

Dampf

Member
And it's still one of the latest game releases. Maybe that also says something about ray tracing support and its importance at this point in time.


And it will continue to beat out the 2060S in performance, and beat out the 2070S in value. The 5700XT was the best value card since it was released. Live with it.

It's always the same story. The goal post is always shifted to fit nVidia and to dismiss AMD.

AMD has 4GB of RAM? Meh, too little. We need 6GB at least, otherwise it won't age.
AMD has 16GB of RAM? Meh, 10GB is enough, it's about RT and DLSS!
AMD has async compute? Meh, not important since barely any game uses it, It's about power consumption!
AMD has better power consumption? Meh. Not important. It's about RT and DLSS, despite only a handful of games using it!

My friend, I am sure you have a 5700XT judging by how much you are defending it and I'm sorry I have to be the one who tells you this: this card won't keep up, at all. Especially from a developer perspective. I highly recommend you selling the card, if you can.

With Sampler Feedback Streaming, the mentioned RTX 2060 Super (but really it also applies to the next gen consoles, Ampere and RDNA2 as well) has around 2.5-3.5x the effective VRAM amount compared to cards without SFS (basically, the 5700XT) (Inform yourself here what this technology does: https://microsoft.github.io/DirectX-Specs/d3d/SamplerFeedback.html ) Simply speaking, it allows for much finer control of texture MIP levels, meaning your VRAM can be used far more efficiently than before. That will result in much higher resolution textures for DX12 Ultimate compatible cards without the need to increase physical VRAM and eliminates stuttering and pop in. Basically, a 2060 Super has around 20 GB and more effective VRAM compared to your 5700XT. How do you plan to compensate for that?

Next, we have mesh shading. As I said, that is a key technology as it replaces current vertex shaders and allows for much, much finer LOD and higher geometry, similar like what you saw at the PS5 Unreal Engine 5 demo. On PS5, it is using Sony's customized next generation Geometry Engine, which is far beyond the standard GE from RDNA1. On the PC however, Nanite will be using mesh shaders, most likely, which the 5700XT does not support, meaning it is incapable of handling that much geometry or it will have huge performance issues when trying to emulate it in software.

VRS can give you 10-30% or even higher performance boost at almost no image quality cost.

Next, there is Raytracing. Raytracing is not a gimmick. Raytracing saves so much time for devs and it gets more efficient each day it's going to be an integral part of Next Gen games in the future. Even the consoles support it in decent fashion and it is the future of rendering, don't let anyone tell you otherwise. Nvidia recently released their RTXGI SDK which allows for dynamic GI using updated light probes via Raytracing and it doesn't destroy performance, it is extremly efficient. This means developers don't have to pre-bake lighting anymore and save so much time and cost when developing games. RTXGI will work on any DXR capable GPU, including consoles and RDNA2. However, the 5700XT is not even capable (well it could be if AMD would enable DXR support) of emulating it in software, meaning a game using RTXGI as its GI solution won't even boot up on a 5700XT anymore. However, giving that the RT capable userbase is growing each day with Turing, Ampere, Pascal, RDNA2 and the consoles, this is likely a non issue for devs. If AMD decides to suddenly implement DXR support for the 5700XT you could still play the game, but with much worse performance and visual quality than the DX12U capable GPUs due to the lack of hardware acceleration for Raytracing.

Remember, everything I talked about also applies to AMD's new RDNA2 architecture. It fully supports these features as well. Basically, RDNA2 is a much, much bigger jump than you might realize.

Your card might be fine for a while with cross generation games, but once next gen kicks off, the 5700XT is dead in the water.

Hope that clears it up.
 
Last edited:
Top Bottom