• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

It was expected the other way around, because AMD was "years behind".
AMD touching only 3070 was also another thing expected, because, you've guessed it, AMD was "years behind".
On RT front, it was expected that AMD is not even on Turing level, because, you've guessed it, AMD was "years behind". (note how it beats NV in Dirt 5)



No, it's better perf/$ all around, with bonuses like more VRAM as icing.


Godfall already doesn't fit in 10GB and we are just transitioning into "this gen".


AMD has much better than expected performance in NV sponsored RT games, and actually beats NV in Dirt 5.


Godfall works fine at 4k maxed out on a 3070. The 8 gigs of that card dont hamper the performance one bit, according to the hardware unboxed video where they unbox the 6800XT.

As for having better than expected raytracing, its bellow nvidia's first attempt at it. Bellow the 2080Ti. How is that better than expected ? They put out worse raytracing performance than nvidia did in 2018 and they had 2 years of reverse engineering what nvidia did and to see how it all works
 
Last edited:

SantaC

Member
Newest games only, computerbase (they have 3080 6% ahead of 6800XT at both 4k and 1440p)

j62Ma11.png
6900XT will basically trounce 3080.
 

Ascend

Member
My friend, I am sure you have a 5700XT judging by how much you are defending it and I'm sorry I have to be the one who tells you this: this card won't keep up, at all. Especially from a developer perspective. I highly recommend you selling the card, if you can.
Ha. I don't have one :) I am still using a card from 2015ish. I normally upgrade every 5 years or so. But thanks for the advice. I wouldn't have sold it if I had it either; I would have used it for some casual crypto mining.

With Sampler Feedback Streaming, the mentioned RTX 2060 Super (but really it also applies to the next gen consoles, Ampere and RDNA2 as well) has around 2.5-3.5x the effective VRAM amount compared to cards without SFS (basically, the 5700XT) (Inform yourself here what this technology does: https://microsoft.github.io/DirectX-Specs/d3d/SamplerFeedback.html ) Simply speaking, it allows for much finer control of texture MIP levels, meaning your VRAM can be used far more efficiently than before. That will result in much higher resolution textures for DX12 Ultimate compatible cards without the need to increase physical VRAM and eliminates stuttering and pop in. Basically, a 2060 Super has around 20 GB and more effective VRAM compared to your 5700XT. How do you plan to compensate for that?
Time will tell, won't it. I know quite a bit about sampler feedback btw. Sampler feedback is more about improving bandwidth efficiency than VRAM size by the way, but it does help with both.

Next, we have mesh shading. As I said, that is a key technology as it replaces current vertex shaders and allows for much, much finer LOD and higher geometry, similar like what you saw at the PS5 Unreal Engine 5 demo. On PS5, it is using Sony's customized next generation Geometry Engine, which is far beyond the standard GE from RDNA1. On the PC however, Nanite will be using mesh shaders, most likely, which the 5700XT does not support, meaning it is incapable of handling that much geometry.

VRS can give you 10-30% or even higher performance boost at almost no image quality cost.
Despite this all being true, most people will upgrade their graphics card before it becomes a concern. It's not as if the game will not run without these. And someone who was going to buy a $400 card wouldn't likely mind lowering a few settings to keep the performance up. The 5700XT will still age fine.

Next, there is Raytracing. Raytracing is not a gimmick. Raytracing saves so much time for devs and it gets more efficient each day it's going to be an integral part of Next Gen games in the future. Even the consoles support it in decent fashion and it is the future of rendering, don't let anyone tell you otherwise. Nvidia recently released their RTXGI SDK which allows for dynamic GI using updated light probes via Raytracing and it doesn't destroy performance, it is extremly efficient. This means developers don't have to pre-bake lighting anymore and save so much time and cost when developing games. RTXGI will work on any DXR capable GPU, including consoles and RDNA2. However, the 5700XT is not even capable (well it could be if AMD would enable DXR support) of emulating it in software, meaning a game using RTXGI as its GI solution won't even boot up on a 5700XT anymore. However, giving that the RT capable userbase is growing each day with Turing, Ampere, Pascal, RDNA2 and the consoles, this is likely a non issue for devs. If AMD decides to suddenly implement DXR support for the 5700XT you could still play the game, but with much worse performance and visual quality than the DX12U capable GPUs.
All true. But I don't think that future is as near as you think.

Remember, everything I talked about also applies to AMD's new RDNA2 architecture. It fully supports these features as well. Basically, RDNA2 is a much, much bigger jump than you might realize.

Your card might be fine for a while with cross generation games, but once next gen kicks off, the 5700XT is dead in the water.

Hope that clears it up.
The lack of VRS was one of the reasons I didn't upgrade to the 5700XT. The lack of RT was not one of them. I was waiting on RDNA2, because nVidia's RTX2000 series was too overpriced, and honestly, I go to nVidia as a last resort, because I'm kind of an idealist and I don't like their shady business practices. And I have a long list of memories of nVidia's shenanigans.
 

FireFly

Member
Godfall works fine at 4k maxed out on a 3070. The 8 gigs of that card dont hamper the performance one bit, according to the hardware unboxed video where they unbox the 6800XT.

As for having better than expected raytracing, its bellow nvidia's first attempt at it. Bellow the 2080Ti. How is that better than expected ? They put out worse raytracing performance than nvidia did in 2018 and they had 2 years of reverse engineering what nvidia did and to see how it all works
We need to see what performance looks like in titles developed for the next-generation consoles that are designed for AMD's architecture, and we need to see how drivers improve. In a year's time, we will likely have a much better picture of how everything stacks up. For example, AMD's performance in Control is terrible even without ray tracing, and the Minecraft RTX beta results look to be on par with what Microsoft demo'd on the XSX!
 

Dampf

Member
Oh good, I'm glad!

Yes it might be a little while until all these features are used and for true next gen games for that matter, but partial support could also come in cross generation games next year, like a high res texture setting only for DX12U GPUs.

Anyway, I do expect faster adaption, giving the fact the consoles have all these features too and can share code via DX12U easily. I give it two years. But you are right, for people upgrading every 2 years this might not be an issue.
 
Last edited:

Ascend

Member
just trying to gauge when that might be.

interested in bookmarking this thread as us oldtimers have seen some pretty large crows and hats eaten in the past.
It will definitely take a while before the 5700XT is obsolete... No game is going to make VRS or RT mandatory anytime soon because it alienates too much of the player base. It's the same reason games will still support hard drives. SSDs are awesome, but they will not be mandatory anytime soon.
The majority of gamers still do so at 1080p. The shift to 4K hasn't even really happened yet, and we think RT is going to be mainstream this or next year?
The 5700XT has already proven that it can perform extremely well in recently released games, and punches more high above its weight than anybody initially thought possible.

nVidia has a way of letting people buying into FOMO. They have employed that same strategy forever with TWIMTBP and GameWorks. Now, it's RTX.

They make some people think like;
"No RTX? OH MY GOD I'M GOING TO DIE IF I DON'T HAVE THAT"...

Yeah... Some of us are more level-headed than that.
 

MadAnon

Member
It was expected the other way around, because AMD was "years behind".
AMD touching only 3070 was also another thing expected, because, you've guessed it, AMD was "years behind".
On RT front, it was expected that AMD is not even on Turing level, because, you've guessed it, AMD was "years behind". (note how it beats NV in Dirt 5)

Maybe you expected it. Don't project your dumb expectations on me.
5700xt already showed massive improvements. RDNA2 arch + improved 7nm node was pointing towards another great jump.


No, it's better perf/$ all around, with bonuses like more VRAM as icing.

As I said, the prices are all over the place. Show me where I can get 6800XT for 649$ in Europe.

Godfall already doesn't fit in 10GB and we are just transitioning into "this gen".

Show me where Godfall doesn't fit in 10GB. 3080 runs it at 4k, epic settings over 60fps without any problems. And VRAM usage is nowhere near 10GB. Less than 7GB. Watch some benchmarks and get a grip.

AMD has much better than expected performance in NV sponsored RT games, and actually beats NV in Dirt 5.

What benchmarks did you watch? RTX trounces AMD in Control, Minecraft by 50-100% (without DLSS) according to GamerNexus benchmarks. The more RT effects you apply, bigger the lead gets.
 
Last edited:

mitchman

Gold Member
Insomniac's ray tracing is per their engine. Not DXR based. Almost most all first party games it seems going forward if looking at Insomniac are doing their own kind of ray traced reflections, lighting per their own internal engines.

So it's not comparable to what some of the third party titles are using which is either RTX, or DXR based.
You seem to be mixing up HW RT with DXR API based RT, aka the hardware vs an API. DirectX is an MS api, GNM is the similar Sony API. Both supports enabling ray tracing in hardware. Insomniac obviously uses GNM to enable HW ray tracing.
 
Last edited:

Turk1993

GAFs #1 source for car graphic comparisons
I knew something was fishy when they didn't show any ray traced games in there benchmarks. Really hoped they surpassed Nvidia but nope not even close. And its funny that the same guys that had a argument with me that a 5700XT was a better buy than any RTX gpu from last year is still in denial and going wild over Amd lol. Im gonna wait for the 6900xt benchmarks for RT, if they are above RTX 3080 performance wise i might get one. Other wise i need to wait on 3080TI or go wild and get a 3090 FFS.
 

wachie

Member
After reading more reviews, still cant believe AMD has caught up to Nvidia at the top-end in rasterization. I mean at one point in time, some select people were insisting that AMD "cannot" and "will not" be much faster than a 2080Ti. Well we might very well have an AMD GPU that is touching a supposed "Titan" from Nvidia, at least for now.

Only hoping that the supply situation improves for both AMD and Nvidia so gamers of all kind benefit from this. Lovely times :)
 

Andodalf

Banned
After reading more reviews, still cant believe AMD has caught up to Nvidia at the top-end in rasterization. I mean at one point in time, some select people were insisting that AMD "cannot" and "will not" be much faster than a 2080Ti. Well we might very well have an AMD GPU that is touching a supposed "Titan" from Nvidia, at least for now.

Only hoping that the supply situation improves for both AMD and Nvidia so gamers of all kind benefit from this. Lovely times :)

The issue is that AMD now has a RT and ML deficit as big as their rasterization one was, but now they're competing at similar price points.
 

Rentahamster

Rodent Whores
It's not hard to see HardwareUnboxed is trying their hardest to make Radeon looks good. First exaggerating the poscap situation and now their benchmarks. Every where I see including GN, techpowerup, guru3d has numbers where 3080 is very close if not faster than 6800xt in 1440p
What are you talking about? Hardware Unboxed was one of the few sites to withhold judgment, investigate more, and then state that the situation was overblown.
 

Ascend

Member
What benchmarks did you watch? RTX trounces AMD in Control, Minecraft by 50-100% (without DLSS) according to GamerNexus benchmarks. The more RT effects you apply, bigger the lead get
I normally like Gamers Nexus. And I can't argue with their benchmark results. But I can't help but feel that they are slightly biased against the 6800XT.

Why you ask?

Because they are the only ones that had to mention that the cooler basically sucks, despite pretty much every other review saying that the cooler is fine. It sounds like they mentioned that just because all other AMD reference coolers weren't good, rather than acknowledging the improvements.

-------------

On another note, sometimes we forget to look at practicality in benchmarks. People love to talk about the practicality of DLSS at "4k" and RT. But if we look at the raw benchmarks, say, Flight Simulator 2020, the RTX 3080 is 20% faster at 40. But that's 40 fps vs 33 fps. Who's going to willingly run the game at those framerates?

Then there's the other side, where framerates are either too high, or within a certain range, where the difference doesn't really matter. Shadow of the Tomb Raider at 4K is 87 fps for the 3080 and 79 fps for the 6800XT. Both are fine for a 75 Hz monitor, too slow for a 90 Hz or 120Hz monitor. Where is the difference?
Rainbow Six Siege gives you 174 fps for the 3080 and 168 for the 6800XT. Both have minimums above 144Hz, so, where's the difference?

We are too used to simply going for the higher number while in reality it might not mean anything at all. Add in variable refresh rates and the average framerate alone becomes even less important.
 
One other thing to note is how false the AMD slides were with its performance from last month. Basically the 3080 wins at 4k in 95% of the cases. AMD has the slides with numbers that havent materialized in the reviews today. In fact, what AMD showed is almost 20% off in the case of Gears 5. They showed i think almost 80 frames on their card and being higher than nvidia. And reviews today put it in the 60's and nvidia close to 80. Those were some doctored slides and performance metrics
 

DonkeyPunchJr

World’s Biggest Weeb
AMD fans: “who cares about ray tracing? Only a few games support it right now.”

Also AMD fans: “16GB is more future proof!”

In all seriousness I’m glad to see AMD back in the high-end game. Although personally I couldn’t see myself getting a GPU that lags this badly in ray tracing performance. Definitely interested in how their DLSS competitor will perform.
 

Ascend

Member
AMD fans: “who cares about ray tracing? Only a few games support it right now.”

Also AMD fans: “16GB is more future proof!”
RAM is used in all games. RT is not.

One other thing to note is how false the AMD slides were with its performance from last month. Basically the 3080 wins at 4k in 95% of the cases. AMD has the slides with numbers that havent materialized in the reviews today. In fact, what AMD showed is almost 20% off in the case of Gears 5. They showed i think almost 80 frames on their card and being higher than nvidia. And reviews today put it in the 60's and nvidia close to 80. Those were some doctored slides and performance metrics
The selection of games and amount of games matter. One of the reasons I like Hardware Unboxed is because they include many different types of games. They try to vary the benchmarks with nVidia sponsored titles, AMD sponsored titles, different engines, different APIs and a mix of old & new titles. The majority of sites seem to randomly pick a handful of games, and that can skew the overall result.
 
Last edited:

Turk1993

GAFs #1 source for car graphic comparisons
RAM is used in all games. RT is not.


The selection of games and amount of games matter. One of the reasons I like Hardware Unboxed is because they include many different types of games. They try to vary the benchmarks with nVidia sponsored titles, AMD sponsored titles, different engines, different APIs and a mix of old & new titles. The majority of sites seem to randomly pick a handful of games, and that can skew the overall result.
There are more games that use RT than games that use 16gb vram and thats a fact ;). Both are important stop downplaying RT, its nice to have both.
 

fermcr

Member
AMD should have released the 6800 at the same price as the Nvidia 3070, and, destroy the competition... that extra 80-100€ is a bit too much. Gives the Nvidia 3070 some breeding room.
 
Last edited:

Papacheeks

Banned
You seem to be mixing up HW RT with DXR API based RT, aka the hardware vs an API. DirectX is an MS api, GNM is the similar Sony API. Both supports enabling ray tracing in hardware. Insomniac obviously uses GNM to enable HW ray tracing.

Correct. But it's specific to Sony. ANd specific to internal use for the console itself. Your not going to see that translate to the PC. Which will use DXR.

Thats my understanding anyway. Insomniac was literally boasting this that it was best in class compared to what they were seeing from others.
 
RAM is used in all games. RT is not.


The selection of games and amount of games matter. One of the reasons I like Hardware Unboxed is because they include many different types of games. They try to vary the benchmarks with nVidia sponsored titles, AMD sponsored titles, different engines, different APIs and a mix of old & new titles. The majority of sites seem to randomly pick a handful of games, and that can skew the overall result.


Most sites pick a good selection of games. That was not what i said. AMD showed slides where they compared the 6800XT against the 3080 and had some bizzare results that werent even consistent with what they themselves showed earlier, at the cpu conference. Remember back when they presented ryzen, at the end of the show they showed 3 games - Gears 5, COD and Borderlands 3. Now, three weeks later when they had the gpu conference, these 3 games were all presented with higher framerates than what they showed 3 weeks earlier. Weird.

Then they showed a number of games comparing them with the performance of the 3080. Pretty much every game they showed on there as wining over the 3080 that i could find reviewd today is untrue. Its the other way around. Gear 5 in particular stands out and Battlefield 5. Gears 5 is almost 20% faster on a 3080 than on 6800Xt at 4k, yet they had their card wining at the conference. Battlefield 5 i think they had the biggest jump presented for their card, yet reality today shows the 3080 faster there too.

Its like they made up higher numbers out of thin air for their presentation
 

Ascend

Member
There are more games that use RT than games that use 16gb vram and thats a fact ;). Both are important stop downplaying RT, its nice to have both.
Stop downplaying 16GB of RAM.

See how that works?

Most sites pick a good selection of games. That was not what i said. AMD showed slides where they compared the 6800XT against the 3080 and had some bizzare results that werent even consistent with what they themselves showed earlier, at the cpu conference. Remember back when they presented ryzen, at the end of the show they showed 3 games - Gears 5, COD and Borderlands 3. Now, three weeks later when they had the gpu conference, these 3 games were all presented with higher framerates than what they showed 3 weeks earlier. Weird.

Then they showed a number of games comparing them with the performance of the 3080. Pretty much every game they showed on there as wining over the 3080 that i could find reviewd today is untrue. Its the other way around. Gear 5 in particular stands out and Battlefield 5. Gears 5 is almost 20% faster on a 3080 than on 6800Xt at 4k, yet they had their card wining at the conference. Battlefield 5 i think they had the biggest jump presented for their card, yet reality today shows the 3080 faster there too.

Its like they made up higher numbers out of thin air for their presentation
It might be due to the 5950X. How many reviewers test with that?
 
Last edited:

wachie

Member
The issue is that AMD now has a RT and ML deficit as big as their rasterization one was, but now they're competing at similar price points.
I'm a big believer of DLSS so I acknowledge the gap there. With AMD essentially catching up to Nvidia's 1st implementation of RT, their focus will be different with RDNA3. Seems like catching up to Nvidia in pure raster performance was their primary goal, along with DX12 Ultimate conformance and other features that are likely to define the "next-gen".
 

SantaC

Member
RT is here and now. 16GB of VRAM won't be necessary for a long time. Long enough for another generation of GPU to drop.
RT has been irrelevant for PC gaming for a couple of years now, suddenly it is the only thing that matters? Someone should pull out the steam hardware poll results.
 

Ascend

Member
I'm getting Ryzen 1000 series vibes. The Intel fans all came out of the woodwork. It seems like RDNA4 will be Zen3 equivalent, if nVidia sleeps on their laurels.
 
Stop downplaying 16GB of RAM.

See how that works?


It might be due to the 5950X. How many reviewers test with that?



I was thinking about it, but 4k results show no difference between cpu's. Certainly not higher than 10%

 
RT has been irrelevant for PC gaming for a couple of years now, suddenly it is the only thing that matters? Someone should pull out the steam hardware poll results.
I don't know about Steam or general public. Neither do I care. I've been enjoying RT for 2 years. And with next-gen consoles releasing now, more and more games are adapting RT than ever. It's the future.
 

Ascend

Member
Let's hope that Nvidia gives us this rumored 3080 Ti. That's all AMD is good for still, getting Nvidia to lower their prices.

Why you would pick these instead of the Nvidia cards? There is no obvious reason besides you're an AMD fanboy.
By your logic, because if they are not supported, they will not be around to lower nVidia prices anymore.
 
Last edited:

Ascend

Member
I don't know about Steam or general public. Neither do I care. I've been enjoying RT for 2 years. And with next-gen consoles releasing now, more and more games are adapting RT than ever. It's the future.
Ok so... Is 2080 Ti RT performance good enough or not?
 

Ascend

Member
Don't see 2080Ti RT performance as good enough or not. See it as worst than 3000 series.

:messenger_sunglasses:
You were the one saying that you have been enjoying it for 2 years, so, it is good enough.
But... Touché.

Question to everyone; Do you believe VR is also the future? Maybe that USB-c connector comes in quite handy, wouldn't you agree?
 
Last edited:
By your logic, because if they are not supported, they will not be around to lower nVidia prices anymore.
Nah, Apple has arrived and their GPU is seriously powerful as iGPU's go. If they take it seriously they could release an actual dGPU which is worth something instead of Intel's trash.
 

SantaC

Member
I don't know about Steam or general public. Neither do I care. I've been enjoying RT for 2 years. And with next-gen consoles releasing now, more and more games are adapting RT than ever. It's the future.
Native 4K with 60 fps @ ultra settings is imo the priority over RT.
 
Native 4K with 60 fps @ ultra settings is imo the priority over RT.
Yeah I used to hold the same opinion. 4k or death. But reconstruction methods like DLSS or whatever Sony is doing in games like Demon's Soul honestly are good enough and I rather have ray tracing effects (ambient occlusion, shadows, reflections) than pure native 4k. Bare in mind that I game on 65'' CX from like 2 meters away.
 

Turk1993

GAFs #1 source for car graphic comparisons
Stop downplaying 16GB of RAM.

See how that works?


It might be due to the 5950X. How many reviewers test with that?
Yup you are doing the same shit as last year, blindly picking things up without reading everything i wrote. You didn't read the part where i said "Both are important stop downplaying RT, its nice to have both. ". I can show you more games with ray tracing than games that use 16gb vram, you on the other hand can't even show handfull if any games that use 16gb vram.
 

Ascend

Member
Yup you are doing the same shit as last year, blindly picking things up without reading everything i wrote. You didn't read the part where i said "Both are important stop downplaying RT, its nice to have both. ". I can show you more games with ray tracing than games that use 16gb vram, you on the other hand can't even show handfull if any games that use 16gb vram.
It is indeed nice to have both, which the 6800XT also has, in case you conveniently forgot.

I would pick 16GB with medium speed RT over 10GB with slightly higher speed RT.

And oh, when the 3080 becomes RAM limited, you can expect me to come rub it in your face, and I then expect an apology too. You have been warned.
 

Silver Wattle

Gold Member
The amount of games that support RT or DLSS is miniscule, doesn't apply to older games etc, suddenly it's the only thing that matters, seems like the usual way the press Stan for Nvidia and downplay AMD.
I've been using an rtx 2060 for over a year and can't recall a single time I have used either of those features.
 

CrustyBritches

Gold Member
I'm late to the party. Man, I love when the embargo finally lifts and you get to digest all this new info. I kind of expected 6800XT to come in at or a little above in general performance, but it was close. Those average clocks with OC+Max Pow are crazy!
A0u0ker.jpg


Perf/$ is slightly higher on 6800XT, but so close it's basically a wash. Nvidia really nailed their price points with Ampere. AMD will hopefully have more stock available so people can upgrade for Cyberpunk that will totally get released on Dec. 10th >_>

Concerning RT, it looks like they do ok at RT in Metro, but really struggle in Control. Synthetic hybrid-RT seems to be decent, as rumors indicated slightly higher than 2080ti. Nvidia is very strong at path tracing. DLSS cannot be ignored either. It's in all sorts of games now with many to come. AMD will need to bring their AI-upscaling solution to market quickly to counter.

Overall, I'm impressed with AMD and TSMC. RT is in it's infancy for AMD on RDNA 2. However, as the consoles get more titles optimized for RDNA 2 RT, I'm sure the cards will have decent long-term performance. As far as what card I'm leaning towards, I think I'll be looking at a 3060ti or 3070 for the DLSS and RT support for Cyberpunk, then I can get a 6800 XL next year and have a couple decent mid-range setups like I did with RX 480 and GTX 1060. This works best for me for the next year. I buy, sell, trade on craigslist an FB market all the time, too. I'm not worried about flipping and splurging.:messenger_halo:
 
Top Bottom