• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

wachie

Member
Delicious, delicious minimums...


vEhe2Rz.jpg
Sweet jeez
 

JohnnyFootball

GerAlt-Right. Ciriously.
Thats because you see a lot of bias from reviewers, depending on who is doing the article. A lot of them you see mocking nvidia somewhat and championing for amd. Guru3d for example, has around 15 games tested and nvidia wins 11 i think. Sometimes by as much as 13% at 4k in Red Dead 2 or 12% at 4k in Watch Dogs Legion.

The conclusion is the 6800 xt kinda equal to a 3080. Which is demonstrably false by their own numbers.

Hardware unboxed has an extra bias in that it completely disregards ray tracing and dlss. He parots the "arent enough games" even though there actually are enough games as of now.
Define "enough"
 
After reading a bunch of reviews, the main takeaway for me is that the next-gen consoles are not going to implement RT is real meaningful way and for the next 5-7 years, I'm not sure how that's going to shake out on the PC landscape.

Im already blown away by Spiderman RT and this generation is only a few days old. Ratchet and Clank also looks really good.
 

mitchman

Gold Member
I pre-ordered the MSI 6800 XT card for delivery on Nov 27th, we'll see how that pans out. If it's delayed well into December, I will be away and will cancel it and wait for the ASUS TUF 3080 instead with supposedly delivery in January 2021 or my other pre-ordered 3080 with unknown delivery. So basically, I have 3 pre-orders for cards out now, whoever is sent first wins.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I think thats the con. " With DLSS". Without DLSS most new games at those resolutions are almost unplayable. Ray tracing hasn't been shown off in a big meaningful way yet. On top of DXR re-write not 100% done neither is Fidelity fx which to me is open source, and is not using AI re-scaled images. It's rescaling effects, and the rendered resolutions of those effects not the entire rendered image.

If you are telling me I can tone down ray traced reflections to 1080p like how Sony is doing it in GT7, but my final rendered image is a native 4k to me thats more compelling. And the fact everything shown from AMD is not Proprietary outside of infinity cache. SMART MEMORY ACCESS can be used by other companies if they program for it.

Radeon SEEMS more focused on overall just good gaming performance that matters the most.

Not saying people are not going to go and buy nvidia cards and be super impressed playing Ray traced games, I MEAN obviously people like yourself are sold on the idea.

But to down play a compelling product that unanimously everyone who reviewed it is talking about, after 4-5 years of not competing in the high end sector is disingenuous.
I wont argue that DLSS isn't impressive, but saying "4K with DLSS" isn't actual 4K.

What's also weird is that I have heard that 1440p is what matters for the LONGEST time now and being better in 4K for nvidia is really interesting to see that horn being tooted.

However, I am just happy that there is very legit competition for nvidia.

To quote JayzTwoCents:
"If I were nvidia, I wouldn't make the same mistake Intel made by .... letting arrogance drive your business practice. You have to acknowledge that the 6000 series is not only on your radar, but up your ass"
 
D

Deleted member 17706

Unconfirmed Member
Looks promising. I'm still planning to hold out for a 6900 XT and see what those benchmarks are like. Not like I would have a chance at buying a 6800 XT anyway... I'm guessing I won't be able to get any GPU (or PS5 for that matter) until next spring at the earliest.
 

Kenpachii

Member
Great to see that all the reviewers don't even check if the raytracing quality is comparable. Kinda doubt they'll even correct their reviews. Professionals, ladies and gentlemen.

(Though I understand that they had to rush these reviews, shitty situation all around)

what i don't get is they are pretty big on youtube and make a good living, get some other kid that tests raytracing with those cards quality wise while u do the benchmarks for the games.
 
Define "enough"


There are about 30 games now or a bit more i think with ray tracing or dlss. Keep in mind DLSS 2.0 its only from april of this year. So, half a year only.

Lets look at the second half of 2020. Lets name the big games that have come out or will come out until the end of the year, starting from july:

Death Stranding
Horizon
Avengers
Watch Dogs
Valhalla
Cyberpunk
Call of Duty
World of Warcraft

Besides ACreed and Horizon they all employ ray tracing and/or dlss. And there are half a dozen smaller games also in this time period with rtx/dlss. Point is, this low nr of games was accurate back in 2018. The 2000 series literally lauunched with no games suporting the new tech. But as time went on, games started to pile up. Months go by, the nr increases. You cant really come and say when almost all of the big games that are coming out support the tech that there arent enough games. At some point you actually need to stop and observe that in the meantime they kept coming out. And we have more on the horizon, Bloodlines 2, Witcher 3, the next COD, Doom Eternal and so on saying they will use ray tracing.

You cant have a gigantic number of games with ray tracing if a large number of games dont come out at all. How many AAA games come out in a year ? In a month ? Not many. But if most of those tenpole, AAA games suport raytracing/dlss, then its not correct to keep repeating that there arent thoudands of games with ray tracing
 

JohnnyFootball

GerAlt-Right. Ciriously.
Great to see that all the reviewers don't even check if the raytracing quality is comparable. Kinda doubt they'll even correct their reviews. Professionals, ladies and gentlemen.

(Though I understand that they had to rush these reviews, shitty situation all around)
They have also data on what viewers care about. Apparently ray tracing isn't it.
 

Andodalf

Banned
So if you also have the newest AMD CPU, these GPUs are a bit better than what Nvidia has if you hate RT, only game in 1440p, don't want to make use of any bonus ML features, and haven't heard of DLSS. Nice.
 

wachie

Member
To quote JayzTwoCents:
"If I were nvidia, I wouldn't make the same mistake Intel made by .... letting arrogance drive your business practice. You have to acknowledge that the 6000 series is not only on your radar, but up your ass"
Basically.

And yet people are upset at this situation. No matter which brand you prefer, people should be happy that the landscape is more competitive that it has been since .. 290X?
 

Pagusas

Elden Member
Absolutely great review from Linus,

Basically says he's "not disappointed" but the price point is wrong due to lack of features, but A+ for effort. He calls out the lack of features, or badly implimented ones, like amd's horrible media encorder quality, lack of comparable streaming tools, horrible RT performance, and sub par professional tools performance. But still great to see AMD back in line with Nvidia at most things and gives us great hope for RDNA3. I feel like he said in a political way what the rest of us have said: AMD is asking too high a $ for what they are giving due to lack of features, they should be $100 - $150 cheaper than Nvidia's offering. Which is why I said, why would you buy this card at this price when for $50 more you'd get so much more.

 
Last edited:

Papacheeks

Banned
So if you also have the newest AMD CPU, these GPUs are a bit better than what Nvidia has if you hate RT, only game in 1440p, don't want to make use of any bonus ML features, and haven't heard of DLSS. Nice.

Smart memory access is not exclusive to just amd cpu's thats what we are hearing now. So if intel does a update for their newer chipset boards to support this we can see this being a thing going forward. Nothing amd has besides infinity cache is proprietary. unlike DLSS which is only nvidia because they are their AI data centers. Fidelity fx is open source. Consoles are using something similar if not using it all together.
 

Ascend

Member
But the PS warriors will say we are just at the start of a generation and the developers will find ways to make the next Spiderman MM do even more RT and higher res textures despite the limitation in hardware that we see. This is the main issue I have with people on these boards. They need to ALWAYS consider what the hardware can actually do.. what it's true limit is and then they can have realistic expectations. Trust me when I say we are already seeing the max of what the consoles can do with RT right now on release.
You were always arguing for both RT and for 16GB of RAM.

Key question;
Would you prefer RTX 3080 for its RT, or the 6800XT for its 16GB?
 

JohnnyFootball

GerAlt-Right. Ciriously.
Basically.

And yet people are upset at this situation. No matter which brand you prefer, people should be happy that the landscape is more competitive that it has been since .. 290X?
If the 3080 had 16GB, it would be the card I would prefer. Not because of ray tracing, but because I I trust their drivers a bit better, but as I said earlier nvidia does not have a perfect track record in regards to drivers, but its still pretty good. Bad drivers supposedly really hurt the 5700XT in ways that didn't show up in reviews. As someone who was dealing with software issues over the summer, and dealt with crashing and bluescreens, I can understand that frustration. I assumed the culprit was the AMD CPU. However, it turned out that I was using Avast and Malwarebytes and they weren't playing nice with my system. Once I uninstalled Avast and Malwarebytes and switched to ESET the issues went away.
 

M1chl

Currently Gif and Meme Champion
You mean playstation 5 right ? Since every compute unit on rdna 2 has a ray something thats used for ray tracing. And ps5 has 36 of those while xbox has 52.
Well we yet to see some die shots and what kind of additional stuff was done to the APU. And well for both on RT, it's downright embarassing. If you look at LInus Tech Tips.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Doubt it, seeing how people are going nuts for Spider-Mans raytracing.
Who's going nuts? I haven't seen anything to suggest anyone really gives that much of a shit about it. I also doubt many are giving up the performance benefits when it is SOOOO much better in performance mode.
 

Papacheeks

Banned
Doubt it, seeing how people are going nuts for Spider-Mans raytracing.

Insomniac's ray tracing is per their engine. Not DXR based. Almost most all first party games it seems going forward if looking at Insomniac are doing their own kind of ray traced reflections, lighting per their own internal engines.

So it's not comparable to what some of the third party titles are using which is either RTX, or DXR based.
 

Pagusas

Elden Member
If the 3080 had 16GB, it would be the card I would prefer. Not because of ray tracing, but because I I trust their drivers a bit better, but as I said earlier nvidia does not have a perfect track record in regards to drivers, but its still pretty good. Bad drivers supposedly really hurt the 5700XT in ways that didn't show up in reviews. As someone who was dealing with software issues over the summer, and dealt with crashing and bluescreens, I can understand that frustration. I assumed the culprit was the AMD CPU. However, it turned out that I was using Avast and Malwarebytes and they weren't playing nice with my system. Once I uninstalled Avast and Malwarebytes and switched to ESET the issues went away.

Maybe the compitition from AMD will force the 3090 price down, or a 20gb 3080ti out sooner? I can fully understand the heartache about 10gb, I avoided the 3080 and went with a 3090 for that exact reason.
 

evanft

Member
Absolutely great review from Linus,

Basically says he's "not disappointed" but the price point is wrong due to lack of features, but A+ for effort. He calls out the lack of features, or badly implimented ones, like amd's horrible media encorder quality, lack of comparable streaming tools, horrible RT performance, and sub par professional tools performance. But still great to see AMD back in line with Nvidia at most things and gives us great hope for RDNA3. I feel like he said in a political way what the rest of us have said: AMD is asking too high a $ for what they are giving due to lack of features, they should be $100 - $150 cheaper than Nvidia's offering. Which is why I said, why would you buy this card at this price when for $50 more you'd get so much more.



Maybe RDNA3 will be Radeon's Zen3 moment.

Of course they're competing with nVidia, not Intel, so the game is a bit different.

Maybe the compitition from AMD will force the 3090 price down, or a 20gb 3080ti out sooner? I can fully understand the heartache about 10gb, I avoided the 3080 and went with a 3090 for that exact reason.

I am almost certain that nVidia will release a card with 20gb of VRAM for $1000 or less before the end of 1Q21.
Probably a 3090 chip with 20GB dubbed the 3080ti. Maybe a clock speed increase as well but I don't think they have much room on Ampere for that.
 
Last edited:

tusharngf

Member
From reddit


Waited in line outside my local Microcenter. There were over a hundred people in line (some there since yesterday). Was then told the store had 2 6800XT available. However there was another truck coming as soon as they open. The following truck arrived with 5.

 
So to sum up the general consensus and performance for RX6800XT vs RTX3080:

General Info From Reviews:
  • 1080p/1440p - 6800XT has 3-5% better performance on average than 3080
  • 4K - 3080 has 3-5% better performance on average than 6800XT
  • Reference Fan Design - Both seem to run cool and quiet, essentially a draw?
  • Performance per watt - 6800XT pulls ahead of 3080
  • Performance per $ - 6800XT pulls ahead of 3080
  • Power Draw - 6800XT draws less power than 3080
  • Ray Tracing (Hybrid Rendering) - 3080 pulls significantly ahead in most titles.
  • Ray Tracing (Path Tracing) - 3080 pulls way ahead. (Minecraft and Quake DXR are the only PT titles at the moment)
  • Ray Tracing DLSS - 3080 Pulls even further ahead, but AMD's Super Resolution is not available yet to compare.
  • Productivity Performance (Blender etc..) - 3080 still maintains its general lead in most cases due to the massive CUDA advantage.
In addition to the above, 6900XT is $50 cheaper at $649 and has 16GB RAM vs 10GB RAM for 3080.

All in all I would say a pretty good showing from AMD aside from the low RT performance vs 3080 and the lack of FidelityFX Super Resolution being ready. I did expect the RT performance to be maybe 5% higher personally but overall seems to mostly match or exceed 2080ti (without DLSS).

On the topic of Ray Tracing on 6000 series cards the performance could potentially improve a bit (still won't match Ampere though) due to the following:
  • Almost all current RT enabled games are optimized for Nvidia's cards. (AMD didn't even have RT capable cards till now)
  • Nvidia worked closely with MS to design the DXR 1.0 API spec/functionality. This was designed towards Nvidia's RT hardware/solution. AMD worked closely with MS for DXR 1.1 which offers new features which can be optimized towards AMD's RT hardware solution. I believe DIRT and Godfall are the only DXR 1.1 capable games so it is still early days yet.
  • Console ports will be optimized towards AMD's RT solution 90% of the time. (See Watch Dogs: Legion RT perf)
  • Improved driver maturity (FineWine tm) /developer familiarity with AMD's RT hardware/DXR 1.1

Having said all of that, they will still likely fall far behind Ampere's RT solution this generation so I'm not making excuses for them or jumping on the "wait for X!" bandwagon. If RT is important to you and if you want the best RT performance available then you should definitely go with Nvidia this generation, simple as that. I just figured it would be interesting to take the above bullet points into account as a possible area where AMD might gain a little ground this gen regarding RT performance.
 

Pagusas

Elden Member
Maybe RDNA3 will be Radeon's Zen3 moment.

Of course they're competing with nVidia, not Intel, so the game is a bit different.

Thats what I'm thinking, it took them 3 tries with Zen to get it completely right, maybe it'll be the same with RDNA. Regardless, this is NOT the crumy AMD of the past decade, this is a new AMD thats starting to kick ass, it just takes a while to get its engine reved.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Maybe the compitition from AMD will force the 3090 price down, or a 20gb 3080ti out sooner? I can fully understand the heartache about 10gb, I avoided the 3080 and went with a 3090 for that exact reason.
nvidia is many times more competent than Intel, so I fully expect them to respond. nvidia as best I can tell still has a hunger to innovate, while Intel seems to have laid off all their best engineers.
 

M1chl

Currently Gif and Meme Champion
I mean I would be interested in, where the hell does the XSX chip has the ML capabilities, how it was touted in the hot chips presentation. Because seems like it's not by this type of GPU. 9:50 in Linus review...
 
Something that would be interesting to see is the performance of for both AMD and Nvidia would be 3440 x 1440 Ultrawide resolution.

Seeing as AMD is slightly ahead at 1440p and Nvidia is slightly ahead at 4K then I wonder would they be roughly equal at the somewhat inbetween 3440 x 1440?

Ultrawide master race, assemble and let us know if you find benchmarks!
 
Last edited:

dave_d

Member
Because either is avaialble right now? If anything AMD had more of a paper launch than Nvidia. B&H just announced they got NO stock at all.


Right, neither is available so if you were like me and had the attitude "I'll just get whatever's available" you're pretty much getting nothing and waiting for a couple more months. Who knows what will be available in February/March.(I am not paying scalper prices and if anything I should have scalped the XSX I did get.)

Anyway, DAMN I knew stock was low but I didn't know it was that low.
 

llien

Member
Why is everyone praising the 6800XT when its failing in nearly every category compared to the RTX 3080 with Ray tracing enabled? lol
Maybe because there are whopping 20 RT games, including those not yet released, including those in which people do not enable it as it barely matters visually, but greatly impacts performance, no matter what GPU they have?

Or maybe because UE5 demo has shown hardware RT isn't even needed:
 
Last edited:

Andodalf

Banned
Maybe because there are whopping 20 RT games, including those not yet released, including those in which people do not enable it as it barely matters visually, but greatly impacts performance, no matter what GPU they have?

Or maybe because UE5 demo has shown hardware RT isn't even needed:


Yeah why have dedicated hardware ray tracing when you can spend half your rastorization budget on something that's almost as good as one type of ray tracing and won't be out until SOON
 

llien

Member
ready to be disappointed at the lack of 6700/xt
Why on earth would you expect them to be revealed today?
Q1 2021.

Yeah why have dedicated hardware ray tracing when you can spend half your rastorization budget on something that's almost as good as one type of ray tracing and won't be out until SOON
Why stop there and call it "half rasterization budget", pulling number out of that place you pulled these from, you can be more generous! :messenger_beaming:
And an actual answer is:

1) Because it runs on wider range of GPUs and hence is more likely to get any traction at all at game development
2) Because it is more flexible too
 
Last edited:

Andodalf

Banned
Why on earth would you expect them to be revealed today?
Q1 2021.


Why stop there and call it "half rasterization budget", pulling number out of that place you pulled these from, you can be more generous! :messenger_beaming:

What about that looked like it needed to run at 1440p 30 when compared to a real game like Demon's souls?
 

00_Zer0

Member
Welp boys I just decided I'm going to wait till 2022-23 till I either build a new rig or upgrade what I got, by then these cards will be refreshed, and all the new hotness will have died down.

Right now I a have a Ryzen 7 37000x with a Sapphire 5700 XT Nitro plus and 32 gigs of DDR4 that I built last year and for the games I play right now I'm just happy with 1440p at 120 Hz with freesync enabled.

I also don't want to spend like a madman to get either an Nvidia 3080 or Radeon 6800 XT right now with Christmas coming up and other bills to pay.
 
Something that would be interesting to see is the performance of for both AMD and Nvidia would be 3440 x 1440 Ultrawide resolution.

Seeing as AMD is slightly ahead at 1440p and Nvidia is slightly ahead at 4K then I wonder would they be roughly equal at the somewhat inbetween 3440 x 1440?

Ultrawide master race, assemble and let us know if you find benchmarks!


But its not slightly ahead at 1440p. Its below 3080 at 1400p and 4k. Hardware unboxed more than likely ran the tests and chose nvidia in the worst light and amd in the best he could. You can be sure a 3080 will net you better performance at ultrawide, better features, better ray tracing, productivity, dlss.
 
Top Bottom