• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

could you please stop this bullocks about not having RT cores.

AMD is doing the exact same RT pipeline steps nvidia is doing in it RT cores in specialized hardware as seen in the fucking AMD raytracing patent.

VFXVeteran is spreading this FUD for over half a year now constantly and GAF is starting to believe it.
 
I'm pretty sure they do have hardware based ray tracing built into their TMUs, similar to XSX I would imagine.

They likely didn't emphasize RT too much other than as a tick box because they know they are behind in perf this gen as well as their DX12 Super Resolution stuff wasn't ready yet.
 

FireFly

Member
No RT specific Hardware, no Tensor hardware equivalent either. Without those AMD needed to blow Nvidia out of the water for it to really be close.
They have dedicated BVH acceleration built into the TMUs. The leaked Port Royale benchmark showed them beating a 2080 Ti. And Microsoft has shown the XSX running Minecraft RTX at 2060+ level performance.
 
Last edited:
AMD showed that in very specific situations they can match the power, but have done nothing to address RT or DLSS. When those are added in, Results could be a blowout in favor of Nvidia.
^ I too think AMD blew the pricing with 6800XT. At $500 they could have taken a bite of the market, but not at ~similar price as 3080 without DLSS and much inferior RT perf.

edit 6800xt*
 
Last edited:
in the presentation overall, they put more emphasis on FPS ultra/best settings.

they didnt care too much about raytracing and dlss.
 

BluRayHiDef

Banned
I wonder how well these cards run Control - the poster child of ray tracing - at 4K with ray tracing enabled? This game alone suffices as a means of determining whether the RX 6000 Series or the RTX 30 Series is the better choice for modern/ future gaming. With a lack of an equivalent to DLSS, I'm pretty sure that AMD's cards run Control very poorly at 4K with ray tracing enabled (or even without it because the game is demanding even without ray tracing).
 

Bo_Hazem

Banned
4k6121.jpg
 
in the presentation overall, they put more emphasis on FPS ultra/best settings.

they didnt care too much about raytracing and dlss.

They have slightly weaker RT hardware than 3000 series and the Super Resolution stuff from SX12 is not ready yet, so if they showed games with heavy RT without the super resolution stuff the performance would suffer a lot especially if people were comparing to 3000 series with DLSS turned on.

Give it some time for AMD/MS to finish up the Super Resolution stuff and release it as a driver update.
 

MiguelItUp

Member
You're calling people illiterate while focusing on the two points that matter significantly less? A $50 haircut and 20 watts of total board power? Ok, LMFAO.

Seriously dude, you're the dumb one if you think that's the argument.

It's the lack of performance in ray tracing and uncertainty about a true dlss 2.0 competitor vs the uncertainty of 10gb of vram being enough that is the discussion. Amd's pricing was only going to tilt the discussion heavily in their favor if it was much more aggressive. Now they're just competing on features. Do you want more forward thinking and advanced features/performance now? Choose 3080. Do you want 5 years out of your card for SURE without compromise on vram intensive settings? Choose Amd
giphy.gif
 
It is not surprising RDNA2 pwns Ampere, one could see it coming and we have talked about that many times.

Not really surprising, I think.
RDNA1 was already matching Turing. With Ampere Nvidia choose to cram more cores instead of really improving then, which leads to difficulties on efficiency and utilization. With RDNA AMD started an effort on improving efficiency and utilization, so was predictable that RDNA2 cores would have "better IPC" and surpass Ampere.
The doubt was if this would be enough to beat Nvidia cards, and how much the Infinity Cache would push performance forward.

Nvidia saving grace is DLSS, they'll really push this harder than ever, but with DirectX12U new standars it seems that these cards can also do a similar form of upscalling.
 
Can the DLSS warriors and their 20 titles that support it get help and stop downplaying how good these cards are? I think the 3070 for the price is borderline ok for the time being though and a great card for 1440p.

Only idiots are saying these are bad cards. But $50 off for the same performance in rasterization while losing in other areas and making up for it with possibly better longevity ain't 'raping nvidia'

People are dumb everywhere. 6900xt is aight I guess if you want to pay $300-350 more for an extra 10-15 percent jump. Comparing it to the 3090 is disengenuous though. It's not a creator card
 

Xyphie

Member
Predictions:

Very good performance in 1080p/1440p but lags behind Ampere in 4K because of bandwidth limitations.✅
Raytracing demos shown but no performance metrics.✅
No Navi 22 (RX 6700XT/6700/6600XT) information. Won't launch this year.✅
Bundles with Ryzen 5000 processors.❌

6900XT:
$749-799❌
~105% RTX 3080 (Will be compared to RTX 3090 by AMD✅, "same perf, half price" kind of messaging❌ )
~350W reference❌

6800XT:
$649✅
~95% RTX 3080 in 3rd party reviews (Will be shown with games similar or better performance by AMD✅)
~300✅-325W reference, ~350W board power for fancy AIB OC models (ASUS STRIX et al)

6800:
$579✅
~85% 6800XT perf
275W reference❌

Bit better on power than I expected but benchmarks were with the "rage mode" auto overclocking enabled, so should be a bit higher for the promised performance. Otherwise not horribly off.
 

Dr.D00p

Gold Member
..Good presentation and at least AMD are back in the game, for which we should be all grateful, Nvidia needs a good kicking.

Only thing I didn't like was the sneaky 6900XT performance figures being based on those Ryzen 5 CPU memory boosts, which not everyone will be able to take advantage of.
 
I thought XsX was using directx ray tracing?

PS5 is punching WAY ABOVE its weight class with the AAA Ray Tracing games they've shown.

XsX does use direct x 12 ultimate, but the compute units of the XsX have been customized at the hardware level to accelerate ray tracing.

The questions is, do these new cards have hardware level ray tracing via:

1) modified compute units
2) dedicated RT cores

I dont know the answer
 
AMD is behaving the same as Sony this year, being all secret and releasing information little by little.
I'm expecting a lift on Zen 3 NDA today. Some changing in it's architecture are related to work with RDNA2.
As with the last even today AMD only showed results, they just glanced over the architecture without giving details. Some will have a few surprises in the coming weeks as the details are revealed.
 

onesvenus

Member
Im not greedy
I just want my 2.23ghz sustainable game clocks! :messenger_clapping:

Geordieboy takes a dump
Oh, you sweet summer child, I hope you have Valerian nearby.
we already saw from 3D mark log files that it can run over 2.5 Ghz.
Weren't rumors talking about game clocks being 2.4Ghz?

And looking at the PS5 speed of 2.23Ghz and seeing that there's no RDNA2 6000 GPU with those game clocks, does that say those won't really be sustained? Highest game clocks are 2.04 in 6900XT
 

Orta

Banned
Okay, I'm impressed. I was waiting to see how the 6800 performed but there's such little price difference with the XT I'd plump for that instead. Want to see benchmarks where these cards aren't working in tandem with amd cpu's though.
 
XsX does use direct x 12 ultimate, but the compute units of the XsX have been customized at the hardware level to accelerate ray tracing.

The questions is, do these new cards have hardware level ray tracing via:

1) modified compute units
2) dedicated RT cores

I dont know the answer

Aff... again.
We already know how RT works on RDNA2.
Why are people still questioning this?
 
Lol, get fucked Jensen.

Just as I thought.

1) No RT cores so basically the SMUs will do all the RT.
2) No hardware Tensor Cores for DLSS.

These cards will not last long this generation.

They'll last longer than the 3070 and 3080 with their gimped memory arrangement.

it tells you everything about the confidence Nvidia has in its product by the fact they sharted out a failed, botched launch with ampere because they knew a deep dicking was on its way.
 
So now we know the answer as to why Nvidia pushed the 3080 and 3090 way past the optimum efficiency curve to gain a tiny bit of extra performance for huge power draw.

We know why they rushed out their paper launch without enough stock.

We know why the 3080 and 3070 are priced the way they are, we know why the 3090 exists at all as a hail mary maintain the Halo crown card.

We know why the 3070ti based on GA104 was cancelled.

We know why a cut down GA102 is taking its place.

We know why the 20GB 3080 and 16GB 3070 were cancelled.

We know why a 3080ti is launching at slightly lower performance than 3090 with 12GB RAM and presumably lower price.

Very well played AMD, very well played indeed.
 
Last edited:

Bluntman

Member
"Ray tracing itself does require additional functional hardware blocks, and AMD has confirmed for the first time that RDNA2 includes this hardware. Using what they are terming a ray accelerator, there is an accelerator in each CU. The ray accelerator in turn will be leaning on the Infinity Cache in order to improve its performance, by allowing the cache to help hold and manage the large amount of data that ray tracing requires, exploiting the cache’s high bandwidth while reducing the amount of data that goes to VRAM."

https://www.anandtech.com/show/1620...-starts-at-the-highend-coming-november-18th/2

Can we stop with the "no RT hardware OLOLOLO" bullshit sometime soon?
 
How is that 3090 that you paid 700 dollars over market price for working out? I definitely expected you to be super positive on AMD in this thread. Can't think of a single reason why you wouldn't be! No correlation at all!

Keep in mind VFXVeteran is a graphics artist; for graphics artists more VRAM and RT features are probably preferred, for their visual effects suites. IIRC the 3090 is mainly targeted at graphics production suites, not gaming per se.

Overall I'm really happy with these AMD announcements. Gotta give RedTechGaming some props, he was pretty accurate WRT the 6000 line spec leaks without saying too much to get obliterated by AMD ninjas. From what I saw, one of the GPUs is basically 2x PS5s (6900?), and the other is basically a Series X (6800 XT). So it definitely seems they leveraged the hell out of those partnerships.

Seems they are going with adopting DirectStorage for data I/O solutions, similar to Nvidia, though I think they're waiting on MS's DirectStorage instead of spinning their own. Also something that surprised me here was just how focused they were on specifying DX12U support with the GPUs. Any mentions of Vulkan stuff? A good number of the DX12U stuff seems strongly ingrained into the actual silicon design of RDNA2, as well, that kind of matches some of the earlier speculation indicative to what degree MS's been involved in the process (like bringing mesh shaders to the spec).

I didn't see any mention of cache scrubbers here, but that could maybe be the feature Sony said would be the type of thing that is brought forward in a "future" RDNA standard, something like RDNA3 most likely. So in that sense yeah, they would "have" an RDNA3 feature present, but more as something from their end adopted forward, not pulled from a forward RDNA3 roadmap...which fits the language Mark Cerny used back in March.

There might be a chance the consoles have Infinity Cache after all; definitely not 128 MB's worth, but maybe say, 32 MB minimum. That's at least about 1 MB (minus a few hundred KB) L1$ per dual CU, but it could go up a bit depending on what Sony and MS would've wanted. Maybe a chance they'd have it implemented slightly differently, too.

Power consumption looks like it falls in line with what I was expecting, very good figures there. I think for visual suite productivity Nvidia might still have the advantage at the highest end but for gaming AMD is at least legit competitive with them and beating them in a lot of cases for less money, too. That could change somewhat factoring in DLSS and RT for Nvidia, but the former is game-by-game case and the latter is only really applicable in measured doses without tanking game performance (i.e you still need to use a lot of artistic liberties).
 
Last edited:
6900XT is cheap so will be sold out because pandemic and unemployment crisis is happening. People need to spend their money on something (except rent).
 
Last edited:

Rentahamster

Rodent Whores
what's even the point of AMD cards if they don't have DLSS? I mean, NVIDIA is a much better deal because of it, no?
DLSS is awesome, but until it gets supported in more games, it only really matters if your most favorite games are DLSS capable. Once Nvidia gets DLSS more widely adopted, it'll become a much larger selling point. AMD will probably have an alternative by then, however.
 

SantaC

Member
I am getting a 6800XT, that card looks awesome.

Pricepoint is good. People need to realize that these card will sell out no matter what. It is covid times.
 
Top Bottom