Lets be honest, Ray Tracing has been the biggest move in gaming IQ for the last couple of years, and DLSS is the future in moving gaming forward in the future.
Talking about historic stuff going back to ATI even isn't what today is about.
And fidelityfx contains VRS which came out back in 2018 on Nvidia cards, and Microsoft had a big hand in developing FFX with AMD.
People should really thank Nvidia for doing what they have over the last few years.
Sure They were overpriced, but they didn't sit on their hands like Intel did at the time AMD didn't offer any competition. They pushed ahead and introduced tech that we all now take for granted.
Bullshit.
We had shadows, reflections and what not since ages ago.
That is why it is, in fact, so hard to impress with "RT on", on top of RT being essentially noise, before intensive postprocessing, even in a trivial game like Quake, this is "no post processing" frame (this game is open source, if you wonder, how):
But let's move the goals from "AMD doesn't innovate" to "let me cherry pick innovation I like from stinky green camp", shall we.
Complaining that RT is noisy "bEfOrE pOsT pRoCeSSiNg" just demonstrates how little you know about it. RT is done in concert with de-noising. And this is true even for non-realtime applications like Blender.
Realtime Raytracing is a significant leap forward in IQ and we have Nvidia to thank for that. Not AMD.
The truth is that for the past 2 years Nvidia has been the only game in town for raytracing, so all the progress made so far has been through Nvidia's solution. And right now they have a combination of superior Raytracing cores AND DLSS which now allows Raytracing AND high resolutions. You don't have to choose one or the other anymore.
Even though this is true, that does not mean one can make the statement that AMD is always two years behind. But people's true colors always unintentionally shine through in their comments.
New tech is required to support a new API... Look at what happened with DX10.1, which AMD was also ahead in...
A couple of years ago AMD was ahead with hardware features. But nobody cared about its hardware advantages then... It's only logical that AMD has inverted the game, by adopting features later. Because when they had additional features in their hardware, nobody cared. It seems most people don't even know about it, but obviously they all have an opinion...
Nonsense.
The typical reaction of gamers to that pic (which I'm well aware of) is "no way, it can't be real".
It shows just how things are at the moment.
The most expensive card available on the market, right?
How could people spout this level of nonsense in all seriousness is beyond me.
But I have something rather upsetting for ya:
Nonsense.
The typical reaction of gamers to that pic (which I'm well aware of) is "no way, it can't be real".
It shows just how things are at the moment.
It also shows why NV is so in love with TAA.
Best value across the board, unless you are into less than a handful of nvidia sponsored RT games that you wish to run with RT on.
If consoles will aim at 4k/30fps there is no GPU on the market that can guarantee you 4k/60FPS.
Your comments don't seem to have anything to do with anything.
AMD has not had a better GPU than Nvidia in many many years. If the 6900 is, then people will buy it. What's so complicated about that? But it hasn't actually launched yet now has it.
For many many YEARS, if you wanted a high end GPU, Nvidia was the ONLY option. Why is this so difficult for you to come to terms with? When AMD owns the high end, you will see damn near everyone who was buying Nvidia high end cards, switch to AMD. The actual fanboys are in the extreme minority.
This EXACT situation has already happened for CPUs. People used to buy Intel, not because they were Intel fanboys but because they offered superior CPU's. Then when AMD started putting out quality Zen CPUs people started flocking to them.
Your comments about Realtime Raytracing are nonsensical. Like you hate it because it was Nvidia who pioneered it not AMD so you are in some way obligaed to hate it now. Posting a noisy image from quake like it means something? Seriously what are you expecting by posting that pic? All it does it show you don't understand raytracing. Using denoising is not some weakness or proof that it's not ready yet. They do the exact same thing for CGI.
I think it's Lisa Su's son/daughter posting. There's no way a "normal" person could ever be this infatuated with a company or item. It just can't be healthy. The person even hates the color green for Christ sakes lol.
Perhaps you love it and that's why you feel hurt when someone criticizes it?
Why do you feel buthurt about someone posting image actually rendered by rays (with optimizations already applied by the way)
Who the hell told you that it means NOTHING pretty please?
Why are most users SURPRISED and outright DO NOT BELIEVE IT IS TRUE?
You want to claim "blender uses same kind of source to render images"? Well, make an argument, then compare what blender is "denoising" to what has been posted above and think again: do you even have a point?
This EXACT situation has already happened for CPUs. People used to buy Intel, not because they were Intel fanboys but because they offered superior CPU's. Then when AMD started putting out quality Zen CPUs people started flocking to them.
It is amazing how many chronic diseases of AMD have been addressed under her rule (and Raja's departure).
1) Dated look on GPU Control software
2) Drivers (5700 series had hardware issues it seems and it is a flaw on them and I hope they do QA better next time, I also wonder if power supply issues can be handled more gracefully)
3) Embarrassing over-hyping of products
4) Blower coolers on ref cards
Ironically, FUD is on pro-AMD side in CPU world.
Intel's 14nm is much closer to TSMC 7nm than naming suggests.
It is amazing how many chronic diseases of AMD have been addressed under her rule (and Raja's departure).
1) Dated look on GPU Control software
2) Drivers (5700 series had hardware issues it seems and it is a flaw on them and I hope they do QA better next time, I also wonder if power supply issues can be handled more gracefully)
3) Embarrassing over-hyping of products
4) Blower coolers on ref cards
Its pretty good no doubt. Its pretty even (or better) than 3080 for 1080p and 1440p, but is, on average, ~7% behind 3080 at 4k. And since its 7% cheaper (at MSRP anyway, which is useless these days), they both offer same price perf at 4K.
Of course, it has 16GB as compared to 10GB of 3080, but then 3080 has far better Raytracing and also has DLSS. So at the end, depends what you value more. I am sure no one will be disappointed with either one of them IF they can get one. Just get whichever one you are getting within your budget and I am sure the experience would be stellar.
AMD Radeon RX 6800 XT sets a new world record Recently we saw that the Radeon RX 6800 XT Red Devil was easily overclockable to 2.65 GHz on air and out of the box. This is the upcoming custom model of the Big Navi that will debut on November 25th. A few days earlier a […]
I doubt there is 1 person on Earth who actually uses an AMD card for VR considering how superior Nvidia has always been there, but in case anyone cares, that hasn't changed this gen.
I doubt there is 1 person on Earth who actually uses an AMD card for VR considering how superior Nvidia has always been there, but in case anyone cares, that hasn't changed this gen.
Video Index: 00:00 - Welcome Back to Hardware Unboxed 01:22 - Has Nvidia's Poor Launch Helped AMD? 05:07 - How Long Until RDNA2 Optimized Ray Tracing? 07:45 - Is RX 6800 Really Worth Buying Over RTX 3070? 20:37 - Is a 650W PSU Enough? 21:23 - Will RX 6800 Be Highly Overclockable? 22:20 - Will AIBs Use Different VRAM/Cache Amounts? 26:16 - Will Non-Nvidia-RTX Games Perform Better? 31:11 - Driver Stability? 34:03 - Will AIB Launch Lead to More Supply? 38:36 - Why Did AMD Switch Away From Blower Design? 41:59 - Will These Cards See FineWine? (importance of VRAM size) 52:11 - Outro
I doubt there is 1 person on Earth who actually uses an AMD card for VR considering how superior Nvidia has always been there, but in case anyone cares, that hasn't changed this gen.
"Don't say always because in 15 tests, it wins in 4 of them".
Come the fuck on.
Just can't admit defeat. Nvidia won this time around. AMD cards, while having pretty good raster performance, just aren't a good value. Unfortunately, $50 less for something that offers either equal or worse performance while also not offering viable RT solutions or a DLSS alternative just isn't very attractive. Sorry that hurts your feelings, or that you take that personally for some reason.
That's called precision of language and not exacerbating lies. It's like a team losing 3-1 in football, and saying to the losing team that they never score a goal, while they did score one in the same game. Or are you going to disagree that that is a blatant lie?
That's your opinion. Considering how everyone was saying that AMD wouldn't go past the RTX 3070, if that, the fact that they are competing at all is a win for AMD, even if according to the ones that think that only RT and DLSS exist that AMD lost.
Good luck with your 8GB on the 3070. I'd rather pay $80 for double the VRAM, thanks, especially since multiple games are already surpassing the 8GB usage limit.
As for the 6800XT, it's only not a good value if you don't care about SAM, see higher RT performance in current games as a must have, don't care about power consumption, and see DLSS-like feature as mandatory right now.
Want an unbiased perspective? Watch the video about future proofing in my previous post;
The mental gymnastics you are willing to go through for AMD is quite remarkable. :messenger_ok: Most people just want the best GPUthey can get. So the choose Nvidia. You on the other hand are PERSONALLY INVESTED in AMD. Like its products are a part of your self identity.
Unfortunately, $50 less for something that offers either equal or worse performance while also not offering viable RT solutions or a DLSS alternative just isn't very attractive.
Who says its RT is not viable? If the PS5 can do it with 36CUs, you really think that the PC version with twice the CUs has a non-viable RT solution? Not as fast as nVidia, fine, but to call it "not viable" is, once again, a blatant lie.
And even IF it was not viable, weren't you the one that said this...?;
You do realize PC already has games that have ray tracing as an option? And that PC's without RTX cards can still play them? It's literally a toggle. I don't understand how this is an issue or a concern.
And weren't you using an RTX 2060...????? Why did you buy that? If the RT on the 6800 cards are not viable, neither is the one on the RTX 2060, yet, you bought it over the clearly superior 5700XT. What's the deal?
I dislike dishonesty, putting it mildly. Apathy and carelessness is damaging. You didn't even think AMD could come even close to the levels of the 3080 with anything. Well, they did so in multiple aspects. And now you want to criticize me for calling out people that exaggerate things and perpetuate lies.
Interesting, it definitely seems that optimisation for each vendor's RT implementation seems to play a major role in the final performance.
This game appears to be another example of that along with Dirt 5. I wonder will this trend continue with AMD sponsored titles and non-sponsored console ports/cross platform releases? We may end up getting a case where "neutral" non-sponsored titles may need to have separate code paths/branches for RT depending on the detected GPU if the developers are looking to optimise well for both platforms.
Outside of the AMD vs Nvidia RT performance stuff an interesting point seems to be that the 3090 performs worse with RT enabled than even the 2080ti. I can only really think of two possibilities:
The architectural changes from Turing to Ampere, specifically relating to RT seem to at least in this title lead to better performance with Turing than Ampere, which does seem a little odd but I suppose not totally impossible.
A potentially more likely scenario is a potential driver issue with the 3090 for this game? Although it is strange that without RT turned on the 3090 is well ahead of 2080ti.
Either way it looks like the RT landscape/final performance story between the cards may not be fully fleshed out right now seeing as all of the games tested to showcase comparisons are optimised to Nvidia's RT solution/cards.
That benchmark makes no sense, OC-ing 6800 by 10% brings exactly zero perf boost, that is crazy stuff...
I'd say maybe CPU limited (and running on different CPUs) but I doubt CPU load changes much by resolution.
That benchmark makes no sense, OC-ing 6800 by 10% brings exactly zero perf boost, that is crazy stuff...
I'd say maybe CPU limited (and running on different CPUs) but I doubt CPU load changes much by resolution.
Yeah I'm not really sure myself, it does seem a little strange but could be partially CPU limited at 1440p? Or could be game engine limits or other weird resource usage?
Regarding OC performance for 6000 series we know that they clock high but we don't know how exactly that translates into performance just yet, lets wait until we see a few more examples once the AIB custom models release and we should get some better data to see how they perform.
No compromise: RTX 3090, but is terrible value
RT at the cost of likely having to lower settings in the future due to VRAM: 3080
Large amount of VRAM at the cost of potentially having to lower RT settings in the future: 6800XT
Worst option: RTX 3070 due to only having 8GB, so get 6800 instead.
The most obvious conclusion is that RT needs to be optimized for the specific architecture, otherwise it simply doesn't work properly.
Another one might be that AMD's RT is specifically very good with shadows, although that seems very weird, because RT GI is not as heavy as RT shadows. But all of AMD's RT implementations seem to focus primarily on shadows. There must be something more behind that.
This is why I am actually happy that for example RT will not be working on AMD cards at launch. It sucks in some way, but on the other hand it gives hope it will actually work bettar than control for example, when it's here. And I don't mean suddenly it will be better than NV, but maybe RT turns out to be a playable setting to some extent. It will be interesting to see how much time it will take them.
I think the fact that we are starting the generation when the consoles have some RT capacity *might* have something to do with that. And that for example Cyberpunk about to be released looks crazy good with RT.
This is why I am actually happy that for example RT will not be working on AMD cards at launch. It sucks in some way, but on the other hand it gives hope it will actually work bettar than control for example, when it's here. And I don't mean suddenly it will be better than NV, but maybe RT turns out to be a playable setting to some extent. It will be interesting to see how much time it will take them.
Sooooooo what the benchmark comparisons are telling us? I wasn't following news for a while. Is there some kind of average out of many games from multiple sources that compares AMD and Nvidia cards?
Sooooooo what the benchmark comparisons are telling us? I wasn't following news for a while. Is there some kind of average out of many games from multiple sources that compares AMD and Nvidia cards?
Go to page 1, post 1 of this thread. I have a bunch of embedded videos for reviews which contain benchmarks and also a link to a review roundup with links to all of the reviews.
Go to page 1, post 1 of this thread. I have a bunch of embedded videos for reviews which contain benchmarks and also a link to a review roundup with links to all of the reviews.
After thinking about it a bit more, maybe the reason that AMD's RT is focused on shadows is because shadows don't need color data. They basically only darken what is already there (extremely simplified of course). So maybe RT effects that require to 'hold' the color data are more demanding on AMD's architecture, like reflections.
After thinking about it a bit more, maybe the reason that AMD's RT is focused on shadows is because shadows don't need color data. They basically only darken what is already there (extremely simplified of course). So maybe RT effects that require to 'hold' the color data are more demanding on AMD's architecture, like reflections.
That is certainly possible and might end up being the case. It is odd that most AMD sponsored games seem to only be using RT shadows, however it might be the case that they know their Super Resolution tech is not ready yet and to get playable framerates at something like 4K with many RT effects enabled they need an upscaling tech like DLSS. Otherwise they would be getting directly compared to Nvidia and their own sponsored games would be running better on Nvidia cards? At least that is another possibility and something to take into account.
For example on consoles with only 36CUs we see good RT reflections in Spiderman and we see many RT effects in Watch Dogs (although I know those are running at lower quality/settings vs PC).
We should hopefully know more as the months pass and more games optimized for AMD release such as console ports etc... It will also be interesting to see how Watch Dogs performs once the driver issue is sorted out. I wonder will performance drop or stay roughly the same for that title on AMD cards?
No compromise: RTX 3090, but is terrible value
RT at the cost of likely having to lower settings in the future due to VRAM: 3080
Large amount of VRAM at the cost of potentially having to lower RT settings in the future: 6800XT
Worst option: RTX 3070 due to only having 8GB, so get 6800 instead.
The most obvious conclusion is that RT needs to be optimized for the specific architecture, otherwise it simply doesn't work properly.
Another one might be that AMD's RT is specifically very good with shadows, although that seems very weird, because RT GI is not as heavy as RT shadows. But all of AMD's RT implementations seem to focus primarily on shadows. There must be something more behind that.
That's called precision of language and not exacerbating lies. It's like a team losing 3-1 in football, and saying to the losing team that they never score a goal, while they did score one in the same game. Or are you going to disagree that that is a blatant lie?
That's your opinion. Considering how everyone was saying that AMD wouldn't go past the RTX 3070, if that, the fact that they are competing at all is a win for AMD, even if according to the ones that think that only RT and DLSS exist that AMD lost.
Yes, their basic raster performance is solid. Nobody is taking that away from AMD. But at the end of the day, being on par or slightly worse while also missing out on many useful features isn't really a win.
Good luck with your 8GB on the 3070. I'd rather pay $80 for double the VRAM, thanks, especially since multiple games are already surpassing the 8GB usage limit.
As for the 6800XT, it's only not a good value if you don't care about SAM, see higher RT performance in current games as a must have, don't care about power consumption, and see DLSS-like feature as mandatory right now.
Want an unbiased perspective? Watch the video about future proofing in my previous post;
The mental gymnastics you are willing to go through for AMD is quite remarkable. :messenger_ok: Most people just want the best GPUthey can get. So the choose Nvidia. You on the other hand are PERSONALLY INVESTED in AMD. Like its products are a part of your self identity.
Who says its RT is not viable? If the PS5 can do it with 36CUs, you really think that the PC version with twice the CUs has a non-viable RT solution? Not as fast as nVidia, fine, but to call it "not viable" is, once again, a blatant lie.
And even IF it was not viable, weren't you the one that said this...?;
And weren't you using an RTX 2060...????? Why did you buy that? If the RT on the 6800 cards are not viable, neither is the one on the RTX 2060, yet, you bought it over the clearly superior 5700XT. What's the deal?
I dislike dishonesty, putting it mildly. Apathy and carelessness is damaging. You didn't even think AMD could come even close to the levels of the 3080 with anything. Well, they did so in multiple aspects. And now you want to criticize me for calling out people that exaggerate things and perpetuate lies.