Sweet jeezDelicious, delicious minimums...
Sweet jeezDelicious, delicious minimums...
Define "enough"Thats because you see a lot of bias from reviewers, depending on who is doing the article. A lot of them you see mocking nvidia somewhat and championing for amd. Guru3d for example, has around 15 games tested and nvidia wins 11 i think. Sometimes by as much as 13% at 4k in Red Dead 2 or 12% at 4k in Watch Dogs Legion.
The conclusion is the 6800 xt kinda equal to a 3080. Which is demonstrably false by their own numbers.
Hardware unboxed has an extra bias in that it completely disregards ray tracing and dlss. He parots the "arent enough games" even though there actually are enough games as of now.
After reading a bunch of reviews, the main takeaway for me is that the next-gen consoles are not going to implement RT is real meaningful way and for the next 5-7 years, I'm not sure how that's going to shake out on the PC landscape.
I keep seeing this yet the 6800XT/6800 are quite competitive even in 4K.Now move golapost is like to try to ignore RT and 4k gaming.
I wont argue that DLSS isn't impressive, but saying "4K with DLSS" isn't actual 4K.I think thats the con. " With DLSS". Without DLSS most new games at those resolutions are almost unplayable. Ray tracing hasn't been shown off in a big meaningful way yet. On top of DXR re-write not 100% done neither is Fidelity fx which to me is open source, and is not using AI re-scaled images. It's rescaling effects, and the rendered resolutions of those effects not the entire rendered image.
If you are telling me I can tone down ray traced reflections to 1080p like how Sony is doing it in GT7, but my final rendered image is a native 4k to me thats more compelling. And the fact everything shown from AMD is not Proprietary outside of infinity cache. SMART MEMORY ACCESS can be used by other companies if they program for it.
Radeon SEEMS more focused on overall just good gaming performance that matters the most.
Not saying people are not going to go and buy nvidia cards and be super impressed playing Ray traced games, I MEAN obviously people like yourself are sold on the idea.
But to down play a compelling product that unanimously everyone who reviewed it is talking about, after 4-5 years of not competing in the high end sector is disingenuous.
LOLOLOLOL
Great isn't it?
But look here...
Console RT quality on AMD
Great to see that all the reviewers don't even check if the raytracing quality is comparable. Kinda doubt they'll even correct their reviews. Professionals, ladies and gentlemen.
(Though I understand that they had to rush these reviews, shitty situation all around)
Define "enough"
They have also data on what viewers care about. Apparently ray tracing isn't it.Great to see that all the reviewers don't even check if the raytracing quality is comparable. Kinda doubt they'll even correct their reviews. Professionals, ladies and gentlemen.
(Though I understand that they had to rush these reviews, shitty situation all around)
Basically.To quote JayzTwoCents:
"If I were nvidia, I wouldn't make the same mistake Intel made by .... letting arrogance drive your business practice. You have to acknowledge that the 6000 series is not only on your radar, but up your ass"
This does not bode well for Consoles, especially for Xbox...
So if you also have the newest AMD CPU, these GPUs are a bit better than what Nvidia has if you hate RT, only game in 1440p, don't want to make use of any bonus ML features, and haven't heard of DLSS. Nice.
They have also data on what viewers care about. Apparently ray tracing isn't it.
You were always arguing for both RT and for 16GB of RAM.But the PS warriors will say we are just at the start of a generation and the developers will find ways to make the next Spiderman MM do even more RT and higher res textures despite the limitation in hardware that we see. This is the main issue I have with people on these boards. They need to ALWAYS consider what the hardware can actually do.. what it's true limit is and then they can have realistic expectations. Trust me when I say we are already seeing the max of what the consoles can do with RT right now on release.
If the 3080 had 16GB, it would be the card I would prefer. Not because of ray tracing, but because I I trust their drivers a bit better, but as I said earlier nvidia does not have a perfect track record in regards to drivers, but its still pretty good. Bad drivers supposedly really hurt the 5700XT in ways that didn't show up in reviews. As someone who was dealing with software issues over the summer, and dealt with crashing and bluescreens, I can understand that frustration. I assumed the culprit was the AMD CPU. However, it turned out that I was using Avast and Malwarebytes and they weren't playing nice with my system. Once I uninstalled Avast and Malwarebytes and switched to ESET the issues went away.Basically.
And yet people are upset at this situation. No matter which brand you prefer, people should be happy that the landscape is more competitive that it has been since .. 290X?
Well we yet to see some die shots and what kind of additional stuff was done to the APU. And well for both on RT, it's downright embarassing. If you look at LInus Tech Tips.You mean playstation 5 right ? Since every compute unit on rdna 2 has a ray something thats used for ray tracing. And ps5 has 36 of those while xbox has 52.
Who's going nuts? I haven't seen anything to suggest anyone really gives that much of a shit about it. I also doubt many are giving up the performance benefits when it is SOOOO much better in performance mode.Doubt it, seeing how people are going nuts for Spider-Mans raytracing.
Doubt it, seeing how people are going nuts for Spider-Mans raytracing.
If the 3080 had 16GB, it would be the card I would prefer. Not because of ray tracing, but because I I trust their drivers a bit better, but as I said earlier nvidia does not have a perfect track record in regards to drivers, but its still pretty good. Bad drivers supposedly really hurt the 5700XT in ways that didn't show up in reviews. As someone who was dealing with software issues over the summer, and dealt with crashing and bluescreens, I can understand that frustration. I assumed the culprit was the AMD CPU. However, it turned out that I was using Avast and Malwarebytes and they weren't playing nice with my system. Once I uninstalled Avast and Malwarebytes and switched to ESET the issues went away.
Absolutely great review from Linus,
Basically says he's "not disappointed" but the price point is wrong due to lack of features, but A+ for effort. He calls out the lack of features, or badly implimented ones, like amd's horrible media encorder quality, lack of comparable streaming tools, horrible RT performance, and sub par professional tools performance. But still great to see AMD back in line with Nvidia at most things and gives us great hope for RDNA3. I feel like he said in a political way what the rest of us have said: AMD is asking too high a $ for what they are giving due to lack of features, they should be $100 - $150 cheaper than Nvidia's offering. Which is why I said, why would you buy this card at this price when for $50 more you'd get so much more.
Maybe the compitition from AMD will force the 3090 price down, or a 20gb 3080ti out sooner? I can fully understand the heartache about 10gb, I avoided the 3080 and went with a 3090 for that exact reason.
Waited in line outside my local Microcenter. There were over a hundred people in line (some there since yesterday). Was then told the store had 2 6800XT available. However there was another truck coming as soon as they open. The following truck arrived with 5.
Maybe RDNA3 will be Radeon's Zen3 moment.
Of course they're competing with nVidia, not Intel, so the game is a bit different.
nvidia is many times more competent than Intel, so I fully expect them to respond. nvidia as best I can tell still has a hunger to innovate, while Intel seems to have laid off all their best engineers.Maybe the compitition from AMD will force the 3090 price down, or a 20gb 3080ti out sooner? I can fully understand the heartache about 10gb, I avoided the 3080 and went with a 3090 for that exact reason.
It's a known issue. Frankly reviewers shouldn't have included the result at all because it reflects bad on them first of all.LOLOLOLOL
Great isn't it?
But look here...
Console RT quality on AMD
Because either is avaialble right now? If anything AMD had more of a paper launch than Nvidia. B&H just announced they got NO stock at all.
From reddit
Maybe because there are whopping 20 RT games, including those not yet released, including those in which people do not enable it as it barely matters visually, but greatly impacts performance, no matter what GPU they have?Why is everyone praising the 6800XT when its failing in nearly every category compared to the RTX 3080 with Ray tracing enabled? lol
Maybe because there are whopping 20 RT games, including those not yet released, including those in which people do not enable it as it barely matters visually, but greatly impacts performance, no matter what GPU they have?
Or maybe because UE5 demo has shown hardware RT isn't even needed:
Why on earth would you expect them to be revealed today?ready to be disappointed at the lack of 6700/xt
Why stop there and call it "half rasterization budget", pulling number out of that place you pulled these from, you can be more generous!Yeah why have dedicated hardware ray tracing when you can spend half your rastorization budget on something that's almost as good as one type of ray tracing and won't be out until SOON
Why on earth would you expect them to be revealed today?
Q1 2021.
Why stop there and call it "half rasterization budget", pulling number out of that place you pulled these from, you can be more generous!
What about running that visually stunning demo on a 36CU GPU?What about that looked like it needed to run at 1440p 30 when compared to a real game like Demon's souls?
Something that would be interesting to see is the performance of for both AMD and Nvidia would be 3440 x 1440 Ultrawide resolution.
Seeing as AMD is slightly ahead at 1440p and Nvidia is slightly ahead at 4K then I wonder would they be roughly equal at the somewhat inbetween 3440 x 1440?
Ultrawide master race, assemble and let us know if you find benchmarks!