Do you care about raytracing?Okay so I want to get anew gpu now that I can for under a thousand dollars. What should i get a 6900 xt or 3080 12gig?
well isnt that the question? lol Ive been looking at recent comparisons and the 6900 beats the 3080 in a lot of non RT games by a pretty good margin, but then there is ray tracing .. the thing i see with that is the 6900 is always about 20 fps behind in RT games but the games that tank it below 60 are usually run badly on the 3080 as well. only a few cases where the 3080 is right at 60 with the 6900 at 40.Do you care about raytracing?
I don't think actual implementation of raytracing is that meaningfull, so I'd go with a 6900 (especially for the vram), I'm pretty sure than when raytracing will be a real game changer our gpu will be useless for now it's really fancy reflection and nothing more, worth in maybe 1 or 2 games... naturally imho, other than that do you need nvenc?well isnt that the question? lol Ive been looking at recent comparisons and the 6900 beats the 3080 in a lot of non RT games by a pretty good margin, but then there is ray tracing .. the thing i see with that is the 6900 is always about 20 fps behind in RT games but the games that tank it below 60 are usually run badly on the 3080 as well. only a few cases where the 3080 is right at 60 with the 6900 at 40.
3080 12 gb. 6900xt is a great fucking card but ray tracing is going to be pretty much standard in next Gen games when they do arrive. Ue5 matrix demo with ray tracing seem to be doing fine on AMD cards but we don’t know if Avatar and Star Wars fallen order 2 will behave the same way.Okay so I want to get anew gpu now that I can for under a thousand dollars. What should i get a 6900 xt or 3080 12gig?
that is a tough question these days. I convinced a friend of mine to buy the 3080 12GB(which he did) because it was on sale for $770. At that price, it is hard to justify a 6900xt. Even if the 6900xt does beat it in rasterization by quite a bit. Price per frame, the 3080 12GB I think is the hands down best value aside from maybe a 6600xt at the moment. That is of course if you are dead set on buying a card right now. If this was a few months ago? 6900xt since it was the first card to hit msrp and even go under for the most part. Rasterization is still king and the 6900 xt > 3080 12gb in that manner.Okay so I want to get anew gpu now that I can for under a thousand dollars. What should i get a 6900 xt or 3080 12gig?
I would also imagine more FSR 2.0's with RT in the future and you'll get roughly the same perf or better.I'm of the opinion, grab the 6900 XT. It's the same money (or cheaper) than an RTX 3080 10/12 GB and better. Sure, Nvidia is winning the Raytracing race as of now, but the 6900 XT just straight brute forces and it appears FSR 2.0 can stand up to DLSS, so you'll have gains there. (Once more universal support, of course) You'll be in good hands if you get either card, but again, why not get the more powerful AND its usually cheaper?
This is exactly what I did, for all the reasons you mentioned.I'm of the opinion, grab the 6900 XT. It's the same money (or cheaper) than an RTX 3080 10/12 GB and better. Sure, Nvidia is winning the Raytracing race as of now, but the 6900 XT just straight brute forces and it appears FSR 2.0 can stand up to DLSS, so you'll have gains there. (Once more universal support, of course) You'll be in good hands if you get either card, but again, why not get the more powerful AND its usually cheaper?
Planning on this if I’m underwhelmed with the RDNA3 cards.This is exactly what I did, for all the reasons you mentioned.
Love my 6900 XT.
Would sit out the RNDA3 and next cpu gen. No point in upgrading every gen.
Definitely wait. I have a 12600k, 165hz monitor, and 6800xt. I can USUALLY get that little bit extra from OCing with a little undervolt. Haven't been playing warzone but definitely gonna be playing MW2 2022. RDNA3 likely won't be out by then anyways. 5800x3d would still require the wait to see if it is worth it
Plus AMD has been killing it with their drivers.... albeit the situation is weird since their best drivers are OPTIONAL and have to be manually downloaded.
Both CPUs can get 180+ FPS in warzone
So I imagine any limitations you would have would be your GPU and settings. I'd wait for it to come out and run your own benchmarks and compare it to others.
AMD GPUS are not great in Warzone. They can get higher highs and a higher average than an equivalent NVIDIA card, but the 1% lows are much worse and overall the frametimes are not as smooth so despite a higher average it just doesn't feel as smooth. Just check the RTSS frametime graph on that 2nd 6800XT video. A lot of spikes. For comparison, my 12900KS and 3090 the frametime graph is pretty much smooth. I can take a video tonight. Their CPUs also suffer from lower lows compared to their Intel equivalents, but not to the extent of their GPUS.
The x3D is great but since you already have a zen 3 chip, maybe not. It will be a big jump in many games but 5600x is still very good.
If you had an older Ryzen and hadn't already jumped to 5600x I would say for sure get the x3D. But just wait for the next Intel socket and for am5 to mature. I agree with Kenpachii .
I upgraded from a 1600af to an x3D which is huuuuge.
stay with the 5600x unless you're certain that you're cpu bound, as for gpu i'd wait some real bench, maybe preorder on amazon or somewhere whit easy refound policy. I won't go with ddr5 for at least another 24 months... top ddr4 still beat ddr5 without considering the price/perf advantage... you have a very high end build, i'd wait some real evidence any upgrade would be usefull
why would you need anything above a 5600X? do you use it as a workstation? because unless you are an absolute pro CS:GO or Valorant player you don't need anything better than a 5600X
Planning on this if I’m underwhelmed with the RDNA3 cards.
I have gotten some bad ghosting on dlss. Version 2.4.get 3080 for dlss
It differs from version to version. It's usually not an issue at allI have gotten some bad ghosting on dlss. Version 2.4.
Saw that rumor yesterday. I imagine it is just a rumor despite coming from a trusted leaker. Especially since 10GB isn't enough for even a handful of titles. Which is 1 too many titles for such a premium card. Far Cry 6 being an example of this with HD textures and max graphics set up. The regular 3080 chugs along. Agree, I would rather go with RDNA 3 over Lovelace(I think that is what it is called) if that were to be true.Now there are rumors saying the 4070 will only have 10gb.
If true it's out of the question for me. I'm not going from 12gb vram to 10. Nvidia deserves to get rocked by amd if true.
Not gonna lie if there is no 12gb card from Nvidia on 4070/60/60 ti I am probably going to get a sapphire rdna3 gpu.
If they are the same price as each other and you don't care about ray tracing(most people realistically don't, I am one of those) then the 6900 XT would make more sense than the 3080 12GB.I'm of the opinion, grab the 6900 XT. It's the same money (or cheaper) than an RTX 3080 10/12 GB and better. Sure, Nvidia is winning the Raytracing race as of now, but the 6900 XT just straight brute forces and it appears FSR 2.0 can stand up to DLSS, so you'll have gains there. (Once more universal support, of course) You'll be in good hands if you get either card, but again, why not get the more powerful AND its usually cheaper?
they might do that to milk people for more vram high end models.I would like to think Nvidia aren't THAT stupid to make a 10gb 4070 but, hey. They did make an 8gb 3070 :/
This time I think it's going to bite them. Rdna3 rumors are looking strong.they might do that to milk people for more vram high end models.
Although it's not a problem yet with 3080. I had 1 crash with resi2 remake rt when changing settings too much
RDNA, as powerful and nice as it is, does not have DLSS and few other nvidia features.This time I think it's going to bite them. Rdna3 rumors are looking strong.
Crypto is going down again so they won't be able to sell anything instantly no matter how much of a turd it is.
If it weren't for crypto/shortages they never would have released 3000 series with so little vram.
Dlss is overrated in my experience. I do need to fiddle with the different versions between 2.3 and 4 though. If I can't get rid of the ghosting I will start shitting on dlss all the time here lol.RDNA, as powerful and nice as it is, does not have DLSS and few other nvidia features.
Sadly, the amd counterpart is very bad looking in motion
What brand/model card did you get?i got a 6650 xt. I would have gotten a regular 6600 xt but they were at the same price so i figured why not.
i wonder if 8gb is enough for games in the future at 1440p though
What brand/model card did you get?
I can't do native IQ anymore. I must have something. TAA is usually pretty good but it can weirdly shimmer.Dlss is overrated in my experience. I do need to fiddle with the different versions between 2.3 and 4 though. If I can't get rid of the ghosting I will start shitting on dlss all the time here lol.
Esp. if you want the most crisp image, you don't want dlss. Or taa. Example native 4k with msaa/smaa 1x is what you want for a most sharp image.
We are exact opposites here as well. I always prefer sharpness over lack of jaggies.I can't do native IQ anymore. I must have something. TAA is usually pretty good but it can weirdly shimmer.
For example, Death stranding on pc does this bad foliage TAA shimmer. YOu rotate camera and the bushes go crazy... and it does not happen on ps5.
But DLSS is amazing for most part. Especially when you put in the right version
I have tried DLSS and FSR. I really don't like how either Temporal solution. There is always something off about it. Native is the only way I can play games. MAYBE this experience changes at 4k but I doubt it. Either way, using powerful GPUs like 6800xt/3080 should not really require these solutions. They are better suited to the less powerful GPUs but surely can be used to push frames if you so wish of course.I can't do native IQ anymore. I must have something. TAA is usually pretty good but it can weirdly shimmer.
For example, Death stranding on pc does this bad foliage TAA shimmer. YOu rotate camera and the bushes go crazy... and it does not happen on ps5.
But DLSS is amazing for most part. Especially when you put in the right version
TBH for a 3060 i think 1080 144hz would make more senseIs it worth it to buy 1080 144hz monitor in 2022 or would 1440p 144hz be better? For rtx 3060
I've not tried THAT many dlss games but Death Stranding is the best looking I've seen.I have tried DLSS and FSR. I really don't like how either Temporal solution. There is always something off about it. Native is the only way I can play games. MAYBE this experience changes at 4k but I doubt it. Either way, using powerful GPUs like 6800xt/3080 should not really require these solutions. They are better suited to the less powerful GPUs but surely can be used to push frames if you so wish of course.
think of getting better specs on a monitor as a form of future proofing yourself. That monitor will likely be fine but the industry is trying really hard to move towards 4k. It is struggling with the consoles lacking the power. Which of course means squeezing more life out of 1080p but you will likely get more value out of a 1440p 144hz monitor especially when upgrading down the road. Plus 1440p looks much better than 1080p imoIs it worth it to buy 1080 144hz monitor in 2022 or would 1440p 144hz be better? For rtx 3060
I have gotten some bad ghosting on dlss. Version 2.4.
yeah 3080ti price tanked and being so close to the 12gb variant in price and perf seems they stopped making itRumors about the rtx3080 12gb having production halted.
Makes sense. Crypto is on life support and the 3080 Ti dropped to 1K. Nobody (Unless uninformed) would buy the 12GB when the 3080 Ti is the same price or cheaper.Rumors about the rtx3080 12gb having production halted.
July Aug with September release hopefullyWhen can we expect some kind of Nvidia presentation of new cards? I'm hyped for 4000 series. I think I will look for a new power supply later today just to get my rig ready.
At this rate 5000 series will most likely come with its own external power supply and wall plug.
July Aug with September release hopefully
Sorry to hear your 2080 died. Please don't spend over a thousand on a GPU that will be superseded in like 3 months. Wait for that 4090 and if you need a right now GPU, grab one of those hold me overs.My rtx 2080 msi ventus v2 has just died playing plague tale was really enjoying it. Now its either wait for rtx4090 or just bite on entirely new system with a rtx3090.
Sorry to hear your 2080 died. Please don't spend over a thousand on a GPU that will be superseded in like 3 months. Wait for that 4090 and if you need a right now GPU, grab one of those hold me overs.
My rtx 2080 msi ventus v2 has just died playing plague tale was really enjoying it. Now its either wait for rtx4090 or just bite on entirely new system with a rtx3090.