Dream-Knife
Banned
If the PC version is good, yeah.I was thinking straight perf. Say game runs at 4K 30 on ps5 you might get 150 fps on this at the same settings.
I think by 2024 we will have 4k144.
If the PC version is good, yeah.I was thinking straight perf. Say game runs at 4K 30 on ps5 you might get 150 fps on this at the same settings.
Why 24? 4K 120 in Q4 "22 I believe !If the PC version is good, yeah.
I think by 2024 we will have 4k144.
Great argument.Never buy AMD GPUs.
Great argument.
I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.Don't really need to plead my case. History does it for me.
I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.
So ray tracing performance is bad on AMD therefore it will always be bad. AMD couldn't equal Nvidia for power efficiency for over 6 years so what chance did they have of doing this in one generation with RDNA 2? Why should we expect AMD could overtake Intel in gaming when the 1800x was comprehensively beaten by an i5 processor?
Because fanboys and corporate shillsWhy would you as a consumer not want more options?
I think because we haven't seen the UE5 stuff really come out yet. For UE4 probably, but I think next year will change some things.Why 24? 4K 120 in Q4 "22 I believe !
Isnt it funny when ever you see benchmark comparisons from the big tech sites and youtubers that for some reason RT games and settings are always excluded.
The only thing that will take advantage of the 100 titty flops is star citizen. But only because that game will never be finished.100Tflops....
Jesus H Christ.
While I'm not a PC gamer, I love to see the GPU makers pushing performance as much as they can.
But nothing will take advantage of it.
Because intel is greedy and raja is not trustworthy.
I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.
You compare against similar architecture, so comparing AMD vs NVIDIA is pointless, but comparing rDNA3 to rdna2 has merit, though until we get more information regarding rDNA3 the raw numbers should be used as a vague guide and not a direct percentage increase comparison.I thought I wasn't supposed to
Care about tflops anymore.
The big question is if RDNA3 and Ada Lovelace will reanimate the mining on GPU market.
If it does, the remaining consumers are f*****
What are you talking about? The 5800x3d is possibly the best gaming cpu and it literally came out last weekRight.. Nvidia is stuck on a node at their foundry and sleeping on tech for over a decade, nor bringing anything new, just like Intel did..
This stupid comparison keeps poping up without any foundation to a valid comparison. Intel is already closing the gap with a few investments, it’s not looking bright for Ryzen in coming years , they poked a dragon. And this is coming from a gen 1 Ryzen buyer and now an 5600X owner.
All this to still get wiped by Nvidia.
Never buy AMD GPUs.
No idea why this is still a thing after RDNA 2.0. AMD GPUs are on par if not slightly ahead of Nvidia GPUs in all games except for games with ray tracing. Even then, you are looking at Metro Exodus and Cyberpunk with the biggest differences. UE5 has ray tracing and seems to favor the AMD cards. See the Matrix benchmarks below. Only by a little but considering how virtually everyone and their mothers are moving to UE5 next gen, it's fair to say we wont be seeing Metro and Cyberpunk like differences going forward even if AMD fails to improve their RT performance which is highly unlikely.
Here is a comparison of a 3080 12 GB and 6900xt. Both $999 cards at MSRP. The only difference is that the 3080 12 GB consumers 100-140 watts more than the 6900xt for 1% better average performance even when including the Metro results.
i dont know what game that is, but id take the average of 50 games over 1 or 2 games with really bad results.See the pic i posted in the thread.
The adoption of big screen gaming is getting more and more , especially with the 42 or 48 OLED TV that people use on their desk like me. So a 4k 60FPS capable GPU is the absolute minimum.Always wondered if 4k is worth it on PC right now or with RTX 4xxx. Seems that 1440p 144hz would be better? I don't know if that's much difference between 1440p and 4k on 27" monitor..
How far do you sit from that?The adoption of big screen gaming is getting more and more , especially with the 42 or 48 OLED TV that people use on their desk like me. So a 4k 60FPS capable GPU is the absolute minimum.
As someone who owned an RDNA2 card, I'm never doing that again. Probably not buying another AMD CPU either after the USB issues with Zen 3.I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.
If UE5 cpu usage doesn't get a big improvement these new flagship cards are gonna be lucky to hit 50% load because of being cpu bound anyway. The 6900xt already drops to under 90% load regularly in the matrix demo (so does the 3080ti this is not a shot at AMD).i dont know what game that is, but id take the average of 50 games over 1 or 2 games with really bad results.
Again, UE5 is going to be the standard next gen and the RDNA 2.0 cards seem to be on par with the 30 series cards.
Because intel is greedy and raja is not trustworthy.
Nah I’ve seen several rdna benchmarks that go over 97% and stay there. Its only when they are flying around do they drop to 70%. My 3080 remains at 98% almost all the time unless i start flying around.If UE5 cpu usage doesn't get a big improvement these new flagship cards are gonna be lucky to hit 50% load because of being cpu bound anyway. The 6900xt already drops to under 90% load regularly in the matrix demo (so does the 3080ti this is not a shot at AMD).
yeah sure because AMD hadnt released a good cpu in a while but lots of people were eagerly awaiting the Zen chips. AMD was really competitive up to about 2012I am sure before Zen 1 there were plenty of people saying the same thing about AMD CPUs.
What happened with Intel is exactly my point. Based on their near perfect execution up until Skylake, and AMD's horrendous execution and likely path towards bankruptcy, it would have been crazy to suggest the roles would be reversed over the coming generations. Even with Intel struggling, AMD still needed an 80% increase in IPC to draw even in productivity applications and more in gaming, after Bulldozer went backwards.Right.. Nvidia is stuck on a node at their foundry and sleeping on tech for over a decade, nor bringing anything new, just like Intel did..
This stupid comparison keeps poping up without any foundation to a valid comparison. Intel is already closing the gap with a few investments, it’s not looking bright for Ryzen in coming years , they poked a dragon. And this is coming from a gen 1 Ryzen buyer and now an 5600X owner.
At the low end, sure, but Phenom couldn't stand up to Nehalem/Sandy Bridge.yeah sure because AMD hadnt released a good cpu in a while but lots of people were eagerly awaiting the Zen chips. AMD was really competitive up to about 2012.
AD102 144SM RTX 4090Ti
2,5ghz =92TF
3ghz=110,5TF
AD103 84SM RTX 4080
2,5ghz=53,7TF
3ghz=64,5TF
AD104 60SM RTX 4070
2,5ghz=38,4TF
3ghz=46TF
AD106 36SM RTX 4060
2,5ghz=23TF
3ghz=27,6TF
AD107 24SM RTX 4050Ti
2,5GHZ=15,4TF
3ghz=18,4TF
If Nvidia fixed FP32 utilisation in Ada GPUs then FP32 will fake no more so AMD will be done.
Ampere has FP32+FP32/INT32 ( Half of FP32 is shared with INT32, this is where those fake teraflops are coming)
If Ada Lovelace is FP32+FP32+INT32 then AMD is cooked.
I have a 25" gaming monitor flanking the TV, so basically it's only for movies and single player games where I can sit further and recline. Really the perfect setup.How far do you sit from that?
AD102 144SM RTX 4090Ti
2,5ghz =92TF
3ghz=110,5TF
AD103 84SM RTX 4080
2,5ghz=53,7TF
3ghz=64,5TF
AD104 60SM RTX 4070
2,5ghz=38,4TF
3ghz=46TF
AD106 36SM RTX 4060
2,5ghz=23TF
3ghz=27,6TF
AD107 24SM RTX 4050Ti
2,5GHZ=15,4TF
3ghz=18,4TF
If Nvidia fixed FP32 utilisation in Ada GPUs then FP32 will be fake no more so AMD will be done.
Ampere has FP32+FP32/INT32 ( Half of FP32 is shared with INT32, this is where those fake teraflops are coming)
If Ada Lovelace is FP32+FP32+INT32 then AMD is cooked.
No idea why this is still a thing after RDNA 2.0. AMD GPUs are on par if not slightly ahead of Nvidia GPUs in all games except for games with ray tracing. Even then, you are looking at Metro Exodus and Cyberpunk with the biggest differences. UE5 has ray tracing and seems to favor the AMD cards. See the Matrix benchmarks below. Only by a little but considering how virtually everyone and their mothers are moving to UE5 next gen, it's fair to say we wont be seeing Metro and Cyberpunk like differences going forward even if AMD fails to improve their RT performance which is highly unlikely.
Here is a comparison of a 3080 12 GB and 6900xt. Both $999 cards at MSRP. The only difference is that the 3080 12 GB consumers 100-140 watts more than the 6900xt for 1% better average performance even when including the Metro results.
Laughable notion that 4K and 60fps are the absolute minimum.The adoption of big screen gaming is getting more and more , especially with the 42 or 48 OLED TV that people use on their desk like me. So a 4k 60FPS capable GPU is the absolute minimum.
Mac?! WTF bro?These numbers sound insane! I'll halt my upgrade to the next Mac or next PC depending on what's happening now.
Mac?! WTF bro?
OK fair enough, I thought there might be some MS games you have a sight interest in.#NoFucksGivenToPCgaming
I only use my PC for other general uses, but most importantly for video editing. Mac Studio is compact as fuck, cool as fuck, powerful as fuck. I already have a Radeon VI PC since 2019 and never played on it, it cost me around $3400 to build it.