RX XT X the naming is all over the place.
Is my 3700x still relevant or it became obsolete crap today ?
That looks very impressive. I can't wait to see the new Threadrippers.'
Intel's "look at me" blog post from yesterday about Rocket Lake" feels lame by comparison. The review embargoes for Ryzen 5000 can't be lifted fast enough.
sexy
Thank you to Microcenter for always making CPUs their loss leader.
The one thing I will say for Intel is they dont have that much of a performance delta to make up for.
While true, I am glad AMD is finally giving them some competition. I have felt no reason to upgrade my 2600K until now.
my favorite part of living in Dallas is finally having a Micro Center nearby
The one thing I will say for Intel is they dont have that much of a performance delta to make up for.
So you guys reckon that 3800x or 3900x will go down in price now?
Also, will have 3800x be good enough for gaming for around 3-4 years?
RX XT X the naming is all over the place.
Is my 3700x still relevant or it became obsolete crap today ?
![]()
Seems like slower than RTX3080 ? Checked Techpowerup for 4k benchmark of Gears 5 and Borderland 3 with the allegedly slowest 3080 (Zotac Trinity)Price will be deciding factor I guess
![]()
![]()
How the hell did Zotac manage to be SLOWER than the FE???![]()
Seems like slower than RTX3080 ? Checked Techpowerup for 4k benchmark of Gears 5 and Borderland 3 with the allegedly slowest 3080 (Zotac Trinity)Price will be deciding factor I guess
![]()
![]()
my favorite part of living in Dallas is finally having a Micro Center nearby
Indeed. Assuming AMD used the in-game benchmark for BL3, we can use Eurogamer's 3080 review as an acceptable-ish comparison: DX12/Bad Ass/TAA.TPU test borderlands 3 @ ultra DX11 which gets higher frames than badass DX12 so it is not comparable.
also are those bench using a 5950x cpu ?Indeed. Assuming AMD used the in-game benchmark for BL3, we can use Eurogamer's 3080 review as an acceptable-ish comparison: DX12/Bad Ass/TAA.
They mention 65 above and to the right of the line chart but 60.9 as the mean (I think?) for the bar graph.
Honestly that is pretty good considering how comparatively strong the 3080's performance is at 4K vs 1080p & 1440p.
If Big Navi's CUs aren't similarly underutilized at lower resolutions this could turn out closer than rumours would suggest.
Way too early to take this with anything but a mine's worth of salt. Just one game. So much better to get apples to apples on a variety games from one outlet when comparing. (And then checking against other outlets to avoid outliers)
Wait, isn't this the Zen 3 thread anyway?![]()
You're not familiar with Zotac, are you? This is normal for Zotac. Their stuff is cheap sure but never try and figure out why, you won't like the answer.How the hell did Zotac manage to be SLOWER than the FE???
Yeah. Keep imagining. Because that's the only place where that's happening.I'll say it again, just imagine Intel on the 10nm process if they're keeping up with AMD on 7nm. AMD at 5nm will get destroyed as they are finally faster than intel at 14nm.
I'll say it again, just imagine Intel on the 10nm process if they're keeping up with AMD on 7nm. AMD at 5nm will get destroyed as they are finally faster than intel at 14nm.
It will be good. Whether it will have long legs like Intel processors have remains to be seen. Of course Intel could blow the doors if they have another Core 2 moment.So you guys reckon that 3800x or 3900x will go down in price now?
Also, will have 3800x be good enough for gaming for around 3-4 years?
Should be absolutely minimal CPU bottleneck at 4K 60-70fps, regardless of whether a 10900k or 5950x is used.also are those bench using a 5950x cpu ?
can impact them since they will be top of the game.
Just like how big Navi was supposed to overtake the 3080? That is where my imagination is at.Yeah. Keep imagining. Because that's the only place where that's happening.
The performance shown is on par with an RTX 3080. And we don't know if it's the full die with zero CUs disabled...Just like how big Navi was supposed to overtake the 3080? That is where my imagination is at.
Anyone in the know care to provide some insight into the possible benefits of the Ryzen 7 5800X: 8-cores/16-threads vs Ryzen 5 5600X: 6-cores/12-threads specifically for gaming? They both have the same 32MB Cache pool. The 5800X will have 100 Mhz higher base and boost speeds. Are the extra cores worth the extra $ and the need to purchase a third party cooler?
Also don't forget they promise a FPS uplift in games because of ZEN3 so you have to add those to the 3080 too as they will also get a uplift with those CPUs. So it seems the GPU stacks in somewhere between the 3070 and 3080 as expected. I wonder how RTX performance will be though and if they have any kind of DLSS competitor.
The Xbox and PS5 both have 8 core CPUs, so as next gen matures, I would imagine that most multiplat games would be designed with that in mind as the norm.
I don't see why you wouldn't want to go there unless you're really on a budget. In the short term, 6 cores is perfectly fine, but if you want to future proof, 8 cores are safer.
Like others have said just save a bit more and get at least an 8 core CPU.Anyone in the know care to provide some insight into the possible benefits of the Ryzen 7 5800X: 8-cores/16-threads vs Ryzen 5 5600X: 6-cores/12-threads specifically for gaming? They both have the same 32MB Cache pool. The 5800X will have 100 Mhz higher base and boost speeds. Are the extra cores worth the extra $ and the need to purchase a third party cooler?
AMD is using superior process, but it's not as far ahead as naming imply (7 vs 14nm, yeah, 10xxx are 14nm).
As examination of the chips revealed, AMD's transistors are 22*22nm, while Intel's are 24*24nm.
Doesn't that CPU Zen 3 uplift generally only apply at lower resolutions like 1080p?
If they're showing at 4K then I'm expecting there won't be as much of a difference (hence why they chose to show games running at 1080p for the CPU and yet 4K for the GPU).
Depends on the game.
AFAIK for example AC:Odyssey has a 10FPS difference between a i9-10900k and a 3900X on a RTX 3080.
So it really comes down to how the rendering pipeline is made. Many games will show no difference because as you said you should be GPU bound, but unfortunately there are also as many games out there that are to this day not very well made and sadly will benefit from a faster CPU even in 4k.
Disappointing? Look at how the 3000 series compared to the 10900k in gaming and then look at the 5000 series. The 5000 series is a massive increase in performance and overtakes Intel in the only area that they were leading in as AMD leads in pretty much everything else. Now Intel doesn't even have "best gaming performance" to fall back on.So it looks like the 5000 series is only slightly better than a 10900k in gaming? That's pretty disappointing.
The Xbox and PS5 both have 8 core CPUs, so as next gen matures, I would imagine that most multiplat games would be designed with that in mind as the norm.
I don't see why you wouldn't want to go there unless you're really on a budget. In the short term, 6 cores is perfectly fine, but if you want to future proof, 8 cores are safer.
Like others have said just save a bit more and get at least an 8 core CPU.
That's not how it works, buddy. There's a reason why Intel is stuck with 14nm for so long.I'll say it again, just imagine Intel on the 10nm process if they're keeping up with AMD on 7nm. AMD at 5nm will get destroyed as they are finally faster than intel at 14nm.
Yeah that's a good point. Of course both AMD, Intel and NVIDIA will cherry pick games and applications that show them in a good light
What's interesting is that AMD had to essentially refine 7nm to beat Intel's 14nm(++++++++++++++++++++++++++++++++++++++++++). If you think of that way then when Intel moves on a smaller node they may start to rip Ryzen to pieces. AMD might pull some more rabbits out of the hat in the meantime but I would probably respect AMD more if they beat NVIDIA rather than Intel.
Beating Intel at this point is like beating a has-been one-legged guy in a 100m sprint, whereas beating NVIDIA is basically beating Usain Bolt.
That's not how it works, buddy. There's a reason why Intel is stuck with 14nm for so long.
Some games are very CPU bound even in higher resolutions, eg. Flight Sim 2020. In general, I agree with you, but it's not guarantee to apply to all games.Doesn't that CPU Zen 3 uplift generally only apply at lower resolutions like 1080p?
If they're showing at 4K then I'm expecting there won't be as much of a difference (hence why they chose to show games running at 1080p for the CPU and yet 4K for the GPU).
AMD has been pretty decent on that side.Yeah that's a good point. Of course both AMD, Intel and NVIDIA will cherry pick games and applications that show them in a good light![]()
Did our unbiased Leo make an "AMD, New King in Gaming" thread then? No? But why?Disappointing? Look at how the 3000 series compared to the 10900k in gaming and then look at the 5000 series. The 5000 series is a massive increase in performance and overtakes Intel in the only area that they were leading in as AMD leads in pretty much everything else. Now Intel doesn't even have "best gaming performance" to fall back on.
The price increase is a bit rough, but Ryzen 3 is better than Intel in every other metric.