• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Ryzen 7000: 5 nm desktop CPUs coming September 27th

Rickyiez

Member
Yes, this is correct but this is not a fair benchmark either.

the highest number they were able to get at full HD is a 35% uplift. this is a joke. because when this goes to 4k, we are down to 20% uplift if we are lucky. this is the highest mind you.

For a generational leap? this is nothing but a joke. honestly, the 5800x3d is probably as fast or close to in gaming benchmark as this so-called high-end processor.

seriously what a joke. not impressed one bit. they just wanted to go DDR5 with new motherboards and CPU.

they barely can beat the 12900k. when the 13900k is here, this will get destroyed. cleaned the floor with by Intel. which is sad really.

$299 beat a $569 chip , yea not impressive 🤷‍♂️

You’re going to be GPU bound at 1440p?
Not really . Even in 4k you want every bit of CPU speed available because higher-end TVs nowadays can all do 120hz .

EDIT
I'm talking about the leap in performance from my perspective, probably my bad for not including it. From 3700x to a 7700x would easily yield me 15FPS increase or more even at 4k, provided I'm not totally GPU bounded.
 
Last edited:

lachesis

Member
I'm up for upgrade later this year - so closely looking at it between Zen4 and Raptor Lake.
If the performance is similar, I'm thinking about Zen 4 - because it sounds like I can at least upgrade to Zen 5 with this new motherboard in a couple years - while I don't think I Meteor Lake socket would be compatible with Raptor Lake....

I've never built one with AMD chipset, so I do have one question - how much of performance differences were there on earlier AM4 boards vs late AM4 boards? (and would I be able to expect similar result for AM5..?
I'm looking at some charts on X570 vs 470 vs 370... on AM4 chipsets assuming 370 came out early and 570 came out later in the lifecycle....I see some pcie differences and how many USB ports / SATA ports etc - but will older board somehow hinder the performance of newest chip?
 

Crayon

Member
Waiting for their GPU zen moment.

I hope. I'd rather buy amd and they sorta kinda caught up with rdna2, but that raytracing performance... Fine for people who don't care but the gap is huge.

Rdna 3 has to have a huge jump in rt performance. There was some rumor about matrix accumulate mentioned somewhere so maybe they've got some dedicated hardware bits in there. I'm guessing 4000 will still be ahead but amd has to close that gap. I'm not THAT loyal.
 

draliko

Member
If you're on zen3 it's best to wait for zen5 and second gen mobo for sure, from ddr costs to real gaming perf nothing really need more than 5600/5800x3d. This is a good price/perf cpu and we'll need to see the complete package (ddr, mobo,cpu) price vs intel new gen to really draw conclusions. If i didn't recently upgraded to 5600x i could consider a new build for Christmas... anyway go team red
 

Rubytux

Neo Member
If you're on zen3 it's best to wait for zen5 and second gen mobo for sure, from ddr costs to real gaming perf nothing really need more than 5600/5800x3d. This is a good price/perf cpu and we'll need to see the complete package (ddr, mobo,cpu) price vs intel new gen to really draw conclusions. If i didn't recently upgraded to 5600x i could consider a new build for Christmas... anyway go team red

I just build a 5600x pc last year, paired with a RX6600XT and 16GB of RAM + 1TB M2.

I am going to play mostly in 1080p, but i am expecting More for an upgrade in the GPU area. Hopefully the 4070 Is under $500 dollars.


Nice job of AMD, but i am not going to upgrade the CPU/MB and RAM at the moment.
 
Last edited:

supernova8

Banned
I hope. I'd rather buy amd and they sorta kinda caught up with rdna2, but that raytracing performance... Fine for people who don't care but the gap is huge.

Rdna 3 has to have a huge jump in rt performance. There was some rumor about matrix accumulate mentioned somewhere so maybe they've got some dedicated hardware bits in there. I'm guessing 4000 will still be ahead but amd has to close that gap. I'm not THAT loyal.

Yeah I was going to make the argument that raytracing is still not quite mainstream enough for me to care about it but even so, assuming the GPUs all end being similarly priced, you would want RDNA3 to absolutely bury RTX40 in general (ie non raytracing) GPU performance to compensate for the (probably) comparatively bad raytracing, otherwise it's picking the worse option just to "stick it to the man".
 
Last edited:

draliko

Member
I just build a 5600x pc last year, paired with a RX6600XT and 16GB of RAM + 1TB M2.

I am going to play mostly in 1080p, but i am expecting More for an upgrade in the GPU area. Hopefully the 4070 Is under $500 dollars.


Nice job of AMD, but i am not going to upgrade the CPU/MB and RAM at the moment.
I got a good deal on a 3080 12gb, but sincerely i think I'll return it and wait for new gen, having both consoles cover my gaming urge, but i don't know why i really miss pc gaming and always feel the urge to get back to it 😅😁
 

//DEVIL//

Member
$299 beat a $569 chip , yea not impressive 🤷‍♂️


Not really . Even in 4k you want every bit of CPU speed available because higher-end TVs nowadays can all do 120hz .
Are you new to this ?

Please do some research before you post ... you should know that even the 5600x and 5900x when it comes to gaming, there is no performance difference worth the 300$ price tag difference. Not even 10 frames at 4k.

The same goes for intel. For example the i5 12600 and the i7 12700 in gaming, the performance is about the same . Yet there is big price difference.

The difference in these cpu are bigger in content creation. But this is not our topic here since we are focused on gaming.

When you look at the slide in the OP about the performance gain between both high end CPUs 12900k and 7950k... In gaming .... Then refer back to my previous post. It's not impressive or even close to being good in terms of gaming. They are not even close to the 13th gen intel at this rate.
 

AndrewRyan

Member
Been waiting for a good price on the 5800x3d. $400 at microcenter. Guess I'll wait a little longer for the benchmarks to see if it's worth upgrading the motherboard.
 
Looks like I'll be building a completely new rig before Starfield comes out, but these total platform costs are racking up pretty fast. I'm currently on my old 7700K+2070S combo, but for the new setup I'll have to ditch all of my current parts, which is a significant leap over just replacing my mobo/cpu/cooler combo I was contemplating before. I'm thinking I'll go for a decent board with future proofing and 7600X so there'll be a good upgrade path eventually. Unless Intel comes up with a better offer. If they keep supporting DDR4 with their new platform, it just might be something that could keep me in that camp if it still offers similar future proofing and price. I should probably just wait until SF actually comes out, but the upgrade itch is real. :messenger_savoring:
 
Are you new to this ?

Please do some research before you post ... you should know that even the 5600x and 5900x when it comes to gaming, there is no performance difference worth the 300$ price tag difference. Not even 10 frames at 4k.

The same goes for intel. For example the i5 12600 and the i7 12700 in gaming, the performance is about the same . Yet there is big price difference.

The difference in these cpu are bigger in content creation. But this is not our topic here since we are focused on gaming.


When you look at the slide in the OP about the performance gain between both high end CPUs 12900k and 7950k... In gaming .... Then refer back to my previous post. It's not impressive or even close to being good in terms of gaming. They are not even close to the 13th gen intel at this rate.

This !

I'm always surprised that people always fall for such marketing way to present performances... They ALL do that and everytime that's the same "story"... The only important thing we really need to take in account with this graph is the fact that zen4 have closed the gap with intel 12th gen, which is a really good thing, nothing more.

https://www.guru3d.com/articles_pages/amd_ryzen_7_5800x3d_review,22.html

in such review, you can see what //DEVIL// is speaking about.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
They said they would support it at least till 2025+ and based on some already known info, Zen 5 well be on AM5 as well.

WQR2rTh.jpg
Turin? That’s my birth city, wonderful :).
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Need to wait for the X3D variants to really get excited.
All the RPL leaks soured this reveal.
Hopefully the reason there is no 7800X is because its been reserved for the 7800X3D.

$300 for a 6 core CPU? Isn't that a bit pricey, especially if I look at prices for their 8 core models?
The 5600X basically set the precedent for that pricing.
The 3600X and 3600 were AMDs golden children, perfectly priced with perfect performance.
Intel still have that in their 200 and under 12400s, assuming the 13400 is similarly priced it will be the generations bang for buck gaming CPU.

This is insane. Wow.
I can't see myself upgrading from my 5800x any time soon, but that's absolutely insane. $299 for the same gaming performance of the best non 3D cache CPU currently available.
AMD isn't letting Intel breathe.
Waiting for their GPU zen moment.
Raptorlake is in a few weeks.

It was all about who revealed first, whoever revealed first could show off how much better they are than the competitions old chip.
IF RaptorLake came first AMDs slides would be very different.
AMD needed to show us their 7000 vs the 5800X3D in gaming if they wanted to impress us.

What? Barely beating the 12900k? You know they are comparing it to the 7600x, the cheapest and slowest of their new cpus
Intel will do the same.
They will compare the 13600K to the 12900K.
And from the leaks it will be faster.
If Intel have time before the reveal they could prepare slides showing the 13600K beating the 7950X.


You wanna know how petty Intel are?
They are doing their reveal event the same day Ryzen 7000s are supposed to be released, likely when the embargo is up so the news cycle is nothing but look at the 13th gen cores, reviewers are gonna be working overtime getting the Ryzen reviews ready and getting the 13 gen core reviews ready......all while sweating to also do the inevitable VS articles.
FUBlWoQ.jpeg
 
Last edited:

GymWolf

Gold Member
I usually prefer going with intel but if they don't show something before the 4000 series is out, i'm forced to go with amd this time.
 

FireFly

Member
Yes, this is correct but this is not a fair benchmark either.

the highest number they were able to get at full HD is a 35% uplift. this is a joke. because when this goes to 4k, we are down to 20% uplift if we are lucky. this is the highest mind you.

For a generational leap? this is nothing but a joke. honestly, the 5800x3d is probably as fast or close to in gaming benchmark as this so-called high-end processor.

seriously what a joke. not impressed one bit. they just wanted to go DDR5 with new motherboards and CPU.

they barely can beat the 12900k. when the 13900k is here, this will get destroyed. cleaned the floor with by Intel. which is sad really.
On average, the 12900K was around 12% faster than the 11900K in 1080p gaming. The 11900k was around 1% (!) faster than the 10900k. If Raptor Lake manages even a 10% boost it will be impressive given that it is on the same process with no major architectural changes.

Then the 7800X3D is going to arrive...
 

GymWolf

Gold Member
Intel's event isn't out until the 27th when the 7000 series officially launches. I'd personally still tell you to wait and compare both before just jumping out there.
I'm not gonna buy anything until i have a 4080 at vanilla price on my cart, so yeah waiting is not a problem, i though that the intel event was still far, but 27 is pretty close.
 
Last edited:

GreatnessRD

Member
I'm not gonna buy anything until i have a 4080 at vanilla price on my cart, so yeah waiting is not a problem, i though that the intel event was still far, but 27 is pretty close.
I feel you. Nvidia and AMD would have to blow me completely away before I move on from the 6800 XT. As it stands, I won't be touching anything until '25 at the earliest. But its always exciting for me to see new tech. The AMD event was cool, but the real fun will be the V-Cache chips early next year.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'm not gonna buy anything until i have a 4080 at vanilla price on my cart, so yeah waiting is not a problem, i though that the intel event was still far, but 27 is pretty close.
Its the usual pettiness between these companies.

AMD/Intel does their event first, they show off how much better their NEW chips are than the competitions OLD chip to bolster their numbers.

The competition retaliates and does an event around the time embargos are up for whoever went first this way the news cycle is all about whoever went second even if reviews for the first chip should be in the news.



Intel will do their event the day reviewers are allowed to talk about Ryzen CPUs.
If they have the time, they could likely even use Ryzen 7000 "speculated" scores in their slides just to fuck with peoples minds.


The whole 7600X vs 12900K thing is AMD being smart with their wording/presenting, we all know games rarely if ever care about more cores above 6. The 12900K is a heavily binned chip which is why it ends up scoring higher sometimes, sample for sample a 12600K and 12900K will usually be within spitting distance, but using a 12600K in their presentation wouldnt look as cool.

Intel can come at AMD with a 13600K vs whatever AMD has and make themselves look good too.



Check out how in this very thread people are super impressed AMDs "cheapest" chip is matching or beating Intels old chip.
What AMD wont tell you is the 7600X and the 7900X likely score exactly the same in gaming, but again that wouldnt look cool saying our new ~$500 chip matches and sometimes beats Intels old ~$500 chip.
 

GymWolf

Gold Member
I feel you. Nvidia and AMD would have to blow me completely away before I move on from the 6800 XT. As it stands, I won't be touching anything until '25 at the earliest. But its always exciting for me to see new tech. The AMD event was cool, but the real fun will be the V-Cache chips early next year.
My situation is different, i NEED a new gpu, my 2070super is not good enough anymore.
 

GymWolf

Gold Member
It's not? You trying to push 4K 120? lol

But I still understand the upgrade. Going from the 2070 Super to the 4080 will be big time flex.
???

Dude it is not even optimal for 1440p60 max details with any heavy or broken game, let alone 4k120 frames...people forget that very few games have dlss and even that is not like you get 50 frames...a 2070super was NEVER considered a 4k gpu, like not even in the slightest...it's a decent 1440p gpu at best that can do 4k30 with dlss and some settings toned down.

I forgot last time that i didn't needed to tinker with settings to get stable 1440p60, even with old ass stuff like skyrim or fucking batman arkham origins to get semi-stable 4k60.

I aim at 4k60 rock solid with modern games, i can play with some stupid settings that look the same at ultra or high, but i don't wanna lose 30 min before playing something...i miss brute forcing broken games:lollipop_grinning_sweat:
 
Last edited:

GreatnessRD

Member
???

Dude it is not even optimal for 1440p60 max details with any heavy or broken game, let alone 4k120 frames...people forget that very few games have dlss and even that is not like you get 50 frames...a 2070super was NEVER considered a 4k gpu, like not even in the slightest...it's a decent 1440p gpu at best that can do 4k30 with dlss and some settings toned down.

I forgot last time that i didn't needed to tinker with settings to get stable 1440p60, even with old ass stuff like skyrim or fucking batman arkham origins to get semi-stable 4k60.

I aim at 4k60 rock solid with modern games, i can play with some stupid settings that look the same at ultra or high, but i don't wanna lose 30 min before playing something...i miss brute forcing broken games:lollipop_grinning_sweat:
I didn't know it was no longer hitting 1440p60/high. That's one of the reasons I said that. I'm super surprised by that. But yeah, for your use case, the 4080 is gonna be a fun time.
 

GymWolf

Gold Member
I didn't know it was no longer hitting 1440p60/high. That's one of the reasons I said that. I'm super surprised by that. But yeah, for your use case, the 4080 is gonna be a fun time.
Sorry if i sounded like i was schooling you, not my intention, It usually hits 1440p with a mix of ultra, high and medium and something off (like my tv has a soft image so i never use AA unless i have headroom).

But i have a 4k tv, and everything below 4k60 is disappointing for me.
 
Last edited:

GreatnessRD

Member
Sorry if i sounded like i was schooling you, not my intention, It usually hits 1440p with a mix of ultra, high and medium and something off (like my tv has a soft image so i never use AA unless i have headroom).

But i have a 4k tv, and everything below 4k60 is disappointing for me.
No worries, pimp, you good! As I continue to gather new information on your use case, everything makes sense, lol. What 4K TV you rockin'? Trying to talk myself out of a LG C2 this holiday season.
 

nightmare-slain

Gold Member
What in the actual fuck. These chips can boost to 5.7Ghz?!? I haven't been following AMD much and checked the specs out there lol.

Is that on a single core? Even if across 2, 3, or 4 cores it can stay above 5.0/5.1GHz that would be impressive.
 

nightmare-slain

Gold Member
Yeah, that was single core. I can't wait for Steve from Hardware unbox to show us the real goods on review day.
Still insane to see 5.7GHz. I remember being amazed when we hit 5GHz. 6GHz might be here sooner than I thought. I assume these cpus are being pushed to their limit and there will not be much overclocking potential? I don't know about the recent releases but I remember my 3700X couldn't really OC.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Still insane to see 5.7GHz. I remember being amazed when we hit 5GHz. 6GHz might be here sooner than I thought. I assume these cpus are being pushed to their limit and there will not be much overclocking potential? I don't know about the recent releases but I remember my 3700X couldn't really OC.
I had a 2500K in 2011 that could do 5.0GHz on air. (51 multiplier, BCLK 99Mhz)
With a Hyper 212 no less.

Shame the race your PC thread probably got wiped when GAF reset or whatever.
I used to love that thread just to see peoples overclocks and steal settings.
I think I had the second fastest (fastest) GTX570 in 2011.

Ohh the days when overclocks were actually fun.


If all the rumors surrounding Ada GPUs are true, I might actually jump back into overclocking cuz supposedly Nvidia is purposely underclocking the chips and they have a lot of headroom left inside.
 
Zen4/RDNA3 APU combo for entry level gaming for laptops will be unmatched with DDR5, PCIE5. I think they will be the only company to offer 1440p/60fps+ on entry level gaming. They really emphasized the performance/power/efficiency usage. Don't know too much about RDNA 3, but perhaps you can finally get ray tracing, along with machine learning\DLSS super sampling on an entry level laptop/pc at 60pfs+

DirectstorageAPI with 1-4 sec load time icing on the cake
 

nightmare-slain

Gold Member
I had a 2500K in 2011 that could do 5.0GHz on air. (51 multiplier, BCLK 99Mhz)
With a Hyper 212 no less.

Shame the race your PC thread probably got wiped when GAF reset or whatever.
I used to love that thread just to see peoples overclocks and steal settings.
I think I had the second fastest (fastest) GTX570 in 2011.

Ohh the days when overclocks were actually fun.


If all the rumors surrounding Ada GPUs are true, I might actually jump back into overclocking cuz supposedly Nvidia is purposely underclocking the chips and they have a lot of headroom left inside.
I never had one but the 2500K was an amazing CPU from all the stuff I read about it. It could OC really well and people were using them for a long time.

I have a 9900K and it is overclocked but I wouldn't say it's a good one. It's set to 5.0GHz with a -2 AVX offset so yeah it can hit 5.0 across all cores but it will often drop to 4.8. I could probably get rid of the AVX offset but I done it because I was originally running at 5.1GHz which ended up being unstable and when this CPU is pushed hard it gets HOT. To be fair this was during stress tests I was seeing 90-100C lol. During game it rarely goes above 60C.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Zen4/RDNA3 APU combo for entry level gaming for laptops will be unmatched with DDR5, PCIE5. I think they will be the only company to offer 1440p/60fps+ on entry level gaming. They really emphasized the performance/power/efficiency usage. Don't know too much about RDNA 3, but perhaps you can finally get ray tracing, along with machine learning\DLSS super sampling on an entry level laptop/pc at 60pfs+

DirectstorageAPI with 1-4 sec load time icing on the cake
If they have ML hardware in their new GPUs they can have a good competitor to XeSS and DLSS.
If not we have to hope they can keep improving FSR to the point its good enough to not care about DLSS implementations.
Honestly though a 1440p laptop FSR2 already looks amazing, so if we are talking entry level 1440p(is that a thing these days) then yeah AMD are likely gonna knock it out of the park.

There is a near zero chance they muck about and make their RT weak, at this point whether a million games use RT or not, its a selling point to the upper echelon.
So please AMD have good RT hardware.
 

Kuranghi

Member
I don't really want to upgrade everything when the 4000 series comes out but it's well past due. I have a 3770K + appropriate MB but I'm aiming for 4K60 so that should be fine for a while ime as I'm always GPU-bound in every game.

So I'll wait for this new stuff from AMD/Intel to drop in price a bit into 2023 and just stick the 4000 series in the old MB.

My question is will PCI-E 3 vs 4 affect the 4000-series much at 4K? Google seems to indicate a small drop in perf but I'm sure that won't make enough difference since I'll be brute forcing almost anything that comes out (for a year at least) if graphics settings are optimised and not just maxed, at 4K.
 
Top Bottom