• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Ryzen Thread: Affordable Core Act

No building yet as i have to get off to work in 40 mins, but i think work will be done with a extra bit of zest today. :D

va941Zc.jpg
 
Nice. Be interesting to see how you get on with overclocking later. Conflicting reports on the 1700 in that regard.

Could only get this motherboard after the Amazon shenanigans yesterday with pre orders, will see how it goes and decide if i want/need to get a different one once they're more easily available.
 

sega4ever

Member
Could only get this motherboard after the Amazon shenanigans yesterday with pre orders, will see how it goes and decide if i want/need to get a different one once they're more easily available.

whats up with amazon and newegg not having motherboards in stock till almost a week later than the release date of the processors?
 

Nachtmaer

Member
I usually go with the best chipset and mobo but I now tend to think a bit more economical.
Looking at the ASRock AB350 one currently - anything speaking against AB350 versus X370 for average workstation/gaming usage?

If you're not going for SLI or want the extra IO, then B350 should be plenty. Most X370 motherboards currently seem to have beefier VRMs (more phases doesn't mean better power delivery) with more cooling. I'd say just wait for proper motherboard reviews.
 

ISee

Member
Are we expecting the 6 core ryzen variant to perform significantly worse than the 8c variant in gaming scenarios?
 

daninthemix

Member
GPU load is at 99% throughout the video. That's a measure of GPU performance, not a CPU test.
You have to use a fast enough GPU setup, reduce the resolution, or reduce graphical options as much as necessary to prevent the GPU from ever hitting 99/100% if you are running a CPU test.

It is incredibly frustrating that so many sites/youtube channels have no idea what they're doing when testing the gaming performance of a CPU - and worse, that AMD encouraged reviewers to set up GPU-bottlenecked tests.

http://www.gamersnexus.net/hwreview...review-premiere-blender-fps-benchmarks/page-7

Yeah, just watched the Linus Tech Tips video and all their games were running at 4K.

How is that an optimal way to test CPUs?
 

RenditMan

Banned
yea not even close.

This shit would be called out in a fucking instant if it were any other company. People just love the underdog, even when they use shit tactics like every other company.

I run a G Sync monitor at 1440p, I now know that this cpu is considerably faster in most work situations and the same(ish) in gpu bound games optimised for the Intel generation. Seeing as most games are GPU bound this makes this CPU a very good buy at their pricing no?

Some aren't seeing how the lack of competition has created a market tailored for Intels strengths and non of its weaknesses. Some even appear to be defending this position.

Not in most gaming scenarios, no.

Not in most legacy scenarios, the big question is if AMD can persuade Devs to optimise to use the cores going forward instead of sticking to the low core high IPC model they are currently using.

Yeah, just watched the Linus Tech Tips video and all their games were running at 4K.

How is that an optimal way to test CPUs?

Its optimal to use that review along with the others to build an overall picture of whether this CPU will suit your specific usage patterns. Its apparent this CPU sings when all the cores are lit up, the question is does your usage light all the cores up? Or does the benefits of lighting all the cores up outweigh the times when you won't be in your rig?
 

Engell

Member
Are we expecting the 6 core ryzen variant to perform significantly worse than the 8c variant in gaming scenarios?

The current data presented by AMD themselves has the 6core clocked lower than ryzen1700, clock is already the Ryzens problem when it comes to gaming, so unless they fix so you can OC it to 4.5ghz+ then it will be a bit problematic.. plus intel will have launched the kabylake refresh(7740K) at that point, it's an easy fix for them just to use 1$ extra for thermal interface on the IHS.
 

dr_rus

Member
getgraphimg.php


Really decent improvement by disabling SMT. Does that point to something that can be improved via software?

http://www.hardware.fr/articles/956-17/jeux-3d-project-cars-f1-2016.html

Gains with SMT being off are pretty normal in gaming on Intel CPUs too, people just don't benchmark this too often. Games tend to launch several heavy weight threads and a number of additional light weight ones. When two heavy weight threads end up being on one physical core or when a light weight one is stealing execution resources from the heavy weight one you'll see a performance decrease when compared to them running on the same core sequentially.

Are we expecting the 6 core ryzen variant to perform significantly worse than the 8c variant in gaming scenarios?

I think they will be pretty close to where respective 8 core versions of the same base/boost frequencies are. As I've said many times already, games aren't nearly as good at scaling with more CPU cores as some people seem to think.
 

Marlenus

Member
The current data presented by AMD themselves has the 6core clocked lower than ryzen1700, clock is already the Ryzens problem when it comes to gaming, so unless they fix so you can OC it to 4.5ghz+ then it will be a bit problematic.. plus intel will have launched the kabylake refresh(7740K) at that point, it's an easy fix for them just to use 1$ extra for thermal interface on the IHS.

No it is not, the 1600X is clocked at 3.6/4.0Ghz just like the 1800X.

As far a gaming is concerned it will likely perform about the same a the 1800X. Of course by the time it is reviewed I would not be surprised if the Bios / Firmware / Software tweaks in the coming months will result in a performance uplift across all Ryzen CPUs in the gaming benchmarks.

It seems pretty obvious that AMD concentrated on Productivity apps and were trying to target buyers of the 6800K and 6900K with these chips so once they got the performance in those apps sorted they launched and were less concerned with gaming.

The R5 line however will definitely be more aimed at gamers so one would expect them to be focusing on getting that performance up to snuff by the time that launches.
 

Livanh

Member
GPU load is at 99% throughout the video. That's a measure of GPU performance, not a CPU test.
You have to use a fast enough GPU setup, reduce the resolution, or reduce graphical options as much as necessary to prevent the GPU from ever hitting 99/100% if you are running a CPU test.

It is incredibly frustrating that so many sites/youtube channels have no idea what they're doing when testing the gaming performance of a CPU - and worse, that AMD encouraged reviewers to set up GPU-bottlenecked tests.

http://www.gamersnexus.net/hwreview...review-premiere-blender-fps-benchmarks/page-7

If you refer to Jokers review, should maybe turn on the sound. He explains why the workloud is 99% in his 1080p tests. VSYNC off means gpu load will always show 99%. He discusses it further in his follow up videos.

It is incredible frustrating to see people trash reviews without an idea whats going on.
 

V_Arnold

Member
If you refer to Jokers review, should maybe turn on the sound. He explains why the workloud is 99% in his 1080p tests. VSYNC off means gpu load will always show 99%. He discusses it further in his follow up videos.

It is incredible frustrating to see people trash reviews without an idea whats going on.

We have formed a narrative by now. This cpu lineup is disappointing, positive reviews and benchmarks are doing something wrong.
 
Here's hoping that Vega knocks it out of the park as well. I'd love for my next build to be AMD only, sick of Intel and Nvidia's bullshit.
 

JBwB

Member

Renekton

Member
Wonder what the pricing would be on those CPU's. I'd assume they'd be cheaper than what we expect considering they don't have iGPU's.
They are HEDTs (basically re-purposed Xeons), so they come with higher premiums than normal KBL-S. The motherboard alone will be much more expensive.
 

Micael

Member
Its optimal to use that review along with the others to build an overall picture of whether this CPU will suit your specific usage patterns. Its apparent this CPU sings when all the cores are lit up, the question is does your usage light all the cores up? Or does the benefits of lighting all the cores up outweigh the times when you won't be in your rig?

Any good review should show both cases, the 4k and the 1080p (with low settings preferably), because while the 4k might give a more realistic idea of how the CPUs fare with current graphic hardware, the 1080p gives a better idea of the future since the graphic card is quite likely to be exchanged before the CPU ever does, and graphic cards have been evolving nicely.
 

napata

Member
If you refer to Jokers review, should maybe turn on the sound. He explains why the workloud is 99% in his 1080p tests. VSYNC off means gpu load will always show 99%. He discusses it further in his follow up videos.

It is incredible frustrating to see people trash reviews without an idea whats going on.

I doubt he said that because it's wrong, as any hardware reviewer would know. If you're CPU limited you will not have 99% GPU usage because your GPU is waiting on your CPU, and thus it's not working at full speed.
 

ISee

Member
The current data presented by AMD themselves has the 6core clocked lower than ryzen1700, clock is already the Ryzens problem when it comes to gaming, so unless they fix so you can OC it to 4.5ghz+ then it will be a bit problematic.. plus intel will have launched the kabylake refresh(7740K) at that point, it's an easy fix for them just to use 1$ extra for thermal interface on the IHS.

Temperature seems to be a problem while overclocking ryzen atm, less cores could result in lower temperatures and more stable overclocks, but we'll see. Ryzen ocs seem to vary a lot already.

Isn't the 7740k a 'normal' 7700k with a deactivated iGPU but for the 2066 platform? Of course this could result in higher overclockability but I expect 2066 mainboards to be more expensive so maybe not the best investment for gamers.
 

V_Arnold

Member
I doubt he said that because it's wrong, as any hardware reviewer would know. If you're CPU limited you will not have 99% GPU usage because your GPU is waiting on your CPU, and thus it's not working at full speed.

So pick one. Either its GPU-bound (as claimed) or its CPU-bound (and then the tests do show cpu-bound results :D)

So which one is it? It cant be both.
 
Ouch all this doom & gloom and it's down to some board manufacturers having poor bios's.

Please tell me this is hidden joke.

No it's not BIOS it's whatever you are testing in gpu limited test place or not.

AMD even asked reviewers to use GPU limited places because it makes their cpus closer to Intel.
 

Profanity

Member
So pick one. Either its GPU-bound (as claimed) or its CPU-bound (and then the tests do show cpu-bound results :D)

So which one is it? It cant be both.

He's wrong. VSync off does not mean GPU usage will always be pegged at 99%. I can take some screenshots to show you if you want.
 

V_Arnold

Member
He's wrong. VSync off does not mean GPU usage will always be pegged at 99%. I can take some screenshots to show you if you want.

But the initial critique against him was that he is offering GPU-bound scenarios. If they are not GPU-bound then, then by definition, they will be CPU-bound scenarios.
 

Durante

Member
If you refer to Jokers review, should maybe turn on the sound. He explains why the workloud is 99% in his 1080p tests. VSYNC off means gpu load will always show 99%. He discusses it further in his follow up videos.

It is incredible frustrating to see people trash reviews without an idea whats going on.
It's incredibly frustrating that you (and apparently this so-called "reviewer") have no idea how computers work.

If you are CPU-limited in a given gameplay scenario, then your GPU load will not be at 100%. This is completely irrespective of vertical synchronization.

Seriously, these things should be the absolutely minimum basic knowledge in order to talk from a position of authority about a CPU, or a GPU, or any hardware component (which is what a review is, or at least should be).

You dont have to belive me anything, just watch the criticized reviews. There is even a 40min talk about what and how it was tested and why it makes sense.
You can talk 40 hours about bullshit, that doesn't make it smell less.
 
This looks really dissapointing. I hope R5 provides a better results on its cheaper category :/ I don't want to spend 234879$ to buy an intel chip.
 

ISee

Member
Please tell me this is hidden joke.

No it's not BIOS it's whatever you are testing in gpu limited test place or not.

AMD even asked reviewers to use GPU limited places because it makes their cpus closer to Intel.

Gamersnexus talked about this in their YT review. AMD suggested to do ryzen benchmarks in 1440p, which is indeed a bit misleading.
(Gamersnexus YT review @ 18:45)

in 1080p the 7700k is ~ 34% faster than the 1800x (both at stock clocks)

In 1440p the 7700k is only ~10% faster.
 

napata

Member
So pick one. Either its GPU-bound (as claimed) or its CPU-bound (and then the tests do show cpu-bound results :D)

So which one is it? It cant be both.

I haven't watched any joker videos. I was just referring to that specific comment.

If his GPU load was 99% then he was GPU bound. If the GPU load was lower then he was CPU bound, preferably you'd want your GPU load as low as possible because if it's 90%+ you'll probably hit GPU limited scenarios way too often.

Edit: Just skimmed through some of Joker's videos. Those CPU reviews are completely worthless. No frametimes and almost always GPU bound. He practically benchmarked his GPU.
 

RenditMan

Banned
Any good review should show both cases, the 4k and the 1080p (with low settings preferably), because while the 4k might give a more realistic idea of how the CPUs fare with current graphic hardware, the 1080p gives a better idea of the future since the graphic card is quite likely to be exchanged before the CPU ever does, and graphic cards have been evolving nicely.

You don't see the flaw in this methodology? What the picture clearly shows is that IPC on current bios is lower but there's more cores. so overall performance is considerably higher on the software that can use all cores. The downside is that its lower on the software that can't.

The question here is that can we expect devs to use more cores or will they stick with what they are currently doing. Bit of a chicken and egg scenario.
 

Profanity

Member
But the initial critique against him was that he is offering GPU-bound scenarios. If they are not GPU-bound then, then by definition, they will be CPU-bound scenarios.

But they are GPU bound. If the GPU usage is fixed at 99, then they are by definition GPU bound. I really don't see what's hard to understand about this.
 

V_Arnold

Member
But they are GPU bound. If the GPU usage is fixed at 99, then they are by definition GPU bound. I really don't see what's hard to understand about this.

Its okay, then they *are* gpu-bound. But what is the point of attacking the vsync angle which WOULD produce - according to your own posts, for example, - non-gpu bound scenarios when we CAN see CPU failing to send data to the gpu fast enough. That was my point. I am not arguing that the videos are good, I am arguing that it cant be wrong for TWO opposing reasons at the same time :)
 
Gamersnexus talked about this in their YT review. AMD suggested to do ryzen benchmarks in 1440p, which is indeed a bit misleading.
(Gamersnexus YT review @ 18:45)

in 1080p the 7700k is ~ 34% faster than the 1800x (both stock clocks)


In 1440p the 7700k is only ~10% faster.

I wouldn't call it "a bit" misleading - it's full on attemp to fool consumers with false picture of performance.
And it completely ignores that today 1080p performance of GPUs is tomorrow 1440p as GPU power grows with each new generation.

Using their method I could prove that FX is almost as good gaming cpu as Skylake i7 :)
 

Nachtmaer

Member
really strange since they are still listed as a dual channel design... hmm guess they are abondening the old socket(or maybe the rumors are just false)

That's because they're using the same KL chips, just for HEDT and with a higher TDP. It still only has a dual channel memory controller.
 

Livanh

Member
It's incredibly frustrating that you (and apparently this so-called "reviewer") have no idea how computers work.

If you are CPU-limited in a given gameplay scenario, then your GPU load will not be at 100%. This is completely irrespective of vertical synchronization.

Seriously, these things should be the absolutely minimum basic knowledge in order to talk from a position of authority about a CPU, or a GPU, or any hardware component (which is what a review is, or at least should be).

You can talk 40 hours about bullshit, that doesn't make it smell less.


Its a talk with gamernexus about both their testing methods, not only a defense from joker.

if 99% gpu usage is hit pretty much all the time in any tested game at 1080p, wouldnt it then mean any test at 1080p is useless, since its always gpu bound?
 

Micael

Member
You don't see the flaw in this methodology? What the picture clearly shows is that IPC on current bios is lower but there's more cores. so overall performance is considerably higher on the software that can use all cores. The downside is that its lower on the software that can't.

The question here is that can we expect devs to use more cores or will they stick with what they are currently doing. Bit of a chicken and egg scenario.

What does that have to do with the 1080p 4k thing I was talking about? Are you sure you were quoting the right person?
 

ISee

Member
I wouldn't call it "a bit" misleading - it's full on attemp to fool consumers with false picture of performance.
And it completely ignores that today 1080p performance of GPUs is tomorrow 1440p as GPU power grows with each new generation.

Using their method I could prove that FX is almost as good gaming cpu as Skylake i7 :)

That's indeed a very good point.
 

Profanity

Member
Its okay, then they *are* gpu-bound. But what is the point of attacking the vsync angle which WOULD produce - according to your own posts, for example, - non-gpu bound scenarios when we CAN see CPU failing to send data to the gpu fast enough. That was my point. I am not arguing that the videos are good, I am arguing that it cant be wrong for TWO opposing reasons at the same time :)

Its a talk with gamernexus about both their testing methods, not only a defense from joker.

if 99% gpu usage is hit pretty much all the time in any tested game at 1080p, wouldnt it then mean any test at 1080p is useless, since its always gpu bound?

Essentially, yes. I could for example bust out a Bulldozer CPU, show you footage of it and a 7700K running a game at the same framerate (because they're both GPU bound at 99%) and claim that the Bulldozer was as good as the 7700K for gaming. Now we all know that's not true, because the methodology was flawed.

Edit: Same point, beaten by michaelius above.
 

Durante

Member
if 99% gpu usage is hit pretty much all the time in any tested game at 1080p, wouldnt it then mean any test at 1080p is useless, since its always gpu bound?
Yes, when your purpose is testing CPU performance, any game scenario with a 99% GPU load is useless.

It's not the case though that 1080p is always GPU bound, e.g. in Watch Dogs 2.
 
That's not really true though?
I paid around 350€ for the 5820k in 2014. Now the Ryzen 7 1700 (never mind X or 1800) is 360€.

I agree as to the significance of the WD2 results, but your remark about the 7700k doesn't hold true in frametimes at Computerbase at least:
ryzen_percentile_wd29ok2s.png


That said, Intel's 6- and 8-core CPUs show much better scaling.

I was looking at Tech Report's frametime analysis, which is showing completely different results. I can't recall a CPU launch with such variable performance shown.
 

Livanh

Member
Essentially, yes. I could for example bust out a Bulldozer CPU, show you footage of it and a 7700K running a game at the same framerate (because they're both GPU bound at 99%) and claim that the Bulldozer was as good as the 7700K for gaming. Now we all know that's not true, because the methodology was flawed.

Edit: Same point, beaten by michaelius above.

Where do the differences then come from? Why then test at 1080p at all (thats what seems to be wanted by most), if any modern game is apperently still GPU limited pretty hard at 1080p, even with the fastest gpus.

Also, isnt this then a test which games are cpu bound at all? games that are really cpu limited would show a drop in gpu usage.
 
I wouldn't call it "a bit" misleading - it's full on attemp to fool consumers with false picture of performance.
And it completely ignores that today 1080p performance of GPUs is tomorrow 1440p as GPU power grows with each new generation.

Using their method I could prove that FX is almost as good gaming cpu as Skylake i7 :)

They addressed it on Reddit. They simply wanted reviewers to test at all resolutions, not just 1080p, to give a full picture of the performance across resolutions. It's true that the lower res will pronounce the difference against AMD, but it's also true that at higher resolutions where most people actually use high end rigs the difference won't be the same. You and I might know this, but not everyone, and making a review that only shows the worst case scenario seems a bit misleading too, don't you think? ;)


As for 99% GPU being the bottleneck when benching, well if that's the case with an OC 1080 at 1080p resolution, you have to start thinking maybe the game in question just isn't benefiting from the faster CPU unless you wait until you have that 240 Hz monitor and 2x Titan XP's worth of GPU power to bring that CPU performance in use. Maybe that's something that you'll have in 2020, maybe not, maybe it matters then or maybe not. Maybe at that point the 16 thread CPU will run laps around the 8 thread due to better threading optimizations. It's a bit more complicated than just assuming that 7700K will prove any tangible benefits in the long run. Depending on your setup, 1700 getting past that 60 FPS limit is all that is needed for a lot of people, and who's to say in future titles that are even more demanding, but well threaded and optimized for Ryzen, it won't go faster than 7700K?

I wouldn't just assume things though, and there's cases where single thread performance is highly important and actually limiting right now, but unfortunately most reviews don't even touch titles like Xcom 2. :( I've spent over 1000 hours on it, and it runs like a dog, and the only thing that'll help even slightly is high clocks and IPC. That's an actual reason to opt for 7700K, not because GTA V runs at 150 vs. 180 FPS on your 60 Hz monitor with a Titan X at 1080p.
 

~Cross~

Member
Where do the differences then come from? Why then test at 1080p at all (thats what seems to be wanted by most), if any modern game is apperently still GPU limited pretty hard at 1080p, even with the fastest gpus.

Also, isnt this then a test which games are cpu bound at all? games that are really cpu limited would show a drop in gpu usage.

There are plenty of games that when you are using a strong enough GPU will be bottlenecked by memory speeds/latency and cpu. Witcher 3 in particular is used a lot in tests because of how well it scales with CPU, particularly in Novigrad. Novigrad is all about cpu and how well it utilizes memory bandwidth.
 

napata

Member
Its okay, then they *are* gpu-bound. But what is the point of attacking the vsync angle which WOULD produce - according to your own posts, for example, - non-gpu bound scenarios when we CAN see CPU failing to send data to the gpu fast enough. That was my point. I am not arguing that the videos are good, I am arguing that it cant be wrong for TWO opposing reasons at the same time :)

Obviously if you lock your framerate with vsync you could have <99% GPU load without being CPU limited. That's just common sense so people don't specifically mention it.
 

Renekton

Member
I wouldn't just assume things though, and there's cases where single thread performance is highly important and actually limiting right now, but unfortunately most reviews don't even touch titles like Xcom 2. :( I've spent over 1000 hours on it, and it runs like a dog, and the only thing that'll help even slightly is high clocks and IPC. That's an actual reason to opt for 7700K, not because GTA V runs at 150 vs. 180 FPS on your 60 Hz monitor with a Titan X at 1080p.
Slightly OT, but try clean reinstalling XCOM2 without the mods once.
 
Top Bottom