• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Ryzen Thread: Affordable Core Act

ethomaz

Banned
I have not overclocked my CPU (or my 780 Ti for that matter) yet. I am a little afraid of doing it because I fear that throttling it could break it.

Also, really stupid question, would live streaming on YouTube or Twitch be considered a heavy multi-threaded task?
No. Streaming is basically low single task (10-20% max of one core with your CPU) :D
 
I have not overclocked my CPU (or my 780 Ti for that matter) yet. I am a little afraid of doing it because I fear that throttling it could break it.

Also, really stupid question, would live streaming on YouTube or Twitch be considered a heavy multi-threaded task?

There's not much to be worried about when overclocking unless you're pushing high voltages and temperatures through the chip. I have hardware that I've overclocked the snot out of that still runs fine 6+ years later, but only do it if you feel comfortable with doing it.

Hmm, streaming and playing a game at the same time? It can be tough depending on what you're using to record, the quality settings you're using and what you're playing.
 

Vipu

Banned
I have not overclocked my CPU (or my 780 Ti for that matter) yet. I am a little afraid of doing it because I fear that throttling it could break it.

Also, really stupid question, would live streaming on YouTube or Twitch be considered a heavy multi-threaded task?

You really cant do any harm if you dont do something stupid.
If you follow any guides etc you cant break something if you dont try to squeeze every tiny bit and dont look at the temperatures.
 
Please DO NOT BUY the Asus PRIME B350M-A motherboard. This thing is a piece of shit. It fails to detect the correct XMP memory timings. The BIOS and Asus "AI" software report differing XMP settings and they're both wrong (CPU-Z reports the correct settings). It insists on running my 2400 MHZ memory at 2133. And I am using memory specifically listed on the Asus product page as having been tested on this motherboard at its correct speeds (2400 MHz 14-16-16-31).

It also reports my CPU temp as being ridiculously high (55-60 idle, ~75 load). I'm using the AM4 edition Noctua tower cooler w/ 120mm fan and the thing is cool to the touch. Even removed it and replaced the thermal paste to make sure it's making good contact, seated properly, etc.

This is all using the latest BIOS (0502).


Anybody else have better luck with any of the other mATX boards? If the next BIOS update doesn't fix this board it's going back to Newegg as defective.

I have the Prime x370 on 0504 bios and have the same idle temperature as you, with the h60 Corsair cooler. Applied the thermo paste 3 times, still the same temperatures. Someone on reddit got 38 degrees with the 0502 bios and 58 with the 0504...
However, My Corsair LPX 3000MHz is recognized through D.C.O.P. as 2666Mhz as per their pdf.. better than the defaults I guess.
 

Leedogg

Member
Intel top HEDT is based in Broadwell yet, no? It is "5th-generation Core" while descktop is at 7th.

When the Skylake or Kybe Lake HEDT will launch???

Skylake-X (new name for the E series) is suppose to launch in the second half of this year. It will also bring a new chipset (X299) and a new socket (2066). But with Ryzen out, it may get released soon.
 
Well I would have bought the 1700 but Alza's website is being super weird with card verification. Never dealt with them before so not too impressed, but they do have the lowest price. Will try tomorrow.
 

shark sandwich

tenuously links anime, pedophile and incels
I have the Prime x370 on 0504 bios and have the same idle temperature as you, with the h60 Corsair cooler. Applied the thermo paste 3 times, still the same temperatures. Someone on reddit got 38 degrees with the 0502 bios and 58 with the 0504...
However, My Corsair LPX 3000MHz is recognized through D.C.O.P. as 2666Mhz as per their pdf.. better than the defaults I guess.
Well that kind of makes me relieved that it's a problem with the temperature readings and I'm not actually cooking my CPU.

I'm really hoping they get this sorted out soon w/ a new BIOS. This is a joke though. Their Prime lineup is advertised as being the most thoroughly validated/tested with the most memory models and other devices.
 
I have not overclocked my CPU (or my 780 Ti for that matter) yet. I am a little afraid of doing it because I fear that throttling it could break it.

Also, really stupid question, would live streaming on YouTube or Twitch be considered a heavy multi-threaded task?

Definitely investigate overclocking, since if you have a good cooling solution then it's trivial to do and has a very low risk of actually damaging anything. Hell, if your cooling solution is meaty enough, you could probably do it with what you already have right now (although it would probably also be wise to consider replacing your thermal compound for even better results).
 

shark sandwich

tenuously links anime, pedophile and incels
I've been looking for a new CPU. Based on reviews and price is 7700K the best option right now?
For just gaming, 7700K is king.

Ryzen is better if you're doing some heavy multitasking, video encoding, content creation, etc and also want something that is pretty good at gaming. It also MAY be more future-proof in that future games might benefit more from more CPU cores, and socket AM4 will most liky support the next couple generations of AMD CPUs.
 
I've been looking for a new CPU. Based on reviews and price is 7700K the best option right now?

Depends on your use. For pure gaming uses, the 7700K is the best CPU right now, but it might be wise to wait for Ryzen 3s in the second half of the year to see if they can provide a chip that's 80% as capable for 50% of the price.

If you have a workload that needs large quantities of CPU threads, a Ryzen 7 1700X paired with a good cooling solution and overclocked to the hilt will eat all the multithreaded tasks for breakfast. Do note, however, that it is noticeably worse at gaming, particularly at 1080p and below.
 
Depends on your use. For pure gaming uses, the 7700K is the best CPU right now, but it might be wise to wait for Ryzen 3s in the second half of the year to see if they can provide a chip that's 80% as capable for 50% of the price.

If you have a workload that needs large quantities of CPU threads, a Ryzen 7 1700X paired with a good cooling solution and overclocked to the hilt will eat all the multithreaded tasks for breakfast. Do note, however, that it is noticeably worse at gaming, particularly at 1080p and below.

No, depends entirely on what card you have.

It's only slightly worse at 1080p gaming with a 1080 or Titan X to expose it's limitations versus 7700K. If you have a card below that, the difference would not be noticeable at all.

Even with a Titan X, we're looking at say, 80 fps on Ryzen vs 92 fps on the 7700K.
 

Vipu

Banned
No, depends entirely on what card you have.

It's only slightly worse at 1080p gaming with a 1080 or Titan X to expose it's limitations versus 7700K. If you have a card below that, the difference would not be noticeable at all.

Even with a Titan X, we're looking at say, 80 fps on Ryzen vs 92 fps on the 7700K.

Just stop it already...
 
Do newer emulators take advantage of the additional cores, or at least is there potential to? Like Cemu and RPCS3, will they benefit from Ryzen in the coming years?
 
No. Streaming is basically low single task (10-20% max of one core with your CPU) :D

Are you talking about streaming with shadow play or something, which is done on the GPU? Streaming using a program like OBS running on the CPU is most definitely very CPU demanding. It uses a ton of threads and it also benefits from fast RAM. You can very easily max out an i5 or i7 by streaming in 1080p and 60fps while gaming. I think Ryzen 7 will be very popular for Twitch streamers.
 
Depends on your use. For pure gaming uses, the 7700K is the best CPU right now, but it might be wise to wait for Ryzen 3s in the second half of the year to see if they can provide a chip that's 80% as capable for 50% of the price.

If you have a workload that needs large quantities of CPU threads, a Ryzen 7 1700X paired with a good cooling solution and overclocked to the hilt will eat all the multithreaded tasks for breakfast. Do note, however, that it is noticeably worse at gaming, particularly at 1080p and below.

It's worth to watch this if you want to take a look at Ryzen's performance in games. It's honestly pretty adequate in most situations.
 

petran79

Banned
Do newer emulators take advantage of the additional cores, or at least is there potential to? Like Cemu and RPCS3, will they benefit from Ryzen in the coming years?

This should still be valid I guess

http://wiki.pcsx2.net/index.php/PCSX2

Another important factor determining PCSX2 performance is that Intel Processors almost always perform better than AMD Processors. This is primarily because Intel's are more appropriately designed for running advanced computer softwares, whereas AMD's are more ideal for PC gaming. PCSX2 is definitely the former, since it is designed to run non-PC-formatted games.

For comparison, the AMD FX series (currently AMD's most prominent line of processors) will not match the performance of the Intel i7 series. An Intel i7-4770K 3.5 GHz will perform better than an AMD FX-9370 4.4 GHz, even though the FX-9370 has a much higher clock-rate. The main reason for this is because in most AMD Micro-architectures, the number of Floating-Point Units is half the number of total CPU cores. This means that each unit is shared between two cores. If one core is currently using the FPU, the other core sharing it must wait to do the same. It is only free to perform integer operations. PCSX2 makes heavy use of floating-point operations.

It is also important to note that having more than two CPU cores does not automatically increase the emulation performance of PCSX2. This is because PCSX2 currently uses only two cores, so having a greater number of cores will not compensate for a lower clock-rate. Emulation can be improved by enabling the MTVU hack in the Speedhacks menu, but not by much.

Below are the minimum and recommended requirements to run PCSX2:
 
It's worth to watch this if you want to take a look at Ryzen's performance in games. It's honestly pretty adequate in most situations.

It's still a case whereby the i7 7700k does games better than any of the Ryzen 7s for less money. For builds that are exclusively for gaming, I would not recommend any of the Ryzen 7s. That's why I mentioned the Ryzen 3s, since in theory it should be almost as good in gaming as the Ryzen 7s, but should cost substantially less.
 

Sinistral

Member
I have the Prime x370 on 0504 bios and have the same idle temperature as you, with the h60 Corsair cooler. Applied the thermo paste 3 times, still the same temperatures. Someone on reddit got 38 degrees with the 0502 bios and 58 with the 0504...
However, My Corsair LPX 3000MHz is recognized through D.C.O.P. as 2666Mhz as per their pdf.. better than the defaults I guess.

Well that kind of makes me relieved that it's a problem with the temperature readings and I'm not actually cooking my CPU.

I'm really hoping they get this sorted out soon w/ a new BIOS. This is a joke though. Their Prime lineup is advertised as being the most thoroughly validated/tested with the most memory models and other devices.

Got it up and running and updated. All stock, with updated bios to 504, no memory tweaks yet. Idle is at 55-60, and load under Cinebench was 75. Using a single fan Noctua NH U9S. Trying memory tweaks now...

And I couldn't get it to post beyond 2133... Ram is G.Skill TridentZ 3200 32GB CL14 2x16gb. It's not even on this list: https://www.asus.com/Motherboards/PRIME-B350M-A/HelpDesk_QVL/

Guess I'll patiently wait for bios upgrades.
 
hqdefault.jpg



Win 7 & Win 10, Frametimes + Bell curve —— RYZEN 1700X REVIEW - 4 days worth of testing - ft. i7 7700K [MindBlank Tech]
https://www.youtube.com/watch?v=Nfv5aF_GfWg

I finally got my RYZEN 1700X and ASUS Crosshair VI Hero - time to benchmark the crap out of it and compare it to an i7 7700K! Then overclock the 1700X to 4Ghz and test it against the 5GHz 7700K. Testing is done at 1080p and 720p, complete with frametime analysis. There's also Windows 10 vs. Windows 7 benchmarks and frametime distribution graphs - bell curves!
 

Steel

Banned
·feist·;231904307 said:
hqdefault.jpg



Win 7 & Win 10, Frametimes + Bell curve —— RYZEN 1700X REVIEW - 4 days worth of testing - ft. i7 7700K [MindBlank Tech]
https://www.youtube.com/watch?v=Nfv5aF_GfWg

Huh, in battlefield 1 and Crysis 3, the overclocked 1700x beat the overclocked 7700k in bottom 5% framerates. And it outright beats the OC 7700k in GTA V here even at 720p. Overall the frametime curves look worse for the 1700x except for a handful of games, though.
 

Paragon

Member
It's worth to watch this if you want to take a look at Ryzen's performance in games. It's honestly pretty adequate in most situations.

I would be upset if I had spent $2000 on video cards (now $800 for better performance with 2x1070) and the CPU was causing them to only be half-utilized.
GPU utilization was dropping below 50% many times in that video.

Unigine Heaven:
38-50-16004z18.jpg


GTA V:
48-44-80qkx6i.jpg


DOOM:
I've added 50% markers for the GPU usage and framerate graphs for DOOM.
Looking at the framerate graph, it seems that the sections of gameplay Jay chose to present when zoomed into the framerate counter were not representative of typical performance.

Now I know that the argument many people make is "but the framerate is good" - well it may be, but it could be considerably better with a faster CPU that can fully utilize those GPUs.
80 FPS might be fine if you only intend to game V-Synced on a 60Hz display, but with a 144/165Hz+ VRR monitor, higher framerates are always better, up to the maximum refresh rate.
And that's 80 FPS in GTA V - it's not a guarantee of 80 FPS minimums in all games.
It could also be argued that these CPUs are better to be paired with a slower GPU for more consistent performance if you aren't using a VRR display, since the CPU determines minimum framerates and the GPU largely determines maximum/average framerates.
 
Now I know that the argument many people make is "but the framerate is good" - well it may be, but it could be considerably better with a faster CPU that can fully utilize those GPUs.
80 FPS might be fine if you only intend to game V-Synced on a 60Hz display, but with a 144/165Hz+ VRR monitor, higher framerates are always better, up to the maximum refresh rate.
And that's 80 FPS in GTA V - it's not a guarantee of 80 FPS minimums in all games.
It could also be argued that these CPUs are better to be paired with a slower GPU for more consistent performance if you aren't using a VRR display, since the CPU determines minimum framerates and the GPU largely determines maximum/average framerates.

Yeah, This is the crux of the matter. It's that little fear that you encounter a game that runs especially poorly on your system due to a CPU bottleneck. I am on the fence about this myself. I want to be able to play team games in Starcraft 2 without lagging but I think that's not going to be possible until 10 years from now considering the rate of ICP improvements. I remember years ago when playing that on a Q6600. I would take my hands off the mouse and keyboard and say a Hail Mary before each fight.

Jay is running at 1080p which is the primary factor in the low GPU utilization. Let's not forget that he's running SLI also. This puts his GPU budget way past most people's budgets (even if it were two 1070s). I think that the Ryzen 7 works best with a single card and/or at high resolutions. I have a single GTX 1080 and most AAA games run between 60-110 fps at 1440p at my settings with the card maxed out.

There are a couple of issues that I don't know the answers to, such as whether these frame rate issues will be resolved through developer optimization (AMD says there is nothing architecturally wrong with the Ryzen that is causing these low fps) or that the windows scheduler is to blame. Anyway, I would like to know more about these fps issues before I make a decision on a new CPU.
 
I'm a little lost right now.

What are now the differences between the three 8 core models in real action? Can all three versions be overclocked to the same levels?
 
Do newer emulators take advantage of the additional cores, or at least is there potential to? Like Cemu and RPCS3, will they benefit from Ryzen in the coming years?

I'm not to sure about CEMU but it looks like RPCS3 is capable of taking advantage of 4+ threads, I've seen a video where it pushed an i7-6700K above 80% usage, and the Demon's Souls demonstration was performed on a system equipped with a 5820K. Here's more about it: Demon’s Souls: Nearly Playable?! and RPCS3 v0.0.2
Performance. The game is very demanding of resources, and while Ryzen @ 4 ghz or 6 cores of Haswell-E can run some areas of the game at 30 fps, it can also drop down to as little as 10 fps in big open areas such as in front of The Boletarian Castle. One really wants a stable 30 fps (native frame rate, maybe someone will mod it to 60 fps), and preferably also on “normal” CPU’s. But there are both short and long term plans to improve performance, for example by making the LLVM recompiler not crash in this game.

Soul Calibur 4 (i7 6700K)
 

ethomaz

Banned
What about the frametimes? Some HW sites say the frametimes FEEL better and the TPU graphs do look kinda decent.
About Win7 you say?

I did not see any frametimes test but like posted before the Win7 seems to not use SMT in some cases and disabling SMT on Win10 increase framerate.
 
I'm a little lost right now.

What are now the differences between the three 8 core models in real action? Can all three versions be overclocked to the same levels?
The 1700 has a TDP rating of 65W,the other two is rated for 95W. That's basically the only real difference, since a 1700 overclocked to 4Ghz will perform about the same as a 1800x overclocked to 4Ghz.
 

~Cross~

Member
I'm a little lost right now.

What are now the differences between the three 8 core models in real action? Can all three versions be overclocked to the same levels?

For the end user you are better off getting a 1700 and overclocking it to near 1800x levels. You are getting 3.8-3.9ghz at most unless you hit the silicon jackpot.

Getting a 1800x just means you are paying a whole lot for better binning mostly.
 

Datschge

Member
I'm a little lost right now.

What are now the differences between the three 8 core models in real action? Can all three versions be overclocked to the same levels?
Without OC the base, XFR, single core and all core turbo frequencies are fixed at different values. With OC nothing of that matters as it's all disabled and the only potential/likely difference between the three is binning.

Perfect load imbalancing it seems. /s
 
The 1700 has a TDP rating of 65W,the other two is rated for 95W. That's basically the only real difference, since a 1700 overclocked to 4Ghz will perform about the same as a 1800x overclocked to 4Ghz.

They are binned though, even if the difference is small.

Someone posted the stats on another forum, basically, small percentage of 1800X's can get to 4.1Ghz. A decent amount will get to 4.0Ghz.

You can expect 3.9Ghz -4Ghz out of the 1700X, and the 1700 one hundred MHZ less.
 

Thraktor

Member
·feist·;231904307 said:
hqdefault.jpg



Win 7 & Win 10, Frametimes + Bell curve —— RYZEN 1700X REVIEW - 4 days worth of testing - ft. i7 7700K [MindBlank Tech]
https://www.youtube.com/watch?v=Nfv5aF_GfWg

Yet more evidence of bimodal frame time distributions on Ryzen. (Although I'm in a cafe at the moment so can't listen to his discussion, so I'm not sure what, if anything, he's said about it).

I'm really interested in doing a deeper dive into what's causing this, but it's not really possible to infer much more than I already have from simple histograms, and I don't have any Ryzen hardware myself to test with.

Is anyone who has a Ryzen CPU and a sufficiently powerful GPU (i.e. GTX1070 or greater) willing to help me with this? What I need are frametime outputs (i.e. the raw output files) from FCAT or OCAT/PresentMon for relatively recent games which are run in a CPU-limited scenario (i.e. powerful GPU at low resolution). The more games the better, but Crysis 3, Deus Ex MD, GTA V and Rise of The Tomb Raider have clearly visible bimodal distributions from the graphs I've already seen.

Mainly I'd be looking for Windows 10 tests with all 8 cores and SMT active, although if anyone could test on Windows 7, and/or with 4 cores disabled or SMT disabled that would be very helpful to narrow down the precise cause of the problem.

What I'd be able to do with the data is narrow down precisely when this "bug" which causes the bimodal distribution happens, how often it happens, precisely what impact it has on performance, and potentially even how AMD and Microsoft could solve the problem.
 

Steel

Banned
I'm a little lost right now.

What are now the differences between the three 8 core models in real action? Can all three versions be overclocked to the same levels?

Yeah, they can all be overclocked to about the same level. The X's have a bit more headroom with XFR, but not much.
 

shark sandwich

tenuously links anime, pedophile and incels
Got it up and running and updated. All stock, with updated bios to 504, no memory tweaks yet. Idle is at 55-60, and load under Cinebench was 75. Using a single fan Noctua NH U9S. Trying memory tweaks now...

And I couldn't get it to post beyond 2133... Ram is G.Skill TridentZ 3200 32GB CL14 2x16gb. It's not even on this list: https://www.asus.com/Motherboards/PRIME-B350M-A/HelpDesk_QVL/

Guess I'll patiently wait for bios upgrades.
Sorry to hear. Sounds like this is an extremely common problem so hopefully Asus will respond quickly.
 
Übermatik;231876565 said:
Nothing really, but that also means nothing bad, haha. MSI have really caught up with Gigabyte, I hear. Plus their Mobo design is less, uh, garish.

Yeah, the design really drew me to it. It's nice and neat. Plus, it has their RAM boost, so I'm hoping it can run 3200MHz. The product page had a performance graph showing 3200 working with Ryzen.
 
Yeah, the design really drew me to it. It's nice and neat. Plus, it has their RAM boost, so I'm hoping it can run 3200MHz. The product page had a performance graph showing 3200 working with Ryzen.

Don't all/most the boards support DDR4 3200? But yeah, the design is tidy, as is the Asrock Pro4's... I think right now it's a tie between those two for me. Need to decide!
Also, on that topic, what's the state with Ryzen and lower frequency RAM? Am I best buying 3200Mhz?
 
Übermatik;231876565 said:
Nothing really, but that also means nothing bad, haha. MSI have really caught up with Gigabyte, I hear. Plus their Mobo design is less, uh, garish.

By sheer coincidence, I used to have only Gigabyte motherboards but recently had upgraded and replaced them all with MSI. I have nothing but good things to say about my MSI motherboards. I really like Click BIOS for anyone who cares about their BIOS screens.
 

Paragon

Member
Übermatik;231928983 said:
Don't all/most the boards support DDR4 3200? But yeah, the design is tidy, as is the Asrock Pro4's... I think right now it's a tie between those two for me. Need to decide!
Also, on that topic, what's the state with Ryzen and lower frequency RAM? Am I best buying 3200Mhz?
DDR4-3200 is an overclock. Maximum officially supported speed is DDR4-2667.
The issue is that the AM4 platform does not offer control over sub-timings, the timings used are very aggressive, and you have to use a 1T command rate.
DDR4-3200+ is possible, but not with all memory sticks - even if they're rated for 3200+.

You're also going to need a board which allows you to adjust the base clock. Right now, I think that's just the ASUS Crosshair VI, Gigabyte Gaming K7, and ASRock X370 Taichi. Unlike Intel boards, X370 requires an external clock generator to change the base clock.
That's why G.Skill have announced memory kits specifically for Ryzen: https://www.gskill.com/en/press/vie...s-and-fortis-series-ddr4-memory-for-amd-ryzen
 
DDR4-3200 is an overclock. Maximum officially supported speed is DDR4-2667.
The issue is that the AM4 platform does not offer control over sub-timings, the timings used are very aggressive, and you have to use a 1T command rate.
DDR4-3200+ is possible, but not with all memory sticks - even if they're rated for 3200+.

You're also going to need a board which allows you to adjust the base clock. Right now, I think that's just the ASUS Crosshair VI, Gigabyte Gaming K7, and ASRock X370 Taichi. Unlike Intel boards, X370 requires an external clock generator to change the base clock.
That's why G.Skill have announced memory kits specifically for Ryzen: https://www.gskill.com/en/press/vie...s-and-fortis-series-ddr4-memory-for-amd-ryzen
The MSI Carbon is supposed to allow you to adjust it as well, which is why I'm waiting impatiently for a review.
 

Weevilone

Member
What about the frametimes? Some HW sites say the frametimes FEEL better and the TPU graphs do look kinda decent.

You can see from the Tech Report review that's not true.

It spends considerably more time above target frametimes than the 7700K but feels better.. ok.

If I didn't have another workstation, and I needed a work box first and gaming was secondary then I'd go Ryzen (after they get it fixed up). But the "feels better" stuff is just people reaching for something they want to be there.
 
hwi1zjyga.png


hwi2nxygp.png



Session of mostly light usage on stock 1800X. All CPU/mobo power saving features still on.
Win 10 Pro still in Balanced Profile (have noticed *seemingly* odd thread activity with Balanced as well as Performance) which leads to core parking of 4-7, while 0-3 remain active with SMT being used. In moderate loads under Win 10 Balanced Profile the CPU largely behaves as 4c/8t, occasionally going into 5c, 5c/10t and 6c/12t even with several concurrent apps/tasks.

Normally it only un-parks the remaining cores, entering into 7c/14t and 8c/16t, under "less moderate" loads (less than 50%), or for software which explicitly generates several threads. And yes, I'm aware certain tasks continue to be distributed even if a core appears to be at or near 0%.

RAM is CL14 3200MHz run at 2666MHz with relatively loose timings. Would easily do 3600MHz+.
 
Top Bottom