Is there any chance Apple might switch to Ryzen if it's competitive?
Considering that Apple put Thunderbolt into all their products, it seems unlikely.
I bet that's why Intel backtracked and decided to keep Thunderbolt exclusive instead of opening it up and allowing other companies (i.e. AMD) to use it.
This is why I'm really concerned about those leaked benchmarks.
The memory performance seemed really poor, and the specs of the leaked motherboards so far are listing DDR4-2666 and DDR4-3000, compared to the latest Z270 boards which support DDR4-4266.
I never understand these benchmarks, becuase this site shows virtually no difference between 2133, 2666, and 3200:
http://techbuyersguru.com/gaming-ddr4-memory-2133-vs-26663200mhz-8gb-vs-16gb?page=1
And another (not the newest article):
http://www.anandtech.com/show/8959/...-3200-with-gskill-corsair-adata-and-crucial/7
There was also another couple articles showing not too much of a performance difference between single and dual channel, as well.
Many sites don't know how to do proper performance testing in games and often set up their tests to be GPU-limited or only look at average framerates.
When you care about minimum frame rates, memory latency/bandwidth becomes quite important - same thing as CPU performance.
Anandtech's gaming tests are so bad they show practically no difference in 5 years of Intel CPUs.
Techspot and Digital Foundry generally do good performance testing for games.
I always wondered what is THE thing to look for in memory.
Some people say its the MHZ, some say latency, some say combination of both.
So what is it?
Like in those DFoundry videos they only say mhz, nothing about latency.
Im gonna upgrade PC soon when we know what AMD can do and im still not sure what memory to get.
3200mhz 14cl or some 4000mhz and 19cl?
Combination of 3200 14 is better than 4000 19 but is it better or not?
Generally, the faster RAM speed makes up for slower CAS latency, while having more bandwidth.
There's a great chart on Wikipedia:
https://en.wikipedia.org/wiki/CAS_latency#Memory_timing_examples
DDR4-3200 CL14 has a 3200MT/s data rate and 8.75ns latency for the first word. By the fourth word, latency is at 9.69ns.
DDR4-4266 CL19 has a 4266MT/a data rate and 8.91ns latency for the first word. By the fourth word it has made up the difference due to its higher speed and has 9.61ns latency.
So it generally offers a 33% higher data rate and similar latency when speed is factored in.
So, in order to have DRM protected video playback with an AMD system, the only hope at this point is to have a video card that supports that?
UHD drive units have begun trickling to the market, I need to investigate further into the matter as it seems stupid to not being able to use them on the AMD platform.
In theory you should be able to drop in a compatible GPU, but NVIDIA's current GPUs apparently support PlayReady 3.0, while 4K Netflix only works with Kaby Lake iGPUs right now.