Would you buy DDR5 for system RAM on PC if you could?

Let me tell you guys and gals something.

You want to know why people are hyping the PS4's 8GB GDDR5 RAM? Because PC gaf and all the experts said it was impossible. Nobody said anything about Durango's rumored DDR3 8GB. That, magically, was fine and dandy. But any talk of sony using 8GB was like cursing at jesus. What did you expect was going to happen after being told over and over and over and over x10 8GB of GDDR5 RAM can not be done and suddenly.......we get it? So everybody who's annoyed at this bragging of GDDR5, it's your own fault .

it's not that it can't be done. it's just really expensive and will take up a ton of room on the board. they're going to have to fit 16 chips on there in addition to everything else. that's why they're not showing any hardware. I feel bad for whoever is in charge of the thermal design for that thing.

so you're pretty much guaranteed to get a large, hot, expensive system. that's not very exciting.
 
CPUs and GPUs have different RAM requirements.

CPUs want RAM with low latency, so they can very quickly access and move small chunks of data around.
GPUs want RAM with high bandwidth, so they can move large chunks of data.

DDR3 is suited for CPUs. It is low latency, but also low bandwidth. It is the defacto RAM found in PCs and servers. You spend $10,000 on a server, and it will use DDR3.

GDDR5 is suited for GPUs. It is high latency, but also high bandwidth. Graphics cars above level entry will use GDDR5 for VRAM.

The Xbox 360 was the pioneer for using GDDR (in its case, GDDR3) for both system and VRAM. The PS4 is following suit. While this might be fine for dedicate gaming machines, for genral purpose computing and CPU intensive work, you want low latency RAM. Which is currently DDR3.

There is a reason the next Xbox has gone for the DDR3 + EDRAM approach. MS have designed the console for more than games. The non gaming apps want DDR3. The EDRAM is there to mitigate the low bandwidth main RAM to a certain degree. Sony seem to have designed the PS4 as a pure bread gaming console. Different priorities resulted in different RAM architectures.

TL;DR you don't want GDDR5 as system RAM in a PC. When DDR5 finally comes to market, it might have best of both worlds. Low latency for CPUs and high bandwidth for GPUs. Only then would you want it as system RAM.

Very well explained post. This likely perfectly explains why MS is going with DDR3.
 
For faster read/write for general apps on my PC, sure. But for gaming, games pull from the graphics RAM, it is where the textures and the resolutions data is stored. But to have zero speed parity between general Ram and graphics ram would be great. PC suffers from the issue that every form of memory have such different read/write speeds, so devs have to generalize a bit. SSD for example, has such a low market use; the hybrid are high cache models; better but not in the same realm.
 
It has become some sort of mantra for console folks, it won't be long until someone claims that those 8 gbs could cure cancer.
I just wanted to point out that the only computer technology proven to cure cancer and rescue kittens from trees is 120Hz displays.

That is all.
 
Did I? Blimey. I have no idea how when I've not really commented on it before today.

I'm interested to see what they do myself.

No not specifically you. All the people who said who said it couldn't be done. This new messiah of GDDR5 is because of that. It's funny cause they made it out to be some type of holy grail chip that couldn't be put into a console. And now you have some saying how it's not that special. Ha fucking ha. We got the holy grail, now deal with fanboys and people who know nothing about how RAM works singing to the heavens about it.

How do ya'll like what you started? Enjoy.
 
I don't think they did. They just said 'fast RAM' IIRC. Although they did also mention the speed - but keep in mind they were talking to developers too.

"Supercharged PC architecture" is a bit misleading in its nature.

Kung Fu Grip said:
It's funny cause they made it out to be some type of holy grail chip that couldn't put into a console.

I was one of those people that said it was highly unlikely you'd see that level of GDDR in a console. So were folks around the web like Digital Foundry, etc.
The issue wasn't that it was impossible, just that it was unlikely based on the fact that it would be extremely expensive and would greatly complicate the motherboard (this also impacts future cost reduction, btw). Which they are, and it will. It was also contingent on memory densities increasing in time to mass produce the box, which were doubtful even as of last year. However, that seems to have happened - otherwise we wouldn't be seeing this 16 chip clamshell.

Crow eating? Sure, absolutely. But just like the BD-Rom in the PS3, it ain't going to be cheap for Sony to do what they did.
 
And that is fine, but what we need to know is whether one or the other is better for gaming?

Taking into account the PS4 architecture: would the GDDR5 bandwidth outweigh the higher latency, or is lower latency of DDR3 more important than higher bandwidth?
Uh, it really depends from what kind of operations you are making with them?

If you are dealing with small amounts of data that need to be processed FAST then you are going to prefer low latency, while on the other hand if moving data in bulks is what matters at some point, you are going to prefer more bandwidth.
Let's say you are going to handle instructions that don't come even close to saturate your memory bandwidth, then DDR3 would probably outperform GDDR5. And vice versa.

"Supercharged PC architecture" is a bit misleading in its nature.
Let's just say it's marketing buzzwords.
 
GPUs will get more added on as needed, its a great move for Sony who struggled hard with ram last pass, but not really worth worrying about for PCs. Right now precious little scrapes even the mid tier cards.
 
"Supercharged PC architecture" is a bit misleading in its nature.



I was one of those people that said it was highly unlikely you'd see that level of GDDR in a console. So were folks around the web like Digital Foundry, etc.
The issue wasn't that it was impossible, just that it was unlikely based on the fact that it would be extremely expensive and would greatly complicate the motherboard (this also impacts future cost reduction, btw). Which they are, and it will. It was also contingent on memory densities increasing in time to mass produce the box, which were doubtful even as of last year. However, that seems to have happened - otherwise we wouldn't be seeing this 16 chip clamshell.

Crow eating? Sure, absolutely. But just like the BD-Rom in the PS3, it ain't going to be cheap for Sony to do what they did.

I understand all that. I'm saying that is the exact reason why these threads about GDDR5 are popping up. People were told "No" repeatedly and it still happened. The excitement is expected.
 
Yeah, I'm stump on this as well.

Because DDR3 and its lower latency is more suited to operations that involve the CPU and its data. GDDR essentially serves as a gimp to the CPU rather than a boon. However, it is a big huge boon to the graphics core because graphics cores like big fat bandwidth and care less about latency. The PS4 is obviously more focused on its graphics core than what its CPU can do.
 
As a addendum to my prior post, I was referring to the clock speeds being the same, as a I would assume that gddr5 and ddr5 would have almost the same speeds.

But the CAS latency would be vastly different in that situation but is CAS latency between PC ddr3 and gddr5 are really that different?
 
We got the holy grail, now deal with fanboys and people who know nothing about how RAM works singing to the heavens about it.

Aaaand comments like this are exactly why people are laughing at you.

There's nothing wrong with not knowing how something works, but basking in your ignorance and using it as ammo for console warz is generally not something to be proud of.
 
Right real talk time.

GDDR5 is typically utilized as Graphics Memory correct? The PS4 much like the Xbox 360 has opted to have the 8GB of GDDR5 for System and Graphics processing.

What downsides does GDDR5 has for system usage with it being tailor made for graphics?

Sony decided to go with 256MB XDR RAM for the system and 256MB of GDDR 3 for the VRAM what was the reason for switching to multi functional GDDR5
 
It doesn't surprise me people are going nuts. We've been obsessed with RAM on GAF for a long time. Plus, all that meeting hype and how unexpected it was.
 
Sony decided to go with 256MB XDR RAM for the system and 256MB of GDDR 3 for the VRAM what was the reason for switching to multi functional GDDR5
I would guess flexibility?
When you have an unified pool of RAM it's completely up to you how to use the whole thing and how to part data between CPU and GPU.
 
My favorite part of new console launches is all the console folks start trying to talk tech when they haven't the slightest idea as to what their talking about.

IT HAS A 5 ON THE END! 5 IS BIGGER THEN 3!

only in penis size...

before the conference never heard of it I've got DDR 3 8gb on my 3 year old PC... so for all the pros in the now whats the BEST RAM available to buy......

well, you can't buy GDDR5 anyway.

I think people were surprised based on what was expected going into the "Playstation Meeting". Not sure why PC people are being smug and snarky though. Sharing knowledge is never a bad thing, but lording it over other people surely is!

people are trying to share knowledge and are being shut down by being called "smug PC elitist"

[
I've been gaming on PCs for over 20 years and I still thing it's a pretty big deal, because, well, 8GB GDDR5 of unified RAM for system and gpu is kind of a big deal, no matter how hard you try to downplay it.

yes but the 360 did it as well if i'm not mistaken. it's just a unified memory pool, let's see if it cabn deliver results.

Yes i know. Still, people are bursting with joy because gaf said GDDR5 is impossible. And we actually get it and people are excited and now everyones frustrated and annoyed. Get out of here with that shit. You created this.

nobody ever said that, nobody created this. people are just surprised at the amount.

No not specifically you. All the people who said who said it couldn't be done. This new messiah of GDDR5 is because of that. It's funny cause they made it out to be some type of holy grail chip that couldn't be put into a console. And now you have some saying how it's not that special. Ha fucking ha. We got the holy grail, now deal with fanboys and people who know nothing about how RAM works singing to the heavens about it.

How do ya'll like what you started? Enjoy
.

again, nobody said. the leaks told us weeks ago that Sony was going with GDDR memory, how much of it we didn't know exactly but lots of people guessed 4. so now everyone is surprised that they are getting double that. also a lot people seem to think all that VRAM is for graphics only, it isn't. nobody created annoying, shrill fanboys. they were already there spouting nonsense.
 
Aaaand comments like this are exactly why people are laughing at you.

There's nothing wrong with not knowing how something works, but basking in your ignorance and using it as ammo for console warz is generally not something to be proud of.

Huh? I don't know what you mean. The point i'm making is there's a reason why people are going crazy with this GDDR5 talk.
 
I understand all that. I'm saying that is the exact reason why these threads about GDDR5 are popping up. People were told "No" repeatedly and it still happened. The excitement is expected.

It didn't help that the PS4's original docs said "2GB GDDR5 UMA" either lol.
There are many folks that saw those or were made aware of them quite a long ways back.

I can understand excitement, but I hope you realize WHY exactly many were trying to temper those expectations. Even DigitalFoundry's most recent spec analysis echoes my sentiment.

DF said:
The GDDR5 memory modules - the same used in PC graphics cards - are only available in certain configurations, with the densest option available offering 512MB per module. The startling reality is that unless Sony has somehow got access to a larger chip that isn't yet in mass production and that nobody knows about, it has crammed 16 memory modules onto its PS4 motherboard.

Early rumours suggested that GDDR5 availability could even limit PS4 to just 2GB of memory, with 4GB at one point looking rather optimistic. What changed at Sony and encouraged them to go all out with its final design is not clear, but the chances are it would have been well aware of the RAM advantage offered up by its upcoming Xbox competitor, which - certainly up to its beta hardware at least - features 8GB of more bandwidth-constrained DDR3. What shouldn't be understated is the amount of extra cash this is going to add to PlayStation 4's BOM (bill of materials) - this is an expensive, massive investment for the company.
 
Right real talk time.

GDDR5 is typically utilized as Graphics Memory correct? The PS4 much like the Xbox 360 has opted to have the 8GB of GDDR5 for System and Graphics processing.

What downsides does GDDR5 has for system usage with it being tailor made for graphics?

Sony decided to go with 256MB XDR RAM for the system and 256MB of GDDR 3 for the VRAM what was the reason for switching to multi functional GDDR5

Your comparing the ps3 ram spec to the ps4 so your kinda all over the place. But if irrc, the XDR had higher bandwidth (don't know about the latency) compared to the GDDR3 ram. So that causes issues right there if using the whole memory pool for a single game, it has to be taken into account.

As for the ps4 ram, I don't know anything about the latency but it should be in range of ddr3 ram. But ddr3 is such a wide spectrum to look at. So people who are bringing up the next xbox it is really hard to know without the facts.

But the question posed in the OP was if I could put ddr5 in PC would I? And yes, but that would require an all new generation of PC hardware but it was available say tomorrow and I had the hardware and all sure. Faster Ram with ideally lower latency for general apps, sign me up.
 
It doesn't surprise me people are going nuts. We've been obsessed with RAM on GAF for a long time. Plus, all that meeting hype and how unexpected it was.

It's been obsessed on the console side lately, largely because the PS3's ram config and fairly bloaty OS has caused real problems for it, especially in taking down the tech or outright breaking some games, notably in open world titles.

For Sony this is probably a quick and still relatively cheap future proofing for their environment against the Xbox from that happening again. So you don't wind up with key studios like Bethesda clawing at the wall for every single meg of ram, because what happens with those giant clips and stutters from the Skyrim debacle on PS3 is the machine running out of memory and having to halt everything. Yes, Bethesda sucks for shipping the game like that, but Skyrim was probably a hell of a push for a box with 512 as is, let alone missing a fair chunk.

Very easy to spot, especially with some experience, have a friend who I used to play WoW with way back in the day, close to launch so like...2004-2005. He had a nice GPU, nice GPU, and I think bare minimum at best in RAM and same exact chop in an area with a lot of variety and interaction. :lol Flyin' into Ironforge...! ....chop...chop....chop
 
Huh? I don't know what you mean. The point i'm making is there's a reason why people are going crazy with this GDDR5 talk.

Being excited is fine, it's definitely something that nobody saw coming. But the people treating this as a "holy grail" and using that to declare the death of Microsoft and PC gaming deserve the mockery. Some of the comments in that PC Gamer $600 PC thread were embarrassing to read.
 
So, why is DDR5 only used on graphics cards instead of replacing DDR3 as system ram?

Cost.
There's no reason to have such expensive memory sitting by the CPU.
The GPU loves memory bandwidth, so you use a wider bus and blistering speeds, but you pay for it.

A unified memory architecture does NOT make sense for performance PCs.
 
It didn't help that the PS4's original docs said "2GB GDDR5 UMA" either lol.
There are many folks that saw those or were made aware of them quite a long ways back.

I can understand excitement, but I hope you realize WHY exactly many were trying to temper those expectations. Even DigitalFoundry's most recent spec analysis echoes my sentiment.

Yep. I completely understand why many said 8GB couldn't be done. It's from those reasons, and being told the PS4 would never go pass 4, people are so happy that the PS4 is actually using that much. That's why i don't get those who are annoyed at the sudden GDDR5 praises from everyone.
 
To answer OP's question: I would pretty much have to since ram specs tend to follow cpu sockets. Same as my 2500k only works with DDR3 and my E6850 only worked with DDR2.

I do suspect this isn't the question OP really was asking though.
 
PC are split into main memory, where GDDR5 is not cost effective because CPUs do not need so much bandwidth, where GPUs on the video card do need the bandwidth.

Neither Durango nor Orbis are PCs, they do not have split memory pools. You need to examine AMD APU system, which do share the same pool. Of course now they all use DDR3, but you can still look at the latency vs bandwidth payoffs.

PlatPerPerDollar.png


We're thrilled, then, that the Trinity design's memory controller supports data rates up to DDR3-2400. Although 19.2 GB/s per channel is quite a ways off from the 288 GB/s you get from a Radeon HD 7970 GHz Edition's 3 GB of GDDR5, every little bit of bandwidth helps, particularly in games. And that's what we'll be testing today.
http://www.tomshardware.com/reviews/memory-bandwidth-scaling-trinity,3419-8.html

As DDR3 gets faster (1600->2400) latencies increase (9-9-9-24 -> 10-12-11-30). It is apparent that APUs are bandwidth starved, and not effected all that much by latency. It would be great to see the same system with various GDDR5 configurations, but this is a low end APU, so that means low end memory.
 
Cost.
There's no reason to have such expensive memory sitting by the CPU.
The GPU loves memory bandwidth, so you use a wider bus and blistering speeds, but you pay for it.

A unified memory architecture does NOT make sense for performance PCs.

It's not just cost. As others have said, DDR3 is lower latency than GDDR5, and that's important because CPUs have to wait for the latency cycle to end before it can read/write to memory again.
 
PlatPerPerDollar.png



http://www.tomshardware.com/reviews/memory-bandwidth-scaling-trinity,3419-8.html

As DDR3 gets faster (1600->2400) latencies increase (9-9-9-24 -> 10-12-11-30). It is apparent that APUs are bandwidth starved, and not effected all that much by latency. It would be great to see the same system with various GDDR5 configurations, but this is a low end APU, so that means low end memory.

That is not correct, AFAIK.

CAS increases as frecuency do, right. But given a correct balance, latencies decrease, not increase.

DDR3 1066 CL7 should have a delay of 13ns, DDR3 1600 CL9 just 11ns, because even when needing more cycles, those cycles are faster.
 
Yeah, APU's are bandwidth starved. Their performance consistently increased with bandwidth increases. Of course, there is a limit to how much it improves, so far, the maximum is limited by the speed of DDR3, for all we know it can keep going significantly by adding more bandwidth.

Cell was similar. It is a very bandwidth starved device.
 
My PC CPU already has more bandwidth available to it than it requires in the vast majority of workloads. So no, using GDDR5 for it would not make any sense.


I was one of those people that said it was highly unlikely you'd see that level of GDDR in a console.
You were one of the people who continuously insisted that it would be unlikely to get 4GB of GDDR5 in a console, while ridiculing and questioning the qualifications of anyone who disagreed.
 
Yeah, APU's are bandwidth starved. Their performance consistently increased with bandwidth increases. Of course, there is a limit to how much it improves, so far, the maximum is limited by the speed of DDR3, for all we know it can keep going significantly by adding more bandwidth.

Cell was similar. It is a very bandwidth starved device.

This is the one reason why I think putting a GDDR slot on the motherboard could be beneficial. Even a 1-2GB stick could go a long way to improve performance in a AMD APU or intel integrated graphics chips, while the standard CPU would still access the DDR RAM on the motherboard. It might also help with creating smaller form factor PC's with really solid graphics performance for systems that use integrated graphics. It could also expand the memory capacity of GPU's without having to buy a new card, if they were capable of recognizing the extra GDDR.
 
DDR4 is something that is only being demoed right now. It should launch with Haswell or hopefully by this time next year. GDDR5 is best for graphics but almost anything can see a benefit. Some on PC area already using even DDR3 modules to make RAM DISKS. Pretty cool.
 
Top Bottom