Man, this guy here doesn't understand the distinction between GDDR and DDR. What a silly console pleb.
Seriously guys, building a PC is not hard at all. Why are some of you so smug about it?
Man, this guy here doesn't understand the distinction between GDDR and DDR. What a silly console pleb.
Seriously guys, building a PC is not hard at all. Why are some of you so smug about it?
Everybody should read this so threads like this stop popping up.
The hype on this has gotten out of hand. 8 GB in a console is the big deal. Some of you are acting like Sony invented a new type of RAM that you'll eventually be able to use on a home PC if you're lucky and you beg Kaz.
My favorite part of new console launches is all the console folks start trying to talk tech when they haven't the slightest idea as to what their talking about.
IT HAS A 5 ON THE END! 5 IS BIGGER THEN 3!
HOW CAN I ADD MORE FLOPS TO MY GPU?
HOW CAN I ADD MORE FLOPS TO MY GPU?
I guess people are just afraid of technology or something. I genuinely believe that someone's grandmother could spend a couple of days on Google and figure it out.Everyone is always impressed I can do this. And I anyways ask them if they can build with Legos because that is how ridiculously easy it is.
Man, this guy here doesn't understand the distinction between GDDR and DDR. What a silly console pleb.
Seriously guys, building a PC is not hard at all. Why are some of you so smug about it?
HOW CAN I ADD MORE FLOPS TO MY GPU?
CPUs and GPUs have different RAM requirements.
CPUs want RAM with low latency, so they can very quickly access and move small chunks of data around.
GPUs want RAM with high bandwidth, so they can move large chunks of data.
DDR3 is suited for CPUs. It is low latency, but also low bandwidth. It is the defacto RAM found in PCs and servers. You spend $10,000 on a server, and it will use DDR3.
GDDR5 is suited for GPUs. It is high latency, but also high bandwidth. Graphics cars above level entry will use GDDR5 for VRAM.
The Xbox 360 was the pioneer for using GDDR (in its case, GDDR3) for both system and VRAM. The PS4 is following suit. While this might be fine for dedicate gaming machines, for genral purpose computing and CPU intensive work, you want low latency RAM. Which is currently DDR3.
There is a reason the next Xbox has gone for the DDR3 + EDRAM approach. MS have designed the console for more than games. The non gaming apps want DDR3. The EDRAM is there to mitigate the low bandwidth main RAM to a certain degree. Sony seem to have designed the PS4 as a pure bread gaming console. Different priorities resulted in different RAM architectures.
TL;DR you don't want GDDR5 as system RAM in a PC. When DDR5 finally comes to market, it might have best of both worlds. Low latency for CPUs and high bandwidth for GPUs. Only then would you want it as system RAM.
Man, this guy here doesn't understand the distinction between GDDR and DDR. What a silly console pleb.
Seriously guys, building a PC is not hard at all. Why are some of you so smug about it?
Not sure what both worlds you mean. High latency isn't a "plus" for gaming, it's just less of a minus than it is for PCs.
My favorite part of new console launches is all the console folks start trying to talk tech when they haven't the slightest idea as to what their talking about.
IT HAS A 5 ON THE END! 5 IS BIGGER THEN 3!
Man, this guy here doesn't understand the distinction between GDDR and DDR. What a silly console pleb.
Seriously guys, building a PC is not hard at all. Why are some of you so smug about it?
I think their attitudes are far more annoying than these threads. Thanks to the posters trying to actually educate people on the differences rather than making snide remarks. I actually learned stuff today.
CPUs and GPUs have different RAM requirements.
CPUs want RAM with low latency, so they can very quickly access and move small chunks of data around.
GPUs want RAM with high bandwidth, so they can move large chunks of data.
DDR3 is suited for CPUs. It is low latency, but also low bandwidth. It is the defacto RAM found in PCs and servers. You spend $10,000 on a server, and it will use DDR3.
GDDR5 is suited for GPUs. It is high latency, but also high bandwidth. Graphics cars above level entry will use GDDR5 for VRAM.
The Xbox 360 was the pioneer for using GDDR (in its case, GDDR3) for both system and VRAM. The PS4 is following suit. While this might be fine for dedicate gaming machines, for genral purpose computing and CPU intensive work, you want low latency RAM. Which is currently DDR3.
There is a reason the next Xbox has gone for the DDR3 + EDRAM approach. MS have designed the console for more than games. The non gaming apps want DDR3. The EDRAM is there to mitigate the low bandwidth main RAM to a certain degree. Sony seem to have designed the PS4 as a pure bread gaming console. Different priorities resulted in different RAM architectures.
TL;DR you don't want GDDR5 as system RAM in a PC. When DDR5 finally comes to market, it might have best of both worlds. Low latency for CPUs and high bandwidth for GPUs. Only then would you want it as system RAM.
Great explanation. However, I believe you meant to say "DDR4" instead of "DDR5" in your tl;dr. DDR4 just recently got wrapped up as a spec. Work hasn't begun on DDR5.
Some other things that I think are important to note:
1) The 3 and the 5 are the version numbers, but for separate things. DDR5 is not a thing yet (they're still working on DDR4 which should start releasing this year or next). It's very important that you have the "G" on there (which stands for graphics). It pains me when people see GDDR5 and DDR3 and think one is the obviously superior version. They are two separate products (imagine if the X360 was named Xbox 2. This is similar to someone saying PS3 > XB2, even though they're two separate product lines).
2) GDDR5 is actually based on DDR3 (as was GDDR4). They're basically two sides of the same coin. DDR3 is focused on low-latency, but with the tradeoff of lower-bandwidth, and GDDR5 has higher bandwidth, at the cost of higher-latency.
Nah, we have Teras the days, teras!You need the gigas.
I guess people are just afraid of technology or something. I genuinely believe that someone's grandmother could spend a couple of days on Google and figure it out.
Oh yeah, I'm sure it was an actual achievement back in the day. I have a huge amount of respect for the elders back in '94 who did shit like taught themselves C without any kind of IDE or anything.Yeah... back when I first started to build computers we actually had to try and translate some of that engrish in the motherboard manual to make sure we didn't set the voltage jumper wrong and thus cause a completely melted mobo... no over heat protection... old school.
Additionally, when I build PCs for friends and family, I tell them I'll wave the fee as long as they help with it's construction. Most are amazed with how "Square peg only fits in square hole" it all is. Usually the only problem any of them have is setting up the case switches.
I Not sure why PC people are being smug and snarky though. Sharing knowledge is never a bad thing, but lording it over other people surely is!
Let me tell you guys and gals something.
You want to know why people are hyping the PS4's 8GB GDDR5 RAM? Because PC gaf and all the experts said it was impossible. Nobody said anything about Durango's rumored DDR3 8GB. That, magically, was fine and dandy. But any talk of sony using 8GB was like cursing at jesus. What did you expect was going to happen after being told over and over and over and over x10 8GB of GDDR5 RAM can not be done and suddenly.......we get it? So everybody who's annoyed at this bragging of GDDR5, it's your own fault .
Hard to get though all the silicone.The real question is, can I code the GDDR5 to the metal?
lol
i'm not sure if OP is serious or not. I think I've been using GDDR5 for 3-4 years now...
Let me tell you guys and gals something.
You want to know why people are hyping the PS4's 8GB GDDR5 RAM? Because PC gaf and all the experts said it was impossible. Nobody said anything about Durango's rumored DDR3 8GB. That, magically, was fine and dandy. But any talk of sony using 8GB was like cursing at jesus. What did you expect was going to happen after being told over and over and over and over x10 8GB of GDDR5 RAM can not be done and suddenly.......we get it? So everybody who's annoyed at this bragging of GDDR5, it's your own fault .
Great explanation. However, I believe you meant to say "DDR4" instead of "DDR5" in your tl;dr. DDR4 just recently got wrapped up as a spec. Work hasn't begun on DDR5.
Some other things that I think are important to note:
1) The 3 and the 5 are the version numbers, but for separate things. DDR5 is not a thing yet (they're still working on DDR4 which should start releasing this year or next). It's very important that you have the "G" on there (which stands for graphics). It pains me when people see GDDR5 and DDR3 and think one is the obviously superior version. They are two separate products (imagine if the X360 was named Xbox 2. This is similar to someone saying PS3 > XB2, even though they're two separate product lines).
2) GDDR5 is actually based on DDR3 (as was GDDR4). They're basically two sides of the same coin. DDR3 is focused on low-latency, but with the tradeoff of lower-bandwidth, and GDDR5 has higher bandwidth, at the cost of higher-latency.
Yeah... back when I first started to build computers we actually had to try and translate some of that engrish in the motherboard manual to make sure we didn't set the voltage jumper wrong and thus cause a completely melted mobo... no over heat protection... old school.
Additionally, when I build PCs for friends and family, I tell them I'll wave the fee as long as they help with it's construction. Most are amazed with how "Square peg only fits in square hole" it all is. Usually the only problem any of them have is setting up the case switches.
Yeah... back when I first started to build computers we actually had to try and translate some of that engrish in the motherboard manual to make sure we didn't set the voltage jumper wrong and thus cause a completely melted mobo... no over heat protection... old school.
Well as 8 gig of ddr3 is normal in a PC they wouldn't, 8 gigs gddr5 is unheard of and very expensive.
I think the big deal is not only bandwidth (256 and 384 bit interfaces are expensive) but amount as well.
I highly doubt you even have a 4GB card with GDDR5, and if you did, the game didn't use it.
NEW
GDDR5 RAM
PUT IT IN YOUR BUTT AND GO
They think it's like a Kingstom RAM stick you can pick up from Staples I guess.
amazing.My favorite part of new console launches is all the console folks start trying to talk tech when they haven't the slightest idea as to what their talking about.
IT HAS A 5 ON THE END! 5 IS BIGGER THEN 3!
So is DDR4 used in PCs? I think I have DDR3 in mine....this is confusing.
Well I mean, they were hardly wrong. 8GB of GDDR5 RAM was totally unexpected and it's hard to fault anyone for saying as much. 8GB of DDR3 is middle of the mall shit. It'll be great to finally have that much RAM in consoles, but it's definitely not surprising or anything.Let me tell you guys and gals something.
You want to know why people are hyping the PS4's 8GB GDDR5 RAM? Because PC gaf and all the experts said it was impossible. Nobody said anything about Durango's rumored DDR3 8GB. That, magically, was fine and dandy. But any talk of sony using 8GB was like cursing at jesus. What did you expect was going to happen after being told over and over and over and over x10 8GB of GDDR5 RAM can not be done and suddenly.......we get it? So everybody who's annoyed at this bragging of GDDR5, it's your own fault .
Um
What do people think GDDR5 ram is?
exactly the reason people said it wouldn't happen. Jack said it yesterday... "I sure hope its not going to cost $599 for ps4"... well guess what it probably will.Well as 8 gig of ddr3 is normal in a PC they wouldn't, 8 gigs gddr5 is unheard of and very expensive.
Did I? Blimey. I have no idea how when I've not really commented on it before today.Yes i know. Still, people are bursting with joy because gaf said GDDR5 is impossible. And we actually get it and people are excited and now everyones frustrated and annoyed. Get out of here with that shit. You created this.
CPUs and GPUs have different RAM requirements.
CPUs want RAM with low latency, so they can very quickly access and move small chunks of data around.
GPUs want RAM with high bandwidth, so they can move large chunks of data.
DDR3 is suited for CPUs. It is low latency, but also low bandwidth. It is the defacto RAM found in PCs and servers. You spend $10,000 on a server, and it will use DDR3.
GDDR5 is suited for GPUs. It is high latency, but also high bandwidth. Graphics cars above level entry will use GDDR5 for VRAM.
The Xbox 360 was the pioneer for using GDDR (in its case, GDDR3) for both system and VRAM. The PS4 is following suit. While this might be fine for dedicate gaming machines, for genral purpose computing and CPU intensive work, you want low latency RAM. Which is currently DDR3.
There is a reason the next Xbox has gone for the DDR3 + EDRAM approach. MS have designed the console for more than games. The non gaming apps want DDR3. The EDRAM is there to mitigate the low bandwidth main RAM to a certain degree. Sony seem to have designed the PS4 as a pure bread gaming console. Different priorities resulted in different RAM architectures.
TL;DR you don't want GDDR5 as system RAM in a PC. When DDR5 finally comes to market, it might have best of both worlds. Low latency for CPUs and high bandwidth for GPUs. Only then would you want it as system RAM.
These should become mandatory reading fro those people who don't know how computers work.Great explanation. However, I believe you meant to say "DDR4" instead of "DDR5" in your tl;dr. DDR4 just recently got wrapped up as a spec. Work hasn't begun on DDR5.
Some other things that I think are important to note:
1) The 3 and the 5 are the version numbers, but for separate things. DDR5 is not a thing yet (they're still working on DDR4 which should start releasing this year or next). It's very important that you have the "G" on there (which stands for graphics). It pains me when people see GDDR5 and DDR3 and think one is the obviously superior version. They are two separate products (imagine if the X360 was named Xbox 2. This is similar to someone saying PS3 > XB2, even though they're two separate product lines).
2) GDDR5 is actually based on DDR3 (as was GDDR4). They're basically two sides of the same coin. DDR3 is focused on low-latency, but with the tradeoff of lower-bandwidth, and GDDR5 has higher bandwidth, at the cost of higher-latency.
The large pool of GDDR5 will be great for the GPU, but becuase of the high latency, the CPU could be potentially be starved of data since it would have to wait more clock cycles to get data from memory.And that is fine, but what we need to know is whether one or the other is better for gaming?
Taking into account the PS4 architecture: would the GDDR5 bandwidth outweigh the higher latency, or is lower latency of DDR3 more important than higher bandwidth?
Yes i know. Still, people are bursting with joy because gaf said GDDR5 is impossible. And we actually get it and people are excited and now everyones frustrated and annoyed. Get out of here with that shit. You created this.