PS4 has 8 GB OF GDDR5 RAM

Considering the WiiU is using 1GB for its OS and ps4 has impressive sounding "instant on" (memory intensive) and the sharing stuff I bet the OS is using more than 1 GB
 
Considering the WiiU is using 1GB for its OS and ps4 has impressive sounding "instant on" (memory intensive) and the sharing stuff I bet the OS is using over 2GB.

There'a thing called optimization. Just look at what the Vita can do and how much memory that is using.

OS footprint will be 1.5 GB or less.
 
Considering the WiiU is using 1GB for its OS and ps4 has impressive sounding "instant on" (memory intensive) and the sharing stuff I bet the OS is using more than 1 GB

WiiU? Sorry but Nintendo is way behind in everything including optimization.
 
Developers are loving the fact that PS4 has 8GB of DDR5 RAM. Now I don't know much about techs, but considering that people that I look up to in the gaming industry loves it, I choose to believe them more.

When all else fails, trust the professional, not the herd :p

Obviously you're new to GAF, but to even get an account on this site you have to be a senior game developer and/or engineer with 15+ years of experience in the gaming industry. If I were you, I'd trust the word of the average gaffer than some random dude on stage presenting the technology at the Sony conference.
 
But is that from Nintendo, or some pipeline rumor?

I honestly did not ask. It was a reply I glossed over. If I could find the thread, I would share it but I do not remember.

Considering the WiiU is using 1GB for its OS and ps4 has impressive sounding "instant on" (memory intensive) and the sharing stuff I bet the OS is using more than 1 GB

EDGE rumour had it pegged at 512MB with those alleged features.

I remember some dev here saying it was less than 1GB.

That is still very possible. It all depends on whether Sony are preparing to counter attack MS's feature set. If they think they can not match MS's feature then the it may go north of 512MB (which as it is, is 10x PS3's OS).
 
What could you expect from someone calling himself SPE after CELL processor.

Actually, numb-nuts, my forum name is my old high score name. It's my initals, which predate the Cell by a good 30+ years. Nice try, though.

And ad hominem attacks just show the weakness of your arguement. You can't argue against the actual facts (because the facts contradict your incorrect point) so you attack the person.
 
Frankly i don't think we are people who are be playing this game. I think it is created mainly with kids in mind same as was Crash or Spyro. It is directed by dude which was responsible for crash spyro and a lot of sonics past 1st one.

And Jak & Daxter. And Uncharted. And Killzone 3. And...

Seriously, check out the dude's wiki page to see the sheer number of top tier games he's worked on. His softology is incredible.
 
Some more volumetric smoke from Knack:

ibvU6EeXzYe1ZP.gif

Knack really makes me wish that Wonderful 101 was on PS4.
 
And Jak & Daxter. And Uncharted. And Killzone 3. And...

Seriously, check out the dude's wiki page to see the sheer number of top tier games he's worked on. His softology is incredible.

Dude is a legend, hands down. Insanely smart guy and uber nice, to boot.
 
Actually, numb-nuts, my forum name is my old high score name. It's my initals, which predate the Cell by a good 30+ years. Nice try, though.

And ad hominem attacks just show the weakness of your arguement. You can't argue against the actual facts (because the facts contradict your incorrect point) so you attack the person.

What a ridiculous assumption that was for him or her to make.
 
So I guess Sony's cost reduction plan is to switch the GDDR5 into stacked DDR3/4 at some point? Does anyone have any idea when that sort of thing might be ready?
 
I'm kinda hoping Knack plays like a cross between R&C and Rampage/War of the Monsters type games. Ageless appeal, imo.
 
So I guess Sony's cost reduction plan is to switch the GDDR5 into stacked DDR3/4 at some point? Does anyone have any idea when that sort of thing might be ready?

I'd imagine their cost reduction plan is to get GDDR5 for less than it currently costs as it becomes more established, older tech with more efficient production lines and higher yields.
 
I know little about computers tech wise, but honestly, you seem like one one of the most level headed people in this thread, and everyone just insulting you without having any way to counter your points reeks of fanboyisim on their part.

Thank you. Very much apreciated.

Ok, then a GTX 680 with 4GB.

Single GPU's with that many VRAM are made for SLI setups. By themselves, they not only doesn't improve performance, but all the way around, they can lose it, because of the loose timmings needed for the IMC to handle it:

EVGA-GTX580-3GB-36.jpg


EVGA-GTX580-3GB-39.jpg


EVGA-GTX580-3GB-42.jpg


EVGA-GTX580-3GB-65.jpg


http://www.hardwarecanucks.com/foru...ws/44390-evga-geforce-gtx-580-3gb-review.html

Doing strong SLI setups, you can run out of VRAM with the reference setup at very high resolutions. That is the meaning of those offerings. By them alone, they haven't enough horsepower.

Actually, numb-nuts, my forum name is my old high score name. It's my initals, which predate the Cell by a good 30+ years. Nice try, though.

And ad hominem attacks just show the weakness of your arguement. You can't argue against the actual facts (because the facts contradict your incorrect point) so you attack the person.

It's funny you say that about ad hominem, after it was me who had to introduce it in this thread. But, ok, lets give a second look at your arguments:

This post just reeks of a lack of understanding of how modern, game focused hardware works.

RAM is always a bottleneck in computing. Especially in fixed hardware platforms like consoles. Not just quantity of RAM, but bandwidth too.

Who cares most about RAM? Developers. If you have been following the progress of the PS4, it was initially designed with 2GB or RAM, and after pressure from devs, was upped to 4GB. Then recently, from further pressure from the likes of Epic and internal studios, increased again to 8GB.

And there are two big differences between the 8GB of RAM in the next Xbox and the 8GB of RAM in the PS4.

Xbox is using slow DDR3 RAM, which is better suited for general purpose computing. It has a low bandwidth, meaning you have a bottleneck as to how much data can be read/written into RAM each frame. PS4 is using GDDR5, a much more expensive RAM type designed for graphics (the G in GDDR stands for graphical). It has a blazing fast bandwidth, meaning a fuckload of data can be transfers in/out of it per frame, which has serious benefits in rendering. The next Xbox does also have a small cache of fast EDRAM to mitigate its slow main RAM, but even that has a bandwidth far below the RAM in the PS4.

The other point is the Xbox is reserving 3GB for the OS, so only 5GB is available for games. The PS4 is rumoured to reserve 0.5GB, so 7.5GB of RAM is available for games.

Whatever way you look at it, the 8GB of unified GDDR5 is a seriously impressive spec to have, and a massive improvement over what is rumoured to be in the Xbox.

So, Xbox bad and slow, PS4 expensive and fast. You made your points very clear way over the subconscious than where aimed at.

Again, you're wrong at a technical level.

There is no point having a beast of a GPU if you don't have the quality and type of RAM to feed it. This has easily provable real word metrics. Look at the range of AMD cards and their various rendering scores. You see better results with the same card moving from DDR3 to GDDR5, then you do going up to the next card (better GPU) and still use DDR3. And better performance by increasing the RAM

That case only happens when a powerful GPU is starving for bandwidth. And this is not the case with such a low-mid tier GPU and low end CPU.

The reality is, the PS4 would gain more rendering performance increasing the RAM from 4GB to 8GB, than they would adding a couple more CUs to the GPU. Likewise, MS would see better performance improving the RAM to GDDR rather than giving the GPU clock speed a boost or adding a few more CUs.

This is blatant false at all levels.

Now, explain me this:

GTX580 have a 384 bit bus, GTX 680 have a 256 bit bus, being the mid tier Kepler chip. Once you set their GDDR5 memory over 6Ghz effective speed, 580 will have an obvious bandwidth advantage. And one extra GB if we take the 3GB version of it.

Anyone in this forum, who have bothered into check some reviews, know GXT680 will mop 580 up anyway. It have better pixel and texel fillrate and a lot of architectural improvements over Fermi.

And this, guy, is incompatible with your walls of nonsense enthusiastic (on one side) text.

If a GTX 680 isn't crippled at any level with 190GB/s, how can be a much worse GPU afraid of hit bandwidth bounds?

If the most powerful, currently available, PC mono GPUs cant hit a single 2GB limit by themselves, how will be able a tiny low power GPU, like the Orbis one, go over 3GB without dying much earlier?

GDDR5 isn't the 'expensive' solution. In fact, is the cheapest one since it reduce bus sizes, and bus sizes are money both in chips areas and PCB complexity (and you can't shrink latter one).

Tell me, boy. GTX580 384bits 3GB card, or GTX680 256 bit 2GB card?
 
Someone's forgetting that 176GB/s bandwidth must be shared with the CPU. And a higher bandwidth + more ROPs means higher resolutions and higher MSAA, all good for image quality.
 
WALL OF CRAZY TEXT

Guy? Boy?

When you hear people say that GDDR5 results in heat issues, it also applies to people. This chump just had a full blow RAM related meltdown. And he still doesnt understand technical facts.

There is a reason the next Xbox has gone for the DDR3 + EDRAM approach. MS have designed the console for more than games. The non gaming apps want the low latency DDR3. The EDRAM and DMAs are there to mitigate the low bandwidth main RAM to a certain degree. Its not because MS are being cheap.

Sony seem to have designed the PS4 as a pure bread gaming console. Different priorities resulted in different RAM architectures.
 
Hey doctor, where does the system RAM buffer fit into your benchmarks?

/woosh

Forgot I was responding to apocalipsis, 8GB is def an overkill. ;)

What?

Test System & Setup

Processor: Intel Core i7 920(ES) @ 4.0Ghz (Turbo Mode Enabled)
Memory: Corsair 3x2GB Dominator DDR3 1600Mhz
Motherboard: Gigabyte EX58-UD5
Cooling: CoolIT Boreas mTEC + Scythe Fan Controller (Off for Power Consumption tests)
Disk Drive: Pioneer DVD Writer
Hard Drive: Western Digital Caviar Black 2TB
Power Supply: Corsair HX1000W
Monitor: Samsung 305T 30” widescreen LCD / / 3x Acer GD235HZ 23.5" 1080P LCDs
OS: Windows 7 Ultimate N x64 SP1

Same setup, read the review I linked.

It's starting to get racist in here....
Got members calling each other "BOY"

SPE said:
numb-nuts



Someone's forgetting that 176GB/s bandwidth must be shared with the CPU. And a higher bandwidth + more ROPs means higher resolutions and higher MSAA, all good for image quality.

CPU bandwidth requirements are minimal compared to GPU. Going from 4GB to 8GB doesn't add more bandwidth since the bus keeps the same. Very same with ROPs.

Don't understand your argument here.
 
Actually, numb-nuts, my forum name is my old high score name. It's my initals, which predate the Cell by a good 30+ years. Nice try, though.

And ad hominem attacks just show the weakness of your arguement. You can't argue against the actual facts (because the facts contradict your incorrect point) so you attack the person.
.....seeing that he assumes you named yourself after a component in Cell shows the state of his mind. You just can't win. Hell, he even bought up your usage of ad hominem like he owns the word.

In my part, I wouldn't worry about the xbox. MS will have something up their sleeve.
 
.....seeing that he assumes you named yourself after a component in Cell shows the state of his mind. You just can't win. Hell, he even bought up your usage of ad hominem like he owns the word.

In my part, I wouldn't worry about the xbox. MS will have something up their sleeve.

No, I bolded that a poster, after calling me 'numb-nuts', retook the ad hominem argument and tried to use it against me.
 
Yes, but not wildly. Assuming 512mb for OS, in order to match 6GB of memory for a Titan it would need to use merely 1.5 for CPU stuff. That's not impossible but it is severely imbalanced. Dumping a whole game to RAM would be madness since disc capacity is so high and games aren't going to be 7GB total size unless it's a PSN indie game or something.
No PC game is developed with the Titan as a performance target though. They're developed for video cards with 512MB/1GB/2GB memory.
 
CPU bandwidth requirements are minimal compared to GPU. Going from 4GB to 8GB doesn't add more bandwidth since the bus keeps the same. Very same with ROPs.
Don't understand your argument here.
I thought your argument was that 176GB/s was overkill for 18CU's, nothing about 4GB vs 8GB.
I just showed you that you could use all that bandwidth up when working with the 8 CPU cores + high levels of MSAA + transparencies.
 
Using PC ports of current gen console games isn't a good example of bandwidth usage (especially with the RSX being the lowest common denominator).
 
Using PC ports of current gen console games isn't a good example of bandwidth usage (especially with the RSX being the lowest common denominator).

This. I would extend that to benchmarks of ANY game on ANY platform running today. If the game already exists, you're missing the point of having 8GB of GDDR5.

RAM is overkill and doesn't matter until you run out. Then nothing else matters, leaving you frantically trying to snatch back 20MB here or even 5MB there. Memory management is a PITA. Call up the audio guys, summon the artists, get those assets smaller, use less of them. I've run out of RAM and my program has ground to a halt!

More CPU cores/threads take careful thought to use, more fillrate and/or texturing requires more art. There are definite diminishing returns to having more of each.

But fast and plentiful RAM is the LAST resource that'll be underused. If it's available, developers will use it. Ask ten developer if they rather have the extra 4GB of RAM or a similar budget towards more CPU cores or compute units. All of them will go for RAM. Oh wait, I just did and they answered.
 
That's not an issue. They managed to cool Cell and RSX. A few extra memory chips are child's play in comparison.
No they didn't. Pretty much every launch console(2006 - early 2008) suffered the YLOD. Including my European launch 60GB Ps3 bought in mid-July 2007.

Obviously you're new to GAF, but to even get an account on this site you have to be a senior game developer and/or engineer with 15+ years of experience in the gaming industry. If I were you, I'd trust the word of the average gaffer than some random dude on stage presenting the technology at the Sony conference.
Yeah, it's not like they (including the developers) want to sell their products...

For the most part (not including gddr5) ram is usually pretty cheap, ddr3 is extremely cheap now.

Though it's not usually what is used, but memory is not nearly as expensive as most people make it out to be. CPU's are usually a big cost.
Hey genius, console RAM =/= your average of the shelf PC RAM sticks. (doesn't matter which kind)
 
No they didn't. Pretty much every launch console(2006 - early 2008) suffered the YLOD. Including my European launch 60GB Ps3 bought in mid-July 2007.

Ylod didn't have anything to do with system cooling. RROD was caused because cooling was bad.
 
No they didn't. Pretty much every launch console(2006 - early 2008) suffered the YLOD. Including my European launch 60GB Ps3 bought in mid-July 2007.

Can you actually back this up with a source?


Yeah, it's not like they (including the developers) want to sell their products...

By making them more expensive? Sounds legit. But on the other hand, you are right. Developers want to sell their products, and 8 GB GDDR5 RAM will help them with that. By making games look better.
 
No they didn't. Pretty much every launch console(2006 - early 2008) suffered the YLOD. Including my European launch 60GB Ps3 bought in mid-July 2007.
(1) YLOD has nothing to do with cooling
(2) My 2008 console still works perfectly
(3) Unlike MS' early models of the 360, I haven't really seen any reliable source that indicates PS3 failure rates were higher than industry standard

Hey genius, console RAM =/= your average of the shelf PC RAM sticks. (doesn't matter which kind)
Actually, the RAM in Wii U and presumably the next Xbox is exactly off the shelf PC RAM. Not on sticks of course, but the same chips.
 
This is a fantastic thing. Every developer I've spoken with here at Ubisoft Red Storm has agreed. Anyone trying to spin otherwise (doing "calculations" about Blu-Ray/RAM transfer) are kidding themselves.

Good to hear feedback from an actual developer on this subject. Thank you for that sir :-)
 
Hey genius, console RAM =/= your average of the shelf PC RAM sticks. (doesn't matter which kind)
Yup, they only need the chips and not the sticks + they're bought at wholesale, so they're cheaper than your average PC RAM.
Also, from a production standpoint, GDDR5 isn't really more expensive than DDR3, it's just that the latter is produced in far greater quantities.
 
It won't matter if microsoft decides to go with less, then all developers will go with less. that would suck for SONY...again
 
Good to hear feedback from an actual developer on this subject. Thank you for that sir :-)

Except, some gaffers apparently know more than the developers do without actually having worked on the dev kit, simply from an armchair. These are the true sherlock holmes of our time. One look and the mystery is solved. Unfortunately, the programmers who are actively working with the dev kit simply have not caught up to it yet.

I mean it was only today that I learned that PS4(and even XB3) will perhaps be inferior to WiiU in terms of lighting and DoF while outdoing it in particles and textures. I think while he may not be a developer, he certainly sounds smarter than one to know how things are going to turn out given the specs before even they do.
 
No they didn't. Pretty much every launch console(2006 - early 2008) suffered the YLOD. Including my European launch 60GB Ps3 bought in mid-July 2007.


Yeah, it's not like they (including the developers) want to sell their products...


Hey genius, console RAM =/= your average of the shelf PC RAM sticks. (doesn't matter which kind)

The good majority of those were the RoHS lead free solder issues that everyone was having.

Also my launch console still works /touchwood.
 
No they didn't. Pretty much every launch console(2006 - early 2008) suffered the YLOD. Including my European launch 60GB Ps3 bought in mid-July 2007.

So you've accounted for pretty much all launch consoles somehow? Yea ok.

My launch 60gb is running just fine still.
 
Top Bottom