If the 40 GB RAM rumors for the PS6 are true, PC gamers are in big trouble

I crashed my car. I'm in the hospital.

YO this is the surgeon. He can't type right now.

I hope he lives so I can kill him and the rest of humanity by my own devices.

bender%27s-devil-laugh-devil-laugh.gif
 
Last edited:
I don't even know where to begin, with all the lies and delusions you've written.
It's like I'm talking to a door.


The best I can do is tell you to read what I wrote again and if you still don't understand, I won't even waste my time.
If that were all true about the PS5, why are first party games showing up on PC with higher frame rates, better textures, higher native resolutions.. etc it's not lies unless you're going to cherry pick a couple unoptimized titles utilizing UE5. Plenty of games run very well on my 8700K / 3070. I look to upgrade in the near future. I have nothing against consoles but the ego it carries with some people that think they getting some high end hardware is incredibly naive
 
If that were all true about the PS5, why are first party games showing up on PC with higher frame rates, better textures, higher native resolutions.. etc it's not lies unless you're going to cherry pick a couple unoptimized titles utilizing UE5. Plenty of games run very well on my 8700K / 3070. I look to upgrade in the near future. I have nothing against consoles but the ego it carries with some people that think they getting some high end hardware is incredibly naive


Your PC already has problems with games because they require 16GB of vRAM.

If you follow the current standard, to ensure your PC doesn't run out of vRAM, and consoles have 40GB, your PC will need a video card with 32-40GB of vRAM.

And that's not even counting the PS5's SSD, which practically counts as RAM too, only limited to 5-20Gbps read speeds. And certainly the PS6 will have at least one SSD that saturates the PCI-E 5 standard, but if it follows the same standard as the PS5, it will be PCI-E 6 since the PS5 came out requiring SSDs that didn't even exist on the market.

I don't know what the mystery is, a PC has always needed higher specifications than a console to deliver the same result.



PC has always needed higher specifications than a console to deliver the same result.


PC has always needed higher specifications than a console to deliver the same result.

PC has always needed higher specifications than a console to deliver the same result.

PC has always needed higher specifications than a console to deliver the same result.


It's unbelievable
 
Last edited:
Compared to what we were used to a couple of gens ago this is like, nothing. And I'm not even accounting for D.R.(diminishing returns).

We are gonna get to a point, soon... that the mass audience will just say it's not worth it.. At that point we're gonna be in serious trouble. That should be at the top of our concerns right now.
 
My bet is on Keplar over MLiD any day of the week.

Keplar has proven to be a trusted source of gpu leaks for years. MLiD has hyped more AMD cards and playstation products than anyone ive seen and has been wrong on a few occasions.

That's just how I see it from a very basic, what information comes across my doorstep kind of view.
But I think this situation is weird in regards to the specs.

Kepler is looking at the changes happening with GFX12.5 (CDNA5), compared to GFX12 (RDNA4) and apply them to GFX13 (RDNA5).

MLiD on the other hand is saying that's not the case.

So from my understanding, GFX12 is the beginning of UDNA and GFX12.5 is a fork of that, so those changes shouldn't be applied to RDNA5.

For example, GFX12.5 doesn't support ray tracing.

So it should go like this if you look at it this way.
GCN = GFX12 (RDNA4),
RDNA = GFX13 (RDNA5),
CDNA = GFX12.5 (CDNA5).
TexJSxaCEodxCr9B.jpeg
 
Last edited:
No more special sauce since the PS4.
As usual, those consoles will be weak because they have to be affordable.
In fact, the "affordable" option is a PC with recycled hardware from China.
Those looking for cheap hardware build a PC with Xeon chips scrapped from old servers and RX580 chips recycled from mining.

The console is not the cheapest option, it is the most value for money
 
Last edited:
Previous console generations had up to 16X more ram than their predecessors. This only changed now, at this very generation we are in, where we went from 8GB to 16GB so that's only 2X.

Think about it. We went from having 16X jumps to just 2X. If we continued the same pattern, the PS5 should have up to 128GB RAM.

PC users were spared this generation. Be glad.

Seriously though, this wasn't the reason, we just hit diminishing returns. PS4 to PS5 visuals aren't that different other than higher resolutions and some Ray Tracing added on top. Having noticeably better graphics requires more money, more work, better engines (lol) and other stuff that we can't have anymore, human game development has reached it's technological peak.

So we don't really even need more RAM or better hardware in general anymore, the only reason we are getting it is because next gen engines will be even more bloated and worse performing and newer devs even more used to having no restrains and care less about optimizing, so from now on newer hardware will be used mostly to brute force code that becomes worse and worse and graphics that become only marginally better at best..
This isn't true. Greed by higher ups and suits at the top of the pyramid have gotten bigger thats all. There are Asian devs using UE5 giving us "next gen" leaps besides HellBlade 2 and The Matrix Awakens demo. Graphics have a long way yo go, tool advancement will make the work easier and reduce work, costs etc. Hellblade 2 is an example of a true leap.
 
Last edited:
Did people actually say that before the ps3 came out?
Yeah there was a pretty huge paper written about Cell must've been 2004-2005, all about how much ass it was going to kick and how it was going to revolutionize computing. And that compatibility with legacy x86 wouldn't be a problem because it could just emulate x86 faster than native.

Jon Stokes at Ars Technica wrote an article in response, urging people to remain skeptical until we see how Cell performs in the real world.

I'm not having any luck finding it but I believe the author was named Nicholas Blackmore.
 
Yeah there was a pretty huge paper written about Cell must've been 2004-2005, all about how much ass it was going to kick and how it was going to revolutionize computing. And that compatibility with legacy x86 wouldn't be a problem because it could just emulate x86 faster than native.

Jon Stokes at Ars Technica wrote an article in response, urging people to remain skeptical until we see how Cell performs in the real world.

I'm not having any luck finding it but I believe the author was named Nicholas Blackmore.
You're likely referring to this article that went the rounds back in the day. And here's the Ars article you're referring to.

According to IBM the Cell performs 10x faster than existing CPUs on many applications. This may sound ludicrous but GPUs (Graphical Processors Units) already deliver similar or even higher sustained performance in many non-graphical applications. The technology in the Cell is similar to that in GPUs so such high performance is certainly well within the realm of possibilities. The big difference is though that Cell is a lot more general purpose so can be usable for a wider variety of tasks.
 
Last edited:
You're likely referring to this article that went the rounds back in the day. And here's the Ars article you're referring to.
Thanks, that's it! Here's Version 1, which made some of the wackier claims https://www.blachford.info/computer/Cell/archive/Cell0.html

To Intel and AMD's processors Cell presents a completely different kind of competition to what has gone before. The speed difference is so great that nothing short of a complete overhaul of the x86 architecture will be able to bring it even close performance wise. Changes are not unheard of in x86 land but neither Intel or AMD appear to be planning a change even nearly radical enough to catch up. That said Intel recently gained access to many of Nvidia's patents [Intel+Nvidia] and are talking about having dozens of cores per chip so who knows what Santa Clara are brewing.

the vast majority of PCs these don't need the power they provide, Cell will only accentuate this because it will be able to off load most of the intensive stuff to the APUs. What this means is that if you do need to run a specific piece of software you can emulate it. This would have been impossibly slow once but most PC CPUs are already more than enough and with today's advanced JIT based emulators you might not even notice the difference.

Cell is going to be cheap, powerful, run many of the same operating systems and if all else fails it can emulate a PC will little noticeable difference, software and price will not be a problem. Availability will also not be a problem, you can buy playstations anywhere. This time round the traditional advantages the PC has held over other systems will not be present, they will have no advantage in performance, software or price. That is not to say that the Cell will walk in and just take over, it's not that simple.

https://arstechnica.com/uncategorized/2005/01/4551-2/ There's Ars Technica's response to it.

the article is chock full of wild-eyed and completely unsubstantiated claims about exactly how much butt, precisely measured in kilograms and centimeters squared, that the Cell will kick, and how hard, measured in decibels, that the Cell will rock. I'm as excited about the Cell as the next geek, but there's no need to go way over the top like this about hardware that won't even seen the light of day for a year. And it's especially ill-advised to compare it to existing hardware and declare that we have a hands-down winner.
 
I don't even know where to begin, with all the lies and delusions you've written.
It's like I'm talking to a door.


The best I can do is tell you to read what I wrote again and if you still don't understand, I won't even waste my time.
Not entirely true. If you want console equivalent settings and resolution then a 12GB GPU does the trick in almost every single game. A 6700 10GB actually matches a PS5 in most games.

Sure some games do perform better on console hardware, but other games perform better on PC equivalent hardware as well, just recently Black Myth Wukong. It really depends on how the game was developed and what the developers priorities are.

Naturally consoles still lead in price to performance.
 
People are forgetting the console memory is unified and shared between the CPU and GPU.

On PC you can get away with at 10GB GPU because you also have DDR memory, probably around 8GB.
 
If the 40GB of Unified RAM was correct I don't think it would deliver the results the OP expects purely because Sony are big Epic investors and have more interest in keeping UE as the main game development engine than making games that don't scale to PC's with inadequate VRAM.

However... the 40GBs is a peculiar amount and got me thinking., At launch even just 32GBs of GDDR7 is likely to cost nearly $150-$200 of a BoM for a PS6, and the only real reason for needing more VRAM is that the memory pyramid in a console has a huge difference in bandwidth between the unified RAM and the SSD.

You can't really bridge the SSD bandwidth/latency gap because it is tied to PCIe communication and limited by the costly and tiny ESRAM cache in the IO complex. But what if you altered the pyramid and invisibly added another unified memory tier, but was 4 times the size of the the expensive first memory tier.

Essentially you could black box memory through the memory controller and have 8GBs of GDDR7 at tier1 and have - initially for the first SKUs - a 2nd tier with cheaper GDDR5 or 6 or 6X with 4 times the size(32GB) to allow 4 channel parallelism to match the GDDR7 characteristics.

That way you could initially buy just 8GBs of GDDR7 per BoM at around $35-$50, and $80-$110 more for 32GBs of GDDR5 or GDDR6 to get a virtualized 40GBs of GDDR7 in the system for under $175.

It is probably all nonsense as everyone is saying and it will end up 24GBs of GDDR7 at under $175 as the solution, given that I suspect the APU being able to run PSSR2, etc fully fused within the GPU requires bigger register memory cost, leaving little over for excess spend on unified RAM, but in theory a launch model with a complex memory controller and mixed GDDR memory inside could be a possibility to get 40GBs in the system until GDDR7 drops to current GDDR6 prices.
 
Last edited:
Oh quick question - why are PC gamers always in trouble sometime in the future, and never now?

Why is Sony tech always going to be mind-blowing and dangerous in the future, and never now?
Because consoles and computers end up being fundamentally different no matter how closer the underlying hardware seems to become
 
Get real.

Sonys going to release some mid hardware so they can charge another $500 to play 3rd party games and a couple of exclusives a year until they too go 3rd party.

40 GB of ram. Just stop.
 
Top Bottom