If the 40 GB RAM rumors for the PS6 are true, PC gamers are in big trouble

I crashed my car. I'm in the hospital.

YO this is the surgeon. He can't type right now.

I hope he lives so I can kill him and the rest of humanity by my own devices.

bender%27s-devil-laugh-devil-laugh.gif
 
Last edited:
I don't even know where to begin, with all the lies and delusions you've written.
It's like I'm talking to a door.


The best I can do is tell you to read what I wrote again and if you still don't understand, I won't even waste my time.
If that were all true about the PS5, why are first party games showing up on PC with higher frame rates, better textures, higher native resolutions.. etc it's not lies unless you're going to cherry pick a couple unoptimized titles utilizing UE5. Plenty of games run very well on my 8700K / 3070. I look to upgrade in the near future. I have nothing against consoles but the ego it carries with some people that think they getting some high end hardware is incredibly naive
 
If that were all true about the PS5, why are first party games showing up on PC with higher frame rates, better textures, higher native resolutions.. etc it's not lies unless you're going to cherry pick a couple unoptimized titles utilizing UE5. Plenty of games run very well on my 8700K / 3070. I look to upgrade in the near future. I have nothing against consoles but the ego it carries with some people that think they getting some high end hardware is incredibly naive


Your PC already has problems with games because they require 16GB of vRAM.

If you follow the current standard, to ensure your PC doesn't run out of vRAM, and consoles have 40GB, your PC will need a video card with 32-40GB of vRAM.

And that's not even counting the PS5's SSD, which practically counts as RAM too, only limited to 5-20Gbps read speeds. And certainly the PS6 will have at least one SSD that saturates the PCI-E 5 standard, but if it follows the same standard as the PS5, it will be PCI-E 6 since the PS5 came out requiring SSDs that didn't even exist on the market.

I don't know what the mystery is, a PC has always needed higher specifications than a console to deliver the same result.



PC has always needed higher specifications than a console to deliver the same result.


PC has always needed higher specifications than a console to deliver the same result.

PC has always needed higher specifications than a console to deliver the same result.

PC has always needed higher specifications than a console to deliver the same result.


It's unbelievable
 
Last edited:
Compared to what we were used to a couple of gens ago this is like, nothing. And I'm not even accounting for D.R.(diminishing returns).

We are gonna get to a point, soon... that the mass audience will just say it's not worth it.. At that point we're gonna be in serious trouble. That should be at the top of our concerns right now.
 
My bet is on Keplar over MLiD any day of the week.

Keplar has proven to be a trusted source of gpu leaks for years. MLiD has hyped more AMD cards and playstation products than anyone ive seen and has been wrong on a few occasions.

That's just how I see it from a very basic, what information comes across my doorstep kind of view.
But I think this situation is weird in regards to the specs.

Kepler is looking at the changes happening with GFX12.5 (CDNA5), compared to GFX12 (RDNA4) and apply them to GFX13 (RDNA5).

MLiD on the other hand is saying that's not the case.

So from my understanding, GFX12 is the beginning of UDNA and GFX12.5 is a fork of that, so those changes shouldn't be applied to RDNA5.

For example, GFX12.5 doesn't support ray tracing.

So it should go like this if you look at it this way.
GCN = GFX12 (RDNA4),
RDNA = GFX13 (RDNA5),
CDNA = GFX12.5 (CDNA5).
TexJSxaCEodxCr9B.jpeg
 
Last edited:
No more special sauce since the PS4.
As usual, those consoles will be weak because they have to be affordable.
In fact, the "affordable" option is a PC with recycled hardware from China.
Those looking for cheap hardware build a PC with Xeon chips scrapped from old servers and RX580 chips recycled from mining.

The console is not the cheapest option, it is the most value for money
 
Last edited:
Previous console generations had up to 16X more ram than their predecessors. This only changed now, at this very generation we are in, where we went from 8GB to 16GB so that's only 2X.

Think about it. We went from having 16X jumps to just 2X. If we continued the same pattern, the PS5 should have up to 128GB RAM.

PC users were spared this generation. Be glad.

Seriously though, this wasn't the reason, we just hit diminishing returns. PS4 to PS5 visuals aren't that different other than higher resolutions and some Ray Tracing added on top. Having noticeably better graphics requires more money, more work, better engines (lol) and other stuff that we can't have anymore, human game development has reached it's technological peak.

So we don't really even need more RAM or better hardware in general anymore, the only reason we are getting it is because next gen engines will be even more bloated and worse performing and newer devs even more used to having no restrains and care less about optimizing, so from now on newer hardware will be used mostly to brute force code that becomes worse and worse and graphics that become only marginally better at best..
This isn't true. Greed by higher ups and suits at the top of the pyramid have gotten bigger thats all. There are Asian devs using UE5 giving us "next gen" leaps besides HellBlade 2 and The Matrix Awakens demo. Graphics have a long way yo go, tool advancement will make the work easier and reduce work, costs etc. Hellblade 2 is an example of a true leap.
 
Last edited:
It's unbelievable
No matter how many times you quote yourself it won't suddenly make anything you write a fact. You are incorrect and there are plenty of examples as to why you are. It appears that you don't understand what native resolution is.
 
Last edited:
Did people actually say that before the ps3 came out?
Yeah there was a pretty huge paper written about Cell must've been 2004-2005, all about how much ass it was going to kick and how it was going to revolutionize computing. And that compatibility with legacy x86 wouldn't be a problem because it could just emulate x86 faster than native.

Jon Stokes at Ars Technica wrote an article in response, urging people to remain skeptical until we see how Cell performs in the real world.

I'm not having any luck finding it but I believe the author was named Nicholas Blackmore.
 
Top Bottom