• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

[MLiD] XBOX Magnus RDNA 5 Finalized

Is Magnus's RAM expandable/replaceable? Because then it won't be GDDR memory, right?
Impossible. GDDR7 doesn't have a socket memory standard.

Because Playstaion will end up the dominant development platform, what he deemed non-essential ends up not being used widely, not because he predicted correctly, but because the platform dominance made it "correct"...
This is correct.

To Add, Series X and PS5 have different architectural ratios. One has a stronger backend, the other a stronger frontend. In the end games were built around PS5.
Yes definitely a "lazy devs" problem if only they could've unlocked the magic of series X......
Series X has a 15% slower frontend.

Developers could either underutilize PS5's front end or XSX's compute.

It's obvious what they'd choose in retrospect. Xbox wanted a higher TFLOP figure, but they didn't care as much if the XSX was actually meaningfully faster.
 
Last edited:
Thanks for confirming, full of shit.
Tales from the ass.


Here's BG3. It's Nice since it's public information.

Before their series S "optimizations" (downgrades) they were using 5.3 GB on CPU and 3.5 GB on GPU.

It's not my fault that you were ignorant on the topic so you just concern trolled.

A PC gamer who has enough VRAM to not overflow can easily know how much game processes can use. PS5 can save because it can stream more easily and there is no need to copy. But you seem to fundamentally misunderstand UMA as the CPU/GPU mystically working on the same data.

PS5 has A TOTAL of 12.5GB RAM exposed to games. Developers can give 7.5-9.5 GB of that to GPU depending on the game.
 
Last edited:
And btw PS6 isn't full RDNA5 either.
For those concerned. It's more like that PS6 is an earlier fork of RDNA5 compared with PC/Xbox RDNA5. (Life gfx 1300 instead of 1305 for example, etc).

The features that define RDNA5 (Level4 RT, Neural Arrays, Dense BVH Geometry, etc) are all there in PS6.

The actually meaningful difference we know of is that PS6 has half the L2 per memory controller as the rest of RDNA5.

Saying this before we get dumb PS6 is RDNA4 takes.
 
Absolutely not.

Nvidia is the Tier 0 customer. They are bigger than Apple at this point. And their gaming GDDR orders are bundled with Rubin CPX orders for massive volume.

And Nvidia has much higher margins and free cash so they can outbid. The issue with Nvidia is that their AI side is a memory blackhole.
I don't believe there is a bidding going on for manufacturing RAM.

If that was the case AMD wouldn't be using so much RAM modules with their Helios Racks.

There are 18 of these trays in a Helios Rack.
soZceq2wAJM6u7Cb.png

Each of these trays consists of:
DPU: 64GB - 128GB DDR5
CPU: 16 × DDR5 channels (64GB - 128GB per channel)
GPUs: 48 × HB4 stacks, 12-Hi, totaling 1,728GB + 96 × LPDDR5 modules (32GB - 64GB per module)

AMD, Nvidia, Micron, etc. are just focusing on producing AI chips and not anything consumer related, causing retailers to raise prices due to scarcity like when the PS5 launched. .
 
Now story changes, like I said you were pulling an avg out of your ass.
It didn't change. You're nuance trolling. 9GB VRAM is on the extreme high end of what developers can give PS5 GPU given 12.5GB TORAL RAM they have to use.

The fact you didn't know this isn't indicative that anything in my thinking has changed.
 
I said 15% off spot. And again it's illustrative for how cost prohibiting current prices are.

It's not just Memory that increased. Remember that NAND went up a lot.

Sony needs memory to be dirt cheap for PS6 design to make sense at all. And right now that seems like a questionable gamble.
15% off isn't what you had in your image.
 
If that was the case AMD wouldn't be using so much RAM modules with their Helios Racks.
They have to use a lot of RAM because AI requires it.

AMD pays more for Samsung HBM3 (failed Nvidia qualifications six times) than what Nvidia pays for pristine Hynix HBM3e for Blackwell.

KhVATuh.jpeg


Furthermore, this is a situation where demand exceeds supply. In order to get large quantities of DRAM, you will defacto have to outbid other fabless firms.

RAM negotiations are just that, negotiations. The more that a customer needs the supply / cartel, the more the customer will have to pay up.

15% off isn't what you had in your image.
It was. Read the bottom.
 
Last edited:
It didn't change. You're nuance trolling. 9GB VRAM is on the extreme high end of what developers can give PS5 GPU given 12.5GB TORAL RAM they have to use.

The fact you didn't know this isn't indicative that anything in my thinking has changed.
You're still calling it VRAM, also you were closer to the truth in your previous post.

Like I said you were pulling an avg or something else from your ass.
 
Last edited:
I'm using the consumer stuff as an example to show that AMD, Nvidia and Sony isn't going to be paying what we are paying since they are buying straight from the manufacturers.

But consumers can't buy GDDR7, there is no usage for it. Prices are estimated from what large GPU companies are paying for GDDR7.

We can assume that largest consumer for GDDR7 in 2027 will be:

1. Nvidia
.
.
.
2. consoles
3. AMD? (they want to steal some of that date center pie from Nvidia)

I wonder how much limited console production will potentially be given that nvidia will probably snap vast majority of that VRAM.
 
I don't believe there is a bidding going on for manufacturing RAM.

If that was the case AMD wouldn't be using so much RAM modules with their Helios Racks.

There are 18 of these trays in a Helios Rack.
soZceq2wAJM6u7Cb.png

Each of these trays consists of:
DPU: 64GB - 128GB DDR5
CPU: 16 × DDR5 channels (64GB - 128GB per channel)
GPUs: 48 × HB4 stacks, 12-Hi, totaling 1,728GB + 96 × LPDDR5 modules (32GB - 64GB per module)

AMD, Nvidia, Micron, etc. are just focusing on producing AI chips and not anything consumer related, causing retailers to raise prices due to scarcity like when the PS5 launched. .
That Helios tray looks very clean. I am assuming AMD will build something similar for xCloud.

4 AT0 GPUs, each GPU paired up with 2-4 Magnus CPU SOCs.
 
its goin to be $999 if its coming out next year. ... with 18 - 20 gigs of ram.
It will come with minimum 36 GB ram, it's running windows 11. It will either be 36 or 48.

The appeal of this is a console that you can place under your TV and not have to worry about driver issues, etc. that you do with traditional PC gaming, as it is a closed, one spec machine. It combines that with with the benefit of being able to play all PC gamss with access to all PC stores. It will alsomplay all Xbox games, hell, it will play Sony titles that are on PC. That's huge. It will sell more units than most people think. Being able to play PC games easily on my living room tv is very attractive. Right now, when I game in my PC, I am stuck in my office where my desktop is which is lame. I think many people who PC game are in the same situation as me so I think there definitely is appeal in being able to play PC games easily in the living room.

Look at sales of pricey gaming laptops which sell like crazy. I'm thinking 10M units sold minimum which will be fine for MS As it will serve mainly as way for people to play game pass. MS already sells their titles on all platforms so they are nowhere near as dependent as Sony on having a large consoles sold base to sell to. They sell to everyone now.
 
Last edited:
They have to use a lot of RAM because AI requires it.

AMD pays more for Samsung HBM3 (failed Nvidia qualifications six times) than what Nvidia pays for pristine Hynix HBM3e for Blackwell.

KhVATuh.jpeg


Furthermore, this is a situation where demand exceeds supply. In order to get large quantities of DRAM, you will defacto have to outbid other fabless firms.

RAM negotiations are just that, negotiations. The more that a customer needs the supply / cartel, the more the customer will have to pay up.


It was. Read the bottom.
Guess you completely missed my point.
My point is there is no bidding going on for manufacturing RAM or bottleneck for supply.

You would more see bidding for chip manufacturing, which is the real bottleneck.

It's obvious AMD would be paying more since their bulk orders are obviously less.


Spot prices: Short-term/open-market indicator (not what Sony/NVIDIA primarily pay).

Contract prices: The real bulk pricing for big players like GPU/console makers.

Consumer prices: End-user retail, heavily impacted but not direct spot prices.
 
Last edited:
Curious to see how this will all pan out. Nextbox coming out a year+ before PS6 might actually be the beginning of truly next gen games starting to be released. But devs might also consider that with the lower install base and upcoming PS6 and shitty handheld, the juice might not be worth the squeeze.

I'd say first party is a different story, but MS is more likely to cancel a cool looking game after showing it off than actually releasing it.
That's another huge bonus of the approach MS is taking with this next Xbox. Developers are no longer just developing a game for Xbox. They are developing it for PC which has a huge install base. There isn't the risk of only selling to aa small installed Xbox console. Their title, when coded for the next Xbox, can sell to anyone who owns a PC. The next XBox will have crazy 3rd party support for this reason as anything PC is made for it.
 
Last edited:
My point is there is no bidding going on for manufacturing RAM or bottleneck for supply.
This is not true. Consumer DRAM demand exceeds supply by a country mile. How did you think we got to 250$ spot price for 16GB DDR5-4800/5600???

From JPMorgan's meeting with Micron management:

Expect DRAM/HBM demand to outstrip supply beyond 2026, even as new capacity comes online, underpinning further strength in pricing. Mgmt reiterated its previous assertion that it is only able to serve 50% to two-thirds of key customers' medium-term bit demand.

You think iPhone continued double digit growth in China while Chinese OEMs get hazed by 15% is natural?

Apple chose to fold to secure the quantities of DRAM they need to keep growing. Smaller players are just getting brutalized and deprioritized.

Sony is not a small fry but they are not Nvidia either. Their launch isn't just in doubt because of VRAM pricing. It's in doubt if they can secure enough DRAM for 10's of millions of PS6. GeForce is higher on the priority list than Sony.

Ofc the current memory market is insane. It's not necessarily indicative of 2028's. But it's absolutely false to suggest this is just a pricing thing.
 
Last edited:
30-50% depending on TDP targets for both
I was expecting 35-40%. Now I'm curious, would the Console allow eGPUs, so would it be possible to pair up a RTX 6080 with it? The Xbox Ally X allows eGPUs with a TB4 port.

Do you know if Magnus devices will come with any TB5 ports?
It will come with minimum 36 GB ram, it's running windows 11. It will either be 36 or 48.

The appeal of this is a console that you can place under your TV and not have to worry about driver issues, etc. that you do with traditional PC gaming, as it is a closed, one spec machine. It combines that with with the benefit of being able to play all PC gamss with access to all PC stores. That's huge. It will sell more units than most people think. Being able to play PC games easily on my living room tv is very attractive. Right now, when I game in my PC, I am stuck in my office where my desktop is, which is lame.

Look at sales of pricey gaming laptops which sell like crazy. I'm thinking 10M units sold minimum which will be fine for MS As it will serve mainly as way for people to play game pass. MS already sells their titles on all platforms so they are nowhere near as dependent as Sony on having a large consoles sold base to sell to. They sell to everyone now.
50-55 million pre built OEM gaming devices are sold every year, AMD wants to take market share of a big chunk of that by undercutting Intel/Nvidia variants. If they can get 10-20%, that would be 5-10 million devices every year.
 
This is not true. Consumer DRAM demand exceeds supply by a country mile.
There is no Consumer DRAM demand.
All I see are people complaining about a price increase like every other price increase. You can still buy PC components and consoles.

The demand is for HBM because of it's poor yields.
 
Last edited:
There is no Consumer DRAM demand.
All I see are people complaining about a price increase like every other price increase. You can still buy PC components and consoles.

The demand is for HBM because of it's poor yields.

RAM manufacturers have fixed capacity for memory production. Let's assume they can make 10M chips/year, vast majority of them will go into data centers in the form of HBM and GDDR7. Not much will be left for DDR5, GDDR6, LPDDRx etc.
 
There is no Consumer DRAM demand.
Even on a technicality this point is false because DIY consumer market is getting flushed down the toilet right now.

It's supply and demand. Price explodes when there is more demand than supply. It keeps rising until the elevated price destroys enough demand.

At current prices a lot of demand is destroyed. For example PS6 can't even be launched in the current environment even if the SoC was ready.

The demand is for HBM because of it's poor yields
Not really. Hynix has great HBM yield. It's more specific than that.

It's that by construction modern HBM uses 3bits for every 1bit of regular DRAM. Micron management called this the "HBM Ratio".

Micron's thesis that because of this, and the insane DDR demand from server to augment it, consumer supply would be so constrained that they'd be able to make disgusting profits. That was their "bull scenario".

So when Micron / SEC / SKH shifted 10% of their bits to HBM, that automatically took out a large amount of memory from the consumer market. If it reaches 16.7%, literally half of global wafers will go to HBM. You can see why that's a problem right?

Add that server DDR/LPDDR reqs are also exploding, and you can see why consumer facing businesses are struggling to secure supplies.

Even GeForce, a business with 40% operating margins, had to reduce production 15-20%. Lower volume low operating margin businesses (think radeon) are just going to be taking a vacation and selling much lower volumes when their inventory clears.

Sony said they have the minimum amount of GDDR6 that they need and will focus on monetizing existing users. Yes it's very bleak.
 
RAM manufacturers have fixed capacity for memory production. Let's assume they can make 10M chips/year, vast majority of them will go into data centers in the form of HBM and GDDR7. Not much will be left for DDR5, GDDR6, LPDDRx etc.
Right, but that doesn't change the fact there is no consumer demand.

The problem is HBM has poor yields and is time consuming to manufacturing. From the DRAM stacks to the interposer all have poor yields.

The RAM manufacturers are focusing most of their supply on HBM because there is no demand else where. There are no new cards or CPUs that would increase consumer demand.

Retailers are just using HBM supply as an excuse to raise prices, even on components in their stores for ages.
 
Last edited:
The RAM manufacturers are focusing most of their supply on HBM because there is no demand else where. There are no new cards or CPUs that would increase consumer demand.
Please.

Android OEMs are going to be down 8-15%. PC will be down 10%+. And that's with PC shifting 16GB SKUs to 8GB and 32GB SKUs to 16GB. GeForce is down 15-20%. There is no RTX 50 Super series anymore.

There is real demand destruction going on.
 
Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.
Wtf is this based on? It has 16GB of GDDR6. How are you uniformly turning this into 9GB of VRAM?
Edit: I see propellerEar already covered this incorrect broad claim.
 
Last edited:
I'm curious to see the price of Magnus, because when you consider that the Steam Machine, with its outdated specs, will probably cost 700/800 for a release this year...

Before everything went crazy with RAM, building a PC with an RTX 5080 would cost you a little over 2k, so if Magnus really does come out next year, I find it hard to believe the price will just be 1k, but probably much higher, especially if things around RAM don't improve.
 
I'm curious to see the price of Magnus, because when you consider that the Steam Machine, with its outdated specs, will probably cost 700/800 for a release this year...

Before everything went crazy with RAM, building a PC with an RTX 5080 would cost you a little over 2k, so if Magnus really does come out next year, I find it hard to believe the price will just be 1k, but probably much higher, especially if things around RAM don't improve.
I think even nuttella himself doesnt know magnus price yet, simply coz who knows how the ram/vram/ssd shortages/pricing situation gonna look in 1,5 years, could be much better vs now which could make magnus cost 1200$, or much worse and increase the launchprice to 2k usd, its crazy hard to predict ;X
 
Wtf is this based on? It has 16GB of GDDR6. How are you uniformly turning this into 9GB of VRAM?
It has 12.5GB Accessible to developers, the OS/FW/etc reserves 3.5GB. (DF)

Of that 12.5GB, a few GB's will be used by the CPU. ~3.5GB for the CPU is a reasonable estimate for the lower bound of PS5 CPU usage (BG3 Console is 5.3GB, CP2077 PC is 4.4GB, SoTR PC is 2.9GB, Wukong PC is 5.2GB, etc).

CPU / GPU work on different data. They get different allocations. Even on PC with UMA and on Xbox you see developers manage the usage of each separately. You can't game with just GPU memory.

So that leaves us with ~9GB VRAM for PS5 GPU as a high bound. And it explains why copy paste PS5 ports struggle on PC. As most PC gamers can only afford 8GB VRAM GPUs. The small discrepancy butchers PC's performance if not managed.

But ofc, you can call it 12.5GB RAM instead. In the same way NS2 is 9GB, XSS is 8.8GB, and PS5 Pro is 13.7GB.

Why do I say it like this? Because it takes the false magic out. So much bullshit was said about PS5 when it came out about how you will need 12GB VRAM to match it when most of the time PC's 8GB VRAM is enough and when it's short it's short by MBs not GBs.

This will become more important with PS6's 30GB. Since the GPU will probably get 20~22GB of it at most. PC's 18-20GB dGPU SKUs will be the closest equivalent.

And lastly, PC is more important than PS5 for gaming. So I use PC conventions as standards. Sony may have beaten Xbox, but they have not beaten Nvidia yet.
 
Last edited:
.....

This will become more important with PS6's 30GB. Since GPU will probably get ~20GB of it at most.

...
Strongly doubt that bolded part given the cost of memory to PlayStation and what has been a gradual standardization of console features/social media features since the PS4, so developers will likely get proportionally more memory out of the gate because less will be reserved, and in a unified memory system CPUs don't need so much dedicated memory when they prepare for the GPU can be left in situ.

Your BG3 example isn't typical IMO, so assuming a 2.3GB reserved by the system like the PS5 Pro, 3-4GBs dedicated to an action game's CPU data needs, that should leave +23GBs for a game's graphics data needs - using the current game design paradigm.
 
It has 12.5GB Accessible to developers. (DF)
Sure but why did you state 9GB out of thin air and concentrate on VRAM? It doesn't even make sense to your claim that:

"Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5."


They can more easily use more "VRAM for the GPU" with higher settings than they can with "RAM for the CPU". I don't understand why you're using these incorrect distinctions of unified memory that don't apply to begin with but lets go with it. Are you trying to suggest that VRAM cannot be used to add stuff that can be easily pared down/removed for <12.5GB? Because that's just nonsense really. VRAM use is easier to scale in a game than RAM use even on PC.

Of that 12.5GB, a few GB's will be used by the CPU. ~3.5GB for the CPU is a reasonable estimate (BG3 is 5.5GB, etc).

Is it though? Why would a game on a Series S require less RAM "for the CPU" than a PS5/XSX?

Why do I say it like this. Because it takes the false magic out. So much bullshit was said about PS5 when it came out about how you will need 12GB VRAM to match it when most of the time PC's 8-10GB VRAM is more than enough.
I'm sorry but absolutely not. I have enough knowledge/testing experience with 8GB VRAM cards to know this is nonsense with actual empirical evidence showing it's not. 8GB cards have performed like trash in a lot of games while performing fine on PS5/XSX.
We may get better RAM optimisation in future games but 8GB VRAM was not "more than enough" on current PC GPUs especially against PS5/XSX. 8GB VRAM on PC was "barely enough" and even then you had massive hiccups on a lot of games.
This will become more important with PS6's 30GB. Since GPU will probably get ~20GB of it at most.

And PC is more important than PS5 for gaming. So I use it's conventions as standards.
The GPU can get as much as it likes. What conventions or importance? You seem to be creating limitations and constraints that don't exist. Can you also explain what you mean by "SX is 15% slower at frontend". What exactly do you mean by frontend here?
 
Last edited:
so developers will likely get proportionally more memory out of the gate because less will be reserved
PS5 has 3.5GB reserved.
PS5 Pro has 4.3GB reserved. (2.3GB G6, 2GB D5). More not less.
NS2 has 3GB reserved.

PS6 GPU allocation should top out at 20~23GB.
They can more easily use more "VRAM for the GPU" with higher settings than they can with "RAM for the CPU"
Because 3.5GB is on the low end. If the average game was completely fine with that, 8GB RAM on PC would be stutter free.

BG3 Console is 5.3GB, CP2077 PC is 4.4GB, SoTR PC is 2.9GB, Wukong PC is 5.2GB, etc.

I am not ignoring the flexibility of UMA. It's already factored in. My response more speaks of 9GB VRAM as being on the high end of what can be allocated for only the GPU.
Is it though? Why would a game on a Series S require less RAM "for the CPU" than a PS5/XSX?
I didn't make that claim. The difference is OS. XSS allows developers to use 8.8GB total between CPU/GPU (OS uses 1.2GB). PS5? 12.5GB.

In practice though XSS memory is too brutal to deal with. So developers have to save every last MB they can.

I already shared the tweet about BG3 on XSS. They only managed to shave CPU usage to 4.7GB from 5.3GB. They were forced to push down VRAM use from 3.4GB to 2.3GB. (To leave enough headroom for overflows/not to crash).

As you can see it's a lot harder to reduce CPU usage than GPU usage. In practice XSS is beneath 6GB dGPUs. You can call XSS a 5-5.5 GB VRAM console I guess.

This speaks to how utterly terrible XSS's memory buffer is relative to what it was supposed to do.
8GB cards have performed like trash in a lot of games while performing fine on PS5/XSX.
I already addressed these unoptimized dogshit games. I've already talked about how PC's GPU bandwidth plummets once you have to use host mapped memory off PCIe. Don't make me repeat myself.

TLDR: Games that use 8.5-9GB VRAM on consoles often expect to be able to copy paste that onto PC. They can't. Effective bandwidth plummets once even a few MBs spillover.

(Note: a few of the sony first party ports legitimately use over 8GB VRAM with PS5 equivalent settings).

Fortunately now PC is bigger than just PS5. PS5 has to bend to 8GB now. Games that don't respect PC's 8GB line in the sand flop automatically regardless of how well they do on PS5.

The GPU can get as much as it likes. What conventions or importance? You seem to be creating limitations and constraints that don't exist.
This is so stupid. "The GPU can use as it likes" is so stupid because obviously the CPU will use memory. The "limitation" that I am "creating" on PS5 GPU is that the CPU must use a few GBs of memory. 🤣

It goes without saying that it will. You just want to pretend that it doesn't so it looks better vs PC.

Can you also explain what you mean by "SX is 15% slower at frontend". What exactly do you mean by frontend here?
Note: as a PC gamer. I am not that familiar or interested with RDNA architecture since Radeon isn't a relevant player on PC (6-8% share). So some of the details below could be wrong. If you wanna verify @ at Kepler.

The Geometry engine. The rasterizers. WGP level resources.

Xbox stuffed an additional 16CUs into the same 4SE's. It then ran them at a 15% slower clock. Meaning it's frontend was 15% slower but backend was 15% better.

This should include: vertex assembly, tessellation, geometry shading, culling and rasterization. All should be 15% slower on XSX.

If a game is not bound by the compute, but by the frontend, XSX straight up loses. As we see it to often do.

Microsoft just wanted a big marketing number. Then they got blindsided by Sony pushing clockspeed to 2.15-2.23 GHz. It reduced PS5's deficit in TFLOPs and made PS5's advantage in frontend much more real. There's nothing inherently wrong in the games where PS5 beats XSX.

Sony just out designed Microsoft. Which they did in the Gen 8 consoles as well.
 
Last edited:
PS6 GPU allocation should top out at 20~23GB.

Absolutely brutal for Magnus. 23GB would leave only 13GB for the entire Windows operating system, background tasks, and copying textures from system memory to video memory. 48GB would be a requirement for games not to be downgraded in some form.
 
Last edited:
Absolutely brutal for Magnus. 23GB would leave only 13GB for the entire Windows operating system, background tasks, and copying textures from system memory to video memory. 48GB would be a requirement
Xbox Full Screen Expirence doesn't load most of windows. That's the point of it. You basically have to boot into Xbox FSE rather than enter it as an app.

Not to mention Magnus should have access to PC's settings windows.
 
Xbox Full Screen Expirence doesn't load most of windows. That's the point of it. You basically have to boot into Xbox FSE rather than enter it as an app.

Not to mention Magnus should have access to PC's settings windows.

It won't matter much - cutting edge games use a ton of VRAM and system RAM. Marvel's Spiderman 2 at sufficiently high settings uses nearly 16GB of system RAM alongside 16GB of VRAM. With Spiderman 3, Insomniac could potentially use the full 20-24GB of VRAM as it would be a PS6 showpiece title. Magnus may not be able to go for PS6 equivalent settings and the advantage in compute will go to waste.

An allocation of 20GB system RAM and 16GB for VRAM would put Magnus in a strong position, if Sony went for 20GB unified RAM themselves.
 
Absolutely brutal for Magnus. 23GB would leave only 13GB for the entire Windows operating system, background tasks, and copying textures from system memory to video memory. 48GB would be a requirement for games not to be downgraded in some form.

It won't matter much - cutting edge games use a ton of VRAM and system RAM. Marvel's Spiderman 2 at sufficiently high settings uses nearly 16GB of system RAM alongside 16GB of VRAM. With Spiderman 3, Insomniac could potentially use the full 20-24GB of VRAM as it would be a PS6 showpiece title. Magnus may not be able to go for PS6 equivalent settings and the advantage in compute will go to waste.

An allocation of 20GB system RAM and 16GB for VRAM would put Magnus in a strong position, if Sony went for 20GB unified RAM themselves.
Problem is you outright assuming Magnus Console is a PC and would function with ONLY PC SKUs. The Console is a Console, with likely Console SKUs for the Xbox ecosystem. So in that scenario, it would be 30 GB vs 36 GB. For the Steam/Epic SKUs, the games have to use the unified memory as system ram and vram separately, so 16/20 split or 16/16/4 (OS) split is fine.
 
PS5 has 3.5GB reserved.
PS5 Pro has 4.3GB reserved. (2.3GB G6, 2GB D5). More not less.
NS2 has 3GB reserved.

PS6 GPU allocation should top out at 20~23GB.

Because 3.5GB is on the low end. If the average game was completely fine with that, 8GB RAM on PC would be stutter free.

BG3 Console is 5.3GB, CP2077 PC is 4.4GB, SoTR PC is 2.9GB, Wukong PC is 5.2GB, etc.

I am not ignoring the flexibility of UMA. It's already factored in. My response more speaks of 9GB VRAM as being on the high end of what can be allocated for only the GPU.
.....
BG3 is from a game data heavy genre so isn't representative. Game data sizes for specific CPU data have changed significantly since the PS4, and you can still see cross gen games from PS3 to PS4 like MGSV that had a base game data that worked within less than 256MB XDR, so we can then also assume all the PS3 Souls games worked within a small CPU specific data size of 256MB despite being game data heavy for actio n games. So even an allocation of 3-4GBs specifically for CPU game data on PS6 is an overestimate.

Also your PC memory uses are misleading - unless they were definitely done at the most memory demanding point and the Virtual memory was disabled in the system - because all PC games shadow some part of VRAM in their own memory use, so when minimum specs for CP2077 are listed as 12GB RAM and 6GB/8GB VRAM GPUs a basic low settings - without RT - that isn't quite the implied 4.4GBs on a PC, where 2GBs could easily be just the GPU driver shadowing some VRAM in RAM - and the rest in Virtual memory - VRAM shadowing that consoles with unified memory don't do.
 
Last edited:
that isn't quite the implied 4.4GBs on a PC
No it's literal process RAM usage with a 4090. There is no spill over since VRAM buffer is sufficient.

Some games can push to 8GB+ on PC (Unoptimized Monster Hunter Wilds off the top of my head).

still see cross gen games from PS3 to PS4 like MGSV that had a base game data that worked within less than 256MB XDR,
PS3 GPU also was fine with hundreds of MB's doesn't mean PS5 GPU is fine with that.

You guys are acting like a modern game can be run on 500MBs on CPU side and 12GB on the GPU side to cope. Again if that was possible, PC would have no problem playing games with 4GB RAM as long as GPU VRAM is sufficient.

There is no console magic.
 
@SoloKingRobert mistakenly exposing himself as Florian was one of the all-time highlight events during my time on GAF.
I don't buy that, but if it were true, it would make everything that happened back then even more hilarious, indeed.
Funny thing is that he was completely wrong about everything. @Riky was the one who correctly said the powers that be in UK government would pressure CMA to back off and let the acquisition go through. Florian was the guy who said Microsoft could just "waive" having to get CMA approval. Just proves that corporations will pay shills as long as they can pretend to know what they are talking about.
We don't know where this pressure came from, so he could still have been right in essence. What counts though is that he predicted the deal to go through - which got a lot of warriors riled up, and even more so when it actually happened. This is what I look back to so fondly.

My man, that's not his question.
It's useful information though.
 
Last edited:
No it's literal process RAM usage with a 4090. There is no spill over since VRAM buffer is sufficient.

Some games can push to 8GB+ on PC (Unoptimized Monster Hunter Wilds off the top of my head).


PS3 GPU also was fine with hundreds of MB's doesn't mean PS5 GPU is fine with that.

You guys are acting like a modern game can be run on 500MBs on CPU side and 12GB on the GPU side to cope. Again if that was possible, PC would have no problem playing games with 4GB RAM as long as GPU VRAM is sufficient.

There is no console magic.
Process ram usage tells us nothing in reality.

The Nvidia driver and virtual memory and ram usage is all part of that process to a large degree.
Without the source code you have no way of knowing what the real CPU specific data usage is when most games are middleware at their core that use memory wastefully on PC and at the start of the every console generation, and improve as we progress.

UE3.5 clearly had a run time that fit nicely within the PS3/360 memory limits. UE4 was the same for 8GBs, and UE5.7 clearly fits within this generation. Next gen UE's runtime will be proportionally smaller given the cost of memory, that will probably be the main selling point of UE6, so I still doubt a 30GB PS6 is seeing less than 23GB for its GPU.
 
I don't buy that, but if it were true, it would make everything that happened back then even more hilarious, indeed.

It was hilarious and yes, it happened. He outed himself as Florian by taking a screenshot of his own twitter page.

lol...here you go:


We don't know where this pressure came from, so he could still have been right in essence. What counts though is that he predicted the deal to go through - which got a lot of warriors riled up, and even more so when it actually happened. This is what I look back to so fondly.

Most of us predicted the deal would go through. It didn't rile anyone up. Everyone knew it was going to happen.

I mean....there was a poll and vast majority said it would go through.

umjxcC6bdW7gVYdz.png


Florian was one of the ones who said the deal would go through without requiring any changes. So he was actually wrong.

RZljPI02wRyIcS4c.png
 
Last edited:
Because 3.5GB is on the low end. If the average game was completely fine with that, 8GB RAM on PC would be stutter free.
This wasn't my question and I'm not sure what point it's making. My question was why were you mentioning "9GB VRAM" to begin with on a system with unified memory. I don't get it. Why are you concentrating on separate "VRAM" when asked by somebody "You don't think that there will be games that use the PS6's memory to the fullest extent?"

And your response was "Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5."

If they wanted to "add something that they can't turn off" youre talking about "RAM for CPU" from that 12.5GB pool which they can just use more of. For VRAM there is plenty they can add which uses more than that "9GB VRAM" on PS5 and you can easily 'turn off' for it. Higher quality textures for one.


BG3 Console is 5.3GB, CP2077 PC is 4.4GB, SoTR PC is 2.9GB, Wukong PC is 5.2GB, etc.
I am not ignoring the flexibility of UMA. It's already factored in. My response more speaks of 9GB VRAM as being on the high end of what can be allocated for only the GPU.
9GB VRAM is not the high end of what can be allocated for only GPU but lets assume it absolutely was. Why even mention PS5s upper VRAM usage when answering a question about the PS6? Are you suggesting that more VRAM is pointless because they'll make crossgen games? That's nonsense because you can very easily scale VRAM usage higher with higher settings even in current gen games. I don't get your point.
I didn't make that claim. The difference is OS. XSS allows developers to use 8.8GB total between CPU/GPU (OS uses 1.2GB). PS5? 12.5GB.
You had posted an image of BG3 using 4.7GB "Top RAM" separated from "Top VRAM" (these are only separated by function) in its latest version . I wanted to know whether you believe that it uses 4.7GB on XSS which is different to the 5.5GB you later claimed on PS5 and asked why you think it uses more on the latter. I didn't ask for a list of other game's RAM use on PC. These are clearly not the same on console.
I am not ignoring the flexibility of UMA. It's already factored in. My response more speaks of 9GB VRAM as being on the high end of what can be allocated for only the GPU.
On a PS5 lets say it does (it doesn't strictly but for the sake of argument lets say it does) how does that answer the question of PS6 "using its memory to its fullest extent"? You can easily scale settings to use more VRAM.
I already shared the tweet about BG3 on XSS. They only managed to shave CPU usage to 4.7GB from 5.3GB. They were forced to push down VRAM use from 3.4GB to 2.3GB. (To leave enough headroom for overflows/not to crash).
As you can see it's a lot harder to reduce CPU usage than GPU usage.
Ok, nobody is arguing against that though.
In practice XSX is beneath 6GB dGPUs. You can call XSX a 5-5.5 GB VRAM console I guess.
It is? I dont think so. For what reason do you believe this?
I already addressed these unoptimized dogshit games. I've already talked about how PC's GPU bandwidth plummets once you have to use host mapped memory off PCIe. Don't make me repeat myself.

TLDR: Games that use 8.5-9GB VRAM on consoles often expect to be able to copy paste that onto PC. They can't. Effective bandwidth plummets once even a few MBs spillover.

(Note: a few of the sony first party ports legitimately use over 8GB VRAM with PS5 equivalent settings).
So 8GB is not "more than enough" even for current gen games. They don't "expect" anything. They're just designed for that VRAM usage and often have higher recommended requirements. This will even increase further with new consoles.
Fortunately now PC is bigger than just PS5. PS5 has to bend to 8GB now. Games that don't respect PC's 8GB line in the sand flop automatically regardless of how well they do on PS5.
Hard disagree but again I dont get the relevance between PS6 memory utilisation, PS5 upper bounds for typical VRAM usage, and now this. If you wanted to say PS6 memory will be underutilised due to installbase of PS5/PC how does that prevent easily utilising the VRAM for higher settings in both higher end PCs and PS6? Why do you claim they can't add anything to 9GB and not 12.5GB? The inability to "add anything" for compatibility would apply more to system RAM not VRAM. So why are you blurting out PS5 VRAM upper bounds as an answer to a PS6 memory utilisation question?
This is so stupid. "The GPU can use as it likes" is so stupid because obviously the CPU will use memory. The "limitation" that I am "creating" on PS5 GPU is that the CPU must use a few GBs of memory. 🤣

It goes without saying that it will. You just want to pretend that it doesn't so it looks better vs PC.
What the fuck are you talking about man? Make what look better vs PC? PC isn't a set spec and more VRAM on a PC can be utilised to make it look better than a PS5 too.

Obviously the CPU will use some fraction of the total memory but the point was that the dev can choose whatever they like for their game. For example I can remaster/remake Bloodborne and use the same (even less) system RAM than the PS4 version on "things that can't be removed" but use the rest of the unified RAM on improving the visuals like higher quality textures, I can use larger acceleration data structures for raytracing. I can use it for any number of things. Again what has your claimed VRAM upper bound on a PS5 got to do with the usability of more unified RAM that I'm now utilising on a PS6 or higher end PC?
Note: as a PC gamer. I am not that familiar or interested with RDNA architecture since Radeon isn't a relevant player on PC (6-8% share). So some of the details below could be wrong. If you wanna verify @ at Kepler.
What?
The Geometry engine. The rasterizers. WGP level resources.

Xbox stuffed an additional 16CUs into the same 4SE's. It then ran them at a 15% slower clock. Meaning it's frontend was 15% slower but backend was 15% better.
This should include: vertex assembly, tessellation, geometry shading, culling and rasterization. All should be 15% slower on XSX.



If a game is not bound by the compute, but by the frontend, XSX straight up loses. As we see it to often do.



Microsoft just wanted a big marketing number. Then they got blindsided by Sony pushing clockspeed to 2.15-2.23 GHz. It reduced XSX deficit in TFLOPs but made PS5's advantage in frontend much more real. There's nothing inherently wrong in the games where PS5 beats XSX.



Sony just out designed Microsoft. Which they did in the Gen 8 consoles as well.
I've never heard people call these differences (like 15% lower clock) "Slower at frontend" but hey what do I know, maybe in your circles that's a colloquial or even somehow formal term. You seem to be constantly spitting out pretty random information at me though when my question is not related to that info.
 
Have they said anything about VR if it will be supported on this? They're calling it a PC basically so can you hook up any vr unit to this? That would be another big sell for this thing for some.
 
Have they said anything about VR if it will be supported on this? They're calling it a PC basically so can you hook up any vr unit to this? That would be another big sell for this thing for some.

I would assume that if Steam works, Steam VR will also work.

although, who knows...
 
VR would be good, but I wanna know about Thunderbolt 5 ports and if eGPUs would work, since they work with an Xbox Ally X.

would be cool. I generally hope that they don't cheap out on ports.

for example, IMO it should at least have 1 HDMI and 1 DP port, and at least 2 USB-A ports on the back.

in the front, 2 USB-C (ideally thunderbolt 5/USB4) ports

and maybe an SD card slot somewhere. or even better, one of those new SSD slots, for those new SSDs that look like SD cards... an easily accessible M.2 slot would also work of course.
 
Last edited:
Top Bottom