[MLiD] PS6 Early Specs Leak: AMD RDNA 5, Lower Price than PS5 Pro!

At least with a handheld you can handwave away some of the parity discussions by saying it's a handheld and it shouldn't be expected to run full fledged PS6 games.
Then there is a chance 3rd party will completely ignore it and it will be more dead than Vita. Especially if its not marketed sufficiently and is ignored by masses.

I am interested in this portable but would need some reassurance from Sony it will be properly supported.

Not having parity clause is fine but it should get 90% of games releasing.

Or put Windows/steam OS on it.
 
Then there is a chance 3rd party will completely ignore it and it will be more dead than Vita. Especially if its not marketed sufficiently and is ignored by masses.

I am interested in this portable but would need some reassurance from Sony it will be properly supported.

Not having parity clause is fine but it should get 90% of games releasing.

Or put Windows/steam OS on it.
Unless the handheld is good and Sony makes first party games, it'll be dead regardless of third party support. Similar to Sony VR. Aside from PSP, some reason Sony commits to making the hardware (Vita, VR1, VR2) but barely makes dedicated first party games for it. They assume third party ports will carry it alone. But at least with some new first party handheld games it'll make it worth it a bit more.

You can tell if Sony will support it though since it's obvious from the beginning. If they barely/dont show any first party games during their early big promo trailers, you're not getting any except for maybe 1 or 2 down the line.
 
Last edited:
Unless the handheld is good and Sony makes first party games, it'll be dead regardless of third party support. Similar to Sony VR. Aside from PSP, some reason Sony commits to making the hardware (Vita, VR1, VR2) but barely makes dedicated first party games for it. They assume third party ports will carry it alone. But at least with some new first party handheld games it'll make it worth it a bit more.

You can tell if Sony will support it though since it's obvious from the beginning. If they barely/dont show any first party games during their early big promo trailers, you're not getting any except for maybe 1 or 2 down the line.
By 3rd party support, I mean solid support like how a steam deck/Windows handheld gets. Or something that release parity clause with PS6 would ensure.

Anything less and it won't make sense to invest in.
 
By 3rd party support, I mean solid support like how a steam deck/Windows handheld gets. Or something that release parity clause with PS6 would ensure.

Anything less and it won't make sense to invest in.
I agree.

Unless Sony is just trying to do what Nintendo does (and what Sony did in the past), get a piece of console and handheld markets and hope it's big just going on pure trending. Handheld always works for Nintendo,

Similar to them doing the VR thing twice with hardly any first party support. They probably thought just on hype alone it'd be big. You don't even hear about PSVR anymore in any way.
 
Last edited:
Something is telling me those Xbox "Magnus" devices are going to be a lot cheaper than $1200/$1500, especially if the performance delta between them and PS6 is only ~ 20%. Even if some SKUs scale to higher clocks or more enabled shader cores, or more memory & faster clocks, the APU itself presents a ceiling as to how far you can take that type of stuff.

I'd probably expect the most performant Magnus-enabled devices to be around $1000, but more modest ones (less system RAM, less storage, expected shader cores disabled, more conservative GPU clock, slower VRAM) can probably go for $700-$800 with some type of profit or at worst at-cost. And that doesn't rule out a SKU maybe even aiming for $599.

For a $1200 or $1500 version of that Magnus device I'd expect at least 384-bit bus and 32-48 GB VRAM at faster clocks, but unless I've misinterpreted any of the corrections the past week or so (granted, I haven't followed all of them), conclusions are Magnus is using a 192-bit bus correct? So yeah, that doesn't sound like a $1500 device.
 
Something is telling me those Xbox "Magnus" devices are going to be a lot cheaper than $1200/$1500, especially if the performance delta between them and PS6 is only ~ 20%. Even if some SKUs scale to higher clocks or more enabled shader cores, or more memory & faster clocks, the APU itself presents a ceiling as to how far you can take that type of stuff.

I'd probably expect the most performant Magnus-enabled devices to be around $1000, but more modest ones (less system RAM, less storage, expected shader cores disabled, more conservative GPU clock, slower VRAM) can probably go for $700-$800 with some type of profit or at worst at-cost. And that doesn't rule out a SKU maybe even aiming for $599.

For a $1200 or $1500 version of that Magnus device I'd expect at least 384-bit bus and 32-48 GB VRAM at faster clocks, but unless I've misinterpreted any of the corrections the past week or so (granted, I haven't followed all of them), conclusions are Magnus is using a 192-bit bus correct? So yeah, that doesn't sound like a $1500 device.
Sounds logical to me.

I don't know enough about tech or trying to understand all the specs and configurations people are guessing, but if PS5 Pro goes for $700 and next box is supposed to be jacked up, $1,000-ish tops for a base model.

Then as you said any premium models go bigger and pricier.

I dont expect PS6 to be any less than $700, but even next box $1,000 just seems so odd and out there. But if PS6 is $700-800 then maybe $1,000 for a better system isn't that far fetched.

Regardless, these consoles are in PC pricing territory now. Not saying anything about price/performance ratios but just going on pure price. And way back we'd be buying consoles around $300.
 
Last edited:
Interestingly, he said Magnus was only 1.5-2x better than the PS5 Pro.

Secret Cerny Sauce confirmed?
Well, at first he said Magnus was PS6 and that it oozed M. Cerny's engineering throughout and that it was totaly impressive...

Then, curiously, he put aside his interest in Magnus and only contradicts himself when assessing potential performance 🤷

It's basically a repeat of what he already did with PS5 Pro, speculating beyond the data. That's what makes his articles always exude a "take with a grain of salt" quality.
 
Something is telling me those Xbox "Magnus" devices are going to be a lot cheaper than $1200/$1500, especially if the performance delta between them and PS6 is only ~ 20%. Even if some SKUs scale to higher clocks or more enabled shader cores, or more memory & faster clocks, the APU itself presents a ceiling as to how far you can take that type of stuff.

I'd probably expect the most performant Magnus-enabled devices to be around $1000, but more modest ones (less system RAM, less storage, expected shader cores disabled, more conservative GPU clock, slower VRAM) can probably go for $700-$800 with some type of profit or at worst at-cost. And that doesn't rule out a SKU maybe even aiming for $599.

For a $1200 or $1500 version of that Magnus device I'd expect at least 384-bit bus and 32-48 GB VRAM at faster clocks, but unless I've misinterpreted any of the corrections the past week or so (granted, I haven't followed all of them), conclusions are Magnus is using a 192-bit bus correct? So yeah, that doesn't sound like a $1500 device.
If Kepler is comparing it to a 5080 and it's releasing next year, that's the equivalent of a $999 GPU on the market. AMD of course would be a lot cheaper in comparison, but the total BoM would still justify a > $1000 price tag. I don't think they have the luxury to subsidize this like Sony as it is likely not going to move the kind of volumes that would be needed to make it profitable at lower price tags. Having said that, I wouldn't be surprised to see lower performance variants at lower prices, or higher performance variants at even higher prices down the line.
 
If Sony improves the storage architecture that the GPU can access more data stored on the SSD directly and only use the GDDR7 for latency heavy data. We may actually see less RAM next gen.
I think the distance between processors and storage is a gulf so wide that large improvements to the SSD and the I/O pipeline are needed in order to keep up with RAM bandwidth improvements.

You could rely on the SSD more and need more RAM to buffer, yes ;).

To think in the 2030's a system with a maximum of 24 GB of total system memory is enough is myopic: I am not talking about crazy numbers like M4 MAX RAM amount and bandwidth, but this is a problem their engineering teams will need to struggle with and solve. The same way they had to pull a rabbit out of their hat to get 8 GB of GDDR5 at launch for PS4.
 
He deleted that video lol.
Surprise!!!

As if there weren't enough threads on forums, news stories on websites, Twitter, etc. where it was cited and discussed, it's hard to believe that deleting it will achieve a "it never happened or I never said such a thing."🤷
 
A power difference of the scale discussed in this thread is completely inconsequental. PS3 and Xbox 360, PS4 and Xbox One, PS5 and Xbox Series X had similar differences and it made no practical difference.

Only time it really made a difference was PS2 and Xbox, that was a difference that attracted players to jump over.
I would say ps4 vs xone and especialy xox vs ps4pro made practical difference, imo here difference will be around xox vs ps4pro so significant
 
If Kepler is comparing it to a 5080 and it's releasing next year, that's the equivalent of a $999 GPU on the market. AMD of course would be a lot cheaper in comparison, but the total BoM would still justify a > $1000 price tag.

The only major differences between a discrete GPU and a console is the form factor and the console has SSD + a game controller. I don't expect the console using the equivalent desktop GPU to be more than $200 more expensive than the discrete GPU.

I think the AT2 chip is $500 MSRP, just like RX 9070 XT.
 
Last edited:
I don't expect the console using the equivalent desktop GPU to be more than $200 more expensive than the GPU.
So the desktop equivalent gpu for PS5 pro is $500 now? Not a rhetorical question as I don't know the answer. Just looking for logical consistency
 
The only major differences between a discrete GPU and a console is the form factor and the console has SSD + a game controller. I don't expect the console using the equivalent desktop GPU to be more than $200 more expensive than the GPU.

I think the AT2 chip is $500 MSRP, just like RX 9070 XT.
The console also needs cpu, ram, storage, power supply, mobo, etc etc etc. The cost of a discrete GPU vs the cost of a complete fully equipped console are apples and oranges conversations. Unless I'm misunderstanding what you're going for here.
 
Last edited:
So the desktop equivalent gpu for PS5 pro is $500 now? Not a rhetorical question as I don't know the answer. Just looking for logical consistency

The 7700 XT, 7800XT have MSRP less than $500. The 9070 GRE should be less than $500 MSRP if released in the US.
 
Last edited:
The console also needs cpu, ram, storage, power supply, mobo, etc etc etc. The cost of a discrete GPU vs the cost of a complete fully equipped console are apples and oranges conversations. Unless I'm misunderstanding what you're going for here.

GPU has ram, mobo too. The CPU is added to the die on console but you also strip out parts you don't need from a discrete GPU such as infinity fabric / cache, and some display engines, so in the end you're not adding much die space at all.

Power supply is one I forgot. TY.
 
Last edited:
Just curious when it comes to ram configs, how does lets say 2 x 12 = 24 vs 4 x 6 =24 affect the gamer or dev?

Is there literally zero effect for gamers and devs? And it's more about company's COGs or form factor? Or there could be a gaming effect?
There could be a world of difference. Or not much at all.

2 x 12 = 24GB can come in two forms. you either have 2GB chips on a 12 channel 32bit bus which translates to a 384bit bus (no one is going to do this) and this would give you over 1.2TB/s of bandwidth.

Or you have a clamshell with a 2 x 32Bit 6 channel bus, which translates to 192bit bus with a pair of mem chips sharing each 32bit channel, so effectively getting the same bandwidth from 2 chips as you would have from one. Which will be around 700GB/s+.

As you can see, one config offers significantly more bandwidth. The clamshell will have higher latency though.

The dev will not really be affected though, the system overall will have higher memory access latency in the clamshell design, but it wont be the end of the world.
The console also needs cpu, ram, storage, power supply, mobo, etc etc etc. The cost of a discrete GPU vs the cost of a complete fully equipped console are apples and oranges conversations. Unless I'm misunderstanding what you're going for here.
Lets not forget that the BOM of those $1000 GPUs are usually less that half that retail price though. Its not costing AMD/Nvidia $950 to make a $1000 MSRP GPU. The real cost sits somehwre around $400 for a $1000 GPU.
 
The 7700 XT, 7800XT have MSRP less than $500. The 9070 GRE should be less than $500 MSRP if released in the US.
Got it. So a $700 price for ps6 would make sense. Then if Magnus is 20 to 30% faster (as per Kepler) on the same architecture, wouldn't that justify a price around $900?
 
Got it. So a $700 price for ps6 would make sense. Then if Magnus is 20 to 30% faster (as per Kepler) on the same architecture, wouldn't that justify a price around $900?

AT2 mid range should be the replacement for RX 9070 XT, so $700-$800 IMO.

PS6 I expect to be $499/$599 and break-even to slightly profitable.
 
Last edited:
I think the distance between processors and storage is a gulf so wide that large improvements to the SSD and the I/O pipeline are needed in order to keep up with RAM bandwidth improvements.

You could rely on the SSD more and need more RAM to buffer, yes ;).

To think in the 2030's a system with a maximum of 24 GB of total system memory is enough is myopic: I am not talking about crazy numbers like M4 MAX RAM amount and bandwidth, but this is a problem their engineering teams will need to struggle with and solve. The same way they had to pull a rabbit out of their hat to get 8 GB of GDDR5 at launch for PS4.
I always tell myself if anyone was going to incorporate the SSD controller within the SoC to bypass PCIe low bandwidth and high latency, would be Mark Cerny.

The Nand flash would be directly connected to the SoC in the same way as the GDDR.
 
Interestingly, he said Magnus was only 1.5-2x better than the PS5 Pro.

Secret Cerny Sauce confirmed?
There are Cerny ray-tracing patents if you do a search - I am waiting for more info on these but it looks like the usual thing he does (goes off to his mind-palace and comes up with an optimal solution to a problem which others perhaps don't even think about).

There's also this https://patentscope.wipo.int/search/en/detail.jsf?docId=US451711771&_cid=P12-MDZB0G-54452-1: "An asset store controller is configured to control management of assets in the asset store and the delivery of assets throughout the asset aware computing architecture. The asset store controller may be software running on the CPU with an assist from other on-chip units, in one embodiment. The asset store controller may alternatively be an on-chip complex containing co-processors, such as additional CPUs. Both the CPU and GPU may be modified for integration and/or interfacing with the asset store controller" which is highly suggestive of custom hw (like the io complex in ps5) which is designed to select the correct LOD of an asset instead of loading all LODs and then discarding most of the data - again it's an idea which anyone working on perf critical code knows - do as little work as possible, fetch as little as possible (moving data across memory bounds is very expensive in terms of watts).

"The GPU may be modified to be asset aware. For example, when an object is rendered, the GPU recognizes whether the required LODs of the asset have been loaded or not. The GPU is able to issue an "emergency load" request for the required LODs and then stall until the required LODs are loaded. In addition, the GPU recognizes what the highest LOD that has been loaded into memory is for a particular asset, and render a corresponding video frame using up to that highest LOD—but not beyond that highest LOD. Manipulation of page tables, CPU caches and GPU caches may be required as part of the loading and freeing of system memory, and therefore the asset store controller, the CPU and the GPU may be configured to accelerate these operations. In that manner, from the perspective of a game executing on the CPU, the required assets for corresponding draw calls "magically" show up in system memory in time to be used." <- highly suggests changes to a typical GPU, perhaps most of this is captured in RDNA5 etc though as I expect any good/clever ideas are now part of that project amethyst partnership.
 
Last edited:
I'm puzzled why MLiD keeps comparing the PS6 performance with running PS5 games and it getting them to locked frame rate. That is supposed to be the PS5 Pro's job.

The PS6 is supposed to play more graphically intensive games, not be a PS5 Pro Pro.
 
Last edited:
Let me know when we get hard specifications and less speculation.
I was thinking of buying a high end console next gen, but I think I am going to opt out for a second generation. Even if the benefit is competing with a higher end PC GPU, the closed nature of consoles are a major turn off, even if I grew up with that.
 
Last edited:
Ps6 DE - 499.99$ with every Blackwell feature & then some would be bonkers.

Disk will probably be separate.

Xbox is irrelevant to even be discussed, will probably fail to reach 10M LT.
 
- UDNA/RDNA5 is a rather large jump efficiency and features wise (looking at KeplerL2's suggestions) over PS5 Pro's hybrid RDNA2.x roots… TFLOPS do not tell the whole story. RT/PT workloads might have a much larger than expected efficiency gains on RDNA5 compared to RDNA2 even when considering the RDNA4 derived RT units (SER, ray reconstruction / AI based denoising, etc… all bring a large efficiency delta… anything improving the performance of incoherent workloads can have massive gains there… I see Cerny investing time there rather than pure CU count to increase FLOPS)
I mean sure but it remains to be seen how usable advanced RT will be on the next gen of console h/w. If they'll be at about 5070 to 5080 performance level then I'd argue that PT won't be used often and when it will be we will likely get it in 30 fps modes only.

Bruh. 18TF is not even 2X.
Yeah, so? It'll be more than 2X in practice because of other improvements - bandwidth, RT h/w, advanced AI upscaling, etc.
It will likely be a similar step visually to what we've got between PS4 and 5 with the main exception this time being the absence of target framerate doubling (the 30->60 switch).
 
Cant believe we are back into the hype cycle of leaks.

But, im super excited to see how these all pan out. If Sony can keep the price lower and offer a nice little spec bump over ps5 pro for like 450 to 500 dollars ill probs pick one up day one as long as theirs a few awesome sony single player games set up for launch.
 
I'm puzzled why MLiD keeps comparing the PS6 performance with running PS5 games and it getting them to locked frame rate. That is supposed to be the PS5 Pro's job.

The PS6 is supposed to play more graphically intensive games, not be a PS5 Pro Pro.
This was never the target of the Pro. The target was to play at the same performance framerates (good or bad) with quality settings.

On the other hand the PS5 actually played most PS4 games at locked framerate. Which is probably what the PS6 will also do with PS5 games, though the DRS will also enter into the equation.
 
I mean sure but it remains to be seen how usable advanced RT will be on the next gen of console h/w. If they'll be at about 5070 to 5080 performance level then I'd argue that PT won't be used often and when it will be we will likely get it in 30 fps modes only.
I am hopeful that their AI and RT tech will enable PT at dynamic 1440p (FSR4.x/PSSR2 upscaled 1440p dr) and 60 FPS in well optimised titles if we can have DOOM: TDA at 60 FPS and this level of quality and complexity with RTGI and RT required today. With PSSR2/FSR4 next year I would not be surprised some PS5 titles reaching for PT at 30 FPS and it looks like RDNA5 is a giant step ahead features set wise.

4K@60/120 Hz (upscaled) PT might be for PS6 Pro but PS6 needs a hook and I think Path Tracing will be one of them.
 
MLiD claims he knows a lot more about the PS6 but isn't allowed to say much, and now claims that the ray-tracing performance is between 5 and 10x faster than the PS5 Pro. I suspect this will vary on a game by game basis but the uplift should be huge.

EDIT : I'm not as suspicious of this claim, I think Cerny gave it away during the PS5 Pro seminar when he said he expects several quantum in ray-tracing performance in the future.




MLiD says than PS6 RT performance is 5-10X base PS5, not PS5 Pro. And we were told than PS5 Pro was already 2-4X PS5.
 
This was never the target of the Pro. The target was to play at the same performance framerates (good or bad) with quality settings.

On the other hand the PS5 actually played most PS4 games at locked framerate. Which is probably what the PS6 will also do with PS5 games, though the DRS will also enter into the equation.
I'm referring to unpatched games.
 
I am hopeful that their AI and RT tech will enable PT at dynamic 1440p (FSR4.x/PSSR2 upscaled 1440p dr) and 60 FPS in well optimised titles if we can have DOOM: TDA at 60 FPS and this level of quality and complexity with RTGI and RT required today. With PSSR2/FSR4 next year I would not be surprised some PS5 titles reaching for PT at 30 FPS and it looks like RDNA5 is a giant step ahead features set wise.

4K@60/120 Hz (upscaled) PT might be for PS6 Pro but PS6 needs a hook and I think Path Tracing will be one of them.
If the next-gen consoles are really able to match the performance of the RTX 5080 (which I believe is possible, but not with a power budget of just 160W and only 48 CUs), they will easily be able to run PT games at 1440p with the help of FSR 4.

Next gen consoles should offer similar quality / experience to my PC. My OC'ed RTX4080S 60TF has comparable performance to stock RTX5080 56TF and runs PT games between 100-170fps at 1440p with the help of DLSS Quality and FGx2. At 4K I need to use performance mode to get 90-120fps. On gamepad you really dont need more than that and even on K+B experience is good enough to enjoy these PT games.

I'm including screenshots from various PT games in the spoiler.

Cyberpunk 4K DLSS Performance (1080p internally reconstructed to 4K) with Path Tracing.

4-K-DLSSP-PT-FG-2.jpg


4-K-DLSSP-PT-FG-3.jpg


4-K-DLSSP-PT-FG-1.jpg


And 1440p DLSSQuality + FGx2 with Path Tracing

Cyberpunk2077-2025-07-26-21-41-59-102.jpg


Cyberpunk2077-2025-07-26-21-11-09-969.jpg


PT-DLSSQ-FG2.jpg


Doom The Dark Ages is the most demanding PT game I have played and still it's playable.

1440p DLSSQuality + FGx2 with PT

1440p-DLSSQ-FGx2-PT.jpg


1440p DLSSQuality without FGx2 with PT

1440p-DLSSQ-PT.jpg


1800p DLSS performance + FGx2 with PT

1800p-DLSSP-FGx2-PT.jpg


1800p DLSSPerformance + FGx2 with PT

1800p-DLSSP-PT.jpg


4K DLSS Performance + FGx2 (without FGx2 50fps)

4-K-DLSSP-FGx2-PT.jpg


Alan Wake 2 1440p DLSSQuality + FGx2 with PT

Alan-Wake2-2025-03-14-02-49-22-872.jpg


Alan-Wake2-2025-03-14-02-45-41-095.jpg


Alan-Wake2-2025-03-14-02-54-41-754.jpg


Alan-Wake2-2025-03-14-12-30-30-435.jpg


Indiana Jones 1440p DLSSQ + FGx2 with PT

PT-3.jpg


1440p-DLSSQ-PT.jpg


Black Myth Wukong 1440p DLSSQuality + FGx2 100-120fps with maxed out PT a, and 130-150fps with medium PT

1440p-screenshot.jpg


4K DLSSPerformance 80-100fps

b1-Win64-Shipping-2024-09-01-00-25-08-987.jpg


BMW-DLSSP.jpg


On this photo mode screenshot I didnt included OSD, but I was using 4K DLSSPerformance as well. It's only 1080p internally, but to my eyes, the image looks 4K like.

b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg

Personally I love hybrid RT, but I dont think PT performance impact is justified. PT is almost twice as demanding compared to hybrid RT and differences arnt that obvious in most games (maybe except for CB 2077, but even there you need to know where to look). I think developers on next gen consoles will still use hybrid RT instead of PT.

149fps path tracing

1440p-DLSSQ-PT.jpg


269fps with hybrid RT

RT-1440p.jpg


136fps with PT

PT-3.jpg


258fps with hybrid RT

RT-1440p-2.jpg


Hybrid RT technology runs very well on my GPU, but always destroys the framerate.
 
Last edited:
It'll be a massive failure once again but cope harder.
Idk I can see it doing well. I just depends on pricing. There are millions of Steam users right now. And I'd imagine some (including myself) who probably wouldn't mind getting a console if it could play their games with high fidelity and without all the quirks of PC gaming - shader stutters, shader compilations or just plain poor optimization. Plus you get things like quick resume etc that lets you jump in and out of games quickly. That's one of my favorite features on my Steam Deck and consoles that I wished Windows had.
 
Idk I can see it doing well. I just depends on pricing. There are millions of Steam users right now. And I'd imagine some (including myself) who probably wouldn't mind getting a console if it could play their games with high fidelity and without all the quirks of PC gaming - shader stutters, shader compilations or just plain poor optimization. Plus you get things like quick resume etc that lets you jump in and out of games quickly. That's one of my favorite features on my Steam Deck and consoles that I wished Windows had.
Right, if you're getting a PC, and Microsoft are offering a box than can play Xbox and PC games, I don't know why you wouldn't consider it. Like for example I assume GTA 6 will be playable at least in some kind of backwards compatibility mode, and if the PC port isn't out, then it gives you a way of playing the game.
 
Idk I can see it doing well. I just depends on pricing. There are millions of Steam users right now. And I'd imagine some (including myself) who probably wouldn't mind getting a console if it could play their games with high fidelity and without all the quirks of PC gaming - shader stutters, shader compilations or just plain poor optimization. Plus you get things like quick resume etc that lets you jump in and out of games quickly. That's one of my favorite features on my Steam Deck and consoles that I wished Windows had.
I think most developers/manufacturers consider core gamer demographic as pretty important. Which is why support hasn't dried up for xbox series, steam deck etc.

Selling 150 million consoles is great but having a higher MAU that buy and play games is more important IMO.

Duping a casual with cool marketing and promotions is not very productive, especially if they use system for 6 months, then its collecting dust.
 
Idk I can see it doing well. I just depends on pricing. There are millions of Steam users right now. And I'd imagine some (including myself) who probably wouldn't mind getting a console if it could play their games with high fidelity and without all the quirks of PC gaming - shader stutters, shader compilations or just plain poor optimization. Plus you get things like quick resume etc that lets you jump in and out of games quickly. That's one of my favorite features on my Steam Deck and consoles that I wished Windows had.
Why would it get a better optimization than PC if it will be even more niche device than PC with very small population that actually buy games for it.

Selling 150 million consoles is great but having a higher MAU that buy and play games is more important IMO.
What does it have with Xbox, especially xboxNext?
 
I think most developers/manufacturers consider core gamer demographic as pretty important. Which is why support hasn't dried up for xbox series, steam deck etc.

Selling 150 million consoles is great but having a higher MAU that buy and play games is more important IMO.

Duping a casual with cool marketing and promotions is not very productive, especially if they use system for 6 months, then its collecting dust.

Developers care more about ARPU (average revenue per user) than MAU (monthly active users) which can be inflated in the case of Microsoft who's MAU include users on Playstation, Switch, Mobile, PC users who may be engaging with there games but not with there storefront which matters more to the developer/publisher.
 
Why would it get a better optimization than PC if it will be even more niche device than PC with very small population that actually buy games for it.


What does it have with Xbox, especially xboxNext?

Sales of device like steam deck is very important cause its sold without additional convincing to a core player.

Xbox next gen will firmly fall into that category.

Developers care more about ARPU (average revenue per user) than MAU (monthly active users) which can be inflated in the case of Microsoft who's MAU include users on Playstation, Switch, Mobile, PC users who may be engaging with there games but not with there storefront which matters more to the developer/publisher.
Yes, that tends to be higher when people are playing the system.
 
Sales of device like steam deck is very important cause its sold without additional convincing to a core player.

Xbox next gen will firmly fall into that category.
Steamdeck has an added value for core gamers.
What is added value of next Xbox compared to alternatives?
 
Top Bottom