ars technica "Chips aren’t improving like they used to, and it’s killing game console price cuts"

Haven't they tried to make their own break in with Arc? They're taking steps to try and enter the dGPU market and their tech is pretty decent.
Yes, but the "GPU applications" I was talking about was not videogames and Arc has basically zero presence there AFAIK. Intel is still pushing their CPUs for these applications.
 
Last edited:
Whatever happened to HBM ram? I remember it being hyped up years ago and when people were speculating console specs gamers were even thinking it'd be in new consoles.

Never heard about HBM since.
 
Let me get this straight... they are blaming the chips when:

Sony & Microsoft both use an old TSMC process;
Slim consoles that weight so much less;
After each revision PS5 uses less copper;
GDDR6 prices are now 1/3 when you compare to 3 years ago;
Cost of nand memory is now much cheaper.


Pathetic, I wouldn't buy because it smells like scam, they are just making us pay for Microsoft's Activision acquisition and Sony mistakes, gaas and bungie.
 
Apple already runs the biggest game store in the world, and devs can put their games on iOS and Mac with little trouble, I am not sure what you want them to do exactly?

I think you know what I mean .. they need them big GAAS games, RPGs, etc.
 
Borderlands 4 will still look a Borderlands game, Doom will still look Doom, Metal Gear Solid will still look Metal Gear Solid, Death Stranding will still look Death Stranding, GTA VI will still look GTA, Resident Evil Will still look Resident Evil, The Elder Scrolls will still look The Elder Scrolls, Silent Hill will still look Silent Hill, Gears will Still look Gears, GOW will still look GOW. Problem solved, just be real and factual with tech bro.
 
Okay @Schmendrick what's your expected solution?

They'll have 250watts a their disposal, they'll need double the TOPs and RT of the PS5 Pro and at least the same or better raster capabilities for B/C. The Pro also shows their hand that an NPU is out of the question. They need to parallelize ML AI in the system or incur latency or get a modest improvement over the PS5 Pro.

They still have the option to delid and use liquid metal and expensive cooling to hit high frequencies/thermals at launch again, but the power consumption will be a surprising design choice if it isn't locked below 250watts. Even taking BoM out of the equation, other than my asymmetric twin APU suggestion how would they even double PS5 Pro performance at 250watt?
Double the PS5 Pro would mean around the performance of a 5080. Going down to 250W for the whole APU would mean around 40% less power draw, which should be possible moving from 5nm/4nm to 2nm. Ultimately the barrier is cost, not technical feasibility.
 
The last great chip was probably salt and vinegar.
38Cuutj.jpeg
 
Double the PS5 Pro would mean around the performance of a 5080. Going down to 250W for the whole APU would mean around 40% less power draw, which should be possible moving from 5nm/4nm to 2nm. Ultimately the barrier is cost, not technical feasibility.
I agree with your thinking but for one small issue: PSSR2 and how it uses bandwidth differently to DLSS/XeSS/FSR4.
(edit and it is less than 250watts for the APU, that's entire system power most likely)

Mark's technical breakdown of the PS5 Pro and PSSR shows exactly why they didn't follow the Nvidia, Intel and now AMD FSR4 route of providing exclusive ML silicon for the task, instead the solution needs to be a general purpose GPU that has unbelievable ML AI processing for the 1ms they take at the end of a frame.

So Mark's main objective with PSSR going forward sounds like he wants to make the neural network fused to give them more throughput in that 1ms, rather than hide latency and take more milliseconds like Nvidia, Intel, AMD.

Fusing the network means that the 15MB of vector register memory in the Pro needs to be closer to 60MB in a PS6 - maybe even double that - and to achieve that doubling the WGPs and doubling the register memory per WGP in a PS6 on 2nm isn't going to get them there within thermals and power draw.

My theory is that a redesign with RDNA4/RDNA5 on a PS5 Pro APU and doubling the vector register memory could get them within 100watts on an APU, provided they don't high clock the CPU and GPU at the same time and take inspiration of the big little approach of smartphones. Fusing the neural network with two separate APUs would then either require them to engineer a highspeed northbridge between the two - as is their specialist expertise - or have them process half the image on each APU and treat rendering like their old multi PS3 GT5 rendering prototype that only needed a 1000Mbit/s connections between PS3s.
 
Last edited:
Guess what Taiwan is no longer poor. Do you know why we stopped making clothes in the West, the same reason chips are getting in more expensive. The issue is countries like Bangladesh are making clothes but not chips. Its proving to be much more difficult to move production to India. There just isn't another China. Last 30 years was a fluke.
 
*RAM & NAND in PS5 likely down by ~$100*

*APU in PS5 (Slim) anywhere from down a few bucks to up a few bucks*

*Knock-on cost savings/efficiencies (Slim) from slightly smaller/lower wattage chip (PSU, VRMs, Heatsink, Complexity, Smaller Box/Less Weight etc.)

"The APU is making us raise prices!"
 
Last edited:
The thing is the average consumer doesn't care about the cost of process nodes going up or their being major engineering challenges, they expect something for their money. If PS6 is highly iterative and just something like a 3nm monolithic chip with bump in CPU/GPU architectures and no more than 60% more raw power over pro; then paired with another excessive cross-gen period it'll end up as little more than a PS5 Pro+. I think a significant number of people will just hang around on PS5, it already has the drastically improved load times and general responsiveness / quality of life improvements.

I think they need to find some way to pull a miracle out of their arse as there needs to be some self-evident leaps on screen to make it worth it on this go around and not someone saying "if you look closely, you can see the ray tracing on this surface here, it's really impressive what we're doing here because the calculation is really complicated and it's being accelerated by special units, it's running at sub-1080p and the particles look like tv static, but it's really impressive cause of the calculations, honest...".

I feel like they got away with it this gen, but I wonder if they get away with it again. I just don't see significant chunk of people being inclined to switch again to what'll probably be a $600 console (before a disc drive) so they can play what are fundamentally the same games with some extra bells and whistles. I think they really have to find a way to wow people or to differentiate the experience somehow. I know totally exotic setups are out of the window but I feel like they need to find a way to lean parts of the system towards that again to punch above their weight and at the same time make it incredibly easy to utilise, they also need to drop a few experiences that simply cannot be had on PS5 or even PS4.

Many people I know are frustrated with things as they are and I've explained to them as clearly as possible why things have slowed and how process nodes effect that; and to paraphrase their response it's basically "ok, I get it, but that's not really my problem, I expect a certain step up for my money and they need to find a way to do it, if not then I'll wait and buy a PC down the line, just not bother anymore or stick with my current console for most of the generation".

They [console makers] have to find a way.

The console makers are coping by charging more on the console people, more remasters/ports and releasing their console exclusives on PC at the moment. Won't be surprise if they double down on that.
 
The console makers are coping by charging more on the console people, more remasters/ports and releasing their console exclusives on PC at the moment. Won't be surprise if they double down on that.
Reminds me of the Cable TV industry as it keeps jacking up the prices are the few subscribers they have left.
 
Top Bottom