ars technica "Chips aren’t improving like they used to, and it’s killing game console price cuts"

ArtHands

Thinks buying more servers can fix a bad patch

Moore's Law held true over the next four decades thanks in part to dramatic improvements in the manufacturing processes used to make silicon chips. Chip fabricators like Intel and AMD and Samsung and TSMC—and many others that have come and gone—kept developing more and more advanced ways to cram more transistors into the same amount of physical space, making that continued doubling of transistor counts over time feasible.

Not everyone will declare in so many words that Moore's Law is "dead," and any given tech exec's opinion on that says just as much about that exec's motivations as anything else (Nvidia's Jensen Huang, who can sell GPUs for more money if Moore's Law is "dead," will tell you it's dead; Intel execs who are trying to convince you that Intel is on the road to being competitive manufacturer of high-end chips will contest that).

But the fact remains that progress in new manufacturing technologies has slowed, and developing new ones has gotten dramatically more time-consuming and expensive. And unless people find a way to make transistors sub-atomic, we'll hit the boundaries of actual laws sooner or later: the laws of physics.



How does this relate to game consoles? We're getting to that.

A side effect of fitting more transistors into a given area is that power usage also goes down, because you need to use less electrical current to switch those transistors on and off. Taking any given silicon chip and making it on a newer, more efficient manufacturing process is commonly called a "die shrink," because you're making a functionally equivalent silicon die that uses less physical space.

For computer processors and GPUs, those power savings are usually "spent" by adding transistors, either to improve performance or add new capabilities—that's why a processor from 2020 can perform dramatically better than one from 2010 and consume about the same amount of power.

But game consoles are meant to be static, stable targets for game developers to aim at, so that their internal testing of a game and how it runs is always indicative of how the game will run on everyone's consoles.

So when you shrink the chips in a console, the benefit comes almost exclusively in the form of smaller physical chips that draw less power and put out less heat. You're also potentially getting more chips out of a single wafer of silicon, which theoretically can lower the price per chip (if the manufacturing process itself doesn't cost a ton more money).

A slowdown of that progression was already evident when we hit the PlayStation 4/Xbox One/Nintendo Switch generation, but technological improvements and pricing reductions still followed familiar patterns. Both the mid-generation PS4 Slim and Xbox One S used a 16 nm processor instead of the original consoles' 28 nm version, and each also had its price cut by $100 over its lifetime (comparing the Kinect-less Xbox One variant, and excluding the digital-only $249 Xbox One). The Switch's single die shrink, from 20nm to 16nm, didn't come with a price cut, but it did improve battery life and help to enable the cheaper Switch Lite variant.
 
Node improvements are surely part of it, but I think it's also fabrication capacity. There's so much demand for chip production that fabricators don't have much reason to reduce prices.
 
We already knew that, nodes aren't getting any less cheap. Somehow they are getting double digit percentages more expensive instead.
 
Last edited:
It's killing price cuts for everything, including PC.
PC is different because the generations move so fast - instead of price cuts each year you got improved hardware. No one was expecting a cheaper 2080super to be still being produced in 2025, but most of us would have expected a substantially cheaper PS5.
Most would have been expecting something more impressive than what we have currently available to buy at the 2080 Super price point though.
 
Are giant SOC no longer viable for future designs?
Need to split up the CPU and GPU and be a PC?
Or tiny chiplets that add up for more power?
 
I don't think we will be getting the performance boosts per generation like the last 25 years either
There's two factors there: hardware performance is no longer advancing at a geometric rate, and overall software complexity now contributes enough overhead that sacrifices to performance for practical development simplicity and speed are required. The first one can be somewhat addressed with better implementation of new techniques, like AI upscaling to deliver greater bang for your buck. I imagine the next consoles have this stuff baked into the silicon for maximum performance gains. So, we'll still see a jump with next gen, but not a seismic one. The second one is unavoidable; games are simply so complicated that we need all the various software and hardware layers of abstraction and interpretation to make development feasible for the majority of developers.
 
Last edited:
We already knew that, nodes aren't getting any less cheap. Somehow they are getting double digit percentages more expensive instead.
There is a hard limit on how small node can go, and we already not too far from it. So it's understandable that it will be harder and more expensive to try to reach a cap for current technology
Only switching technology may help, but it's also still too early

Are giant SOC no longer viable for future designs?
Need to split up the CPU and GPU and be a PC?
Or tiny chiplets that add up for more power?
The problem is in shrinking. Splitting will not help much
Basically cost of transistors doesn't go down, so any more power equals to extra cost
 
I don't think we will be getting the performance boosts per generation like the last 25 years either
Already reality.

PS5 / XSX are the same shit as PS4/XBO just "more" (not Moore, RIP): x86 APUs from AMD. Same thing with Switch 2: another ARM tegra chip from nvidia.

In both (or all 3) cases, the marquee features are AI upscaling through FSR/PSSR or DLSS and support for hardware raytracing. PS6 will be more of same shit but add frame gen (HW RT @ 120 'fps'!)

Visually, I still think the biggest jump was PS1/N64 to PS2/GameCube. That said, I think the biggest jump as a generation was 360/PS3.

It changed everything. Wireless controllers standard. Online play, lobbies, matchmaking. Digital storefronts right on the console. Both worked as fantastic media centers. Wii exploded the casual market and introduced a novel control scheme.

Being in college during the GH/Rock Band and COD (MW1+2) boom feels like it was the pinnacle.
 
Last edited:
PC is different because the generations move so fast - instead of price cuts each year you got improved hardware. No one was expecting a cheaper 2080super to be still being produced in 2025, but most of us would have expected a substantially cheaper PS5.
Most would have been expecting something more impressive than what we have currently available to buy at the 2080 Super price point though.
Okay but that doesnt address the core problem, if price of hardware continues to go up, people get priced out of the market, PC, consoles, phones, everything.
 
I prefer the term, crisps.
I love these flavored chips, especially at lunch time with a nice deli sandwich. 😋
image.gif.814c1ecb5a1c22956d2288a608b723b5.gif
 
Okay but that doesnt address the core problem, if price of hardware continues to go up, people get priced out of the market, PC, consoles, phones, everything.
maybe we can finally curb tech consumerism after tech peaks
 
Already reality.

PS5 / XSX are the same shit as PS4/XBO just "more" (not Moore, RIP): x86 APUs from AMD. Same thing with Switch 2: another ARM tegra chip from nvidia.

In both (or all 3) cases, the marquee features are AI upscaling through FSR/PSSR or DLSS and support for hardware raytracing. PS6 will be more of same shit but add frame gen (HW RT @ 120 'fps'!)

Visually, I still think the biggest jump was PS1/N64 to PS2/GameCube. That said, I think the biggest jump as a generation was 360/PS3.

It changed everything. Wireless controllers standard. Online play, lobbies, matchmaking. Digital storefronts right on the console. Both worked as fantastic media centers. Wii exploded the casual market and introduced a novel control scheme.

Being in college during the GH/Rock Band and COD (MW1+2) boom feels like it was the pinnacle.
The parts in bold were already standard in 6th gen (since 2002), if you had an OG Xbox with Xbox Live.
 
Last edited:
LOL we know.

Visual effects aren't improving in movies, either - all that matters is quality now. These gens will get longer and longer until they eventually stop and then we'll have to think only about game availability, quality, and value.
 
LOL we know.

Visual effects aren't improving in movies, either - all that matters is quality now. These gens will get longer and longer until they eventually stop and then we'll have to think only about game availability, quality, and value.
I certainly hope the generations start getting longer, considering how expensive the hardware is getting and how long it's taking to develop games. Games development was so much faster and cheaper last generation and the generation prior to that. Studios were able to put our entire trilogies of games in one generation, whereas the same studios this generation can barely get one game out. Something has to change.
 
Node improvements are surely part of it, but I think it's also fabrication capacity. There's so much demand for chip production that fabricators don't have much reason to reduce prices.

Yup simple supply and demand and one company being the main manufacturers of all chips. Tmc.

So yeah, a few reasons but the slowing of tech contributes
 
There is a hard limit on how small node can go, and we already not too far from it. So it's understandable that it will be harder and more expensive to try to reach a cap for current technology
Only switching technology may help, but it's also still too early


The problem is in shrinking. Splitting will not help much
Basically cost of transistors doesn't go down, so any more power equals to extra cost

We're reaching the point we can't make these things smaller and it is scary. It's a new reality of stagnation.
 
Die shrinks are one thing, but the cost effectiveness of die shrinks is rather excessively diminished now due to the virtual monopoly TSMC holds in bleeding edge high volume logic chip fabrication.

Wafer costs have been increasing, not decreasing, because Global Foundries, Samsung and even the formerly almighty Intel just cannot compete with TSMC anymore. So TSMC has been flagrantly price gouging their customers, while also taking their time in developing newer nodes as well as whole new materials (beyond silicon); because the lack of competition has removed all impetus for them to be aggressive on price or R&D investment.
 
I certainly hope the generations start getting longer, considering how expensive the hardware is getting and how long it's taking to develop games. Games development was so much faster and cheaper last generation and the generation prior to that. Studios were able to put our entire trilogies of games in one generation, whereas the same studios this generation can barely get one game out. Something has to change.
Soon we will have just one long generation, PS6 generation will probably start of that. There will be updates, but they'll update feature that and not raw power.
Like cars had a marginal increase in engine power and just had a reskin and some new features every 4 years.
 
Wafer costs have been increasing, not decreasing, because Global Foundries, Samsung and even the formerly almighty Intel just cannot compete with TSMC anymore. So TSMC has been flagrantly price gouging their customers, while also taking their time in developing newer nodes as well as whole new materials (beyond silicon); because the lack of competition has removed all impetus for them to be aggressive on price or R&D investment.
Everyone would like to too able to make further shrinks. Especially given the current margins.
It's just sooooo complex that we are lucky that TSMC able to do it, if it's not their expertise we would stuck with subpar processes other companies provide and it would be even worse for the whole state of electronics.
 
Yeah, well… that is why I was expecting and wanted longer console cycles and no mid generation upgrades while a lot of people still seem to want the "choice" of new hardware coming out more often. It does not make sense. Part of the fight is getting devs to invest in the new generation HW and SW and optimise it. Short generations go against that and also it takes longer and longer for great performance boosts to happen.

Now I understand the feeling that if someone sold me something that made my games look a bit better I might get it and I did, but fundamentally it is turkeys voting for thanksgiving's.
 
Last edited:
These gens will get longer and longer until they eventually stop and then we'll have to think only about game availability, quality, and value.
The last GameBoy platform was officially EOL in 2010. GB games are still improving (without Nintendo) and making one has never been easier or less expensive.
In 2023, McDonald's knocked out Grimmace's Birthday for GBC in 7wks to promote a shake.
Studios having a stable unchanging development platform for decades will streamline game development, lower dev costs and lead to better games.
Users never stop buying new games on a platform, platforms stop selling users games.
 
Last edited:
"Breaking: President Trump signed an executive order to withdraw Moore's law and replace it with Donald's law allowing chips to be +50x more powerful each year"
 
We're reaching the point we can't make these things smaller and it is scary. It's a new reality of stagnation.
I am not so unhappy about it. If you can't sell your product anymore by adding more visual appealing stuff, you must get better at something that is not easily showable but will translate into sales by good word of mouth, ie some new quality in some area that is lacking in product pitches at publishers currently. maybe even again with more risk, when the tech itself isn't evolving and adding costs every upgrade.
techwise quantum computing and raytracing seem to be in theory working better. Or whatever. I have no real idea, where it might go. But something fundamentally changing the x86/ARM world might be introduced some day.
 
Node improvements are surely part of it, but I think it's also fabrication capacity. There's so much demand for chip production that fabricators don't have much reason to reduce prices.
That's a bingo. Think back as recently as 14nm era.

We had Global Foundries, Intel, Samsung all doing alright on fabrication, not just TSMC.
 
These days only architecture improvement can result in any kind of die shrink. Phasing out x86 can't come soon enough.
 
Can someone smarter than me explain if the Apple M chips are doing some voodoo shit compared to other chips?
Apple are not doing "voodoo shit". They are using the same chip fabs as everyone else. I suppose where they do have an advantage is that they control the software and hardware of their platforms so can optimize some things such as moving things to the chips and coding for very specific hardware that others cannot. But they are not breaking records on raw performance. I suppose the volume and money Apple does also gives them priority as customers with chip fabs.
 
Are giant SOC no longer viable for future designs?
Need to split up the CPU and GPU and be a PC?
Or tiny chiplets that add up for more power?

Without a die and cost shrink we are fucked either way.
Chiplets would have similar yields.
AMD and Intel use multiple nodes in their chiplets/tiles to try reduce costs and gain efficiency.
Probably not viable for consoles though.
 
PC is different because the generations move so fast - instead of price cuts each year you got improved hardware. No one was expecting a cheaper 2080super to be still being produced in 2025, but most of us would have expected a substantially cheaper PS5.
Most would have been expecting something more impressive than what we have currently available to buy at the 2080 Super price point though.
What generations move how fast?!
50xx is basically close to 40xx which was close to 30xx… small jumps
 
What generations move how fast?!
50xx is basically close to 40xx which was close to 30xx… small jumps
He means how there's a new gen each 2 years more or less on PC and what companies do is just discontinue "old" HW and put the new one instead.

And the main problem with console prices isn't what Ars Technica says, it's just greediness, testing the waters and looking to raise profits hard.

I can understand it's not easy to discount old hardware as much as in the past alright, and that's why GPU manufacturers don't do it either, but the power leap between a 3070 and a 5070 is similar to a PS5 and a PS5 Pro, yet the 5070 is 50€ more expensive than the 3070, and the pro is 400€ more expensive than the PS5.
 
Can someone smarter than me explain if the Apple M chips are doing some voodoo shit compared to other chips?

The only voodoo they are doing is adopting every new manufacturing node as the first customer, often exclusively for a year+. E.g. the PS6 generation is rumored to use a TSMC 3nm-class node in 2027/2028. Apple has already been shipping 3nm in scale since the Apple A17 in 2023, so when consoles catch up it'll be almost half a decade old tech.
 
Last edited:
I am not so unhappy about it. If you can't sell your product anymore by adding more visual appealing stuff, you must get better at something that is not easily showable but will translate into sales by good word of mouth, ie some new quality in some area that is lacking in product pitches at publishers currently. maybe even again with more risk, when the tech itself isn't evolving and adding costs every upgrade.
techwise quantum computing and raytracing seem to be in theory working better. Or whatever. I have no real idea, where it might go. But something fundamentally changing the x86/ARM world might be introduced some day.

Quantum computing is not going to happen at scale, cooling requirements are absurd.
 
The only voodoo they are doing is adopting every new manufacturing node as the first customer, often exclusively for a year+. E.g. the PS6 generation is rumored to use a TSMC 3nm-class node in 2027/2028. Apple has already been shipping 3nm in scale since the Apple A17 in 2023, so when consoles catch up it'll be almost half a decade old tech.
U are technically right, but those 3nm chips apple is using in their phones are mobile low powerdraw ones console gonna need around 300mm² with 250W powerdraw.
Those need different node compared to small mobile chips apple is using.
https://en.wikichip.org/wiki/3_nm_lithography_process#google_vignette here some deep down info about it.
 
Top Bottom