PS6 with the same RTX 4090 power?

It will paint a prettier picture but won't have the same horsepower as a $2200+ card.
That's still arguable. The 4090 will be two generations removed from the top spot once the PS6 comes out. It's all contingent upon how good the advancements are in the next two nodes for AMD. If in 5 years, their mid-range chip still cannot outperform a 4090, they will be a failure. The 4090 will likely still be far stronger for machine learning but for graphics rendering? I wouldn't be the least bit surprised if the PS6 is stronger and would be a bit disappointed if it's not.

Nope, it will smoke 4090 in both feature set and performance. For reference the titan x was the top pc card 5 years prior to ps5 release. It runs Horizon Zero Dawn (not forbidden west) 40-60fps at 1440p medium settings.


The conclusion isn't wrong but you cannot exactly use the 5 year argument as a baseline. The Titan X came from a time when it still took only 1 to 1.5 years for a new node and architecture. We're now at 2 years+. If the PS6 comes out in 2027-2028, we can expect it to be RDNA 6 so two generations ahead, not 3 like with the Titan X.
 
You're looking at scalper prices, because the 4090 is still in high demand. Yet you were just saying how nobody bought one, lmao. This would be like me saying the ps5 is a $1000 console after launch when only scalpers had them. It's dumb.
0.88% of the Steam users active last month had a 4090 according to Steam's survey. This is what, around 1M sold maybe?

Raytracing as it exists right now on consoles is basically a marketing bullet point. "LOOK OUT, WE GOT RAYTRACING IN THIS GAME" and then it's just super basic puddles and reflections. Fact is the only people pushing raytracing to the next level with path tracing and RTGI are Nvidia.
Raytracing and real time global illumintion -in real time- in games is still on its infancy. Even in a 4090.

No idea who iSize is but it's gonna amount to another software based upscaling technique. You sound all mixed, checkerboard rendering on ps4 pro was also another software based upscaling technique and it wasn't a very good one.
Yes, you have no idea. In the top of their website there are different examples of their tech, with left and right arrows to see different examples: https://www.isize.co/

iSize has their own AI based Deep Learning image enhacing (and super sampling) technique. Yes, like any software (DLSS is also a software), is software based. And like any software could be accelerated by hardware.

And yes, checkerboard rendering was an upscaling software too, that got benefited of hardware acceleration too. And at its time was the best option that was available back then, the one that allowed something that people thought was impossible: to put AAA games in 4K at a decent framerate in a PS4 Pro.
 
Last edited:
If you want something with the power of a 4090... just buy a 4090. The PS games will land on pc eventually.
 
They aren't really ahead in anything.

ACM-G10 is a 400mm2 die on N6 which at best competes with Navi 22; a 250mm2 N6 die using an old architecture or N33; a 200mm2 die on N6 in normal rasterisation.
It does better at raytracing, where it competes with Navi 22; a 340mm2 die, again an old architecture.

One is tuned for rasterization and gaming only. Intel on first iteration is defining the foundation for things to come.

There's a reason for the die size

6m4ST4yWjGHJdfXucffmVG-970-80.png


Nvidia basically has an unassailable lead in graphics

Yup

But the battle will happen at mid range for the remaining ~12% market share AMD has.

, and Intel are worse than AMD in every relevant metric, except price/performance.
And given how much money Intel are currently losing, I really doubt they'd be willing to compete with AMD on price in designing a console APU.

Let us not forget that neither Intels GPUs nor CPUs are more power efficient than AMD's offerings in both spaces - particularly in CPU. Unlike in PC, where you can make excuses for poor performance per watt if the price/performance is good, consoles do not have that luxury.

You're thinking about desktop CPUs, consoles are not bound to that.

Intel's Core Ultra tiled APUs are going after Apple like efficiency, they're chasing the bigger CPU contenders with their new nodes and architecture.

Intel has a roadmap sure. So do AMD. I'll believe anything that roadmap sells when I see the benchmarks. Until then I wouldn't get your hopes up about Intel magically outcompeting AMD in GPUs.

Intel's 1st entry is already solid. 1st entry. What year was AMD~ATI's first entry? AMD is trailing behind. Intel already came with a better upscaling solution than AMD on first go when AMD was part of the DX12 ML & RT consortiums ages ago.

Underestimating Intel with all they have going on in the coming years is a huge mistake. AMD poked a dragon. Look, I own a Ryzen CPU since Zen1 1600, all the way to a 5800X3D, but it took almost everything and the best foundry on earth to surpass Intel sleeping at the wheel on 14nm+++++++++++. That lead ain't gonna hold long with what they have cooking. To have your own foundry in the 2 & 1.8nm range in 2024 is already a huge advantage. TSMC can only supply so much and Apple gobbles up the best node always, leftovers are then in a battle of ressources between Nvidia, AMD, Qualcomm, Broadcom, Mediatek, Sony, Amazon, etc etc. These make chipsets cost more and they don't have access to the best node.

AMD's roadmap rumours is that they are not even going after the high end GPU market for RDNA 4. RDNA 3 weakened their knee for that much. Even had rumours in late October of layoffs at the GPU department.


AMD APUs are stagnating for a while now, with AMD Phoenix models hampered by low RAM speed

Intel is ramping up 5 nodes in 4 years, has 3 times the market share in AI chipsets than AMD.

AMD doesn't have the war drums to fend off both Nvidia and Intel, simple as that. They were lucky as hell that Intel was sleeping at the wheel for so many years.
 
AMD is so far behind Nvidia at this point that even if they're speccing the Ps6 right now as a $1,000 console it will still be way behind the 4090 at raytracing
Bro I'm an Nvidia shill and this is just wrong, sure they might have their AI pipeline down but their RT performance has clearly caught up now that the pipelines are less Nvidia centric in newer RT titles
 
I don't know about that, the 4090 is an outlier, nothing today even from nvidia comes close. The 4080 is so far behind. The 4090 itself has a higher power draw than a PS5 and the 65" tv its hooked up to. PS6 will handily beat a 4080 though.
 
It will be more powerful. Assuming 2028 launch and 2nm it should beat the 4090, even at ray tracing.

PS5 Pro is apparently going to have significant leaps in AMD RT capabilities and that is out next year.
2028 Launch? The 4090 will seem like my 970 by then, jesus wept.
 
PS6 will be streaming only and they can upgrade their GPUs at their servers whenever they want. If not it'll be so far behind the PC masterrace, it's not even funny anymore.
 
Have consoles ever used 5 year old flagship tech? Don't expect anything over 60 fps mid settings, and that's if Sony and MS want to sell at a loss.
 
How do you come to this conclusion?
Basic logic? It's like 5 years away, are you assuming GPU technology is just not going to improve? 3nm is already production ready and by the time the PS6 is ready to go 2nm would have been out for several years.

"ps5 pro" LOL there wont be one.
It was leaked by the same guy that leaked the slim (and the detachable drive) and the PS Portal. It's almost certain.
 
Amazon and Google show prices mostly from around $2300-$2700, more or less around $2500.
it's because the AI push and nvidias proper AI hardware is sold out till god knows when and scrubs are resorting to 4090's because it's the next best thing they can get because AMD once again just waited for the future then scramble like headless chickens when the tree is growing golden fruit and they only just got the sprouts growing.
 
tbh, thats not unrealistic since ps6 is not going to be release anytime soon.By that time PCMR probably using 6XXX cards.
 
First of all; 4090s MSRP is $1600. And plenty of suckers bought them myself included.

But youre right that wasn't a fair comparison. Here's a better one.

87-DE00-B1-CAE6-4664-8145-CC6-EF2059885.jpg


this was also before DLSS3 and the gulf getting even wider. I dunno what you are even suggesting Sony is gonna do besides some software based solution but unless it has been in the oven for years and years it is gonna be far worse than current day DLSS.
The amd image is pretty bad but the DLSS image still looks like shit. DLSS stans are basically very loud advocates bragging about how their turd looks better than another vendors turd. I say this as someone who owns a 4090 and avoids DLSS like the plague where possible.
 
Alan Wake 2 path tracing puts it slightly behind the 3090 and inbetween the 4070 and 4070ti, hard to find other path tracing benchmarks that arent Cyberpunk. But comparing flagships which are in completely different price brackets is pointless when it comes to console architecture where everything is on a tight budget. Price point wise they are behind but not even one full gen behind.
Alan wake 2 path tracing puts it about on par with the 3080 and 4070.

In quake 2 and portal 2 path tracing benchmarks its again on par with a 3080. In cyber punk the most comprehensive rt benchmark its about on par with a 3060ti. Price brackets are meaningless since nvidia is charging insane margins courtesy of their significantly superior product and dominant market position, we look at comparable die sizes to see how a card is performing compared to the transistors in it and the associated cost(keeping nm processes in mind).

The 4070/3080 are around the ~300mm range while the 7900xtx is the 500+ monster die range and thus needs to be compared to the competitor's flagships. Now try to understand that when amds current flagship on a massive die consuming massive amounts of power is only matching a 3080/4070 (far from nvidias flagship) from even last gen and gets further embarrassed as the rt complexity ramps... up so yeah when its clearly inferior to even ampere in ml and rt im sorry but its not even close currently.

I was a huge fan of amd and have always been hoping and supporting it due to its past history of innovation and for the desperate the need for competition in the gpu space but lets call a spade a spade they are embarassingly behind in software features and forward development to the point they have basically thrown in the towel in their efforts of actually competing. Instead of investing in r@d to make ground they basically are happy to feed off nvidias scraps.
 
Last edited:
The amd image is pretty bad but the DLSS image still looks like shit. DLSS stans are basically very loud advocates bragging about how their turd looks better than another vendors turd. I say this as someone who owns a 4090 and avoids DLSS like the plague where possible.
Yes, DLSS still looks bad but at least it's true that looks better than FSR, and as of now it's the best in-game alternative being used to super sample game images in real time.

These deep learning AI techniques to upscale and improve frames considering a few previous frames are just starting, in a few years will be way better and faster than now, and won't be limited to super expensive high end cards. Nvidia obviously does that to sell GPUs. In the case of iSize, their tech can even run in CPU only devices (but obviously can be benefited of hardware acceleration to run them faster and reduce latency).
 
Last edited:
Based on PS4 launch:

In 2013 consoles launched - 2 years later top GPU was GTX980ti that was 3x faster than 7850 in PS4:

Pc5T3Ae.png


PS5 launched in 2020 and was noticeable faster than 980ti:

aK7a3DC.jpg


2 years later 4090 launches and is ~3x faster than equivalent to PS5 GPU:

BluiCML.jpg


So yeah if PS6 launches in 2027 it will be faster than 4090.
6700XT should be a closer match for ps5
 
Yes, DLSS still looks bad but at least it's true that looks better than FSR, and as of now it's the best in-game alternative being used to super sample game images in real time.

These deep learning AI techniques to upscale and improve frames considering a few previous frames are just starting, in a few years will be way better and faster than now, and won't be limited to super expensive high end cards. Nvidia obviously does that to sell GPUs. In the case of iSize, their tech can even run in CPU only devices (but obviously can be benefited of hardware acceleration to run them faster and reduce latency).
Here's the problem, I don't care if it's better than amd, I just care if it's good or not and it's not good enough to use.
 
That console would be costing like 2000 bucks if Microsoft is getting their will of sideloading stores on other consoles like ps6.
 
this reminds me pre-premier discussion about ps5 specs. So many people ate crow on the day it ws unveiled.
I remember when people were scoffing at the idea of the ps5/series x matching a 980ti/1080 but the problem is amd has dropped the ball so badly compared to the past that its legit a generational difference between nvidia and amd currently in forward oriented tech so we are forced to consider the possibility will amd be able to match the current flagship by then which is pretty depressing.
 
I remember when people were scoffing at the idea of the ps5/series x matching a 980ti/1080 but the problem is amd has dropped the ball so badly compared to the past that its legit a generational difference between nvidia and amd currently in forward oriented tech so we are forced to consider the possibility will amd be able to match the current flagship by then which is pretty depressing.

Microsoft and Sony will put alot of R&D to close the gap on forward tech as far as possible. Its hard to say how far they will get but it's save to say that they get problems with marketing next gen if they don't.
 
Graphics shouldn't be our concern. The quality of the talent that keeps dropping and the left people infecting the industry should be a greater concern.
there is still alot of talent out there , the problem is gamers are giving their money to the games with the most beautiful cinematics , most recent example is the muller powell principle , game that released 2 days ago and its amazing , but the game doesnt have top tier cinematics or an overly dramatic story so people dont like it

Baldur's gate 3 is the exception , it combines great story/cinematics with even greater gameplay
 
Last edited:
there is still alot of talent out there , the problem is gamers are giving their money to the games with the most beautiful cinematics , most recent example is the muller powell principle , game that released 2 days ago and its amazing , but the game doesnt have top tier cinematics or an overly dramatic story so people dont like it

Baldur's gate 3 is the exception , it combines great story/cinematics with even greater gameplay

I get the argument for not prioritizing graphics but wtf is everyone problem with people prefeing story games is beyond me.
 
Of course a Playstation 6, in 2028 will be more powerful than a 4090.
But the issue is how the market will evolve until then, and how will it compare to other GPUs from Intel, Apple, Qualcomm and NVidia.
We can be sure there will be new features in the GPU market, some for AI, some for RT, some for things we can't even imagine now.
 
Microsoft and Sony will put alot of R&D to close the gap on forward tech as far as possible. Its hard to say how far they will get but it's save to say that they get problems with marketing next gen if they don't.
The problem is that sony and microsoft themselves dont design the chip they bring their demands and request customizations based on amds architecture roadmap so we cant really expect sony and microsoft who have no real experience in graphics manufacturing to close the gap for amd. Even with sony and microsofts investment for e.g the gpus in the consoles are barely different than the release product and actually inferior in areas and signifigantly inferior to nvidias offering in rt and ml. The gap has only widened much further since than as nvidia has pulled away to the point there is a generational gap.
 
The problem is that sony and microsoft themselves dont design the chip they bring their demands and request customizations based on amds architecture roadmap so we cant really expect sony and microsoft who have no real experience in graphics manufacturing to close the gap for amd. Even with sony and microsofts investment for e.g the gpus in the consoles are barely different than the release product and actually inferior in areas and signifigantly inferior to nvidias offering in rt and ml. The gap has only widened much further since than as nvidia has pulled away to the point there is a generational gap.

Ms and sony helped amd to bring back their stuff in their pc lineup. It would look even worse for amd if console wouldn't exist. You never know with the right people and budget they might push it. No one imagined amds cpu comeback a decade ago.
 
The PS6 will likely have some Sony custom hardware solution for frame gen and upscaling the raw power of the console will likely be significant but I wouldn't expect something on a 4090 .

The 4090 is designed to brute force its way through games and in the process suck on a 1k watts of power and the FE is pretty loud.

Sonys solution for an upscaling and frame gen will probably enhance an AMD GFX card that has its own versions of those too but disabled so I imagine the perf profile will be
Similar to a 7900xtx
 
Anyone ever noticed how most of the threads on this board are about numbers? Sales numbers, subscriber numbers, hardware prices, software prices, release dates, sequel numbers, review scores, clock speeds, teraflops, refresh rates, frame rates, transfer rates, views on trailers...

It's actually pretty rare to have a thread where people just talk about playing the actual games.
 
Yes the PS6 will have better performance than the RTX 4090. That card will likely be 6 years old when the PS6 launches.

The PS5 came out 5 years after the GTX Titan X and destroyed it in performance. The PS6 is likely going to be 6 years after the RTX 4090. Midrange GPU's in 2027/2028 will beat the 4090.
 
Last edited:
3 years into the PS4 generation the 1080ti was the most powerful GPU. PS5 gets around that or slightly better from what I understand about older GPUs.

Its entirely probable that the PS6 will be on par if not exceed the 4090.
It will easily be better than a 4090. Some people just don't understand how technology works.
 
One is tuned for rasterization and gaming only. Intel on first iteration is defining the foundation for things to come.

There's a reason for the die size

6m4ST4yWjGHJdfXucffmVG-970-80.png

Yes there is a reason for the die size, but lets not pretend like you'd need literally twice the area to incorporate some AI/ML and RT units. ACM-G10 should; based upon its structure, be competing with (and beating tbh) Navi 22 in rasterisation. With the extra 60mm2 of die area being that boost in ML performance. That would line up. However, its substantially slower - embarassingly so. And not to mention that despite all that extra area, its only just about competitive with N22 in actual ray tracing anyway. It wins more often than not, but not by any insane margin. And then not to mention the fact that ACM-G10 draws more power than Navi 22 to achieve worse rasterisation performance and marginally better RT performance.

And all these ML benchmarks are all fantastic, but where will that translate to gaming performance? Why would Microsoft and Sony buy into that? They have a strict requirement for power and die area (i.e cost) to be as low as possible because the margins on consoles are razer thin.

Yup

But the battle will happen at mid range for the remaining ~12% market share AMD has.

I dunno how to break this to you, but since Alchemist came out, AMD's market share has actually gone up. So its not exactly a winning formula. Even with the steep discounts.
People were unwilling to forgive AMD of much smaller driver issues 14 years ago, so why would they be willing to forgive Intel; whose driver issues are worse and have been bad for decades (integrated graphics DO count)?

You're thinking about desktop CPUs, consoles are not bound to that.

Intel's Core Ultra tiled APUs are going after Apple like efficiency, they're chasing the bigger CPU contenders with their new nodes and architecture.

Show me the money. Where are their tiled APUs going after Apple-like efficiency? Where's the proof?
Even Apple is struggling to maintain "Apple-like efficiency" due to the slowing down of node improvements. Intel haven't had node leadership since 14nm which was nearly 6 years ago. Their nodes have been so poor that they're utilising TSMC themselves. Why should I just accept that Intel will magically make new nodes that will bring them to GPU leadership, when they've been failing to do so for over half a decade.

Intel's 1st entry is already solid. 1st entry. What year was AMD~ATI's first entry? AMD is trailing behind. Intel already came with a better upscaling solution than AMD on first go when AMD was part of the DX12 ML & RT consortiums ages ago.

AMD/ATi have been competing with Nvidia for 20 years. And along the way have managed to beat Nvidia at least a few times (R300, R580, RV870, Tahiti, Hawaii) and they've been plenty competitive on numerous occasions. But I fail to see what this has to do with anything. Having a "better upscaling solution" is grand and everything, but who fucking cares when your GPU is so slow, that it needs upscaling to compete properly against native rendering from AMD and Nvidia.

We're also ignoring the fact that ACM-G10 taped out in late 2020, and was aiming to launch early 2021. But in the end launched midway through 2022. Why should I have any faith that BMG will launch on time, and be competitive with Blackwell or RDNA4 in the mid-range. Heck it could come out when Blackwell Next and RDNA5 are launching, if their timelines slip as badly as they did with ACM.

Underestimating Intel with all they have going on in the coming years is a huge mistake. AMD poked a dragon. Look, I own a Ryzen CPU since Zen1 1600, all the way to a 5800X3D, but it took almost everything and the best foundry on earth to surpass Intel sleeping at the wheel on 14nm+++++++++++. That lead ain't gonna hold long with what they have cooking. To have your own foundry in the 2 & 1.8nm range in 2024 is already a huge advantage. TSMC can only supply so much and Apple gobbles up the best node always, leftovers are then in a battle of ressources between Nvidia, AMD, Qualcomm, Broadcom, Mediatek, Sony, Amazon, etc etc. These make chipsets cost more and they don't have access to the best node.

Intel is also one of those TSMC customers. Gee I wonder why that is.
Show me where their process leadership is right now. Hypotheticals are meaningless. None of this is about underestimating anyway. I don't doubt Intel will put their strongest foot forward in CPU. But GPU I have my doubts, because their track record is worse than AMD. Arc isn't even Intel's first foray into Graphics. Remember Larrabee? That went super well didn't it. And that was back when Intel had unquestioned CPU and Process superiority.

AMD's roadmap rumours is that they are not even going after the high end GPU market for RDNA 4. RDNA 3 weakened their knee for that much. Even had rumours in late October of layoffs at the GPU department.

I thought the battle would be fought in the mid-range? What does the high-end have to do with anything?


AMD APUs are stagnating for a while now, with AMD Phoenix models hampered by low RAM speed

Its really funny that you mention the Core U7 155h. Really funny. Because the graphics tile for the integrated graphics on that APU is manufactured on TSMC N5.
Xe-LPG also has a GPU that is 33% larger than Phoenix, so yeah. I certainly hope it would be faster. Let's see where this ends up when we have actual gaming benchmarks and not synthetics.

In any case you're comparing an unreleased APU product against an APU that's been out for nearly a year. You understand that AMD also has newer APUs in the pipeline right?

Intel is ramping up 5 nodes in 4 years, has 3 times the market share in AI chipsets than AMD.

AMD doesn't have the war drums to fend off both Nvidia and Intel, simple as that. They were lucky as hell that Intel was sleeping at the wheel for so many years.
Wake me up when the nodes aren't just on paper, and we actually have some performance characteristics, yields and so on.

Beyond that, all of this posturing at Intel coming and stealing AMD's lunch somehow, fails to address the key question. And that is why would Microsoft and Sony throw a multi-decade chip partnership with AMD away for Intel? What does Intel have that AMD cannot provide? I guess more fab capacity. But in terms of microarchitecture? Why would Sony/MS go with Panther Lake and Celestial, over Zen 6 and RDNA 6? Moreover, why would AMD not offer as many incentives for Sony/MS to maintain their working relationship instead of defecting to Intel?
Unless you think that Intel's next CPU/GPU combination is going to be so much faster than AMD in every conceivable metric that they would be left with no choice but to change partners? I mean it could happen, but I doubt it.
 
Last edited:
If Sony is still going with AMD, then the PS5 Pro matching a 4090 in any way shape or form is an impossible dream.

AMD couldn't achieve this with their Halo product (7900 XTX), why would they be able to achieve it in a device that has far more stringent thermal and pricing limits?
 
If Sony is still going with AMD, then the PS5 Pro matching a 4090 in any way shape or form is an impossible dream.

AMD couldn't achieve this with their Halo product (7900 XTX), why would they be able to achieve it in a device that has far more stringent thermal and pricing limits?
People said the same thing about the 1080 Ti.
5 years is a long time in technology.
 
New tech could make the 4090 ancient at that time. If we get more DLSS / framegen stuff the 4090 will age badly.

For example a 4060 ti ( budget gpu, performs like a 3090 ti ) just because of framegen.

We saw a huge leap in performance the last 3 generations, that could continou in the next 2 generations.
 
The 4090 is designed to brute force its way through games and in the process suck on a 1k watts of power and the FE is pretty loud.
No, it's not "designed" to brute force its way. What kind of stupid statement is this? NVIDIA makes the cards. It's up to the developers to use it properly.

Every time I see you post about PCs, it's to make up some fake shit. Don't you have a 4090 in your rig? Then why do you always outright lie or spread misinformation at any given occasion?

8wCcia2.png


1K watts? Not even if you remove the power limiter and push its clocks as high as possible. Ada is much more power efficient than RDNA 3 or 2.
 
I get the argument for not prioritizing graphics but wtf is everyone problem with people prefeing story games is beyond me.
prefering story games is another way of saying you want pretty cinematics , there are 2d games out there with amazing story and nobody is playing them because they dont have pretty graphics
 
If Sony is still going with AMD, then the PS5 Pro matching a 4090 in any way shape or form is an impossible dream.

AMD couldn't achieve this with their Halo product (7900 XTX), why would they be able to achieve it in a device that has far more stringent thermal and pricing limits?
Umm people are talking about the ps6...the ps5 pro would be lucky to match a 3080ti.
 
Top Bottom