That's still arguable. The 4090 will be two generations removed from the top spot once the PS6 comes out. It's all contingent upon how good the advancements are in the next two nodes for AMD. If in 5 years, their mid-range chip still cannot outperform a 4090, they will be a failure. The 4090 will likely still be far stronger for machine learning but for graphics rendering? I wouldn't be the least bit surprised if the PS6 is stronger and would be a bit disappointed if it's not.It will paint a prettier picture but won't have the same horsepower as a $2200+ card.
Nope, it will smoke 4090 in both feature set and performance. For reference the titan x was the top pc card 5 years prior to ps5 release. It runs Horizon Zero Dawn (not forbidden west) 40-60fps at 1440p medium settings.
0.88% of the Steam users active last month had a 4090 according to Steam's survey. This is what, around 1M sold maybe?You're looking at scalper prices, because the 4090 is still in high demand. Yet you were just saying how nobody bought one, lmao. This would be like me saying the ps5 is a $1000 console after launch when only scalpers had them. It's dumb.
Raytracing and real time global illumintion -in real time- in games is still on its infancy. Even in a 4090.Raytracing as it exists right now on consoles is basically a marketing bullet point. "LOOK OUT, WE GOT RAYTRACING IN THIS GAME" and then it's just super basic puddles and reflections. Fact is the only people pushing raytracing to the next level with path tracing and RTGI are Nvidia.
Yes, you have no idea. In the top of their website there are different examples of their tech, with left and right arrows to see different examples: https://www.isize.co/No idea who iSize is but it's gonna amount to another software based upscaling technique. You sound all mixed, checkerboard rendering on ps4 pro was also another software based upscaling technique and it wasn't a very good one.
hi my little cute beauty !If you want something with the power of a 4090... just buy a 4090. The PS games will land on pc eventually.
"ps5 pro" LOL there wont be one.It will be more powerful. Assuming 2028 launch and 2nm it should beat the 4090, even at ray tracing.
PS5 Pro is apparently going to have significant leaps in AMD RT capabilities and that is out next year.
Some will land on PC, others won't.If you want something with the power of a 4090... just buy a 4090. The PS games will land on pc eventually.
They aren't really ahead in anything.
ACM-G10 is a 400mm2 die on N6 which at best competes with Navi 22; a 250mm2 N6 die using an old architecture or N33; a 200mm2 die on N6 in normal rasterisation.
It does better at raytracing, where it competes with Navi 22; a 340mm2 die, again an old architecture.
Nvidia basically has an unassailable lead in graphics
, and Intel are worse than AMD in every relevant metric, except price/performance.
And given how much money Intel are currently losing, I really doubt they'd be willing to compete with AMD on price in designing a console APU.
Let us not forget that neither Intels GPUs nor CPUs are more power efficient than AMD's offerings in both spaces - particularly in CPU. Unlike in PC, where you can make excuses for poor performance per watt if the price/performance is good, consoles do not have that luxury.
Intel has a roadmap sure. So do AMD. I'll believe anything that roadmap sells when I see the benchmarks. Until then I wouldn't get your hopes up about Intel magically outcompeting AMD in GPUs.
Bro I'm an Nvidia shill and this is just wrong, sure they might have their AI pipeline down but their RT performance has clearly caught up now that the pipelines are less Nvidia centric in newer RT titlesAMD is so far behind Nvidia at this point that even if they're speccing the Ps6 right now as a $1,000 console it will still be way behind the 4090 at raytracing
so excited for todays tech in 7 years!
2028 Launch? The 4090 will seem like my 970 by then, jesus wept.It will be more powerful. Assuming 2028 launch and 2nm it should beat the 4090, even at ray tracing.
PS5 Pro is apparently going to have significant leaps in AMD RT capabilities and that is out next year.
How do you come to this conclusion?It will be more powerful. Assuming 2028 launch and 2nm it should beat the 4090, even at ray tracing.
PS5 Pro is apparently going to have significant leaps in AMD RT capabilities and that is out next year.
Basic logic? It's like 5 years away, are you assuming GPU technology is just not going to improve? 3nm is already production ready and by the time the PS6 is ready to go 2nm would have been out for several years.How do you come to this conclusion?
It was leaked by the same guy that leaked the slim (and the detachable drive) and the PS Portal. It's almost certain."ps5 pro" LOL there wont be one.
it's because the AI push and nvidias proper AI hardware is sold out till god knows when and scrubs are resorting to 4090's because it's the next best thing they can get because AMD once again just waited for the future then scramble like headless chickens when the tree is growing golden fruit and they only just got the sprouts growing.Amazon and Google show prices mostly from around $2300-$2700, more or less around $2500.
The amd image is pretty bad but the DLSS image still looks like shit. DLSS stans are basically very loud advocates bragging about how their turd looks better than another vendors turd. I say this as someone who owns a 4090 and avoids DLSS like the plague where possible.First of all; 4090s MSRP is $1600. And plenty of suckers bought them myself included.
But youre right that wasn't a fair comparison. Here's a better one.
![]()
this was also before DLSS3 and the gulf getting even wider. I dunno what you are even suggesting Sony is gonna do besides some software based solution but unless it has been in the oven for years and years it is gonna be far worse than current day DLSS.
Alan wake 2 path tracing puts it about on par with the 3080 and 4070.Alan Wake 2 path tracing puts it slightly behind the 3090 and inbetween the 4070 and 4070ti, hard to find other path tracing benchmarks that arent Cyberpunk. But comparing flagships which are in completely different price brackets is pointless when it comes to console architecture where everything is on a tight budget. Price point wise they are behind but not even one full gen behind.
Yes, DLSS still looks bad but at least it's true that looks better than FSR, and as of now it's the best in-game alternative being used to super sample game images in real time.The amd image is pretty bad but the DLSS image still looks like shit. DLSS stans are basically very loud advocates bragging about how their turd looks better than another vendors turd. I say this as someone who owns a 4090 and avoids DLSS like the plague where possible.
6700XT should be a closer match for ps5Based on PS4 launch:
In 2013 consoles launched - 2 years later top GPU was GTX980ti that was 3x faster than 7850 in PS4:
![]()
PS5 launched in 2020 and was noticeable faster than 980ti:
![]()
2 years later 4090 launches and is ~3x faster than equivalent to PS5 GPU:
![]()
So yeah if PS6 launches in 2027 it will be faster than 4090.
Here's the problem, I don't care if it's better than amd, I just care if it's good or not and it's not good enough to use.Yes, DLSS still looks bad but at least it's true that looks better than FSR, and as of now it's the best in-game alternative being used to super sample game images in real time.
These deep learning AI techniques to upscale and improve frames considering a few previous frames are just starting, in a few years will be way better and faster than now, and won't be limited to super expensive high end cards. Nvidia obviously does that to sell GPUs. In the case of iSize, their tech can even run in CPU only devices (but obviously can be benefited of hardware acceleration to run them faster and reduce latency).
I remember when people were scoffing at the idea of the ps5/series x matching a 980ti/1080 but the problem is amd has dropped the ball so badly compared to the past that its legit a generational difference between nvidia and amd currently in forward oriented tech so we are forced to consider the possibility will amd be able to match the current flagship by then which is pretty depressing.this reminds me pre-premier discussion about ps5 specs. So many people ate crow on the day it ws unveiled.
I remember when people were scoffing at the idea of the ps5/series x matching a 980ti/1080 but the problem is amd has dropped the ball so badly compared to the past that its legit a generational difference between nvidia and amd currently in forward oriented tech so we are forced to consider the possibility will amd be able to match the current flagship by then which is pretty depressing.
there is still alot of talent out there , the problem is gamers are giving their money to the games with the most beautiful cinematics , most recent example is the muller powell principle , game that released 2 days ago and its amazing , but the game doesnt have top tier cinematics or an overly dramatic story so people dont like itGraphics shouldn't be our concern. The quality of the talent that keeps dropping and the left people infecting the industry should be a greater concern.
there is still alot of talent out there , the problem is gamers are giving their money to the games with the most beautiful cinematics , most recent example is the muller powell principle , game that released 2 days ago and its amazing , but the game doesnt have top tier cinematics or an overly dramatic story so people dont like it
Baldur's gate 3 is the exception , it combines great story/cinematics with even greater gameplay
The problem is that sony and microsoft themselves dont design the chip they bring their demands and request customizations based on amds architecture roadmap so we cant really expect sony and microsoft who have no real experience in graphics manufacturing to close the gap for amd. Even with sony and microsofts investment for e.g the gpus in the consoles are barely different than the release product and actually inferior in areas and signifigantly inferior to nvidias offering in rt and ml. The gap has only widened much further since than as nvidia has pulled away to the point there is a generational gap.Microsoft and Sony will put alot of R&D to close the gap on forward tech as far as possible. Its hard to say how far they will get but it's save to say that they get problems with marketing next gen if they don't.
The problem is that sony and microsoft themselves dont design the chip they bring their demands and request customizations based on amds architecture roadmap so we cant really expect sony and microsoft who have no real experience in graphics manufacturing to close the gap for amd. Even with sony and microsofts investment for e.g the gpus in the consoles are barely different than the release product and actually inferior in areas and signifigantly inferior to nvidias offering in rt and ml. The gap has only widened much further since than as nvidia has pulled away to the point there is a generational gap.
6700XT should be a closer match for ps5
It will easily be better than a 4090. Some people just don't understand how technology works.3 years into the PS4 generation the 1080ti was the most powerful GPU. PS5 gets around that or slightly better from what I understand about older GPUs.
Its entirely probable that the PS6 will be on par if not exceed the 4090.
One is tuned for rasterization and gaming only. Intel on first iteration is defining the foundation for things to come.
There's a reason for the die size
![]()
Yup
But the battle will happen at mid range for the remaining ~12% market share AMD has.
You're thinking about desktop CPUs, consoles are not bound to that.
Intel's Core Ultra tiled APUs are going after Apple like efficiency, they're chasing the bigger CPU contenders with their new nodes and architecture.
Intel's 1st entry is already solid. 1st entry. What year was AMD~ATI's first entry? AMD is trailing behind. Intel already came with a better upscaling solution than AMD on first go when AMD was part of the DX12 ML & RT consortiums ages ago.
Underestimating Intel with all they have going on in the coming years is a huge mistake. AMD poked a dragon. Look, I own a Ryzen CPU since Zen1 1600, all the way to a 5800X3D, but it took almost everything and the best foundry on earth to surpass Intel sleeping at the wheel on 14nm+++++++++++. That lead ain't gonna hold long with what they have cooking. To have your own foundry in the 2 & 1.8nm range in 2024 is already a huge advantage. TSMC can only supply so much and Apple gobbles up the best node always, leftovers are then in a battle of ressources between Nvidia, AMD, Qualcomm, Broadcom, Mediatek, Sony, Amazon, etc etc. These make chipsets cost more and they don't have access to the best node.
AMD's roadmap rumours is that they are not even going after the high end GPU market for RDNA 4. RDNA 3 weakened their knee for that much. Even had rumours in late October of layoffs at the GPU department.
![]()
Intel Core Ultra 7 155H Arc iGPU surpasses Radeon 780M, new leaked OpenCL result shows - VideoCardz.com
ASUS laptop with Core Ultra 7 155H and ARC iGPU surpasses Radeon 780M in Geekbench OpenCL test Intel is set to unveil the upcoming Core 100 series in three weeks. The integrated graphics department will receive one of the biggest upgrades. The Core series will soon include Intel Arc GPUs. The...videocardz.com
AMD APUs are stagnating for a while now, with AMD Phoenix models hampered by low RAM speed
Wake me up when the nodes aren't just on paper, and we actually have some performance characteristics, yields and so on.Intel is ramping up 5 nodes in 4 years, has 3 times the market share in AI chipsets than AMD.
AMD doesn't have the war drums to fend off both Nvidia and Intel, simple as that. They were lucky as hell that Intel was sleeping at the wheel for so many years.
People said the same thing about the 1080 Ti.If Sony is still going with AMD, then the PS5 Pro matching a 4090 in any way shape or form is an impossible dream.
AMD couldn't achieve this with their Halo product (7900 XTX), why would they be able to achieve it in a device that has far more stringent thermal and pricing limits?
Wait I misread the thread as PS5 pro for whatever reason lol.5 years is a long time in technology.
No, it's not "designed" to brute force its way. What kind of stupid statement is this? NVIDIA makes the cards. It's up to the developers to use it properly.The 4090 is designed to brute force its way through games and in the process suck on a 1k watts of power and the FE is pretty loud.
prefering story games is another way of saying you want pretty cinematics , there are 2d games out there with amazing story and nobody is playing them because they dont have pretty graphicsI get the argument for not prioritizing graphics but wtf is everyone problem with people prefeing story games is beyond me.
Umm people are talking about the ps6...the ps5 pro would be lucky to match a 3080ti.If Sony is still going with AMD, then the PS5 Pro matching a 4090 in any way shape or form is an impossible dream.
AMD couldn't achieve this with their Halo product (7900 XTX), why would they be able to achieve it in a device that has far more stringent thermal and pricing limits?
Lmao i know. I guess I had the immediate term on my mind when I read it.Umm people are talking about the ps6...the ps5 pro would be lucky to match a 3080ti.