PS6 with the same RTX 4090 power?

Normally I would confidently say it would beat the 4090 by a decent margin based on past gens but amd has basically regressed so badly that Im honestly not sure. I am sure it will beat out the 4090 in rasterization but in rt..amd is literally 2 gens behind. Amd has basically thrown in the towel at this point and has given up on real R@D investment to the point they barely innovate meanwhile even intel has surpassed them in rt and compute/ml. Honestly amd is so behind that next gen the console makers might be forced to look at alternatives to amd such as intels offerings etc.
 
Last edited:
All depends on the price and advancement of technology. Performance improvement has slowed quite a bit in comparison to the past, which makes it hard to guess about.

Plus, are we still looking at an APU or are the CPU/GPU now separate (maybe chiplets on the same package, etc.) are we splitting memory to avoid hamstringing the CPU with GDDR, etc.
 
Yes they better wake the fuck up, Intel's around the corner ready for blows

Intel might even be a good contender for an APU next gen if one of the console manufacturer wants an edge against the one picking AMD.
This is premium tier copium.
Intel isn't ready for a goddamn thing.
 
I want to say yes, but the sad truth is that unless there's a big change in the next few years, it might not be feasible to have a console with that kind of power at a reasonable price point.

The way it works is simple; You need to put a certain number of little components (transistors) in a chip. The smaller the chip, the cheaper it is to produce/buy and the easier it is to cool it (so you can put it in a small box). If you can shrink those transistors enough, you can put many in that small chip that goes into a console.

However, in the last few years, shrinking transistors has become more and more expensive for a number of reasons hence why I'm not convinced it will necessarily be the case. It's definitely possible, but no one wants to buy a base console that costs 599.
 
I think it will be released in November 2027 and the 4090 visuals will be a joke when compared to the PS6.

And not because of raw GPU teraflops, which we already saw isn't a dumb metric to compare against PC or Xbox because Mark Cerny did some shit to optimize the I/O to reach a higher percent of the theorical limit of the GPU than XB or PC.

I think it will be because of some kind of Mark Cerny magic to get top tier raytracing at a very low cost, combined with Sony's own take on DLSS and some AI shit or some unpredictable stuff that Cerny may do to achieve more with less.

On top of that, having moved to SSD leaving behind HDD is going to provide some big benefits that we'll see in future game engines, things like Lumen and Nanite are only the first step in that direction.
 
Last edited:
Normally I would confidently say it would beat the 4090 by a decent margin based on past gens but amd has basically regressed so badly that Im honestly not sure. I am sure it will beat out the 4090 in rasterization but in rt..amd is literally 2 gens behind. Amd has basically thrown in the towel at this point and has given up on real R@D investment to the point they barely innovate meanwhile even intel has surpassed them in rt and compute/ml. Honestly amd is so behind that next gen the console makers might be forced to look at alternatives to amd such as intels offerings etc.
? Doesn't a 7900xtx outperform a 3080 in ray tracing games except maybe for Cyberpunk and get pretty close to the 4070 before DLSS and FSR enter the picture?
 
I think it will be because of some kind of Mark Cerny magic to get top tier raytracing at a very low cost, combined with Sony's own take on DLSS and some AI shit or some unpredictable stuff that Cerny may do to achieve more with less.
Yeah, right before he becomes president of the United States and simultaneously ends the war in Israel and Ukraine. Still will pale in comparison to how he'll solve the climate crisis in 2 years.

Sony is lucky to have him.
 
I think it will be released in November 2027 and the 4090 visuals will be a joke when compared to the PS6.

And not because of raw GPU teraflops, which we already saw isn't a dumb metric to compare against PC or Xbox because Mark Cerny did some shit to optimize the I/O to reach a higher percent of the theorical limit of the GPU than XB or PC.

I think it will be because of some kind of Mark Cerny magic to get top tier raytracing at a very low cost, combined with Sony's own take on DLSS and some AI shit or some unpredictable stuff that Cerny may do to achieve more with less.

On top of that, having moved to SSD leaving behind HDD is going to provide some big benefits that we'll see in future game engines, things like Lumen and Nanite are only the first step in that direction.

Do you think PS5 is running prettier games than GTX 1080ti could never imagine?
 
I think it will be released in November 2027 and the 4090 visuals will be a joke when compared to the PS6.

And not because of raw GPU teraflops, which we already saw isn't a dumb metric to compare against PC or Xbox because Mark Cerny did some shit to optimize the I/O to reach a higher percent of the theorical limit of the GPU than XB or PC.

I think it will be because of some kind of Mark Cerny magic to get top tier raytracing at a very low cost, combined with Sony's own take on DLSS and some AI shit or some unpredictable stuff that Cerny may do to achieve more with less.

On top of that, having moved to SSD leaving behind HDD is going to provide some big benefits that we'll see in future game engines, things like Lumen and Nanite are only the first step in that direction.

As long as they are using AMD hardware they are handcuffed to their bad tech. Look at how long FSR has been around and how terrible it still is. Look at how bad their newest GPUs perform in raytracing. They're getting left in the dust. Look at AW2 and Cyberpunk (w / pathtracing) on a 4090.

FSR game at 17 fps on ps5:

IMG-7757.jpg


DLSS 3 game at 100+ fps on a 4090

20230925233252-1.jpg



AMD simply can't hack it. They stink.
 
Do you think PS5 is running prettier games than GTX 1080ti could never imagine?
With the exception of Cyberpunk 2077 running on a multi thousands dollars PC yes, it does. First because its GPU is supposedly equivalent to a 2070 Super, not a 1080 ti.

Second because its IO usage to communicate the (compressed data, decompressed by hardware) SSD stuff into memory and its access to both GPU and CPU is faster than current PCs can do, which allows it to perform better than supposed a PC equivalent in terms of GPU/CPU/memory/SSD.

And third because of talent of people who make high end exclusive games that aren't available in PC (even if some of them may be in the future).


As long as they are using AMD hardware they are handcuffed to their bad tech. Look at how long FSR has been around and how terrible it still is. Look at how bad their newest GPUs perform in raytracing. They're getting left in the dust. Look at AW2 and Cyberpunk (w / pathtracing) on a 4090.

FSR game at 17 fps on ps5:

IMG-7757.jpg


DLSS 3 game at 100+ fps on a 4090

20230925233252-1.jpg



AMD simply can't hack it. They stink.
Sony can do their own DLSS/FSR tech, pretty likely using the technology from the company iSize they recently acquired, which doesn't require any specific hardware.

Having said that, that comparision you made is unfair. If something you should compare Cyberpunk (or any other game) on PS5 vs its PC version on an equivalent Nvidia hardware (2070 Super) to show what the DLSS benefits (or to compare a same game in PC with FSR vs DLSS on equivalent cards, differences that many people won't notice unless watching a Digital Foundry or similar comparision).

To compare a random PS5 game vs Cyberpunk on a 4090 (a $2500 card released two years after the console) is as dumb as to compare the best looking PS5 game vs a crappy PC game running on a GeForce 256 to say that Nvidia sucks.

Obviously a $2500 card that almost nobody bought should look better than an older $500 console.
 
Last edited:
As long as they are using AMD hardware they are handcuffed to their bad tech. Look at how long FSR has been around and how terrible it still is. Look at how bad their newest GPUs perform in raytracing. They're getting left in the dust. Look at AW2 and Cyberpunk (w / pathtracing) on a 4090.

FSR game at 17 fps on ps5:

IMG-7757.jpg


DLSS 3 game at 100+ fps on a 4090

20230925233252-1.jpg



AMD simply can't hack it. They stink.
PS6 with a RX 10600xt, will be able of running native 4k 60fps + ray tracing ?

I mean, the current gen games...
And then Sony would probably drop resolution or even use the newer FSR to make true next gen graphics!
 
This is premium tier copium.
Intel isn't ready for a goddamn thing.

Intel is already ahead in RT and ML on first iteration.

Main culprit of their initial performance problems are drivers for legacy games outside of DX12 and Vulkan, which is not an hurdle on consoles.

They have tiled APU designs coming, foundries for 2 & 1.8nm roadmap ahead of TSMC on the calendar.

By 2027~28 there's a very good chance that Intel pops AMD out of desktop GPU space as early as on 2nd or 3rd iteration, everyone looking at the GPU market share sees this.

So why not consoles
 
Last edited:
With the exception of Cyberpunk 2077 running on a multi thousands dollars PC yes, it does. First because its GPU is supposedly equivalent to a 2070 Super, not a 1080 ti.

Second because its IO usage to communicate the (compressed data, decompressed by hardware) SSD stuff into memory and its access to both GPU and CPU is faster than current PCs can do, which allows it to perform better than supposed a PC equivalent in terms of GPU/CPU/memory/SSD.

And third because of talent of people who make high end exclusive games that aren't available in PC (even if some of them may be in the future).
Well I've heard PS5 GPU is equivalent to 2060~2070, not the 2070 super or 3060...

Wouldn't you be wrong to compare AMD's teraflops numbers with Nvidia teraflops directly in a literal way??

I mean, the 3090 makes around 37 Teraflops that means almost the same RX 7900xt performance, with it's 52 Teraflops
 
? Doesn't a 7900xtx outperform a 3080 in ray tracing games except maybe for Cyberpunk and get pretty close to the 4070 before DLSS and FSR enter the picture?
The 7900 xtx is amds flagship meanwhile nvidias flagship for last gen was the 3090ti so a 3080 is not a fair comparison at all. It is signifigantly inferior to a 3090ti in the comprehensive rt benchmarks. When path tracing is enabled it becomes a bloodbath where its close to a damn 3060,
 
Last edited:
People thinking the PS6 will be more powerful than the RTX 4090, maybe but don't get your hopes up.

The fact the PS5 is on par with a RTX 2070, doesn't help it out. It's equivalent to a mid-tier GPU.

If the PS6 follows suit, one would expect it will be equivalent to whatever the '70 series equivalent is at the time.

Considering the RTX 4070 Ti is roughly equivalent to a RTX 3090 (the previous gen top dog, outside the RTX 3090 Ti, which wasn't really that much better than the RTX 3090) I'd really hope that the RTX 5070 Ti is on par with the RTX 4090. However, Nvidia has been fucking around with the naming schemes (RTX 4070 Ti should have been a RTX 4070, at a cheaper price).

It's hard to extrapolate out past the RTX 5000 series, again since Nvidia isn't really following the trend of "80 Ti, is equivalent to 70 series" anymore (since the RTX 4000 series has made it evident how much it sucks, relative to innovation between previous gen respective models).

One would expect the RTX 6070 would at the very least be comparable to the RTX 5070 Ti (and by association the RTX 4090), but who the fuck knows.
 
Last edited:
With the exception of Cyberpunk 2077 running on a multi thousands dollars PC yes, it does. First because its GPU is supposedly equivalent to a 2070 Super, not a 1080 ti.

Second because its IO usage to communicate the (compressed data, decompressed by hardware) SSD stuff into memory and its access to both GPU and CPU is faster than current PCs can do, which allows it to perform better than supposed a PC equivalent in terms of GPU/CPU/memory/SSD.

And third because of talent of people who make high end exclusive games that aren't available in PC (even if some of them may be in the future).



Sony can do their own DLSS/FSR tech, pretty likely using the technology from the company iSize they recently acquired, which doesn't require any specific hardware.

Having said that, that comparision you made is unfair. If something you should compare Cyberpunk (or any other game) on PS5 vs its PC version on an equivalent Nvidia hardware (2070 Super) to show what the DLSS benefits (or to compare a same game in PC with FSR vs DLSS on equivalent cards, differences that many people won't notice unless watching a Digital Foundry or similar comparision).

To compare a random PS5 game vs Cyberpunk on a 4090 (a $2500 card released two years after the console) is as dumb as to compare the best looking PS5 game vs a crappy PC game running on a GeForce 256 to say that Nvidia sucks.

Obviously a $2500 card that almost nobody bought should look better than an older $500 console.

First of all; 4090s MSRP is $1600. And plenty of suckers bought them myself included.

But youre right that wasn't a fair comparison. Here's a better one.

87-DE00-B1-CAE6-4664-8145-CC6-EF2059885.jpg


this was also before DLSS3 and the gulf getting even wider. I dunno what you are even suggesting Sony is gonna do besides some software based solution but unless it has been in the oven for years and years it is gonna be far worse than current day DLSS.
 
Well I've heard PS5 GPU is equivalent to 2060~2070, not the 2070 super or 3060...

Wouldn't you be wrong to compare AMD's teraflops numbers with Nvidia teraflops directly in a literal way??

I mean, the 3090 makes around 37 Teraflops that means almost the same RX 7900xt performance, with it's 52 Teraflops
Its entirely game dependent. Death Stranding for example the PS5 beats out a 2080.
 
People thinking the PS6 will be more powerful than the RTX 4090, maybe but don't get your hopes up.

The fact the PS5 is on par with a RTX 2070, doesn't help it out. It's equivalent to a mid-tier GPU.

If the PS6 follows suit, one would expect it will be equivalent to whatever the '70 series equivalent is at the time.

Considering the RTX 4070 Ti is roughly equivalent to a RTX 3090 (the previous gen top dgo, outside the RTX 3090 To, which wasn't really that much better than the RTX 3090) I'd really hope that the RTX 5070 Ti is on par with the RTX 4090. However, Nvidia has been fucking around with the naming schemes (RTX 4070 Ti should have been a RTX 4070, at a cheaper price).

It's hard to extrapolate out past the RTX 5000 series, again since Nvidia isn't really following the trend of "80 Ti, is equivalent to 70 series" anymore (since the RTX 4000 series has made it evident how much it sucks, relative to innovation between previous gen respective models).

One would expect the RTX 6070 would at the very least be comparable to the RTX 4070 Ti (and by association the RTX 3090), but who the fuck knows.
sure, comparing the raw power of a PS6 to the 3090 is more realistic...

but yeah, we need to consider all those newer technologies in the future, and optimizations that consoles makes pretty well...

Rumors says PS6 will find it's own way to Path Tracing made by AMD, or even something better at the time...
 
The 4090 was a bigger jump that usual between gpu generations, and even if the next gen consoles were more powerful than a 4090 (and i'm not certain of that, not in raw horsepower anyway), the 4090 is starting to struggle to hit 60fps at native 4k in some current gen games already.
Next gen will be very much like this gen so far, a lot of last gen, or slightly better looking than last gen games but running at higher internal resolutions, and possibly framerates, maybe with better RT effects, etc.
 
AMD does fine at raster performance it's just everywhere else they are getting blown out. Also the 4090 is not twice as much as the 7900xtx.
You're right. Twice as much is too low. The cheapest 4090 is about 2.2x more expensive. Most are around 2.5x the price though.
 
Last edited:
whatever price or power the ps6 will be at, we don't even have that many next gen games as it is so maybe let's focus on what's going on now before we worry about what will happen in the future
 
The 7900 xtx is amds flagship meanwhile nvidias flagship for last gen was the 3090ti so a 3080 is not a fair comparison at all. It is signifigantly inferior to a 3090ti in the comprehensive rt benchmarks. When path tracing is enabled it becomes a bloodbath where its close to a damn 3060,
Alan Wake 2 path tracing puts it slightly behind the 3090 and inbetween the 4070 and 4070ti, hard to find other path tracing benchmarks that arent Cyberpunk. But comparing flagships which are in completely different price brackets is pointless when it comes to console architecture where everything is on a tight budget. Price point wise they are behind but not even one full gen behind.
 
You're right. Twice as much is too low. The cheapest 4090 is about 2.2x more expensive. Most are around 2.5x the price though.

The MSRP of the 4090 is $1,600. The MSRP of the 7900XTX at launch was $999. The 4090 is still hard to find at MSRP because people actually want them but you would be dumb to pay a scalper for it.
 
The MSRP of the 4090 is $1,600. The MSRP of the 7900XTX at launch was $999. The 4090 is still hard to find at MSRP because people actually want them but you would be dumb to pay a scalper for it.
MSRP doesn't matter when you can't buy it at that price point. A 4090 at $2500 is just not a good buy, not matter how great DLSS is.

Also, the 4090 is not increasing in price due to any gamer demand. Simply looking at Steam survey results can prove that. No, limited stock from Nvidia and the AI boom is responsible for that.
 
Last edited:
MSRP doesn't matter when you can't buy it at that price point. A 4090 at $2500 is just not a good buy, not matter how great DLSS is.

Also, the 4090 is not increasing in price due to any gamer demand. Simply looking at Steam survey results can prove that. No, limited stock from Nvidia and the AI boom is responsible for that.
For determining what GPU could go in a console the MSRP is the best comparison.
 
Intel is already ahead in RT and ML on first iteration.

Main culprit of their initial performance problems are drivers for legacy games outside of DX12 and Vulkan, which is not an hurdle on consoles.

They have tiled APU designs coming, foundries for 2 & 1.8nm roadmap ahead of TSMC on the calendar.

By 2027~28 there's a very good chance that Intel pops AMD out of desktop GPU space as early as on 2nd or 3rd iteration, everyone looking at the GPU market share sees this.

So why not consoles
They aren't really ahead in anything.

ACM-G10 is a 400mm2 die on N6 which at best competes with Navi 22; a 250mm2 N6 die using an old architecture or N33; a 200mm2 die on N6 in normal rasterisation.
It does better at raytracing, where it competes with Navi 22; a 340mm2 die, again an old architecture.

Nvidia basically has an unassailable lead in graphics, and Intel are worse than AMD in every relevant metric, except price/performance.
And given how much money Intel are currently losing, I really doubt they'd be willing to compete with AMD on price in designing a console APU.

Let us not forget that neither Intels GPUs nor CPUs are more power efficient than AMD's offerings in both spaces - particularly in CPU. Unlike in PC, where you can make excuses for poor performance per watt if the price/performance is good, consoles do not have that luxury.

Intel has a roadmap sure. So do AMD. I'll believe anything that roadmap sells when I see the benchmarks. Until then I wouldn't get your hopes up about Intel magically outcompeting AMD in GPUs.
 
For determining what GPU could go in a console the MSRP is the best comparison.
No, that would be transistor count, die size, and power consumption. Sony won't pay anything near MSRP for AMD chips and Nvidia has a massive profit margin on the 4090.
 
AMD does fine at raster performance it's just everywhere else they are getting blown out. Also the 4090 is not twice as much as the 7900xtx.

xtx can be had at around $1000




And just to think that you will have all of this power for under $600 with the Sony Playstation 6... would be amazing
 
Last edited:
Well I've heard PS5 GPU is equivalent to 2060~2070, not the 2070 super or 3060...

Wouldn't you be wrong to compare AMD's teraflops numbers with Nvidia teraflops directly in a literal way??

I mean, the 3090 makes around 37 Teraflops that means almost the same RX 7900xt performance, with it's 52 Teraflops
Maybe what you heard was a fart.

Digital Foundry said with their first test -CoD 2020- that PS5 PC equivalent is aprox. a 2070 Super (and they mentioned that performed 20% better), and later with AC Valhalla said it's between 2070 Super and RTX 2080 TI. Later they used 2070 Super with Ratchet (I think it was the first PS5 only game they analized) to find the equivalent. For Spider-Man and Returnal they did use 2060 Super but in none of them compare PS5 vs PC performance, they only use it to show how the different PC settings look.

The teraflops aren't a valid metric to measure game performance (see PS5 beating or matching Series X in most games, when Series X has more teraflops) and the Nvidia and AMD teraflops aren't equivalent. In the same way, even trying to use AMD PC equivalents the results won't be the same due to the better IO system on PS5, no background apps in console, etc.

And well, this is ignoring PC specific issues -not only in 'equivalent' hardware- like shader compilation, stutter, longer loadings etc.
 
Last edited:
Maybe what you heard was a fart.

Digital Foundry said with their first test -CoD 2020- that PS5 PC equivalent is aprox. a 2070 Super (and they mentioned that performed 20% better), and later with AC Valhalla said it's between 2070 Super and RTX 2080 TI. Later they used 2070 Super with Ratchet (I think it was the first PS5 only game they analized) to find the equivalent. For Spider-Man and Returnal they did use 2060 Super but in none of them compare PS5 vs PC performance, they only use it to show how the different PC settings look.

The teraflops aren't a valid metric to measure game performance (see PS5 beating or matching Series X in most games, when Series X has more teraflops) and the Nvidia and AMD teraflops aren't equivalent. In the same way, even trying to use AMD PC equivalents the results won't be the same due to the better IO system on PS5, no background apps in console, etc.

And well, this is ignoring PC specific issues -not only in 'equivalent' hardware- like shader compilation, stutter, longer loadings etc.
Jesus, PS5 is at the same RTX 2070 Super performance?

I couldn't even imagine!!!

How stronge is that?

I need to find some benchmarks
 
Maybe what you heard was a fart.

Digital Foundry said with their first test -CoD 2020- that PS5 PC equivalent is aprox. a 2070 Super (and they mentioned that performed 20% better), and later with AC Valhalla said it's between 2070 Super and RTX 2080 TI. Later they used 2070 Super with Ratchet (I think it was the first PS5 only game they analized) to find the equivalent. For Spider-Man and Returnal they did use 2060 Super but in none of them compare PS5 vs PC performance, they only use it to show how the different PC settings look.

The teraflops aren't a valid metric to measure game performance (see PS5 beating or matching Series X in most games, when Series X has more teraflops) and the Nvidia and AMD teraflops aren't equivalent. In the same way, even trying to use AMD PC equivalents the results won't be the same due to the better IO system on PS5, no background apps in console, etc.

And well, this is ignoring PC specific issues -not only in 'equivalent' hardware- like shader compilation, stutter, longer loadings etc.

For bad PC ports PS5 is (sometimes much) stronger than 2070 super, for good PC ports it should be around the same power. With Ray Tracing it's like 2060.

Jesus, PS5 is at the same RTX 2070 Super performance?

I couldn't even imagine!!!

How stronge is that?

I need to find some benchmarks

In AMD architecture it's like 6600XT/6700 but better in some aspects (and worse in others like lack of hardware vrs).
 
First of all; 4090s MSRP is $1600. And plenty of suckers bought them myself included.
Amazon and Google show prices mostly from around $2300-$2700, more or less around $2500.

There's one in Aliexpress for under $500, maybe this is the one you're talking about.

But youre right that wasn't a fair comparison. Here's a better one.

87-DE00-B1-CAE6-4664-8145-CC6-EF2059885.jpg


this was also before DLSS3 and the gulf getting even wider. I dunno what you are even suggesting Sony is gonna do besides some software based solution but unless it has been in the oven for years and years it is gonna be far worse than current day DLSS.
Yes, but DLSS3 requires a 4000 card. DLSS 2.3.4 is a fair comparision, works in a 2070 Super.

People thought they weren't going to be able to have 4K games on PS4 Pro, and achieved it with many games, in some cases doing checkerboard rendering.

People thought PS4 wasn't going to be able to perform VR decently, but they did that retroprojection thing to "double" the framerate. In PSVR2 they went beyond with stuff like foveated rendeding+eye tracking to optimize even more the performance.

People thought they weren't going to be able to put raytracing in PS5, or that PS5 was going to perform worse than Series X and performs better in many games (same with supposedly equivalent PC specs).

I suggest you to watch the iSize -the company that SIE just bought- demos, they use AI to reconstruct -and optionally scale it up- the image from streamed videos, achieving better quality even with lower bitrates. In real time, independently of the hardware and the codec with under 5ms of latency. The demos they had in their web were really impressive if real, and they mention can also be used for gaming. So pretty likely Sony could use their tech to make their own DLSS/FSR and maybe without even having to wait for PS6.
 
Last edited:
For bad PC ports PS5 is (sometimes much) stronger than 2070 super, for good PC ports it should be around the same power. With Ray Tracing it's like 2060.



In AMD architecture it's like 6600XT/6700 but better in some aspects (and worse in others like lack of hardware vrs).
Well I've known RE4 Remake runs worse than 6600XT on PS5
 
Amazon and Google show prices mostly from around $2300-$2700, more or less around $2500.

There's one in Aliexpress for under $500, maybe this is the one you're talking about.


Yes, but DLSS3 requires a 4000 card. DLSS 2.3.4 is a fair comparision, works in a 2070 Super.

People thought they weren't going to be able to have 4K games on PS4 Pro, and achieved it with many games, in some cases doing checkerboard rendering.

People thought PS4 wasn't going to be able to perform VR decently, but they did that retroprojection thing to "double" the framerate. In PSVR2 they went beyond with stuff like foveated rendeding+eye tracking to optimize even more the performance.

People thought they weren't going to be able to put raytracing in PS5, or that PS5 was going to perform worse than Series X and performs better in many games (same with supposedly equivalent PC specs).

I suggest you to watch the iSize -the company that SIE just bought- demos, they use AI to reconstruct -and optionally scale it up- the image from streamed videos, achieving better quality even with lower bitrates. In real time, independently of the hardware and the codec with under 5ms of latency. The demos they had in their web were really impressive if real, and they mention can also be used for gaming. So pretty likely Sony could use their tech to make their own DLSS/FSR and maybe without even having to wait for PS6.

You're looking at scalper prices, because the 4090 is still in high demand. Yet you were just saying how nobody bought one, lmao. This would be like me saying the ps5 is a $1000 console after launch when only scalpers had them. It's dumb.

Raytracing as it exists right now on consoles is basically a marketing bullet point. "LOOK OUT, WE GOT RAYTRACING IN THIS GAME" and then it's just super basic puddles and reflections. Fact is the only people pushing raytracing to the next level with path tracing and RTGI are Nvidia.

No idea who iSize is but it's gonna amount to another software based upscaling technique. You sound all mixed, checkerboard rendering on ps4 pro was also another software based upscaling technique and it wasn't a very good one.

DLSS has been around for years at this point and is just getting better. Time will tell what happens with ps6 but AMD has proven over and over that they can't keep up.

mqdefault.jpg
 
I don't understand the people who are so pessimistic.
no one knows AMDs plans beyond the RDNA4.Yes RDNA4 will realease only as a middle class card no high end cards.But that don't mean anything what if AMD uses the time to create an incredible card with the RDNA5 series.PS6 in 2028 will be RDNA6 with 7 features.Dont forget AMD was laughed at constantly for their CPUs but they made it best CPU with the lowest watt usage.AMD will also make it with their graphics cards they only need a bit time what they will have now.Till the RDNA5 series they are out of the high end market lots of time to work things out.
 
Last edited:
Fuck no. Even current gen consoles haven't tapped the potential of the RTX2080. Game consoles are meant to rely on low power consumption and use the mobile version of any CPU/GPU. Also, console manufacturers are limited to AMD for BC reasons.

YOU ARE NEVER GOING TO GET HIGH END PC OUT OF A GAMING CONSOLE, EVER. YOU WILL NEVER GET A $2500 PC out of a $500 CONSOLE.

Instead, what you will get our console manufacturers, trying to hit 8K and 4K resolutions with dynamic resolution and medium to medium high graphic settings, Which barely even reach 60 FPS. Any expectation further than this is bullshit and is setting themselves up for disappointment. What also comes with more power is more developers, trying to brute, force their games to function, which means even more bugs, freezes and bullshit.
 
Last edited:
Top Bottom