an xbox 30% more powerful than a playstation, what do you think about that ?

4070ti super.

You have to remember that graph (I think) is missing all this stuff that even with internal geometry resolution being much lower still have to render in native 4k (or lower but still higher than 1080p output).

This cost is missing for DLSS table and I'm sure it's also missing for PSSR because it's game dependent. Every game will have different things, shadow maps, alphas (smoke, fire), HUD, volumetric lights, DOF etc.

People often forget about this and expect the same performance from 4k DLSS performance and native 1080p. This have to be added to the cost of reconstruction.

I also focused on latency here because this is what you were talking about, pure cost in frametime is different.
The column in the table I was comparing from Nvidia is supposed to be taking a native 1080p source and producing the DLSS3 4k equivalent, and that is the cost of the inferencing. Not the calculation of the motion vectors or the TAA-styled history blend pass, and assuming you haven't altered any settings, the native render should be the same workload for TAA pass and the TAA+DLSS pass, if not maybe lighter for the DLSS pass, giving it even more latitude to under calculate the post frame render latency(processing).

The latency numbers you are looking at don't mean anything outside your setup with your TV latency because in an unpredicted new frame change that happen in games all the time, the latency is the time it takes to render, upscale and present that brand new frame, and the latency I am talking about is the upscale processing latency - after the native frame rendering finishes - and in this situation, PSSRs fixed ~1ms is less processing latency for the upscale than DLSS3, and definitely DLSS4's on similar TOPs, and even on more TOPs going by your 4070 ti results.
 
I still think if ps6 is 650$ and next xbox 1k but around 50% faster than ps6 and will have steam catalog so also sony games it will be interesting
What I find interesting is the specflation that happens every time a delta for Xb is mentioned.
Rumour says 30% and in this thread we already have 50% and even 100% expectation off the same spec.
Not saying it can't happen but the 18% last time was inflated the exact same way leading up to launch.
(I'm pretty sure extremes were going into 4k with rt vs 1080p without just months before launch...)

Anyway, xb has been playing the spec game for 25 years without a pay off.
8 years ago 2x faster at 25% higher price got outsold 2-3:1 and other times was worse, so not sure where the "but this time will be different, for real, with a worse price/perf ratio" keeps coming back.

The only time they were genuinely competing was when substantially cheaper than competing PlayStation. price parity, let alone premium, never served them to date.
 
Last edited:
What I find interesting is the specflation that happens every time a delta for Xb is mentioned.
Rumour says 30% and in this thread we already have 50% and even 100% expectation off the same spec.
Not saying it can't happen but the 18% last time was inflated the exact same way leading up to launch.
(I'm pretty sure extremes were going into 4k with rt vs 1080p without just months before launch...)

Anyway, xb has been playing the spec game for 25 years without a pay off.
8 years ago 2x faster at 25% higher price got outsold 2-3:1 and other times was worse, so not sure where the "but this time will be different, for real, with a worse price/perf ratio" keeps coming back.

The only time they were genuinely competing was when substantially cheaper than competing PlayStation. price parity, let alone premium, never served them to date.
The history of the console market teaches us that to challenge the leading company, you need to launch your console first. Genesis vs. SNES, PS1 vs. N64, and Xbox 360 vs. PS3.
But history teaches us that in two situations, the leading company simply overtakes the first-launcher, primarily because of famous IPs and power.

Only the PS1 managed to take the crown because there was a lot of lobbying; no company wanted to continue using cartridges. Without Square's "betrayal," the N64 would probably have finished the generation in first place.

What Microsoft or any other company needs to do:

1. Launch their console first.
2. The console needs to have the best technology of its time.
3. Have a marketing budget to defend the console from Sony, saying that buying the console is foolish.
4. Have a colossal budget for game production, so that there are as many AAA games as possible to instill in consumers the desire to take risks.
5. Continue releasing many games after the PS6 comes out.
6- their console needs to cost twice as much as a current console but needs to drop to a price lower than the PS6 when released.

What company is crazy enough to try something like that?
 
Last edited:
But history teaches us that in two situations, the leading company simply overtakes the first-launcher, primarily because of famous IPs and power.
I think that's oversimplification.
360 won their two primary markets that entire gen, the PS3 only fell further behind as the gen wore on in US or UK. And 360 had a record setting performance in US (IIRC the only Playstation that sold better in US was the PS2).
But 360 was also - at least 25% cheaper the entire time, and up to 50% during periods.
 
What I find interesting is the specflation that happens every time a delta for Xb is mentioned.
Rumour says 30% and in this thread we already have 50% and even 100% expectation off the same spec.
Not saying it can't happen but the 18% last time was inflated the exact same way leading up to launch.
(I'm pretty sure extremes were going into 4k with rt vs 1080p without just months before launch...)

Anyway, xb has been playing the spec game for 25 years without a pay off.
8 years ago 2x faster at 25% higher price got outsold 2-3:1 and other times was worse, so not sure where the "but this time will be different, for real, with a worse price/perf ratio" keeps coming back.

The only time they were genuinely competing was when substantially cheaper than competing PlayStation. price parity, let alone premium, never served them to date.
yes last rumours suggest more like 30%
 
I think that's oversimplification.
360 won their two primary markets that entire gen, the PS3 only fell further behind as the gen wore on in US or UK. And 360 had a record setting performance in US (IIRC the only Playstation that sold better in US was the PS2).
But 360 was also - at least 25% cheaper the entire time, and up to 50% during periods.
Overall, it was a defeat.

The Xbox 360 is an excellent console, but its strategy was unsustainable.

1. We don't know how many million Xbox 360s were replaced.
2. We don't know if Microsoft counted the exchanged consoles as new sales.
3. The Xbox 360's specs are very good; I believe the subsidy was proportionally higher than the PS3's.
4. The Xbox brand lost traction due to Sony's efforts with the PS3 and its popular ips, just as I suggested.
5. The Xbox lost market share during the PS4 generation.
6. Perhaps the PS4 was more popular than the Xbox 360 in real terms, assuming that 10 million Xbox 360s were replaced.
This would make the victory over the PS3 small, basically the result of an extra year on the market
 
The column in the table I was comparing from Nvidia is supposed to be taking a native 1080p source and producing the DLSS3 4k equivalent, and that is the cost of the inferencing. Not the calculation of the motion vectors or the TAA-styled history blend pass, and assuming you haven't altered any settings, the native render should be the same workload for TAA pass and the TAA+DLSS pass, if not maybe lighter for the DLSS pass, giving it even more latitude to under calculate the post frame render latency(processing).

The latency numbers you are looking at don't mean anything outside your setup with your TV latency because in an unpredicted new frame change that happen in games all the time, the latency is the time it takes to render, upscale and present that brand new frame, and the latency I am talking about is the upscale processing latency - after the native frame rendering finishes - and in this situation, PSSRs fixed ~1ms is less processing latency for the upscale than DLSS3, and definitely DLSS4's on similar TOPs, and even on more TOPs going by your 4070 ti results.

1ms for PSSR you say? 1ms for DLSS3 as well:

1080p TAA:

cRSFxbfFelvn5KaN.jpeg

4K DLSS3 Performance:

HwdzHUUP4Zi0e1Tz.jpeg


4K DLSS4 Performance:

fsosBihLyxu8eIr6.jpeg



Cost of PSSR or DLSS is not exactly fixed. It's different for different games. "On paper" cost of DLSS is not full frametime cost of it because you have to add what game is also rendering with 4k OUTPUT vs. standard 1080p resolution.

Every game have different cost and gains of FSR4 and DLSS:

 
Powerful to play TV?
To play with the power of the cloud?
To play Kinect Star Wars (I'm Solo!)
Or to play on everything It's an Xbox?

The real problem is not the power of the hardware, It is Microsoft and its marketing team
 
1ms for PSSR you say? 1ms for DLSS3 as well:

1080p TAA:

cRSFxbfFelvn5KaN.jpeg

4K DLSS3 Performance:

HwdzHUUP4Zi0e1Tz.jpeg


4K DLSS4 Performance:

fsosBihLyxu8eIr6.jpeg



Cost of PSSR or DLSS is not exactly fixed. It's different for different games. "On paper" cost of DLSS is not full frametime cost of it because you have to add what game is also rendering with 4k OUTPUT vs. standard 1080p resolution.

Every game have different cost and gains of FSR4 and DLSS:


You aren't normalizing for dynamic resolution because the PSSR pass is deterministic per pixel like the validation I did of Cerny's numbers and for 1080p -> 4K - the other thread where - it is just over 1ms on a PS5 Pro, so unlike DLSS where you aren't normalizing for the hardware TOPs anyway, the cost for the ML AI isn't deterministic and varies by the phenomenon in the scene and is even more multiples of TOPs/processing time when claiming superior quality using DLAA or Quality mode. and if using DLSS3/4 with FG or MFG too, which is typical then the processing latency for DLSS3/4 is no where close to PSSR's 1ms in worst case scenarios of fresh unpredictable frames.

The biggest cause of variation in a PSSR frame runtime compared to native with TAA is the TAA-history blend which is done in fragment shaders and proportional to the resolution, but the actual ML AI inferencing upscaling in PSSR is both lower processing cost and completely deterministic, hence why PS5 Pro can use it with GT7, but Nintendo don't use DLSS in highly competitive fast action MK world.
 
Xbox has a heavy os layer which eats into a decent % of the performance lead it has over the ps5. Makes the 30% more like 15% and at the level of pixels being pushed, it's hardly noticeable.
 
Power War between consoles was always fun but retarded all the way. From the 900p vs 1080p to this day. If you care about power there is pc. The discussion for consoles is exclusives good games, original ips. And that's mostly or almost out of the landscape. 2 or 3 exclusives (that probably end up that on pc) that I'm interested in a console generation is not worth it.
 
Overall, it was a defeat.

The Xbox 360 is an excellent console, but its strategy was unsustainable.

1. We don't know how many million Xbox 360s were replaced.
2. We don't know if Microsoft counted the exchanged consoles as new sales.
3. The Xbox 360's specs are very good; I believe the subsidy was proportionally higher than the PS3's.
4. The Xbox brand lost traction due to Sony's efforts with the PS3 and its popular ips, just as I suggested.
5. The Xbox lost market share during the PS4 generation.
6. Perhaps the PS4 was more popular than the Xbox 360 in real terms, assuming that 10 million Xbox 360s were replaced.
This would make the victory over the PS3 small, basically the result of an extra year on the market

A lot of 360's success was down to PS3's launch price.
 
Raw power lost all value as barely anyone uses it anyway. We used to get the best looking games as a generation went on. Today it seems like it's the other way around. Demons Souls is still one if not the best looking games. What a sad state…
 
The PS6 will make up for it, by its new SSD which will be 20GB/sec-30GB/sec read/write speeds. Am i rite?
I hope it won't throttle otherwise it will lose the titie flop war.
PZaB5IkctDcGBHEw.jpg


Lord Cerny-I wait thy full technical briefing
 
Last edited:
1ms for PSSR you say? 1ms for DLSS3 as well:

1080p TAA:

cRSFxbfFelvn5KaN.jpeg

4K DLSS3 Performance:

HwdzHUUP4Zi0e1Tz.jpeg


4K DLSS4 Performance:

fsosBihLyxu8eIr6.jpeg



Cost of PSSR or DLSS is not exactly fixed. It's different for different games. "On paper" cost of DLSS is not full frametime cost of it because you have to add what game is also rendering with 4k OUTPUT vs. standard 1080p resolution.

Every game have different cost and gains of FSR4 and DLSS:


For what it is worth, with the recent PSSR update for Control it is very easy to see what the upscaling cost is. Thanks to the modes being uncapped you can see it's roughly around ~1.5ms, which aligns with NxGamer testing on AC Shadows, but that didn't include the TAA cost. So roughly ~2ms for the whole PSSR solution is probably still right, which aligns with the data from Infinity Ward last year. Also, on Alan Wake 2 it has a FSR 2/PSSR toggle and PSSR is ever so slightly heavier than FSR 2.
 
Last edited:
Whatever Cerny is making, Microsoft needs to triple it. It's not just about power, it's about how it comes together and being able to punch above your weight. Listening to developers too.

I still remember the XBOX > PS5 TF argument during release.
 
You aren't normalizing for dynamic resolution because the PSSR pass is deterministic per pixel like the validation I did of Cerny's numbers and for 1080p -> 4K - the other thread where - it is just over 1ms on a PS5 Pro, so unlike DLSS where you aren't normalizing for the hardware TOPs anyway, the cost for the ML AI isn't deterministic and varies by the phenomenon in the scene and is even more multiples of TOPs/processing time when claiming superior quality using DLAA or Quality mode. and if using DLSS3/4 with FG or MFG too, which is typical then the processing latency for DLSS3/4 is no where close to PSSR's 1ms in worst case scenarios of fresh unpredictable frames.

The biggest cause of variation in a PSSR frame runtime compared to native with TAA is the TAA-history blend which is done in fragment shaders and proportional to the resolution, but the actual ML AI inferencing upscaling in PSSR is both lower processing cost and completely deterministic, hence why PS5 Pro can use it with GT7, but Nintendo don't use DLSS in highly competitive fast action MK world.

I don't use frame generation, and when it's available I also use Reflex.

Latency of PC games is usually lower than on consoles and that was even before PSSR and DLSS were invented. PSSR is just another type of reconstruction, I don't get how it can reduce latency vs. for example FSR2 when cost of it is similar.

For what it is worth, with the recent PSSR update for Control it is very easy to see what the upscaling cost is. Thanks to the modes being uncapped you can see it's roughly around ~1.5ms, which aligns with NxGamer testing on AC Shadows, but that didn't include the TAA cost. So roughly ~2ms for the whole PSSR solution is probably still right, which aligns with the data from Infinity Ward last year. Also, on Alan Wake 2 it has a FSR 2/PSSR toggle and PSSR is ever so slightly heavier than FSR 2.

Yep, PSSR is similar to other reconstruction technology in this aspect. It usually produces better results than FSR2/3 (not in all aspects) but is a bit heavier as well.

I really wonder if PSSR2/FSR4 will have the same cost on PS5 Pro as current PSSR.
 
So let's assume the new Xbox pushes games at full 4K. PS6 is 30% less powerful.

Xbox - Full 4K 3840 x 2160
PS6 - 30% fewer total pixels 3211 × 1809

That difference in completely meaningless and won't benefit Xbox in the slightest.
 
So let's assume the new Xbox pushes games at full 4K. PS6 is 30% less powerful.

Xbox - Full 4K 3840 x 2160
PS6 - 30% fewer total pixels 3211 × 1809

That difference in completely meaningless and won't benefit Xbox in the slightest.

No one will use native resolutions, it's a waste of resources when you have FSR4. Most games will be (my prediction) 1080p -> FSR4 -> 4K output.

Most difference will be in framerate. With PS6 being able to render 60fps, Xbox will have 78fps. PS6 with 100fps, Xbox will have 130fps...
Big difference? I don't know to be honest, PS5 Pro is ~45% faster than PS5 and sometimes differences are small.
 
Even you built an new xbox with 30% increase in power its just machine that good look on multiplatform games but suck on showing its true potential due there is no game that will show it due their exclusives now are now timed exclusives

Jack of all trade master of none
 
I don't use frame generation, and when it's available I also use Reflex.

Latency of PC games is usually lower than on consoles and that was even before PSSR and DLSS were invented. PSSR is just another type of reconstruction, I don't get how it can reduce latency vs. for example FSR2 when cost of it is similar.



Yep, PSSR is similar to other reconstruction technology in this aspect. It usually produces better results than FSR2/3 (not in all aspects) but is a bit heavier as well.

I really wonder if PSSR2/FSR4 will have the same cost on PS5 Pro as current PSSR.
You might not use frame-gen, but Nvidia's opaque implementation for DLSS and the VRAM and cache memory bandwidth requirements don't add up on mid-tier Nvidia hardware at high frame-rate IMO which for me means DLSS3 and 4 by default probably uses FG even when toggled off on 4xxx and above hardware.

The validation of the Pro's PSSR inferencing cost and the recent Amethyst presentation by AMD and Mark discussing the bandwidth pressures even compared to the PS5 Pro's should tell you that Nvidia are doing a lie by omission on when exactly the latency (processing)cost of adding DLSS upscaling to a brand new frame actually is - when they aren't able to hide the latency cost with prior frame interpolation.

PC gamers have been using a form of frame-gen for multiple decades, where the driver provided the ability to (pre-render) process X number of frames ahead of time.

As for Reflex, like frame-gen it only works while everything is predictable and it is giving the illusion of smoother control, you still need to see a real new frame first and can't present a valid game input - that isn't predictive - before the game logic allows for it and that is limited by the native frame-rate, double buffering and the tick-rate of the CPU to read user input while being in lock-step with the GPU server rendering under the CPU client's control

It works 99.99% of the time because most games like Last of Us 2 and HFW are designed around 30fps as the lowest common denominator where frame to frame and input to input is very predictable, but for a native high fps game that 0.01% failed prediction is a variable input latency spike, hence why I just ignore Reflex as a technology much like I didn't play GT5 at 1080p because the 0.01 screen tearing was a massive input latency spike causing me to crash.

And IIIRC I said nothing about PSSR reducing processing latency against non-ML AI techniques like FSR2/3. AFAIK it doesn't, but that was never part of the discussion about the importance of these ML AI upscalers in relation to next-gen and against FSR4 on an Xbox with 30% more power than a PS6 and how PS6's use or strategy for Amethyst comes into play by comparison.


I get the PC gamers love to point at higher-frame-rates and nicer settings, but IMO there is a major lack of scrutiny on the legitimacy of real worst case frame-rates and real worst case latency on DLSS3 and DLSS4, and FSR4 on PC by comparison to PSSR, even more so since new info has emerged this week on Amethyst that looks like register memory solution of the PS5 Pro is being extended to make CUs able to globally reference the register memory in any CUs.

This would allow them to work on tensor tiles at a much larger scale than those shown in the PS5 Pro seminar video at ~23:54 for Rachet and Clank - that have high processing redundance at borders because they are small -, which suggests that AMD believe the Pro hardware solution at a larger scale and enhanced is superior to evolving their existing RX 9070XT solution for FSR4.

That supports my view that PSSR at a hardware-use-level is better in ways that are being obscured by the back drop of a RTX 5090 using DLSS4 and MFG.
 
No one will use native resolutions, it's a waste of resources when you have FSR4. Most games will be (my prediction) 1080p -> FSR4 -> 4K output.

Most difference will be in framerate. With PS6 being able to render 60fps, Xbox will have 78fps. PS6 with 100fps, Xbox will have 130fps...
Big difference? I don't know to be honest, PS5 Pro is ~45% faster than PS5 and sometimes differences are small.
It clearly isn't a waste of resources in Nintendo's opinion "in all situations" if the ML AI upscaling as-is for the likes of MK World isn't used - to be consider the free win the media think it is.

I'll be shocked if Nintendo don't announce a MK9 before MK World gets any update to use DLSS.
 
You might not use frame-gen, but Nvidia's opaque implementation for DLSS and the VRAM and cache memory bandwidth requirements don't add up on mid-tier Nvidia hardware at high frame-rate IMO which for me means DLSS3 and 4 by default probably uses FG even when toggled off on 4xxx and above hardware.

This is nonsense. Off is off, DLSS SR existed before frame generation.

It clearly isn't a waste of resources in Nintendo's opinion "in all situations" if the ML AI upscaling as-is for the likes of MK World isn't used - to be consider the free win the media think it is.

I'll be shocked if Nintendo don't announce a MK9 before MK World gets any update to use DLSS.

Nintendo and their opinions about technology are irrelevant. Almost all big AAA games will use FSR4/PSSR2 on PS6 (indies can run native 4k without issue).
 
Last edited:
This is nonsense. Off is off, DLSS SR existed before frame generation.



Nintendo and their opinions about technology are irrelevant. Almost all big AAA games will use FSR4/PSSR2 on PS6 (indies can run native 4k without issue).
Nintendo are one of the few people that will have a clear view of the proprietary aspects of DLSS from their Nvidia partnership and their decision to forego the so-called free benefit despite it would improve battery life running even lower native res at handheld, should surely seem like an oddity.

I suspect Nintendo will join the ML AI upscaling world, but will do so with a similar end of native frame measured cost(like PSSR/PSSR2), that is portable to other ML hardware
 
Last edited:
Overall, it was a defeat.

The Xbox 360 is an excellent console, but its strategy was unsustainable.

1. We don't know how many million Xbox 360s were replaced.
2. We don't know if Microsoft counted the exchanged consoles as new sales.
3. The Xbox 360's specs are very good; I believe the subsidy was proportionally higher than the PS3's.
4. The Xbox brand lost traction due to Sony's efforts with the PS3 and its popular ips, just as I suggested.
5. The Xbox lost market share during the PS4 generation.
6. Perhaps the PS4 was more popular than the Xbox 360 in real terms, assuming that 10 million Xbox 360s were replaced.
This would make the victory over the PS3 small, basically the result of an extra year on the market
The 2nd half of that generation was a resounding reversal for PlayStation and Xbox in a positive way for Sony and a negative way for Xbox as it was Mattrick and Kinect 360 all day every day as the well ran dry from the Peter Moore era Xbox. The PS3 squeaked out ahead globally but also had regained momentum in terms of game output and mindshare.
 
Top Bottom