• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

3liteDragon

Member
Blurrier*

For real though, DLSS can increase the resolution of the final image, but it won't make up for model quality, texture resolution/filtering, lighting etc.etc.
I’d imagine AI quality will improve significantly with DLSS 3.0, speaking of which, are we even getting it this year for the Ampere cards?
 

ethomaz

Banned
Yeah, I had over a 1000 hours in Destiny 1 and it broke my L3 button on three different controllers. The launch one. The one they sent as a replacement. And a brand new controller I bought after the replacement had the same issue (warranty on replacements is only 3 months). They had fixed the rubber peeling off issue that affected the launch controllers but pretty much nothing else. Played BLOPS 3 for close to 200 hours and got dark matter camo (Getting gold and diamond skins for every gun in the game) and didnt have any issues.

After I bought my pro, I had no controller issues. Played BF1 for almost 500 hours. Destiny 2 for close to 200. GoW for 200. MGS for 200. Warzone for over a month. No drifting issues. No R3 or L3 issues. I dont play my switch much, if at all, but tbh I dont even know what drifting looks like.

If drifting was just due to wear and tear, they wouldve fixed it by now. Or it wouldve affected PS4 controllers far more than PS5 controllers since the console literally just came out. How can it already have wear and tear 3 months in?

If I had to guess, either they still havent figured it out or its related to haptics somehow since both the switch and PS5 controller use them. I also dont understand the need for a class action lawsuit since we are all in the warranty period and should get it replaced for free.
Weird I had over 1500 hours with Desitny 1 and I was using my day one controller until I migrate to PS5.
To be fair both my controllers (day one and another brought 3 months later) are still fine (one of them peeled last year) just the battery now takes just 3-4 hours max instead of 6 hours at launch.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Weird I had over 1500 hours with Desitny 1 and I was using my day one controller until I migrate to PS5.
To be fair both my controllers (day one and another brought 3 months later) are still fine (one of them peeled last year) just the battery now takes just 3-4 hours max instead of 6 hours at launch.
Thats super weird because it tells me even the L3 issue wasnt related to normal wear and tear.

And again, unless they fucked up the testing of the haptics, I highly doubt any of this is something they knew about and released anyway. Stuff like this usually comes up in testing.
 

Bo_Hazem

Banned
Thats super weird because it tells me even the L3 issue wasnt related to normal wear and tear.

And again, unless they fucked up the testing of the haptics, I highly doubt any of this is something they knew about and released anyway. Stuff like this usually comes up in testing.

It's not weird, really. Different people play in different fashions. I've spent thousands of ours in more than 180 games and the only minor drift I had was from falling hard on the analogs, also it was brand new. I was still using the launch 2013 controller because it had soft analogs which are great for aiming precision.
 

ethomaz

Banned
Thats super weird because it tells me even the L3 issue wasnt related to normal wear and tear.

And again, unless they fucked up the testing of the haptics, I highly doubt any of this is something they knew about and released anyway. Stuff like this usually comes up in testing.
My Dual Sense is like new yet.
I don’t feel like I will have any issue with it too.

The last time I brought more than two controller in a generation was with PS3... one Dual Shock broke and I had to buy another to replace it.
 
Last edited:

azertydu91

Hard to Kill
My Dual Sense is like new yet.
I don’t feel like I will have any issue with it too.

The last time I brought more than two controller in a generation was with PS3... one Dual Shock broke and I had to buy another to replace it.
The problem I have with dualsense is that it collects stains too easily.Otherwise it has been pretty sturdy thus far(my gf is quite clumsy and dropped it a lot of times).
 

Great Hair

Banned
"..cloud powa 2.0..."
mad thinking GIF by South Park
 

Trogdor1123

Gold Member
Once the ps5 gets its M2 drive sorted have they said how it will work? Will it serve as a second drive that you need to manage separately or will the Os do that for you?
 

ToTTenTranz

Banned
I think that response is very questionable. Cache is about data being closer to the execution units, and thus having much lower latency. Going off chip, involves a much longer wait for the data, and much higher power consumption. Internal cache can scale with the clock, where memory in consoles stays at a fixed clock.
The PS5's GPU getting Navi 2x clocks was the reason why I thought it could have some form of GPU L3 cache. AMD said Infinity Cache was part of the reason why Navi 21 could clock so high, because they were saving power on reducing the off-chip data transactions.



I hate to link stuff from Snowflake Era. But I thought this was worth sharing.


Claiming a GPU with DLSS can show "the same output at 4K" as the PS5 is either misinformed or dishonest IMO.
Developers have been using temporal reconstruction techniques on consoles for the better part of the last decade. The PS5 very rarely renders internally at full 4K. A "DLSS Switch" wouldn't need to render at 1920*1080 where the PS5 renders at 3840*2160, it would need to match the PS5 at 1440p~1800p, and good luck with doing that on a 5-10W TDP budget before 2025.
 

SlimySnake

Flashless at the Golden Globes
The PS5's GPU getting Navi 2x clocks was the reason why I thought it could have some form of GPU L3 cache. AMD said Infinity Cache was part of the reason why Navi 21 could clock so high, because they were saving power on reducing the off-chip data transactions.



Claiming a GPU with DLSS can show "the same output at 4K" as the PS5 is either misinformed or dishonest IMO.
Developers have been using temporal reconstruction techniques on consoles for the better part of the last decade. The PS5 very rarely renders internally at full 4K. A "DLSS Switch" wouldn't need to render at 1920*1080 where the PS5 renders at 3840*2160, it would need to match the PS5 at 1440p~1800p, and good luck with doing that on a 5-10W TDP budget before 2025.
I think their budget is 15 watts in handheld mode. not 5-10.

IIRC, MS already had a 2.0 tflops vega APU at around 20w a year ago. And DLSS 2.0 is good enough for taking 1080p to 4k. Much better than say checkerboarding which is what the Pro and PS5 use for upscaling to 4k.
 

FrankWza

Member
Once the ps5 gets its M2 drive sorted have they said how it will work? Will it serve as a second drive that you need to manage separately or will the Os do that for you?
If we’re waiting for support for PS5 games I’m gonna assume and hope it’s just integrated and once you’re in the UI I’m you can play from the cards and tiles as usual.
 

ToTTenTranz

Banned
I think their budget is 15 watts in handheld mode. not 5-10.

IIRC, MS already had a 2.0 tflops vega APU at around 20w a year ago. And DLSS 2.0 is good enough for taking 1080p to 4k. Much better than say checkerboarding which is what the Pro and PS5 use for upscaling to 4k.

I was actually being generous with my 15W comment (assuming the Pro would get a larger form factor).
Switch is 11W docked. The SoC itself is probably at less than 10W.


As for the AMD APUs, the best we had a year ago was the Renoir 4800U with 1.79 TFLOPs but those were only sustainable when working at 25W.
Regardless, even if this iGPU has enough compute throughput to match a PS4, its memory bandwidth and pixel fillrate are too short to consider it similar in performance.
 

SlimySnake

Flashless at the Golden Globes

I was actually being generous with my 15W comment (assuming the Pro would get a larger form factor).
Switch is 11W docked. The SoC itself is probably at less than 10W.


As for the AMD APUs, the best we had a year ago was the Renoir 4800U with 1.79 TFLOPs but those were only sustainable when working at 25W.
Regardless, even if this iGPU has enough compute throughput to match a PS4, its memory bandwidth and pixel fillrate are too short to consider it similar in performance.
Yeah, 1.79 is probably too much for a Nintendo handheld.

But thinking about this a bit more, they dont need 1.79 tflops to hit 1080p for switch games that already run at 720p docked on a 0.390 tflops gpu. I think they probably go for 1 tflops to hit 1080p then use some tensor cores and dlss to get a 4k image.

or they can just say fuck it and released a docked only version. they already have switch lite that retails for only $200 and sells millions every month. I think if they are smart, they will know the 4k version is for people who give a shit about this stuff and will probably release it as a docked $400 console.

not that any of this matters, it will still be running shitty undocked versions of games at 1080p or 4k. that awful looking witcher 3 port and the latest immortals fenyx rising port is missing so much detail that you cant just add without redoing the whole port and designing a version exclusively for the switch pro. which we all know is never gonna happen. it will basically run that same version at a higher res and thats it.
 

kyliethicc

Member
Yeah, 1.79 is probably too much for a Nintendo handheld.

But thinking about this a bit more, they dont need 1.79 tflops to hit 1080p for switch games that already run at 720p docked on a 0.390 tflops gpu. I think they probably go for 1 tflops to hit 1080p then use some tensor cores and dlss to get a 4k image.

or they can just say fuck it and released a docked only version. they already have switch lite that retails for only $200 and sells millions every month. I think if they are smart, they will know the 4k version is for people who give a shit about this stuff and will probably release it as a docked $400 console.

not that any of this matters, it will still be running shitty undocked versions of games at 1080p or 4k. that awful looking witcher 3 port and the latest immortals fenyx rising port is missing so much detail that you cant just add without redoing the whole port and designing a version exclusively for the switch pro. which we all know is never gonna happen. it will basically run that same version at a higher res and thats it.
The problem is just the tensor cores on a 2060 TU106 die are bigger than the entire Tegra SoC.

The Switch has a chip the size of a fingernail. Could it have AI upscaling? Sure. But not DLSS.

It cannot have Tensor cores like RTX cards, they're too big and too costly for Nintendo's silicon budget. No Tensor cores, no DLSS.

Plus whats the point of DLSS if the Switch doesn't have RT?
 
Last edited:
Nope, we start stacking dies in a 3D area. Btw the current "nm" count is not a real metric of size of the gates, but a marketing terms to reflect the density of transistors in the chip.
At single digit nm distances the electron would actually skip the gate which would render the gate absolutely useless.

Long story short, 7nm is not really 7nm and in the future we're moving to chiplet designs rather than 1 gigantic chip with lots of transistors :)

2.5D (chiplets) and 3D ASICs will have a steep cost hurdle to overcome. You're multiplying the number of process steps, so multiplying the opportunities for manufacturing defects as well as extending the lead times for manufacturing even further.

Yes, you can save costs through chopping your die down into smaller chunks but then you take on a huge hit to power efficiency, as you're now spending more of your overall chip TDP budget pushing data across your inter-die fabric/internal data buses, instead of switching transistors.

I do think these will end up amounting to stop-gap technologies before some whole new material like carbon nanotubes can take over.

I’d imagine AI quality will improve significantly with DLSS 3.0, speaking of which, are we even getting it this year for the Ampere cards?

DLSS is still a resolution scaling technique. I think that was the previous poster's point. It's like putting on a clearer pair of glasses to see the game world. but if the underlying assets are still sub-par, e.g. low poly geometry, low texel resolution, cheap flat lighting effects, then you're only getting a clearer vision of a low fidelity scene.

I think their budget is 15 watts in handheld mode. not 5-10.

IIRC, MS already had a 2.0 tflops vega APU at around 20w a year ago. And DLSS 2.0 is good enough for taking 1080p to 4k. Much better than say checkerboarding which is what the Pro and PS5 use for upscaling to 4k.

The Switch SoC TDP is 7 Watts.... 7 whole Watts.

Don't ask me how I know this :messenger_winking:
 
DLSS is still a resolution scaling technique. I think that was the previous poster's point. It's like putting on a clearer pair of glasses to see the game world. but if the underlying assets are still sub-par, e.g. low poly geometry, low texel resolution, cheap flat lighting effects, then you're only getting a clearer vision of a low fidelity scene.
I'm fairly sure that a technique "like DLSS" could somehow generate everything that's missing or broken (lighting, low texel reolution/polygon count, etc.). But I don't think DLSS is there yet... I'm just talking from what we have seen in AI image generation so far, which is not necessarily real time yet.
 

Panajev2001a

GAF's Pleasant Genius
That's it. I'm getting an Xbox Series S.

tenor.gif



To my PS4 Pro:

BossyWeeklyIbis-small.gif

I would say XSX and PS5 DE if you are on a “budget” and do not care too too much about UHD Blu-Ray playback.

Even for 1080p TV’s you get higher quality picture (super sampled from 4K for games that support it), better textures, and better effects in games. Also you get the Xbox One X enhanced version of Xbox One titles.
 

Garani

Member
The PS5's GPU getting Navi 2x clocks was the reason why I thought it could have some form of GPU L3 cache. AMD said Infinity Cache was part of the reason why Navi 21 could clock so high, because they were saving power on reducing the off-chip data transactions.
I didn't follow this point, so i take your word for it. At this point I wonder how much the I/O silicon in the PS5 has helped in getting high clock rates.
 
I'm fairly sure that a technique "like DLSS" could somehow generate everything that's missing or broken (lighting, low texel reolution/polygon count, etc.). But I don't think DLSS is there yet... I'm just talking from what we have seen in AI image generation so far, which is not necessarily real time yet.

I'm not seeing how it's possible.

You can use AI offline in development to upscale stuff like texture resolution, but that's redundant in games development when you already have higher res source textures anyway.

The input to your ML model when inferencing is still a 2D frame. It contains no information about what has been used to create that 2D image. So I can't see how conceptually an AI model can improve stuff like polygon mesh complexity.
 

LiquidRex

Member


Interesting tweet, a lot to translate... below an extract from the full tweet translated below.


Infinity Cache is a method that boosts the available efficient bandwidth even further by boosting the available bandwidth at resolutions up to 1440p, and the cache scrubber is a method that minimizes cache misses, so AMD also has Infinity Cache in RDNA2, and Cache Scrubber in RDNA3. I plan to go to the concept. :unsure:
 

Shmunter

Member


Interesting tweet, a lot to translate... below an extract from the full tweet translated below.


Infinity Cache is a method that boosts the available efficient bandwidth even further by boosting the available bandwidth at resolutions up to 1440p, and the cache scrubber is a method that minimizes cache misses, so AMD also has Infinity Cache in RDNA2, and Cache Scrubber in RDNA3. I plan to go to the concept. :unsure:

Which does XsS/ XsX have?
 


Interesting tweet, a lot to translate... below an extract from the full tweet translated below.


Infinity Cache is a method that boosts the available efficient bandwidth even further by boosting the available bandwidth at resolutions up to 1440p, and the cache scrubber is a method that minimizes cache misses, so AMD also has Infinity Cache in RDNA2, and Cache Scrubber in RDNA3. I plan to go to the concept. :unsure:

Apologies for my ignorance, but who is this man?
 

Bo_Hazem

Banned


Interesting tweet, a lot to translate... below an extract from the full tweet translated below.


Infinity Cache is a method that boosts the available efficient bandwidth even further by boosting the available bandwidth at resolutions up to 1440p, and the cache scrubber is a method that minimizes cache misses, so AMD also has Infinity Cache in RDNA2, and Cache Scrubber in RDNA3. I plan to go to the concept. :unsure:


Thankfully! RDNA3 sounds better than RDNA1.5 that "tech savvy" Dealer and some "decorated" members here call PS5. But I prefer "exclusive" tech instead until we see it on other devices.
 
Status
Not open for further replies.
Top Bottom