Radical_3d
Member
Somebody wake up Hicks.
When you say 90ms… you're talking end to end?
Either way, 10ms is significant to me. I work really hard to cut my latency to be as low as possible. And once your body gets used to that near instantaneous response, anything else feels off.
Tomorrow at 5:19 PM.When will you put out these flames OP? Tomorrow?
They all look fucking awful. PS5 Pro takes the win again. Number 4 is the only one without artifacting imo.
Isn't the biggest takeaway here though that the reason to play on PC is IQ and introducing DLSS, Frame gen etc. only serves to make it much worse? And no any GPU is not worth 2k to play games with a marginal improvement.
And no any GPU is not worth 2k to play games with a marginal improvement.
On my own experience playing the game with framegen. Can't speak for other games but BM:W felt good to me.What are you basing this on?
OP clearly made super deceiving and devious comparision where there is 0 movement and pics are relatively small, we dont look at 4k screen from distances like we looked at hd or full hd screens, we need to be closer to apreciate it.
Even nvidia themselfs in their very marketing video for dlss/framegen used wukong in motion/durning fights to ilustrate how it looks.
DF mentioned IQ issues in wukong too:
Dude, that is a very bold statement to makeThey all look fucking awful. PS5 Pro takes the win again. Number 4 is the only one without artifacting imo.
Isn't the biggest takeaway here though that the reason to play on PC is IQ and introducing DLSS, Frame gen etc. only serves to make it much worse? And no any GPU is not worth 2k to play games with a marginal improvement.
Where did you take that from?the reason to play on PC is IQ and introducing DLSS, Frame gen etc. only serves to make it much worse?
It can look better by losing detail e.g. fine strands of aliased hair. The detail it can't resolve is lost, therefore it's a lossy upscaler that smooths out the lost data with AA or a form of it. I think a lot of people mean the image is more pleasing to my eye than the image is higher quality when speaking of DLSS. I'm just surprised that PC players fawn over what the tech essentially is. From a technical point of view I admit it's impressive but from a fundamental point of view it's still smoke and mirrors and that's not even still being able to pick out clear artifacts like the images OP posted.Where did you take that from?
DLSS can look even better than native. No need for anti-aliasing anymore, image is crisp and clear.
Also be careful, as the order of the links is the reverse order of the names of the pictures.
The order of the screenshots in my post is correct, don't look at the filenameswhat? I chose screenshot 4 because it was 4 on the slider - is that right? OP you have failed us!!!!
Are you seriously using a base 720p resolution image to make your point? Why don't you use 4K native vs 4K DLSS? I suspect I know why...Top picture is 1080 DLSS and bottom is 1080 Native.
I'm on my phone and they are just to hand tbh, I had to resize on my phone to use the imgur uploaded on the site.Are you seriously using a base 720p resolution image to make your point? Why don't you use 4K native vs 4K DLSS? I suspect I know why...
I disagree. More below.It can look better by losing detail e.g. fine strands of aliased hair. The detail it can't resolve is lost, therefore it's a lossy upscaler that smooths out the lost data with AA or a form of it.
Honestly at first glance I'd say top looks better. After all, it's 1080p, so you can't ask for crazy level of detail. Bottom one is jagtown.Top picture is 1080 DLSS and bottom is 1080 Native.
![]()
![]()
Yeah well, I'd say that's a problem with the games and not just PC. Diminishing returns and all that. But hey it's an open-world so ofc it's going to look kinda shit when it's a scene with poor lighting.I'm not here to convince anyone, this is just what I see. I just don't buy into the whole PCMR or the cost/performance/game improvement mantra. No matter how good the graphics card is, with PT and all the bells and whistles when the NPCs in the background of cyberpunk still look like this (and this is coming from someone who was running high end in the days of Source with 200 odd fps)
![]()
It's no secret that the more data you give temporal upscalers to work with, the better the results. 1080p native is usually significantly better than 1080p DLSS Q. 1440p is much closer, and 4K is often a wash at standard viewing distances.I'm on my phone and they are just to hand tbh, I had to resize on my phone to use the imgur uploaded on the site.
Both were native 1080 but I'll have to wait till tomorrow to post both in full. It's zoomed in crazy obviously but the point is basically softening the image loses detail (as in all scenarios not just games/pcs).I disagree. More below.
Honestly at first glance I'd say top looks better. After all, it's 1080p, so you can't ask for crazy level of detail. Bottom one is jagtown.
That said. What's the native resolution of top? Meaning, is it DLSS Quality or DLSS Performance?
Also... is the bottom one without any kind of AA? I doubt anyone plays their games like that, looks like shit, although maybe the non-zoomed in original pic looks better. Maybe a better comparison would be DLSS vs Native with AA?
In any case I still think DLSS Quality for 4K can look better than native 4K, probably even more so once the new model for DLSS releases. And for DLAA I'd say it's better, no question.
Yeah well, I'd say that's a problem with the games and not just PC. Diminishing returns and all that. But hey it's an open-world so ofc it's going to look kinda shit when it's a scene with poor lighting.
Try comparing other, more linear games like Black Myth: Wukong or Space Marine 2. I'd guess you'll see more improvements there.
It's no secret that the more data you give temporal upscalers to work with, the better the results. 1080p native is usually significantly better than 1080p DLSS Q. 1440p is much closer, and 4K is often a wash at standard viewing distances.
Lets agree to disagree then.DLSS is better or equal to native is just not right
Native isn't really a thing anymore. In this day and age, an anti-aliasing solution is needed. For instance, I used an hex editor to get rid of Cyberpunk's built-in TAA and the aliasing is so bad that TAA is preferable. In the case of the ubiquitous TAA, DLSS Q does offer some noticeable advantages over it. Is the overall image quality superior with DLSS? Often times, it isn't, but the 30% performance increase makes the choice clear. It's also neat that DLSS boasts key advantages such as the resolve of fine details, text, and other aspects of the image. On the flip side, it does have issues with ghosting, blurring, disocclusion artifacts, and other problems. Granted, TAA isn't very good and there's even an entire subreddit dedicated to shitting on it.And yes (asTintoConCasera alludes) I think ultimately the discussion should be around DLAA v DLSS. One is focused on native IQ and one is focused on performance with lossy (albeit excellent) upscaling. The more complete the source the better the upscale (assuming you don't go to 8k or 16k to see how far you can push)
DLSS is better or equal to native is just not right, and I think is a band aid for those who really want to run DLAA but not have the power of their hardware questioned. There are a lot of variables, and some engines, some games and some implementations of whatever AA the developer chooses play nicer with DLSS of course. And DLSS is improving at a fair rate but I've not seen any image comparison where it isn't obvious that it's not native as per OPs example, hoping I selected the right one as the ordering seems messed up.
I would always choose native personally which I guess is the point of the thread.
1.jpg is the best for me (picture 4), no contest. It has a smoother look, but things look fine.
All three others have shit going on (patterns etc...) on textures where it simply shouldn't happen. Rocks just to the left of the main characters arm have some real issues on all other screenshots. Same for the bamboos to the left, and actually everywhere when you start looking...
We need to go back to native pictures and resolution, and give up on all of this shit...
![]()
Also be careful, as the order of the links is the reverse order of the names of the pictures.
To be honest, The Witcher 3 has one of the worst DLSS implementations I have ever seen. Installing DLSS 3.8.1 helps a lot thoughIt can look better by losing detail e.g. fine strands of aliased hair. The detail it can't resolve is lost, therefore it's a lossy upscaler that smooths out the lost data with AA or a form of it. I think a lot of people mean the image is more pleasing to my eye than the image is higher quality when speaking of DLSS. I'm just surprised that PC players fawn over what the tech essentially is. From a technical point of view I admit it's impressive but from a fundamental point of view it's still smoke and mirrors and that's not even still being able to pick out clear artifacts like the images OP posted.
Top picture is 1080 DLSS and bottom is 1080 Native.
![]()
![]()
I'm not here to convince anyone, this is just what I see. I just don't buy into the whole PCMR or the cost/performance/game improvement mantra. No matter how good the graphics card is, with PT and all the bells and whistles when the NPCs in the background of cyberpunk still look like this (and this is coming from someone who was running high end in the days of Source with 200 odd fps)
![]()
- Screenshot 3
- Screenshot 2
- Screenshot 4
- Screenshot 1
in RDR2 it was much easier for me to decide which AA looked the best.
TAA
![]()
DLSS Quality
![]()
DLSS Performance
![]()
I tried doing this on the latest patches and it doesn't work.You can use a mod to inject DLSS or DLAA.
Mate, people who chose the first screenshot don't need to feel sad, at least you don't need to spend $2000 on a RTX5090I'd like to change my vote
Mate, people who chose the first screenshot don't need to feel sad, at least you don't need to spend $2000 on a RTX5090to enjoy gaming and be happy with the image quality.
Both DLSSP and DLSSB provide insane performance boost compared to native, hell even DLSSQ offered 82% boost. However, those who opt for DLAA will need to spend a lot of money to be satisfied with the image quality. The RTX5090 seems to make sense from their perspective.What about people who choose the second screenshot?
![]()
ps5 pro, Balanced mode. I use JPG format because I take hundreds of shots and it adds up on the ssd....Show us PS5 Pro version in the same place to compare.
ps5 pro, Balanced mode. I use JPG format because I take hundreds of shots and it adds up on the ssd....
So try ignoring compression or ignore these at all
The game looks very clean. Default sharpening value (5/10 I think)
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
Mix. Some are photo mode and some are gameplay.Gameplay or photo mode?
You or/and OP need to sync up place and sharpness, I'm curious how they compare in 1:1. Wukong has best PSSR implementation so far in UE5.
Hell at this point those cards are enough to do 4K. I'm just over it, especially with engines like Unreal 5 skyrocketing system requirements pointlessly. Frostpunk 2 is literal insanity for what's on display and what's required to run it.$1000 is even too much to pay, imo. Pretty much why I'm done with the high end of PC gaming. And I mean "high end" with the 4080/5080 tier, much less GPUs that cost twice as much. Next GPU for me will be the 5070 or 9070. I'm completely satisfied with 60fps gaming and that is certainly achievable in the mid-range with minimal difference in image quality.
Wow - this looks amazing for the PS5.ps5 pro, Balanced mode. I use JPG format because I take hundreds of shots and it adds up on the ssd....
So try ignoring compression or ignore these at all
The game looks very clean. Default sharpening value (5/10 I think)
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
ps5 pro, Balanced mode. I use JPG format because I take hundreds of shots and it adds up on the ssd....
So try ignoring compression or ignore these at all
The game looks very clean. Default sharpening value (5/10 I think)
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
OK. Found the exact spot. Maybe different color because I fast traveled here from chapter 4 ?Gameplay or photo mode?
You or/and OP need to sync up place and sharpness, I'm curious how they compare in 1:1. Wukong has best PSSR implementation so far in UE5.
50% sharpening is too strong, while without a sharpening filter the image is too soft. I would probably change the in-game sharpening to something like 10%, maybe 20%, and use TV sharpening to improve detail, because at 50% the in-game sharpening mask is running the image.OK. Found the exact spot. Maybe different color because I fast traveled here from chapter 4 ?
Corporal.Hicks
PS5 PRO
Balanced mode 5/10 sharpening
![]()
Quality 5/10 sharpening
![]()
Balanced 0/10 sharpening
![]()
Quality 0/10 sharpening
![]()
50 is too strong but I am surprised I like the result so much50% sharpening is too strong, while without a sharpening filter the image is too soft. I would probably change the in-game sharpening to something like 10%, maybe 20%, and use TV sharpening to improve detail, because at 50% the in-game sharpening mask is running the image.