LOL I remember trying to use that with my 980Ti.RTX Voice all over again.
I think I saw a picture of that card.Just chain 1,000 Voodoo 2 cards together.
Jesus Christ, read the goddamn article before jizzing all over the keyboard, peoplehttps://www.pcgamer.com/nvidias-frame-generation-tech-works-with-amd-fsr-and-intel-xess-scaling/
LOL. It looks like the new feature of DLSS 3.0 works for any tech.
Actual results done by Igor's lab.
https://www.igorslab.de/en/nvidia-g...and-xess-in-benchmark-and-quality-comparison/
Let me just glance 15 degrees to the right of this monitor that I am using right now. Let's see... a glowing RGB sign that says uh... EVGA GEFORCE GTX1080 Classified. Damn I hate Nvidia! (your brand loyalties are showing)Jesus Christ, read the goddamn article before jizzing all over the keyboard, people
This DOESN'T ENABLE FRAME GENERATION FOR NON 40XX CARDS! It just means it can be used with FSR and XESS. Which is still a nothing burger, cause DLSS is the best one for Nvidia cards.
I get hating a company, but hate boners aren't healhty.
How's correcting bullshit brand loyalty? I've been very critical of Nvidia in the other threads.Let me just glance 15 degrees to the right of this monitor that I am using right now. Let's see... a glowing RGB sign that says uh... EVGA GEFORCE GTX1080 Classified. Damn I hate Nvidia! (your brand loyalties are showing)
It actually does run on Non 40xx cards. But I didn't make that claim. DLSS3's bullshit fake frames runs on DLSS's competitors.How's correcting bullshit brand loyalty? I've been very critical of Nvidia in the other threads.
I find an update to this thread making people believe that Nvidia DLSS3 now runs on non RTX 40xx cards. That's false and Nvidia being greedy and having put shitty connectors on their new cards doesn't suddenly make all news true.
It actually does run on Non 40xx cards. But I didn't make that claim. DLSS3's bullshit fake frames runs on DLSS's competitors.
DLSS3.0 and FSR3.0 will not be good for the industry. Increasing latency to make a game appear to have more frames isn't the way forward.
It actually does run on Non 40xx cards. But I didn't make that claim. DLSS3's bullshit fake frames runs on DLSS's competitors.
DLSS3.0 and FSR3.0 will not be good for the industry. Increasing latency to make a game appear to have more frames isn't the way forward.
I agree, I'm not a fan of the technology either. I think it can be good for single player games to do 60 -> 120, but that's it. Bad for competitive games and awful below 60.
Yes, again, for high fps. It can do 80 to 160 or 60 to 120 and it's going to be fine.but that's not the usecase.
if you have the choice of running a game at 80fps with the latency of 80fps
or running it 144fps with almost the same latency as 80fps, what would you chose?
you lose basically nothing and gain motion clarity
Yes, again, for high fps. It can do 80 to 160 or 60 to 120 and it's going to be fine.
But this kind of technology would be amazing at 30 to get 60 fps, but it doesn't perform that well, with visible artifacts and gives you still crappy input lag.
I might be wrong but I just think it'd be the most useful on weaker cards (think of 4060 and below) and on those the experience doesn't sound like it'd be that good.
I'll test it on cyberpunk when that patch is finally out.
Maybe it's because I'm not used to super high frames, but in general the moment we're above 90/100hz, i barely nothing a difference up to 144/165. I do get that pushing a screen to its max rate might have its benefits, but at least at the moment this technology seems to be effective only when it's less noticeable. Happy to be proven wrong!the usecase on PC is mostly to get to your monitor's refresh without running natively at that refresh.
so if you have a 165hz monitor but newer games with high settings only run at maybe 80fps, then you just turn on frame generation and reach your max refresh or get close to it.
I don't think this was ever meant to be used at 60hz since most PC screens these days are at least 120hz or 144hz
look where PC screens move towards. with the new Display Port version we will maybe soon see 900hz monitors.
combine that with DLSS frame generation and you could get actually close to 900fps on high end cards.
but even with screens that already exist this could be a big improvement in motion/image clarity.
with 1440p 240hz monitors on the market already you could see it being used to get 120fps games closer to that 240hz refresh without sacrificing graphics settings.
and if you run at 120 native frames + frame generation + nvidia reflex, your input latency will still be super low.
Maybe it's because I'm not used to super high frames, but in general the moment we're above 90/100hz, i barely nothing a difference up to 144/165. I do get that pushing a screen to its max rate might have its benefits, but at least at the moment this technology seems to be effective only when it's less noticeable. Happy to be proven wrong!
The current version of DLSS3 has issues you WILL notice, even from a distance. Flickering scrambled HUDs f.e.Playing on a TV, afew metres away and you'll never see these smaller issues with artifacting/motion and frame generation.
I see it as big win for PC gamers with TV's. Its finally a good time to aim for 120hz and beyond.
Well shit, i guess it needs more time in the oven I was under the impression, from a distance everything was pretty good.The current version of DLSS3 has issues you WILL notice, even from a distance. Flickering scrambled HUDs f.e.
There are several DLSS3 tests from the usual suspects. Just YT search.Well shit, i guess it needs more time in the oven I was under the impression, from a distance everything was pretty good.
Any vides out there about these 'issues'...?
The current version of DLSS3 has issues you WILL notice, even from a distance. Flickering scrambled HUDs f.e.
so was this ever confirmed btw? seems like it has been quite a while and noone managed to showcase this actually running on 2000 and 3000 series cards, meaning I guess it was a whole load of bullshit right?
I think the biggest issue is people are buying into the 4k hype. It's exceptionally expensive, and exceptionally useless. I would take 1080p 144fps with max settings over 4k60 and low any day.The problem is nobody competes with them past a certain point. If I want blistering 4K performance and have the money to pay for it, I really only have the one option.
Exceptionally useless for a 24" or 27" monitor, but exceptionally useful for a 40-48" monitor or 80" TV.I think the biggest issue is people are buying into the 4k hype. It's exceptionally expensive, and exceptionally useless. I would take 1080p 144fps with max settings over 4k60 and low any day.
Such a big screen is also exceptionally useless.Exceptionally useless for a 24" or 27" monitor, but exceptionally useful for a 40-48" monitor or 80" TV.