• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA DLSS 3 “Frame Generation” Lock Reportedly Bypassed, RTX 2070 Gets Double The Frames In Cyberpunk 2077

The problem is nobody competes with them past a certain point. If I want blistering 4K performance and have the money to pay for it, I really only have the one option.
AMD has been competitive when it comes to rasterization. They've lagged on Ray Tracing and ML compared to Nvidia who introduced acceleration for such for gaming applications. But hopefully their newer cards this year manage to catch up in these features.
 

Kuranghi

Member
Totally off topic but do you know of a way to play metroid prime 2 on an emulator without stutter? Really wanna finish it but the stuttering kills me.

Its one of the worst for it due to how many shaders and screen space effects it uses, try reading this:


The newest fork of Dolphin should have it mostly fixed via Ubershaders/Asynchronous Shader Compilation so just make sure you have latest one and thats enabled, but if that doesn't work then look into this fork that has Async shader compilation but does it slightly differently or something:


I have a feeling thats mostly antiquated now though and the main branch is what you want, the benefit of that used to be that it supported a higher quality type of texture compression for HD texture packs but even that is probably in the main now I would expect. I think it may be as simple as checking a few boxes in the settings of Dolphin to remove the stutters though.
 

01011001

Banned
Doing this causes some instability and frame drops but I'm getting ~80 FPS on an RTX 2070 with the following settings:

2560x1440 res

HDR off

DLSS Balanced

DLSS Frame Generation ON

Ray Tracing Ultra preset

(I get ~35-40 FPS without Frame Generation and DLSS at Quality)

so why does he not post what he gets with DLSS Quality for both or DLSS Balanced for both?

this is weird to me.

just for example on my part, with the settings I play play at my performance going from DLSS Quality to DLSS Balanced is ~50fps to ~70fps in some scenes (not factoring in CPU related drops in CPU limited scenarios)
 
Last edited:

Barakov

Gold Member
NVIDIA recently introduced its new DLSS 3 technology that adds a new feature known as Frame Generation on supported GPUs. The company officially said that this feature was only available on their new GeForce RTX 40 series cards but it looks like a Redditor has bypassed the lock & enabled it on an RTX 20 series graphics card.

DLSS 3 Lock Bypassed? Redditor Enables Frame Generation on GeForce RTX 2070 With A Simple Config File​


The Redditor who goes by the username, JusDax, posted in the NVIDIA subreddit under a thread talking about new settings and features being added to Cyberpunk 2077. One of the upcoming features within Cyberpunk 2077 will be DLSS 3 which enables AI-Assisted frame generation that does wonders in boosting the frame rate.

The user says that he was able to bypass the DLSS 3 lock on older GeForce RTX cards by adding a config file to remove the VRAM overhead in Cyberpunk. He ran the game at 1440p with HDR off & DLSS set to balanced mode. He was also able to select the Frame Generation tab under the Ultra ray tracing preset.

DLSS Frame Generation doesn't seem to be hardware locked to RTX 40 series. I was able to bypass a software lock by adding a config file to remove the VRAM overhead in Cyberpunk. Doing this causes some instability and frame drops but I'm getting ~80 FPS on an RTX 2070 with the following settings:

2560x1440 res

HDR off

DLSS Balanced

DLSS Frame Generation ON

Ray Tracing Ultra preset

(I get ~35-40 FPS without Frame Generation and DLSS at Quality)



u/JusDax via r/NVIDIA

Without frame generation, his NVIDIA GeForce RTX 2070 gets around 35-40 FPS but with DLSS 3 frame generation enabled, the user got 80 FPS. That's almost twice or a bigger performance jump. The user did use a different DLSS preset (Balanced) to test frame generation but still, switching from quality to balanced preset even in DLSS 2 won't net you a 2x gain.

nvidia-dlss-3-innovation-HD-1030x322.jpg


Full article : https://wccftech.com/nvidia-dlss-3-...rtedly-bypassed-rtx-2070-gets-double-the-fps/

Very Funny Reaction GIF
 

Filben

Member
I bet a tenner that the bolded is more severe than is being let on, its like Switch emulation:

"I get 180fps in Metroid Prime"

*watches their recorded video and the frametime graph is like the surface of the ocean during a tropical cyclone*

"Nah but I have a shader pack"
"How did you get it?"
"I performed every action in the whole game until it stopped stuttering"

Just play the whole game in a really annoying way first, then you can enjoy playing the game at a higher framerate guys, its simple stuff:

Guns Bullet GIF


Over the last few years I've learned that one man's "its mostly stable" is my "It feels like someone is hitting me in the back of the head with a frying pan every 6 seconds"
I know this topic is about DLSS but yeah, everytime I go about emulating Switch games or PS3 or other stuff people say about "4k60fps!" and than you have audio issues, micro stutter from hell, etc etc. "It gets better the longer you play". Yeah no shit, that's the definition of "experimental" in my books and not "stable" for everyday use.

I'm very sensitive to micro stutter though and I believe some people genuinely wouldn't notice.

On topic: no amount of FPS is worth it when the game crashes (my understanding of 'unstable').

Apart from that it would be awesome when it's possible. Haven't followed news about Nvidia's nuclear power plant but I don't know why it would require hardware. But no argument will ever convince me to spend +1000 for a new GPU just for RT (because I don't need DLSS for rasterised graphics as).
 

Soltype

Member
I'll wait to see if it's true.Everyone said Freesync was the same as Gsync and NVIDIA was selling snake oil. Turned out Gsync was the better tech.
 

Pejo

Member
Is this "soap opera mode" for graphics cards? There's no way it doesn't introduce lag somewhere for the net increase of FPS.
 

Jackgamer XXXX

Neo Member
Is it good news if one of the leading GPU manufactures was caught twisting the truth about it's hotly anticipated technology not being possible on older hardware?
Nvidia didnt say that.

They said frame insertion will run on older GPUs, but not with the quality and performance Nvidia would like it to have. So they didnt implement it on older GPUs.
 

supernova8

Banned
Technically he hasn't provided any proof but if true not surprising at all. It does look like the 4080 16 and 12 gb models are hardly any better than top 30 series offerings (and yet cost more lol) if you remove the DLSS3 advantage.

Hopefully AMD gives them a bloody nose this time but I guess they won't. They've "done it" with Ryzen but for some reason they just cannot get mega mind share in the GPU space. Radeon seems to be a case of "Well I'll maybe get it if it's cheap and the GeForce option is shit" rather than with Ryzen where it's almost become the default option unless Intel can prove itself again.

Reason I say "hopefully" is because let's not forget AMD made FSR available to work on everything (more or less?) and gave us Freesync as an open standard when Nvidia is/was being a dick with G-Sync.
 

Buggy Loop

Member
The guy said there’s stuttering like mad. FPS number doesn’t tell the whole story.

I mean sure, Nvidia, let them try frame generation and see all the complaints that it perform like shit on their 2070… but high fps! Ooooohhhhh aaahhhh
 

Hoddi

Member
Nvidia didnt say that.

They said frame insertion will run on older GPUs, but not with the quality and performance Nvidia would like it to have. So they didnt implement it on older GPUs.
Dunno, it feels like a strange statement given that 1080p interpolation shouldn't have the same requirements as 4k interpolation. Nevermind the difference between 60fps > 120fps and 120fps > 240fps.

I've used Optical Flow in SVP 4 on my 2080 Ti and, sure, it drops frames when interpolating 24fps 4k footage to 165fps. But 1080p30 to 1080p60 doesn't exactly cripple the GPU.
 
Last edited:

Soltype

Member
Not sure but even if it was marginally better it didn't justify the silly premium G-sync monitors used to command. You should be happy that Freesync came along and put a stop to Nvidia's gouging.
I'm glad there's a cheaper option, but their tech was better and you can't hand wave facts. Also no one has made something as good for cheaper so who can say what the premium should cost. Some people want the best and can afford it.
 

OZ9000

Banned
The problem is nobody competes with them past a certain point. If I want blistering 4K performance and have the money to pay for it, I really only have the one option.
The 6900XT was very close to the RTX 3090.

Why not wait for November until we hear from AMD?
 

Knightime_X

Member
I wonder if this will cause games to crash.
I bet someone will blame Nvidia that their DLSS 3 isn't working on their 2x / 3x cards.

I need to see more instances on more games before getting remotely excited.
 
Last edited:
Let's wait and see the proof, because there ain't none yet, and no one can replicate it because it's in the upcoming update to Cyberpunk that he has access to.
"I have shared as much as I could. You will need a specific build of Cyberpunk to replicate this for yourself as the game hasn't been updated thru public channels."
 

THE DUCK

voted poster of the decade by bots
I sense a press release........after further extensive testing and "optimization" we've decided we can do dlss 3.0 on older cards after all due to the amazing dedication.of the team here at Nvidia! And since we love gamers, we are releasing it for free! Yay nvidia!
 
Last edited:

Mister Wolf

Gold Member
If the framerate is already horrible before, interpolation just amplifies it. Trash in trash out...

Yeah this isn't some miracle tech. It won't save a weak gpu and was never meant to. I would as far as saying if you aren't at least hitting 50-60fps fps range without the interpolation, do not bother turning it on.
 

ClosBSAS

Member
Doing this generates issues in the game but that must be cause a 2070 is not strong enough. A 3080 might be able to do it but Nvidia never said it was impossible, it's just that it would take a lot of time to implement. It can probably work on 30 series but not without issues
 

00_Zer0

Member
Scum of the earth Nvidia doing there thing. Only good competition will stop these shady practices. I am unsure about this technology though. I saw where it does add a bit of input lag so it would be no good in online games, but for single player it might be interesting.
 
Last edited:

hlm666

Member
This doesn't mean it works on amd or intel hardware, it's still using the nvidia hardware to do it your just possibly sacrificing quality and upscaling performance by not using the tensor cores by running fsr/xess on standard compute.
 
Yeah, that lock was complete garbage BS, same for their automatic games remastering software.

They should at the very least offer the software as a "bundle" with their new cards and offer to sell it to existing card owners (I'd rather it be free for all cards that can do it... We paid for the RTX premium).
 

01011001

Banned
Last edited:

b0uncyfr0

Member
Realistically speaking - unless there's big backlash, they'll never release this officially to 3 series owners. Its like admitting 'you don't really need to pay crazy prices for our 4 series cards'. Not that i would anyway - I'm banking on a 7800xtx :messenger_beaming:
 
Last edited:
Top Bottom