NVIDIA GeForce Game Ready 576.02 Driver Addresses Several RTX 50 GPU Issues Including Black screens, Crashes, Freezes & More

Something interesting happened yesterday...
I went into my local Microcenter and they had several 5070 Tis for $799 from Zotac....so I bought it and will be returning my $750 9070XT to Amazon for use in my HTPC.
WHy did I do such a thing?

One simple reason: nvidia GPUs allow for me to pass my HDMI signal through my receiver. AMD GPUs (I have tried several) do not without disabling ALLM and VRR, which are compromises that are 100% unacceptable. No idea why AMD GPUs have this issue.

It's a shame really as the 9070XT really is an outstanding GPU and FSR4 (when I could use it) was really truly revolutionary compared to FSR3.

Had I bought the 9070XT for $599 MSRP, would I have made this exchange? No. The 5070 Ti is a better GPU, but not at the $750 MSRP.

The main reason I bought the 9070XT was to to use Bazzite/SteamOS, but I discovered after that fact that I could not use HDMI 2.1 and that basically forced me to go back to Windows, which completely nullified the biggest reason for buying an AMD GPU, in addition to not wanting to support nvidia. We will see how Linux support improves with nvidia.

On the brightside, the 9070XT I returned should eventually end up in the hands of a gamer.
 
Fixed the wake issue for 50 series - was super annoying not having a DP signal being sent to the monitor if the monitor turned off or system went to sleep. At least Nvidia is finally paying attention or maybe they now have time to focus on gamers since they can't sell chips to China and wanted to keep busy
 
interesting. i am using the cable that comes with the samsung g8 32 4k 240 monitor. i figured they would bundle it with display 2.1 cable.
 
interesting. i am using the cable that comes with the samsung g8 32 4k 240 monitor. i figured they would bundle it with display 2.1 cable.
I am pretty sure that monitor is not using DP 2.1.

Very few monitors have DP 2.1 and they usually carry a significant premium.

If you're happy with the 240Hz performance, you wont see an improvement if it was full 2.1 vs 1.4 w/DSC.

I have a 240 Hz Alienware QD-OLED and I dropped the refresh down to 60 (which doesnt require DSC) and compared it to 240 Hz (with DSC), there is no difference. Literally none.

If you have a 5090 its obvious you have a decent amount of income, but please do not spend extra money to get a 2.1 monitors.

For the record, I have had more issues with Samsung monitors than any other brand by far.
 
Last edited:
Perf increase too. Congrats Nvidia fam
I always found the 5000 series performance to be extremely underwhelming. Any performance increased required a power increase compared to the 4000 series. Given how much more bandwidth GDDR7 has compared to 6X, I felt that the performance gains should have been higher than what we got.

Maybe they are now finally unlocking that performance.
 
I always found the 5000 series performance to be extremely underwhelming. Any performance increased required a power increase compared to the 4000 series. Given how much more bandwidth GDDR7 has compared to 6X, I felt that the performance gains should have been higher than what we got.

Maybe they are now finally unlocking that performance.
Should have been there day 1
 
I am pretty sure that monitor is not using DP 2.1.

Very few monitors have DP 2.1 and they usually carry a significant premium.

If you're happy with the 240Hz performance, you wont see an improvement if it was full 2.1 vs 1.4 w/DSC.

I have a 240 Hz Alienware QD-OLED and I dropped the refresh down to 60 (which doesnt require DSC) and compared it to 240 Hz (with DSC), there is no difference. Literally none.

If you have a 5090 its obvious you have a decent amount of income, but please do not spend extra money to get a 2.1 monitors.

For the record, I have had more issues with Samsung monitors than any other brand by far.

I am pretty sure that monitor is not using DP 2.1.

Very few monitors have DP 2.1 and they usually carry a significant premium.

If you're happy with the 240Hz performance, you wont see an improvement if it was full 2.1 vs 1.4 w/DSC.

I have a 240 Hz Alienware QD-OLED and I dropped the refresh down to 60 (which doesnt require DSC) and compared it to 240 Hz (with DSC), there is no difference. Literally none.

If you have a 5090 its obvious you have a decent amount of income, but please do not spend extra money to get a 2.1 monitors.

For the record, I have had more issues with Samsung monitors than any other brand by far.

Yeah it says display port 1.4 on the specs . Not that I care really but was just wondering about a fix . So far the last updated fixed the issue for me so that's good.

Don't have any problem with the monitor to be honest. My brother got one as well and and no issues either . All good thanks
 
Yeah it says display port 1.4 on the specs . Not that I care really but was just wondering about a fix . So far the last updated fixed the issue for me so that's good.

Don't have any problem with the monitor to be honest. My brother got one as well and and no issues either . All good thanks
DP 1.4 w/ DSC is fine. You're not really gaining much using DP 2.1. If I am being honest, I feel you are more likely to have issues trying to push 80 GBps through a cable than just dealing with DSC, which I am willing to bet almost nobody could tell which was full bandwidth vs DSC.
 
I always found the 5000 series performance to be extremely underwhelming. Any performance increased required a power increase compared to the 4000 series. Given how much more bandwidth GDDR7 has compared to 6X, I felt that the performance gains should have been higher than what we got.

Maybe they are now finally unlocking that performance.
I agree and this was the most disappointing thing about the 5000 series. Increased performance for the same corresponding increase in power consumption.

However I guess this was expected since the node process was the same.

Maybe the 6000 series can give us a similar '4000 like' leap in performance.
 
I agree and this was the most disappointing thing about the 5000 series. Increased performance for the same corresponding increase in power consumption.

However I guess this was expected since the node process was the same.

Maybe the 6000 series can give us a similar '4000 like' leap in performance.
That was always the most realistic view, but GDDR7 should have provided a bigger boost.
 
Something interesting happened yesterday...
I went into my local Microcenter and they had several 5070 Tis for $799 from Zotac....so I bought it and will be returning my $750 9070XT to Amazon for use in my HTPC.
WHy did I do such a thing?

One simple reason: nvidia GPUs allow for me to pass my HDMI signal through my receiver. AMD GPUs (I have tried several) do not without disabling ALLM and VRR, which are compromises that are 100% unacceptable. No idea why AMD GPUs have this issue.

It's a shame really as the 9070XT really is an outstanding GPU and FSR4 (when I could use it) was really truly revolutionary compared to FSR3.

Had I bought the 9070XT for $599 MSRP, would I have made this exchange? No. The 5070 Ti is a better GPU, but not at the $750 MSRP.

The main reason I bought the 9070XT was to to use Bazzite/SteamOS, but I discovered after that fact that I could not use HDMI 2.1 and that basically forced me to go back to Windows, which completely nullified the biggest reason for buying an AMD GPU, in addition to not wanting to support nvidia. We will see how Linux support improves with nvidia.

On the brightside, the 9070XT I returned should eventually end up in the hands of a gamer.

This is really disappointing and somewhat holds back my desire to use AMD/Linux. I saw that there is a way to get around it using a Displayport-HDMI adapter, but it seems that VRR doesn't work. One of the best features of OLED for me was VRR, it's a shame that it doesn't have displayport, it would solve a lot of problems.

On my office PC this is not a problem because I use a 4k60hz monitor.

In the case of Nvidia/Linux, they just need to fix the performance issue in DX12 and recently a developer said they are looking into it.

 
This is really disappointing and somewhat holds back my desire to use AMD/Linux. I saw that there is a way to get around it using a Displayport-HDMI adapter, but it seems that VRR doesn't work. One of the best features of OLED for me was VRR, it's a shame that it doesn't have displayport, it would solve a lot of problems.

On my office PC this is not a problem because I use a 4k60hz monitor.

In the case of Nvidia/Linux, they just need to fix the performance issue in DX12 and recently a developer said they are looking into it.

I did have success at first using a DP2HDMI cable from Cable matters. It worked and had all the bells and whistles until it stopped working.

Every time I would cold boot the system, it reverted back to 4K/60 and that got old.

The only way to get it working was to boot with the HDMI cable hooked up, enter desktop mode, disable the HDMI port and plug the DP2HDMI cable in.

That got old fast.
 
I am pretty sure that monitor is not using DP 2.1.

Very few monitors have DP 2.1 and they usually carry a significant premium.

If you're happy with the 240Hz performance, you wont see an improvement if it was full 2.1 vs 1.4 w/DSC.

I have a 240 Hz Alienware QD-OLED and I dropped the refresh down to 60 (which doesnt require DSC) and compared it to 240 Hz (with DSC), there is no difference. Literally none.

If you have a 5090 its obvious you have a decent amount of income, but please do not spend extra money to get a 2.1 monitors.

For the record, I have had more issues with Samsung monitors than any other brand by far.
I have a Gigabyte Aorus monitor that has DP 2.1 and I can't really say it matters much.
I no longer get those quick black screens when going out of a game in full screen mode (which only lasted about 1 sec anyway).
Great monitor though!
 
I have a Gigabyte Aorus monitor that has DP 2.1 and I can't really say it matters much.
I no longer get those quick black screens when going out of a game in full screen mode (which only lasted about 1 sec anyway).
Great monitor though!
I had those issues at first and always used borderless. In recent months it seems like that issue has been fixed as I havent noticed it on my Alienware QD-OLED in quite some time.

Elden Ring was the worst when it came to those quick black screens.
 
My PC used to consistently and violently crash when Frame Generation was enabled in AC Shadows. This issue is now completely gone...
 
By all accounts this driver is indeed a nice improvement. Happy for nvidia owners who've put up with this crap for the past few months.
 
  • Stutter when using VSYNC [5202703][5202474]
  • VSYNC in NVCP + frame generation causes issues in DLSS 4 games [5124816]
Finally! Was driving me crazy that there was a small hitch when playing at extremely high framerates on games like Cyberpunk.
 
I always found the 5000 series performance to be extremely underwhelming. Any performance increased required a power increase compared to the 4000 series. Given how much more bandwidth GDDR7 has compared to 6X, I felt that the performance gains should have been higher than what we got.

Maybe they are now finally unlocking that performance.
No die shrink = no performance per watt gains. 60 series will have a die shrink and blow away the 5090.
 
Hopefully this fixes much of the bullshit that has been present as of late.

Here is LONG list of fixes. Despute what certain shills will tell you, it's nice to see nvidia basically acknowledging that previous drivers sucked.

I will mention that I personally have not dealt with any issues on my 4090 using the previous drivers. but most of my gaming has been done on my living room PC with a 9070XT.
It's great they've acknowledged something they originally fucked up. I mean seriously, how can these companies keep doing this? Paying money for fuck ups like this is unacceptable.
 
It's great they've acknowledged something they originally fucked up. I mean seriously, how can these companies keep doing this? Paying money for fuck ups like this is unacceptable.
When you're making money hand over fist and your priorities shift. nvidia could get sued for billions and lose and it would make little difference.

nvidia is doing just enough to keep gamers around to remind everyone that they exist.

I expected Blackwell would be bad, but even I didn't expect it to go this poorly. I expected Jensen's own personal pride to whip Blackell into shape.

What's interesting is that despite all of these availability issues, my local Microcenter has had fairly reliable stock of 5070s and 5070 Tis, but has been consistently sold out of 9070XTs.
As I said earlier I replaced with $750 9070XT with a $800 5070 Ti. I only did that replacement since the 5070 Ti had enough of a bump in performance and the fact that nvidia GPUs can seemlessly passthrough my Onkyo 6050 to my C1 made me make the switch. If I had gotten the 9070XT at $599 I would NOT have made the switch. The 9070XT is a great GPU at MSRP.
 
No die shrink = no performance per watt gains. 60 series will have a die shrink and blow away the 5090.
If they're still doing GPUs at that point.

Things could be so drastically different by then.

I don't believe nvidia will ever abandon gamers, but I also have little faith that the AI market will be shaken up enough by then for nvidia to put effort into pleasing us
 
Last edited:
4080, still black screening at random even with a clean wipe and a new DP cable (which I thought the problem was)
Black screen as in just no picture on the screen but windows continues to run or also a system crash?

I had the latter and it was due to AMD CPU curve optimizer being set to low.
 
Last edited:
I'm on the new hotfix driver with my RTX 3090 and haven't had any issues.
However, I've never seen nvidia release so many hotfix drivers before like they have in 2025. Not sure what has changed over there...

OP may want to add the hotfix to his post.

This hotfix addresses the following:
  • [RTX 50 series] Some games may display shadow flicker/corruption after updating to GRD 576.02 [5231537]
  • Lumion 2024 crashes on GeForce RTX 50 series graphics card when entering render mode [5232345]
  • GPU monitoring utilities may stop reporting the GPU temperature after PC wakes from sleep [5231307]
  • [RTX 50 series] Some games may crash while compiling shaders after updating to GRD 576.02 [5230492]
  • [GeForce RTX 50 series notebook] Resume from Modern Standy can result in black screen [5204385]
  • [RTX 50 series] SteamVR may display random V-SYNC micro-stutters when using multiple displays [5152246]
  • [RTX 50 series] Lower idle GPU clock speeds after updating to GRD 576.02 [5232414]
 
Last edited:
I'm on the new hotfix driver with my RTX 3090 and haven't had any issues.
However, I've never seen nvidia release so many hotfix drivers before like they have in 2025. Not sure what has changed over there...

OP may want to add the hotfix to his post.

This hotfix addresses the following:
  • [RTX 50 series] Some games may display shadow flicker/corruption after updating to GRD 576.02 [5231537]
  • Lumion 2024 crashes on GeForce RTX 50 series graphics card when entering render mode [5232345]
  • GPU monitoring utilities may stop reporting the GPU temperature after PC wakes from sleep [5231307]
  • [RTX 50 series] Some games may crash while compiling shaders after updating to GRD 576.02 [5230492]
  • [GeForce RTX 50 series notebook] Resume from Modern Standy can result in black screen [5204385]
  • [RTX 50 series] SteamVR may display random V-SYNC micro-stutters when using multiple displays [5152246]
  • [RTX 50 series] Lower idle GPU clock speeds after updating to GRD 576.02 [5232414]
This driver fixed the flickering in Ass Creed when you climb a waypoint and it circles around your character on my 5090

JohnnyFootball JohnnyFootball - Can you add The hotfix info to the OP as per Celcius Celcius suggestion
 
I have an issue that I have no idea is driver related, but I'll mention it here.

Doom Eternal's performance has absolutely tanked on my 5070 Ti. That game should be getting around 200 fps (No DLSS) with ray tracing enabled, but its now dropped to around 60-70 fps and I cannot figure out why. I havent tried reinstalling it. turning off ray tracing brings it back up to 115 fps, which is likely due to low latency mode being enabled in the control panel and my OLED C1 being limited to 120 Hz.

The only thing that seems to restore in game performance is to enable in game V-Sync, which you aren't supposed to do to take full advantage of GSync. I have my nvidia control panel set to Ultra Low Latency Mode and V-sync enabled, and turn V-Sync off in game. Again, these are the recommended GSync settings

Anybody have any suggestions? Doom Eternal has always been a game that has run GREAT and has never required any special tweaking, but this has me annoyed.

I will mention that Indiana Jones does not run well with low-latency mode enabled and I had to set special instructions in the control panel to disable it for that game.
 
I have an issue that I have no idea is driver related, but I'll mention it here.

Doom Eternal's performance has absolutely tanked on my 5070 Ti. That game should be getting around 200 fps (No DLSS) with ray tracing enabled, but its now dropped to around 60-70 fps and I cannot figure out why. I havent tried reinstalling it. turning off ray tracing brings it back up to 115 fps, which is likely due to low latency mode being enabled in the control panel and my OLED C1 being limited to 120 Hz.

The only thing that seems to restore in game performance is to enable in game V-Sync, which you aren't supposed to do to take full advantage of GSync. I have my nvidia control panel set to Ultra Low Latency Mode and V-sync enabled, and turn V-Sync off in game. Again, these are the recommended GSync settings

Anybody have any suggestions? Doom Eternal has always been a game that has run GREAT and has never required any special tweaking, but this has me annoyed.

I will mention that Indiana Jones does not run well with low-latency mode enabled and I had to set special instructions in the control panel to disable it for that game.

Remember to check global profiles / app profile for doom just in case it was set.
 
Holy fuck.... more bullshit





Download the hotfix NOW.

Nvidia should pull 576.02 and just make 576.15 the official one....assuming of course this hotfix doesn't make things worse.

GPU monitoring utilities may stop reporting the GPU temperature after PC wakes from sleep [5231307]
chandler-friends.gif

Edit: Why isn't the driver update available on the normal site? It was released 3 days ago?
 
Last edited:
Remember to check global profiles / app profile for doom just in case it was set.
The game does not have any singular profile settings. It is 100% using global settings.

I shouldn't have to change anyting.

I do have a few things to try. Doom Eternal has always been a game that has run great even on my laptop 3060.
 
I have an issue that I have no idea is driver related, but I'll mention it here.

Doom Eternal's performance has absolutely tanked on my 5070 Ti. That game should be getting around 200 fps (No DLSS) with ray tracing enabled, but its now dropped to around 60-70 fps and I cannot figure out why. I havent tried reinstalling it. turning off ray tracing brings it back up to 115 fps, which is likely due to low latency mode being enabled in the control panel and my OLED C1 being limited to 120 Hz.

The only thing that seems to restore in game performance is to enable in game V-Sync, which you aren't supposed to do to take full advantage of GSync. I have my nvidia control panel set to Ultra Low Latency Mode and V-sync enabled, and turn V-Sync off in game. Again, these are the recommended GSync settings

Anybody have any suggestions? Doom Eternal has always been a game that has run GREAT and has never required any special tweaking, but this has me annoyed.

I will mention that Indiana Jones does not run well with low-latency mode enabled and I had to set special instructions in the control panel to disable it for that game.

Try resetting the shader cache.
If it doesn't help, try uninstalling the driver in safe mode with DDU. The reinstall.
 
yeah, at least not for me. I have to check msi afterburner before i play any game to see if gpu temp monitor is working because i use a custom fan curve.
I think it worked for me. I set my sleep time to 1 min to test it then left it for 10 mins, seems to be going up and down in task manager will test Afterburner later.
 
Top Bottom