Yup, same boat. I'll stick with 566.36, no issues at all at my end so see no reason to update.Nice, but most of it is for 50 gpus, i will stay on an older driver with my 4070. Since I use older drivers I have no more crashes.
interesting. i am using the cable that comes with the samsung g8 32 4k 240 monitor. i figured they would bundle it with display 2.1 cable.Yes, there are DP 2.1 certified cables. They aren't cheap, but not crazy either.
But...any decent DP cable will work, but it may force DSC if the bandwidth isn't available.
If you end up using DSC, you probably wont even notice and you shouldn't care.
I am pretty sure that monitor is not using DP 2.1.interesting. i am using the cable that comes with the samsung g8 32 4k 240 monitor. i figured they would bundle it with display 2.1 cable.
I always found the 5000 series performance to be extremely underwhelming. Any performance increased required a power increase compared to the 4000 series. Given how much more bandwidth GDDR7 has compared to 6X, I felt that the performance gains should have been higher than what we got.Perf increase too. Congrats Nvidia fam
Should have been there day 1I always found the 5000 series performance to be extremely underwhelming. Any performance increased required a power increase compared to the 4000 series. Given how much more bandwidth GDDR7 has compared to 6X, I felt that the performance gains should have been higher than what we got.
Maybe they are now finally unlocking that performance.
RTX 50 owners are seeing significant performance boosts from this latest driver. 4-10%:
RTX 5070 TI Before and after latest driver update (huge increase)
Was Nvidia holding back on 5000 series performance?
I am pretty sure that monitor is not using DP 2.1.
Very few monitors have DP 2.1 and they usually carry a significant premium.
If you're happy with the 240Hz performance, you wont see an improvement if it was full 2.1 vs 1.4 w/DSC.
I have a 240 Hz Alienware QD-OLED and I dropped the refresh down to 60 (which doesnt require DSC) and compared it to 240 Hz (with DSC), there is no difference. Literally none.
If you have a 5090 its obvious you have a decent amount of income, but please do not spend extra money to get a 2.1 monitors.
For the record, I have had more issues with Samsung monitors than any other brand by far.
I am pretty sure that monitor is not using DP 2.1.
Very few monitors have DP 2.1 and they usually carry a significant premium.
If you're happy with the 240Hz performance, you wont see an improvement if it was full 2.1 vs 1.4 w/DSC.
I have a 240 Hz Alienware QD-OLED and I dropped the refresh down to 60 (which doesnt require DSC) and compared it to 240 Hz (with DSC), there is no difference. Literally none.
If you have a 5090 its obvious you have a decent amount of income, but please do not spend extra money to get a 2.1 monitors.
For the record, I have had more issues with Samsung monitors than any other brand by far.
DP 1.4 w/ DSC is fine. You're not really gaining much using DP 2.1. If I am being honest, I feel you are more likely to have issues trying to push 80 GBps through a cable than just dealing with DSC, which I am willing to bet almost nobody could tell which was full bandwidth vs DSC.Yeah it says display port 1.4 on the specs . Not that I care really but was just wondering about a fix . So far the last updated fixed the issue for me so that's good.
Don't have any problem with the monitor to be honest. My brother got one as well and and no issues either . All good thanks
I agree and this was the most disappointing thing about the 5000 series. Increased performance for the same corresponding increase in power consumption.I always found the 5000 series performance to be extremely underwhelming. Any performance increased required a power increase compared to the 4000 series. Given how much more bandwidth GDDR7 has compared to 6X, I felt that the performance gains should have been higher than what we got.
Maybe they are now finally unlocking that performance.
That was always the most realistic view, but GDDR7 should have provided a bigger boost.I agree and this was the most disappointing thing about the 5000 series. Increased performance for the same corresponding increase in power consumption.
However I guess this was expected since the node process was the same.
Maybe the 6000 series can give us a similar '4000 like' leap in performance.
Something interesting happened yesterday...
I went into my local Microcenter and they had several 5070 Tis for $799 from Zotac....so I bought it and will be returning my $750 9070XT to Amazon for use in my HTPC.
WHy did I do such a thing?
One simple reason: nvidia GPUs allow for me to pass my HDMI signal through my receiver. AMD GPUs (I have tried several) do not without disabling ALLM and VRR, which are compromises that are 100% unacceptable. No idea why AMD GPUs have this issue.
It's a shame really as the 9070XT really is an outstanding GPU and FSR4 (when I could use it) was really truly revolutionary compared to FSR3.
Had I bought the 9070XT for $599 MSRP, would I have made this exchange? No. The 5070 Ti is a better GPU, but not at the $750 MSRP.
The main reason I bought the 9070XT was to to use Bazzite/SteamOS, but I discovered after that fact that I could not use HDMI 2.1 and that basically forced me to go back to Windows, which completely nullified the biggest reason for buying an AMD GPU, in addition to not wanting to support nvidia. We will see how Linux support improves with nvidia.
On the brightside, the 9070XT I returned should eventually end up in the hands of a gamer.
I did have success at first using a DP2HDMI cable from Cable matters. It worked and had all the bells and whistles until it stopped working.This is really disappointing and somewhat holds back my desire to use AMD/Linux. I saw that there is a way to get around it using a Displayport-HDMI adapter, but it seems that VRR doesn't work. One of the best features of OLED for me was VRR, it's a shame that it doesn't have displayport, it would solve a lot of problems.
On my office PC this is not a problem because I use a 4k60hz monitor.
In the case of Nvidia/Linux, they just need to fix the performance issue in DX12 and recently a developer said they are looking into it.
![]()
I have a Gigabyte Aorus monitor that has DP 2.1 and I can't really say it matters much.I am pretty sure that monitor is not using DP 2.1.
Very few monitors have DP 2.1 and they usually carry a significant premium.
If you're happy with the 240Hz performance, you wont see an improvement if it was full 2.1 vs 1.4 w/DSC.
I have a 240 Hz Alienware QD-OLED and I dropped the refresh down to 60 (which doesnt require DSC) and compared it to 240 Hz (with DSC), there is no difference. Literally none.
If you have a 5090 its obvious you have a decent amount of income, but please do not spend extra money to get a 2.1 monitors.
For the record, I have had more issues with Samsung monitors than any other brand by far.
I had those issues at first and always used borderless. In recent months it seems like that issue has been fixed as I havent noticed it on my Alienware QD-OLED in quite some time.I have a Gigabyte Aorus monitor that has DP 2.1 and I can't really say it matters much.
I no longer get those quick black screens when going out of a game in full screen mode (which only lasted about 1 sec anyway).
Great monitor though!
Tell us later if it worked, hopefully it did bro![]()
Damn, sorry to hear thatNope, just did it again.
Damn, sorry to hear that![]()
Finally! Was driving me crazy that there was a small hitch when playing at extremely high framerates on games like Cyberpunk.
- Stutter when using VSYNC [5202703][5202474]
- VSYNC in NVCP + frame generation causes issues in DLSS 4 games [5124816]
Eh think I still have these in Wukong and Shadows.Finally! Was driving me crazy that there was a small hitch when playing at extremely high framerates on games like Cyberpunk.
No die shrink = no performance per watt gains. 60 series will have a die shrink and blow away the 5090.I always found the 5000 series performance to be extremely underwhelming. Any performance increased required a power increase compared to the 4000 series. Given how much more bandwidth GDDR7 has compared to 6X, I felt that the performance gains should have been higher than what we got.
Maybe they are now finally unlocking that performance.
It's great they've acknowledged something they originally fucked up. I mean seriously, how can these companies keep doing this? Paying money for fuck ups like this is unacceptable.Hopefully this fixes much of the bullshit that has been present as of late.
Here is LONG list of fixes. Despute what certain shills will tell you, it's nice to see nvidia basically acknowledging that previous drivers sucked.
I will mention that I personally have not dealt with any issues on my 4090 using the previous drivers. but most of my gaming has been done on my living room PC with a 9070XT.
When you're making money hand over fist and your priorities shift. nvidia could get sued for billions and lose and it would make little difference.It's great they've acknowledged something they originally fucked up. I mean seriously, how can these companies keep doing this? Paying money for fuck ups like this is unacceptable.
If they're still doing GPUs at that point.No die shrink = no performance per watt gains. 60 series will have a die shrink and blow away the 5090.
Black screen as in just no picture on the screen but windows continues to run or also a system crash?4080, still black screening at random even with a clean wipe and a new DP cable (which I thought the problem was)
This driver fixed the flickering in Ass Creed when you climb a waypoint and it circles around your character on my 5090I'm on the new hotfix driver with my RTX 3090 and haven't had any issues.
However, I've never seen nvidia release so many hotfix drivers before like they have in 2025. Not sure what has changed over there...
OP may want to add the hotfix to his post.
NVIDIA Support
nvidia.custhelp.com
This hotfix addresses the following:
- [RTX 50 series] Some games may display shadow flicker/corruption after updating to GRD 576.02 [5231537]
- Lumion 2024 crashes on GeForce RTX 50 series graphics card when entering render mode [5232345]
- GPU monitoring utilities may stop reporting the GPU temperature after PC wakes from sleep [5231307]
- [RTX 50 series] Some games may crash while compiling shaders after updating to GRD 576.02 [5230492]
- [GeForce RTX 50 series notebook] Resume from Modern Standy can result in black screen [5204385]
- [RTX 50 series] SteamVR may display random V-SYNC micro-stutters when using multiple displays [5152246]
- [RTX 50 series] Lower idle GPU clock speeds after updating to GRD 576.02 [5232414]
well that explains why my GPU temperatures were so lowGPU monitoring utilities may stop reporting the GPU temperature after PC wakes from sleep [5231307]
I have an issue that I have no idea is driver related, but I'll mention it here.
Doom Eternal's performance has absolutely tanked on my 5070 Ti. That game should be getting around 200 fps (No DLSS) with ray tracing enabled, but its now dropped to around 60-70 fps and I cannot figure out why. I havent tried reinstalling it. turning off ray tracing brings it back up to 115 fps, which is likely due to low latency mode being enabled in the control panel and my OLED C1 being limited to 120 Hz.
The only thing that seems to restore in game performance is to enable in game V-Sync, which you aren't supposed to do to take full advantage of GSync. I have my nvidia control panel set to Ultra Low Latency Mode and V-sync enabled, and turn V-Sync off in game. Again, these are the recommended GSync settings
Anybody have any suggestions? Doom Eternal has always been a game that has run GREAT and has never required any special tweaking, but this has me annoyed.
I will mention that Indiana Jones does not run well with low-latency mode enabled and I had to set special instructions in the control panel to disable it for that game.
Holy fuck.... more bullshit
Download the hotfix NOW.
![]()
NVIDIA Pushes Out New GeForce Hotfix Driver 576.15, Addressing Display Crashing & GPU Temperature Sensor Issues
NVIDIA has released a new hotfix for its GeForce driver, attempting to solve the "troublesome" display issues and a GPU temperature bug.wccftech.com
Nvidia should pull 576.02 and just make 576.15 the official one....assuming of course this hotfix doesn't make things worse.
GPU monitoring utilities may stop reporting the GPU temperature after PC wakes from sleep [5231307]
![]()
576.15 didn't fix it?i have this problem, even with the hot fix installed. the only way to fix the gpu temp bug is to restart.
The game does not have any singular profile settings. It is 100% using global settings.Remember to check global profiles / app profile for doom just in case it was set.
576.15 didn't fix it?They've been shit with this years updates so far.
I have an issue that I have no idea is driver related, but I'll mention it here.
Doom Eternal's performance has absolutely tanked on my 5070 Ti. That game should be getting around 200 fps (No DLSS) with ray tracing enabled, but its now dropped to around 60-70 fps and I cannot figure out why. I havent tried reinstalling it. turning off ray tracing brings it back up to 115 fps, which is likely due to low latency mode being enabled in the control panel and my OLED C1 being limited to 120 Hz.
The only thing that seems to restore in game performance is to enable in game V-Sync, which you aren't supposed to do to take full advantage of GSync. I have my nvidia control panel set to Ultra Low Latency Mode and V-sync enabled, and turn V-Sync off in game. Again, these are the recommended GSync settings
Anybody have any suggestions? Doom Eternal has always been a game that has run GREAT and has never required any special tweaking, but this has me annoyed.
I will mention that Indiana Jones does not run well with low-latency mode enabled and I had to set special instructions in the control panel to disable it for that game.
I think it worked for me. I set my sleep time to 1 min to test it then left it for 10 mins, seems to be going up and down in task manager will test Afterburner later.yeah, at least not for me. I have to check msi afterburner before i play any game to see if gpu temp monitor is working because i use a custom fan curve.