nemiroff
Gold Member
Upgrading DLSS version doesn't cost anything.get ready for 6x the price as well.
Upgrading DLSS version doesn't cost anything.get ready for 6x the price as well.
Yes and Switch 2.And work on a RTX 3060?![]()
AI bubble bursting means Nvidia only needs to supply a few companies instead of many companies. Genie's not going back in the bottle on this one.I can't wait for the AI Bubble to burst and bring nvidia back to earth again.
AI bubble bursting means Nvidia only needs to supply a few companies instead of many companies. Genie's not going back in the bottle on this one.
With them bringing back the 3060 I wouldn't be surprised if they figured out a way to introduce framegen to 30 series cards. Especially now with the whole market jacked up.Dang I missed that. It's been thought you need a 1000Hz display to match the motion clarity of CRTs. We're about to find out.
No, there's additional hardware on the RTX 4000 cards that the 3000 cards are missing.
No, it's not "stupid". We have 720Hz and 1000Hz even monitors incoming this year. There are no ways of getting anywhere close to these max refresh rates without FG. If you're at 100 fps or more then 6x FG would put you into 500+ territory with good input lag and very low persistence blur.But 6X frame generation is just stupid.
When is it coming out? after the announcement or later this year?
Holy so DLSS4.5 tomorrow?? Im excitedMy Google search yielded:
NVIDIA is expected to provide full demos and video comparisons on Tuesday, January 6, 2026, during their CES presentation.
DLSS 4.5 Super Resolution release Date: Expected to be available immediately via an update to the NVIDIA App.
Dynamic Multi Frame Generation (6x FG) release Date: Scheduled for Spring
It's possible, they've been quiet on DLSS updates since August (so were working on something new presumably) and have no real reason to announce something in advance as they are in the lead anyway.Holy so DLSS4.5 tomorrow?? Im excited
When is it coming out? after the announcement or later this year?
I wonder if the cost of 4.5 will be another 5-10% depending on the game though![]()
I don't think it has any effect whatsoever on Nintendo.Good for Nintendo.
Tomorrow it will be available. I want to test it with rdr2
Isnt the NSW2 using DLSS?I don't think it has any effect whatsoever on Nintendo.
DLSS3 not 4, so any transormer model updates won't mean much.Isnt the NSW2 using DLSS?
They will eventually. You can only spend so much money and time on something before giving up if you don't see results. And AGI anytime soon is still extremely unlikely. The scientific consensus is pretty clear on this. LLM research especially appears more and more likely to be a dead end.Yup, anyone thinking that Nvidia is coming back with the tail between their legs when there's an arm race between USA and China to achieve AGI is completely oblivious to the world. I wish I had still that kind of naivety.
A few AI companies might go out of business by failing to develop models against the competition but any silicon wafer for the next 10 years is gonna have a huge part of the pie going to AI.
This is Manhattan project + space race wrapped into one and then some kind of arm race to achieve the first AGI. They ain't lifting the foot from the pedal.
x6 framegen? only with series 50It's backwards compatible with RTX cards?
Doesn't it support 20 series cards?DLSS3 not 4, so any transormer model updates won't mean much.
Doesn't it support 20 series cards?
I want to test it with rdr2
what i mean is, the DLSS in the NSW2 will always be updated. So it will not be shocking if later these or some features come to NSW2 in the future.Yeah but switch 2 is the smallest and least powerful Nvidia card with tensors and RT. It doesn't have enough ML power for transformer model.
what i mean is, the DLSS in the NSW2 will always be updated. So it will not be shocking if later these or some features come to NSW2 in the future.
Let see how things out near the end of NSW2 generationCNN model stopped being updated after 3.8 version. Nvidia made "tiny DLSS" for SW2 (low quality), maybe they will improve it in the future but I doubt anything close to DLSS4 will happen on this hardware.
6x is dumb, but the thing being able to dynamically slip in frames (if i understand correctly) is pretty cool. That's what I thought fg was going to do from the start.
Are you thinking of interpolation between two finished frames (like often on traditional TVs)? That's never been DLSS FG.6x is dumb, but the thing being able to dynamically slip in frames (if i understand correctly) is pretty cool. That's what I thought fg was going to do from the start.
6x is dumb, but the thing being able to dynamically slip in frames (if i understand correctly) is pretty cool. That's what I thought fg was going to do from the start.
That's the only thing that interests me with frame-gen. If it can dynamically adjust the frame-gen to only give the extra fake frames to max my refresh rate and no more...maybe could reduce the input latency hit from being as high if I picked static amounts?NVIDIA also says users will be able to set a target FPS in the app up to the display's maximum refresh rate, with the feature dynamically adjusting how many frames are generated to meet that goal.
Are you thinking of interpolation between two finished frames (like often on traditional TVs)? That's never been DLSS FG.
DLSS FG basically use pixels + depth buffer + motion vectors, + reflex low latency syncing.
And DLSS 4.5 hopefully uses the second gen transformer model to try to neutralize some of the "shimmer" and ghosting which often is the result of pushing FG tech too far.
Jensen keynote today at CES made it clear he's all in on this AI revolution. How it looks, if no one achieves AGI, he'll end up just doing it himself anyways and make it open source.They will eventually. You can only spend so much money and time on something before giving up if you don't see results. And AGI anytime soon is still extremely unlikely. The scientific consensus is pretty clear on this. LLM research especially appears more and more likely to be a dead end.
If a decade passes and nobody can come close to an AGI, well, expect countries and companies to loose interest.
MFG is the future. I know some may not like it, but there's not much you or we can do about it. Next-gen consoles as well and games will rely on fake frames to run games(bigger ones at least) instead of relying on raw power. It is what it is, best to just accept it I guess. I dont think AI bubble will burst any time soon, much different than crypto this time around.
was hoping LG would show off 240hz OLED TVs at CES, but theyre still 165hz like the previous gen
big TVs with high refresh rates will be so good w/ FG
Can a 500hz screen do something more than a 240hz monitor can, though?we are getting 500hz screens now, so 6x makes sense
lower persistence blur, smoother looking image.
Can a 500hz screen do something more than a 240hz monitor can, though?
Yeah, i guess with flat panel monitors you can't have enough hz. Old CRT Monitors had less blur and lag than todays monitors.the higher the refresh/framerate, the less the persistence blur, which a sample and hold screen inherently has, will be noticeable.
you can also of course use Black Frame Insertion or backlight strobing to reduce the blur, but that will also reduce your screen brightness.
Yeah, i guess with flat panel monitors you can't have enough hz. Old CRT Monitors had less blur and lag.
Aha, interesting, didn't know that modern monitors are that close to a CRT technically or even better.I mean, a 500hz monitor has far less lag than any CRT ever produced.
even at 60hz we are getting insanely close to CRT latency now. we are around 2ms away from CRT level lag now in TVs by Samsung and LG. and if you run your Xbox at 120hz while playing a 60fps game, on a modern Samsung TV, you get less lag than if you'd play it on a 60hz CRT.
the lag issue is mostly resolved... unless you have a Sony TV lol.
but yes, we need far higher refresh rates to get a similar motion clarity as CRTs.
the guy who runs Blurbusters thinks above 1000hz is needed to exceed CRT clarity without black frame insertion or strobing.
and DLSS framegen is not perfect, but preferable to persistence blur.
Aha, interesting, didn't know that modern monitors are that close to a CRT technically or even better.