I would buy the deluxe right now but nobody here want to save money and buy my discounted standard edition.
So close.
"Fake frames" - this term is used randomly by people who never actually tried FG in practice. It doesn't make sense at all.I didn't. I am using 3080.
It is doubled fake frames with artifacts + added input latency.
It is all measured and factual. I don't have any doubts it feels fine and you can't see artifacts but we are talking performance comparisons here. Real fps and not this
bummer. There goes my hope of adaptive trigger support. Really hate it when PC devs don't care although it's already implemented somewhere in the game due to its PS5 release.No DualSense Support, or even PS controller support
Bunch of rich fucks...
I would buy the deluxe right now but nobody here want to save money and buy my discounted standard edition.
Bunch of rich fucks...
The most annoying thing is that voidu doesn't let me cancel my preorder for whatever reason so i'm stuck with the purchase even if the key was not released yet.
jealous cunts
Have you received the full key? If so, thats probably why. They'd be concerned on issuing the refund then the key being redeemed after the fact. Which makes sense tbh.The most annoying thing is that voidu doesn't let me cancel my preorder for whatever reason so i'm stuck with the purchase even if the key was not released yet.
Not even sure if they can legally do that but i have no option to ask for a refund.
I know perfectly well how it works. I've watched df video. It's not some black magic."Fake frames" - this term is used randomly by people who never actually tried FG in practice. It doesn't make sense at all.
It's all software, it's not like you're "buying shady frames" from your obscure "local FPS retailer".
If the perception is of increased smoothness and if the latency increase is negligible (as confirmed by many people who use the feature in its latest version), does it matter? Serious question. It's like saying that DLSS is "fake resolution" and thus, shouldn't be used.
As someone who has used it in several games, consider the latency penalty not noticeable. I would specifically report that just because it's important to tell the truth to others with similar hardware. That's why we have these discussion groups in order to see what we have and what impact that has on our performance including what drivers were using but that doesn't seem to be an issue as much as little engine issues that it has.I know perfectly well how it works. I've watched df video. It's not some black magic.
These are generated aditional frames based on previous and FUTURE frame. It adds input latency. It has to.
And introduces artifacts These are all facts.
Comparing it do DLSS is completely nonsensical. It should not be called dlss 3. It's inferring new frames and DLSS2 is just temporal AA
As someone who has used it in several games, consider the latency penalty not noticeable. I would specifically report that just because it's important to tell the truth to others with similar hardware. That's why we have these discussion groups in order to see what we have and what impact that has on our performance including what drivers were using but that doesn't seem to be an issue as much as little engine issues that it has.
And this is with recognizing the studio doesn't exactly make these kind of AAA games all the time and they've done a pretty damn good job considering that fact.
DLSS 3, if you want to be technical it's at a different number version if you look at the dll files. But frame generation all together is optional. As others have said even in digital foundry during their weekly show, it has gotten better especially since the preview days before the consumer had their hands on the release software.
If a plague tale innocence, cyberpunk, the witcher, and this game are any indicators then we are in for a treat and this is early in this type of technology and we are only going to get more iterative when it comes to these techniques in order to gain more out of our hardware.
Fake frames or whatever terminology people want to use to joke or whatever their motive is, there is those of us who genuinely want to know how it impacts our games or just to see the progression of a new technology. This is all pretty damn exciting and hasn't been this exciting since the early days of polygon graphics. Sorry for the long rant.
Btw, using DLSS doesn't improve the VRAM usage because it has not effect on the textures.
Is it the same as when some pc gamers were screaming how perfect dlss was when a few people said there were issues and native was still better …If you've got your hardware then just get it set up correctly, try it and form your own opinion.
Too many jealous cunts chatting shit about it when they don't know what they're talking about.
By "few" people do you mean the majority? DLSS 1 was widely reviled by the PC community.Is it the same as when some pc gamers were screaming how perfect dlss was when a few people said there were issues and native was still better …
but then dlss2 came out showing how horrible artifact dlss actually was?
Like anything else, I agree that the first generation is oftentimes not the most refined. I'm glad we are where we are now though.Is it the same as when some pc gamers were screaming how perfect dlss was when a few people said there were issues and native was still better …
but then dlss2 came out showing how horrible artifact dlss actually was?
I agree and although there was some benefits clearly it wasn't ready for prime time and more of in a beta state compared to what the second version and beyond has done. So much so that hopefully one day we will be able to do some of our own injecting and implementation of older games. I realize it requires a little more teaching to let the AI do its thing but as fast as these things are advancing I think that possibility isn't out of the question.By "few" people do you mean the majority? DLSS 1 was widely reviled by the PC community.
No key, they are gonna show the key on release date.Have you received the full key? If so, thats probably why. They'd be concerned on issuing the refund then the key being redeemed after the fact. Which makes sense tbh.
If you haven't received the key? Then they really should give you a refund.
I know perfectly well how it works. I've watched df video. It's not some black magic.
These are generated aditional frames based on previous and FUTURE frame. It adds input latency. It has to.
And introduces artifacts These are all facts.
Comparing it do DLSS is completely nonsensical. It should not be called dlss 3. It's inferring new frames and DLSS2 is just temporal AA
I'm pretty sure this guy on twitter has no idea wtf hes talking about regarding vram usage lmao.
You do realize that upscalers use negative LOD bias for textures?
Reflex got it's own issues but nevermind. Enjoy the feature.But since all DLSS 3 games add reflex, it has less system latency than even native
The artifacts are being worked on. I’m pretty sure it’s imminent, hell, that 3.1 version might be it. They made a video for wither 3 and cyberpunk 2077, main artifacts on UI and so on are gone.
DLSS is not just temporal AA, that's TAA. DLSS uses ML to output higher resolution frames from a lower resolution input, by using motion data and feedback from prior frames to reconstruct native quality images.I know perfectly well how it works. I've watched df video. It's not some black magic.
These are generated aditional frames based on previous and FUTURE frame. It adds input latency. It has to.
And introduces artifacts These are all facts.
Comparing it do DLSS is completely nonsensical. It should not be called dlss 3. It's inferring new frames and DLSS2 is just temporal AA
Im talking about dlss not improving vram usage. Unless DLSS is somehow broken in the game, thats not true for any case ever.
I will be happy to try it for sure if I get 40 cards in the futureDLSS is not just temporal AA, that's TAA. DLSS uses ML to output higher resolution frames from a lower resolution input, by using motion data and feedback from prior frames to reconstruct native quality images.
As for DLSS 3 (or frame generation since you don't like Nvidia's naming convention, and to be honest neither do I), I'm glad you know how it works. It seems, though, that you'll only believe that the latency impact is imperceptible by trying it out, so give it a go when you can then (especially the most recent versions).
Reflex got it's own issues but nevermind. Enjoy the feature.
I can't test it since it's not available on 3080... I feel kinda cheated ugh
Ugh thats painful.Runs fine for me at 1440p Ultra settings but no ray tracing. Hitting it with a 3080, 12700K, 16GB DDR4 and a fast 4.0 NVME.
Only issue for me is despite following a few guides, I cannot get the dualsense features to work which is really frustrating.
Im sorry but you have no idea what you are talking about.I know perfectly well how it works. I've watched df video. It's not some black magic.
These are generated aditional frames based on previous and FUTURE frame. It adds input latency. It has to.
And introduces artifacts These are all facts.
Comparing it do DLSS is completely nonsensical. It should not be called dlss 3. It's inferring new frames and DLSS2 is just temporal AA
It’s a fact that it adds latency. But just like wireless controllers adding latency over wired a lot of people won’t notice. They will just wonder why they suck so bad at super Mario bros on switch now compared to the past.Im sorry but you have no idea what you are talking about.
Yesterday i tried FG on/off several times and i could not tell any difference in input lag.
And there are no "artifacts" or anything. It just feels like your fps is boosted by 40% without any negative consequences.
Incredible technology.
If true, RIP 4070ti
Isn't wireless mouse faster than wired in some cases ?It’s a fact that it adds latency. But just like wireless controllers adding latency over wired a lot of people won’t notice. They will just wonder why they suck so bad at super Mario bros on switch now compared to the past.
There are some wireless mice which are as fast as or even faster than wired mice like my Razer Viper UltimateIsn't wireless mouse faster than wired in some cases ?
that's the same one I have ! It's pretty good mouseThere are some wireless mice which are as fast as or even faster than wired mice like my Razer Viper Ultimate
I’ve not seen any shader comp stutter. That’s almost certainly down to them being compiled before you get to the first menu. The compilation took about 30 seconds on the first time I booted the game, but now it’s more like 10 on subsequent launches.I'm glad no one is complaining about shader comp stutter. The public lambasting of Calisto Protocol had a positive effect.
Only stuttering I get seems to be after/before a cutscene, a loading screen (the first time you fast travel), and when entering a new area. I was running the game with MSI Afterburner overlay and looking out for frame drops. Once I'm playing and focusing on the game I won't give a shit because I'm having too much damn fun.I’ve not seen any shader comp stutter. That’s almost certainly down to them being compiled before you get to the first menu. The compilation took about 30 seconds on the first time I booted the game, but now it’s more like 10 on subsequent launches.
It’s not the same as an nes controller directly attached to the board.There are some wireless mice which are as fast as or even faster than wired mice like my Razer Viper Ultimate
Can’t you just create a bespoke profile for it? Or just use any other game profile you don’t have installed and add the .exe to the profile, then enable the ReBar bits.Hopefully nvidia inspector updates with the game with the incoming new drivers so we can enable resize bar on the game's profile. Might bump the frames up like many other games.
you have a 3770K and a 6600XT?Hmm, my 0.1% lows are not good (1080p, low, 6600xt). I am running a 3770k though
Also VSR resolutions don't show up - wtf .
I have 32GB RAM and the game has used up to 16-17GB. I'm playing at 1440p though. Some 4K gameplay I've seen 19-20GB RAM.Is there any difference between 16gb and 32gb RAM performance? It seems to be using 16gb ram in general.