• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Lossless Scaling 2.9 update introduces 3x Frame Interpolation

I actually got this on Steam out of curiosity even though I don't need since I game on a 4090 and it works pretty well.

My main issue with it as with framegen technology in general is input lag.


I used it on Star Citizen which can be a very laggy unoptimized mess depending on your location and it made everything looks smooth but the latency was horrendous. I had to go back to rendering everything natively because I couldn't stand the lag. I can see how this software would be good for those with very weak hardware though. Wish I had this tech back in the day when I used to game on potato PCs.
 
Last edited:
The most annoying thing is people casually equating generated frames to real frames. Generated frames mean nothing. It's has no impact on the game's input, physics, simulation, etc. Its just annoying and I wish this trend got dead and buried. It's like AI assisted interpolation. Just trash.
Because most people (me included) don't give a shit about "ultra low latency". For me 60fps is perfect in responsiveness, even 30fps might be responsive enough if well implemented (I beat sekiro on PS4, a time based game at 30fps). But I care a LOT about smoothness, and in this case these
new frame generation technologies are just magic.

You don't like them? Don't use them, simple as that.
 

hussar16

Member
I want to be back where we get actual good screen picture clarity because if the screen projector was a crt this would never be a problem.all this add is soap opera effect and smoothing . A good crt at 20 fps looks better in motion then this at 60 magic
 
Last edited:

Dorfdad

Gold Member
Do you have to manually tweak this every time you game or can you just have it auto start with the pc? Thinking this might be a solution for a headless pc!?
 

T4keD0wN

Member
The most annoying thing is people casually equating generated frames to real frames. Generated frames mean nothing. It's has no impact on the game's input, physics, simulation, etc. Its just annoying and I wish this trend got dead and buried. It's like AI assisted interpolation. Just trash.
Youre acting like AI interpolation like DLSS and average TV interpolation that was the trend decade ago were equal, they arent.

Obviously generated shouldnt be considered the same real frames and you need enough real ones as a prerequisite to generate the fake, but i actually prefer fake frames over real ones since its impossible to tell FG is on since in real-time youre looking at both fake and generated in conjunction, its not like youre capturing a video and then stopping only on one specific ai generated frame.

Theres plenty of benefits to using FG, even with enough power to max out the monitors refresh rate without frame gen i will still take:
180w, low temps, low cpu usage and passive cooling for 70 real frames with 70 fake with better 1% lows
over
300w, high thermals, medium cpu usage, active cooling for 140 real frames with worse 1% lows and possible CPU bottlenecks

(made up numbers just to be clear, but its not far from real scenarios with dlss, not sure about lossless scaling since i barely use it)
 
Last edited:
Holly cow, I can use DLSS FGx2 and then lossless scaling FGx4 on top of that.... and it works 😃👌, just dont ask me about input latency😂😂.

Thanks to DLSS FGx2, performance went from around 60fps to 106fps with Path Tracing.

20250109-192659.jpg


And x4 Lossless scaling on top of that. My base fps went from 106fps to 65fps (almost the same as If I wouldnt enabled DLSSFG x2🤣), but lossless scaling still generated 261fps 😃😂👌 (65 /261 on the upper left corner)

20250109-192505.jpg
 
Last edited:

Three

Member
Youre acting like AI interpolation like DLSS and average TV interpolation that was the trend decade ago were equal, they arent.

Obviously generated shouldnt be considered the same real frames and you need enough real ones as a prerequisite to generate the fake, but i actually prefer fake frames over real ones since its impossible to tell FG is on since in real-time youre looking at both fake and generated in conjunction, its not like youre capturing a video and then stopping only on one specific ai generated frame.
motion interpolation in TVs and framegen are near identical bar where the processing is done. framegen is like motionflow 120hz and multiframegen is like motionflow 240hz. Sure the algorithms have improved from a decade ago but theyre the same thing and move on near identical timelines.
 

T4keD0wN

Member
motion interpolation in TVs and framegen are near identical bar where the processing is done. framegen is like motionflow 120hz and multiframegen is like motionflow 240hz. Sure the algorithms have improved from a decade ago but theyre the same thing and move on near identical timelines.
I have no idea how either work on a technical level, i just know that one looks good, has many upsides and some small downsides, whereas the other one looks just plain bad (bad might be putting it too lightly)

The tech/principles might be similar, but the quality difference when it comes to what is being outputted is nowhere near comparable.

Easy to spot interpolation on a tv immediately because it looks and feels terrible, not to mention the input is just 24fps, cant tell at all with dlss or even fsr3.1 for that matter, but starting and ending at much higher framerates.

Dont care how they get there, quality-wise its different. Its like switching from a broken down standard version of a 20year old Fiat (no hate to Fiat) to a brand new Porsche. Nvidia and all these other companies should sod off with trying to pass FG as the real performance, its super dishonest, especially towards less informed buyers.
 
Last edited:

YeulEmeralda

Linux User
I want to be back where we get actual good screen picture clarity because if the screen projector was a crt this would never be a problem.all this add is soap opera effect and smoothing . A good crt at 20 fps looks better in motion then this at 60 magic
Playing videogames on a tiny screen never looks good.
 

Three

Member
Easy to spot interpolation on a tv immediately because it looks and feels terrible, not to mention the input is just 24fps, cant tell at all with dlss or even fsr3.1 for that matter, but starting and ending at much higher framerates.
the input isn't just 24fps, I'm not sure where you're getting that from. It's up to 60fps with MotionFlow 120hz and 240hz. meaning it adds a frame inbetween each for 120fps and 3 frames between for 240fps from a 60fps source. The quality is comparable with the same artifacts as framegen. It's very very similar tech just where the processing is done is changed. It's all "fake frames".
 
Last edited:

CrustyBritches

Gold Member
Looking forward to testing out the new update tomorrow. Gotta be careful using x3 or x4, it can cause motion sickness in some games.

Sometimes I use it on anime. Wouldn't say it's "better", but it's cool to fuck around with.
 

Holammer

Member
What does that do to emulators?

8/16-bit games run mostly at 60fps, bumping them to a 120fps interpolated mode, that's interesting!
Tried it with 16-bit games and arcades. It looks cool, but it's not very useful. One frame of lag and you start missing jumps in Mario games.
It's best used for PS2 era games and forward where controls and timing got more forgiving with less pixel perfect timing.

I recently played through Mario Sunshine to completion with a 60fps patch, x2 LS and HD textures. Looked and played superbly.
 

T4keD0wN

Member
the input isn't just 24fps, I'm not sure where you're getting that from
The input has a variable range, but when all the content is 24fps it does not matter, movies and tv shows are pretty much 24fps

Trumotion or whatever each manufacturer calls it is a no go when the TV is in "gaming mode" (for response times/latency reasons), so effectively the only time youll be able to make use of TVs native interpolation is for movies/tv shows which are @24fps (the one exception will be sports, which can be at 24fps too)

When you render+interpolate games on a GPU you have the benefits of reflex/antilag, UI masking. TVs just interpolate "playback" in a sense and when it comes to games they have no way to access TAA data.
 
Last edited:

Hugare

Member
motion interpolation in TVs and framegen are near identical bar where the processing is done. framegen is like motionflow 120hz and multiframegen is like motionflow 240hz. Sure the algorithms have improved from a decade ago but theyre the same thing and move on near identical timelines.
I can tell right away that you havent experienced framegen using this app, if you are really comparing it to TV interpolation techniques .

What matters most in framegen is latency. And no TV will offer latency as good as this app using your GPU.
 

Three

Member
I can tell right away that you havent experienced framegen using this app, if you are really comparing it to TV interpolation techniques .

What matters most in framegen is latency. And no TV will offer latency as good as this app using your GPU.
This loseless scaling app no, but framegen in general of course I have.
 
This loseless scaling app no, but framegen in general of course I have.
It's pretty damn good, especially if used in games where the movement is relatively smooth. I would never recommend it for competitive fps though. But flight simulators or more story heavy games, it's mostly upsides and very little downsides. I'm fucking around with it in cyberpunk, comparing it to DLSS frame gen on a laptop, I'm keeping it at no more than 3x as 4x already causes too much lag and artifacts to appear, but it's pretty good. Even better on the windows handhelds where you're needing to compensate for low powered GPU anyway.
 

Bulletbrain

Member
So does this thing affect gpu performance? Vram requirements? Surely you can't 2x the fps without some catch
GPU cost and vram requirements are broadly inline with DLSS FG. Catch is that there is more artifacting here than DLSS FG Can't speak for FSR as I have an Nvidia card. On the other hand it's nice to have a fg solution that can be universally applied (even for movies!).

Hope the new updates bring some tangible quality improvements!
 

mansoor1980

Gold Member
Happy anniversary, LSFG!

LSFG 3 is built on a new, efficient architecture that introduces significant improvements in quality, performance, and latency.

Key improvements include:
  • Reduced flickering and border artifacts, with noticeable enhancements in motion clarity and overall smoothness.

  • Lower GPU load:
    A 40% reduction for X2 mode compared to LSFG 2 (non-performance mode). Over 45% reduction for multipliers above X2 compared to LSFG 2 (non-performance mode).
    The "Resolution Scale" feature remains an excellent way to further reduce GPU load. For instance, setting it to 90% roughly aligns with the LSFG 2 "Performance" mode.

  • Better latency:
    Latency testing with the OSLTT tool (at 40 base FPS, X2) shows approximately 24% better end-to-end latency compared to LSFG 2.

  • LSFG 3 also introduces an unlocked multiplier, now capped at X20. While this offers greater flexibility, the following recommendations apply for optimal results:

    Base framerate: A minimum of 30 FPS is required (40 FPS or higher is preferred, with 60 FPS being ideal) at 1080p.
    For best overall experience, locking the game framerate is recommended. This helps to avoid 100% GPU load (reducing its impact on latency) and ensures smoother framepacing.
    For higher resolution use at higher than recommended framerates or use the "Resolution Scale" option to downscale input to 1080p:
    For 1440p, set it to 75%.
    For 4K, set it to 50%.

    Higher multipliers (e.g., X5 or above) are best suited for high refresh rate setups, such as:

    48 FPS × X5 for 240Hz.
    60 FPS × X6 for 360Hz.
    60 FPS × X8 for 480Hz.

By CptTombstone:
d23a7588ca453e6a64bfda8f27e756bb7e12af84.png


----------------------------------------------

We’re also happy to announce the Lossless Scaling 3.0 UI Update Beta! To join the beta testing, follow these steps:

Open Steam.
Navigate to Lossless Scaling in your library.
Right-click and select Properties.
Go to the Betas tab and select 3.0 from the dropdown.
Your feedback on the new UI is invaluable as we continue to improve!

2daaaec0201dacec2ea348c903174a3a394dd8aa.png


----------------------------------------------

On a side note, the DXGI Capture API is working again in Windows 11 24H2, thanks to the KB5046617 update. Much appreciated, Microsoft.

----------------------------------------------

Have fun!
More Ways To Connect:
 

Bulletbrain

Member
x20 frame gen lol. Eat your heart out, Jensen, and your puny 4x!

I kid ofc and expect a artifact mess at anything over 4-5x. Excited to try it out after work nonetheless!
 
x20 frame gen lol. Eat your heart out, Jensen, and your puny 4x!

I kid ofc and expect a artifact mess at anything over 4-5x. Excited to try it out after work nonetheless!

Been doing x4 frame gen for months and then Jensen drops x4 MFG for RTX 5000, only to have Lossless Scaling unleash x20 for everything a few days later! Nvidia always playing catch up on frame gen :messenger_grinning_sweat:
 

Holammer

Member
3.0 is not working correctly for me. When I try Ridge Racer 2 PSP with 4x it locks at 137 and it becomes a judder fest barf-o-rama.
 

DirtInUrEye

Member
It has a gpu/cpu cost so you don't go from 60fps to 120fps. For example you go from 60fps to 105fps with added latency because your true fps is now 52.5fps.

This was always my chief complaint with this software. Who on earth wants to play with 105 laggy fps over 60 real and responsive ones? I think it's best suited to very high baseline performance, but again, it's just superior to stick to real frames then.

I can't figure out why this app exists.
 
This app is incredible, at least the frame gen portion. Been using it on kingdom come deliverance to get pretty consistent 120-144 fps. The key is to have enough overhead above 60/72 fps naturally so you can lock your game to that fps, then x2 frame gen up to 120/144 fps. Because your fps is locked you have extra GPU overhead to use for the frame gen and thus get true x2. KCD is incredibly smooth with this on with virtually no artifacting and very little input lag.
 
Last edited:
This was always my chief complaint with this software. Who on earth wants to play with 105 laggy fps over 60 real and responsive ones? I think it's best suited to very high baseline performance, but again, it's just superior to stick to real frames then.

I can't figure out why this app exists.
Have you used it? In my experience the input lag is very minimal. The difference between 60 and 105 frames is massive.
 

proandrad

Member
This app is really cool, it doesn’t come close to Nvidia’s framegen but funny enough I find it better than amd’s framegen.
 

nemiroff

Gold Member
I don't use it a lot, but I find it useful for games bizarrely locked at 60fps, like f.ex. Motorfest.
 
Last edited:

DirtInUrEye

Member
Have you used it? In my experience the input lag is very minimal. The difference between 60 and 105 frames is massive.

Yeah, I bought it a while ago for my kid's PC, I tinkered with it fairly exhaustively. I think mileage varies a heck of a lot depending on the hardware configuration. I was trying to get his Helldivers 2 running at a better clip on his 1070 ti, but his baseline performance NOSEDIVED when enabling LS. Without it he was getting a comfortable 60fps with the right settings, so obviously I thought 120 was possible with this software, but it just killed his baseline by something like 34%, it was bad. I ended up keeping him with his native performance profile and uninstalling LS. Since then I've upgraded his card anyway.
 

mansoor1980

Gold Member
I'm sorry, but reduced lag is... impossible.
im using x2 mode to bump fps from a locked 30 fps to 60 in games like snowrunner.
cant see any artifacts and controls are more responsive , i use a gamepad
early versions had clear artifacts at the bottom of the screen and UI elements
 
Last edited:

ChoosableOne

ChoosableAll
I'm sorry, but reduced lag is... impossible.

Only "real frames" matter to lag, and real FPS suffers a bit with this enabled.
He probably compared it to earlier versions. There is of course a little bit lag but its not that bad. Link's head glitch while rotating camera is less noticeable now(I'm using it on twilight princess for 60fps experience).
 

Gp1

Member
Tell that to all the switch and steamdeck users. Screensize rarely affects immersion.
Try play on a small ass monitor before saying this.
One thing is compromise the the screen size for mobility, the other is just the screen by itself :)
----

That said, did they fixed the distortion and the hair glitching effects on the LS 3.0?
 
tested it out and feels even better now. was running cyberpunk yesterday on it to have a good (fresh) point of reference and somehow the inputs feel much faster. need to test a few more games to look at the baseline performance, but so far it's very decent and no noticeable artifacts at 3x (baseline around 40fps)
 

Tarin02543

Member
Just played a few shmups in X6 mode, I can confirm lag to be reduced even when not using reduced frame resolution. Truly revolutionary.

My CRT still looks better though, even without the superior colours
 

Bulletbrain

Member
Cool, thats great to hear.
Especially the reduced lag.
Artifacts are definitely lower now, but still visible on my very limited testing with AC: Odyssey. Main artificing is the fizzle around the player character model when turning the camera.

at 60 to 120 framegen input lag feels the same as non framegen'd 60, but ofc with the much better "visual" smoothness. Certainly a win there.
 

DirtInUrEye

Member
I can confirm lag to be reduced even when not using reduced frame resolution. Truly revolutionary.

Not to cast doubt, but what is your base fps with it enabled?

I just can't imagine how what is an interesting little indie app can apparently overcome the challenges posed by the interpolation techniques implemented by the big GPU manufacturers.
 
Top Bottom