Am I misreading the patch notes or this latest one doesn't talk about X2? Only about X3 improvements "X3 mode has been updated to further reduce artifacts on patterned textures and in dark scenes."
If you don't like the latency, you can try the "Allow Tearing" option under Sync mode. It significantly reduces latency at the cost of potential screen tearing. However, after 60ish hours of using this option, I haven't experienced screen tearing at all. Obviously, it might not work as well on EVERY game and EVERY pc but its a pretty huge improvement, so its worth a try for anyone who has LS.
Lossless scaling at 4x is great for 2D games with lots of scrolling. These games are often limited to 60 fps (such as Factorio and Terraria). These games combined with my 240 Hz monitor looks fantastic. All motion blur induced by tracking objects with your eye is almost eliminated.
I play Super Smash flash 2. This game is locked on 30fps. This game needs this app to achieve 60fps. Ain't no other option to fix this but to use Lossless Scaling. 😅
Never in my whole life do I expect a pixelated Duck software on Steam could double,triple or QUADRUPLE my FPS in games. TFS is legend for upgrading this for free constantly. Lossless Scaling is truly the GOAT when you wanted to get more FPS on games that didn't have the expensive exclusive DLSS or AMD's whatever FSR or FMF, this Duck just say "Let me allow every single game be blessed with tons of Frames and Resolution!" and then the frames actually did appear. It really saved my precious GTX 1060 6GB from early retirement and I believe it will benefit people who had crazy high Hz monitors or uses a high frame rate TV madlads to use this just for the sake of playing anything on 4k or 8k.
Unfortunately lossless scaling is snake oil. The upscaling is literally just worse dlss/fsr. The frame gen reduces ur native frames to put more fake frames. Fake frames are not real frames, they just create the illusion of smoothness. Ur input lag is still based on ur real native frames, which are now lower thanks to lossless scaling consuming gpu power and vram. The 3000 and 4000 series have special hardware in them that is used for dlss and frame gen. This hardware makes the performance cost of those nvidia features practically zero, which lossless scaling doesn't have because it's a software implementation rather than a hardware one. U're better off just saving money, buying a used 3060 if on a budget and using dlss, that will give u much better quality at a much smaller performance cost. The frame gen feature could be useful for games that dont have native frame gen, but unfortunately its gsync support is lackluster. And gsync/freesync aren't even that good. Special K's latent sync blows both of them out of the water. Even then u need at least a 144hz monitor to make it worth using. If u have a decent gpu like a 4070 or above, u shouldnt struggle getting 144 fps on non-demanding games, and the demanding ones usually have native frame gen included anyway. So pretty much the only time this program is worth using is if u have an older gpu with a 144hz monitor where u get around 75-80 fps stable. Lock the framerate to 72 and double it up to 144. And even then u miss out on latent sync and u have to use a shitty gsync implementation. Aint worth it unless ure really destitude.
@@kerkertrandov459 FPS unlockers aren't a option for most older games on a emulator where making the frame rate higher breaks the physics or makes the game run too fast or nerfs jumping etc.
I used the program for a few months now already and from my experience as long as you aren't specifically looking for the artifacts and those little jittering pixels I have no issues and em very greatfull for this to exist
@@G0A7 maybe try with the G-sync now available turned on? It could be because sometimes they add the jittering on the top but with g sync this then gets rid of the problem
@@risingg3169 i could try but tbh 60fps is more than enough for chrono trigger, more often than not i just forget that i have It installed and just plain without it
I tested it today on youtube and a movie. And - ITS WORKS! You can boost 4k 25 fps video to 100fps or 24fps movie to 96 fps even in fullscreen without graphics problems.
Hadn't gotten around to trying it like that but was hoping it might work. I tend to stream at 720p but have a 1440p 165 Hz monitor. I've only tried it on a few games. Didn't work with LA Noire but worked very well on FH4.
@@xShikari no, what should fix it is internal motion pacing Same in games The game could be running at 60fps then see a random item or character moving at 30fps or less Sometimes it can be cool Like into the spiderverse where Miles is moving at 12fps but the movie is 24fps But running a movie at 60fps is awful, the movement is very video game looking It's like those people who were upscaling JJK episodes from 24fps to 48fps and upscaled to 4K Way too much ghosting and flames looked very liquid You could see this with a satellite tv running on a Samsung or LG 4K60 display It upscales everything to run at 60fps Every single movie no matter how old it's got that motion smoothing feature on that basically doubles the frame rate and everything you watch is at 60fps
In case no one's mentioned it here yet or tried it, you can also use Lossless Scaling for videos and it works surprisingly well. However, you do need to be on fullscreen and there should be no UI elements present such as captions or the scrub bar otherwise you will get FPS fluctuations. Also works alongside RTX VSR.
@@papiertoilette-pb9lf movies are made to be 24fps for reason. give it more than 60fps give its a cheap "soap effect" look. looks like robotic , or game like. but if u like that thats fine.
I just bought and tested it. X2 is the best option. No need to use X4. 120/144 FPS is enough. X2 has a very good quality too. I didnt even notice that there are any artifactes. This shit is better than Frame Generation and FSMR LOL,
@@Jeannemarre yea, this app is great for capping to your max refresh rate too. Also for games which are hard capped at 60, like Tekken 8. I play Tekken 8 with smooth 120 fps and you do not notice any more input lag.
The thing is that 120/144 means you have Sync in Default and not in OFF (allow tearing), which means you have more input lag. Best option is 2x performance mode with allow tearing
@@ghostlu5462This Honestly Using the LS1 Upscaler + the X2, both on performance, have minimum impact on the image while feeling smooth. Each update silently boosts X2 Frame rate to the point is the best choice for a lot of gamers.
I downloaded more SSD space yesterday. The cool thing is i did not have enough space to download the extra space but the extra space gave me the option to download the extra space on the extra space itself. The problem is the download was so massive that at the end of the day i have the same amount of SSD space as before.
I downloaded more internet speed, but unfortunately the download of the speed itself takes up so much bandwidth that it eats into itself. Still, it's essentially free, so I think I'll keep downloading it just so I can tell people I have a faster connection.
@@DragonOfTheMortalKombat Yes you can stack in game Fg and Lsfg, Tried it on Cyberpunk with Fsr fg mod resulted in 660fps with 1440p low preset Fsr balanced *And Spiderman Miles Morales's Fsr 3 Fg around the same fps due to Lsfg capped to monitor refresh rate for base Fps. I have 165hz monitor so x4 from Lsfg is 660fps.
@@TheStrengthScholar And what kind of FPS jump did you obtain in it? Also what are the best settings? I'm seeing a lot of people say x2 and allow tearing?
Considering X2 is Quality and X4 is performance. I actually use the X2 to save power on my 3090. I limited my AW3423DWF refresh rate to 120 so I can use 10-bit. Then I cap the refresh rate in games to 60/90 and just let the X2 generate the rest.
yeh 2x is good enough to play with good image quality, also I'm using 2x and i have 88 fps on cyberpunk 2077, before 42 fps without using lossless scaling. i'm using gtx 1080 8gb GPU
Turn the Performance toggle to On for better latency! Theres a bit more artifacting, but its less noticeable the more frames you feed it, and its less noticable than the input lag.
I've used it with 2x scaling and 2x FG to play FH4 at 1440p60 with a GTX 670 SC. Both on performance mode. Things got a bit wobbly when moving through my garage quickly but in play, latency was low and the image good. Good enough to be normally competitive in ranked racing. I've played the game at least 120 hours like that.
You probably won't notice the artifacting while actively engaged in a fast paced game. It's when you go out of your way to find them using flick tests that they become more apparent. Something to note is that lowering the Mode to X2 and enabling Performance Mode (and setting Sync Mode to Disabled) significantly helps latency. While X4 is cool, it's unoptimized and without the performance settings will result in a bad time latency wise
Just like "lo-fi" music can "work" but it can also sound low-effort and cheap. If you want to make something good, the "they won't notice" excuse will only get in your way.
Have been using this for quite some time now and I am really happy with it. I personally do not really care about hud elements having errors and the artifacts are also barely noticeable, at least for me. I don't really pay too much attention to finer details. I have a 165hz monitor for almost 2 years now but I couldn't really utilize it. I was mostly stuck on 120-144hz because of performance, even with x2 frame gen of Nvidia (I have a 4080). Since x3 mode I can play pretty much every game at 165fps, so I can finally use the 165hz of my monitor. And yea like you said, with games like Mordhau (competitive game) I do not use it because of the latency. But in single player games that aren't crazy fast paced, 60fps baseline is perfectly fine latency wise. Only downside is: you can't really go back to something like 60fps. My god it feels like lag 🤣
I remember long ago i used to skip frames on ps2 emulator and up to 2x it worked more or less playable. And now instead of skipping we're adding frames. What a plot twist. Btw in Valheim even AFMF2 feels perfect, with L.S. it would be probably even better.
I experimented with this on my 6600XT with a 170hz 1440p monitor with vrr. It works great and I don’t notice any ghosting or artifacts pop up, but I do want to avoid having them whenever possible, same story with the latency since I play with a controller most of the time. Works best when you maintain a consistent 60fps to begin with
Should specify that this application is the star of the show for emulation. Emulation, you're looking for a fixed frame rate 100% of the time. That goes great with this application that works best while at a stable FPS as well. VRR doesn't do anything for emulation and can cause slow motion at lower frame rates. Games that lock at 30 FPS aren't great but games that lock at 60FPS like many Mario 3D titles, feel much better with it on.
When games close at 30 frames, it seems as if they are at 60 frames, but this is a shame. The more the frames are cut, the more the control becomes heavier and heavier, and this is annoying.
@@thebigcj517 unless you can do run ahead frames like retroarch and combine that with the frame smoothing, since computing power isn't the limitation here
First DLSS 2.0 dropped and I couldn't believe it wasn't magic. Then Nvidia released Frame Gen and it's almost as impressive. I can't wait for what they are cooking up next. They should definitely be criticized for their anti consumer behaviour but you can't deny the genius of their engineers.
It's just like an evolution of injectors back in the day, they just fiddle with settings to make games look better than they actually are. It isn't a replacement for native, and it certainly isn't magic. More like copium for devs inability to properly optimize PC games
Using words like magic and genius bring back bad memories of Apple's language. I dont agree with glorifying these big corporations. Unfortunately Nvidia is more and more Apple like.
Daniel, congrats! You have now become an educator and a youtuber. So that answers your dilemma that you were facing in one of your past videos. You offer a great easy to understand explanation that many people may not understand, the difference in latency, and the explanation of the values being presented. Appreciate it!
@@krishnaprasad1570 No he has a point, a lot of people will say that LS doesnt work and it make the performance worse when on. This is due to them having a weaker gpu that is already at max utilization and max vram usage. So when they enable LS they lose performance, while the LS and the game are fighting for resource like vram causing even more performance issues.
EDF6 is another great option for LossLess scaling, since it's limited to 60fps, and the frame generation does help make the game a hell of a lot smoother.
this thing is the best thing ever made in gaming history, it helps me with helldivers, where I have a 4060 ti gpu and a weak cpu for the game amd 5600x that causing me only reached around 40-55 fps max in that game (while wukong can grant me around 70-80 fps in cinematic) , was thought to upgrade my whole pc except the gpu, and then I found your video, and try it, it works just like magic, as you said frame gen doenst increase response time, but highers fps, makes my own natural response time better XD
Really interesting video Daniel! However when stacking frame gens, you should absolutely use the performance mode, in your frame gen stack example your base FPS post DLSS FG dropped significantly, showing a big GPU bottleneck, please retry this (for yourself, not on video) and tell me if it improves things, it should! Also please lock your base FPS with the in game frame limiter to 59 FPS, so 59 -> 118 with DLSS FG + Reflex, then x4 performance 118 to 472 FPS to be in VRR range for minimal input lag :) With this it should feel, and look, amazing :)
In absolutely any game, input lag increases very much if your GPU is loaded at 95-100%, it doesn't matter how much fps you have, Nvidia reflex PARTIALLY solves this problem, but the best solution today is still lock fps. Cyberpunk and Elden Ring showed you this very clearly, it's not about the gamepad and mouse. LSFG works best if your GPU is never loaded at 100% and your base fps never drops below 60.
Been using it for about 2 months now, and I'm halfway through replaying God of War since it first came out on PS4, works flawlessly. Lossless scaling is a must buy gem. Definitely a must have for the chill non competitive games.
I love this application and use it daily. Yes there are some artifacts, often small or dismissable for me. In shooters I won't use it and will perfer any built in game options. Reguardless, this is amazing. It's best used in non-esport type games like for me: Witcher 3, Spiderman, Path of Exile mapping, and best of all youtube/google chrome. Since many videos and movies are uploaded with 32 fps, this makes the videos really feel like 120, and makes me feel like I'm getting real use out of my monitor outside of just gaming. Really appreaciate the work they put into this and making it so simple to use.
Been using this on WoW. And omg...it is amazing. Set base fps to 60 in the game as that is the lowest i go down to in valdrakken. Then set to x3 with allow tearing. Perfect 180fps matches my monitor refresh rate, and doesn't skip a beat. So good.
I recently started using LSFG in Helldivers 2. Works pretty darn well! For better latency, make sure to turn on ULLM/Reflex in your Nvidia control panel. With ULLM off the delay is pretty noticeable. With ULLM on, delay is barely there and easy to forget about. Baseline FPS on my rig is ~100 or so. With LSFG on it makes it 80-90 baseline with 160+ output.
It only works when you put --use-d3d11 in the steam launch parameter as driver level ULLM does nothing in DX12 (The reason why they came up with reflex and put it into games)
@@RyanKristoffJohnsonLatest version of RTSS also features Nvidia reflex as the frame limiter. I tried it and it works for games with no in-game reflex option. Try this instead of ULLM.
Haven't played an AAA game from the past several years and my laptop is a budget one from 2019 with 1080p60hz display and a gtx1650, so this stuff is all very new and cool to me. You upgrading this year to a laptop with a 4060 and 3.2k 165hz display, so im stoked to see what it looks like.
@@danielowentech While it is the best way, you can always record at the highest FPS your phone or camera can do and slow it down. It will not be the most accurate in terms of milliseconds but I think most people just want to know which one has the lowest latency. If they can see one is visually faster than the other, it should be enough. Also maybe an image quality comparison would be nice as in try to target a stable FPS (either 60 or 120FPS or both!) and record it. The reason for stable FPS is to guarantee that the recording is good and hopefully get a clean capture. I'm not sure the built in recording feature like from AMD Adrenalin can capture FG, but if it can, you can use that since we are not concerned about the absolute performance. If not, then obviously you can use capture card. Whatever works best for you.
I've been using tool this for a year, and it really does work well! Wherever there's not support for AMD RSR or FSR, I'll use it. (Primarily Minecraft Java, tbh. Lol) As well as when streaming videos on youtube or elsewhere to bring them to 4K. (Just use a pop out picture in picture for your video, then use the shortcut key to make it scale it up. I use brave browser that has good support for PIP) Haven't tried frame generation, but I'm not a fan of fake frames tbh, and don't play games that would really require it where it wouldn't hurt me mechanically. Lol.
Daniel, it would be great if you could try LSC with a second GPU for frame generation. From the LSC panel itself, you can select which GPU you want to use. This way, there is no frame loss on the GPU that is rendering the game. I haven't seen any channel doing that.
As someone that loves this software and uses it in most games I do want to advise that it it does drop your base frame rate by about 10-20 fps so if your too far under 60fps it will induce fairly significant input latency, anywhere near or above 60fps and its a significant improvement to smoothness
I randomly found Lossless Scaling by looking for an app that let me scale my old games using integer scaling, which it does beautifully for most games. Slowly they added more features and it's probably the best few dollars I ever spent on any windows app. The frame generation works so well, yes it's not perfect and requires certain settings like windowed mode, etc. But when it's all in action and activated it's almost like downloading more GPU. Feels good.
I wish ths focused on quality of frame generation instead of quantity of frames. The x2 is amazing but still has flickering issues (especially with 3rd person games)
That is an odd statement. Increasing the number of frame inserted is by far the easy part. The entire task with frame Gen is to accurately predict the frame while not increasing latency. AMD could easily make an upscaler that look BETTER than dlss.. know why they don't no it? Because turning it on would DEcrease your frame rate instead of increasing it. Same here.. they could easily make the 2x be Far better quality but the hit to base performance would be so much that it is not worth using. There is a limited amount of resources that these programs have to work with as well as limited data to work with.. only the pixels on your screen. That is it and they do an excellent job with that limited info. You're saying they should be working on the really hard issue instead of offering the really easy stuff like 3x 4x. Doesn't really make sense. Everyone likes the option to do 4x even if just for the fun of it.
@@waltuhputurdawayThe only way to get better frames than Lossless is to use raw data that comes directly from the game engine even before the frame is rendered completely. Motion vectors is the big one people talk about but there is other useful data as well. Sure, Lossless with get better but it will only get a bit better. By time gpus are strong enough to be able to use a significant more resources to generation frame, there will be ai driven framegen in every game for every GPU. With Lossless.. there is a limit to how good it can get.
2x is for 120hz gaming.. 3x is for 180hz and 4x is for 240hz. I have been enjoying dlss frame gen so so much on cyberpunk and the witcher. Going from 65 to 110 is absolutely worth the increase in latency that i cant seem to notice on a controller. I hope fsr and dlss offer a 3x or 4x soon. Amd needs to take some notes from lossless to better afmf. I lost my 6950 xt before i used the second version but i can say with condidence that the first iteration of afmf was absolute TRASH. Glad amd is making some moves.
didnt watch the whole vid but if you want to demonstrate the smoothness cant you record at 240fps, and play it at 0.25 speed on youtube to display each frame. Obviously people who have 240hz know its smoother, its just seeing the quality of each frame and we can picture it back together at full speed.
Agreed. I think he should record it at 120hz and then slow it down to 1/2 and upload it as that. And then those with a 120hz screen can run it at 2x and they will get the real 120hz video.
@@spencer4hire81 Everyone can not watch his video in 2x and get the benefits from seeing it in 120hz. Only those with 120hz screens. I pretty much always watch in 2x anyway. Sure he gets a less watch time. But so what. The whole point is he is showing of high frame rate video. And this is a technique he could use to improve, what his audience will see compared to what he sees and not be limited to 60fps TH-cam videos.
It's worth mentioning that frame generation will use extra gpu power, and this will cause you to lose "real frames." So if you were getting 60 real fps, you might get 50 when FG is on. Personally I wouldn't bother with FG if you can't keep more than 60 real fps because the input delay will just be too much
One place Lossless scaling has been amazing has been emulation where games had fixed framerates. For example FF8 had 15fps battles (vomit) but since they are menu driven applying lossless scaling makes it look so much better. FG will never fix how laggy it feels but if you are getting a headache from how it LOOKS then it can help immensely.
Regarding 10:10 when Daniel enables LSFG x4 on top of DLSS3 FG. He's attempting base 40, DLSS3 FG doubled to 80 and then quadrupled to 320. But he's not locking a perfect 80/320 at all times, he's doing 74/305 or smth. That's due to the GPU being maxed out. If he'd drop some settings so the fps would be 100% locked to 80 / 320, the input lag would be significantly better. Talking as someone with a 7900 XTX that played Cyberpunk Path Traced using base 40, FSR3 FG to 80 and then LSFG x3 to 240 fps.
@@chacharealsmooth941 well ya theres trade offs, otherwise no one would ever buy a new gpu, although maybe in the future fg will be so good that it wont matter anymore
I think, you mentioned in earlier videos, that capping the base frame rate can limit latency, which you did not mention in this video. I used this with a RTX2060 to play Black Myth Wukong with 120 fps on 1440p high settings with dlss on 50% (40% in chapter 3) to get 30fps capped base frame rate and it played wonderfully. Not capping the framerat introduced latency spikes.
Daniel can you please try using Lossless Scaling with the RX 7600? Its crashing allot for me and im not the only one. There's a whole thread on the official discord but so far nothing has changed with each update :(
@@MiGujack3 Its RX 7600 and RX 580 issue. My PC blacks out and shuts down while using stock settings without any overclocking or undervolting. The GPU works fine for gaming and there's no issue but when I use Lossless the PC crashes and shuts down. I'm not the only one as i've found posts relating to this issue on the steam page, their discord and reddit. Just hoping to see if Daniel can figure it out because i wanna use this thing.
For those that don't know Frame Generation Version 1.07 is the best version of FG. You can manually download it and put it in your game files. That version is the most stable one.
I'm surprised Nvidia and AMD didn't do this first. Maybe they were saving 3- and 4-frame interpolation to pretend their _next_ generation is worth what they'll be asking for it.
A more granular 1.25 or 1.5 mode would be nice here. A perceived jump from 30 - 40fps is significant enough honestly. I'm thinking this would be a good application for power-limited devices or lower spec hardware... Just enough of a bump to meet that good playability range; but not as much of a hit in terms of visual anomalies since the frame gen would only occur every 4th or 3rd frame.
I just tried this on Helldivers 2 and finally hit 240fps on my 240hz 1080p monitor! Felt super smooth but the latency was definitely noticeable. I prefer x2 and x3 mode with perceived latency practically non existent with x2 mode as long as you have right settings switched on.
@@damara2268 I have Lossless Scaling. And I have used FSR 3.0 mod on Cyberpunk 2077 for example. Being easier does not matter to me. What matters is there a mod or not. And the image quality, how much the base frame rate drops etc.
@@mikalappalainen2041FSR 3 much better, tested in TLOU . Lossless Scaling in this game works like shit and burn gpu a lot, Fsr decrease gpu usage, temp, give more stable fps and decrease latency much better. Only one reason to use lossless, game without Fsr3 generation
it works very well now even when frame rate is not an even 60 tried it on the FF XVI demo and it made it still very excellent on my rtx 3060 12gb impressive software and tech this is. To add everything maxed and DLSS 3 with quality preset and lossless framegen.
I think you're wrong about something, x4 right now does have image quality issues but x2 LFSG completely destroys FSR3 framegen in term of image quality, there's no comparison, the only advantage FSR3 has over LSFG x2 is that FSR3 skips the HUD entirely but that leads to other issues as well. FSR3 barely create an actually convincing/correct/stable interpolated image right now.
I tried Lossless Scaling X4 earlier on Trepang2 a super fast game and noticed no input latacy at all and i found i died less and played better with the fluidity absulutly amazing i would rarther use this than Dlss frame generation the end of the gun gargled slightly but on a faced paced game not notale what so ever.
@@cajampa it is not the same at all , you can test it even without numbers the diff is so obvious , also x4 will eats alot of base fps so you got that too
@@christophermullins7163 there is no coordination between real and generated frames if you use both of them at the same time it will be bloody mess my man it is not just math x2 x2 =4 , enjoy the mess lol
I am loving Lossless Scaling, and yes a controller is the way to go. With less jerky movements, regardless of the Frame Generation technology you use, a controller will allow the software to present a far better and more consistent looking picture currently possible over a kb&m. Try using a controller and you'll notice a big difference when bumping up frames. I've yet to see how the new X4 and G-Sync update feels, but latency is just one of those things that will not feel good for most, depending of course on the game you are playing. I recommend sticking to 2X frame generation, if you can, to reduce latency. I just recently made the switch from 30+ years of console gaming to pc gaming about 8 months ago, and witnessing frame rates almost double that of a console, without the use of scaling of any kind, is still an incredible experience to me. Knocking a games frame rate up any higher is kinda just a bonus for me. Plus, I still prefer to play my games on my 55" LG C1 OLED TV which only supports 120Hrz, so all I'm ever going to see is 120fps, and am completely content with it. There is no way I can't just go from a large TV to a smaller pc monitor. It ain't gunna happen. Maybe until new LG TV's hit the market that support 240Hrz, is when I'll upgrade. For people chasing FPS gains, the way I see it is this... When it comes to the vast majority of gamers out there, 90% to 95% are playing on consoles, at much lower frames, who are still enjoying their games at 30fps to 60fps. Rarely a 120fps on console pops ups, and when it does, is a heck of a treat. My point? Rushing to the store for more frames, when what you've got right now is better than what most console gamers are experiencing. When you look at these new GPU's, you also have to look at a new monitor, which together is outrageously expensive. On top of that, even the most incredible pc games released today are still struggling to hit a Native 4K at 60fps to 120fps, and that is where DLSS, FSR, and now Lossless Scaling comes into play, to save your wallets from minimal gains to fps. The problem with new hardware-based frame gen technology, in these new GPUs, is that it only supports very few new games currently. I think like 95% of them are sponsored games, and who wants to upgrade a GPU to play 2 or 3 new games every year, that supports this frame gen technology, while also hoping to that those games are good? Go for it if you have the money, but if you don't, just know that you are already sitting pretty, as a PC gamer as is, over the vast majority of gamers. Don't let these GPU companies try to fool you into upgrading, and use Lossless Scaling for now, until we start to get more games that include frame generation. What's funny, is that Lossless Scaling outperforms FSR 3 frame gen technology. I've seen side-by-side comparisons of FSR 3 frame gen versus Lossless Scaling, and while FSR 3 frame gen has its perks, Lossless Scaling still looks better. Ai frame gen software is exactly what people need right now, who just do not financially have the means to purchase the latest and greatest in graphics technology every year, when Lossless Frames is a hair away from the competition. A software based GPU upgrade for only $7 bucks. I don't know if Lossless Scaling has this feature or not, but if the devs read this, I'd like for them to add an option to turn off the fps counter if possible, and replace it with like a small X2, X3, or X4 watermark to indicate that the software is doing it's thing. While looking at FPS counters all day can be fun, I'm just not personally a fan of fps counters in my games, because I find it a bit distracting. I don't really care how many frames its generating, when I can clearly see that it's working based on how I set it. The only time I care about the FPS counter, is when I'm dialing in my settings initially to cap the frame rate to a consistency that the software requires, in order to for Lossless Scaling to produce stable additional frames. From that point forward, using the Lossless Scaling's fps counter doesn't really mean much to me. If that makes any sense. The program does what it needs to do, and it does it exceptionally well! Just a thought I'd throw out there, but no big deal. Great video as always Daniel! Cheers y'all!🍻
Losless scaling is legitimately great, but I've found the most use out of it on TH-cam and with emulators. Real frames are always preferable, interpolating past frame caps is where it's at
I have a rx 570 4gb paired with i5 2400... It runs elden ring at 1080p high at 30 - 40 fps (cpu bottleneck) most of the time, if i cap at 25 frames and use 3x mode of lossless i get 75 frames (my monitor refresh rate), it has artifacting and the input lag is noticible but its waaaaaaaaaaay more stable and enjoyable for me, i can even play it with a higher resolution! This program gave so much life to my pc! :)
@@MiGujack3you will get tearing otherwise. Enable Vsync from NVCP. I had no severe latency issues with vsync + loseless for games that are not competitive. But I am using a controller which is recommend
I think the endgame use of this is using the highest quality/least artefacting modes stacked on top. So DLDSR stacked with Quality DLSS stacked with x2 frame generation will have more input lag sure, but you could take a game from unplayable to solid and banish aliasing to boot. I think combining with Special K framecapping there is some potential to reduce felt input lag and improve stability even further, and in a few key situations it really gets going. I think that once these tools are mature the next step is developing a system that dynamically uses tools like these together to pull out a close-to-ideal configuration automatically, which makes fiddling with individual settings even easier as you're closer to the desirable endpoint when you start messing with stuff. Frame-generation with frame-capped games benefits especially, as you pointed out - it's a way of increasing smoothness and providing high refresh rate compatibility without an egregious downside. Guess I gotta pick up lossless scaling.
@@MrSuvvri but what's the point if the mouse isn't smooth? Idk, when I'd want to use it is when it sucks the most so this quite literally seems no good
Yeah I understand where you are coming from. Because I don't really find it useful in shooting games. Almost unplayable. I use it for livestream and videos tho, almost all the time. Movies, discord streaming. All this makes huge difference.
This app is now my default method in playing AAA games. Mostly because my 2060 is pretty much outdated now with all these new games having intensive graphics. This can also help me achieve higher graphics settings and favorable performance at the same time too. For example, normally i wouldn't be able to achieve a stable 60fps on High settings in games like Black Myth Wukong. But with this app, i can just cap the frame rate at 30, turn all the graphics settings to High, set LS on 4x and casually enjoy the free 120fps/High settings. Sure, the input lag will be there and can get a bit annoying at times but that's way better than playing at an unstable 60fps that keep dropping to the 50s or even low 40s in some area. Watching movies with this is a blast too, if you are fine with anything more than 30fps in movies and it works really good with movies that heavily use CGI like MCU movies. Im having so much value from this 7$ app than i ever had from spending hundreds or even thousands of dollars upgrading my hardware.
u must pay lootboxes and get the legendary X5 mode with a drop rate of 0,000000000x10-^432149x=z²+x/69, after that you need the battlepass (only 1000$ right now it's almost free ngl) and only then you must have 100 000 hours on loseless scaling, and make a 2000$ dollars payment to finally get X6, but it's only avalaible to those who own a rtx 10090 ti super ultra (which cost around 15k$ in 2050)
cool story bruh. consoles struggle to get 60 fps in 2024. tell me how they will become insane in next gen? by finally getting 60 fps in 2027-2028? 🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣Final Fantasy 16 and Forspoken drop resolution to 720p and upscale to 1080p and still can't hit 60 fps. consoles will always be inferior to PC.
I tried Lossless Scaling but refunded because it just made every game screen tear like crazy. I played with the settings a lot (VSYNC, Framerate limits) etc but no luck.
Some games gave me more issues than others with LS. I wonder if by tearing, you're actually seeing artifacting. The X2 mode is least likely to artifact and tear and make sure you don't have in-game settings and LS settings fighting each other. I had to play with in-game settings, LS settings, and my driver settings to get an optimal experience but once I solved it, it works great.
What was your fps? Have you enabled vsync only from Nvidia panel? Are you using RTSS properly? What monitor refresh rate have you set? Or maybe the game you are trying has problems on its own that only the developers can fix or check the internet for stuttering fixes? I can help you with this. For me loseless scaling is working perfectly with all older games, emulators.
As a teacher you shouldn't compare Apples with Oranges, you know the scientific method. So if want to compare latency and image artifacts, you compare DLSS FG to LS 2x and not the 4x. You should only compare de LS 4x with the LS 3x and 2x, because it's only one changed variable the number of frames inserted on the same upscaler. So I have to vote negatively for your video, I could voted positively if you had compared Apples with Apples.
Been using it lately with some emulators and it's goddamn magic. I've also been using with iRacing to get more consistent framerate and there's a slight delay but completely playable.
@@BourbonBiscuit. what he is saying is that no matter if you are generating 1 or 2 or 3 frames the latency is only as long as the time it has to wait for a the second in game frame. so all 3 should have the exact same latency but from some reviews Ive seen the 3x has less latency than 2x (something having to do with frame pacing or idk I aint no tech wizard)
@@BourbonBiscuit. yeah but why would you do that if you can already hit the target frame-rate from 60 to 120 going 30 to 120 would be both very stupid and unnecessary but going from 60 to 240 should feel the exact same as 60 to 120 when it comes to latency
Agreed. Ignore the other comments. People acting like frame interpolation is “performance” are either lying to themselves or don’t understand the technology at all. This is a cool way to get “smoother” gameplay, but it introduces input lag and artificial frames to make it happen. You can’t call that performance. Imaging if your car’s speedometer doubled the speed it showed you, so while driving at 60MPH it showed you 120MPH. Is that performance? Of course not. This isn’t anything like adding a turbo to an engine- overclocking would be that equivalent.
@@nintendoconvert4045 But this is actually giving you more frames per second, it's not just SHOWING more FPS like your speedometer comparison. If it said 240fps while actually getting 60 fps this app would be getting entirely shit on by everyone, but it doesn't do that.
If the delay latency is too much for you, you can try set the "Low latency mode" to ultra. This setting can be found in Nvidia control panel and can be pretty useful for some game.
i am currently playing a plague tale: requiem, and let's just say with my vega 64 i would be cooked! BUT using lossless scaling i can play at 1440p high preset at 60/70 fps, with the x2 mode and i don't even see much arctifacts, only when there are low light occlused ambients i see a bit of a shimmering effect on some walls. I love this app, the creator is THE GOAT.
I am playing an bit older single player game BattleTech. It's stratagy turn based. I am using Lossless Scaling to scale from 1080p to the native 1800p panel as well as 2x frames, while using the weak APU. Goes from like 26 fps 1080p to 40fps panel resolution looking noticably sharper and a bit smoother.
Don't forget that the 2x mode still exists and is getting better in terms of efficiency and image quality with each update
Yeah that's what i'm using in Burnout Paradise Remastered.
The 2X mode has best quality of course
Am I misreading the patch notes or this latest one doesn't talk about X2? Only about X3 improvements
"X3 mode has been updated to further reduce artifacts on patterned textures and in dark scenes."
x2 mode is the best in term of latency and visual quality. best use in fps capped games ofc like elden ring
Nvidia saved the industry from SLI and CROSSFIRE. Artificially doubled or tripled the GPUs you own.
You all owe Nvidia an apology.
If you don't like the latency, you can try the "Allow Tearing" option under Sync mode. It significantly reduces latency at the cost of potential screen tearing. However, after 60ish hours of using this option, I haven't experienced screen tearing at all. Obviously, it might not work as well on EVERY game and EVERY pc but its a pretty huge improvement, so its worth a try for anyone who has LS.
I find the Performance toggle does more for input lag for me
Yes, VRR technologies often handle the tearing by themselves, making VRR and Allow Tearing LSFG a win-win.
@@ashishgoyal8655lossless
@@briando8150 u mean it helps with latency or hurts it?
Woah this is huge help. Thank you
highly technical and accurate diagram is all we ask for :D
Lossless scaling at 4x is great for 2D games with lots of scrolling. These games are often limited to 60 fps (such as Factorio and Terraria). These games combined with my 240 Hz monitor looks fantastic. All motion blur induced by tracking objects with your eye is almost eliminated.
the problem is 60fps is not enough base framerate
@@sudd3660 For what? For 1x 60 is perfectly fine. 2x too depending on game. For 3x i agree that base framerate should be closer to 90 or so.
I play Super Smash flash 2. This game is locked on 30fps. This game needs this app to achieve 60fps. Ain't no other option to fix this but to use Lossless Scaling. 😅
I figured loss of fine detail during motion would mostly negate any benefit of higher fps. Sounds like this works better with 2D games.
@@sudd3660 Many of these games utilize a hardware cursor which is unaffected by the frame rate or latency. This makes input lag way less noticeable.
Never in my whole life do I expect a pixelated Duck software on Steam could double,triple or QUADRUPLE my FPS in games. TFS is legend for upgrading this for free constantly. Lossless Scaling is truly the GOAT when you wanted to get more FPS on games that didn't have the expensive exclusive DLSS or AMD's whatever FSR or FMF, this Duck just say "Let me allow every single game be blessed with tons of Frames and Resolution!" and then the frames actually did appear. It really saved my precious GTX 1060 6GB from early retirement and I believe it will benefit people who had crazy high Hz monitors or uses a high frame rate TV madlads to use this just for the sake of playing anything on 4k or 8k.
Unfortunately lossless scaling is snake oil. The upscaling is literally just worse dlss/fsr. The frame gen reduces ur native frames to put more fake frames. Fake frames are not real frames, they just create the illusion of smoothness. Ur input lag is still based on ur real native frames, which are now lower thanks to lossless scaling consuming gpu power and vram. The 3000 and 4000 series have special hardware in them that is used for dlss and frame gen. This hardware makes the performance cost of those nvidia features practically zero, which lossless scaling doesn't have because it's a software implementation rather than a hardware one. U're better off just saving money, buying a used 3060 if on a budget and using dlss, that will give u much better quality at a much smaller performance cost. The frame gen feature could be useful for games that dont have native frame gen, but unfortunately its gsync support is lackluster. And gsync/freesync aren't even that good. Special K's latent sync blows both of them out of the water. Even then u need at least a 144hz monitor to make it worth using. If u have a decent gpu like a 4070 or above, u shouldnt struggle getting 144 fps on non-demanding games, and the demanding ones usually have native frame gen included anyway. So pretty much the only time this program is worth using is if u have an older gpu with a 144hz monitor where u get around 75-80 fps stable. Lock the framerate to 72 and double it up to 144. And even then u miss out on latent sync and u have to use a shitty gsync implementation. Aint worth it unless ure really destitude.
@@kerkertrandov459 It's not snake oil for emulators and games that are locked to 30fps or 60 fps
@@Yolo_Swaggins just use a fps unlocker then
@@kerkertrandov459 FPS unlockers aren't a option for most older games on a emulator where making the frame rate higher breaks the physics or makes the game run too fast or nerfs jumping etc.
@@Yolo_Swaggins bboomer gaming
I used the program for a few months now already and from my experience as long as you aren't specifically looking for the artifacts and those little jittering pixels I have no issues and em very greatfull for this to exist
I used it on chrono trigger, the only problem i saw was the text showing up (more times if it was on top Idk why), pretty neat feature
@@G0A7 maybe try with the G-sync now available turned on? It could be because sometimes they add the jittering on the top but with g sync this then gets rid of the problem
@@risingg3169 i could try but tbh 60fps is more than enough for chrono trigger, more often than not i just forget that i have It installed and just plain without it
@@risingg3169can i use this with a weak cpu and a strong gpu ? 12400f and 7800xt.
🤮
Congrats on the shoutout in the latest Hardware Unbox video Daniel, you're getting some well deserved recognition unc
can't believe i'm seeing the word 'unc' in a youtube comment
What is it about?
My goat made it
While you have a fair point as far as Daniels recognition goes, you are steppin out of line junior. Sit tf down boi.
derbauer was also there. :)
I tested it today on youtube and a movie. And - ITS WORKS! You can boost 4k 25 fps video to 100fps or 24fps movie to 96 fps even in fullscreen without graphics problems.
Hadn't gotten around to trying it like that but was hoping it might work. I tend to stream at 720p but have a 1440p 165 Hz monitor. I've only tried it on a few games. Didn't work with LA Noire but worked very well on FH4.
That's not advised, running movies at 60fps is already bad, the motion is broken and image quality is also broken
It looks smooth yes but it's a movie
Everything under solid 60 fps is trash for fg. 24fps video is already so jelly when you push it to 58 let alone 96
@@melxvee6850It depends on the movie. Panning shots look awful on most screens nowadays, this should fix that shouldn't it?
@@xShikari no, what should fix it is internal motion pacing
Same in games
The game could be running at 60fps then see a random item or character moving at 30fps or less
Sometimes it can be cool
Like into the spiderverse where Miles is moving at 12fps but the movie is 24fps
But running a movie at 60fps is awful, the movement is very video game looking
It's like those people who were upscaling JJK episodes from 24fps to 48fps and upscaled to 4K
Way too much ghosting and flames looked very liquid
You could see this with a satellite tv running on a Samsung or LG 4K60 display
It upscales everything to run at 60fps
Every single movie no matter how old it's got that motion smoothing feature on that basically doubles the frame rate and everything you watch is at 60fps
In case no one's mentioned it here yet or tried it, you can also use Lossless Scaling for videos and it works surprisingly well. However, you do need to be on fullscreen and there should be no UI elements present such as captions or the scrub bar otherwise you will get FPS fluctuations. Also works alongside RTX VSR.
@@raven3696 wait... That is actually lit! I'll have to try this on TH-cam videos and maybe even movies and see how they look
Videos don't work like games, 24fps will look better than 60fps
does it work for online games like cod or tarkov
@@prequall it won't for some people, i now watch film at around 90fps and it's much better than the laggy 24fps, can't stand it anymore...
@@papiertoilette-pb9lf movies are made to be 24fps for reason. give it more than 60fps give its a cheap "soap effect" look. looks like robotic , or game like. but if u like that thats fine.
I just bought and tested it. X2 is the best option. No need to use X4. 120/144 FPS is enough. X2 has a very good quality too. I didnt even notice that there are any artifactes. This shit is better than Frame Generation and FSMR LOL,
This app just saved my 3090 in black myth wukong lol.
@@Jeannemarre yea, this app is great for capping to your max refresh rate too. Also for games which are hard capped at 60, like Tekken 8. I play Tekken 8 with smooth 120 fps and you do not notice any more input lag.
@@Yakoo86 that’s dope !!
The thing is that 120/144 means you have Sync in Default and not in OFF (allow tearing), which means you have more input lag.
Best option is 2x performance mode with allow tearing
@@ghostlu5462This Honestly
Using the LS1 Upscaler + the X2, both on performance, have minimum impact on the image while feeling smooth.
Each update silently boosts X2 Frame rate to the point is the best choice for a lot of gamers.
Hardware Unboxed shouted you out in their last video, great work!
Additionally, the image quality of 2x mode has improved a lot since the latest update, espacially in some darker scenes.
I downloaded more SSD space yesterday. The cool thing is i did not have enough space to download the extra space but the extra space gave me the option to download the extra space on the extra space itself. The problem is the download was so massive that at the end of the day i have the same amount of SSD space as before.
What 😭
I downloaded more internet speed, but unfortunately the download of the speed itself takes up so much bandwidth that it eats into itself. Still, it's essentially free, so I think I'll keep downloading it just so I can tell people I have a faster connection.
You guys creating paradox now?😭
(Nice analogy lmao)
underrated
The SSD got bigger but it has the same memory space inside of it
Now I'm waiting for someone to try AMD AFMF + FSR FG + LS. Complete madness
Tried it, unfortunately no.. Afmf and Lsfg will turned each other off lol.
Similar software conflicting.
@@WutTheHekk but fsr fg and lsfg can work right ?
@@DragonOfTheMortalKombat Yes you can stack in game Fg and Lsfg,
Tried it on Cyberpunk with Fsr fg mod resulted in 660fps with 1440p low preset Fsr balanced
*And Spiderman Miles Morales's Fsr 3 Fg around the same fps due to Lsfg capped to monitor refresh rate for base Fps.
I have 165hz monitor so x4 from Lsfg is 660fps.
나는 이미 그것을 시도했다. 특정 게임에서는 매우 잘 작동합니다.
I did that... the game rendered at negative frames and scaled them up to 144, as a neat little side effect my GPU has its own singularity
I’m definitely downloading lossless for Elden Ring to get a better gaming experience!!!! Thank you!🤩
I literally cannot play Elden Ring without lossless scaling enabled. I have probably put 100 hours into lossless scale and through Elden ring alone
@@TheStrengthScholar And what kind of FPS jump did you obtain in it? Also what are the best settings? I'm seeing a lot of people say x2 and allow tearing?
Great breakdown. I love the highly accurate and technical breakdown graphs.
Yuge update for high refresh rate panel enjoyers
Considering X2 is Quality and X4 is performance. I actually use the X2 to save power on my 3090. I limited my AW3423DWF refresh rate to 120 so I can use 10-bit. Then I cap the refresh rate in games to 60/90 and just let the X2 generate the rest.
I have the same monitor, do you have a link to setup 10bit color?
Is 10 bit that much worth it to drop from 165hz?
yeh 2x is good enough to play with good image quality, also I'm using 2x and i have 88 fps on cyberpunk 2077,
before 42 fps without using lossless scaling. i'm using gtx 1080 8gb GPU
@@SLModernJukebox 60fps is the minimum tu use 2x
@@silvio3d Nope. It is 30fps. 60fps is the recommended.
Turn the Performance toggle to On for better latency! Theres a bit more artifacting, but its less noticeable the more frames you feed it, and its less noticable than the input lag.
I've used it with 2x scaling and 2x FG to play FH4 at 1440p60 with a GTX 670 SC. Both on performance mode. Things got a bit wobbly when moving through my garage quickly but in play, latency was low and the image good. Good enough to be normally competitive in ranked racing. I've played the game at least 120 hours like that.
@@Lurch-Botupgrade your GPU
@GewelReal my guy, if we could, would we be using lossless scaling?
@@GewelReal Yeah right why dont you donate me an RTX 6090 Ti?
@@dingickso4098 it hasn't released yet so no
You probably won't notice the artifacting while actively engaged in a fast paced game. It's when you go out of your way to find them using flick tests that they become more apparent. Something to note is that lowering the Mode to X2 and enabling Performance Mode (and setting Sync Mode to Disabled) significantly helps latency. While X4 is cool, it's unoptimized and without the performance settings will result in a bad time latency wise
Just like "lo-fi" music can "work" but it can also sound low-effort and cheap. If you want to make something good, the "they won't notice" excuse will only get in your way.
nah, its all i can see, cant even stand dlss, shits fuzzy as fuck, even the dlaa is questionable
Have been using this for quite some time now and I am really happy with it. I personally do not really care about hud elements having errors and the artifacts are also barely noticeable, at least for me. I don't really pay too much attention to finer details. I have a 165hz monitor for almost 2 years now but I couldn't really utilize it. I was mostly stuck on 120-144hz because of performance, even with x2 frame gen of Nvidia (I have a 4080). Since x3 mode I can play pretty much every game at 165fps, so I can finally use the 165hz of my monitor. And yea like you said, with games like Mordhau (competitive game) I do not use it because of the latency. But in single player games that aren't crazy fast paced, 60fps baseline is perfectly fine latency wise.
Only downside is: you can't really go back to something like 60fps. My god it feels like lag 🤣
I remember long ago i used to skip frames on ps2 emulator and up to 2x it worked more or less playable. And now instead of skipping we're adding frames. What a plot twist. Btw in Valheim even AFMF2 feels perfect, with L.S. it would be probably even better.
I experimented with this on my 6600XT with a 170hz 1440p monitor with vrr. It works great and I don’t notice any ghosting or artifacts pop up, but I do want to avoid having them whenever possible, same story with the latency since I play with a controller most of the time. Works best when you maintain a consistent 60fps to begin with
9:44 - bro casually kicking out 3 headshots on pedestrians is wild 😂
This can make your VRAM run extremely hot, like 90C+, so be careful with it. Monitor your VRAM temps on your GPU.
Well it's obvious the higher the frame the hotter the GPU
This has nothing to do with LS, and everything to do with the gpu you have.
@@haelkerzael7589 This "can" make your VRAM run extremely hot.
Should specify that this application is the star of the show for emulation. Emulation, you're looking for a fixed frame rate 100% of the time. That goes great with this application that works best while at a stable FPS as well. VRR doesn't do anything for emulation and can cause slow motion at lower frame rates. Games that lock at 30 FPS aren't great but games that lock at 60FPS like many Mario 3D titles, feel much better with it on.
Imagine wind waker at 120fps
When games close at 30 frames, it seems as if they are at 60 frames, but this is a shame. The more the frames are cut, the more the control becomes heavier and heavier, and this is annoying.
@@thebigcj517 unless you can do run ahead frames like retroarch and combine that with the frame smoothing, since computing power isn't the limitation here
Oh I gotta try this
Guys if your monitor has crosshair overlay in OSD you can use it instead and disable crosshair in the games 🙂
The monitor crosshairs are usually so bad though lol
@@afroize Yeah you are right they usually don't look great, but I think there is probably some software overlay available, like Crosshair V2 on steam.
Only use them for snipers
First DLSS 2.0 dropped and I couldn't believe it wasn't magic. Then Nvidia released Frame Gen and it's almost as impressive. I can't wait for what they are cooking up next. They should definitely be criticized for their anti consumer behaviour but you can't deny the genius of their engineers.
No one's talking about Nvidia and their walled garden here.
Also they've already released a new "magic", it's called ray reconstruction.
It's just like an evolution of injectors back in the day, they just fiddle with settings to make games look better than they actually are. It isn't a replacement for native, and it certainly isn't magic. More like copium for devs inability to properly optimize PC games
@@JABelms bingo.
@@JABelms agreed, the way people are labelling it as magic is insane
Using words like magic and genius bring back bad memories of Apple's language. I dont agree with glorifying these big corporations.
Unfortunately Nvidia is more and more Apple like.
Daniel, congrats! You have now become an educator and a youtuber. So that answers your dilemma that you were facing in one of your past videos. You offer a great easy to understand explanation that many people may not understand, the difference in latency, and the explanation of the values being presented. Appreciate it!
In the end, it depends on your GPU.
If you're GPU is too weak, then it worsening your fps and input lag instead.
U r good if u have anything abovr a GTX 1660s
Eh, this allows me to get close to 60 fps on Hogwarts Legacy and Jedi Survivor, using a 1060....
@@krishnaprasad1570 No he has a point, a lot of people will say that LS doesnt work and it make the performance worse when on. This is due to them having a weaker gpu that is already at max utilization and max vram usage. So when they enable LS they lose performance, while the LS and the game are fighting for resource like vram causing even more performance issues.
EDF6 is another great option for LossLess scaling, since it's limited to 60fps, and the frame generation does help make the game a hell of a lot smoother.
Bro had beef with that rabbit
this thing is the best thing ever made in gaming history, it helps me with helldivers, where I have a 4060 ti gpu and a weak cpu for the game amd 5600x that causing me only reached around 40-55 fps max in that game (while wukong can grant me around 70-80 fps in cinematic) , was thought to upgrade my whole pc except the gpu, and then I found your video, and try it, it works just like magic, as you said frame gen doenst increase response time, but highers fps, makes my own natural response time better XD
Really interesting video Daniel! However when stacking frame gens, you should absolutely use the performance mode, in your frame gen stack example your base FPS post DLSS FG dropped significantly, showing a big GPU bottleneck, please retry this (for yourself, not on video) and tell me if it improves things, it should! Also please lock your base FPS with the in game frame limiter to 59 FPS, so 59 -> 118 with DLSS FG + Reflex, then x4 performance 118 to 472 FPS to be in VRR range for minimal input lag :)
With this it should feel, and look, amazing :)
oh my god.I never knew about this..got it straight away for my emulators and older pc games thank you so much x
In absolutely any game, input lag increases very much if your GPU is loaded at 95-100%, it doesn't matter how much fps you have, Nvidia reflex PARTIALLY solves this problem, but the best solution today is still lock fps. Cyberpunk and Elden Ring showed you this very clearly, it's not about the gamepad and mouse. LSFG works best if your GPU is never loaded at 100% and your base fps never drops below 60.
Been using it for about 2 months now, and I'm halfway through replaying God of War since it first came out on PS4, works flawlessly. Lossless scaling is a must buy gem. Definitely a must have for the chill non competitive games.
I love this application and use it daily. Yes there are some artifacts, often small or dismissable for me. In shooters I won't use it and will perfer any built in game options. Reguardless, this is amazing. It's best used in non-esport type games like for me: Witcher 3, Spiderman, Path of Exile mapping, and best of all youtube/google chrome. Since many videos and movies are uploaded with 32 fps, this makes the videos really feel like 120, and makes me feel like I'm getting real use out of my monitor outside of just gaming. Really appreaciate the work they put into this and making it so simple to use.
Been using this on WoW. And omg...it is amazing.
Set base fps to 60 in the game as that is the lowest i go down to in valdrakken.
Then set to x3 with allow tearing.
Perfect 180fps matches my monitor refresh rate, and doesn't skip a beat. So good.
I recently started using LSFG in Helldivers 2. Works pretty darn well! For better latency, make sure to turn on ULLM/Reflex in your Nvidia control panel. With ULLM off the delay is pretty noticeable. With ULLM on, delay is barely there and easy to forget about.
Baseline FPS on my rig is ~100 or so. With LSFG on it makes it 80-90 baseline with 160+ output.
It only works when you put --use-d3d11 in the steam launch parameter as driver level ULLM does nothing in DX12 (The reason why they came up with reflex and put it into games)
@@bb5307 Interesting. I'm pretty sure I noticed a decent difference between Off vs Ultra. I can double check it though
@@RyanKristoffJohnsonLatest version of RTSS also features Nvidia reflex as the frame limiter. I tried it and it works for games with no in-game reflex option. Try this instead of ULLM.
Quality was great man, especially after TH-cam compression and loss. Subbed
Finally, I can download frames. When can I download more ram?
Theoretically you could use Onedrive as ram.
proberly to young for that meme xd that goes back to the late 90´s early 2000´s
@@aeppikx i know the meme, i just wanted to be a smartass and say people have managed to use onedrive as "ram"
do you know why this program shows 130/250 but in fact fps 10 and a slideshow?
@@dizzy7376 cause your pc is shit.
Haven't played an AAA game from the past several years and my laptop is a budget one from 2019 with 1080p60hz display and a gtx1650, so this stuff is all very new and cool to me. You upgrading this year to a laptop with a 4060 and 3.2k 165hz display, so im stoked to see what it looks like.
Can you compare AFMF latency vs the latency in Lossless Scaling? Never seen that comparison.
The best way to test that would be with a click-to-photon measurement, which I don't own the hardware to do.
@@danielowentech While it is the best way, you can always record at the highest FPS your phone or camera can do and slow it down. It will not be the most accurate in terms of milliseconds but I think most people just want to know which one has the lowest latency. If they can see one is visually faster than the other, it should be enough. Also maybe an image quality comparison would be nice as in try to target a stable FPS (either 60 or 120FPS or both!) and record it. The reason for stable FPS is to guarantee that the recording is good and hopefully get a clean capture. I'm not sure the built in recording feature like from AMD Adrenalin can capture FG, but if it can, you can use that since we are not concerned about the absolute performance. If not, then obviously you can use capture card. Whatever works best for you.
Afmf2 has less latency lossless scaling has more quality
@@Nicx343xAFMF doesn't have less quality. FSR 2.1 has less quality. If you combine both. Yeah. The quality is slightly worse.
Unless you set manually your software + game. Adrenalin isn't newbie friendly.
I've been using tool this for a year, and it really does work well! Wherever there's not support for AMD RSR or FSR, I'll use it. (Primarily Minecraft Java, tbh. Lol) As well as when streaming videos on youtube or elsewhere to bring them to 4K. (Just use a pop out picture in picture for your video, then use the shortcut key to make it scale it up. I use brave browser that has good support for PIP)
Haven't tried frame generation, but I'm not a fan of fake frames tbh, and don't play games that would really require it where it wouldn't hurt me mechanically. Lol.
Mozilla pop out windows are pure garbage and won't work. I will try Brave though.
Daniel, it would be great if you could try LSC with a second GPU for frame generation. From the LSC panel itself, you can select which GPU you want to use. This way, there is no frame loss on the GPU that is rendering the game. I haven't seen any channel doing that.
As someone that loves this software and uses it in most games I do want to advise that it it does drop your base frame rate by about 10-20 fps so if your too far under 60fps it will induce fairly significant input latency, anywhere near or above 60fps and its a significant improvement to smoothness
I randomly found Lossless Scaling by looking for an app that let me scale my old games using integer scaling, which it does beautifully for most games. Slowly they added more features and it's probably the best few dollars I ever spent on any windows app. The frame generation works so well, yes it's not perfect and requires certain settings like windowed mode, etc. But when it's all in action and activated it's almost like downloading more GPU. Feels good.
Excellent video and explanations of how it all works, good job Daniel.
I wish ths focused on quality of frame generation instead of quantity of frames. The x2 is amazing but still has flickering issues (especially with 3rd person games)
That is an odd statement. Increasing the number of frame inserted is by far the easy part. The entire task with frame Gen is to accurately predict the frame while not increasing latency.
AMD could easily make an upscaler that look BETTER than dlss.. know why they don't no it? Because turning it on would DEcrease your frame rate instead of increasing it.
Same here.. they could easily make the 2x be Far better quality but the hit to base performance would be so much that it is not worth using. There is a limited amount of resources that these programs have to work with as well as limited data to work with.. only the pixels on your screen. That is it and they do an excellent job with that limited info. You're saying they should be working on the really hard issue instead of offering the really easy stuff like 3x 4x. Doesn't really make sense. Everyone likes the option to do 4x even if just for the fun of it.
@@ryujigames2509 that’s what Nvidia frame generation is
@@BourbonBiscuit.explain that?
@@waltuhputurdawayThe only way to get better frames than Lossless is to use raw data that comes directly from the game engine even before the frame is rendered completely. Motion vectors is the big one people talk about but there is other useful data as well. Sure, Lossless with get better but it will only get a bit better. By time gpus are strong enough to be able to use a significant more resources to generation frame, there will be ai driven framegen in every game for every GPU. With Lossless.. there is a limit to how good it can get.
@@waltuhputurdaway Nvidia FG image quality has come a long way, instead of just multiplying frames more
Wow. I have used this in other places like Age of Empires and TH-cam, this is a very powerful program. Great recommendation.
2x is for 120hz gaming.. 3x is for 180hz and 4x is for 240hz.
I have been enjoying dlss frame gen so so much on cyberpunk and the witcher. Going from 65 to 110 is absolutely worth the increase in latency that i cant seem to notice on a controller. I hope fsr and dlss offer a 3x or 4x soon. Amd needs to take some notes from lossless to better afmf. I lost my 6950 xt before i used the second version but i can say with condidence that the first iteration of afmf was absolute TRASH. Glad amd is making some moves.
This is a game changer for pascal series gpus. I have an old pc with a 1070 and this will give it new life.
didnt watch the whole vid but if you want to demonstrate the smoothness cant you record at 240fps, and play it at 0.25 speed on youtube to display each frame. Obviously people who have 240hz know its smoother, its just seeing the quality of each frame and we can picture it back together at full speed.
Agreed. I think he should record it at 120hz and then slow it down to 1/2 and upload it as that. And then those with a 120hz screen can run it at 2x and they will get the real 120hz video.
Wouldn’t the algorithm crush his stats if everyone watched a large section of his video at 4x speed though?
@@spencer4hire81 Everyone can not watch his video in 2x and get the benefits from seeing it in 120hz. Only those with 120hz screens. I pretty much always watch in 2x anyway.
Sure he gets a less watch time.
But so what. The whole point is he is showing of high frame rate video.
And this is a technique he could use to improve, what his audience will see compared to what he sees and not be limited to 60fps TH-cam videos.
It's worth mentioning that frame generation will use extra gpu power, and this will cause you to lose "real frames." So if you were getting 60 real fps, you might get 50 when FG is on. Personally I wouldn't bother with FG if you can't keep more than 60 real fps because the input delay will just be too much
If only this worked in VR
One place Lossless scaling has been amazing has been emulation where games had fixed framerates. For example FF8 had 15fps battles (vomit) but since they are menu driven applying lossless scaling makes it look so much better.
FG will never fix how laggy it feels but if you are getting a headache from how it LOOKS then it can help immensely.
Regarding 10:10 when Daniel enables LSFG x4 on top of DLSS3 FG.
He's attempting base 40, DLSS3 FG doubled to 80 and then quadrupled to 320. But he's not locking a perfect 80/320 at all times, he's doing 74/305 or smth. That's due to the GPU being maxed out. If he'd drop some settings so the fps would be 100% locked to 80 / 320, the input lag would be significantly better.
Talking as someone with a 7900 XTX that played Cyberpunk Path Traced using base 40, FSR3 FG to 80 and then LSFG x3 to 240 fps.
How can you use FG in Cyberpunk?
Do you mean afmf?
@@chacharealsmooth941 Either LukeFZ mod (Patreon) or DLSS Enabler (Nexus). I mean FSR3 FG, not AFMF.
@@raresmacovei8382 Gotcha. Used it myself and it works, though the quality is not perfect.
@@chacharealsmooth941 well ya theres trade offs, otherwise no one would ever buy a new gpu, although maybe in the future fg will be so good that it wont matter anymore
I think, you mentioned in earlier videos, that capping the base frame rate can limit latency, which you did not mention in this video.
I used this with a RTX2060 to play Black Myth Wukong with 120 fps on 1440p high settings with dlss on 50% (40% in chapter 3) to get 30fps capped base frame rate and it played wonderfully. Not capping the framerat introduced latency spikes.
Daniel can you please try using Lossless Scaling with the RX 7600? Its crashing allot for me and im not the only one. There's a whole thread on the official discord but so far nothing has changed with each update :(
i wish nvidia will make 2x 3x 4x framegen for 50-series. also enjoy amd as always 🗿
Its crashing on my 1660S as well
I run it with 6600 XT no issues.
@@MiGujack3 Its RX 7600 and RX 580 issue. My PC blacks out and shuts down while using stock settings without any overclocking or undervolting. The GPU works fine for gaming and there's no issue but when I use Lossless the PC crashes and shuts down. I'm not the only one as i've found posts relating to this issue on the steam page, their discord and reddit. Just hoping to see if Daniel can figure it out because i wanna use this thing.
@@amzgamingxtry upscale from 1440p to 4K this will fix your crashing maybe
For those that don't know Frame Generation Version 1.07 is the best version of FG. You can manually download it and put it in your game files. That version is the most stable one.
I'm surprised Nvidia and AMD didn't do this first. Maybe they were saving 3- and 4-frame interpolation to pretend their _next_ generation is worth what they'll be asking for it.
3x or 4x in shit tecnology
Best video looking for info like this for a long time
I can finally play my latest DooM at 1840 fps
But is that enough? 🤔
Do you also have a 1840hz monitor?
@@JanM2he is the first person to get the new neuro-vision from neurolink. There is a display port is his brain so yeah. He fidna own.
🤖 I'm singing the Doom Song!
🎵 Doom doom doom Doom! 🎶 Doom Doom Doom! 🎵
@@JanM2 I personally have a 1841 HZ monitor, but I'm locking it down to 1840 cause my eyes can't see past 1840fps.
A more granular 1.25 or 1.5 mode would be nice here.
A perceived jump from 30 - 40fps is significant enough honestly.
I'm thinking this would be a good application for power-limited devices or lower spec hardware... Just enough of a bump to meet that good playability range; but not as much of a hit in terms of visual anomalies since the frame gen would only occur every 4th or 3rd frame.
I just tried this on Helldivers 2 and finally hit 240fps on my 240hz 1080p monitor! Felt super smooth but the latency was definitely noticeable. I prefer x2 and x3 mode with perceived latency practically non existent with x2 mode as long as you have right settings switched on.
Can't believe he made a video on LS update that dropped yesterday but still no AFMF 2
With my 3070, it's either this or FSR 3.0 modded frame generation. And at the moment, this seems better.
This is better cause it's much easier to use. Just keyboard shortcut and no need to install any mods that might as well glitch the game out
@@damara2268 I have Lossless Scaling. And I have used FSR 3.0 mod on Cyberpunk 2077 for example. Being easier does not matter to me. What matters is there a mod or not. And the image quality, how much the base frame rate drops etc.
@@mikalappalainen2041FSR 3 much better, tested in TLOU . Lossless Scaling in this game works like shit and burn gpu a lot, Fsr decrease gpu usage, temp, give more stable fps and decrease latency much better. Only one reason to use lossless, game without Fsr3 generation
@@mikalappalainen2041 lossless scaling is better than the FSR 3.0 mod? is it cuz of more fps or image quality?
it works very well now even when frame rate is not an even 60 tried it on the FF XVI demo and it made it still very excellent on my rtx 3060 12gb impressive software and tech this is. To add everything maxed and DLSS 3 with quality preset and lossless framegen.
I think you're wrong about something, x4 right now does have image quality issues but x2 LFSG completely destroys FSR3 framegen in term of image quality, there's no comparison, the only advantage FSR3 has over LSFG x2 is that FSR3 skips the HUD entirely but that leads to other issues as well. FSR3 barely create an actually convincing/correct/stable interpolated image right now.
Used this for Dragons Dogma 2, it made it such an amazing experience, highly recommend it for some titles.
I tried Lossless Scaling X4 earlier on Trepang2 a super fast game and noticed no input latacy at all and i found i died less and played better with the fluidity absulutly amazing i would rarther use this than Dlss frame generation the end of the gun gargled slightly but on a faced paced game not notale what so ever.
Thanks for the info its something I heard about a few months ago but never looked into it
You should have tried dlss 3 frame gen with LS x2 rather than x4 to see if the latency penalty is as heavy
bruuh it is not that simple , you cant use 2 fg together or it will give you bad frame pacing and quality and also bad latency
LS x2 and x4 latency should be the same. It is still a frame behind.
What I was thinking. I'ma try it in cyberpunk when I get home from work lol
@@cajampa it is not the same at all , you can test it even without numbers the diff is so obvious , also x4 will eats alot of base fps so you got that too
@@christophermullins7163 there is no coordination between real and generated frames if you use both of them at the same time it will be bloody mess my man it is not just math x2 x2 =4 , enjoy the mess lol
I am loving Lossless Scaling, and yes a controller is the way to go. With less jerky movements, regardless of the Frame Generation technology you use, a controller will allow the software to present a far better and more consistent looking picture currently possible over a kb&m. Try using a controller and you'll notice a big difference when bumping up frames. I've yet to see how the new X4 and G-Sync update feels, but latency is just one of those things that will not feel good for most, depending of course on the game you are playing. I recommend sticking to 2X frame generation, if you can, to reduce latency. I just recently made the switch from 30+ years of console gaming to pc gaming about 8 months ago, and witnessing frame rates almost double that of a console, without the use of scaling of any kind, is still an incredible experience to me. Knocking a games frame rate up any higher is kinda just a bonus for me. Plus, I still prefer to play my games on my 55" LG C1 OLED TV which only supports 120Hrz, so all I'm ever going to see is 120fps, and am completely content with it. There is no way I can't just go from a large TV to a smaller pc monitor. It ain't gunna happen. Maybe until new LG TV's hit the market that support 240Hrz, is when I'll upgrade. For people chasing FPS gains, the way I see it is this... When it comes to the vast majority of gamers out there, 90% to 95% are playing on consoles, at much lower frames, who are still enjoying their games at 30fps to 60fps. Rarely a 120fps on console pops ups, and when it does, is a heck of a treat. My point? Rushing to the store for more frames, when what you've got right now is better than what most console gamers are experiencing. When you look at these new GPU's, you also have to look at a new monitor, which together is outrageously expensive. On top of that, even the most incredible pc games released today are still struggling to hit a Native 4K at 60fps to 120fps, and that is where DLSS, FSR, and now Lossless Scaling comes into play, to save your wallets from minimal gains to fps. The problem with new hardware-based frame gen technology, in these new GPUs, is that it only supports very few new games currently. I think like 95% of them are sponsored games, and who wants to upgrade a GPU to play 2 or 3 new games every year, that supports this frame gen technology, while also hoping to that those games are good? Go for it if you have the money, but if you don't, just know that you are already sitting pretty, as a PC gamer as is, over the vast majority of gamers. Don't let these GPU companies try to fool you into upgrading, and use Lossless Scaling for now, until we start to get more games that include frame generation. What's funny, is that Lossless Scaling outperforms FSR 3 frame gen technology. I've seen side-by-side comparisons of FSR 3 frame gen versus Lossless Scaling, and while FSR 3 frame gen has its perks, Lossless Scaling still looks better. Ai frame gen software is exactly what people need right now, who just do not financially have the means to purchase the latest and greatest in graphics technology every year, when Lossless Frames is a hair away from the competition. A software based GPU upgrade for only $7 bucks. I don't know if Lossless Scaling has this feature or not, but if the devs read this, I'd like for them to add an option to turn off the fps counter if possible, and replace it with like a small X2, X3, or X4 watermark to indicate that the software is doing it's thing. While looking at FPS counters all day can be fun, I'm just not personally a fan of fps counters in my games, because I find it a bit distracting. I don't really care how many frames its generating, when I can clearly see that it's working based on how I set it. The only time I care about the FPS counter, is when I'm dialing in my settings initially to cap the frame rate to a consistency that the software requires, in order to for Lossless Scaling to produce stable additional frames. From that point forward, using the Lossless Scaling's fps counter doesn't really mean much to me. If that makes any sense. The program does what it needs to do, and it does it exceptionally well! Just a thought I'd throw out there, but no big deal. Great video as always Daniel! Cheers y'all!🍻
Losless scaling is legitimately great, but I've found the most use out of it on TH-cam and with emulators. Real frames are always preferable, interpolating past frame caps is where it's at
I have a rx 570 4gb paired with i5 2400... It runs elden ring at 1080p high at 30 - 40 fps (cpu bottleneck) most of the time, if i cap at 25 frames and use 3x mode of lossless i get 75 frames (my monitor refresh rate), it has artifacting and the input lag is noticible but its waaaaaaaaaaay more stable and enjoyable for me, i can even play it with a higher resolution! This program gave so much life to my pc! :)
Cyberpunk citizens: Ooh sh!t it's frame generation tests time! 😅
Been using AMD's frame generation on quite a few titles lately and it's also very good. Double the fps with a click of a button is crazy cool.
Does it work with Vsync ?
It didnt for me, id recommend turning it off
Do not use vsync with lossless, limit fps manually
@@MiGujack3you will get tearing otherwise. Enable Vsync from NVCP. I had no severe latency issues with vsync + loseless for games that are not competitive. But I am using a controller which is recommend
I think the endgame use of this is using the highest quality/least artefacting modes stacked on top. So DLDSR stacked with Quality DLSS stacked with x2 frame generation will have more input lag sure, but you could take a game from unplayable to solid and banish aliasing to boot. I think combining with Special K framecapping there is some potential to reduce felt input lag and improve stability even further, and in a few key situations it really gets going. I think that once these tools are mature the next step is developing a system that dynamically uses tools like these together to pull out a close-to-ideal configuration automatically, which makes fiddling with individual settings even easier as you're closer to the desirable endpoint when you start messing with stuff. Frame-generation with frame-capped games benefits especially, as you pointed out - it's a way of increasing smoothness and providing high refresh rate compatibility without an egregious downside.
Guess I gotta pick up lossless scaling.
This tech seems a bit pointless to me. Considering the main reason why I want more frames is for my mouse to feel smoother (lighter).
In fps games sure but for stuff like Elden ring, skyrim, basically anything that isn't a fast paced game it's fine
@@MrSuvvri but what's the point if the mouse isn't smooth?
Idk, when I'd want to use it is when it sucks the most so this quite literally seems no good
Yeah I understand where you are coming from. Because I don't really find it useful in shooting games. Almost unplayable.
I use it for livestream and videos tho, almost all the time. Movies, discord streaming. All this makes huge difference.
It will feel smoother.
it makes sense for games like cyberpunk that are hard to run but you want them to look nice and aiming doesn't matter too much
This app is now my default method in playing AAA games. Mostly because my 2060 is pretty much outdated now with all these new games having intensive graphics.
This can also help me achieve higher graphics settings and favorable performance at the same time too. For example, normally i wouldn't be able to achieve a stable 60fps on High settings in games like Black Myth Wukong. But with this app, i can just cap the frame rate at 30, turn all the graphics settings to High, set LS on 4x and casually enjoy the free 120fps/High settings.
Sure, the input lag will be there and can get a bit annoying at times but that's way better than playing at an unstable 60fps that keep dropping to the 50s or even low 40s in some area. Watching movies with this is a blast too, if you are fine with anything more than 30fps in movies and it works really good with movies that heavily use CGI like MCU movies.
Im having so much value from this 7$ app than i ever had from spending hundreds or even thousands of dollars upgrading my hardware.
X6 avaible but only on rtx 😀
*Rtx 5000 series
@@Games_and_Tech best RTX 6090 😛
What a dumb comment. 😂 Love it
u must pay lootboxes and get the legendary X5 mode with a drop rate of 0,000000000x10-^432149x=z²+x/69, after that you need the battlepass (only 1000$ right now it's almost free ngl) and only then you must have 100 000 hours on loseless scaling, and make a 2000$ dollars payment to finally get X6, but it's only avalaible to those who own a rtx 10090 ti super ultra (which cost around 15k$ in 2050)
@@nawabifaissal9625 nah 3080 and good IQ :)
next gen consoles will be insane.
cool story bruh. consoles struggle to get 60 fps in 2024. tell me how they will become insane in next gen? by finally getting 60 fps in 2027-2028? 🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣Final Fantasy 16 and Forspoken drop resolution to 720p and upscale to 1080p and still can't hit 60 fps. consoles will always be inferior to PC.
I tried Lossless Scaling but refunded because it just made every game screen tear like crazy. I played with the settings a lot (VSYNC, Framerate limits) etc but no luck.
Some games gave me more issues than others with LS. I wonder if by tearing, you're actually seeing artifacting. The X2 mode is least likely to artifact and tear and make sure you don't have in-game settings and LS settings fighting each other. I had to play with in-game settings, LS settings, and my driver settings to get an optimal experience but once I solved it, it works great.
What was your fps? Have you enabled vsync only from Nvidia panel? Are you using RTSS properly? What monitor refresh rate have you set?
Or maybe the game you are trying has problems on its own that only the developers can fix or check the internet for stuttering fixes?
I can help you with this. For me loseless scaling is working perfectly with all older games, emulators.
really liking this channel
As a teacher you shouldn't compare Apples with Oranges, you know the scientific method. So if want to compare latency and image artifacts, you compare DLSS FG to LS 2x and not the 4x. You should only compare de LS 4x with the LS 3x and 2x, because it's only one changed variable the number of frames inserted on the same upscaler. So I have to vote negatively for your video, I could voted positively if you had compared Apples with Apples.
Been using it lately with some emulators and it's goddamn magic.
I've also been using with iRacing to get more consistent framerate and there's a slight delay but completely playable.
Seen a lot of lossless scaling videos lately i wonder if its an ad
No it is not it is fantastic
Thanks for video.wouldnt known about this😊
Nobody would use x3 because of artifacting+latency, doubt this will be different. G-sync support is massive though
Afaik latency on x3/x4 should be the similar to the x2 latency because it's still only running 1 frame behind
@@Game_Crunch but have you not got 3 injected frames for every 1 that that you do not have control of, therefore latency?
@@Game_Crunch and if your capping your frame rate at 4th or your target rather than half will that not introduce more latency?
@@BourbonBiscuit. what he is saying is that no matter if you are generating 1 or 2 or 3 frames the latency is only as long as the time it has to wait for a the second in game frame. so all 3 should have the exact same latency but from some reviews Ive seen the 3x has less latency than 2x (something having to do with frame pacing or idk I aint no tech wizard)
@@BourbonBiscuit. yeah but why would you do that if you can already hit the target frame-rate from 60 to 120
going 30 to 120 would be both very stupid and unnecessary
but going from 60 to 240 should feel the exact same as 60 to 120 when it comes to latency
Dude hell yeah i bought this like 6 years ago awesome program
i did not like it at all
Works wonders for pre NVIDIA and AMD frame gen games, older titles. Doom Eternal at 240 fps was amazing haha
I miss raw Performance
This is awesome it's like the introduction of turbochargers in engines
This is the future now old man
Go back to Bronze Era
Agreed. Ignore the other comments. People acting like frame interpolation is “performance” are either lying to themselves or don’t understand the technology at all. This is a cool way to get “smoother” gameplay, but it introduces input lag and artificial frames to make it happen. You can’t call that performance. Imaging if your car’s speedometer doubled the speed it showed you, so while driving at 60MPH it showed you 120MPH. Is that performance? Of course not. This isn’t anything like adding a turbo to an engine- overclocking would be that equivalent.
@@nintendoconvert4045 But this is actually giving you more frames per second, it's not just SHOWING more FPS like your speedometer comparison. If it said 240fps while actually getting 60 fps this app would be getting entirely shit on by everyone, but it doesn't do that.
If the delay latency is too much for you, you can try set the "Low latency mode" to ultra. This setting can be found in Nvidia control panel and can be pretty useful for some game.
So basically it's a gimmick.
No.
If you consider DLSS a gimmick too then yes it is.
i am currently playing a plague tale: requiem, and let's just say with my vega 64 i would be cooked! BUT using lossless scaling i can play at 1440p high preset at 60/70 fps, with the x2 mode and i don't even see much arctifacts, only when there are low light occlused ambients i see a bit of a shimmering effect on some walls. I love this app, the creator is THE GOAT.
I am playing an bit older single player game BattleTech. It's stratagy turn based. I am using Lossless Scaling to scale from 1080p to the native 1800p panel as well as 2x frames, while using the weak APU. Goes from like 26 fps 1080p to 40fps panel resolution looking noticably sharper and a bit smoother.
I'm living in the future. I'm watching this video with 4x scaling enabled.