23:00- small correction. Meant to say you’re getting about the same quality of upscaling at a lower base resolution. Which basically leads to free performance (in most cases). XeSS reviving potato GPUs?? You gonna be using it???
Using vendor specific tools to do image comparisons between competing vendors is questionable. We already know that for example FSR and Xess look worse on Nvidia RTX cards compared to on AMD (FSR) and Intel (Xess), there are hardware vendor agnostic image quality tools used in the movie industry that remove subjective comparisons.
@@jonothonlaycock5456 FSR looks the same on any card as it has no special version... There is literally nothing AMD cards do extra. Unlike XeSS however
There's one small detail. XeSS isn't open source nor cross platform as per one GitHub issue in the XeSS repository, meaning it is no different than DLSS. Meaning Intel didn't "give DLSS to everyone" as the video title claims. Also, I question the amount of overhead this feature will add to cards without AI accelerators, because there are no free lunches. Any type of resolution scaling in real time adds overhead which can only be made imperceptible if your hardware is capable of doing 4-6 times the same amount of computing, in which case, there's no need to use scaling in the first place.
@lilpain1997 I am well aware it has no special version, it and xess just look worse on Nvidia cards due to Nvidia's long history going all the way back to the original RivaTNT's of making other vendors features that have been integrated into games. Look worse or perform worse (Other Gpu vendors have done similar on occasion but not as systematically as Nvidia has through out its history)
@@randomsaloom7238 meaning that fsr sucks compared to dlss and was kind of a reason why some people still preferred Nvidia cards but now that xess is getting this good and kinda of on the level of dlss it gives a fair chance for radeon cards to compete with dlss.
Fun fact; FSR and XeSS both share a large codebase, and both AMD and Intel contribute to each other's open-source projects. We can thank AMD and Intel for _both_ FSR 3.1 and XeSS 1.3
XeSS isn't open source yet though..Intel said open source was the target when XeSS was launched, they haven't said anything about it since that I've read.. what's on github isn't the sourcecode, just header files and some binaries.
@@Vantrakter XeSS 1.2/1.3 is not open yet, but 0.4-1.0 are open-source. Just like with AMD technologies, Intel's open-source projects are released once a version is finalized. FSR 3 only opened 2 months ago (FSR 3.1 is still closed atm), where previously only FSR 1-2.2 were open. FYI, GitHub is only 1 of 100 places FOSS lives and is hosted, and is also the second-smallest repository for FOSS, as most of us in the software industry don't like Microsoft (owns GitHub).
another fun fact, AMD and Intel tend to work this way just in general, due to some VERY complicated history AMD and Intel have both ended up signing a lifetime contract allowing each company to freely use the others patents, how far these extend only the highest ups at these 2 companies know, but it means that if AMD makes an innovation Intel can just copy and paste it and vise versa as for a TLDR of the history real quick, basically Intel invented x86 32 bit, AMD invented x86 64 bit, at the time both companies NEEDED the other technology, Intel needed 64 bit for new flagship CPUs as to not fall behind, AMD needed 32 bit to maintain most of their mid range CPUs, both companies came to an agreement to allow each other to use each others technologys, not entirely fully freely mind you, Intel has many things they dont share with AMD and AMD wont share with Intel, but if 1 company copies the other they cant get sued due to this agreement
Only problem is that Intel has backtracked on it's promise to open source XeSS. I suspect they would go the proprietary route like Nvidia if they gained a larger market share. Only AMD is the open source champion so I hope FSR 3.1 and future revisions deliver significant improvements.
When they do they will charge more than Nvidia because intel is far far more greedier that Nvidia, history proves that hands down. when AMD fell behind This a company that for 7 years they held back innovation so they could sell us chips with 3-5% performance increases all while raising the price by 15% with every new chip just because they could.
@@arenzricodexd4409 It may allow AMD/Nvidia to optimize their cards for XeSS or even let them use the AI enabled version of XeSS for better quality/performance. As it stands Intel has kept the best version of XeSS exclusively for their own gpu's which is basically like Nvidia and DLSS. The other advantage is that it allows modders to add XeSS into multiple games just like they have added FSR3 frame generation that allows any gpu to experience it.
Not in this game/video. The new ratios are only used if a game natively implements 1.3 or if you manually tweak ratios. EDIT: Talked with LukeFZ (one of the FSR3 FG mod authors). Apparently just using the new dll is enough to get the new ratios
@@raresmacovei8382 To get the Ultra Quality mode you would have to have an official implementation. Essentially XeSS 1.3 Quality mode is the old XeSS 1.2 Balanced mode. Vex did not do a fair comparison of the frame rate which would be XeSS 1.2 B vs Xess 1.3 Q.
@@raresmacovei8382 yes. Only the ultra performance mode and the native xess taa mode need official implementation. The rest are already good to go without official implementation.
@@BrunodeSouzaLino Same concept as Apple, they perfect things, and people go back and copy what Apple copied with them. Im selling my AMD off because its a power hog, and cant beat the power consumption of the 4000 series
I just realized something. If we use Uniscaler to put XeSS 1.3 in any game. And custom the settings to specify that "quality" setting is actually at 100% resolution... Does it mean we can have an equivalent to DLAA/NAA in every games in the world and remove aliasing perfectly thanks to the AI in XeSS ? I mean, that would make TAA obsolete, every older games without DLAA/NAA would have a new perfect way to remove all jagged edgies
@@NexY92 You can do it in cyberpunk. Download XeSS and switch the files with the original XeSS ones. Then set upscaling quality to DRS, set min and max res to 100, and target fps to 10. I checked via CET, it stays at native res if you do it like this. As for results, it's generally a lot sharper and more stable than TAA, but it also has weird artifacts (especially with rain and thin particles)
Unfortunately not all the games in the world, unless you use lossless scaling which imo isn't that good of an upscaling implementation, the game needs to actually have an upscaling option so that you could implement XeSS. Hell, I tried Uniscaler on RE2 Remake and FSR 3 framegen didn't work.
This would likely work in games that already use temporal solutions. However games that don't will either break completely or you'll need to generate the required motion vectors & masks on the fly, which significantly lowers performance.
Imagine not supporting games that don't perform well because the developers refuse to optimize their games. But no, you'd rather give them your money and deal with upscalers instead... By giving greedy companies your money, you're saying it's ok to do what they're doing. If you stop doing that, they'll be forced to make better products.
XESS 1.3 still has issues, if you compare it to DLSS (as a reference point). In some games the difference between the new XESS version and the latest DLSS version is still pretty noticeable. But, I have to admit, XESS is very close to DLSS in terms of quality now. One more step (like version 1.4) and it will proably match DLSS, unless Nvidia improves DLSS as well, which is likely. AMD really need to act fast and release that FSR 3.1 asap and update with it as many games as possible.
@@kPyGJIbIuWell, if we are talking about performance/quality ratio of XESS (especially in case of AMD cards), it still looses to DLSS. So, even if you manage to match visual quality, you still loose in terms of performance and vise versa. In some games the difference is pretty significant.
Yeah i tested XESS 1.3 on horizon FW and it was way worse than DLSS.Test setup: 48" LG OLED TV as pc monitor, 4080Super, 4K resolution, DLSS 3.7.0 with preset E and XESS 1.3 Both on quality mode have same FPS, but DLSS looks much better. I could clearly see XESS using way lower resolution than DLSS and whole image looks way worse. Compared to FSR 2.2 the XESS have better water, but very long distance objects like foliage shimmer like crazy and looks worse than FSR 2.2.
@@stangamer1151 There is a caveat to this. If you are buying into the RTX series because of the better upscaling, and the promise of it getting even better, you are more than likely going to be left behind in a generation or two because of nVidia's tendency to lock older hardware from their new software. So, absolutely, comparing DLSS to FSR to XeSS right now, DLSS is on top, IF you have a GPU that supports it. But in 2-3 years, when the "next big thing" in generative tech like this comes along, and you're not ready to spend another 6-7-800 bucks on a new card, the (currently) inferior tech might be the best you can get after it itself gets updated. So your statement is entirely valid. *For now*, but we all win, even if XeSS and FSR are always a step behind DLSS as long as they keep improving.
I think it is more about developer support. Most developers won't support adding in software to supoer 5% of the market, but it is supports 90% then there is more of a chance to get your technology reach wide adoption.
intel does not care about you. when AMD fell behind This a company that for 7 years they held back innovation so they could sell us chips with 3-5% performance increases all while raising the price by 15% with every new chip just because they could. its worse than ngreedia.
Glad to see other solutions like xess getting improvements. Considering how young xess is, this is honestly really promising! My only issue is that devs are using this tech as a crutch for "playable"
the reason for the perf increase is as they wrote, both are using xess performance but xess perf on 1.3 renders at a lower res than 1.2 perf mode so the increase to fps must defo come from this fact
@@_..-...--.-.-.-..- lil bros IQ lower than room temp. 🤡. ion gonna watch the whole dogshit video i just clicked on left the comment then left again. too much yap. And if he really did say that then good it means he has atleast 3 working brain cells. but anyways thank you so much for your reply i will proceed to print it out and wipe my shit with it. 😇😇
I've been using XeSS in Cyberpunk on my 6650 XT, and I was shocked to find that the Performance mode of Intel's upscaler looks significantly better to me than "balanced" in FSR. Everything looks more defined and vibrant with XeSS, whereas it feels to me like FSR adds a blurry filter over everything. AMD's driver level frame gen is fantastic btw - with my framerate locked to 71 fps, I've got Cyberpunk running at PS5 equivalent settings plus all ray tracing turned on except lighting, and I get 100-142 fps with consistent 7-9 ms frame time. Most of the time it hovers between 120-130 fps, but I still decided to cap it near 144 fps cuz AMD FMF produces way better results with far lower latency when the framerate is around 70.
AFMF is great but the input lag is a bit much on some games. I was getting well over 100+ FPS on Alan Wake 2 on max settings with it with my 6750 XT. Tried out XeSS 1.3 in Horizon Forbidden West and Remnant 2 and it's really good. When FSR 3.1 arrives, using the frame generation with XeSS will be great.
@@iitzfizz It's weird - with me, I don't notice any increase in input lag while using frame gen. There's usually like 20-30ms of frame gen lag listed on my Adrenalin overlay with the feature enabled, but then my overall frame time is usually way better cuz I'm coming way closer to my 144 hz refresh rate. Depending on the game and its framerate, I typically have 26-30 ms of total lag, which is weird cuz it feels very responsive to me. Typically when I've turned off frame gen and experimented with input lag by limiting a game to 30 fps or whatever, it resulted in a much lower frame time than frame gen's 25-30 ms yet felt WAY worse. I dunno what the deal is with that, but it feels counterintuitive to me. Weird.
I may be wrong, but AMD was working on making frame gen compatible with other upscalers, right? So we may be able to use XESS with AMD frame gen, that would be amazing
Realised how good XeSS is recently... With a 6700xt, XeSS is the only upscaling tech that gets me 60fps in cyberpunk without the disgusting look of AMD FSR 3. If you push the upscaling further, it boosts to 100+ FPS.
sure, but now AMD card owners (like myself) can use an upscaler that it is not shitty. I'm sorry but sometimes even at 4k quality FSR generates distracting artifacts, I find myself preferring to drop down the resolution to 1440p rather than using FSR at quality. (not for all games, but it is definitely a problem in various titles)
Well it seems like 1.3 might be worth using even if you’re not Intel now because of some massive improvement and the SDK is on GitHub right now. FSR 3.1 was announced right before this. I assume both are worth using now but obviously only one has frame gen at the moment. It’s always good news when any of these gets better.
Not really. In Robocop I compared XeSS 1.2 B vs XeSS 1.3 Q to ensure they have the same base resolution. 1080p. They both delivers same fps, but 1.2 has slighly less shimmering. If I enable XeSS 1.3 to Balanced, it is a pain for the eyes versus 1.2 B. So you need to put 1.3 on a higher quality level, and you will get the same performance, or you can have worse picture with more artefacts at the same level. The only benefit is "quality" in settings looks better for your ego than "balanced" xD
@@electrotrashmailbox in the end i guess is important to have a release by the devs with the proper optimization and else, but this 1.3 should be better in every case.-
Imagine these: °Moore Threads became globally competitive °Apple somehow Join the CPU/GPU market °Qualcomm & Mediatek enters the Desktop CPU/GPU market But That's Just A Theory....
This really goes to show that nvidia used DLSS as a marketing product to upsell their cards, dont blame em but this is a BIG W for intel and all gamers
I tried this also on Remnant 2 but with RX 580, though my result is way different from what you have showed. It runs worse than FSR either in visual quality and rendering performance. FSR on performance looks better and faster compared to XeSS balanced for whatever reason. Maybe because i'm using 768p TV? But that doesn't explain how FSR performs better.
The show and tell editing was really good on this. The highlighting of key text points and comparisons zooms really hammered the points home. Excellent.
I kinda hope fiture gpu's will focus on price over performance at make these upscalers as good as they can be so we could finally play 2k 144fps without breaking the bank and without hickups
exactly my thoughts. I have a 4080, but I’m very excited for the competition this brings to the table. The only thing missing is Ray Reconstruction, but I‘m sure they are already working on it
You can clearly see that XeSS 1.3 is softer since it's using a lower render resolution and I suspect most people would not want to use a lower base resolution. The only real improvement is the reduction of ghosting and moire shimmer. A proper comparison of any quality improvements would have been to compare the same render resolution which is XeSS 1.2 B vs XeSS 1.3 Q.
the ghosting reduction is crazy, remnant 2 started to look way better once Vex changed it to 1.3. it's clear when you look at floating particles, on 1.2 they were leaving gigantic trails
The blurry ground at oblique angles in the AMD tests are from the fact that they can't do any kind of texture filtering. You're stuck on bilinear unless it's a DX9 game, in which anisotropic filtering works.
Already tested it on Steam Deck and it looks great. In Spider-Man Remastered i would prefer image scaling with TAA previously bcs of shimmering of any upscaler in the game, but updated XeSS seems to be the best solution rn and with greater performance. (Tbh hair still does not look great, but this time it looks closest to normal on XeSS.)
Excited to see games implement this officially, as I am excited to see games update with FSR 3.1 in the future. I really wish all games that have upscalers would have all three big options (and also TSR in Unreal Engine games too, I like TSR more than FSR2.2). Glad Intel joined the fray with their GPUs. Still work to be done on their drivers, but between Presentmod and XeSS they're doing a lot of good for all GPU owners.
If I remember correctly, AMD announced AI upscaling technology before Intel. Additionally, Sony is also entering the AI upscaling arena. Therefore, I anticipate that the upcoming AMD FSR will be significant, especially if it operates solely with their NPU cores. They might utilize these NPU cores from the CPU to avoid overburdening the GPU. Will see.
I've been going back and forth in my mind between deciding on getting a 4070 super or a 7900 GRE. It's seriously one of the hardest choices I've had to make in hardware yet in all my years of pc gaming. This video definitely swings me further to the 7900 GRE.
Both cards will last you the entire PS5 and PS5 pro generation. Don’t think about longevity and future proof. The value is much lower for higher end. Matching console at the cheapest price is all that matters.
18:59 when using Xess performance at 1440p, I think the resolution scaling should be 1440%sqrt(2.0) = 1018p and for Xess 1.3 is about 953p. I don't think you can count furs if native resolution is 620p.
I just tried this in Ratchet and Clank on Steam Deck. Hooooo boy that was fun. Night and day different from FSR. No smearing or ghosting really at all. Awesome to see this get more competitive! I just hope XeSS starts making its way into more games.
My main issue with the new Xess is that while they fixed temporal stability quite a bit, they introduced quite some instability when it comes to the anti aliasing. The image feels alot less stable on edges. It might have to do with me putting Xess 1.31 into Cyberpunk 2077 which natively uses the 1.2 dll but the image with 1.2 looked rock solid and very stable apart from the ghosting from fast moving objects it had, but the edges of objects looked amazingly stable. I hope they can still tweak it to find the best of both worlds, but it is looking pretty decent already.
It has just dawned on me that my 1080Ti that our 6y/o uses has actually just jumped a building toward being even better than it was for 6 years already. DAFUQ! So I have nothing to do with Ngreedia anymore, still have the GOAT Nvidia GPU that was ever made great by accident and get more out of my preferred AMD GPU? Man / this is the shit
Im actually glad that they started to work on upscaling that's not limited to amd gpus and i have an amd 7800 xt. Fsr 3.1 looks promising. I guess that will take a few months until we get games to run it. One big plus will be that frame generation will run without having to use fsr.
I have never attempted to mod a game, but I'm tempted to see if this will work on my 1650 Super....I mean, I hope to upgrade this year, but poverty often has other plans for any money I manage to save...
Just a couple of notes on XeSS 1) There are 2 code paths, ( Intel native and DP4A). The 2 paths can produce quite different results. 2) AMD graphics cards including Vega and any below do not support DP4A and thus cant run XeSS, the exception is Radeon VII, which does support CeSS
The newer version of Xess isn't just faster because they improve their algorithm, it gives more framerate because they lowered the base resolution on the pre-existing presets, so it's not a fair comparison to compare the 1.2 quality preset, to the 1.3 quality preset. A fair comparison would be comparing 1.2 quality preset, to the 1.3 quality plus preset. They didn't do that to try being sly, they did it because they improved the Fidelity of the upscaling from lower base resolutions. It makes sense, but it's still a little dodgy to market it that way. I like that there's more options to pick from though
I find it kind of misleading when comparing almost still frames with not much going on, but if you test XeSS 1.2 in Cyberpunk during the rain at any quality at 1080p on DP4a the rain turns into lasers, straight lines coming from the sky with little definition of individual raindrops. FSR doesn't exhibit this temporal smearing/ghosting (but of course lacks in other ways).
Is it possible to upscale to the same resolution. Like from 1080p to 1080p just to increase FPS. I imagine if someone got a realllly bad -30fps pc they could benefit from that to make games playable at least. If not they should make it possible. Or to combine native and upscaling in a new technology
I have a lot of experience with upscaling. You can't upscale 1080p to 1080p. You upscale with low resolution. I recommend buy Lossless Scaling application to see what it does first hand. If you have a 4k monitor. You would leave windows display resolution on 3840x2160. Now the game settings will be completely different from your monitor resolution. Your game would be in 1280x720p window mode only. When you turn on Lossless Scaling. 720p will turn into 4k. Or if you got a 1080p monitor. Your windows display would be on 1920x1080 and your game would be in 1280x720. I don't do this anymore. I bought a 4k Gamer Pro. It's basically the same thing as Lossless Scaling but a hardware thing instead of a software thing. If interested in a 4k gamer pro. You need a 4k monitor. All your games must be in 1080p for the 4k gamer pro to work.
@@handzze7341 It is but i just meant what if you could replace 1080p native with a reconstructed 1080p image instead of going from 720p to 1080p. Just so there's less load on the gpu
So a thing slightly related. I recently bought a 7900 xtx with a 1440p ultrawide monitor and ever sense I've had this bug in cyberpunk that happens if I don't use fsr3. Basically it'll leave this spot on my screen that looks like flashing dying pixel's only in a small part above V's left hand.
I have a qwuestion man. Did you check what percent resolution scale Amd and Nvidia are using before making these comparisons? Maybe FSR Performance doesnt use the same base Res as DLSS Performance. If were comparing visuals you have to check that to make it fair.
1:11 pretty bad example given its moving on native and looks great, as you would expect. plus moving on fsr so a lower resolution upscaled thing moving = aliasing/ flickering. then with the dlss you hailed, its not moving at all. of course it wont be as aliased/ flicker as much.
I tried it on Cyberpunk .. with this i can use highier sharpness setting without making game looking weird .. even at max value of 1 with low resolution it looks very good
I'm not motion sensitive but when you zoom in on videos with added camera motion effects inside windows or browsers, it's really jarring. I zoomed out on the video in order to make it watchable to me :( That's my only complaint. Love the video so far
Way to overhype ML... which is also a man-made algorithm, the main difference is that since it is using HW acceleration using tensor cores, you can have better quality set for the same or a bit lower performance hit. XeSS differs from DLSS in that XeSS does not _require_ fix-function HW accelerators.
Which is why XeSS performs so much worse on non-Intel GPUs. FSR is literally the only upscaler which doesn't rely on ML at this point, and it's phenomenal how many people can hate on it, when clearly all 3 upscalers have their own pros and cons.
Intel not only started shredding the low end PtP but also aided their competitor to make AMD GPU's get fidelity and speed near DLSS, really makes me interested in Battlemage and just what they got cooking
wonder how these upscaling technology would look like on intel's new arc integrated graphics cards. Considering buying one so hopefully someone covers that as well!
If only AMD does what Nvidia do and they'll probably be more competitive than ever... but why not I don't think they can't do it but it just feels like they're not interested and wants to do everything on their own
I don't use upscaling because I don't like what each frame looks like when things are moving such as panning. They are not as sharp. That is because upscaling basically turns off when you are moving so what is displayed is the lower resolution. Though it is a higher frame rate, it is softer because of the lower resolution. This is probably not a problem for those that use motion blur. I don't. I prefer to see sharp frames even if it looks like a slide show. But to be fair, my frame rate at native is above 90 fps at 4K.
nowadays upscaling gives the same image quality or even better in games, at least if you use dlss and xess quality modes, just because how shitty taa and tsr AA looks in modern games, especially at 2k and 4k.
Intel is really surprising me lately. I might just get one of their cards next. I wish they'd get their CPU game together though. Value is just isn't there for gamers atm.
Did u take into account that intel is changing their quality levels? They are adding in more options and changing the resolution that each option renders from. So quality level from 1.2 to 1.3 won't be the same resolution anymore. So quality level in 1.2 will be something like ultra quality in 1.3 or maybe performance I can remember which way it goes but I do know they are 100% adding in more upscale resolutions in 1.3
Vex, what do you use to capture the image/video? What's the codec and bitrate of the captured video? I'm asking because, when you make such comparison it's important to be as close to the source as possible, but those captured videos looks kinda low quality, to be precise low bitrate. The video in upclose is blocky, showing artifacts of compression, and general loss of quality :(
I'm a lil confused. So do you have to set your resolution lower? Or does XESS lower it for you?? Like if I have my PC at 1440p do I have to lower that before using XeSs? Or leave it the same and Xess will do it. Cheers
As for games that don't have upscaling at all you could try to use lossless scaling as they made their own upscaler with machine learning( AI) and frame gen in their app. I couldn't test it much as I have an i5 4570 paired with an rx 5600 XT( ik a terrible combo, Bosnia still had the high prices from mining). I recomend you maybe try it out in one of your videos( the app does cost 7 dollars), also intel be out here delivering on what they promise and beating AMD to implementing AI lol.
14:10 Your stats are just as valid as the official ones. I'd argue Intel is actually doing a better job of communicating the difference. I can intuitively tell that the difference between 1.3x and 1.5x is less than 20%. Probably more like 15% if I had to guess. But for 77% -> 67%? I am fooled into thinking that it is actually only a 10% increase. And it really isn't. And wouldn't you know it, it is 14.9%. I think Intel isn't just marketinging, it is simply the better way to show these stats.
I feel the real benefit is upscaling from 1440p to 4k, as dlss quality makes for a great image but fsr 2 does not. xess 1.2 didn't offer a decent enough uplift of performance but with 1.3 you can now run 4k with great performance.
And yes, every game I play regardless of intensity or fast pace will be played zoomed all the way in and at 1 frame per minute so I can see how good my upscaler is. Vex and Tim from HUB are speaking the same language. It is also my belief that if all these games are played at normal speed without being zoomed in it would be near impossible to distinguish the difference in quality using any of the three available upscalers. I will never purchase an nGreedia GPU.
Not actually too sure about Nvidia deliberately creating ICAT for marketing DLSS. They made LDAT for comparing latency figures, making something to easily compare image quality in general (not just for upscaling) seems pretty logical to capture & sync up graphical settings comparisons (which I think is really what they were pushing this for, especially for real time raytracing).
12:11 no... It is. There are a lot of GPUs that can only use AMD's FSR and not even XeSS. Remember the 6.4 shaders requirement. The oldest Nvidia GPU to support those shaders in hardware is Maxwell. You need a GTX 750 Ti or 980 to access XeSS. A GTS 450 can at least use FSR. XeSS is just the middle ground in terms of hardware support and quality. And last time I checked... I'M STILL ON TESLA 2.0 GRAPHICS SO I DON'T GET $H!T.
As a Intel Arc A770 16gb owner i can say, it's awesome! I went from a old gtx card using fsr to the Intel one using XeSS and it's alot Better image quality. Now they just need to make better Linux drivers.
as a person that lived his entire life with a potato and now recently built a pc with a gtx 1650 on it, i can confidently say that i dont see a fcking difference
I think so yould have used XESS 1.2 and 1.3 on the setting that upscale from the same resolutions too, as comparing it like you did, you can't really make a fair comparison on image quality improvements.
Most non-gaming ML algorithms run on (GP-)GPU, and before that, CPUs were widely used (and still are for some stuff). So it shouldn't be shocking to know that the DLSS' special sauce is actually software, not tensor cores. These ASIC blocks make it faster and more energy efficient, that's all.
i will be happy if they implemented their own DXR and comparing 1.3 and 1.2 i noticed better contrast between light and dark surfaces it's better but not noticeable for all peoples because it's like 1-2 color shade but it's better
I was already using XeSS with my rdna2 gpu in cyberpunk, better 1% lows and visually identical to native compared to blurry FSR. Now it's gotten even better, but still peoples in other tech channels aren't trusting me about XeSS>FSR on a red team card... Open your eyes gents!
Feel like using performance setting in upscale isnt the best comparison as its such an aggressive upscaling option. It's only really used for people that cant get the FPS they need. I think Quality is the gold standard and a lot of people will use it even if they have enough FPS just to get extra smoothness of higher fps.
Can you compare visual xess differences between dp4a and xmx istructions? By using an amd/nvidia card and an intel card running xess at same resolution
23:00- small correction. Meant to say you’re getting about the same quality of upscaling at a lower base resolution. Which basically leads to free performance (in most cases).
XeSS reviving potato GPUs?? You gonna be using it???
Using vendor specific tools to do image comparisons between competing vendors is questionable. We already know that for example FSR and Xess look worse on Nvidia RTX cards compared to on AMD (FSR) and Intel (Xess), there are hardware vendor agnostic image quality tools used in the movie industry that remove subjective comparisons.
@@jonothonlaycock5456 FSR looks the same on any card as it has no special version... There is literally nothing AMD cards do extra. Unlike XeSS however
There's one small detail. XeSS isn't open source nor cross platform as per one GitHub issue in the XeSS repository, meaning it is no different than DLSS. Meaning Intel didn't "give DLSS to everyone" as the video title claims. Also, I question the amount of overhead this feature will add to cards without AI accelerators, because there are no free lunches. Any type of resolution scaling in real time adds overhead which can only be made imperceptible if your hardware is capable of doing 4-6 times the same amount of computing, in which case, there's no need to use scaling in the first place.
@lilpain1997 I am well aware it has no special version, it and xess just look worse on Nvidia cards due to Nvidia's long history going all the way back to the original RivaTNT's of making other vendors features that have been integrated into games. Look worse or perform worse (Other Gpu vendors have done similar on occasion but not as systematically as Nvidia has through out its history)
Can we use it in integrated graphics???
Thank you intel for making amd gpus even more competitive
Fr
i dont get what this means ?
@@randomsaloom7238 meaning that fsr sucks compared to dlss and was kind of a reason why some people still preferred Nvidia cards but now that xess is getting this good and kinda of on the level of dlss it gives a fair chance for radeon cards to compete with dlss.
@@randomsaloom7238 XeSS blows FSR out of the water. AMD's own upscaling tech pales in comparison to their competitor's.
AMD: Makes intel CPUs more competitive
intel: Makes AMD GPUs more competitive
intel 🤝 AMD
Fun fact; FSR and XeSS both share a large codebase, and both AMD and Intel contribute to each other's open-source projects. We can thank AMD and Intel for _both_ FSR 3.1 and XeSS 1.3
That's how it should be. It starting to look like Apple vs Android war but in GPUs
XeSS isn't open source yet though..Intel said open source was the target when XeSS was launched, they haven't said anything about it since that I've read.. what's on github isn't the sourcecode, just header files and some binaries.
@@Vantrakter XeSS 1.2/1.3 is not open yet, but 0.4-1.0 are open-source. Just like with AMD technologies, Intel's open-source projects are released once a version is finalized. FSR 3 only opened 2 months ago (FSR 3.1 is still closed atm), where previously only FSR 1-2.2 were open.
FYI, GitHub is only 1 of 100 places FOSS lives and is hosted, and is also the second-smallest repository for FOSS, as most of us in the software industry don't like Microsoft (owns GitHub).
How do you know? Only one of them is open source...
another fun fact, AMD and Intel tend to work this way just in general, due to some VERY complicated history AMD and Intel have both ended up signing a lifetime contract allowing each company to freely use the others patents, how far these extend only the highest ups at these 2 companies know, but it means that if AMD makes an innovation Intel can just copy and paste it and vise versa
as for a TLDR of the history real quick, basically Intel invented x86 32 bit, AMD invented x86 64 bit, at the time both companies NEEDED the other technology, Intel needed 64 bit for new flagship CPUs as to not fall behind, AMD needed 32 bit to maintain most of their mid range CPUs, both companies came to an agreement to allow each other to use each others technologys, not entirely fully freely mind you, Intel has many things they dont share with AMD and AMD wont share with Intel, but if 1 company copies the other they cant get sued due to this agreement
I really hope Intel smashes the gpu market. This looks promising
Only problem is that Intel has backtracked on it's promise to open source XeSS. I suspect they would go the proprietary route like Nvidia if they gained a larger market share. Only AMD is the open source champion so I hope FSR 3.1 and future revisions deliver significant improvements.
@@Dark-qx8rk what does it matter when game makers are still using Crapdows 11
@@Dark-qx8rk what's the benefit if intel open source their XeSS?
When they do they will charge more than Nvidia because intel is far far more greedier that Nvidia, history proves that hands down. when AMD fell behind This a company that for 7 years they held back innovation so they could sell us chips with 3-5% performance increases all while raising the price by 15% with every new chip just because they could.
@@arenzricodexd4409 It may allow AMD/Nvidia to optimize their cards for XeSS or even let them use the AI enabled version of XeSS for better quality/performance.
As it stands Intel has kept the best version of XeSS exclusively for their own gpu's which is basically like Nvidia and DLSS.
The other advantage is that it allows modders to add XeSS into multiple games just like they have added FSR3 frame generation that allows any gpu to experience it.
The performance is better because XeSS uses a lower base resolution in 1.3 compared to 1.2.
Not in this game/video. The new ratios are only used if a game natively implements 1.3 or if you manually tweak ratios.
EDIT: Talked with LukeFZ (one of the FSR3 FG mod authors). Apparently just using the new dll is enough to get the new ratios
They really made it confusing in their latest revision. 🙁
@@raresmacovei8382 To get the Ultra Quality mode you would have to have an official implementation. Essentially XeSS 1.3 Quality mode is the old XeSS 1.2 Balanced mode. Vex did not do a fair comparison of the frame rate which would be XeSS 1.2 B vs Xess 1.3 Q.
@@Dark-qx8rk No no, that's what I also thought too. But apparently just the new dll will force the new ratios internally.
@@raresmacovei8382 yes. Only the ultra performance mode and the native xess taa mode need official implementation. The rest are already good to go without official implementation.
I hope we'll eventually find a way to loophole all of Nvidia's features for any type of GPU.
Instead we should be hoping that AMD and Intel can improve their features to be competitive with Nvidia.
@@Goober89 Your statement is actually the same as his.
Why copy NVIDIA when you can do better than NVIDIA?
@@BrunodeSouzaLino Same concept as Apple, they perfect things, and people go back and copy what Apple copied with them. Im selling my AMD off because its a power hog, and cant beat the power consumption of the 4000 series
@@ChiBrianXIII tried undervolting? because for me worked like magic :D
I just realized something.
If we use Uniscaler to put XeSS 1.3 in any game. And custom the settings to specify that "quality" setting is actually at 100% resolution... Does it mean we can have an equivalent to DLAA/NAA in every games in the world and remove aliasing perfectly thanks to the AI in XeSS ?
I mean, that would make TAA obsolete, every older games without DLAA/NAA would have a new perfect way to remove all jagged edgies
Test it out and report back
@@NexY92 You can do it in cyberpunk. Download XeSS and switch the files with the original XeSS ones. Then set upscaling quality to DRS, set min and max res to 100, and target fps to 10. I checked via CET, it stays at native res if you do it like this. As for results, it's generally a lot sharper and more stable than TAA, but it also has weird artifacts (especially with rain and thin particles)
Unfortunately not all the games in the world, unless you use lossless scaling which imo isn't that good of an upscaling implementation, the game needs to actually have an upscaling option so that you could implement XeSS.
Hell, I tried Uniscaler on RE2 Remake and FSR 3 framegen didn't work.
This would likely work in games that already use temporal solutions. However games that don't will either break completely or you'll need to generate the required motion vectors & masks on the fly, which significantly lowers performance.
What's with all that trickery? Just select XeSS Native AA
Imagine not supporting games that don't perform well because the developers refuse to optimize their games. But no, you'd rather give them your money and deal with upscalers instead... By giving greedy companies your money, you're saying it's ok to do what they're doing. If you stop doing that, they'll be forced to make better products.
The new generation of the PC Master Race don't understand that this should not be a crutch for $500+ GPUs
It all reduces the cost of software development. Does it matter where the savings come from? Maybe not.
XESS 1.3 still has issues, if you compare it to DLSS (as a reference point). In some games the difference between the new XESS version and the latest DLSS version is still pretty noticeable. But, I have to admit, XESS is very close to DLSS in terms of quality now. One more step (like version 1.4) and it will proably match DLSS, unless Nvidia improves DLSS as well, which is likely.
AMD really need to act fast and release that FSR 3.1 asap and update with it as many games as possible.
even if fsr going to lose, amd is going to win if xess wins. so fsr 3.1 doesn't really matter
@@kPyGJIbIuWell, if we are talking about performance/quality ratio of XESS (especially in case of AMD cards), it still looses to DLSS. So, even if you manage to match visual quality, you still loose in terms of performance and vise versa. In some games the difference is pretty significant.
Yeah i tested XESS 1.3 on horizon FW and it was way worse than DLSS.Test setup:
48" LG OLED TV as pc monitor, 4080Super, 4K resolution, DLSS 3.7.0 with preset E and XESS 1.3
Both on quality mode have same FPS, but DLSS looks much better. I could clearly see XESS using way lower resolution than DLSS and whole image looks way worse.
Compared to FSR 2.2 the XESS have better water, but very long distance objects like foliage shimmer like crazy and looks worse than FSR 2.2.
@@stangamer1151 There is a caveat to this. If you are buying into the RTX series because of the better upscaling, and the promise of it getting even better, you are more than likely going to be left behind in a generation or two because of nVidia's tendency to lock older hardware from their new software. So, absolutely, comparing DLSS to FSR to XeSS right now, DLSS is on top, IF you have a GPU that supports it. But in 2-3 years, when the "next big thing" in generative tech like this comes along, and you're not ready to spend another 6-7-800 bucks on a new card, the (currently) inferior tech might be the best you can get after it itself gets updated.
So your statement is entirely valid. *For now*, but we all win, even if XeSS and FSR are always a step behind DLSS as long as they keep improving.
@@kPyGJIbIu Everyone wins with this... Even those on Nvidia GPUs as you get more options too choose from
really funny how amd and intel care more about old nvidia cards that nvidia themselves
its a free way to get data and improve training their models
I think it is more about developer support. Most developers won't support adding in software to supoer 5% of the market, but it is supports 90% then there is more of a chance to get your technology reach wide adoption.
intel does not care about you. when AMD fell behind This a company that for 7 years they held back innovation so they could sell us chips with 3-5% performance increases all while raising the price by 15% with every new chip just because they could. its worse than ngreedia.
Interesting, can 4090 use dlls + fsr together?
Glad to see other solutions like xess getting improvements. Considering how young xess is, this is honestly really promising! My only issue is that devs are using this tech as a crutch for "playable"
Ok so we can use Xess from Intel and FSR 3.1 Frame generation at the same time
God I love the competition
interesting how intel and amd unintentionally work together to beat the nvidia's dlss and fg xd
the reason for the perf increase is as they wrote, both are using xess performance but xess perf on 1.3 renders at a lower res than 1.2 perf mode so the increase to fps must defo come from this fact
Well no shit, he said that like 10 times in the video
@@_..-...--.-.-.-..- lil bros IQ lower than room temp. 🤡. ion gonna watch the whole dogshit video i just clicked on left the comment then left again. too much yap. And if he really did say that then good it means he has atleast 3 working brain cells. but anyways thank you so much for your reply i will proceed to print it out and wipe my shit with it. 😇😇
I've been using XeSS in Cyberpunk on my 6650 XT, and I was shocked to find that the Performance mode of Intel's upscaler looks significantly better to me than "balanced" in FSR. Everything looks more defined and vibrant with XeSS, whereas it feels to me like FSR adds a blurry filter over everything.
AMD's driver level frame gen is fantastic btw - with my framerate locked to 71 fps, I've got Cyberpunk running at PS5 equivalent settings plus all ray tracing turned on except lighting, and I get 100-142 fps with consistent 7-9 ms frame time. Most of the time it hovers between 120-130 fps, but I still decided to cap it near 144 fps cuz AMD FMF produces way better results with far lower latency when the framerate is around 70.
AFMF is great but the input lag is a bit much on some games. I was getting well over 100+ FPS on Alan Wake 2 on max settings with it with my 6750 XT.
Tried out XeSS 1.3 in Horizon Forbidden West and Remnant 2 and it's really good. When FSR 3.1 arrives, using the frame generation with XeSS will be great.
@@iitzfizz It's weird - with me, I don't notice any increase in input lag while using frame gen. There's usually like 20-30ms of frame gen lag listed on my Adrenalin overlay with the feature enabled, but then my overall frame time is usually way better cuz I'm coming way closer to my 144 hz refresh rate. Depending on the game and its framerate, I typically have 26-30 ms of total lag, which is weird cuz it feels very responsive to me. Typically when I've turned off frame gen and experimented with input lag by limiting a game to 30 fps or whatever, it resulted in a much lower frame time than frame gen's 25-30 ms yet felt WAY worse. I dunno what the deal is with that, but it feels counterintuitive to me. Weird.
I may be wrong, but AMD was working on making frame gen compatible with other upscalers, right? So we may be able to use XESS with AMD frame gen, that would be amazing
No, you're correct - it should be coming with FSR 3.1
Yup, decoupled FG so can use it with even DLSS/XeSS!
I've done Framegen on my 7900XTX. Works super well in SUPPORTED games. Can be issues in unsupported games
Realised how good XeSS is recently... With a 6700xt, XeSS is the only upscaling tech that gets me 60fps in cyberpunk without the disgusting look of AMD FSR 3.
If you push the upscaling further, it boosts to 100+ FPS.
Yeah FSR 3 looks pretty awful in cyerpunk but some games it looks pretty good
I don’t know about this title. I think I get your point but XeSS doesn’t run the same in agnostic mode as it does on Intel.
The point is that it looks better than FSR even on non-Intel cards
sure, but now AMD card owners (like myself) can use an upscaler that it is not shitty. I'm sorry but sometimes even at 4k quality FSR generates distracting artifacts, I find myself preferring to drop down the resolution to 1440p rather than using FSR at quality. (not for all games, but it is definitely a problem in various titles)
Well it seems like 1.3 might be worth using even if you’re not Intel now because of some massive improvement and the SDK is on GitHub right now. FSR 3.1 was announced right before this. I assume both are worth using now but obviously only one has frame gen at the moment. It’s always good news when any of these gets better.
Looks like XESS 1.3 is a worthwhile mod for those games that haven't updated yet. Nice.
Not really.
In Robocop I compared XeSS 1.2 B vs XeSS 1.3 Q to ensure they have the same base resolution. 1080p. They both delivers same fps, but 1.2 has slighly less shimmering. If I enable XeSS 1.3 to Balanced, it is a pain for the eyes versus 1.2 B.
So you need to put 1.3 on a higher quality level, and you will get the same performance, or you can have worse picture with more artefacts at the same level.
The only benefit is "quality" in settings looks better for your ego than "balanced" xD
@@electrotrashmailbox in the end i guess is important to have a release by the devs with the proper optimization and else, but this 1.3 should be better in every case.-
Thank you so much for this video! I love these intricate gaming tech comparisons!
Imagine these:
°Moore Threads became globally competitive
°Apple somehow Join the CPU/GPU market
°Qualcomm & Mediatek enters the Desktop CPU/GPU market
But That's Just A Theory....
"Apple somehow Join the CPU/GPU market" all is good except this.
apple is shit n overpriced, we need more CPU/GPU competition tho
@@Sol4rOnYtwe need competition, not monopoly
Keep apple out of this
@@w-lilypad it's not like they'll get monopoly, they'll just sell overpriced shit as always
thanks for this detailed review looks to me that i should run my 7900xtx on XESS 1.3 if its available instead of FSR
... untill FSR 3.1 would came out ;-)
@igorthelight kinda like still waiting for fsr 3 in cyberpunk?
@@igorthelightby then RX 9000 series is going to be out lmao
@@JustSkram Yep!
I’m going to use this to upscale and use MD frame gen to get it up to 120fps.
12:08 they beat amd in the encoding as well... shame amd, shame.
This really goes to show that nvidia used DLSS as a marketing product to upsell their cards, dont blame em but this is a BIG W for intel and all gamers
FSR is based on an image upscaling algorithm called Lanczos but it has many significant changes.
That name is pretty familar to me. I used emulators and they had upscalers that uses that algorithms.
I tried this also on Remnant 2 but with RX 580, though my result is way different from what you have showed. It runs worse than FSR either in visual quality and rendering performance. FSR on performance looks better and faster compared to XeSS balanced for whatever reason. Maybe because i'm using 768p TV? But that doesn't explain how FSR performs better.
The show and tell editing was really good on this. The highlighting of key text points and comparisons zooms really hammered the points home. Excellent.
Very exciting stuff and optimistic for Intel’s future in the GPU space.
"Intel gave DLSS to everyone"
Proceeds to mostly compare Xess with FSR
Vex is the biggest AMD glazer and Nvidia hater, his titles are just dishonest way too often.
@@Cptraktorn good to know, I was here to see an honest comparation of Xess with the other upscalers...
I'll blacklist this channel, thank you
This is not true@@Cptraktorn
@@Cptraktorn This is best for algorithm.
I kinda hope fiture gpu's will focus on price over performance at make these upscalers as good as they can be so we could finally play 2k 144fps without breaking the bank and without hickups
can't wait for fsr 3.1 and XeSS 1.3
exactly my thoughts.
I have a 4080, but I’m very excited for the competition this brings to the table.
The only thing missing is Ray Reconstruction, but I‘m sure they are already working on it
maybe in 2027 28 knowing amd speeds ....
You can clearly see that XeSS 1.3 is softer since it's using a lower render resolution and I suspect most people would not want to use a lower base resolution. The only real improvement is the reduction of ghosting and moire shimmer. A proper comparison of any quality improvements would have been to compare the same render resolution which is XeSS 1.2 B vs XeSS 1.3 Q.
the ghosting reduction is crazy, remnant 2 started to look way better once Vex changed it to 1.3. it's clear when you look at floating particles, on 1.2 they were leaving gigantic trails
softer is fine when reshade exists. Getting rid of TAA ghosting is a feat by itself.
also, the chart at 5:00 exists in the video
If it looks better im using it
The blurry ground at oblique angles in the AMD tests are from the fact that they can't do any kind of texture filtering. You're stuck on bilinear unless it's a DX9 game, in which anisotropic filtering works.
Damn If this came earlier, I would have got an AMD.
2:06 That also means that it can sometimes hallucinate stuff (which could also explain why it sometimes looks "better" than native).
Already tested it on Steam Deck and it looks great. In Spider-Man Remastered i would prefer image scaling with TAA previously bcs of shimmering of any upscaler in the game, but updated XeSS seems to be the best solution rn and with greater performance. (Tbh hair still does not look great, but this time it looks closest to normal on XeSS.)
Excited to see games implement this officially, as I am excited to see games update with FSR 3.1 in the future. I really wish all games that have upscalers would have all three big options (and also TSR in Unreal Engine games too, I like TSR more than FSR2.2).
Glad Intel joined the fray with their GPUs. Still work to be done on their drivers, but between Presentmod and XeSS they're doing a lot of good for all GPU owners.
If I remember correctly, AMD announced AI upscaling technology before Intel. Additionally, Sony is also entering the AI upscaling arena. Therefore, I anticipate that the upcoming AMD FSR will be significant, especially if it operates solely with their NPU cores. They might utilize these NPU cores from the CPU to avoid overburdening the GPU. Will see.
I've been going back and forth in my mind between deciding on getting a 4070 super or a 7900 GRE. It's seriously one of the hardest choices I've had to make in hardware yet in all my years of pc gaming. This video definitely swings me further to the 7900 GRE.
Both cards will last you the entire PS5 and PS5 pro generation. Don’t think about longevity and future proof. The value is much lower for higher end.
Matching console at the cheapest price is all that matters.
18:59 when using Xess performance at 1440p, I think the resolution scaling should be 1440%sqrt(2.0) = 1018p and for Xess 1.3 is about 953p. I don't think you can count furs if native resolution is 620p.
I noticed how the smoke looks pixelated in FSR Quality and in XeSS looks perfect
Imagine using an upscaler to reach a base framerate of 120FPS, then using Async Reprojection along with that. Insane performance improvements.
You are also using the current old FSr. You should deff note they have a new version coming out soon that looks amazing from what we've seen.
I just tried this in Ratchet and Clank on Steam Deck. Hooooo boy that was fun. Night and day different from FSR. No smearing or ghosting really at all. Awesome to see this get more competitive! I just hope XeSS starts making its way into more games.
My main issue with the new Xess is that while they fixed temporal stability quite a bit, they introduced quite some instability when it comes to the anti aliasing. The image feels alot less stable on edges. It might have to do with me putting Xess 1.31 into Cyberpunk 2077 which natively uses the 1.2 dll but the image with 1.2 looked rock solid and very stable apart from the ghosting from fast moving objects it had, but the edges of objects looked amazingly stable.
I hope they can still tweak it to find the best of both worlds, but it is looking pretty decent already.
It has just dawned on me that my 1080Ti that our 6y/o uses has actually just jumped a building toward being even better than it was for 6 years already. DAFUQ! So I have nothing to do with Ngreedia anymore, still have the GOAT Nvidia GPU that was ever made great by accident and get more out of my preferred AMD GPU? Man / this is the shit
Just wanted to say really good video. Thanks for trying this out and showing everyone
Just why can't amd do what Nvidia does. Ai just requires to be programmed and trained with their gpu's....or is it not that simple
Im actually glad that they started to work on upscaling that's not limited to amd gpus and i have an amd 7800 xt. Fsr 3.1 looks promising. I guess that will take a few months until we get games to run it. One big plus will be that frame generation will run without having to use fsr.
I can't fail but notice the cat in the background using XeSS 1.3 to reach the depths of delicious can of catfood. 10:22
I have never attempted to mod a game, but I'm tempted to see if this will work on my 1650 Super....I mean, I hope to upgrade this year, but poverty often has other plans for any money I manage to save...
Very impressive presentation! You knocked this one out of the park!
I just did this swap with Spider-Man Remastered on my steam deck and I'm seeing performance gains over FSR 2.1, thanks for sharing this!!!
Just a couple of notes on XeSS
1) There are 2 code paths, ( Intel native and DP4A). The 2 paths can produce quite different results.
2) AMD graphics cards including Vega and any below do not support DP4A and thus cant run XeSS, the exception is Radeon VII, which does support CeSS
The newer version of Xess isn't just faster because they improve their algorithm, it gives more framerate because they lowered the base resolution on the pre-existing presets, so it's not a fair comparison to compare the 1.2 quality preset, to the 1.3 quality preset. A fair comparison would be comparing 1.2 quality preset, to the 1.3 quality plus preset.
They didn't do that to try being sly, they did it because they improved the Fidelity of the upscaling from lower base resolutions. It makes sense, but it's still a little dodgy to market it that way. I like that there's more options to pick from though
I find it kind of misleading when comparing almost still frames with not much going on, but if you test XeSS 1.2 in Cyberpunk during the rain at any quality at 1080p on DP4a the rain turns into lasers, straight lines coming from the sky with little definition of individual raindrops. FSR doesn't exhibit this temporal smearing/ghosting (but of course lacks in other ways).
Why didnt you mention its a very old FSR version now people think its bad 😢
Is it possible to upscale to the same resolution. Like from 1080p to 1080p just to increase FPS. I imagine if someone got a realllly bad -30fps pc they could benefit from that to make games playable at least. If not they should make it possible. Or to combine native and upscaling in a new technology
Wtf?
I have a lot of experience with upscaling. You can't upscale 1080p to 1080p. You upscale with low resolution. I recommend buy Lossless Scaling application to see what it does first hand. If you have a 4k monitor. You would leave windows display resolution on 3840x2160. Now the game settings will be completely different from your monitor resolution. Your game would be in 1280x720p window mode only. When you turn on Lossless Scaling. 720p will turn into 4k. Or if you got a 1080p monitor. Your windows display would be on 1920x1080 and your game would be in 1280x720. I don't do this anymore. I bought a 4k Gamer Pro. It's basically the same thing as Lossless Scaling but a hardware thing instead of a software thing. If interested in a 4k gamer pro. You need a 4k monitor. All your games must be in 1080p for the 4k gamer pro to work.
the word "upscaling" is pretty self explanatory
@@handzze7341 It is but i just meant what if you could replace 1080p native with a reconstructed 1080p image instead of going from 720p to 1080p. Just so there's less load on the gpu
@@i3l4ckskillzz79 What's not clicking?
off topic but your bgm choice is always great
Imagin XESS with frame generation
So a thing slightly related. I recently bought a 7900 xtx with a 1440p ultrawide monitor and ever sense I've had this bug in cyberpunk that happens if I don't use fsr3. Basically it'll leave this spot on my screen that looks like flashing dying pixel's only in a small part above V's left hand.
I have a qwuestion man. Did you check what percent resolution scale Amd and Nvidia are using before making these comparisons?
Maybe FSR Performance doesnt use the same base Res as DLSS Performance. If were comparing visuals you have to check that to make it fair.
Who zooms in on paused frames during gameplay? So is it really a thing if you have to look for it
1:11 pretty bad example given its moving on native and looks great, as you would expect. plus moving on fsr so a lower resolution upscaled thing moving = aliasing/ flickering.
then with the dlss you hailed, its not moving at all. of course it wont be as aliased/ flicker as much.
I tried it on Cyberpunk .. with this i can use highier sharpness setting without making game looking weird .. even at max value of 1 with low resolution it looks very good
I'm not motion sensitive but when you zoom in on videos with added camera motion effects inside windows or browsers, it's really jarring. I zoomed out on the video in order to make it watchable to me :(
That's my only complaint. Love the video so far
*Just imagine they include some sort of hardware acceleration aswell*
Way to overhype ML... which is also a man-made algorithm, the main difference is that since it is using HW acceleration using tensor cores, you can have better quality set for the same or a bit lower performance hit. XeSS differs from DLSS in that XeSS does not _require_ fix-function HW accelerators.
Which is why XeSS performs so much worse on non-Intel GPUs. FSR is literally the only upscaler which doesn't rely on ML at this point, and it's phenomenal how many people can hate on it, when clearly all 3 upscalers have their own pros and cons.
xess has been working for everyone in darktide for a while now
Intel not only started shredding the low end PtP but also aided their competitor to make AMD GPU's get fidelity and speed near DLSS, really makes me interested in Battlemage and just what they got cooking
Remnant 2 now regularly supports xess 1.3 btw
wonder how these upscaling technology would look like on intel's new arc integrated graphics cards. Considering buying one so hopefully someone covers that as well!
If only AMD does what Nvidia do and they'll probably be more competitive than ever... but why not I don't think they can't do it but it just feels like they're not interested and wants to do everything on their own
I don't use upscaling because I don't like what each frame looks like when things are moving such as panning. They are not as sharp. That is because upscaling basically turns off when you are moving so what is displayed is the lower resolution. Though it is a higher frame rate, it is softer because of the lower resolution. This is probably not a problem for those that use motion blur. I don't. I prefer to see sharp frames even if it looks like a slide show. But to be fair, my frame rate at native is above 90 fps at 4K.
nowadays upscaling gives the same image quality or even better in games, at least if you use dlss and xess quality modes, just because how shitty taa and tsr AA looks in modern games, especially at 2k and 4k.
@@SimplCup I like it better turned off.
Intel is really surprising me lately. I might just get one of their cards next. I wish they'd get their CPU game together though. Value is just isn't there for gamers atm.
Did u take into account that intel is changing their quality levels? They are adding in more options and changing the resolution that each option renders from. So quality level from 1.2 to 1.3 won't be the same resolution anymore. So quality level in 1.2 will be something like ultra quality in 1.3 or maybe performance I can remember which way it goes but I do know they are 100% adding in more upscale resolutions in 1.3
Vex, what do you use to capture the image/video? What's the codec and bitrate of the captured video? I'm asking because, when you make such comparison it's important to be as close to the source as possible, but those captured videos looks kinda low quality, to be precise low bitrate. The video in upclose is blocky, showing artifacts of compression, and general loss of quality :(
I'm a lil confused. So do you have to set your resolution lower? Or does XESS lower it for you?? Like if I have my PC at 1440p do I have to lower that before using XeSs? Or leave it the same and Xess will do it. Cheers
As for games that don't have upscaling at all you could try to use lossless scaling as they made their own upscaler with machine learning( AI) and frame gen in their app. I couldn't test it much as I have an i5 4570 paired with an rx 5600 XT( ik a terrible combo, Bosnia still had the high prices from mining). I recomend you maybe try it out in one of your videos( the app does cost 7 dollars), also intel be out here delivering on what they promise and beating AMD to implementing AI lol.
14:10 Your stats are just as valid as the official ones. I'd argue Intel is actually doing a better job of communicating the difference.
I can intuitively tell that the difference between 1.3x and 1.5x is less than 20%. Probably more like 15% if I had to guess.
But for 77% -> 67%? I am fooled into thinking that it is actually only a 10% increase. And it really isn't.
And wouldn't you know it, it is 14.9%. I think Intel isn't just marketinging, it is simply the better way to show these stats.
I feel the real benefit is upscaling from 1440p to 4k, as dlss quality makes for a great image but fsr 2 does not. xess 1.2 didn't offer a decent enough uplift of performance but with 1.3 you can now run 4k with great performance.
XeSS is a messy temporal artifacts machine.
FSR 3.1 brings temporal stability without AI shenanigans.
Uh look at GoT's fsr3 or any game with grass
Man everything that gets me more fps is welcome.
Whats happening to youtube videos quality. at 1:30 you can't tell that is a grass...Jesus
And yes, every game I play regardless of intensity or fast pace will be played zoomed all the way in and at 1 frame per minute so I can see how good my upscaler is. Vex and Tim from HUB are speaking the same language. It is also my belief that if all these games are played at normal speed without being zoomed in it would be near impossible to distinguish the difference in quality using any of the three available upscalers. I will never purchase an nGreedia GPU.
You cant really judge it by the still images a lot of the times shimmering will happen when moving the camera around and in motion
Not actually too sure about Nvidia deliberately creating ICAT for marketing DLSS. They made LDAT for comparing latency figures, making something to easily compare image quality in general (not just for upscaling) seems pretty logical to capture & sync up graphical settings comparisons (which I think is really what they were pushing this for, especially for real time raytracing).
12:11 no... It is. There are a lot of GPUs that can only use AMD's FSR and not even XeSS. Remember the 6.4 shaders requirement. The oldest Nvidia GPU to support those shaders in hardware is Maxwell. You need a GTX 750 Ti or 980 to access XeSS. A GTS 450 can at least use FSR. XeSS is just the middle ground in terms of hardware support and quality. And last time I checked... I'M STILL ON TESLA 2.0 GRAPHICS SO I DON'T GET $H!T.
As a Intel Arc A770 16gb owner i can say, it's awesome!
I went from a old gtx card using fsr to the Intel one using XeSS and it's alot Better image quality.
Now they just need to make better Linux drivers.
as a person that lived his entire life with a potato and now recently built a pc with a gtx 1650 on it, i can confidently say that i dont see a fcking difference
As i know XeSS needs Intel GPU for better upscale, isthis a thing right? Or can i have this upscale with my 3080?
XeSS works better on Intel's own cards because those use AI accelerators like Nvidia, but it works on AMD and Nvidia cards using software rendering.
Happened to pause it at 1:57.... lol it says "LEAROING" on the loading screen, instead of 'Learning'
I think so yould have used XESS 1.2 and 1.3 on the setting that upscale from the same resolutions too, as comparing it like you did, you can't really make a fair comparison on image quality improvements.
What a bout lag and upscale and online gaming where every ms of lagging count ?
Definitely don't use upscaling in that scenario.
Most non-gaming ML algorithms run on (GP-)GPU, and before that, CPUs were widely used (and still are for some stuff). So it shouldn't be shocking to know that the DLSS' special sauce is actually software, not tensor cores. These ASIC blocks make it faster and more energy efficient, that's all.
Isnt there a part missing about the GTX? As the 4th chapter is called testing on AMD& GTX GPUs.
We're watching pixel upscaling comparison through the compressed youtube video. Crazy times.
i will be happy if they implemented their own DXR and comparing 1.3 and 1.2 i noticed better contrast between light and dark surfaces it's better but not noticeable for all peoples because it's like 1-2 color shade but it's better
I was already using XeSS with my rdna2 gpu in cyberpunk, better 1% lows and visually identical to native compared to blurry FSR.
Now it's gotten even better, but still peoples in other tech channels aren't trusting me about XeSS>FSR on a red team card... Open your eyes gents!
What a great time to be alive
Feel like using performance setting in upscale isnt the best comparison as its such an aggressive upscaling option. It's only really used for people that cant get the FPS they need. I think Quality is the gold standard and a lot of people will use it even if they have enough FPS just to get extra smoothness of higher fps.
Can you compare visual xess differences between dp4a and xmx istructions? By using an amd/nvidia card and an intel card running xess at same resolution