Wait, Intel just gave DLSS to Everyone! - HUGE XeSS 1.3 Update

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 ก.พ. 2025

ความคิดเห็น • 790

  • @vextakes
    @vextakes  10 หลายเดือนก่อน +192

    23:00- small correction. Meant to say you’re getting about the same quality of upscaling at a lower base resolution. Which basically leads to free performance (in most cases).
    XeSS reviving potato GPUs?? You gonna be using it???

    • @jonothonlaycock5456
      @jonothonlaycock5456 10 หลายเดือนก่อน +3

      Using vendor specific tools to do image comparisons between competing vendors is questionable. We already know that for example FSR and Xess look worse on Nvidia RTX cards compared to on AMD (FSR) and Intel (Xess), there are hardware vendor agnostic image quality tools used in the movie industry that remove subjective comparisons.

    • @lilpain1997
      @lilpain1997 10 หลายเดือนก่อน +6

      @@jonothonlaycock5456 FSR looks the same on any card as it has no special version... There is literally nothing AMD cards do extra. Unlike XeSS however

    • @BrunodeSouzaLino
      @BrunodeSouzaLino 10 หลายเดือนก่อน +8

      There's one small detail. XeSS isn't open source nor cross platform as per one GitHub issue in the XeSS repository, meaning it is no different than DLSS. Meaning Intel didn't "give DLSS to everyone" as the video title claims. Also, I question the amount of overhead this feature will add to cards without AI accelerators, because there are no free lunches. Any type of resolution scaling in real time adds overhead which can only be made imperceptible if your hardware is capable of doing 4-6 times the same amount of computing, in which case, there's no need to use scaling in the first place.

    • @jonothonlaycock5456
      @jonothonlaycock5456 10 หลายเดือนก่อน

      @lilpain1997 I am well aware it has no special version, it and xess just look worse on Nvidia cards due to Nvidia's long history going all the way back to the original RivaTNT's of making other vendors features that have been integrated into games. Look worse or perform worse (Other Gpu vendors have done similar on occasion but not as systematically as Nvidia has through out its history)

    • @Murilo1324m
      @Murilo1324m 10 หลายเดือนก่อน +3

      Can we use it in integrated graphics???

  • @Efsaaneh
    @Efsaaneh 10 หลายเดือนก่อน +1309

    Thank you intel for making amd gpus even more competitive

    • @Just_An_Ignacio
      @Just_An_Ignacio 10 หลายเดือนก่อน +27

      Fr

    • @randomsaloom7238
      @randomsaloom7238 10 หลายเดือนก่อน +10

      i dont get what this means ?

    • @earthboundonquest
      @earthboundonquest 10 หลายเดือนก่อน +185

      ​@@randomsaloom7238 meaning that fsr sucks compared to dlss and was kind of a reason why some people still preferred Nvidia cards but now that xess is getting this good and kinda of on the level of dlss it gives a fair chance for radeon cards to compete with dlss.

    • @riven4121
      @riven4121 10 หลายเดือนก่อน +69

      @@randomsaloom7238 XeSS blows FSR out of the water. AMD's own upscaling tech pales in comparison to their competitor's.

    • @auritro3903
      @auritro3903 10 หลายเดือนก่อน +158

      AMD: Makes intel CPUs more competitive
      intel: Makes AMD GPUs more competitive
      intel 🤝 AMD

  • @thatzaliasguy
    @thatzaliasguy 10 หลายเดือนก่อน +339

    Fun fact; FSR and XeSS both share a large codebase, and both AMD and Intel contribute to each other's open-source projects. We can thank AMD and Intel for _both_ FSR 3.1 and XeSS 1.3

    • @czeszym
      @czeszym 10 หลายเดือนก่อน +28

      That's how it should be. It starting to look like Apple vs Android war but in GPUs

    • @Vantrakter
      @Vantrakter 9 หลายเดือนก่อน +29

      XeSS isn't open source yet though..Intel said open source was the target when XeSS was launched, they haven't said anything about it since that I've read.. what's on github isn't the sourcecode, just header files and some binaries.

    • @thatzaliasguy
      @thatzaliasguy 9 หลายเดือนก่อน +1

      @@Vantrakter XeSS 1.2/1.3 is not open yet, but 0.4-1.0 are open-source. Just like with AMD technologies, Intel's open-source projects are released once a version is finalized. FSR 3 only opened 2 months ago (FSR 3.1 is still closed atm), where previously only FSR 1-2.2 were open.
      FYI, GitHub is only 1 of 100 places FOSS lives and is hosted, and is also the second-smallest repository for FOSS, as most of us in the software industry don't like Microsoft (owns GitHub).

    • @etaashmathamsetty7399
      @etaashmathamsetty7399 9 หลายเดือนก่อน +5

      How do you know? Only one of them is open source...

    • @flamingscar5263
      @flamingscar5263 9 หลายเดือนก่อน +19

      another fun fact, AMD and Intel tend to work this way just in general, due to some VERY complicated history AMD and Intel have both ended up signing a lifetime contract allowing each company to freely use the others patents, how far these extend only the highest ups at these 2 companies know, but it means that if AMD makes an innovation Intel can just copy and paste it and vise versa
      as for a TLDR of the history real quick, basically Intel invented x86 32 bit, AMD invented x86 64 bit, at the time both companies NEEDED the other technology, Intel needed 64 bit for new flagship CPUs as to not fall behind, AMD needed 32 bit to maintain most of their mid range CPUs, both companies came to an agreement to allow each other to use each others technologys, not entirely fully freely mind you, Intel has many things they dont share with AMD and AMD wont share with Intel, but if 1 company copies the other they cant get sued due to this agreement

  • @FilthEffect
    @FilthEffect 10 หลายเดือนก่อน +452

    I really hope Intel smashes the gpu market. This looks promising

    • @Dark-qx8rk
      @Dark-qx8rk 10 หลายเดือนก่อน +42

      Only problem is that Intel has backtracked on it's promise to open source XeSS. I suspect they would go the proprietary route like Nvidia if they gained a larger market share. Only AMD is the open source champion so I hope FSR 3.1 and future revisions deliver significant improvements.

    • @Blox117
      @Blox117 10 หลายเดือนก่อน +13

      @@Dark-qx8rk what does it matter when game makers are still using Crapdows 11

    • @arenzricodexd4409
      @arenzricodexd4409 10 หลายเดือนก่อน +2

      @@Dark-qx8rk what's the benefit if intel open source their XeSS?

    • @highcountrygrower9984
      @highcountrygrower9984 10 หลายเดือนก่อน +22

      When they do they will charge more than Nvidia because intel is far far more greedier that Nvidia, history proves that hands down. when AMD fell behind This a company that for 7 years they held back innovation so they could sell us chips with 3-5% performance increases all while raising the price by 15% with every new chip just because they could.

    • @Dark-qx8rk
      @Dark-qx8rk 10 หลายเดือนก่อน +9

      @@arenzricodexd4409 It may allow AMD/Nvidia to optimize their cards for XeSS or even let them use the AI enabled version of XeSS for better quality/performance.
      As it stands Intel has kept the best version of XeSS exclusively for their own gpu's which is basically like Nvidia and DLSS.
      The other advantage is that it allows modders to add XeSS into multiple games just like they have added FSR3 frame generation that allows any gpu to experience it.

  • @Laundsallyn
    @Laundsallyn 10 หลายเดือนก่อน +346

    The performance is better because XeSS uses a lower base resolution in 1.3 compared to 1.2.

    • @raresmacovei8382
      @raresmacovei8382 10 หลายเดือนก่อน +71

      Not in this game/video. The new ratios are only used if a game natively implements 1.3 or if you manually tweak ratios.
      EDIT: Talked with LukeFZ (one of the FSR3 FG mod authors). Apparently just using the new dll is enough to get the new ratios

    • @saphyre-l4w
      @saphyre-l4w 10 หลายเดือนก่อน +2

      They really made it confusing in their latest revision. 🙁

    • @Dark-qx8rk
      @Dark-qx8rk 10 หลายเดือนก่อน +22

      ​@@raresmacovei8382 To get the Ultra Quality mode you would have to have an official implementation. Essentially XeSS 1.3 Quality mode is the old XeSS 1.2 Balanced mode. Vex did not do a fair comparison of the frame rate which would be XeSS 1.2 B vs Xess 1.3 Q.

    • @raresmacovei8382
      @raresmacovei8382 10 หลายเดือนก่อน +4

      @@Dark-qx8rk No no, that's what I also thought too. But apparently just the new dll will force the new ratios internally.

    • @RX7800XTBenchmarks
      @RX7800XTBenchmarks 10 หลายเดือนก่อน

      ​@@raresmacovei8382 yes. Only the ultra performance mode and the native xess taa mode need official implementation. The rest are already good to go without official implementation.

  • @coolbeans007
    @coolbeans007 10 หลายเดือนก่อน +222

    I hope we'll eventually find a way to loophole all of Nvidia's features for any type of GPU.

    • @Goober89
      @Goober89 10 หลายเดือนก่อน +55

      Instead we should be hoping that AMD and Intel can improve their features to be competitive with Nvidia.

    • @Panacea_archive
      @Panacea_archive 10 หลายเดือนก่อน +30

      ​@@Goober89 Your statement is actually the same as his.

    • @BrunodeSouzaLino
      @BrunodeSouzaLino 10 หลายเดือนก่อน +28

      Why copy NVIDIA when you can do better than NVIDIA?

    • @ChiBrianXIII
      @ChiBrianXIII 10 หลายเดือนก่อน +7

      @@BrunodeSouzaLino Same concept as Apple, they perfect things, and people go back and copy what Apple copied with them. Im selling my AMD off because its a power hog, and cant beat the power consumption of the 4000 series

    • @Equleth
      @Equleth 10 หลายเดือนก่อน +4

      @@ChiBrianXIII tried undervolting? because for me worked like magic :D

  • @MisterKrakens
    @MisterKrakens 10 หลายเดือนก่อน +136

    I just realized something.
    If we use Uniscaler to put XeSS 1.3 in any game. And custom the settings to specify that "quality" setting is actually at 100% resolution... Does it mean we can have an equivalent to DLAA/NAA in every games in the world and remove aliasing perfectly thanks to the AI in XeSS ?
    I mean, that would make TAA obsolete, every older games without DLAA/NAA would have a new perfect way to remove all jagged edgies

    • @NexY92
      @NexY92 10 หลายเดือนก่อน +22

      Test it out and report back

    • @l3monguy
      @l3monguy 10 หลายเดือนก่อน +31

      @@NexY92 You can do it in cyberpunk. Download XeSS and switch the files with the original XeSS ones. Then set upscaling quality to DRS, set min and max res to 100, and target fps to 10. I checked via CET, it stays at native res if you do it like this. As for results, it's generally a lot sharper and more stable than TAA, but it also has weird artifacts (especially with rain and thin particles)

    • @thegamerfe8751
      @thegamerfe8751 10 หลายเดือนก่อน +2

      Unfortunately not all the games in the world, unless you use lossless scaling which imo isn't that good of an upscaling implementation, the game needs to actually have an upscaling option so that you could implement XeSS.
      Hell, I tried Uniscaler on RE2 Remake and FSR 3 framegen didn't work.

    • @MLWJ1993
      @MLWJ1993 10 หลายเดือนก่อน +3

      This would likely work in games that already use temporal solutions. However games that don't will either break completely or you'll need to generate the required motion vectors & masks on the fly, which significantly lowers performance.

    • @adamn7125
      @adamn7125 10 หลายเดือนก่อน +1

      What's with all that trickery? Just select XeSS Native AA

  • @Kmcornell23
    @Kmcornell23 10 หลายเดือนก่อน +17

    Imagine not supporting games that don't perform well because the developers refuse to optimize their games. But no, you'd rather give them your money and deal with upscalers instead... By giving greedy companies your money, you're saying it's ok to do what they're doing. If you stop doing that, they'll be forced to make better products.

    • @slimal1
      @slimal1 10 หลายเดือนก่อน +4

      The new generation of the PC Master Race don't understand that this should not be a crutch for $500+ GPUs

    • @Awesomes007
      @Awesomes007 9 หลายเดือนก่อน

      It all reduces the cost of software development. Does it matter where the savings come from? Maybe not.

  • @stangamer1151
    @stangamer1151 10 หลายเดือนก่อน +84

    XESS 1.3 still has issues, if you compare it to DLSS (as a reference point). In some games the difference between the new XESS version and the latest DLSS version is still pretty noticeable. But, I have to admit, XESS is very close to DLSS in terms of quality now. One more step (like version 1.4) and it will proably match DLSS, unless Nvidia improves DLSS as well, which is likely.
    AMD really need to act fast and release that FSR 3.1 asap and update with it as many games as possible.

    • @kPyGJIbIu
      @kPyGJIbIu 10 หลายเดือนก่อน +35

      even if fsr going to lose, amd is going to win if xess wins. so fsr 3.1 doesn't really matter

    • @stangamer1151
      @stangamer1151 10 หลายเดือนก่อน +1

      ​@@kPyGJIbIuWell, if we are talking about performance/quality ratio of XESS (especially in case of AMD cards), it still looses to DLSS. So, even if you manage to match visual quality, you still loose in terms of performance and vise versa. In some games the difference is pretty significant.

    • @HEAD123456
      @HEAD123456 10 หลายเดือนก่อน

      Yeah i tested XESS 1.3 on horizon FW and it was way worse than DLSS.Test setup:
      48" LG OLED TV as pc monitor, 4080Super, 4K resolution, DLSS 3.7.0 with preset E and XESS 1.3
      Both on quality mode have same FPS, but DLSS looks much better. I could clearly see XESS using way lower resolution than DLSS and whole image looks way worse.
      Compared to FSR 2.2 the XESS have better water, but very long distance objects like foliage shimmer like crazy and looks worse than FSR 2.2.

    • @oneeasterneuropean9299
      @oneeasterneuropean9299 10 หลายเดือนก่อน +12

      @@stangamer1151 There is a caveat to this. If you are buying into the RTX series because of the better upscaling, and the promise of it getting even better, you are more than likely going to be left behind in a generation or two because of nVidia's tendency to lock older hardware from their new software. So, absolutely, comparing DLSS to FSR to XeSS right now, DLSS is on top, IF you have a GPU that supports it. But in 2-3 years, when the "next big thing" in generative tech like this comes along, and you're not ready to spend another 6-7-800 bucks on a new card, the (currently) inferior tech might be the best you can get after it itself gets updated.
      So your statement is entirely valid. *For now*, but we all win, even if XeSS and FSR are always a step behind DLSS as long as they keep improving.

    • @lilpain1997
      @lilpain1997 10 หลายเดือนก่อน +4

      @@kPyGJIbIu Everyone wins with this... Even those on Nvidia GPUs as you get more options too choose from

  • @seamuspink9098
    @seamuspink9098 10 หลายเดือนก่อน +64

    really funny how amd and intel care more about old nvidia cards that nvidia themselves

    • @ewoggerts
      @ewoggerts 10 หลายเดือนก่อน +11

      its a free way to get data and improve training their models

    • @B16B0SS
      @B16B0SS 10 หลายเดือนก่อน +12

      I think it is more about developer support. Most developers won't support adding in software to supoer 5% of the market, but it is supports 90% then there is more of a chance to get your technology reach wide adoption.

    • @Ehren1337
      @Ehren1337 10 หลายเดือนก่อน

      intel does not care about you. when AMD fell behind This a company that for 7 years they held back innovation so they could sell us chips with 3-5% performance increases all while raising the price by 15% with every new chip just because they could. its worse than ngreedia.

    • @Under-jt7ln
      @Under-jt7ln 9 หลายเดือนก่อน

      Interesting, can 4090 use dlls + fsr together?

  • @buddybleeyes
    @buddybleeyes 10 หลายเดือนก่อน +23

    Glad to see other solutions like xess getting improvements. Considering how young xess is, this is honestly really promising! My only issue is that devs are using this tech as a crutch for "playable"

  • @zmotionfx5819
    @zmotionfx5819 10 หลายเดือนก่อน +16

    Ok so we can use Xess from Intel and FSR 3.1 Frame generation at the same time
    God I love the competition

    • @SimplCup
      @SimplCup 10 หลายเดือนก่อน +10

      interesting how intel and amd unintentionally work together to beat the nvidia's dlss and fg xd

  • @Kyzerii
    @Kyzerii 10 หลายเดือนก่อน +19

    the reason for the perf increase is as they wrote, both are using xess performance but xess perf on 1.3 renders at a lower res than 1.2 perf mode so the increase to fps must defo come from this fact

    • @_..-...--.-.-.-..-
      @_..-...--.-.-.-..- 10 หลายเดือนก่อน +12

      Well no shit, he said that like 10 times in the video

    • @Kyzerii
      @Kyzerii 10 หลายเดือนก่อน

      @@_..-...--.-.-.-..- lil bros IQ lower than room temp. 🤡. ion gonna watch the whole dogshit video i just clicked on left the comment then left again. too much yap. And if he really did say that then good it means he has atleast 3 working brain cells. but anyways thank you so much for your reply i will proceed to print it out and wipe my shit with it. 😇😇

  • @steve9094
    @steve9094 10 หลายเดือนก่อน +11

    I've been using XeSS in Cyberpunk on my 6650 XT, and I was shocked to find that the Performance mode of Intel's upscaler looks significantly better to me than "balanced" in FSR. Everything looks more defined and vibrant with XeSS, whereas it feels to me like FSR adds a blurry filter over everything.
    AMD's driver level frame gen is fantastic btw - with my framerate locked to 71 fps, I've got Cyberpunk running at PS5 equivalent settings plus all ray tracing turned on except lighting, and I get 100-142 fps with consistent 7-9 ms frame time. Most of the time it hovers between 120-130 fps, but I still decided to cap it near 144 fps cuz AMD FMF produces way better results with far lower latency when the framerate is around 70.

    • @iitzfizz
      @iitzfizz 10 หลายเดือนก่อน +2

      AFMF is great but the input lag is a bit much on some games. I was getting well over 100+ FPS on Alan Wake 2 on max settings with it with my 6750 XT.
      Tried out XeSS 1.3 in Horizon Forbidden West and Remnant 2 and it's really good. When FSR 3.1 arrives, using the frame generation with XeSS will be great.

    • @steve9094
      @steve9094 9 หลายเดือนก่อน +1

      @@iitzfizz It's weird - with me, I don't notice any increase in input lag while using frame gen. There's usually like 20-30ms of frame gen lag listed on my Adrenalin overlay with the feature enabled, but then my overall frame time is usually way better cuz I'm coming way closer to my 144 hz refresh rate. Depending on the game and its framerate, I typically have 26-30 ms of total lag, which is weird cuz it feels very responsive to me. Typically when I've turned off frame gen and experimented with input lag by limiting a game to 30 fps or whatever, it resulted in a much lower frame time than frame gen's 25-30 ms yet felt WAY worse. I dunno what the deal is with that, but it feels counterintuitive to me. Weird.

  • @JuanSoloWing
    @JuanSoloWing 10 หลายเดือนก่อน +21

    I may be wrong, but AMD was working on making frame gen compatible with other upscalers, right? So we may be able to use XESS with AMD frame gen, that would be amazing

    • @iitzfizz
      @iitzfizz 10 หลายเดือนก่อน +10

      No, you're correct - it should be coming with FSR 3.1

    • @kamikaze_twist
      @kamikaze_twist 9 หลายเดือนก่อน +1

      Yup, decoupled FG so can use it with even DLSS/XeSS!

    • @classic_jam
      @classic_jam 9 หลายเดือนก่อน

      I've done Framegen on my 7900XTX. Works super well in SUPPORTED games. Can be issues in unsupported games

  • @JLMilano
    @JLMilano หลายเดือนก่อน +1

    Realised how good XeSS is recently... With a 6700xt, XeSS is the only upscaling tech that gets me 60fps in cyberpunk without the disgusting look of AMD FSR 3.
    If you push the upscaling further, it boosts to 100+ FPS.

    • @Qelyn
      @Qelyn หลายเดือนก่อน

      Yeah FSR 3 looks pretty awful in cyerpunk but some games it looks pretty good

  • @quintrapnell3605
    @quintrapnell3605 10 หลายเดือนก่อน +62

    I don’t know about this title. I think I get your point but XeSS doesn’t run the same in agnostic mode as it does on Intel.

    • @xXRealXx
      @xXRealXx 10 หลายเดือนก่อน +27

      The point is that it looks better than FSR even on non-Intel cards

    • @mastroitek
      @mastroitek 10 หลายเดือนก่อน +8

      sure, but now AMD card owners (like myself) can use an upscaler that it is not shitty. I'm sorry but sometimes even at 4k quality FSR generates distracting artifacts, I find myself preferring to drop down the resolution to 1440p rather than using FSR at quality. (not for all games, but it is definitely a problem in various titles)

    • @quintrapnell3605
      @quintrapnell3605 10 หลายเดือนก่อน +1

      Well it seems like 1.3 might be worth using even if you’re not Intel now because of some massive improvement and the SDK is on GitHub right now. FSR 3.1 was announced right before this. I assume both are worth using now but obviously only one has frame gen at the moment. It’s always good news when any of these gets better.

  • @AndyViant
    @AndyViant 10 หลายเดือนก่อน +6

    Looks like XESS 1.3 is a worthwhile mod for those games that haven't updated yet. Nice.

    • @electrotrashmailbox
      @electrotrashmailbox 10 หลายเดือนก่อน

      Not really.
      In Robocop I compared XeSS 1.2 B vs XeSS 1.3 Q to ensure they have the same base resolution. 1080p. They both delivers same fps, but 1.2 has slighly less shimmering. If I enable XeSS 1.3 to Balanced, it is a pain for the eyes versus 1.2 B.
      So you need to put 1.3 on a higher quality level, and you will get the same performance, or you can have worse picture with more artefacts at the same level.
      The only benefit is "quality" in settings looks better for your ego than "balanced" xD

    • @nicane-9966
      @nicane-9966 10 หลายเดือนก่อน

      @@electrotrashmailbox in the end i guess is important to have a release by the devs with the proper optimization and else, but this 1.3 should be better in every case.-

  • @razorhanny
    @razorhanny 9 หลายเดือนก่อน +2

    Thank you so much for this video! I love these intricate gaming tech comparisons!

  • @TanGoku-22
    @TanGoku-22 10 หลายเดือนก่อน +78

    Imagine these:
    °Moore Threads became globally competitive
    °Apple somehow Join the CPU/GPU market
    °Qualcomm & Mediatek enters the Desktop CPU/GPU market
    But That's Just A Theory....

    • @l3monguy
      @l3monguy 10 หลายเดือนก่อน +44

      "Apple somehow Join the CPU/GPU market" all is good except this.

    • @Sol4rOnYt
      @Sol4rOnYt 10 หลายเดือนก่อน +24

      apple is shit n overpriced, we need more CPU/GPU competition tho

    • @w-lilypad
      @w-lilypad 10 หลายเดือนก่อน +8

      ​@@Sol4rOnYtwe need competition, not monopoly

    • @younasqureshi9179
      @younasqureshi9179 10 หลายเดือนก่อน +22

      Keep apple out of this

    • @Gass0208
      @Gass0208 10 หลายเดือนก่อน +5

      @@w-lilypad it's not like they'll get monopoly, they'll just sell overpriced shit as always

  • @shintsu01
    @shintsu01 10 หลายเดือนก่อน +10

    thanks for this detailed review looks to me that i should run my 7900xtx on XESS 1.3 if its available instead of FSR

    • @igorthelight
      @igorthelight 10 หลายเดือนก่อน +1

      ... untill FSR 3.1 would came out ;-)

    • @JustSkram
      @JustSkram 10 หลายเดือนก่อน

      ​@igorthelight kinda like still waiting for fsr 3 in cyberpunk?

    • @ishiddddd4783
      @ishiddddd4783 10 หลายเดือนก่อน

      ​@@igorthelightby then RX 9000 series is going to be out lmao

    • @igorthelight
      @igorthelight 10 หลายเดือนก่อน

      @@JustSkram Yep!

    • @Pysnpai
      @Pysnpai 9 หลายเดือนก่อน

      I’m going to use this to upscale and use MD frame gen to get it up to 120fps.

  • @J0ttaD
    @J0ttaD 10 หลายเดือนก่อน +24

    12:08 they beat amd in the encoding as well... shame amd, shame.

  • @djmalachite
    @djmalachite 10 หลายเดือนก่อน

    This really goes to show that nvidia used DLSS as a marketing product to upsell their cards, dont blame em but this is a BIG W for intel and all gamers

  • @eye776
    @eye776 10 หลายเดือนก่อน +2

    FSR is based on an image upscaling algorithm called Lanczos but it has many significant changes.

    • @aziskgarion378
      @aziskgarion378 10 หลายเดือนก่อน

      That name is pretty familar to me. I used emulators and they had upscalers that uses that algorithms.

  • @shirufin5928
    @shirufin5928 10 หลายเดือนก่อน +1

    I tried this also on Remnant 2 but with RX 580, though my result is way different from what you have showed. It runs worse than FSR either in visual quality and rendering performance. FSR on performance looks better and faster compared to XeSS balanced for whatever reason. Maybe because i'm using 768p TV? But that doesn't explain how FSR performs better.

  • @Tigermania
    @Tigermania 10 หลายเดือนก่อน

    The show and tell editing was really good on this. The highlighting of key text points and comparisons zooms really hammered the points home. Excellent.

  • @Kazzman90
    @Kazzman90 10 หลายเดือนก่อน +4

    Very exciting stuff and optimistic for Intel’s future in the GPU space.

  • @shadowofthesupremo7898
    @shadowofthesupremo7898 10 หลายเดือนก่อน +37

    "Intel gave DLSS to everyone"
    Proceeds to mostly compare Xess with FSR

    • @Cptraktorn
      @Cptraktorn 9 หลายเดือนก่อน +9

      Vex is the biggest AMD glazer and Nvidia hater, his titles are just dishonest way too often.

    • @shadowofthesupremo7898
      @shadowofthesupremo7898 9 หลายเดือนก่อน +8

      @@Cptraktorn good to know, I was here to see an honest comparation of Xess with the other upscalers...
      I'll blacklist this channel, thank you

    • @h1ghken
      @h1ghken 9 หลายเดือนก่อน +3

      This is not true​@@Cptraktorn

    • @Jaejgaren
      @Jaejgaren 9 หลายเดือนก่อน +1

      @@Cptraktorn This is best for algorithm.

  • @Mr.Wiksila
    @Mr.Wiksila 10 หลายเดือนก่อน +1

    I kinda hope fiture gpu's will focus on price over performance at make these upscalers as good as they can be so we could finally play 2k 144fps without breaking the bank and without hickups

  • @hamzaskzix
    @hamzaskzix 10 หลายเดือนก่อน +20

    can't wait for fsr 3.1 and XeSS 1.3

    • @onedriftyboy
      @onedriftyboy 10 หลายเดือนก่อน +5

      exactly my thoughts.
      I have a 4080, but I’m very excited for the competition this brings to the table.
      The only thing missing is Ray Reconstruction, but I‘m sure they are already working on it

    • @NostalgicMem0ries
      @NostalgicMem0ries 9 หลายเดือนก่อน

      maybe in 2027 28 knowing amd speeds ....

  • @Dark-qx8rk
    @Dark-qx8rk 10 หลายเดือนก่อน +15

    You can clearly see that XeSS 1.3 is softer since it's using a lower render resolution and I suspect most people would not want to use a lower base resolution. The only real improvement is the reduction of ghosting and moire shimmer. A proper comparison of any quality improvements would have been to compare the same render resolution which is XeSS 1.2 B vs XeSS 1.3 Q.

    • @SimplCup
      @SimplCup 10 หลายเดือนก่อน +4

      the ghosting reduction is crazy, remnant 2 started to look way better once Vex changed it to 1.3. it's clear when you look at floating particles, on 1.2 they were leaving gigantic trails

    • @M_CFV
      @M_CFV 10 หลายเดือนก่อน +1

      softer is fine when reshade exists. Getting rid of TAA ghosting is a feat by itself.

    • @M_CFV
      @M_CFV 10 หลายเดือนก่อน +1

      also, the chart at 5:00 exists in the video

    • @Jolfm123
      @Jolfm123 5 หลายเดือนก่อน

      If it looks better im using it

  • @GeneralLee131
    @GeneralLee131 10 หลายเดือนก่อน +1

    The blurry ground at oblique angles in the AMD tests are from the fact that they can't do any kind of texture filtering. You're stuck on bilinear unless it's a DX9 game, in which anisotropic filtering works.

  • @Gergo049
    @Gergo049 10 หลายเดือนก่อน

    Damn If this came earlier, I would have got an AMD.

  • @kuhluhOG
    @kuhluhOG 8 หลายเดือนก่อน

    2:06 That also means that it can sometimes hallucinate stuff (which could also explain why it sometimes looks "better" than native).

  • @wodar2741
    @wodar2741 10 หลายเดือนก่อน +1

    Already tested it on Steam Deck and it looks great. In Spider-Man Remastered i would prefer image scaling with TAA previously bcs of shimmering of any upscaler in the game, but updated XeSS seems to be the best solution rn and with greater performance. (Tbh hair still does not look great, but this time it looks closest to normal on XeSS.)

  • @Torso6131
    @Torso6131 10 หลายเดือนก่อน +1

    Excited to see games implement this officially, as I am excited to see games update with FSR 3.1 in the future. I really wish all games that have upscalers would have all three big options (and also TSR in Unreal Engine games too, I like TSR more than FSR2.2).
    Glad Intel joined the fray with their GPUs. Still work to be done on their drivers, but between Presentmod and XeSS they're doing a lot of good for all GPU owners.

  • @hatobeats
    @hatobeats 10 หลายเดือนก่อน

    If I remember correctly, AMD announced AI upscaling technology before Intel. Additionally, Sony is also entering the AI upscaling arena. Therefore, I anticipate that the upcoming AMD FSR will be significant, especially if it operates solely with their NPU cores. They might utilize these NPU cores from the CPU to avoid overburdening the GPU. Will see.

  • @genki831
    @genki831 10 หลายเดือนก่อน +1

    I've been going back and forth in my mind between deciding on getting a 4070 super or a 7900 GRE. It's seriously one of the hardest choices I've had to make in hardware yet in all my years of pc gaming. This video definitely swings me further to the 7900 GRE.

    • @happybuggy1582
      @happybuggy1582 10 หลายเดือนก่อน

      Both cards will last you the entire PS5 and PS5 pro generation. Don’t think about longevity and future proof. The value is much lower for higher end.
      Matching console at the cheapest price is all that matters.

  • @yihanzhang2094
    @yihanzhang2094 10 หลายเดือนก่อน

    18:59 when using Xess performance at 1440p, I think the resolution scaling should be 1440%sqrt(2.0) = 1018p and for Xess 1.3 is about 953p. I don't think you can count furs if native resolution is 620p.

  • @koroda765
    @koroda765 2 หลายเดือนก่อน

    I noticed how the smoke looks pixelated in FSR Quality and in XeSS looks perfect

  • @dexgaming6394
    @dexgaming6394 10 หลายเดือนก่อน

    Imagine using an upscaler to reach a base framerate of 120FPS, then using Async Reprojection along with that. Insane performance improvements.

  • @DavidBoggs-pk8nr
    @DavidBoggs-pk8nr 10 หลายเดือนก่อน +1

    You are also using the current old FSr. You should deff note they have a new version coming out soon that looks amazing from what we've seen.

  • @chasealewis
    @chasealewis 9 หลายเดือนก่อน

    I just tried this in Ratchet and Clank on Steam Deck. Hooooo boy that was fun. Night and day different from FSR. No smearing or ghosting really at all. Awesome to see this get more competitive! I just hope XeSS starts making its way into more games.

  • @MrSheduur
    @MrSheduur 5 หลายเดือนก่อน

    My main issue with the new Xess is that while they fixed temporal stability quite a bit, they introduced quite some instability when it comes to the anti aliasing. The image feels alot less stable on edges. It might have to do with me putting Xess 1.31 into Cyberpunk 2077 which natively uses the 1.2 dll but the image with 1.2 looked rock solid and very stable apart from the ghosting from fast moving objects it had, but the edges of objects looked amazingly stable.
    I hope they can still tweak it to find the best of both worlds, but it is looking pretty decent already.

  • @fuzzylumpkins6034
    @fuzzylumpkins6034 10 หลายเดือนก่อน +1

    It has just dawned on me that my 1080Ti that our 6y/o uses has actually just jumped a building toward being even better than it was for 6 years already. DAFUQ! So I have nothing to do with Ngreedia anymore, still have the GOAT Nvidia GPU that was ever made great by accident and get more out of my preferred AMD GPU? Man / this is the shit

  • @drewsnider8893
    @drewsnider8893 10 หลายเดือนก่อน

    Just wanted to say really good video. Thanks for trying this out and showing everyone

  • @noimnotakpoppfpsheacy2526
    @noimnotakpoppfpsheacy2526 10 หลายเดือนก่อน +1

    Just why can't amd do what Nvidia does. Ai just requires to be programmed and trained with their gpu's....or is it not that simple

    • @GerritTV187
      @GerritTV187 9 หลายเดือนก่อน +1

      Im actually glad that they started to work on upscaling that's not limited to amd gpus and i have an amd 7800 xt. Fsr 3.1 looks promising. I guess that will take a few months until we get games to run it. One big plus will be that frame generation will run without having to use fsr.

  • @TheSpaseDestroyeR
    @TheSpaseDestroyeR 9 หลายเดือนก่อน

    I can't fail but notice the cat in the background using XeSS 1.3 to reach the depths of delicious can of catfood. 10:22

  • @mr.guillotine1312
    @mr.guillotine1312 10 หลายเดือนก่อน

    I have never attempted to mod a game, but I'm tempted to see if this will work on my 1650 Super....I mean, I hope to upgrade this year, but poverty often has other plans for any money I manage to save...

  • @techsamurai11
    @techsamurai11 10 หลายเดือนก่อน

    Very impressive presentation! You knocked this one out of the park!

  • @mikeodell511
    @mikeodell511 10 หลายเดือนก่อน

    I just did this swap with Spider-Man Remastered on my steam deck and I'm seeing performance gains over FSR 2.1, thanks for sharing this!!!

  • @brucethen
    @brucethen 9 หลายเดือนก่อน

    Just a couple of notes on XeSS
    1) There are 2 code paths, ( Intel native and DP4A). The 2 paths can produce quite different results.
    2) AMD graphics cards including Vega and any below do not support DP4A and thus cant run XeSS, the exception is Radeon VII, which does support CeSS

  • @gabrieli6008
    @gabrieli6008 3 หลายเดือนก่อน

    The newer version of Xess isn't just faster because they improve their algorithm, it gives more framerate because they lowered the base resolution on the pre-existing presets, so it's not a fair comparison to compare the 1.2 quality preset, to the 1.3 quality preset. A fair comparison would be comparing 1.2 quality preset, to the 1.3 quality plus preset.
    They didn't do that to try being sly, they did it because they improved the Fidelity of the upscaling from lower base resolutions. It makes sense, but it's still a little dodgy to market it that way. I like that there's more options to pick from though

  • @Accuaro
    @Accuaro 10 หลายเดือนก่อน

    I find it kind of misleading when comparing almost still frames with not much going on, but if you test XeSS 1.2 in Cyberpunk during the rain at any quality at 1080p on DP4a the rain turns into lasers, straight lines coming from the sky with little definition of individual raindrops. FSR doesn't exhibit this temporal smearing/ghosting (but of course lacks in other ways).

  • @KianFloppa
    @KianFloppa 10 หลายเดือนก่อน +12

    Why didnt you mention its a very old FSR version now people think its bad 😢

  • @noimnotakpoppfpsheacy2526
    @noimnotakpoppfpsheacy2526 10 หลายเดือนก่อน +1

    Is it possible to upscale to the same resolution. Like from 1080p to 1080p just to increase FPS. I imagine if someone got a realllly bad -30fps pc they could benefit from that to make games playable at least. If not they should make it possible. Or to combine native and upscaling in a new technology

    • @i3l4ckskillzz79
      @i3l4ckskillzz79 10 หลายเดือนก่อน

      Wtf?

    • @100Bucks
      @100Bucks 10 หลายเดือนก่อน

      I have a lot of experience with upscaling. You can't upscale 1080p to 1080p. You upscale with low resolution. I recommend buy Lossless Scaling application to see what it does first hand. If you have a 4k monitor. You would leave windows display resolution on 3840x2160. Now the game settings will be completely different from your monitor resolution. Your game would be in 1280x720p window mode only. When you turn on Lossless Scaling. 720p will turn into 4k. Or if you got a 1080p monitor. Your windows display would be on 1920x1080 and your game would be in 1280x720. I don't do this anymore. I bought a 4k Gamer Pro. It's basically the same thing as Lossless Scaling but a hardware thing instead of a software thing. If interested in a 4k gamer pro. You need a 4k monitor. All your games must be in 1080p for the 4k gamer pro to work.

    • @handzze7341
      @handzze7341 10 หลายเดือนก่อน +2

      the word "upscaling" is pretty self explanatory

    • @noimnotakpoppfpsheacy2526
      @noimnotakpoppfpsheacy2526 10 หลายเดือนก่อน

      @@handzze7341 It is but i just meant what if you could replace 1080p native with a reconstructed 1080p image instead of going from 720p to 1080p. Just so there's less load on the gpu

    • @noimnotakpoppfpsheacy2526
      @noimnotakpoppfpsheacy2526 10 หลายเดือนก่อน

      @@i3l4ckskillzz79 What's not clicking?

  • @jonas314ano
    @jonas314ano 10 หลายเดือนก่อน

    off topic but your bgm choice is always great

  • @aliosanlou4425
    @aliosanlou4425 9 หลายเดือนก่อน +1

    Imagin XESS with frame generation

  • @golddace1129
    @golddace1129 10 หลายเดือนก่อน

    So a thing slightly related. I recently bought a 7900 xtx with a 1440p ultrawide monitor and ever sense I've had this bug in cyberpunk that happens if I don't use fsr3. Basically it'll leave this spot on my screen that looks like flashing dying pixel's only in a small part above V's left hand.

  • @vortraz2054
    @vortraz2054 10 หลายเดือนก่อน +1

    I have a qwuestion man. Did you check what percent resolution scale Amd and Nvidia are using before making these comparisons?
    Maybe FSR Performance doesnt use the same base Res as DLSS Performance. If were comparing visuals you have to check that to make it fair.

  • @larryfleming7295
    @larryfleming7295 10 หลายเดือนก่อน +1

    Who zooms in on paused frames during gameplay? So is it really a thing if you have to look for it

  • @socks2441
    @socks2441 10 หลายเดือนก่อน +3

    1:11 pretty bad example given its moving on native and looks great, as you would expect. plus moving on fsr so a lower resolution upscaled thing moving = aliasing/ flickering.
    then with the dlss you hailed, its not moving at all. of course it wont be as aliased/ flicker as much.

  • @arrtea
    @arrtea 10 หลายเดือนก่อน +1

    I tried it on Cyberpunk .. with this i can use highier sharpness setting without making game looking weird .. even at max value of 1 with low resolution it looks very good

  • @AtlasUrbex
    @AtlasUrbex 10 หลายเดือนก่อน

    I'm not motion sensitive but when you zoom in on videos with added camera motion effects inside windows or browsers, it's really jarring. I zoomed out on the video in order to make it watchable to me :(
    That's my only complaint. Love the video so far

  • @AlexHusTech
    @AlexHusTech 9 หลายเดือนก่อน +1

    *Just imagine they include some sort of hardware acceleration aswell*

  • @andraskovacs8959
    @andraskovacs8959 10 หลายเดือนก่อน +1

    Way to overhype ML... which is also a man-made algorithm, the main difference is that since it is using HW acceleration using tensor cores, you can have better quality set for the same or a bit lower performance hit. XeSS differs from DLSS in that XeSS does not _require_ fix-function HW accelerators.

    • @EricPlayZ132
      @EricPlayZ132 10 หลายเดือนก่อน +1

      Which is why XeSS performs so much worse on non-Intel GPUs. FSR is literally the only upscaler which doesn't rely on ML at this point, and it's phenomenal how many people can hate on it, when clearly all 3 upscalers have their own pros and cons.

  • @doltBmB
    @doltBmB 10 หลายเดือนก่อน

    xess has been working for everyone in darktide for a while now

  • @socialfreak6900
    @socialfreak6900 9 หลายเดือนก่อน

    Intel not only started shredding the low end PtP but also aided their competitor to make AMD GPU's get fidelity and speed near DLSS, really makes me interested in Battlemage and just what they got cooking

  • @GerritTV187
    @GerritTV187 9 หลายเดือนก่อน +1

    Remnant 2 now regularly supports xess 1.3 btw

  • @saito272
    @saito272 9 หลายเดือนก่อน

    wonder how these upscaling technology would look like on intel's new arc integrated graphics cards. Considering buying one so hopefully someone covers that as well!

  • @scarfaceReaper
    @scarfaceReaper 10 หลายเดือนก่อน +1

    If only AMD does what Nvidia do and they'll probably be more competitive than ever... but why not I don't think they can't do it but it just feels like they're not interested and wants to do everything on their own

  • @0x8badbeef
    @0x8badbeef 10 หลายเดือนก่อน +3

    I don't use upscaling because I don't like what each frame looks like when things are moving such as panning. They are not as sharp. That is because upscaling basically turns off when you are moving so what is displayed is the lower resolution. Though it is a higher frame rate, it is softer because of the lower resolution. This is probably not a problem for those that use motion blur. I don't. I prefer to see sharp frames even if it looks like a slide show. But to be fair, my frame rate at native is above 90 fps at 4K.

    • @SimplCup
      @SimplCup 10 หลายเดือนก่อน

      nowadays upscaling gives the same image quality or even better in games, at least if you use dlss and xess quality modes, just because how shitty taa and tsr AA looks in modern games, especially at 2k and 4k.

    • @0x8badbeef
      @0x8badbeef 10 หลายเดือนก่อน

      @@SimplCup I like it better turned off.

  • @ClamChowder95
    @ClamChowder95 10 หลายเดือนก่อน +1

    Intel is really surprising me lately. I might just get one of their cards next. I wish they'd get their CPU game together though. Value is just isn't there for gamers atm.

  • @Booth73
    @Booth73 10 หลายเดือนก่อน

    Did u take into account that intel is changing their quality levels? They are adding in more options and changing the resolution that each option renders from. So quality level from 1.2 to 1.3 won't be the same resolution anymore. So quality level in 1.2 will be something like ultra quality in 1.3 or maybe performance I can remember which way it goes but I do know they are 100% adding in more upscale resolutions in 1.3

  • @Q36BN
    @Q36BN 10 หลายเดือนก่อน

    Vex, what do you use to capture the image/video? What's the codec and bitrate of the captured video? I'm asking because, when you make such comparison it's important to be as close to the source as possible, but those captured videos looks kinda low quality, to be precise low bitrate. The video in upclose is blocky, showing artifacts of compression, and general loss of quality :(

  • @HereAfterNow
    @HereAfterNow 8 หลายเดือนก่อน

    I'm a lil confused. So do you have to set your resolution lower? Or does XESS lower it for you?? Like if I have my PC at 1440p do I have to lower that before using XeSs? Or leave it the same and Xess will do it. Cheers

  • @brgir
    @brgir 10 หลายเดือนก่อน +1

    As for games that don't have upscaling at all you could try to use lossless scaling as they made their own upscaler with machine learning( AI) and frame gen in their app. I couldn't test it much as I have an i5 4570 paired with an rx 5600 XT( ik a terrible combo, Bosnia still had the high prices from mining). I recomend you maybe try it out in one of your videos( the app does cost 7 dollars), also intel be out here delivering on what they promise and beating AMD to implementing AI lol.

  • @hundvd_7
    @hundvd_7 10 หลายเดือนก่อน

    14:10 Your stats are just as valid as the official ones. I'd argue Intel is actually doing a better job of communicating the difference.
    I can intuitively tell that the difference between 1.3x and 1.5x is less than 20%. Probably more like 15% if I had to guess.
    But for 77% -> 67%? I am fooled into thinking that it is actually only a 10% increase. And it really isn't.
    And wouldn't you know it, it is 14.9%. I think Intel isn't just marketinging, it is simply the better way to show these stats.

  • @alvarg
    @alvarg 10 หลายเดือนก่อน

    I feel the real benefit is upscaling from 1440p to 4k, as dlss quality makes for a great image but fsr 2 does not. xess 1.2 didn't offer a decent enough uplift of performance but with 1.3 you can now run 4k with great performance.

  • @sharktooh76
    @sharktooh76 10 หลายเดือนก่อน +2

    XeSS is a messy temporal artifacts machine.
    FSR 3.1 brings temporal stability without AI shenanigans.

    • @Jolfm123
      @Jolfm123 5 หลายเดือนก่อน

      Uh look at GoT's fsr3 or any game with grass

  • @ForgottenShinobi11
    @ForgottenShinobi11 10 หลายเดือนก่อน +2

    Man everything that gets me more fps is welcome.

  • @Cloderick
    @Cloderick 10 หลายเดือนก่อน

    Whats happening to youtube videos quality. at 1:30 you can't tell that is a grass...Jesus

  • @johnvandeven2188
    @johnvandeven2188 9 หลายเดือนก่อน

    And yes, every game I play regardless of intensity or fast pace will be played zoomed all the way in and at 1 frame per minute so I can see how good my upscaler is. Vex and Tim from HUB are speaking the same language. It is also my belief that if all these games are played at normal speed without being zoomed in it would be near impossible to distinguish the difference in quality using any of the three available upscalers. I will never purchase an nGreedia GPU.

  • @johnpen269
    @johnpen269 10 หลายเดือนก่อน +1

    You cant really judge it by the still images a lot of the times shimmering will happen when moving the camera around and in motion

  • @MLWJ1993
    @MLWJ1993 10 หลายเดือนก่อน

    Not actually too sure about Nvidia deliberately creating ICAT for marketing DLSS. They made LDAT for comparing latency figures, making something to easily compare image quality in general (not just for upscaling) seems pretty logical to capture & sync up graphical settings comparisons (which I think is really what they were pushing this for, especially for real time raytracing).

  • @HedgehogY2K
    @HedgehogY2K 8 หลายเดือนก่อน +1

    12:11 no... It is. There are a lot of GPUs that can only use AMD's FSR and not even XeSS. Remember the 6.4 shaders requirement. The oldest Nvidia GPU to support those shaders in hardware is Maxwell. You need a GTX 750 Ti or 980 to access XeSS. A GTS 450 can at least use FSR. XeSS is just the middle ground in terms of hardware support and quality. And last time I checked... I'M STILL ON TESLA 2.0 GRAPHICS SO I DON'T GET $H!T.

  • @sajukagentoo51
    @sajukagentoo51 10 หลายเดือนก่อน

    As a Intel Arc A770 16gb owner i can say, it's awesome!
    I went from a old gtx card using fsr to the Intel one using XeSS and it's alot Better image quality.
    Now they just need to make better Linux drivers.

  • @rusty1253
    @rusty1253 10 หลายเดือนก่อน +1

    as a person that lived his entire life with a potato and now recently built a pc with a gtx 1650 on it, i can confidently say that i dont see a fcking difference

  • @ugurinanc5177
    @ugurinanc5177 10 หลายเดือนก่อน +1

    As i know XeSS needs Intel GPU for better upscale, isthis a thing right? Or can i have this upscale with my 3080?

    • @vladvah77
      @vladvah77 10 หลายเดือนก่อน +2

      XeSS works better on Intel's own cards because those use AI accelerators like Nvidia, but it works on AMD and Nvidia cards using software rendering.

  • @TheySleepAmongUs
    @TheySleepAmongUs 10 หลายเดือนก่อน

    Happened to pause it at 1:57.... lol it says "LEAROING" on the loading screen, instead of 'Learning'

  • @Daniel_Z35
    @Daniel_Z35 10 หลายเดือนก่อน +2

    I think so yould have used XESS 1.2 and 1.3 on the setting that upscale from the same resolutions too, as comparing it like you did, you can't really make a fair comparison on image quality improvements.

  • @moxie_ST
    @moxie_ST 10 หลายเดือนก่อน +1

    What a bout lag and upscale and online gaming where every ms of lagging count ?

    • @slimal1
      @slimal1 10 หลายเดือนก่อน +1

      Definitely don't use upscaling in that scenario.

  • @fuxseb
    @fuxseb 10 หลายเดือนก่อน

    Most non-gaming ML algorithms run on (GP-)GPU, and before that, CPUs were widely used (and still are for some stuff). So it shouldn't be shocking to know that the DLSS' special sauce is actually software, not tensor cores. These ASIC blocks make it faster and more energy efficient, that's all.

  • @Buraak_87
    @Buraak_87 9 หลายเดือนก่อน

    Isnt there a part missing about the GTX? As the 4th chapter is called testing on AMD& GTX GPUs.

  • @mahpell7173
    @mahpell7173 7 หลายเดือนก่อน

    We're watching pixel upscaling comparison through the compressed youtube video. Crazy times.

  • @theanglerfish
    @theanglerfish 9 หลายเดือนก่อน

    i will be happy if they implemented their own DXR and comparing 1.3 and 1.2 i noticed better contrast between light and dark surfaces it's better but not noticeable for all peoples because it's like 1-2 color shade but it's better

  • @Bsc8
    @Bsc8 10 หลายเดือนก่อน

    I was already using XeSS with my rdna2 gpu in cyberpunk, better 1% lows and visually identical to native compared to blurry FSR.
    Now it's gotten even better, but still peoples in other tech channels aren't trusting me about XeSS>FSR on a red team card... Open your eyes gents!

  • @Circleglide
    @Circleglide 10 หลายเดือนก่อน +1

    What a great time to be alive

  • @Plague_Doc22
    @Plague_Doc22 10 หลายเดือนก่อน

    Feel like using performance setting in upscale isnt the best comparison as its such an aggressive upscaling option. It's only really used for people that cant get the FPS they need. I think Quality is the gold standard and a lot of people will use it even if they have enough FPS just to get extra smoothness of higher fps.

  • @Superdazzu2
    @Superdazzu2 10 หลายเดือนก่อน +1

    Can you compare visual xess differences between dp4a and xmx istructions? By using an amd/nvidia card and an intel card running xess at same resolution