@@zWORMzGaming I think you should not have killed Bob. He was a little weird but mostly okay and kept to himself and didn't bother anybody. Don't worry though because I won't tell anyone that you killed Bob.
To eliminate some of the shimmering you see, you need to bump the Indirect Lighting from Low to Medium. It costs roughly 3% less performance but it nearly eliminates the shimmering. They need to fix that as it doesn't provide enough boost in performance while looking x10 worse.
The flickering in the MAST district happens regardless of your settings. The only way I could get rid of it was by using the DLSS mod with preset D (which improves image stability).
@scruples yeah at 1440p I can't get rid of the shimmering in some places even with max settings and 100% resolution scale. I don't have access to dlss with my 7900XT though.
why the clown emoji? it's the truth, all nvidia did was wait for TSMC nodeshrinks and add more features to their existing architecture. why do you think nvidia drivers are mostly stable? cause they didnt change jacksh1t since a decade@@saltmaker5353
Why? Is super dated architecture. Very weak float32 throughput, non existent async compute support (essential on modern games), old caching hierarchy, lack of tons of newer hardware features, accelerators...
it looks great, runs like balls but idk what people are talking about when they say it looks bad, does it look good enough to justify the fps? no but surly anyone saying it looks like that bad just isnt running it at a decent quality. the lighting is really nice the textures and models are all highly detailed. what exactly looks bad? except the water i guess, thats kinda bad.
@@danvez5656 cause it does look bad compared to other games and runs terrible i have a 7800x3d and 2070 super and it does not look good even on high settings
@@danvez5656 As you're saying..it looks bad in terms of graphical requirements. Almost no hardware can efficiently run it like if it was some next gen Crysis, yet it looks like Average Joe 2013-2014 game. And yes, some games in that age looked pretty damn impressive.
The reason this game is extremely intensive is due to LODs basically be non existent, they could be added with mods, which should make things much, much better
@@g2amarketplace506 LODs nowadays are heavily cpu dependant like sanji said. Yesterday i had this brilliant idea to change the normal LOD of Gta 5 to 0.5 of the normal value, and i literally got double the FPS.
I don't think Starfield deserves that meme. Crysis visuals were 2-3 gens ahead of its time so the performance were quiet good for that. Starfield only has a nice and fresh art design but the graphics are more like 2-3 gens behind. In my opinion it looks even worse than Dying Light 1 of Jan 2015.
@@Kyuss-Unidai completely agree with you. Starfield doesn’t look like a next gen title at all. Graphics in games aren’t evolving a lot , yet the hardware requirements are getting horrendously high.
From what i've heard the rumours say that AMD paid Bethesda to make this game run well only on AMD machines and also to not include DLSS. And it makes sense as it literally took moders less than 24 hours to make DLSS mod..
@@Mp57navy Agreed. Even a DLSS mod, not optimized and implemented by the developers, looks and runs SO much better than the garbage that is FSR. With DLSS at 100% it even looks better than native.
Ahhh how much your channel has grown my friend, I remember years ago when I commented on your video when you did benchmarks on Csgo with intergrated graphics
Hey Kryzzp. love your Videos soo much! Always Smile on your Face and as always a very good Video! Id love to see a comparison of the Titan X Maxwell as well as the Titan X Pascal, these cards are as I know very different! :D
hi my friend again amazing test with very OLD GPU very funny and this Game kill every GPU even the power of them!!need Optimisation but i think its late that Devloper do effort and optimisied game!!this 2023 they become lazy and used FSR and DLSS and its all!!always happy to see your videos !!have great day
The GT1030 test in the other video had me laughing lol I still have that card and never realized it could run something so new 🤣 Thank You for the videos 👍
It's a joke that the modding community made the game playable for Nvidia within a few days with dlls mods and sharpening filters and Bethesda doesn't do anything :D Bethesda also says Starfield runs clean and well and the people who have performance problems have old hardware :D
Bethesda screwed Team Green. My friend is playing Starfield on a R9 Fury X in Crossfire (he's actually using the Fallout 4 crossfire profile and wouldn't you know... It works) and he's getting the game at 1080p on medium with FSR on 80 percent resolution scale and he's getting 68 fps and low is 49..... Basically a 60 fps experience for him. R9 Fury X came out in summer 2015 lol.
This is 100% an optimization issue in Bethesda's part. I have a Quadro M2200 which is basically a 965M with more VRAM, and at 4K with the same settings with dynamic resolution enabled I got the same framerate almost. They quote the minimum spec as RX 5700 and GTX 1070 Ti, if you have a card with an older architecture I can only assume they did zero optimization especially since their response to bugs on Intel Arc cards is they don't "meet the requirements" even though the A750 and A770 are a lot faster.
@@zWORMzGaming I wonder, is the ARC A770 faster than the Titan X in this game? You can actually launch Starfield on ARC GPUs with the latest driver, so I’m curious
if I were to play at 720p with a Titan X maxwell I might as well use a mClassic with it to upscale the 720p image to 1080p to stop the image from looking blurry. (I am actually viewing the 720p gameplay with an mclassic to upscale it, it definitely looks okay upscaled that way especially when combined with FSR 2, better than if I let my Monitor upscale it, or upscaled by the usual methods)
Considering when a journalist asked if they are going to optimise the game they responded with saying it already is optimised and people just have bad hardware I doubt they are doing much to optimise it.
Need to use DLSS! oh wait no RT cores:P btw do you have the R9 Fury it's hitting 60 fps on cyberpunk 2077 one video I seen that claim it. Only problem was on Linux as driver still being supported there then on windows, not sure how Starfield will do.
8 don't have the fury, no. The fury has like 1060 levels of performance right? Maybe 980? It will hit 60 at 1080p low in some areas, add fsr and it'll be at 60 avg probably!
@@zWORMzGaming been abit but the R9 fury and the vega card I think AMD coolest card names and tect even it didn't perform too well compare to Nvidia for the price. Only problem if you did AMD stop supporting it, be relying on Linux proton to get Starfield working with out tons of issue but is just my guest atm.
@@hayax Exactly. This is just another garbage AMD title like Jedi Survivor. Luckily there is a DLSS mod to save the day but that doesnt excuse the shit optimization.
@zWORMz Gaming Hi, i have question about resize bar, if you enable it your bios is super laggy too? In windows works fine. And btw, you force resize bar 4 all games in nvidia inspector?
I can understand people who think games should run well on GPUs they once paid over a thousand $$$ for, after all, I myself once invested into 2x Titan SLI. But it is also important to understand that those cards, Titan X, Titan XP, 980Ti and the likes are over 7 years old. The reason being - we had two honestly sucky GPU generations (rtx2000, which had barely any competition at the highest end despite only 2080Ti being much faster than previous top, and rtx4000, which only can boast an rtx4090 in terms of raw performance, but the price...) and one decent, but not great (rtx3000), diminished by cryptocurrency craze, making it extremely overpriced for over 1.5 years. The games moved on, mostly in terms of poopy optimizations unfortunately, and the GPUs stayed behind. It didn't help that the generational lifespan of architecture increased almost twice. Today, we're a full generation behind in graphics card evolution, and only dirty hacks like upscaling / framegen keep the cards at playable frame output. It doesn't mean though, that I support dvelopers' lazy softspots, a decent game should have a basically "I'm fine with what I have, even if it's not that good anymore" settings for people running moderately outdated hardware, but the problem is that what the majority (on steam) have are flavors of 1660, or even 1650 and the likes, which are the mids and upper low of *three generations ago*. As someone who tried to run Crysis on 7600GS, a mid of one generation ago at the time, I can say people severely overestimate the lifespans of their cards. Unfortunately, AMD and nGreedia severely overestimate the thickness of average gamer's wallet in their stead.
Indirect Lighting is causing the blurriness.. changing it to medium fixes it. Its 1 fps less when set to medium with my Titan X but the game looks so much better even when you lower the resolution scale
And I also noticed, less and less of the GPU is being utilized through the lower resolutions relative to the frame rate increase. That might be an engine issue with the game itself.
Hey Kryzzp can you make a video with ryzen 7 5700x with rtx 3070 1080p?. I want to upgrade my cpu form r3 3300x so i need to see how it perform on many title to see how good the cpu handle rtx 3070 pls?.
The game clearly needs sone performance optimisation. However, the games low settings are like high from 5years ago. In addition, like with new titles, if we can lower or disable certain shadows etc, I’m sure we’ll get another ~15-20% fps boost.
I think FSR might be broken with the 900 series GTX cards. I wasn't seeing much of any increased performance at all. Only thing that might change that is just change the raw resolution and leave FSR alone.
Just started playing yesterday on my rx 6800 high settings 1440p with fsr set 80% gets me over 60 all of the time even when it dips below 60 with free sync I cant even tell overall a good experience.
I would actually say that is primarily the fault of FSR and DLSS, as you said yourself “Even with FSR off some things are still loading in at a much lower resolution”, almost like it’s literally baked into the assets that “They’re going to have FSR on anyways so who cares”
i play 1080p 80% upscaling with a 3080 12gb and it still crashes all the time :D i wish i would just hate the game but unfortunately i like it pretty much so hoping for some patches and drivers to come :D
2:03 you forgot to turn dynamic resolution scale of. It's not 90% scale, it's 50% scale automatacially. FSR at 4k looks fine, and not that soft with Quality. But yeah, it looks bad at 4k at 50% scale. You can see at the 2 minute mark it is turned on. Than for 1440p it was off again. FSR2 doesn't look any better or worse on any card. It just performs worse on some cards.
Remember when i was looking at the Titan X as the dream card.. Now it runs the same or even worse in this game as my RTX 2070 Super and i consider my card a pretty slow choice ;/ And now it doesn't even cost 200 euro used lol.
FSR2 is not really optimized for 1070gtx, it is optimized for the 1070Ti, the 1070gtx plus the OC models are Optimized for the CAS settings, FSR3 not included on Starfield is optimised for the 10series GPUs.
Swap to medium settings. There’s no texture resolution setting, it seems to change with the global setting. I had the same problem on my 1070. No matter what I did, on low settings, it was just a blurry mess. Switching to medium settings didn’t drop the fps much, but the game suddenly looked sharp.
Ay my guy Kryzzp, I need some help buddy, currently I own a ryzen 5 5600g cpu ( yes I bought an igpu in 2022 because gpu prices were way too high in my country and I couldn't afford one ) and I'm planning to buy a used gpu for it, can you suggest me the best gpu I can pair with my cpu without any bottleneck? I plan to play on 1080p...
It might be worth trying the XeSS mod on cards which don't support DLSS. I've been playing with XeSS on my 1080 Ti via the mod and it looks MUCH better than the game's FSR 2 in most situations. Distant detail in particular is way more stable. I've noticed it does cause some moire patterns on clothing in certain dialogue sequences though.
Please could anyone help me out? My graphics card- Nvidia Quadro k2100m always shows a TGP of 0.0 watts while gaming. I don't think this is realisticly possible at all. So I've been wondering exactly what the problem could be. I like monitoring the TGP while gaming as with all the others like vram, GPU usage, etc. And I haven't found anything about this online at all. Thanks
Real DLSS tech is sorely needed in games. What a lot of people don't understand is dlss also helps out with anti-aliasing. It's better than TAA, FXAA and SMAA. Putting dlss on quality settings doesn't do games any harm. It usually helps things look better. That's the experience I have had in all the games that I played that had it. The built-in DLAA is great. Ai works.
I like Bethesda games, but let`s try to compare 3D world datails in Starfield and Witcher 3. Do You have the same feelings like me, that older game, which is Withcer 3 is more datailed that Starfierld and it is less demanding also after next-gen update? What`s going on?
I’m currently stuck with my old Rx 480 because my power supply is broken so I have to use my old one (it’s an Dell optiplex power supply) and I can’t use my 1660 super. I’m planning on upgrading gpu and power supply soon
I tried with nvidia gtx1070 8gb, 32gb ddr4, i7 8700k, nvme 2 samsung 980. It works perfect and runs with no issues. Stop playing with gamers to spend more for upgradation.
One tip for you : when killing BOB take a picture with the games photo mode and that photo will be used as a loading screen in the game
Hahahahaha I need to do this 😂
@@zWORMzGaming I think you should not have killed Bob. He was a little weird but mostly okay and kept to himself and didn't bother anybody. Don't worry though because I won't tell anyone that you killed Bob.
@@zWORMzGamingbetrayal 🥹
@@pf100andahalf always kill bob hes evil
@@zWORMzGaming😂
To eliminate some of the shimmering you see, you need to bump the Indirect Lighting from Low to Medium. It costs roughly 3% less performance but it nearly eliminates the shimmering. They need to fix that as it doesn't provide enough boost in performance while looking x10 worse.
this, indirect lighting is SUPER important at low resolution.
Came to post this, what a dumb setting
The flickering in the MAST district happens regardless of your settings. The only way I could get rid of it was by using the DLSS mod with preset D (which improves image stability).
@scruples yeah at 1440p I can't get rid of the shimmering in some places even with max settings and 100% resolution scale.
I don't have access to dlss with my 7900XT though.
@@scruples What's the best DLSS preset anyways?
Have always been a fan of the Maxwell architecture
nvidia is still using maxwell basically in rtx 40 series. just more cuda cores, higher clockspeeds and better memory lol
why the clown emoji? it's the truth, all nvidia did was wait for TSMC nodeshrinks and add more features to their existing architecture. why do you think nvidia drivers are mostly stable? cause they didnt change jacksh1t since a decade@@saltmaker5353
Why? Is super dated architecture. Very weak float32 throughput, non existent async compute support (essential on modern games), old caching hierarchy, lack of tons of newer hardware features, accelerators...
@@SweatyFeetGirl Wrong. Power to die size scaling wise, the 40 series broke off as it's off the previous trends completely.
@@niebuhr61978 years ago it was amazing. They didn't say that it is a great architecture today.
I find it funny how graphically intensive this game is while looking like a game from 2014
totally agree. Assassins Creed Unity look much better then this shit.
Looks like ark survival evolved
it looks great, runs like balls but idk what people are talking about when they say it looks bad, does it look good enough to justify the fps? no but surly anyone saying it looks like that bad just isnt running it at a decent quality. the lighting is really nice the textures and models are all highly detailed. what exactly looks bad? except the water i guess, thats kinda bad.
@@danvez5656 cause it does look bad compared to other games and runs terrible i have a 7800x3d and 2070 super and it does not look good even on high settings
@@danvez5656 As you're saying..it looks bad in terms of graphical requirements. Almost no hardware can efficiently run it like if it was some next gen Crysis, yet it looks like Average Joe 2013-2014 game. And yes, some games in that age looked pretty damn impressive.
The reason this game is extremely intensive is due to LODs basically be non existent, they could be added with mods, which should make things much, much better
But LODs are nowadays more on the cpu/ram/vram than on the raw performance of the gpu /:
@@sanji663 exactly. The game is cpu bound, on most pcs, it would fix the performance issue. Not in his case though
@@sanji663sanji pls explain your reply, I'm studying for a graphics interview
How did you know that?
@@g2amarketplace506 LODs nowadays are heavily cpu dependant like sanji said.
Yesterday i had this brilliant idea to change the normal LOD of Gta 5 to 0.5 of the normal value, and i literally got double the FPS.
So the new question will be Can it Run Starfield ?
I don't think Starfield deserves that meme. Crysis visuals were 2-3 gens ahead of its time so the performance were quiet good for that. Starfield only has a nice and fresh art design but the graphics are more like 2-3 gens behind. In my opinion it looks even worse than Dying Light 1 of Jan 2015.
@@Kyuss-Unidai completely agree with you. Starfield doesn’t look like a next gen title at all. Graphics in games aren’t evolving a lot , yet the hardware requirements are getting horrendously high.
Fsr 2 vs dlss 3 makes a hell of a diference. Fsr 2 looks very soft and very pixlated. Dlss not being on the game was a damn mistake.
Well I don't care because I can't use it anyways 😂 playing it with a 5500xt @40fps
Yep. FSR is a bunch of garbage. One day after I bought the game i applied the 3.5 mod. Double the FPS and it basically looks like a different game.
From what i've heard the rumours say that AMD paid Bethesda to make this game run well only on AMD machines and also to not include DLSS. And it makes sense as it literally took moders less than 24 hours to make DLSS mod..
Frame generation is a hoax, you just see the fps numbers but the responsiveness is the same as the 30-40ish fps you were getting without it.@@Mp57navy
@@Mp57navy Agreed. Even a DLSS mod, not optimized and implemented by the developers, looks and runs SO much better than the garbage that is FSR. With DLSS at 100% it even looks better than native.
I had two GTX 980 TI in SLI in 2015. You felt like a king. 😆
Bro those were the days! 😢
Ahhh how much your channel has grown my friend, I remember years ago when I commented on your video when you did benchmarks on Csgo with intergrated graphics
Hey Kryzzp. love your Videos soo much!
Always Smile on your Face and as always a very good Video!
Id love to see a comparison of the Titan X Maxwell as well as the Titan X Pascal, these cards are as I know very different! :D
Hey another great video! Greetings from Brazil
My GTX 970 got absolutely terrified while playing this video and seeing it's strongest brother suffer. 🤣
I thought of making the 970 video today, but curiosity got to me and tested the titan instead 😅
@@zWORMzGaming Thank you 😂 omw to watch your gta 5 benchmark video on 970 to make it feel better 🤩
@@AbhishekAdhikari-oi8qq the 900 series is my favorite, it hurts to see it slowly die
Yeah but only starfield Will be
Optimized later❤
ran starfield on a 960
hi my friend again amazing test with very OLD GPU very funny and this Game kill every GPU even the power of them!!need Optimisation but i think its late that Devloper do effort and optimisied game!!this 2023 they become lazy and used FSR and DLSS and its all!!always happy to see your videos !!have great day
The GT1030 test in the other video had me laughing lol I still have that card and never realized it could run something so new 🤣 Thank You for the videos 👍
"Run" might be an overstatement 💀
Thank you and your wife for all the videos 😁 They are a joy to watch 😁
I'm getting more and more impressed by the fact that the Steam Deck runs this game at all
12:54 - Bob said “Not today kid, I’m going to be MUCH harder to kill than before. Let me show you my TRUE POWER!”
It's a joke that the modding community made the game playable for Nvidia within a few days with dlls mods and sharpening filters and Bethesda doesn't do anything :D Bethesda also says Starfield runs clean and well and the people who have performance problems have old hardware :D
even though I have a 1060 6gb I watch all your vids. Love seeing other cards and you always have the best benchmarks and videos 👍
That 4K was VERY optimistic lol
I like how you give the old card’s their glory. It shows how much the cost per pixel has crazily increased. Hooray for shadows and minor details?
180w for the most powerful GPU of its time, while today we have an entry level RX 7600 consuming the same amount of power, lol
To be fair, it's supposed to consume 250W, it's just this game that can't take advantage of all of it!
Bethesda screwed Team Green. My friend is playing Starfield on a R9 Fury X in Crossfire (he's actually using the Fallout 4 crossfire profile and wouldn't you know... It works) and he's getting the game at 1080p on medium with FSR on 80 percent resolution scale and he's getting 68 fps and low is 49..... Basically a 60 fps experience for him.
R9 Fury X came out in summer 2015 lol.
Did he used the nemiz drivers or official version?
Damn! Gotta check out my Vega 56, it should run well then
@@zWORMzGaming the gane runs pretty well on amd gpus
This is 100% an optimization issue in Bethesda's part. I have a Quadro M2200 which is basically a 965M with more VRAM, and at 4K with the same settings with dynamic resolution enabled I got the same framerate almost. They quote the minimum spec as RX 5700 and GTX 1070 Ti, if you have a card with an older architecture I can only assume they did zero optimization especially since their response to bugs on Intel Arc cards is they don't "meet the requirements" even though the A750 and A770 are a lot faster.
Hello sir , i love your content. Good to see that you are being consistent. 🎉
When will you test the AMD RX 7700/7800 xt ?
Amd dont send him so he don't have
What happened to the Intel Arc GPUs by the way? I was hearing a lot about them 1-2 years ago but it seems nobody talks anything about them nowadays
The drivers are still iffy and there are better options for the money!
@@zWORMzGaming I wonder, is the ARC A770 faster than the Titan X in this game? You can actually launch Starfield on ARC GPUs with the latest driver, so I’m curious
Below 30fps the adaptive scaling (dynamic resolution) kicks in, so the FSR setting you chose was not what you got at 4K.
will say this, that 170 watt power draw of a gpu is lovely...we are really going the wrong way with gpu's now asking for double that.
if I were to play at 720p with a Titan X maxwell I might as well use a mClassic with it to upscale the 720p image to 1080p to stop the image from looking blurry. (I am actually viewing the 720p gameplay with an mclassic to upscale it, it definitely looks okay upscaled that way especially when combined with FSR 2, better than if I let my Monitor upscale it, or upscaled by the usual methods)
NeXT video Titan SLi
Considering when a journalist asked if they are going to optimise the game they responded with saying it already is optimised and people just have bad hardware I doubt they are doing much to optimise it.
the sharpness fsr2 in game slider is not work under dlss mod so you have to use a reshade with cas.fx or enable sharpness in the panel driver nvidia
This doesn't support DLSS, the mod isn't even installed in this PC 👍
awesome gpu sir!
Need to use DLSS! oh wait no RT cores:P btw do you have the R9 Fury it's hitting 60 fps on cyberpunk 2077 one video I seen that claim it. Only problem was on Linux as driver still being supported there then on windows, not sure how Starfield will do.
8 don't have the fury, no.
The fury has like 1060 levels of performance right? Maybe 980?
It will hit 60 at 1080p low in some areas, add fsr and it'll be at 60 avg probably!
@@zWORMzGaming been abit but the R9 fury and the vega card I think AMD coolest card names and tect even it didn't perform too well compare to Nvidia for the price. Only problem if you did AMD stop supporting it, be relying on Linux proton to get Starfield working with out tons of issue but is just my guest atm.
E aquela pasta com o nome "coisas". Continua o bom trabalho, irmão.
This game has become the next "Can it run Crysis" kind of game.
The GPU Power draw is only at 60 to 80%. There will still be some potential for optimization.
Unfortunately they actually believe that the game is optimized lol
This, Nvidia gotta release an optimized driver for this.
@@zWORMzGaming”You may want to upgrade your PCs” - Todd Howard
How can you upgrade past a 4090 again?
@@hayax Exactly. This is just another garbage AMD title like Jedi Survivor. Luckily there is a DLSS mod to save the day but that doesnt excuse the shit optimization.
.....I still remember dreaming of having a Titanx......then I got a 1080ti soon after......it still works......
@zWORMz Gaming Hi, i have question about resize bar, if you enable it your bios is super laggy too? In windows works fine. And btw, you force resize bar 4 all games in nvidia inspector?
I can understand people who think games should run well on GPUs they once paid over a thousand $$$ for, after all, I myself once invested into 2x Titan SLI. But it is also important to understand that those cards, Titan X, Titan XP, 980Ti and the likes are over 7 years old.
The reason being - we had two honestly sucky GPU generations (rtx2000, which had barely any competition at the highest end despite only 2080Ti being much faster than previous top, and rtx4000, which only can boast an rtx4090 in terms of raw performance, but the price...) and one decent, but not great (rtx3000), diminished by cryptocurrency craze, making it extremely overpriced for over 1.5 years.
The games moved on, mostly in terms of poopy optimizations unfortunately, and the GPUs stayed behind. It didn't help that the generational lifespan of architecture increased almost twice.
Today, we're a full generation behind in graphics card evolution, and only dirty hacks like upscaling / framegen keep the cards at playable frame output.
It doesn't mean though, that I support dvelopers' lazy softspots, a decent game should have a basically "I'm fine with what I have, even if it's not that good anymore" settings for people running moderately outdated hardware, but the problem is that what the majority (on steam) have are flavors of 1660, or even 1650 and the likes, which are the mids and upper low of *three generations ago*. As someone who tried to run Crysis on 7600GS, a mid of one generation ago at the time, I can say people severely overestimate the lifespans of their cards. Unfortunately, AMD and nGreedia severely overestimate the thickness of average gamer's wallet in their stead.
" I'm about to " *Crash*
You scare me man lol.
Bob scares me!
How do you feel about Todd saying that the game is optimized, and that "You might need a better rig"?
Perfect for my 8 year old son to play on the hand me down rig next to me. 1080 is fine for him.
Indirect Lighting is causing the blurriness.. changing it to medium fixes it. Its 1 fps less when set to medium with my Titan X but the game looks so much better even when you lower the resolution scale
I miss my old 980 matrix, god that shroud looked so good. Wish I hadn't sold it 😅
Long live Kryzzp and Digimoon much love
And I also noticed, less and less of the GPU is being utilized through the lower resolutions relative to the frame rate increase. That might be an engine issue with the game itself.
Bro which benchmarking app are you using?
At the beginning you have dynamic resolution turned on and you are getting less than 30 fps thats why it is soft.
Yeah I thought so as well when editing! But then at 1080p it was still terrible and DRS wasn't doing anything 😭
@@zWORMzGaming it was indirect lighting, gotta keep it at medium or it renders the lighting at super low resolutions.
Hey Kryzzp can you make a video with ryzen 7 5700x with rtx 3070 1080p?. I want to upgrade my cpu form r3 3300x so i need to see how it perform on many title to see how good the cpu handle rtx 3070 pls?.
Can't believe that a game with visuals like that can be so demanding.
you will be suprised what optimization or in this case, the lack of optimization can do
Bob has the power of Karma 🤣
The game clearly needs sone performance optimisation. However, the games low settings are like high from 5years ago. In addition, like with new titles, if we can lower or disable certain shadows etc, I’m sure we’ll get another ~15-20% fps boost.
I think FSR might be broken with the 900 series GTX cards. I wasn't seeing much of any increased performance at all. Only thing that might change that is just change the raw resolution and leave FSR alone.
I am playing on AMD FX 8350 / GTX 970 - 720p @ 40 FPS (No FSR). I've tried at 1080p but with FSR, enemies are too blurry to see and shoot.
Nvidia GT 710 next !!! Let this beast show who is the boss of graphic cards 💪
Grande vídeo, ajuda muito na escolha de uma gráfica ou processador, força aí irmão 💪
Just started playing yesterday on my rx 6800 high settings 1440p with fsr set 80% gets me over 60 all of the time even when it dips below 60 with free sync I cant even tell overall a good experience.
I would actually say that is primarily the fault of FSR and DLSS, as you said yourself “Even with FSR off some things are still loading in at a much lower resolution”, almost like it’s literally baked into the assets that “They’re going to have FSR on anyways so who cares”
Its doing pretty well considering. My water cooled 3090ti does awful at 4k
Whelp I was gonna buy this game with this card. Funny you read my mind.
Sorry to disagree, but I think you'll find the *best* Maxwell GPU is the GTX 750 Ti
maybe the K620?
How i though it is gtx 980 ti
Where did you get your background pic?
This game is so intensive
Can you do benchmarks for the RX 7800xt for minecraft shaders?
i play 1080p 80% upscaling with a 3080 12gb and it still crashes all the time :D i wish i would just hate the game but unfortunately i like it pretty much so hoping for some patches and drivers to come :D
oh shit its slowfield again. lets see
liked before watching 🥰
2:03 you forgot to turn dynamic resolution scale of. It's not 90% scale, it's 50% scale automatacially. FSR at 4k looks fine, and not that soft with Quality. But yeah, it looks bad at 4k at 50% scale.
You can see at the 2 minute mark it is turned on. Than for 1440p it was off again. FSR2 doesn't look any better or worse on any card. It just performs worse on some cards.
Remember when i was looking at the Titan X as the dream card.. Now it runs the same or even worse in this game as my RTX 2070 Super and i consider my card a pretty slow choice ;/ And now it doesn't even cost 200 euro used lol.
FSR2 is not really optimized for 1070gtx, it is optimized for the 1070Ti, the 1070gtx plus the OC models are Optimized for the CAS settings, FSR3 not included on Starfield is optimised for the 10series GPUs.
Swap to medium settings.
There’s no texture resolution setting, it seems to change with the global setting.
I had the same problem on my 1070. No matter what I did, on low settings, it was just a blurry mess. Switching to medium settings didn’t drop the fps much, but the game suddenly looked sharp.
I wonder how it runs on the RTX Titan, and also Voodoo 2 (in SLI!) and NV20 chipsets
Ay my guy Kryzzp, I need some help buddy, currently I own a ryzen 5 5600g cpu ( yes I bought an igpu in 2022 because gpu prices were way too high in my country and I couldn't afford one ) and I'm planning to buy a used gpu for it, can you suggest me the best gpu I can pair with my cpu without any bottleneck?
I plan to play on 1080p...
Makes me laugh all the time when ppl say the graphics are not good enough. I grew up on a Voodoo II. Graphics look great to me ATM.🤣😁
Please do the A770 next, Kryzzp!
It might be worth trying the XeSS mod on cards which don't support DLSS. I've been playing with XeSS on my 1080 Ti via the mod and it looks MUCH better than the game's FSR 2 in most situations. Distant detail in particular is way more stable. I've noticed it does cause some moire patterns on clothing in certain dialogue sequences though.
Xess crashes on any game I've used it on my Titan X Maxwell
Also you will own nothing and be happy
For those who have an rtx 3060 and below it's time to upgrade 😂if this continues prices of gpus will go down
Bold of you to assume people would want to upgrade for this game..
Please could anyone help me out? My graphics card- Nvidia Quadro k2100m always shows a TGP of 0.0 watts while gaming. I don't think this is realisticly possible at all. So I've been wondering exactly what the problem could be. I like monitoring the TGP while gaming as with all the others like vram, GPU usage, etc. And I haven't found anything about this online at all. Thanks
Kryzpp wherever you go I will always be there for you (to cause stutters )😶🌫️
Below 30 fps it's using dynamic resolution scaling. So who knows what actual resolution it was running at first in the video.
how the mighty titan has fallen
My guess is 1080P, Low settings, without any upscale.
Real DLSS tech is sorely needed in games. What a lot of people don't understand is dlss also helps out with anti-aliasing. It's better than TAA, FXAA and SMAA.
Putting dlss on quality settings doesn't do games any harm. It usually helps things look better. That's the experience I have had in all the games that I played that had it.
The built-in DLAA is great. Ai works.
I like Bethesda games, but let`s try to compare 3D world datails in Starfield and Witcher 3. Do You have the same feelings like me, that older game, which is Withcer 3 is more datailed that Starfierld and it is less demanding also after next-gen update? What`s going on?
Why did you disable resizable bar?
maybe it doesnt work with cards from 2014
Yeah it doesn't
Low settings look bad, medium is great.
FSR2 or DLSS (mod) do a great job,
i couldn't tell much difference between no scaling or 50% scaling (on 4K)
Ah a $1000 cards back in the day that's cost less than a PS4 today.
At least the whole "GTX 980 TI"~GTX 1070~GTX 1660 TI~RTX 3050 is still kinda sorta true? Except now the 3050 aged better (as it should... I hope?).
I’m currently stuck with my old Rx 480 because my power supply is broken so I have to use my old one (it’s an Dell optiplex power supply) and I can’t use my 1660 super. I’m planning on upgrading gpu and power supply soon
I tried with nvidia gtx1070 8gb, 32gb ddr4, i7 8700k, nvme 2 samsung 980. It works perfect and runs with no issues. Stop playing with gamers to spend more for upgradation.
can you do rx 6800 xt in the games you usually benchmark?
1000 and 900 series architecturally very same different is 1000 series has more L2 caches and small manufacturing nodes high frequency
I wonder who BoB is in his Real Life and what he did to deserve the brutality
Можно Sli / Crossfier системы ? DX 12 поддерживает объединение разных GPU
bob mad the camera a seizure....aswell
now you have to test the 1080Ti, it only make sense to step up 1 generation of best GPU after this one
What's the difference between CAS and FSR 2?
This gpu should perform much better in this game...
Can you test the TITAN RTX
I don't have it!
i belive depth of field is only available when you engage conversations with npcs