I'm using the redmagic 6r right now, looking to upgrade at some point. These phones have all of the performance you need, without the extra markup like other "so called" gaming phones.
@@NotaFollower11 i5-12400F and 3050 (got it for cheap don't diss me) Probably gonna get B580. Not really into kingdom com it's a lame game for such a requirement.
I don't see enough ppl prasising the devs actually just giving us straight facts and telling us "hey this is what you need to run the game at NATIVE" no DLSS or required framegen.
@@resltessmindtwr Unfortunately that's the current landscape, so gotta take the small wins were possible and then use them to make better changes in the gaming industry when it comes to optimisations and transparency over the basic spec list.
For people new into the series: Yes, the original KCD struggled with CPU usage and continues to struggle even to this day, there's a lot of real-time simulation as well as dozens of NPCs with actual schedules, personalities, traits and jobs and they all exist in real time in the world while going on with their lives. KCD2 looks like its going to be even bigger and better, so expect similar issues but adapted to current CPUs.
one of the coolest unexpected mechanic is when i cleaned some bandits but didn't loot their gears, i accidentally found that their gears has been transported to cemetery caretaker / gravedigger chest when i was thieving in Rattay near the church
You should definitely cover Kingdom Come: Deliverance 2 performance. It's a pretty major release for next year so it would be helpful for people on the fence.
"oh, what people said has been true for years by now, but when the very first example of a game that asks for more than 16gb comes up which by the way is a game known for very heavy cpu and thus ram utilization, and even though we are not sure if It's over exaggerated by the devs to be safe, or a case of bad optimization, I'll still try to be a smart ass about it", said you
I have a Ryzen 5700x which should be about 30% slower than a Ryzen 7800x3d. This means I‘ll probably get 42FPS 😭 I‘m used to get at least 80, so I can Max out the Monitor fps when framegeneration is switched on.
@@bilbobaggins8794you‘re right man. We all just have to tweak settings a bit. Modern requirements are just ridiculus and it’s never enough. It’s better to make a view compromises here and there.
3080 is basically on par with a 4070. But holy fuck, now I need a whole new motherboard, cpu, ram, and power supply just to run this game native resolution at 60 fps
CPUs are so behind it's a joke. I got my new 7800X3D recently and it can't even push a 4090 in a lot of games, and the ones it does, it's not got any headroom for a future GPU upgrade and this CPU is a year newer than the card. The 9800X3D doesn't even fully solve this problem because in some games it doesn't run any better.
I don´t think so, some modder will tweak some hidden settings and your will gain 30 fps or something, it´s always the same... those requirements seem like a joke.
Most of KCD1's "ultra" settings were for future hardware, the game even warns you it is experimental and not meant for current (at the time of release) hardware. Im sure it will be similar in KCD2, where many ultra settings are there as experimental options not meant to actually be used untill there is hardware capable of using it. Im glad they do this personally, makes replaying it later that much better.
Which is a good thing from devs, they realize that HW is not ready for it and you can still lower settings and make it run properly because it's CryEngine and when you lower details, you have massive FPS boost, that doesn't work in Unreal nor id tech engine where FPS is still the same and only thing you can do is to use DLSS/FSR or lower resolution. I still believe in CryEngine.
@@redblue2358 Game was optimized well, I played it for the first time on Radeon R9 380 and Core i5 4670K and it was completely fine in 1080p, just in very populed areas it was dropping, but nothing that bad. People complaining about KCD optimalisation just had potato PC, it's comletely different situation than now with stupid unreal engine games where it doesn't run properly neither on top hardware.
Pretty crazy that with a lot of these new releases you actually need a high-end "overkill" PC with the best CPU and GPU available to hit max settings, whereas last generation even the midrange would have been overkill for most games. The fact that a 4080 or sometimes even 4090 shows up in these system requirements charts at all is insane, I did not expect "halo products" to show up as a requirement for games.
@vintatsh im sorry but the fastest computer hardware you could buy ran metro 2033 at about 30fps at 1080p back in 2010. 4090 gets a lot more than 30fps at 1080p in this game.
I had a 1080Ti when RDR2 released. The only setting I played on "ultra" was textures everything else was a mix of medium, low, and high. So yeah this isn't new this has been a thing for a long time.
@@WeencieRants Also keep in mind that a 1080 ti was much cheaper in 2019 than the equivalent sku in the 40 series lineup even accounting for inflation, which would probably be a 4080 super and is over 1000 bucks.
@ well thankfully I didn’t upgrade to a 4080, I upgraded to a 7900XTX which adjusted for inflation was the same price I paid for my 1080Ti. I have an EVGA FTW3 1080Ti and I upgraded to a Sapphire Nitro + 7900XTX, both are top SKUs and from the premium board partner. Adjusted for inflation I paid $50 more for the 7900XTX.
Very steep CPU and RAM requirements, but quite moderate on GPU front. I don't expect this to run well at launch, but hopefully it's not as CPU bottlenecked as it looks on paper.
KCD 1 was also very cpu heavy and is to this day. This, kcd2 has real-time npcs, alot of them that have active lives outside of the player and schedules so they have a constant cpu load all the time. If u kill bandits and leave their loots, some npcs might pick it up and leave it in my chest in the graveyard etc all in real-time
KCD1 had been made with Ultra settings designed for future hardware, and it still tells you this upon selecting them on modern hardware. Hopefully KCD2 is an improvement here with CPUs because the first one runs like garbage on a 7800X3D at max settings. Even if they are poorly optimized or made for future hardware, they should actually run on CPUs released 5-6 years after the game
Once again TH-cam is taking forever to process the higher resolution version of the video so I'm starting with Early Access for channel members now that the standard definition version is ready. I'll post on the main feed once 4K processes.
I expected even higher CPU requirements, as the first game rekts your CPU even today. I guess it's a CryEngine thing. And the second game will be featuring a large city with who knows how many NPCs rendered with complex behaviours.
Yeah. The Engine + their very deep persistent NPC life simulation (NPCs have a home, they eat, sleep, work and engage in leisure activities, they have inventories etc). That’s what really taxes the CPU.
@@Neuroszima We'll see when the game releases. They pushed the release date forward one week, which likely means they are satisfied with polish. The game might be well optimized, at least better than the first game which is abysmal in that regard.
Seeing the specs for Indiana Jones made me think this one would be monstrous but it looks decent. Luckly by the time it releases i already have a 5700x3d.
@@mttrashcan-bg1ro 120 fps at 4K max with 4090, but it's without path tracing. Early access has RT, but not PT. Path traycing will be added in day 1 patch
The original was a massive cpu hog you could only get a locked 60 fps experience with a 8700k which was the fastest gaming cpu at the time so it won't suprise me.
@@marcelosoares7148 I actually think the Switch version is one of the best ways to play the game. Otherwise I have to lock the game to 40fps in order to have a consistent experience.
Yes, this one will be a CPU hog as well (cry engine + a world full of simulated/ persistent NPCs with their life cycles). Though hey did say that they rewrote a lot of code to make the game more stable and that it is much smoother than KCD1. So I would at least expect a more stable framerate.
@@raskolnikov6443 The fact this game uses a fork of Cryengine means jack, HL1 was made on a quake 1 engine fork but looked better and was faster, despite being bigger and more complex than quake 1. The engine used, has nothing to do with the game being slower or faster, not if the developers using it, have the ability and the knowhow to change the source code that the KCD team has.
Im charging in with my 8700k IDGAF. Ive played so many game that recommend 12900k+ and my cpu honestly does fine even at max 4k/1440p. Only exception was Jedi survivor with RT on before they removed Denuvo.
The first KCD game was also quite demanding on the cpu in Rattay. And also had some forward-looking ultra-high settings, meant for future, more powerful gpus. Fortunately, there were some settings that could be lowered for massive gains and quite small visual impact. I did play it on my R5 2600x & Vega 56 at the time, but having replayed it recently on a 7700 & 4070 ti, it is night and day.
@@userblame632 A 5800x3d should not be aging out of 60 FPS. I'm not being defensive about my CPU either (for what it's worth I have a 7800X3D) but that is ridiculous standard for PC users to have to keep pace with.
I think having so much stronger CPUs than the last Gen of consoles relative to release is really affecting the market. It seems like the roughly R7 3700X of the consoles is being treated like the minimum to hit.
Consoles are not any were near a Ryzen 7 3700x. They have no we’re near as much L3 cache and are clocked significantly lower with lower thermal envelope. They are on par with a Ryzen 7 2700x or Ryzen 5 3600 and Digital Foundry has proven this.
@@FlxKomp Probably runs smooth on a 3050 cause he's running with an i9-14900k lmao, no way with these requirements his 3050 runs smooth unless he's playing at low settings
I bought a 9800x3D recently and was surprised how CPU limited I was in a ton of games despite playing in 4k (DLLS quality usually). Modern engines take a lot of shortcuts to save time and money but its very taxing on the CPU
Yeah, I've said so many times that the 7800X3D wasn't enough for a 4090 and people said I was stupid or my PC had something wrong with it. The 9800X3D definitely helps, but it's still not got the headroom to what the 5090 will be. The high-end 50 series cards will be useless in a lot of these games that are releasing.
@@mttrashcan-bg1ro It's not that the cpu isn't enough it's that the engines and game developers aren't doing fuck all for optimization, so even a cpu that IS enough can be a bottleneck because the engine/game is just that poorly optimized.
@@mattblyther5426 True, sometimes you'd be surprised even in true 4k the bottlenecks that go on. It's not in most game but Cyberpunk and Dragons Dogma 2 were both CPU bottlenecked at times. DD 2 is just unoptimized hell and Cyberpunk its more I was noticing an extreme performance increase in the 1% lows. A lot of Unreal Engine 5 games nowadays have CPU bottlenecks too (or at least did when I was on my 5800x).
I've pre-ordered mine (Germany), but who knows when they're back. Bit salty that this seems to be a non-issue over in the US, meanwhile in Europe we're out of luck currently.
@@123Suffering456 They were completely sold out everywhere after black Friday in the US too, and the only seller on the US Amazon that has one in stock right now wants $834....
Why don't comment the diffrence in VRAM when going from fom 1080P to 1440P as both GPUS have atleast 12G of VRAM and when going to 4K both cards have atleast 16G of VRAM . To me it seems like the higher texures needs more VRAM as well as more GPU power OItherwise good intressting content as ussual keep up the good and interessting content :) And excuse my lack of propper english
is it me or all of a sudden games are getting a bump in system requirements right before the 50 series comes out. Indiana Jones game is pretty high too
Not really, it has been this way for every game that targets the current gen consoles only because they have way better hardware compared to the past gen ones.
I never finished the 1st game, got close to it but just never got there. Recently reinstalled it again for the 3rd time, and the performance drop is quite big in the bigger cities due to heavy CPU loads, so given it's still Cryengine and looks somewhat better from the footages I've seen than the first game. Plus the NPCs seemingly have dynamic interactions now with the Player in the new one. it's likely gonna be really nasty on the CPU.
@GodKitty677 I have a prebuilt Omen with a trashy AIO cooler.(was cheaper than buying like half the components separately back when I bought it) Definitely building everything myself when I inevitably have to upgrade everything other than maybe the GPU and storage...
@@sankro With prebuilt PC you have to be careful. Sometimes the PSU's are just about enough and the motherboard has the bare minimum vrms. Crappy AIO cooler is a sign of less than good things. 100% make the PC yourself next time. At least you can get higher quality for a better price.
I have an 11900k and a 3080ti and I'm thinking the same, my cpu is what's holding me back from a stable 60 in stalker 2. I guess I'm going to have to try to overclock, it's that or upgrade...
@@COMMANDandConquer199 You wont get 60fps stable with a RTX 3080 ti at 1440p maximum settings. Minimum fps at 1440p epic with a 14900ks or 9800x3d and a rtx 3090 is 50.4fps test system. - techpowerup 1% low TPU custom scene. Stalker 2. This game is buggy. Basically you need a rtx 3090 for 1440p60 epic settings and you wont be 100% stable fps wise. It takes around two hours before there's very little stuttering. Its probably because all the additional shaders are now compiled at that point. Shader compilation at startup was much faster with the 14900K. This is compared to the 9800x3d. Basically use dlss it helps with cpu bottlenecked games.
Honestly,I greatly prefer the Cryteck engine over URL engine 5. Optimization on that engine seems to be a big, huge pain in the ass for a lot of developers.
I’m really excited for the game! They mentioned that the game is CPU-intensive, so I would love to see some benchmarks for Kingdom Come: Deliverance 2. I want to build a PC specifically for this game. I’m currently playing the first one and I’m loving it so far. I appreciate you making this video. Keep up the great work!
2600 -> 5800X is quite a jump for just changing settings from low to High and still only have 30FPS. It's a shame we don't get 1080p Ultra, nor 1440p low, because that would tell a lot more.
Its actually no change at all since FPS is reduced to 30 in high settings, while at medium its 60, so i dont understand what are these requirements. What a mess they made.
You need money to upgrade hardware and if most people can't you don't have buyers. I think all of us want the best hardware on the maket but this is the sad truth
@@FastGPU Not everyone buys a brand new car every 1-2 years. What kinda rich ass parents you have? You make butt loads of cash a year? I make around $30 a hour and still think it's stupid you gotta upgrade this often.
I don't own a car or have rich parents, but I can see how the world works before my eyes I only ran games that were several years old for years Currently I'm running a modern rig i7-14700K & 4070 Ti Super most my games are still a few years old because their affordable $5 on sale, look fantastic on my 165Hz 1440p monitor at native 165fps at high to ultra settings Several years from now I may own a 10,000 series GPU & ? CPU I don't give a tinkers damn about the newest titles
I remember buying a 1070 at MSRP of $380 and realizing I could play most of the games at 1080p well over 100 FPS on whatever settings I wanted. That was the whole reason I got a 144Hz monitor. If I use the techpowerup database as a reference, I'd have to spend ~$500-550 dollars to do the same with this game (3060/6600XT 60 FPS gets theoretically doubled by a 4070/7900GRE). According to the first inflation calculator I could find, ~$380 2016 dollars is ~$500 2024 dollars, so at least there's that. But then I remember that plenty of the newest games need $500-600 to barely get 60 FPS at any settings. Oh, well.
I never even finished the first one because it played so bad with a 5900x + 6700XT 12GB. And it's an OLD game, now. Now THAT'S a terribly optimized game.
@@MrBaKalu Well, it came out in 2018 and apparently took until 2024 for a CPU to brute force it. It's been so long since I played it... I mean, no offense, just saying "1440p 60fps" doesn't tell me anything. We both know the 6700XT is more powerful than the 2080 Ti. I run my games at 1440p all max. I seem to recall getting 35-40fps. So, you're running the game on med-high, this is a game that just runs far better on Nvidia, or you're full of it.
@@JustAGuy85 before I had ryzen 7 1700 at it was worse, but not that worse. Only in towns it dropped that low with the same GPU. Dunno why you have such low FPS, something is wrong for sure
@@MrBaKalu Lemme refresh my memory and run the game from the beginning. I know nothing is wrong with my rig. My memory, however, does have issues. Do you play it at 2560x1440 all max? Im gonna run it like that.
I would wager that if you tweek the graphics settings and not just use presets, you'll be able to get better quality with less performance cost. It happens a lot with modern graphics where higher and higher settings give diminishing returns.
I'll be honest I don't notice a difference on KCD itself from High to Ultra... Maybe I'm not looking at the right spot and my vision is not what it was..
Same, and comparing my CPU to the ones listed it seems like it would run at like ~45fps. I really hope the fps values they listed are measured in the city, then I'd be perfectly fine with them.
@@Osprey850shit, and I thought my 13900X3D from AMD/Intel collab will suffice... I guess it is these pesky Vcache cores again not holding voltage, since Pcache cores are fine!
All the modern games talk about is high resolution textures packs and options, but what most of us really need is an option for lower resolution textures. Funny how modern games even on the lowest settings look great, thus still require a strong PC.
@@aquanex07 We have to be kind to them! Mine has the tiny stock cooler as well. I have a much stronger one from an old LGA1155 build, but it's missing the mounting hardware for AM4. Maybe I'll motivate myself to find some.
@@AleksiJoensuu I play 1440p with a 4070 Ti Super and 32 GB RAM but the cpu is going to bottleneck my system hard, I predict this is going to happen mainly in Kutná Hora. System req says 5800x for High settings 1080p at 30 fps, 7600x is around +12% faster than a 5800x... and they are asking that for medium 1440p, my gpu is just going to be useless running this game since I won't be able to use it properly. I expected this game to be cpu intensive as I played many hours to the OG but I didn't expected this to be that intensive. I hope someone does an optimized mod like in the first one, the game was running much better and with better render distance.
If these requirements are real, apparently I'm going to finish this game on TH-cam, buying games is already expensive enough these days, now on top of that, to be able to run a game you need to spend a fortune on the hardware too, absurd.
The price to play at native resolution has reached a point where it's no longer really viable. Now you get to see two kinds of specs: either specs with relatively reasonable GPU, but there are asterisks saying they use upscaling, or you have the specs similar to Indiana Jones and this one, with absurdly high tier GPU to reach native. In both cases I think that the jump in fidelity is not worth what's required in terms of hardware...
Personal opinion - every "ultra mode" should be aimed for current best GPU, becuase ultra setting should be for future proofing game. Not everyone plays games at the time when they release, so having mode for "quite late players" sounds like cool idea to me.
What's wrong with that though? People pay for the highest end graphics card because they want to be able to play at the absolute highest settings. If the highest settings were playable on a 4060 then there'd be no reason to buy a top of the line card. Ultra settings are made to fully take advantage of the best technology on the market, not to be accessible to people who have 8 year old hardware. As long as it is playable and still looks good on weaker hardware there is nothing wrong with the best being saved for those who bought the best
This is what "ultra" means. It unlocks all the power of the current gen GPUs for the best visuals possible at the moment. I'm more worried about the quality of midrange GPUs because they are far from good deals and it's not going to get better very likely.
Built my first gaming PC few months ago and boy am i happy i went with a 7800x3d/ 7900xtx combo as 4K 60 or at worst 1440p set to max were my main performance goals
It's not that it's unoptimized, it's just a very difficult game to run. This isn't some shit game with spaghetti code that they're just relying on hardware to offset, it's a game that is just genuinely filled with features that require a lot of power to run regardless of how much they optimize
Why developers think its a good idea to full send the engine so it takes a $1500 rig to run 1440/60 is beyond me. Just because x3d chips and 4090s exist, doesnt mean everyone has one. Its actually crazy that im getting almost half the frames with a 7900xtx that i was a year ago when i bought it. RIP 6000/3000 series enjoyers
I haven't seen a game yet that stated 7800x3d or the 9800x3d or likewise where my Ryzen 9 5900x and 4080 super hasn't maxed out the game at decent framerates especially with upscaling and FG. So I ignore the CPU requirements tbh because I can.
This is getting silly. Games are getting slower much quicker than PCs are getting faster. I don't want to play games at 60fps. It defeats the object of having a PC!
You forgot that it was always this choice: console with 30fps on a TV with TrueMotion interpolation vs PC with true 60fps and above. That's for the story-focused games, competitive are a different thing.
I wouldn't trust the CPU requirements. As Daniel said, these just happen to be the specs the devs used when testing GPUs. For once, Warhorse is still a pretty small studio, compared to your usual AAA dev (Ubisoft, Rockstar, Bethesda etc.), so I don't belive they had that many rigs to benchmark on. Secondly, Tobi himself said that he has an rtx 3050 at home and it ran smooth even on the older builds which were probably not very well optimized. Lastly, about a week ago there was an event in Czech republic where everyone could try out the game on a PC and the rigs there were some older Alienware systems that definitely didn't have a 7800x3d and it still ran perfectly fine.
it's very interesting the requirements here. just go look up games released around 2016-2018. they list recommended requirements of CPUs from several generations past, like the i5-2600k, or the i7-3770, a 2012 processor, a 6 year old processor (when KCD came out), and the GTX 970, 4 years old at the time, and only a $300 gpu. Now? they're recommending pretty much some of the better CPUs on offer. the 7800x3d is one of the best you can buy for gaming, and the intel chips newer than the 13700k aren't much better for gaming. the GPUs listed are much more expensive in the recommended list, the 4080 super being a $1000 GPU. Indiana jones had similar requirements, it's ultra listed the 4090, a $2000 GPU. More importantly, they're the current generation, meaning anyone with older cards are going to struggle HARD. when KCD came out, i played it at 1080p around 60fps med-high settings on an i3-6100 and GTX 960 with only 8GB DDR4. so, i was somewhere in the middle of the min and rec specs. my CPU had fewer cores than the listed ones, but did alright on the benchmarks compared to the min spec AMD CPU at the time, and my GPU was about on par with the ones they listed with the benefit of 4GB VRAM rather than 2GB. my system cost $600, i'm still using it to type this, and i'm in the market to upgrade now... but if i built a system today, i could go build a $1800 system with a 9700X (or similar), the new 5070 or 9070XT (which is what i'm waiting on), 64GB RAM and still struggle to play this on ultra. This sucks, i can almost no longer afford to be a PC gamer. and i own a home. i'm going to build it, but this may be the last one i build for a long time, i may just buy a PS6 when they come out and say f&ck it next time. especially since GTA 6 still isn't announced for PC. ugh.
It looks like this might be one of those games that actually will be fun, not restricted by political correctness. All the trailers definitely looked fun.
and people will still say that we dont need faster cpus. For us with 144hz+ monitors, none of the cpus on the market are able to even get close to those fps in triple A games.
@@megamanx1291 even in singleplayer games the difference between 60fps and 144hz or more in terms of motion clarity is massive. Unless you are one of those that are braindead and cant notice a difference, for those you have to be born again, no fix for a slow brain.
@@gersongaete1574 I love high FPS but the issue with trying to do high refresh rates with AAA games is the game engine is just not designed with high refresh rate in mind. So you end up getting more stutters which ruins the whole point of having high FPS. So I'd rather game at 80-100 FPS with butter-smooth frame times in AAA titles and leave the high refresh for FPS, Racing games stuff like that. But that's what is awesome about PC you can do whatever you want (so long as your PC is up to the task).
The fact that u need a 7600x for 1440p medium, but a 7800x3d for 1440p high? That doesnt even make any sense since lower settings is more cpu intensive, devs are clueless when it comes to system requirements
I guess you didn't listen to the video. higher settings can include things like higher crowd density, longer render distance, etc, which will increase cpu load more than just having more fps will
@@cire420siuol When you lower the graphics settings, your GPU is able to render more FPS. To compensate for this, the CPU has to send more data to the GPU per second which increases its load. If you were to increase your graphics settings, your frames will drop and your CPU usage will decrease. It's a balancing act.
@@Jasontvnd9 Why do u think cpu's are tested at 1080p instead of 1440p/2160p, same thing applies to game settings, lower settings is less gpu bound which makes it more cpu bound
This is just weird. First the list 1080p @ 60 with a 13600k and then 1080p HIGH with a WORSE CPU. I had to make a double take and realize it was HALF the FPS. But why would they show it like that? They're basically saying; you can have higher settings with worse FPS with a worse CPU. Like really?
There is a chance 8400 gets more than 30, like getting 30-50fps but 30 is a stable lock . Same with the higher end cpus. Would have to wait til release to tell tho
whats with the supposed parity between the 13700k and the 7800x3d? theres a massive diff betwen those two. Could it be that this game needs more than 8 cores?
Really makes me feel the age of my 5900x looking at these recommendations. It was bound to happen, only hope I can get my hands on the 9800X3D until then because I don't doubt it'll be CPU-heavy like the first KCD was and these requirements suggest.
First AD that I have seen that shows and actually valuable product, the redmagic is absolutely one of the best value proposals as a gaming phone in every aspect.
You can see the game stutters at every camera change even in the recent IGN gameplay, this is going to be SO good lol Can't wait for ppl to defend the game with some bs like "It has real time NPC with their own routines bla bla bla" like in the recent 20+ years no other game had similar or better systems lmao
7:43 I would like to emphesize this point alot more by saying. These specs are "kinda deceiving" but I think they weren't done on purpose with ill intent. If you take a good look at that chart everything here is just contradicting each other. The requirements for 1080p medium and 1440p medium would wipe 1080p high specs. The i5-13600k or r5 7600x would wipe core i5 12600k and ryzen 7 5800x. Same for the guy listed in these the 6600xt, rtx 3060, 3060ti, 6700xt would also wipe the floor with 2060 super and rx 5700. The only reason I think this is how it is is because of the ram amount in each that make 32gb > 24gb but in cpu and gpu both of these are better than 1080p high. So dont let the gpu and cpu fool you.
30fps with a 4080/7800x3d at 4k is dumb. Pass instantly for me. I find older games so much better because these new high demanding graphics do not improve the games for me. Games looked great a bit ago and played much better. I want great stories with character relationships, and good gameplay and features. Stop with the bad optimized games that really don't look much better. Keep your shadows and reflections. Indie games rule!
"I can't play this game at 200 fps 20k graphics. Pass. Older games were so much better. I just want good gameplay and story. That's why I won't play this game because I can't play this at ultimate max settings at 200 fps" Oxymoronic idiot.
You'll still get all that in this game, mate. Only have to settle for more modest graphical settings, which is something you should be comfortable doing judging by your comment, and you'll still get great visuals nevertheless.
I feel ok with it. We should expect modern games to be pushing hardware requirements. If it doesn’t it means they aren’t pushing to make games more immersive and rich. PC gaming is supposed to be about upgrading regularly. Your system needs to have a few changes every three years or possibly more often. I currently have: i7 14700K, at 5.7Ghz on pcores, 4.5Ghz ecores 360mm AIO (Deepcore Mystique, it’s actually amazing) 65GB DDR5 ram Several gigs of SSD TUF RTX 3080 34 inch ultra wide 4K - it runs KSP1 at between 80-110 fps in 4K, so hopefully will do ok on KSP2.
Hmm...my xmas gift to myself was a platform upgrade from my old 6600k 16GB ram to AM5 7600x3D 32GB ram while keeping my old 5700xt for now. In theory I should be in decent shape for this if I upscale 1080p to 1440p. Will be interesting to see.
I wouldn't jump ahead. First the premiere, then bugs/lags, then patches and various fixes. Looking at other titles on the market, I have the impression that the requirements will ultimately be lower, although I hope that the optimization will be at the level, not like in some of the latest items...
I really hope my 3800x with 32gigs ram will be good enough. I was severely GPU bottlenecked in the first game and ordered a new GPU the other day, still waiting for the package to arrive. I'll check how hard I'm CPU bottlenecked when it arrives. Fingers crossed I won't want to upgrade to a 5800x3d or something like that straight away. My wallet is scared.
I don't believe these recommended PC specs. First part of KCD is still not playable with comfort after 6 years. Even maximum draw distance makes chicken appear and disappear in ~40-50 meters, and lowest native fps at those settings in Rattay is 30 fps on ryzen 7500f, RTX 4070 super, 32Gb DDR5 6000MHz, SSD, 1440p.
Just upgraded my Ram along with my CPU.. all the comments on forums and reddit about 32GB BEING PLENTY for a long time are hard to believe after seeing this. Luckilly i went with 64 in the end.
Did one of devs say he was running the game on a 3050 with a 3600x at home and it ran fine? Why this sudden leap on some of this stuff. STALKER 2 as well had 2070 and a 3600x up till the day of release as recommended and they ghost switched in the new requirements.
I wonder how consoles, with their meager CPUs are going to manage 60 Fps (especially the ps5 pro @2160p upscaled when native 1440p medium recommends a 7600x on PC...).
@@iBaZiic now that I think about it, they specified 60 _unlocked with VRR_ . Probably because the console can't sustain 60. Without a VRR screen that 60 is probably going to feel abysmal.
@@etienne1062 60 unlocked with vrr means that it can go above 60 and it needs vrr to do it, like other games. Or else they would have said up to 60 locked with vrr. Why would you remove the fps cap if it can't go above it ? But tbh i do expect some dips here and there and the games to go into the mid 40 or 50 fps in some areas. But if it's steady 60 fps most of the time i would be happy with that.
@@iBaZiic why cap to 60 when you allow frame rate to fluctuate anyway ? Sure it may go above, but given the PC requirements, it's most certainly because it will spend a lot of time _below_ 60.
@danielowentech just hope someone will test and analyse the current state of KCD 1 more or less deeply with 9800X3D and some range of CPUs/GPUs like in this recommendation, to give a perspective for people who want to play KCD 2, but have an idea about buying some pc parts on sales / before the tarriffs.
trying to figure out exactly what the game is doing, to require such high sys requirements for 2k. I tend to playin 4k. I think I will pass on this game for the moment until I see some benchmarks.
I'm at the medium 1440p 60 FPS range but I still don't think it will hit a smooth 60 always. I'm currently playing the first game and it runs like a dream at 1440p very high settings 120+ FPS. Was really hoping for something similar but that now seems like a pipe dream. And yes please do testing for this game! I'm not buying it until I see your results.
The original game dips below 60 in certain instances on my new PC, which has a 7800X3D. But my average is still well over 90fps for the most part. None of this is surprising to anyone who has played this game or a Cryengine game.
Check out the amazing REDMAGIC 10 Pro gaming phone! bit.ly/3Ozg22H #redmagic; #redmagic10pro; #technology; #smartphone
I'm using the redmagic 6r right now, looking to upgrade at some point.
These phones have all of the performance you need, without the extra markup like other "so called" gaming phones.
Peak
@@RongDMemer confident in my rig for this I712700k RTX3070 32gbs ram going to be playing 1080p so i'll probably be able to run it at 60 in ultra
Is the Red magic phone unlocked so it can be used with any carrier?
@@NotaFollower11 i5-12400F and 3050 (got it for cheap don't diss me)
Probably gonna get B580.
Not really into kingdom com it's a lame game for such a requirement.
I don't see enough ppl prasising the devs actually just giving us straight facts and telling us "hey this is what you need to run the game at NATIVE" no DLSS or required framegen.
So true
imagine we need to start praising game companies for doing the most basic ethical move because all others don't dare do it.
@@resltessmindtwr Unfortunately that's the current landscape, so gotta take the small wins were possible and then use them to make better changes in the gaming industry when it comes to optimisations and transparency over the basic spec list.
if the game looks 10% better than kcd1 but requires x10 times the hardware it's hard to praise
@@armandosillones2643can blame this one on ue5 as well...
For people new into the series: Yes, the original KCD struggled with CPU usage and continues to struggle even to this day, there's a lot of real-time simulation as well as dozens of NPCs with actual schedules, personalities, traits and jobs and they all exist in real time in the world while going on with their lives. KCD2 looks like its going to be even bigger and better, so expect similar issues but adapted to current CPUs.
But the fact it requires a 7800x in modern day for even just 1440
@@gon720Good. Making progress is hard sometimes... It's also 7600X recommended medium, not required.
Morrowind had it too
@@meqdadfnsure it did buddy
one of the coolest unexpected mechanic is when i cleaned some bandits but didn't loot their gears, i accidentally found that their gears has been transported to cemetery caretaker / gravedigger chest when i was thieving in Rattay near the church
You should definitely cover Kingdom Come: Deliverance 2 performance. It's a pretty major release for next year so it would be helpful for people on the fence.
Requiring a 7800x3d which was the best cpu up until a month ago for 1440p high is wild.
Cyberpunk also requires the ryzen 7800x3d since the 2.0 update that was like 2 years ago
@@emanuelriquelme1133still ran great at 1440p on my ryzen 3600x
@ considering I was playing 2.0 1440p high on an i7 9700k, that’s not true at all.
@@Zer0Hour17 Cap brother, that game runs fantastic on my rig and my CPU is nowhere near a 7800x3D
UE5 at its finest.
“Oh, you don’t need more than 16 gig ram unless you’re multitasking”, said everybody.
Lol
That was quite a while ago. Now you have single sticks of Ram that are 16GB, so a kit of 2 is 32GB, and that is on the low end.
"oh, what people said has been true for years by now, but when the very first example of a game that asks for more than 16gb comes up which by the way is a game known for very heavy cpu and thus ram utilization, and even though we are not sure if It's over exaggerated by the devs to be safe, or a case of bad optimization, I'll still try to be a smart ass about it", said you
"The 7800X3D/9800X3D is only to play at 1080p 480FPS with a 4090, there's no real use case for those CPUs",
Why said that?
Team 360p over here.
Will still need a beefy CPU though.
@Tien1million Well, my comment was meant for something else but this works as well :) Anyway I have a 13700kf so I should be fine.. I hope..
I have a Ryzen 5700x which should be about 30% slower than a Ryzen 7800x3d.
This means I‘ll probably get 42FPS 😭 I‘m used to get at least 80, so I can Max out the Monitor fps when framegeneration is switched on.
@@K.R.98 There will probably be a few settings we can dial down for decent performance, besides FG works wonders when CPU limited.
@@bilbobaggins8794you‘re right man. We all just have to tweak settings a bit. Modern requirements are just ridiculus and it’s never enough. It’s better to make a view compromises here and there.
3080 is basically on par with a 4070. But holy fuck, now I need a whole new motherboard, cpu, ram, and power supply just to run this game native resolution at 60 fps
Just like new stalker
CPUs are so behind it's a joke. I got my new 7800X3D recently and it can't even push a 4090 in a lot of games, and the ones it does, it's not got any headroom for a future GPU upgrade and this CPU is a year newer than the card. The 9800X3D doesn't even fully solve this problem because in some games it doesn't run any better.
I don´t think so, some modder will tweak some hidden settings and your will gain 30 fps or something, it´s always the same... those requirements seem like a joke.
3080ti is nearer, 4070 is clearly stronger than 3080
@@mttrashcan-bg1ro i was considering a 7800x3d when i will upgrade...
Most of KCD1's "ultra" settings were for future hardware, the game even warns you it is experimental and not meant for current (at the time of release) hardware. Im sure it will be similar in KCD2, where many ultra settings are there as experimental options not meant to actually be used untill there is hardware capable of using it. Im glad they do this personally, makes replaying it later that much better.
Which is a good thing from devs, they realize that HW is not ready for it and you can still lower settings and make it run properly because it's CryEngine and when you lower details, you have massive FPS boost, that doesn't work in Unreal nor id tech engine where FPS is still the same and only thing you can do is to use DLSS/FSR or lower resolution. I still believe in CryEngine.
Yep, I played it recently and had 60 fps with occasional drops to mid 50 in Rattay with all ultra settings in 4k on 4080 7800x3d rig
More like they were too lazy to optimize it so they just thought “time will create the illusion of optimization”
@@redblue2358 Game was optimized well, I played it for the first time on Radeon R9 380 and Core i5 4670K and it was completely fine in 1080p, just in very populed areas it was dropping, but nothing that bad.
People complaining about KCD optimalisation just had potato PC, it's comletely different situation than now with stupid unreal engine games where it doesn't run properly neither on top hardware.
@@Pidalin probably true. Screw unreal engine
Pretty crazy that with a lot of these new releases you actually need a high-end "overkill" PC with the best CPU and GPU available to hit max settings, whereas last generation even the midrange would have been overkill for most games.
The fact that a 4080 or sometimes even 4090 shows up in these system requirements charts at all is insane, I did not expect "halo products" to show up as a requirement for games.
pretty normal to require a 4080+ for ultra settings without upscaling, most games show with upscaling, this was shown with no upscaling
@vintatsh im sorry but the fastest computer hardware you could buy ran metro 2033 at about 30fps at 1080p back in 2010. 4090 gets a lot more than 30fps at 1080p in this game.
I had a 1080Ti when RDR2 released. The only setting I played on "ultra" was textures everything else was a mix of medium, low, and high. So yeah this isn't new this has been a thing for a long time.
@@WeencieRants Also keep in mind that a 1080 ti was much cheaper in 2019 than the equivalent sku in the 40 series lineup even accounting for inflation, which would probably be a 4080 super and is over 1000 bucks.
@ well thankfully I didn’t upgrade to a 4080, I upgraded to a 7900XTX which adjusted for inflation was the same price I paid for my 1080Ti. I have an EVGA FTW3 1080Ti and I upgraded to a Sapphire Nitro + 7900XTX, both are top SKUs and from the premium board partner. Adjusted for inflation I paid $50 more for the 7900XTX.
Very steep CPU and RAM requirements, but quite moderate on GPU front. I don't expect this to run well at launch, but hopefully it's not as CPU bottlenecked as it looks on paper.
4080 for 30fps in 4K or just 60 in 2K ? yeah nah let's stop the cap
KCD 1 was also very cpu heavy and is to this day.
This, kcd2 has real-time npcs, alot of them that have active lives outside of the player and schedules so they have a constant cpu load all the time.
If u kill bandits and leave their loots, some npcs might pick it up and leave it in my chest in the graveyard etc all in real-time
@@rasengan367tna They don't want their game to look dated in 4 years. It doesn't hurt you to select High or Medium.
KCD1 had been made with Ultra settings designed for future hardware, and it still tells you this upon selecting them on modern hardware. Hopefully KCD2 is an improvement here with CPUs because the first one runs like garbage on a 7800X3D at max settings. Even if they are poorly optimized or made for future hardware, they should actually run on CPUs released 5-6 years after the game
if its cryengine again cpus will get destroyed. i5-8400 in the first game is still pretty stuttery in towns
Once again TH-cam is taking forever to process the higher resolution version of the video so I'm starting with Early Access for channel members now that the standard definition version is ready. I'll post on the main feed once 4K processes.
I expected even higher CPU requirements, as the first game rekts your CPU even today. I guess it's a CryEngine thing. And the second game will be featuring a large city with who knows how many NPCs rendered with complex behaviours.
Yeah. The Engine + their very deep persistent NPC life simulation (NPCs have a home, they eat, sleep, work and engage in leisure activities, they have inventories etc). That’s what really taxes the CPU.
isnt 7800x3d like top 3 cpu in the world lmao?
To be fair, i expect these specs to not match reality
@@splasherrrr I thought more of the minimum spec CPUs. Expected at least mid Ryzen 3xxx and Intel 9xxx series.
@@Neuroszima We'll see when the game releases. They pushed the release date forward one week, which likely means they are satisfied with polish. The game might be well optimized, at least better than the first game which is abysmal in that regard.
Seeing the specs for Indiana Jones made me think this one would be monstrous but it looks decent. Luckly by the time it releases i already have a 5700x3d.
Yeah and they were a gross exaggeration, it said a 4090 was good for 4k 60fps when it gets 120fps without DLSS.
@@mttrashcan-bg1ro what game are u talking about KCD2 or indiana jones? and how do u know how well for example KCD2 will run?
I knew Indie wouldn't be that demanding. Most games on Idtech engines run really well even with ray tracing.
@@mttrashcan-bg1ro 120 fps at 4K max with 4090, but it's without path tracing. Early access has RT, but not PT. Path traycing will be added in day 1 patch
I'm afraid 5700X3D won't be enough with RT. W/o RT should be fine tho.
The original was a massive cpu hog you could only get a locked 60 fps experience with a 8700k which was the fastest gaming cpu at the time so it won't suprise me.
The game is still a mess, i had drops into 40s on my 5900X and that was with a config mod, which made the game run a bit better.
A miracle that they got it running on the Switch somehow.
@@marcelosoares7148 I actually think the Switch version is one of the best ways to play the game. Otherwise I have to lock the game to 40fps in order to have a consistent experience.
Yes, this one will be a CPU hog as well (cry engine + a world full of simulated/ persistent NPCs with their life cycles). Though hey did say that they rewrote a lot of code to make the game more stable and that it is much smoother than KCD1. So I would at least expect a more stable framerate.
@@raskolnikov6443 The fact this game uses a fork of Cryengine means jack, HL1 was made on a quake 1 engine fork but looked better and was faster, despite being bigger and more complex than quake 1.
The engine used, has nothing to do with the game being slower or faster, not if the developers using it, have the ability and the knowhow to change the source code that the KCD team has.
Im charging in with my 8700k IDGAF. Ive played so many game that recommend 12900k+ and my cpu honestly does fine even at max 4k/1440p. Only exception was Jedi survivor with RT on before they removed Denuvo.
ya I'm doing the same with my 5yo build of 9700k, 2080 and 16GB ram. hoping to achieve 1440p medium or at least 1080p medium.
Your 8700k does fine at max 4k settings? Tf are you playing? Cuphead? 😂
The first KCD game was also quite demanding on the cpu in Rattay. And also had some forward-looking ultra-high settings, meant for future, more powerful gpus. Fortunately, there were some settings that could be lowered for massive gains and quite small visual impact. I did play it on my R5 2600x & Vega 56 at the time, but having replayed it recently on a 7700 & 4070 ti, it is night and day.
Hello, I am Lazaro Dias from a small town on Brazil called Britania, and I love all your videos, thank you for your work.
Nobody cares
Hello Lazaro Dias, it's great to see someone else enjoying these videos! Don't listen to Herman, he's a scumbag!
@@hermannhand4557Don’t be a douche.
Hi Lazaro
@@hermannhand4557 I bet you're a really fun person at gatherings.
People crank up all settings to max without considering that some are more taxing than others and that you won't always notice a difference
I don't even notice a difference in KCD from High to Ultra. .. IF KCD2 is the same, it won't matter again.
Honestly I dont even know where to get time from to even play or the money to buy the hardware. For real
Just get a console bud
@@marshal7969lol no. Consoles won't run it well
Get a sugar daddy
@@MaxHardcore-p7t not even the PS5 pro?
@@Mike7064 PS5 pro is overrated.
So according to these recommendations, my 5800X 3D and 4080 combo can’t hit 60 FPS at all times at native resolution. Unreal.
According to them, you are CPU-limited at 60 FPS so upscaling won't help. It is obviously fake.
Yeah man... new game devs are completely shitting on the PC community.. oof.
I mean the 5800x3d is aging at this point, if youre gonna get a 4080 id want something newer for sure.
@@userblame632 A 5800x3d should not be aging out of 60 FPS. I'm not being defensive about my CPU either (for what it's worth I have a 7800X3D) but that is ridiculous standard for PC users to have to keep pace with.
You'll be able to get more than 60 fps with the 5800x3d. It's performance is the same or better than the 7600x in games
I think having so much stronger CPUs than the last Gen of consoles relative to release is really affecting the market. It seems like the roughly R7 3700X of the consoles is being treated like the minimum to hit.
Consoles are not any were near a Ryzen 7 3700x. They have no we’re near as much L3 cache and are clocked significantly lower with lower thermal envelope. They are on par with a Ryzen 7 2700x or Ryzen 5 3600 and Digital Foundry has proven this.
They are always a lie though. You never actually have to get anywhere near what they claim for CPU's in these lists.
This is why when It comes to CPU's, I buy more power/performance than I actually need.
@@ZackSNetworkeveryone knows the cpu is equivalent to the ryzen 7 3700x. Why does people always need to downplay the ps5 specs and performances
3700x is a great cpu still, I paired my kids computer with a 6700xt and it crushes anything they want to play
I have a feeling these requirements are on the overstated side
Cant wait KCD1 was my favorite game of last gen era. Glad I got my 7800x3d a few months ago.
Didn't the dev say he is playing this game at high settings 60fps on his "old home machine" ?
Yeah his old home machine with an antique 7800x3D/4090 combo ROFL
He said it runs smooth on his 3050
@@FlxKomp Probably runs smooth on a 3050 cause he's running with an i9-14900k lmao, no way with these requirements his 3050 runs smooth unless he's playing at low settings
Just as Henry is hungry for food, so too is Bohemia Simulator hungry for power and technology.
I bought a 9800x3D recently and was surprised how CPU limited I was in a ton of games despite playing in 4k (DLLS quality usually). Modern engines take a lot of shortcuts to save time and money but its very taxing on the CPU
Yeah, I've said so many times that the 7800X3D wasn't enough for a 4090 and people said I was stupid or my PC had something wrong with it. The 9800X3D definitely helps, but it's still not got the headroom to what the 5090 will be. The high-end 50 series cards will be useless in a lot of these games that are releasing.
@@mttrashcan-bg1ro It's not that the cpu isn't enough it's that the engines and game developers aren't doing fuck all for optimization, so even a cpu that IS enough can be a bottleneck because the engine/game is just that poorly optimized.
Turn off dlss and watch that cpu usage drop.
@@mattblyther5426 True, sometimes you'd be surprised even in true 4k the bottlenecks that go on. It's not in most game but Cyberpunk and Dragons Dogma 2 were both CPU bottlenecked at times. DD 2 is just unoptimized hell and Cyberpunk its more I was noticing an extreme performance increase in the 1% lows. A lot of Unreal Engine 5 games nowadays have CPU bottlenecks too (or at least did when I was on my 5800x).
@@mttrashcan-bg1ro Yup the super CPU intensive (or unoptimized games) are still bottlenecking in some cases on 4K even native. Pretty crazy.
looks like a 9800X3D is in my future when they come back in stock
Got mine two days ago ordered from Amazon.
@@ZackSNetwork i am in europe and i am not going to pay scalper prices
I've pre-ordered mine (Germany), but who knows when they're back. Bit salty that this seems to be a non-issue over in the US, meanwhile in Europe we're out of luck currently.
@@123Suffering456 They were completely sold out everywhere after black Friday in the US too, and the only seller on the US Amazon that has one in stock right now wants $834....
I'll have to save up for now, got a 5800X3D 2 years ago. I guess I'm getting the 11800X3D?
Why don't comment the diffrence in VRAM when going from fom 1080P to 1440P as both GPUS have atleast 12G of VRAM and when going to 4K both cards have atleast 16G of VRAM .
To me it seems like the higher texures needs more VRAM as well as more GPU power
OItherwise good intressting content as ussual keep up the good and interessting content :)
And excuse my lack of propper english
is it me or all of a sudden games are getting a bump in system requirements right before the 50 series comes out. Indiana Jones game is pretty high too
Not really, it has been this way for every game that targets the current gen consoles only because they have way better hardware compared to the past gen ones.
@ the current gen consoles are equal to a 2080ti
@@mikeramos91 Yeah, but that's way better than a PS4/XB1 so games don't scale as low as they used too anymore.
which means it is like what, targeting an "average of 30fps quality setting on PS5"? That is so laughable omg
@@mikeramos91 ps5 pro maybe
I never finished the 1st game, got close to it but just never got there. Recently reinstalled it again for the 3rd time, and the performance drop is quite big in the bigger cities due to heavy CPU loads, so given it's still Cryengine and looks somewhat better from the footages I've seen than the first game. Plus the NPCs seemingly have dynamic interactions now with the Player in the new one. it's likely gonna be really nasty on the CPU.
My rtx 3080 will definitely handle it, but I think my poor i7-11700k will certainly act as a bottleneck
Not if you overclock it and tune the memory. With a 10900k I will beat a 5800x no problems. Fully overclocked and tuned.
@GodKitty677 I have a prebuilt Omen with a trashy AIO cooler.(was cheaper than buying like half the components separately back when I bought it) Definitely building everything myself when I inevitably have to upgrade everything other than maybe the GPU and storage...
@@sankro With prebuilt PC you have to be careful. Sometimes the PSU's are just about enough and the motherboard has the bare minimum vrms. Crappy AIO cooler is a sign of less than good things. 100% make the PC yourself next time. At least you can get higher quality for a better price.
I have an 11900k and a 3080ti and I'm thinking the same, my cpu is what's holding me back from a stable 60 in stalker 2.
I guess I'm going to have to try to overclock, it's that or upgrade...
@@COMMANDandConquer199 You wont get 60fps stable with a RTX 3080 ti at 1440p maximum settings. Minimum fps at 1440p epic with a 14900ks or 9800x3d and a rtx 3090 is 50.4fps test system. - techpowerup 1% low TPU custom scene. Stalker 2. This game is buggy. Basically you need a rtx 3090 for 1440p60 epic settings and you wont be 100% stable fps wise. It takes around two hours before there's very little stuttering. Its probably because all the additional shaders are now compiled at that point. Shader compilation at startup was much faster with the 14900K. This is compared to the 9800x3d.
Basically use dlss it helps with cpu bottlenecked games.
Cryengine, not nanites or lumen, it'll be smoother than Stalker 2 for starters.
Honestly,I greatly prefer the Cryteck engine over URL engine 5. Optimization on that engine seems to be a big, huge pain in the ass for a lot of developers.
And yet performance in Rattay is meh even with GeForce 4090 and Ryzen 7950X in 4K on ultra settings.
I’m really excited for the game! They mentioned that the game is CPU-intensive, so I would love to see some benchmarks for Kingdom Come: Deliverance 2. I want to build a PC specifically for this game. I’m currently playing the first one and I’m loving it so far. I appreciate you making this video. Keep up the great work!
Warhorse collaborated with Alza CZ (a Czech Amazon type company) and made a PC specifically to run KCD2 smoothly. The specs are online.
@afiiik1 Sounds great, thank you 🙏🏼
The search begins!
16 GB for LOW? What are these newer games using all that memory for? It can't be textures since those should be potato-quality on low.
Ram literally costs nothing... Go buy more and stop crying
Consoles have 16gb, it was obvious since these consoles came out that pc is going 32 for mainstream...
most games nowadays don't look potato-quality on low though
spoiler: its textures
Its 2024 get more RAM clown
2600 -> 5800X is quite a jump for just changing settings from low to High and still only have 30FPS.
It's a shame we don't get 1080p Ultra, nor 1440p low, because that would tell a lot more.
Its actually no change at all since FPS is reduced to 30 in high settings, while at medium its 60, so i dont understand what are these requirements. What a mess they made.
KCD2 will also feature major city (Kuttenberg) so as with BG3, expect CPU usage will be heavy especially there
Whats insane is thinking video game quality is going to stagnate because its upsets those with older systems / lower spec setups
You need money to upgrade hardware and if most people can't you don't have buyers. I think all of us want the best hardware on the maket but this is the sad truth
New Cars keep increasing in price and features, its life and affordability has little to do with it New and improved sells, you don't have to like it
@@FastGPU yeah... even the car market is also in crisis for the same reasons...
@@FastGPU Not everyone buys a brand new car every 1-2 years. What kinda rich ass parents you have? You make butt loads of cash a year? I make around $30 a hour and still think it's stupid you gotta upgrade this often.
I don't own a car or have rich parents, but I can see how the world works before my eyes I only ran games that were several years old for years Currently I'm running a modern rig i7-14700K & 4070 Ti Super most my games are still a few years old because their affordable $5 on sale, look fantastic on my 165Hz 1440p monitor at native 165fps at high to ultra settings Several years from now I may own a 10,000 series GPU & ? CPU I don't give a tinkers damn about the newest titles
I remember buying a 1070 at MSRP of $380 and realizing I could play most of the games at 1080p well over 100 FPS on whatever settings I wanted. That was the whole reason I got a 144Hz monitor. If I use the techpowerup database as a reference, I'd have to spend ~$500-550 dollars to do the same with this game (3060/6600XT 60 FPS gets theoretically doubled by a 4070/7900GRE). According to the first inflation calculator I could find, ~$380 2016 dollars is ~$500 2024 dollars, so at least there's that. But then I remember that plenty of the newest games need $500-600 to barely get 60 FPS at any settings. Oh, well.
I never even finished the first one because it played so bad with a 5900x + 6700XT 12GB. And it's an OLD game, now. Now THAT'S a terribly optimized game.
I have ryzen 7 9800X3D and RTX 2080 and it runs 60+ FPS on ultra in 2k, how is it bad?
@@MrBaKalu Well, it came out in 2018 and apparently took until 2024 for a CPU to brute force it.
It's been so long since I played it... I mean, no offense, just saying "1440p 60fps" doesn't tell me anything. We both know the 6700XT is more powerful than the 2080 Ti. I run my games at 1440p all max. I seem to recall getting 35-40fps.
So, you're running the game on med-high, this is a game that just runs far better on Nvidia, or you're full of it.
@@JustAGuy85 before I had ryzen 7 1700 at it was worse, but not that worse. Only in towns it dropped that low with the same GPU.
Dunno why you have such low FPS, something is wrong for sure
@@MrBaKalu Lemme refresh my memory and run the game from the beginning. I know nothing is wrong with my rig.
My memory, however, does have issues. Do you play it at 2560x1440 all max? Im gonna run it like that.
@@JustAGuy85 yes all max and with HD textures
I would wager that if you tweek the graphics settings and not just use presets, you'll be able to get better quality with less performance cost. It happens a lot with modern graphics where higher and higher settings give diminishing returns.
I'll be honest I don't notice a difference on KCD itself from High to Ultra... Maybe I'm not looking at the right spot and my vision is not what it was..
This is the only modern release I'm interested in. Looks like my current system will be Low-Medium.
Same. I don't really mind - even the first game looked quite nice on lower settings!
Low these days + High textures is already very pretty.
Same, and comparing my CPU to the ones listed it seems like it would run at like ~45fps.
I really hope the fps values they listed are measured in the city, then I'd be perfectly fine with them.
mine will be ultra, but its KCD.. i expect it to be rough still. luckily there is AFMF2
As much as good the game might be (TBD), these guys have never been known for optimization skills...the first KCD performance was all over the place
Cant wait to play it with my recently bought 17800x3D CPU!
@@Plague_Doc22 exactly. Shits out of hand.
@@PneumaticTire It is kind of insane.
this comment will age well
I have bad news for you. To get a good experience, you'll need the 19800X3D.
@@Osprey850shit, and I thought my 13900X3D from AMD/Intel collab will suffice... I guess it is these pesky Vcache cores again not holding voltage, since Pcache cores are fine!
All the modern games talk about is high resolution textures packs and options, but what most of us really need is an option for lower resolution textures. Funny how modern games even on the lowest settings look great, thus still require a strong PC.
My 5600X will handle it. I believe in you little buddy!
Also hoping a 2080 will be enough for 4K somehow if I crank up the DLSS.
I have a 5600x too! We are in this together brother!
@@aquanex07 We have to be kind to them!
Mine has the tiny stock cooler as well. I have a much stronger one from an old LGA1155 build, but it's missing the mounting hardware for AM4. Maybe I'll motivate myself to find some.
5800x here, is going to be wild…
@@Hispanvs You should be just fine. 7600X recommended for 60fps.
@@AleksiJoensuu I play 1440p with a 4070 Ti Super and 32 GB RAM but the cpu is going to bottleneck my system hard, I predict this is going to happen mainly in Kutná Hora. System req says 5800x for High settings 1080p at 30 fps, 7600x is around +12% faster than a 5800x... and they are asking that for medium 1440p, my gpu is just going to be useless running this game since I won't be able to use it properly. I expected this game to be cpu intensive as I played many hours to the OG but I didn't expected this to be that intensive. I hope someone does an optimized mod like in the first one, the game was running much better and with better render distance.
If these requirements are real, apparently I'm going to finish this game on TH-cam, buying games is already expensive enough these days, now on top of that, to be able to run a game you need to spend a fortune on the hardware too, absurd.
what's the deal with Devs setting their 4KUltra60fps "mode" only achievable by the 4090?
The price to play at native resolution has reached a point where it's no longer really viable.
Now you get to see two kinds of specs: either specs with relatively reasonable GPU, but there are asterisks saying they use upscaling, or you have the specs similar to Indiana Jones and this one, with absurdly high tier GPU to reach native. In both cases I think that the jump in fidelity is not worth what's required in terms of hardware...
Personal opinion - every "ultra mode" should be aimed for current best GPU, becuase ultra setting should be for future proofing game. Not everyone plays games at the time when they release, so having mode for "quite late players" sounds like cool idea to me.
What's wrong with that though? People pay for the highest end graphics card because they want to be able to play at the absolute highest settings. If the highest settings were playable on a 4060 then there'd be no reason to buy a top of the line card. Ultra settings are made to fully take advantage of the best technology on the market, not to be accessible to people who have 8 year old hardware. As long as it is playable and still looks good on weaker hardware there is nothing wrong with the best being saved for those who bought the best
you must be new to PC gaming.
This is what "ultra" means. It unlocks all the power of the current gen GPUs for the best visuals possible at the moment. I'm more worried about the quality of midrange GPUs because they are far from good deals and it's not going to get better very likely.
Built my first gaming PC few months ago and boy am i happy i went with a 7800x3d/ 7900xtx combo as 4K 60 or at worst 1440p set to max were my main performance goals
Guess it's starting. "Optimize our game? Nah, just put 9800X3D and 5090Ti as the requirememts with 128GB of RAM to cover the memory leaks!"
Depends how much the devs depends on Temporal stuff as it tanks and destroys performance.
It's not that it's unoptimized, it's just a very difficult game to run. This isn't some shit game with spaghetti code that they're just relying on hardware to offset, it's a game that is just genuinely filled with features that require a lot of power to run regardless of how much they optimize
@@HatsunePeeku th-cam.com/video/lJu_DgCHfx4/w-d-xo.html
Why developers think its a good idea to full send the engine so it takes a $1500 rig to run 1440/60 is beyond me. Just because x3d chips and 4090s exist, doesnt mean everyone has one. Its actually crazy that im getting almost half the frames with a 7900xtx that i was a year ago when i bought it. RIP 6000/3000 series enjoyers
Medium settings 60 FPS is a 7600x but high settings 60 FPS is a 7800X3D. It makes no sense at all.
The difference lies in the GPU. My guess is "high" settings include ray tracing which is CPU heavy
@@cks2020693 Wy would they do that? This is just misleading and a bad ad. It seems system requirement are hardly ever accurate.
draw distance
@@cks2020693 there wont be rtx in kcd 2
@@ravingcrab8405 16% more for draw distance? Holy, the distant mountains are gonna look good.
😅
I haven't seen a game yet that stated 7800x3d or the 9800x3d or likewise where my Ryzen 9 5900x and 4080 super hasn't maxed out the game at decent framerates especially with upscaling and FG. So I ignore the CPU requirements tbh because I can.
Cyberpunk phantom liberty requires 7800x3d for 1080p lmao
This is getting silly. Games are getting slower much quicker than PCs are getting faster. I don't want to play games at 60fps. It defeats the object of having a PC!
True
Ok. Then you are not interested in playing the good story this game has. Go play Warzone at 200 fps.
@verigumetin4291 Nope, not even slightly. I'll play it in a few years time.
You forgot that it was always this choice: console with 30fps on a TV with TrueMotion interpolation vs PC with true 60fps and above.
That's for the story-focused games, competitive are a different thing.
Deep world and NPC simulation are worth it IMO. So few games do this. But yeah, it’s super taxing, especially with this already CPU taxing engine.
Ya give us benchmarks. I'm super looking forward to this game. Glad I have a 7800x3D
Are you going to benchmark the Indiana Jones game? Especially since it directly references the RX 6600, looking forward to hearing your thoughts.
it seem to run fine even on lower GPU's
I wouldn't trust the CPU requirements. As Daniel said, these just happen to be the specs the devs used when testing GPUs. For once, Warhorse is still a pretty small studio, compared to your usual AAA dev (Ubisoft, Rockstar, Bethesda etc.), so I don't belive they had that many rigs to benchmark on. Secondly, Tobi himself said that he has an rtx 3050 at home and it ran smooth even on the older builds which were probably not very well optimized. Lastly, about a week ago there was an event in Czech republic where everyone could try out the game on a PC and the rigs there were some older Alienware systems that definitely didn't have a 7800x3d and it still ran perfectly fine.
can't you get one of those chrome night mode extensions so you dont flashbang people lol
yes
it's very interesting the requirements here.
just go look up games released around 2016-2018. they list recommended requirements of CPUs from several generations past, like the i5-2600k, or the i7-3770, a 2012 processor, a 6 year old processor (when KCD came out), and the GTX 970, 4 years old at the time, and only a $300 gpu.
Now? they're recommending pretty much some of the better CPUs on offer. the 7800x3d is one of the best you can buy for gaming, and the intel chips newer than the 13700k aren't much better for gaming. the GPUs listed are much more expensive in the recommended list, the 4080 super being a $1000 GPU. Indiana jones had similar requirements, it's ultra listed the 4090, a $2000 GPU. More importantly, they're the current generation, meaning anyone with older cards are going to struggle HARD.
when KCD came out, i played it at 1080p around 60fps med-high settings on an i3-6100 and GTX 960 with only 8GB DDR4. so, i was somewhere in the middle of the min and rec specs. my CPU had fewer cores than the listed ones, but did alright on the benchmarks compared to the min spec AMD CPU at the time, and my GPU was about on par with the ones they listed with the benefit of 4GB VRAM rather than 2GB. my system cost $600, i'm still using it to type this, and i'm in the market to upgrade now...
but if i built a system today,
i could go build a $1800 system with a 9700X (or similar), the new 5070 or 9070XT (which is what i'm waiting on), 64GB RAM and still struggle to play this on ultra. This sucks, i can almost no longer afford to be a PC gamer. and i own a home. i'm going to build it, but this may be the last one i build for a long time, i may just buy a PS6 when they come out and say f&ck it next time. especially since GTA 6 still isn't announced for PC. ugh.
The whole point of gaming was to have fun it isn't fun anymore
It looks like this might be one of those games that actually will be fun, not restricted by political correctness. All the trailers definitely looked fun.
Gaming’s best days are behind it. People know it. It’s why retro consoles and CRTs are so popular. That being said, this game will be awesome.
and people will still say that we dont need faster cpus. For us with 144hz+ monitors, none of the cpus on the market are able to even get close to those fps in triple A games.
competitive multiplayer games don't have such hardware req anyway. there is no good reason for such a slow game to be played with 100+ fps.
I don't think those high refreshrates are there for recent single player AAA games bro
So you wasted money buying an expensive monitor then.
@@megamanx1291 even in singleplayer games the difference between 60fps and 144hz or more in terms of motion clarity is massive. Unless you are one of those that are braindead and cant notice a difference, for those you have to be born again, no fix for a slow brain.
@@gersongaete1574 I love high FPS but the issue with trying to do high refresh rates with AAA games is the game engine is just not designed with high refresh rate in mind. So you end up getting more stutters which ruins the whole point of having high FPS. So I'd rather game at 80-100 FPS with butter-smooth frame times in AAA titles and leave the high refresh for FPS, Racing games stuff like that. But that's what is awesome about PC you can do whatever you want (so long as your PC is up to the task).
The fact that u need a 7600x for 1440p medium, but a 7800x3d for 1440p high? That doesnt even make any sense since lower settings is more cpu intensive, devs are clueless when it comes to system requirements
Lower settings aren't more CPU-intensive, they're just less GPU-intensive so allow your CPU to hit higher clocks/usage (it's called a bottleneck)...
Lower settings isnt more CPU intensive where did you get that rubbish from?
I guess you didn't listen to the video. higher settings can include things like higher crowd density, longer render distance, etc, which will increase cpu load more than just having more fps will
@@cire420siuol When you lower the graphics settings, your GPU is able to render more FPS. To compensate for this, the CPU has to send more data to the GPU per second which increases its load. If you were to increase your graphics settings, your frames will drop and your CPU usage will decrease. It's a balancing act.
@@Jasontvnd9 Why do u think cpu's are tested at 1080p instead of 1440p/2160p, same thing applies to game settings, lower settings is less gpu bound which makes it more cpu bound
Good thing I game at mostly 4k now and cpu almost doesn’t matter anymore. 7800x3D should be fine at 4k for another 3-4 years for me.
This is just weird. First the list 1080p @ 60 with a 13600k and then 1080p HIGH with a WORSE CPU. I had to make a double take and realize it was HALF the FPS. But why would they show it like that? They're basically saying; you can have higher settings with worse FPS with a worse CPU. Like really?
@paaaatrika 30 fps
Because they want to show you a range of specs.
There is a chance 8400 gets more than 30, like getting 30-50fps but 30 is a stable lock . Same with the higher end cpus. Would have to wait til release to tell tho
whats with the supposed parity between the 13700k and the 7800x3d? theres a massive diff betwen those two. Could it be that this game needs more than 8 cores?
Massive diff only at 1080p, 1440p 5-10 fps difference
"massive difference" delusional amd poojeets
Really makes me feel the age of my 5900x looking at these recommendations. It was bound to happen, only hope I can get my hands on the 9800X3D until then because I don't doubt it'll be CPU-heavy like the first KCD was and these requirements suggest.
i remember when a 1050 could hit high around 60fps in newer games, crazy
Stop lying... 1050 was trash
Delusional lmao
Bro when did 1050 gave you good performance 😭
Wen?
around 2016 it was the 1050ti tho so not what i said before
First AD that I have seen that shows and actually valuable product, the redmagic is absolutely one of the best value proposals as a gaming phone in every aspect.
You can see the game stutters at every camera change even in the recent IGN gameplay, this is going to be SO good lol
Can't wait for ppl to defend the game with some bs like "It has real time NPC with their own routines bla bla bla" like in the recent 20+ years no other game had similar or better systems lmao
ryzen 5 7600x for medium 1080p setting is insane
Don't preorder boys. stay strong 💪
Ugh really hoping this game runs well on launch. We need a performance W for once. Stalker 2 ran like butt cheese and I returned it.
Im not buying this shit, simple as that. 7800x3d shouldnt be in any recommended specs. Thats like putting the 4090 as recommended for 1440p.
At this pace, next year's games will have 9800X3D and 4090 as minimum specs for 720p 30fps.
7:43 I would like to emphesize this point alot more by saying. These specs are "kinda deceiving" but I think they weren't done on purpose with ill intent. If you take a good look at that chart everything here is just contradicting each other. The requirements for 1080p medium and 1440p medium would wipe 1080p high specs. The i5-13600k or r5 7600x would wipe core i5 12600k and ryzen 7 5800x. Same for the guy listed in these the 6600xt, rtx 3060, 3060ti, 6700xt would also wipe the floor with 2060 super and rx 5700. The only reason I think this is how it is is because of the ram amount in each that make 32gb > 24gb but in cpu and gpu both of these are better than 1080p high. So dont let the gpu and cpu fool you.
30fps with a 4080/7800x3d at 4k is dumb. Pass instantly for me. I find older games so much better because these new high demanding graphics do not improve the games for me. Games looked great a bit ago and played much better. I want great stories with character relationships, and good gameplay and features. Stop with the bad optimized games that really don't look much better. Keep your shadows and reflections. Indie games rule!
Have you played Kingdom Come 1?
What an NPC take. You must have not played KDC 1
"I can't play this game at 200 fps 20k graphics. Pass. Older games were so much better. I just want good gameplay and story. That's why I won't play this game because I can't play this at ultimate max settings at 200 fps"
Oxymoronic idiot.
@@verigumetin4291 The real idoits are the people buying the latest overpriced hardware to play unoptimized games.
You'll still get all that in this game, mate. Only have to settle for more modest graphical settings, which is something you should be comfortable doing judging by your comment, and you'll still get great visuals nevertheless.
I feel ok with it. We should expect modern games to be pushing hardware requirements. If it doesn’t it means they aren’t pushing to make games more immersive and rich. PC gaming is supposed to be about upgrading regularly. Your system needs to have a few changes every three years or possibly more often.
I currently have:
i7 14700K, at 5.7Ghz on pcores, 4.5Ghz ecores
360mm AIO (Deepcore Mystique, it’s actually amazing)
65GB DDR5 ram
Several gigs of SSD
TUF RTX 3080
34 inch ultra wide 4K
- it runs KSP1 at between 80-110 fps in 4K, so hopefully will do ok on KSP2.
It doesn’t look much better than the first one.
Sure is a lot more demanding though😂
So with upscaling in mind, it's very close to STALKER 2. on a 5900x and 4070TI I can expect something in the neighborhood of 100fps+-
32GB 💀
Dude, 32GB of RAM in 2024 is very standard. They're not that expensive anymore either.
Since the PS4/XB1, you always want at least double what the current gen consoles have in terms of RAM, so 32GB isn't that far-fetched really.
Consoles have 16gb, PCs should always have more.
Cannot wait to see how this will perform on a 5090/9800X3D system.
Probably good :)
That does it, I’m getting a 14700k.
And before you tell me to get a 9800x3d, don’t bother.
ok, get a 7800x3d.
Wait for a 10800x3d ultra xtx
@@userblame632 😕
Just make sure you get a beefy cooler. I have a 420mm AIO on my 13700K because even that gets toasty.
@@Krenisphia Thanks for the heads up. Does running it at Intel spec help at all with that?
Can you checkout Marvel Rivals, The minimum specs seem a little too high for a game like that
Hmm...my xmas gift to myself was a platform upgrade from my old 6600k 16GB ram to AM5 7600x3D 32GB ram while keeping my old 5700xt for now. In theory I should be in decent shape for this if I upscale 1080p to 1440p. Will be interesting to see.
GTX 1060 is an 8 years old videocard that was mid-range on release. Just think about it. Can you run a 2016 game using GTX260 from 2008?
Off topic: can you make a video showing your guitar/guitars? That Les Paul is lurking in the background with its sexy paint job.
I wouldn't jump ahead. First the premiere, then bugs/lags, then patches and various fixes. Looking at other titles on the market, I have the impression that the requirements will ultimately be lower, although I hope that the optimization will be at the level, not like in some of the latest items...
I really hope my 3800x with 32gigs ram will be good enough. I was severely GPU bottlenecked in the first game and ordered a new GPU the other day, still waiting for the package to arrive. I'll check how hard I'm CPU bottlenecked when it arrives. Fingers crossed I won't want to upgrade to a 5800x3d or something like that straight away. My wallet is scared.
I don't believe these recommended PC specs. First part of KCD is still not playable with comfort after 6 years. Even maximum draw distance makes chicken appear and disappear in ~40-50 meters, and lowest native fps at those settings in Rattay is 30 fps on ryzen 7500f, RTX 4070 super, 32Gb DDR5 6000MHz, SSD, 1440p.
Just upgraded my Ram along with my CPU.. all the comments on forums and reddit about 32GB BEING PLENTY for a long time are hard to believe after seeing this.
Luckilly i went with 64 in the end.
Did one of devs say he was running the game on a 3050 with a 3600x at home and it ran fine? Why this sudden leap on some of this stuff. STALKER 2 as well had 2070 and a 3600x up till the day of release as recommended and they ghost switched in the new requirements.
I wonder how consoles, with their meager CPUs are going to manage 60 Fps (especially the ps5 pro @2160p upscaled when native 1440p medium recommends a 7600x on PC...).
They won't lool
@@Crimsongzthey will. Its already been announced
@@iBaZiic now that I think about it, they specified 60 _unlocked with VRR_ . Probably because the console can't sustain 60. Without a VRR screen that 60 is probably going to feel abysmal.
@@etienne1062 60 unlocked with vrr means that it can go above 60 and it needs vrr to do it, like other games. Or else they would have said up to 60 locked with vrr. Why would you remove the fps cap if it can't go above it ?
But tbh i do expect some dips here and there and the games to go into the mid 40 or 50 fps in some areas. But if it's steady 60 fps most of the time i would be happy with that.
@@iBaZiic why cap to 60 when you allow frame rate to fluctuate anyway ? Sure it may go above, but given the PC requirements, it's most certainly because it will spend a lot of time _below_ 60.
@danielowentech just hope someone will test and analyse the current state of KCD 1 more or less deeply with 9800X3D and some range of CPUs/GPUs like in this recommendation, to give a perspective for people who want to play KCD 2, but have an idea about buying some pc parts on sales / before the tarriffs.
trying to figure out exactly what the game is doing, to require such high sys requirements for 2k. I tend to playin 4k. I think I will pass on this game for the moment until I see some benchmarks.
I'm at the medium 1440p 60 FPS range but I still don't think it will hit a smooth 60 always. I'm currently playing the first game and it runs like a dream at 1440p very high settings 120+ FPS. Was really hoping for something similar but that now seems like a pipe dream. And yes please do testing for this game! I'm not buying it until I see your results.
The original game dips below 60 in certain instances on my new PC, which has a 7800X3D. But my average is still well over 90fps for the most part. None of this is surprising to anyone who has played this game or a Cryengine game.
The CPU requirement it's most likely due to NPCs amount and routines on towns, than to some hidden Ray Tracing setting.
Editing is funky in the beginning
These CPU requirements are getting out of hand. Soon enough, people will be prioritizing upgrading their CPU's over their GPU's
OMYGOD Red Magic?! Dan is finally making it big!