Yeah, I always had that suspicion and it's now confirmed indirectly I guess. There's no way we should encounter stutters even on 4 core CPUs when you realize that the 4 core CPUs people have nowadays are much faster than even the 8 core jaguar CPUs in the consoles. It is absolutely bad optimization, having to match the consoles in core count just reeks of the same shit as saint's row 2 with CPU clock speed having to match with the consoles to run well. Not as drastic as that, but still bad.
Big oof on the timing of this due to the new update and their supplied launch argument. My 3570k @4.4ghz led to an absolute stutterfest on Vulkan and was smooth at DX12 but with significantly reduced CPU performance as well as a small to moderate loss in GPU performance. Playing at all low but Ultra textures on a 970. Before the fix on DX12 (Using Process Lasso CPU Limiter to prevent lock ups) I got 28-33 fps in busy towns with a moderate but playable amount of stutter and a smooth 45-55 everywhere else other than a few drops on select GPU intensive forests. After the fix on Vulkan I get 45-53 in busy towns and mostly 55-65 everywhere else other than the select locations. Only stuttering I get now is very minor and only in big towns. The issue previously known to be plaguing 4/6 threaded CPUs seems to be fixed and runs perfectly fine even on my aged rig. Sucks but you might already need to follow up with an update. Lol
My i7-4770k@ 4.2 GHz and 2400 MHz CL 10 DDR3 RAM with GF 1080 Ti @ 1.9GHz ran this game pretty good before the patch in 1440p high, no stutter at all and keeps stable 60 fps 98% of the time when I did missions around that first town and gang hideout.
Are you serious got that much fps increase ? If that so, i would probably consider to buy the game right now because my gtx 1060 finally are able to run this game on console settings on 60fps lol ...
Well, my 2500K had a really nice run at least. What was that? 8 years? More? I bought it a couple of months after release. EDIT: There's a patch? Crisis adverted!
old 2500k- *gulp...Guess this is the end* 😵, You- "Hey" 2500k- Huh? You- "We're not done yet bud, not by a long shot!" 2500- Really? 😱 You- yeah... You've still got it. 🤜💥🤛
Damn, I didn't even know it already released. Looks like it's something to do with load balancing according to the patch. With the community coming up with a "fix" to stutters with a CPU usage cap, that seems like the exact thing that was needed.
Well, after the patch I only experience drops in the middle of towns with my i5 6600. I keep the game on ultra and it runs at 60 everywere else. Considering I've spent about 2 hours in towns, out 90ish hours. It's good enough for my old cpu. Very pleased with the patch
@@XA--nj5nv 2060. I know it's overkill for the system, but I found an offer I could not miss. Now I just need to upgrade the rest around it :p Also, it's a custom "ultra": textures, lighting, illumination, shadows, tesselation, soft shadows, geometry detail all ultra. The rest are custom based on if I see any visual difference. Volumetrics seems to be using more cpu for me on high, but I see no diff from medium(honestly the low looks more realistic) One more thing; I only use a 1080p monitor so that also give a big fps boost
Considering my friend gameplay with a 4th gen i7 4c/8t, I totally agree with you. 4threads are not enough anymore. 8threads can still hold a little bit.
At 1440p with all the settings maxed I get between 40-50fps with an i7-6850k and 1080ti. My cpu sits at 59C though, with a corsair h100i. This is fine for the winter, but I wouldn't want to play in the height of summer.
@@timothybryant2898 I mean... the Tegra cips are technically APU"s..which has a CPU in them that they made.. so.. Nvidia DOES make CPU's. Just not... Desktop kinds..
I really appreciated the inclusion of the R5 2600 and the RX 580. That is my current setup and, I believe, is more representative of the mid-range setups present in PC gaming. Please keep them coming.
Another cpu comparison that falls to include the recommended and minimum hardware as a baseline. As a i4790k user this is as useful to me as a chocolate teapot. Fyi - Runs great here, averages over 80fps with mostly Ultra settings on my RTX2080ti with many advanced settings enabled. The minimum is around the mid 60s in the towns and never drops below that... Only using a minor overclock of 4.6GHz. Not bad for a 5 year old cpu.
Shocky, What did they misreport then? It's still important as an RTX2080Ti user to know if you're losing performance due to your CPU which you are. The fact that you can get 60FPS may be sufficient to you but others may want to know this information. And many people use below the recommended hardware so they too want this information... why even come to a video about a game that performs great for YOU then complain that the video is useless to you? Also, not sure if you have the new update so it's not clear if your experience matches the video which likely does not.
You are using a 2080 Ti. If you throw powerful hardware at a poorly optimized game of course you will see decent performance. This game looks good, but it doesn't look good enough to justify the shitty performance people are seeing. I am using a i7 6700 and a GTX 1070 TI and I get pretty awful performance when I try to run the game on all high settings and 1080p. I have more than the recommended system requirements so I should be getting a steady 60 fps at all high settings at 1080p but I don't. That's how poorly optimized this game is. It's far too GPU/hardware hungry. This is a sign of poor optimization. The recommended system requirements call for a 1060. I can't imagine anyone trying to run this game at 1080p on some ultra and high settings. It must be a mess.
1:20 Interesting how big the spread between min and max fps is on Turing cards. For ex. 2060 Super vs 1080 Ti. Here optimisation can still be expected, mainly on driver side i guess.
should have tested a 30FPS cap with the 4c/4t, even the Xbox One X runs it at 30FPS, so if an old i5 can hold 30FPS without those freezes it's not a disaster IMO.
Defcat I literally locked it at 30fps (Half Vsync) because the fps is all over the place. I get consistent 60’s mind you, but whenever I go to town it drops and it annoys me. Playing at locked 30 while plugging my pc to an HDTV, is one of the best things I experienced. Game plays like a damn movie....
Richard, always the best and most detailed video analysis, pitty that will need to re-do some of the work because the update from Rockstar. Thanks Richard as always!
I haven't yet tried Alex's optimization settings in my game but I did try the hardware unboxed optimization settings and using a 5820k with a GTX 1080 at 1080p I get average FPS of 63. It's pretty brutal that I'm just barely over 60 even using a GTX 1080!
Same problem. Vulkan give better performance and stability but everything looks extra blurry to me because vulkan can't use Nvidias sharpening from the control panel. But yeah, same issues here... However, frame pacing at 30 is flawless in this game. so I just crank the settings up REALLY high and play it at 30 2k. Kinda sad my GPU is twice the power of the Xbox one X and I can't even get the performance of it....
That’s odd because my RX480 averaged about 50-60 FPS (had to lower a couple more settings to get 55-70) at 1080p. Maybe you indeed are CPU bottlenecked?
I run a gtx 1060 and an i3-8100 and i agree that 60fps is out of question without tuning settings way down. However, you can easily attain 50fps and keep it locked at that without any stutter or drops
Bad news I got 58fps avg rdr2 bench on Vulkan with a mix of ultra and normal (ultra textures, high models, shadows, trees, lighting, tesselation, water reflection, everything else on normal/low with FXAA on) in 3440x1935 on GTX 1080 and 6600k. Time to throw my PC in the trash. BTW the latest patch with new drivers completely fixed the stutters. :)
I am also getting 8.5sec stutters on my i5 6600k after every 1 or 2 mins. P.S: It still happens even after the update they just released today for 4core cpus.
@@savoz894 If you're using Rockstar Games Launcher: Click on Settings Click on RDR2 under My installed games on the left side Scroll down a bit and you have "Launch arguments" there Type: -cpuLoadRebalancing in the field and hit enter Please do tell us whether it removes the stutter or not, I'm looking to purchase it on Steam once this all fiasco ends.
FYI latest patch appears to have broken gsync for fullscreen mode selected in Red Dead 2 graphics menu. It causes framerate to tank and game to crash when loading story mode. Changing to Windowed mode in Red Dead graphics menu (with gsync set to Windowed and fullscreen in nvidia menu) appears to fix this allow game to successfully load story mode and run with gsync again. Other solution is to disable gsync if you want to run exclusive fullscreen (note: if you set gsync to fullscreen only in nvidia menu and then select fullscreen mode in red dead graphics menu then gsync will not work and the monitor will run at its normal refresh rate).
When CPU limited it is often recommended to limit max framerate. When you get 75 fps on average and enable 60 fps limit, you will get 3 ms of free CPU time every frame on average. This free time can be used to run secondary, background threads like those associated with asset streaming and object instantiating. That way you will increase that 1% minimal FPS, because such drops and stutter occurs when background threads tasks pile up to the point where everything has to freeze. The same technique is used on consoles. FPS is locked to 30 fps, even though games would run at 35-45 fps, but that extra power is utilized by background tasks. It is a good practice to use 50% threads than hardware has. To fully utilize 4 core CPU you should use 6 threads, because threads are often stopped by waiting for RAM access or simply wait for other thread to finish. This leaves a lot of free CPU time that can be used by additional threads. Hyper Threading is an hardware extension of that idea. Single core is used to manage two threads, but has more arithmetic units so while one task is being processed it can schedule arithmetic operations for another task. Usually resulting in HT CPUs are about 50% faster than CPU with same core count without HT.
AMD 3600 VS 9700K dsta please. Great video, must have taken a long time to run the tests. Unfortunate timing. Looking forward to the update. Keep up the great work Rich, and team.
It was written for 8x core CPUs. So, yeah. This was always going to happen, this console generation caused engine designers to optimise for many threads and offload physics to the GPU.
The game runs more than fine on 4 core cpus with the recent patch (certainly enough for 60 FPS now with the official fix), of course you can't slap a 2080 Ti in there on such weak CPUs and expect 140 FPS, but then again you don't want 140+ FPS on a Rockstar game either no matter your CPU, because the engine craps out at such high frame-rates anyway, lol.
@@buzzworddujour If you hit 144fps or higher the game engine has a meltdown and stutters heavily when you have 8 threads or less, the same thing happened with GTA V on pc but it was a higher fps that caused it, closer to 180fps before it stuttered. So super fast cores that can reach 144fps but 8 or less threads. There is only 1 CPU on the market that fits that description the 9700K
Sounds like it's _very_ optimised for a fixed target render. You know this is a _good_ thing right? Engines were dying, CPU development was and probably still is dead, single threaded engines that lived and died on Moore's law were a dead end. So hurray, AMD, Microsoft and Sony saved gaming. Should be a fuckin Christmas movie.
With the latest patch and command line argument -cpuLoadRebalancing it runs perfectly smooth on my i5-3570k, RTX 2060 super, 16GB RAM machine. I use ultra extreme settings on 1080p. No stuttering anymore with ca. 30fps. The game looks gorgeous. It is a masterpiece! My deepest respect for the developers! I am pretty sure the developers are working hard in order to fix all errors. Give them time!
i5 6600k ~ 4.3GHz, GTX 970 ~ 1545MHz 1440p @ 35-50fps, Ultra textures with medium settings and additional "pc bling" Windows 7, Vulkan (asyncComputeEnabled) Slight stuttering in towns (And CPU heavy areas) Good nuff
Would have liked to see RAM speed scaling on both Intel and Ryzen CPUs (also more CPUs compared, I could literally watch hours of DF's CPU related content in one sitting lol).
This is dumb, there are not a lot of people trying to game on new gen CPUs with 4 threads. You should have tested CPUs like the 2500K and scaled up from there so people who haven't upgraded in a while can see what to expect.
Here's the thing, I'm runing red redemption 2 @1440p 27 inch Asus monitor, all settings on ultra apart from soft shadows which is set to high. I've locked frames to 60 frames, getting butter smooth frames and no issues. Runing it on an old i7 4770k 4.5ghz and a 1080ti. Gotta hand it to rockstar, it's an absolutely visualy stunning looking game.
Similar. I'm running on 4770k at 4ghz because I never got around to buying a better cpu cooler. Paired with a 2070 Super (1440p monitor @144hz) I "can" hit 110 fps average. I ended up letting nvidia setting choose for me and getting much better visuals with 65 fps, unsure of the lows but I don't feel any dips or the pc struggling. I've just bought my wife a Ryzen 3600 as her old i5 650 just couldn't do what she needed it to do and can't say I'm not alittle jealous. Maybe next year for me
@@andrewboult848 Hi mate, yeah I just built myself a new system, AMD 3900X...32gig Corsair vengeance 3600hz, Asus Strix 2080TI OC...the Gigabyte Aorus X570 Master...need a new higher refresh monitor though now as I'm still running it off my old 27 inch 1440p monitor. I'm looking at the 34inch LG ultra wide, 144hz adaptive sync. Right now I'm still runing Red Dead 2 locked at 60hz ultra excactly as I did with the old system, there's absolutely 0 difference, still same frames and butter smooth ie no stutters no tearing. Just shows if you can be happy at 60 frames ultra settings butter smooth and astonishing visuals you don't need top end systems. Yes when I get an 144hz ultra wide I'm sure I will be able to use ultra settings on red dead 2 say at 120hz but will I really notice that much??...I mean it was already silky smooth at 60hz ultra. How much smoother gameplay can you get beyond silky smooth:) it's a funny one. I think if you get 60 frames using ultra settings butter smooth there's not much piont upgrading. Looking at these ultra wides and it's pretty expensive too, reviews between VA panels and IPS panels are confusing too, some say VA panels for gaming due to much higher contrast ratio etc, but some say IPS.
Glad I just built a new Gaming PC using Ryzen 7 3700X Processor with RTX 2070 Super and 32 Gig of Memory. So should be able to get 1440P with High Settings across the board.
My 6700K does very well with RDR2 and doesn't bottleneck my 1080 Ti at all. Smooth as silk. Utilization can get over 80% at times, but the GPU's utilization is always at or north of 95%.
About max settings 60 FPS? until it probably heads to the Google graveyard. I'd like to think Stadia will work out despite little interest in streaming, but I know better.
Nope. Your elitist hardware is miles better than consoles only on paper. Result in games paints a different picture. Bottom line, high-end PC gaming is nothing but a waste of money.
Krešimir Mandić what absolute twiddle my friend. With the right optimisation a midrange PC will murder a console in framerates and graphical capabilities. Rockstar have always looked at PC has an unworthy platform nothing to do with consoles being better, consoles will always be inferior to PC. 👈🏻
Most new games for AMD recommend a Ryzen 5 1600 or the X edition so I completely agree, quade core gaming is coming to an end. My next step is to buy a new Ryzen CPU after Christmas any way, presently own a 1300x planing to move upto a 2600 or higher.
My i5 6600k@4.828/4.627cache@1.456v+2x4gb ddr4&3018CL12@1.465 runs fine after i added the startup parameter -cpuLoadRebalancing Before adding that i was getting frequent 3 to 5 second freezes.
the radeon 7 performance is kind of interesting. The question is, does the engine make better use of the radeon vii than is typical or is it just making poor use of some other architectures
I ring the bell , for instant gratification.... ^^ the grin was badly held back richhard... very good. :-D , now plz investigate the pascal problem in red dead 2.
I think that the AMD cpu optimization that is built up around the Red Dead 2 engine is for the old AMD cpus that are like the one in the ps4 console and have yet to be updated for the next generation of Ryzen cpu architecture. This may be all patchable, but that’s speculation
wtf is this clickbait title? When the 7700k is still getting 70-110 FPS most of the time you cannot say it's game over for quad cores or a bad experience. Here I am at over 90-100 FPS most of the time on my 4.5Ghz 6700k wondering what the hell you guys are smoking. I haven't even tried the new patch which fixes CPU performance either.
avg fps dnt demonstrate the whole story my friend, the 1% lows are bad same with 8400 the avg fps is better but when you visit busy area where tons of stuff there your quad core cpu will struggle to maintain higher number
Yeah, but the 7700K has 8 threads, plenty of cache, high IPC and very high all-core clock-speeds are possible. So it will last a while yet. Thankfully... as that's what I still have!
@@DJ_Dopamine 7700k will struggle games like ac odyssey my 1600 ryzen 4ghz avg usages 70-80% and sometime hit 94% and wherever my cpu hit over 90% usages those micro stutters drive me crazy, same goes with bf5 if you are targeting 120 fps or 150 fps for better multiplayer experience with a high end gpu sadly you cant do, those cpu spike will drive you nuts
The new patch killed my performace 9700k 2080ti. 60 fps before patch GPU temp 67 C, after patch about 40 fps with same settings and only 48 C at 100% GPU load.
I have an i7-4770k. I bought it back when people recommended i5 CPUs for gaming, saying that an i7 is a waste of money. Looking back I am glad about my choice. Hyperthreading can make a ton of difference now.
Steve from GN defined, that those stutters are coming from engine bug present since GTA V and those drops can be eliminated by upping graphical settings.
Isn't that the classic method of raising the strain on your GPU to such a level that you're making yourself GPU limited rather than CPU limited? Because in that case, instead of having dips to lower FPS, you're essentially framecapping yourself to the same low FPS of these dips.
not every 4c 8t i7 I had to upgrade from i7 4790k to 9600k to get rid of the bottlenecking at high framerates with RTX 2080 most likely was ddr3 1886mhz at fault here though. Now have ddr4 4266mhz and performance is way better.
You should have also showed the difference between running Vukan vs. DX12... I wonder if Pascal cards run better under DX12 then Vulkan in RDR2? It would have been an interesting addition to this video...
so, i'm buying a new pc. this time around with cutting edge parts. what do i get as a graphical showcase? this one? resident evil2 remake? metro exodus? shadow of the tomb raider? anything else? and yeah. minecraft of course! :D
I am running a 4770K paired with an RTX 2080 and 16GB RAM @ 1600Mhz. It's time to say goodbye soon. Next to the core count, I feel like slow DDR3 RAM speeds are also starting to become a major bottleneck in modern game engines.
Please test on the latest patch and I would also love to see how overclocked 6600K for example performs considering you can easily get 500-700 MHz higher boost clocks out of it.
RDR2 at 1440p and max settings makes my 1080ti sing and warms my i7-6850k to 59C (with a corsair h100i). I get between 40 and 50fps, which I'm fine with for story mode.
*Digital Foundry* If you've not done so, it would be interesting to look at console vs PC in terms of CPU PROCESSING vs FPS in several games with the goal of guessing what you'll need for a CPU to hit 60FPS on PC at similar quality to what a PS5 is likely to do... I know you've at least talked about this in general... I'm guessing the minimum you'd want would be something like the 8-core R7-1700, 6-core R5-3600 or similar Intel CPU... we can roughly assume games have access to something comparable to the R5-3600 running at 3.2GHz (if we assume initially only 6/8 cores allowed for game usage).. that might mean the DESKTOP R5-3600 is 30% faster due to its higher frequency, but consoles run more efficiently plus for newer titles on console there should be the DRAW CALL HARDWARE that further increases CPU efficiency so it gets very difficult to compare even in guessing. Could the draw call hardware reduce CPU load by more than 10%? Last point I guess is that DX12/Vulkan softare should slowly close the gap in terms of using more of a CPU's threads but again we'll still need more cores and/or higher frequencies to offset the console advantages of coding efficiency and hardware optimizations.
Why do you think Pascal performance is so lackluster? Hoping a new driver gives a boost but knowing nvidia they wanna separate their current cards as much as possible.
That's not too bad, I was thinking my 6600K @4.4Ghz would barely break the 60FPS mark like on Kingdom Come:Deliverance but this seems like it would just be some stutters and not game breaking drops. I can deal with that, but a Ryzen is in my future.
I have absolutely no issue running this game and I feel for everyone who has the hardware but still has issues. I list my settings and specs in my videos and showcase a mix of Ultra textures, with high settings. Great game, super ahead of our time. It'll age like fine wine!
My question is: "Instead of releasing Read Dead Redemption 2 ,why didn't Rockstar released the first part of the game both able to run on high end low specs PC? Something that they have done with GTAV?
i could not get stable frames when the game first launched, my CPU (2600x stock clocks) was getting annihilated, 100% usage followed by a crash to desktop every 5 minutes. Now that the new patch has been installed and i added the launch argument for CPU rebalancing, i get a crisp 75fps (locked due to refresh rate) and my CPU bounces around in the 60-70% range and i have not had a crash since...makes you wonder why this wasnt implemented at launch.
I'm getting 1080p60 on Xeon X5650 (6 cores 12 threads X58 chipset) @ 4.0ghz, 16gb DDR3 1600, 980ti. Pretty stable frame times at 16.7ms. Frame capped at 59.92 on RTSS, vsync on in game. all in game settings is set to default. Not bad for an x58!
I already noticed this with GTA-V more than 4 years ago, it made me upgrade my i5 3570K to a i7 6700K.. Looks like I have to upgrade again coming year..
looks like the new 16 core ryzen chip may be more future proff than anything intel has had to offer for the last 4-6 years. Hell if i where to switch/buy i would go for an upgradeable am4 platform and a 12-16 core cpu, with these new games seemingly more core demanding now.
@@AdaaDK yep, this is what im planning to do, i will get either ryzen 5 3600 or the r7 2700 as i currently have an i5 3570k with a gtx 1060 and 16 gb of ram and it stutters in asassins creed oddysey, the outer worlds and a little bit in borderlands but just at the start.
Considering I am using an i5 6700k at 4.6Ghz I'd say no, it isn't game over for quad core processors. This game has ran great for me except for the occasional stutter since day one. Since the latest patch the game runs flawlessly. By the way I am running the game on Ultra quality on 75% of the settings, high on the rest. My GPU is a RTX 2080.
this is not very surprising considering the game was written for and optimised for Consoles sporting AMD chips. it's a shame they didn't optimise more for the nvidia / intel side before launch but no doubt it will get resolved. I have played it on PS4 Pro so i can wait and i suggest that every NVidia / intel owner does the same so that they pull their fingers out and get it sorted.
This game is shockingly running FANTASTIC on my rig. And I have a FX 8350, 8GB RAM, and and RX 580 8GB. I'm thinking about upgrading my mobo, get more ram, and getting a nicer cpu cooler so I can OC it to 5GHz but I don't know if that;ll be very smart.
Thumbs up. Sadly it always is. Kids bitching whats better xbox,PS4, or PC. Kids bitching about what games are better,cpus/gpus, etc. Really its just kids arguing and not realizing since they are young and stupid that people like different things and that opinions are like assholes, we all have one.
Once they used to say gpu al matters for gaming. . Now for marketing they added ram, cpu cores everything the game needs. Right now switched to 4 core budget gaming build from console. They started the same issue when i switched from pc to console.. hardware hussle is always in pc gaming. This is happening bcoz reviewers over believing the next gen hardwares and rich gamers always preferring fancy over priced pc harwares. But most of the gamers ares still playing in budget, developers has to understand this and stop moving to bullshit high cores usage ans keep 4 core best for gaming and remaining part gpu wil take over
imagine , how this is going to run on Gtx 960 gb , Nvidia , 16 GB Ram , i7 :) that is my system & it plays all modern games at around 40 (after slight graphical tweaking ) Testing this game will be a tough one ...
@@putinstea Yes, I have a 1080 paired with the 7700k. I'm playing at 1080p with most settings on high, and a few on medium and ultra. Getting a 70fps average. Lowest frame I've seen is 57, but it's rarely ever under 60fps. Overall a significantly better experience than when I played on the X. The draw distance is amazing. Never heard of the 3570k or whatever you said.
Quite bad timing with this video, because they just put out a patch today which directly addresses CPU performance with 4-core and 6-core CPUs.
Yeah, I always had that suspicion and it's now confirmed indirectly I guess. There's no way we should encounter stutters even on 4 core CPUs when you realize that the 4 core CPUs people have nowadays are much faster than even the 8 core jaguar CPUs in the consoles. It is absolutely bad optimization, having to match the consoles in core count just reeks of the same shit as saint's row 2 with CPU clock speed having to match with the consoles to run well. Not as drastic as that, but still bad.
@@TexelGuy Consoles have even lower level APIs than DX12 or Vulkan AND they have a unified memory pool.
That's like comparing apples to oranges.
Basically crappy jaguar optimization mess with 4~6 cores of PCs... Damnit.
Lol, this videos became very funny now lol.
Rob Taylor must be such a headache for devs to optimise for shit pcs with 4 core cpus hahaha
Big oof on the timing of this due to the new update and their supplied launch argument.
My 3570k @4.4ghz led to an absolute stutterfest on Vulkan and was smooth at DX12 but with significantly reduced CPU performance as well as a small to moderate loss in GPU performance. Playing at all low but Ultra textures on a 970.
Before the fix on DX12 (Using Process Lasso CPU Limiter to prevent lock ups) I got 28-33 fps in busy towns with a moderate but playable amount of stutter and a smooth 45-55 everywhere else other than a few drops on select GPU intensive forests. After the fix on Vulkan I get 45-53 in busy towns and mostly 55-65 everywhere else other than the select locations. Only stuttering I get now is very minor and only in big towns.
The issue previously known to be plaguing 4/6 threaded CPUs seems to be fixed and runs perfectly fine even on my aged rig. Sucks but you might already need to follow up with an update. Lol
yay.. now i'm no longer need process lasso.. :D
My i7-4770k@ 4.2 GHz and 2400 MHz CL 10 DDR3 RAM with GF 1080 Ti @ 1.9GHz ran this game pretty good before the patch in 1440p high, no stutter at all and keeps stable 60 fps 98% of the time when I did missions around that first town and gang hideout.
@Liquid Sunshine you read his comment saw the fastest DDR3 available but didn't see 1440p??
@Liquid Sunshine 1440p is typically 2560 and 3440p is 1440p UW ;)
Are you serious got that much fps increase ? If that so, i would probably consider to buy the game right now because my gtx 1060 finally are able to run this game on console settings on 60fps lol ...
you need to do another on this video to compare this with the new patch that come out today and see what your view is now from then
Well, my 2500K had a really nice run at least. What was that? 8 years? More? I bought it a couple of months after release.
EDIT: There's a patch? Crisis adverted!
Good here, 2600k, 1080p, 60fps on big tv.
@@Nohahio What card you got?
Hell you could switch to i7 2600K in a year or two and you’ll be getting a major upgrade
old 2500k- *gulp...Guess this is the end* 😵, You- "Hey" 2500k- Huh? You- "We're not done yet bud, not by a long shot!" 2500- Really? 😱 You- yeah... You've still got it. 🤜💥🤛
I'm still using i5-3330
Video already out-of-date, performance has been notable increased after patch.
Really?
Damn, I didn't even know it already released. Looks like it's something to do with load balancing according to the patch. With the community coming up with a "fix" to stutters with a CPU usage cap, that seems like the exact thing that was needed.
All of their data is for 4K so its irrelevant to most people anyway.
@@Motleyguts Even so, game fps is getting a boost, i've a 1080ti/ ryzen 3800x and got about 10+ fps after patch.
@@altermatt Oh shit! That's awesome news!
Well, after the patch I only experience drops in the middle of towns with my i5 6600.
I keep the game on ultra and it runs at 60 everywere else.
Considering I've spent about 2 hours in towns, out 90ish hours. It's good enough for my old cpu.
Very pleased with the patch
whats your gpu?
Yeah what’s your GPU?
Running an i5 6600k over here with a 1070ti.
@@XA--nj5nv 2060. I know it's overkill for the system, but I found an offer I could not miss. Now I just need to upgrade the rest around it :p
Also, it's a custom "ultra": textures, lighting, illumination, shadows, tesselation, soft shadows, geometry detail all ultra. The rest are custom based on if I see any visual difference.
Volumetrics seems to be using more cpu for me on high, but I see no diff from medium(honestly the low looks more realistic)
One more thing; I only use a 1080p monitor so that also give a big fps boost
Victor Hategan you just sold me on getting this lol
I got a i5 8400 and a 2070, I was really worried about running this like crap.
Ultra is sooo unecessary, you basically throwing performance away for no real visual gains.
4 cores/8 threads is still sufficient, so it's only game over if you don't have hyper-threading.
Yeah, the title is misleading. If they'd asked if it wasd over for 4 _thread_ cpus, that would've been much more accurate.
That's what I have and my game runs great after the last patch
Considering my friend gameplay with a 4th gen i7 4c/8t, I totally agree with you. 4threads are not enough anymore. 8threads can still hold a little bit.
Nah, not really
Please ,add some Haswell cpu's from 4000 series , a lot of us out there still use them!
and fx 8350 too
My 4770k has been hanging on in there, paired with a 2080ti, you're making me worry here!
Yeah, I got an i7-4960X and I am always worried about new games.
@@cinerir8203 looks like time to switch to 10 gen or newer ryzen
And 3rd gen cpus i have an i5 3570k and im wondering how does rdr2 perform🤣🤣
I always watch to the end because Richard's heartfelt thank you's make it worthwhile.
Would love to see more 1440p results.
At 1440p with all the settings maxed I get between 40-50fps with an i7-6850k and 1080ti. My cpu sits at 59C though, with a corsair h100i. This is fine for the winter, but I wouldn't want to play in the height of summer.
3:54 “6 core chips from Amd and Nvidia”???
@@iamnid Nvidia doesn't make CPUs
slip of tongue
Well, Nvidia does make "CPUs", just with hundreds of times more than 6 cores.
@@timothybryant2898 I mean... the Tegra cips are technically APU"s..which has a CPU in them that they made.. so.. Nvidia DOES make CPU's. Just not... Desktop kinds..
Tegra: am I a joke to you?
I really appreciated the inclusion of the R5 2600 and the RX 580. That is my current setup and, I believe, is more representative of the mid-range setups present in PC gaming. Please keep them coming.
Another cpu comparison that falls to include the recommended and minimum hardware as a baseline. As a i4790k user this is as useful to me as a chocolate teapot.
Fyi - Runs great here, averages over 80fps with mostly Ultra settings on my RTX2080ti with many advanced settings enabled. The minimum is around the mid 60s in the towns and never drops below that... Only using a minor overclock of 4.6GHz. Not bad for a 5 year old cpu.
Shocky,
What did they misreport then?
It's still important as an RTX2080Ti user to know if you're losing performance due to your CPU which you are. The fact that you can get 60FPS may be sufficient to you but others may want to know this information. And many people use below the recommended hardware so they too want this information... why even come to a video about a game that performs great for YOU then complain that the video is useless to you?
Also, not sure if you have the new update so it's not clear if your experience matches the video which likely does not.
You are using a 2080 Ti. If you throw powerful hardware at a poorly optimized game of course you will see decent performance. This game looks good, but it doesn't look good enough to justify the shitty performance people are seeing. I am using a i7 6700 and a GTX 1070 TI and I get pretty awful performance when I try to run the game on all high settings and 1080p. I have more than the recommended system requirements so I should be getting a steady 60 fps at all high settings at 1080p but I don't. That's how poorly optimized this game is. It's far too GPU/hardware hungry. This is a sign of poor optimization. The recommended system requirements call for a 1060. I can't imagine anyone trying to run this game at 1080p on some ultra and high settings. It must be a mess.
1:20
Interesting how big the spread between min and max fps is on Turing cards.
For ex. 2060 Super vs 1080 Ti. Here optimisation can still be expected, mainly on driver side i guess.
gotta be drivers. No way the 580 is keeping up with a 1070 in any other game.
@@snozbaries7652 it is a faster card, now used properly...
Even the GTX 1080 being beat by both RX Vegas and the base RTX 2060 is totally wrong.
@@oropher1234 not it's not...
@@illusionlb boooooooooooooring person alert
should have tested a 30FPS cap with the 4c/4t, even the Xbox One X runs it at 30FPS, so if an old i5 can hold 30FPS without those freezes it's not a disaster IMO.
30fps on a pc? blasphemy... u sir do not deserve a pc.
But muh 60 fps. You're absolutely correct. In a slow paced game like this, 30 fps suffices
Defcat I literally locked it at 30fps (Half Vsync) because the fps is all over the place. I get consistent 60’s mind you, but whenever I go to town it drops and it annoys me. Playing at locked 30 while plugging my pc to an HDTV, is one of the best things I experienced. Game plays like a damn movie....
On quad core CPUs you're better off locking it at 30fps than suffering that rollercoaster FPS.
Richard, always the best and most detailed video analysis, pitty that will need to re-do some of the work because the update from Rockstar.
Thanks Richard as always!
I haven't yet tried Alex's optimization settings in my game but I did try the hardware unboxed optimization settings and using a 5820k with a GTX 1080 at 1080p I get average FPS of 63. It's pretty brutal that I'm just barely over 60 even using a GTX 1080!
Same problem. Vulkan give better performance and stability but everything looks extra blurry to me because vulkan can't use Nvidias sharpening from the control panel.
But yeah, same issues here... However, frame pacing at 30 is flawless in this game. so I just crank the settings up REALLY high and play it at 30 2k. Kinda sad my GPU is twice the power of the Xbox one X and I can't even get the performance of it....
@@rage8010 Try to put a sharpness filter with alt + f3
That’s odd because my RX480 averaged about 50-60 FPS (had to lower a couple more settings to get 55-70) at 1080p. Maybe you indeed are CPU bottlenecked?
@@Bestgameplayer10 I'm using a 6 core 5820k. Using optimization settings discussed in hardware unboxed optimization video part 1 and 2
Zipzeolocke - Thats really odd. I guess Intel CPUs are faring too well with this game.
It’s not too late to cannnnnnn this video and re release it.
what?
@@PSXuploads Never seen such an innocent reply😍
@@PSXuploads R* just rolled out an update specifically for 4c CPUs so this video is already out of date.
I run a gtx 1060 and an i3-8100 and i agree that 60fps is out of question without tuning settings way down. However, you can easily attain 50fps and keep it locked at that without any stutter or drops
RDR2 killed the nVidia 10XX series, quad core processors and the 8GB RAM standard...
3 birds with 1 stone
Bad news I got 58fps avg rdr2 bench on Vulkan with a mix of ultra and normal (ultra textures, high models, shadows, trees, lighting, tesselation, water reflection, everything else on normal/low with FXAA on) in 3440x1935 on GTX 1080 and 6600k. Time to throw my PC in the trash. BTW the latest patch with new drivers completely fixed the stutters. :)
Time to push the boundaries again!
It's not the game in itself but rather the use of low level APIs like Vulkan and DX12 which Pascal GPUs were not optimized for.
So next : memory speeds 2400-3600Mt/s with ryzen and intel cpu's
That's an interesting suggestion.
I am also getting 8.5sec stutters on my i5 6600k after every 1 or 2 mins.
P.S: It still happens even after the update they just released today for 4core cpus.
Have you added the command line argument as well? It's a simple thing, but I just want to make sure you've tried everything.
Add the command line argument, it removed those gigantic stutters on my i5 6500.
@@MarioManTV Command line argument? Can u explain pls..how to do it?
@@savoz894 If you're using Rockstar Games Launcher:
Click on Settings
Click on RDR2 under My installed games on the left side
Scroll down a bit and you have "Launch arguments" there
Type: -cpuLoadRebalancing in the field and hit enter
Please do tell us whether it removes the stutter or not, I'm looking to purchase it on Steam once this all fiasco ends.
Great vid but bad timing as they just fixed this yesterday. its just an easy put this in your launch arguments " -cpuLoadRebalancing"
FYI latest patch appears to have broken gsync for fullscreen mode selected in Red Dead 2 graphics menu. It causes framerate to tank and game to crash when loading story mode. Changing to Windowed mode in Red Dead graphics menu (with gsync set to Windowed and fullscreen in nvidia menu) appears to fix this allow game to successfully load story mode and run with gsync again. Other solution is to disable gsync if you want to run exclusive fullscreen (note: if you set gsync to fullscreen only in nvidia menu and then select fullscreen mode in red dead graphics menu then gsync will not work and the monitor will run at its normal refresh rate).
When CPU limited it is often recommended to limit max framerate. When you get 75 fps on average and enable 60 fps limit, you will get 3 ms of free CPU time every frame on average. This free time can be used to run secondary, background threads like those associated with asset streaming and object instantiating. That way you will increase that 1% minimal FPS, because such drops and stutter occurs when background threads tasks pile up to the point where everything has to freeze.
The same technique is used on consoles. FPS is locked to 30 fps, even though games would run at 35-45 fps, but that extra power is utilized by background tasks. It is a good practice to use 50% threads than hardware has. To fully utilize 4 core CPU you should use 6 threads, because threads are often stopped by waiting for RAM access or simply wait for other thread to finish. This leaves a lot of free CPU time that can be used by additional threads. Hyper Threading is an hardware extension of that idea. Single core is used to manage two threads, but has more arithmetic units so while one task is being processed it can schedule arithmetic operations for another task. Usually resulting in HT CPUs are about 50% faster than CPU with same core count without HT.
Wow- incredible amount of work put in this. Thank you!
You are the best digital foundry
AMD 3600 VS 9700K dsta please.
Great video, must have taken a long time to run the tests. Unfortunate timing. Looking forward to the update.
Keep up the great work Rich, and team.
No, its not game over for 4 core CPU, only shitty optimised games have problems with it.
No. More cores are the future. This game had a problem with optimization, 100%. But 6 core plus is the new standard.
Low level APIs like DX12 and Vulkan love lots of cores. And btw next gen consoles will set 8 cores 16 threads as the new standard.
It was written for 8x core CPUs.
So, yeah.
This was always going to happen, this console generation caused engine designers to optimise for many threads and offload physics to the GPU.
is that why the 9700k got such horrible benchmark results? lmao
Its 2 quad core modules that make up the console CPU.
The game runs more than fine on 4 core cpus with the recent patch (certainly enough for 60 FPS now with the official fix), of course you can't slap a 2080 Ti in there on such weak CPUs and expect 140 FPS, but then again you don't want 140+ FPS on a Rockstar game either no matter your CPU, because the engine craps out at such high frame-rates anyway, lol.
@@buzzworddujour If you hit 144fps or higher the game engine has a meltdown and stutters heavily when you have 8 threads or less, the same thing happened with GTA V on pc but it was a higher fps that caused it, closer to 180fps before it stuttered.
So super fast cores that can reach 144fps but 8 or less threads. There is only 1 CPU on the market that fits that description the 9700K
Sounds like it's _very_ optimised for a fixed target render.
You know this is a _good_ thing right?
Engines were dying, CPU development was and probably still is dead, single threaded engines that lived and died on Moore's law were a dead end.
So hurray, AMD, Microsoft and Sony saved gaming.
Should be a fuckin Christmas movie.
With the latest patch and command line argument -cpuLoadRebalancing it runs perfectly smooth on my i5-3570k, RTX 2060 super, 16GB RAM machine. I use ultra extreme settings on 1080p. No stuttering anymore with ca. 30fps. The game looks gorgeous. It is a masterpiece! My deepest respect for the developers! I am pretty sure the developers are working hard in order to fix all errors. Give them time!
i5 6600k ~ 4.3GHz, GTX 970 ~ 1545MHz
1440p @ 35-50fps, Ultra textures with medium settings and additional "pc bling"
Windows 7, Vulkan (asyncComputeEnabled) Slight stuttering in towns (And CPU heavy areas)
Good nuff
Async compute in system settings changed it for me completely, even after latest update
Turn it to True, hope it helps
Would have liked to see RAM speed scaling on both Intel and Ryzen CPUs (also more CPUs compared, I could literally watch hours of DF's CPU related content in one sitting lol).
This is dumb, there are not a lot of people trying to game on new gen CPUs with 4 threads. You should have tested CPUs like the 2500K and scaled up from there so people who haven't upgraded in a while can see what to expect.
Here's the thing, I'm runing red redemption 2 @1440p 27 inch Asus monitor, all settings on ultra apart from soft shadows which is set to high.
I've locked frames to 60 frames, getting butter smooth frames and no issues.
Runing it on an old i7 4770k 4.5ghz and a 1080ti.
Gotta hand it to rockstar, it's an absolutely visualy stunning looking game.
Similar. I'm running on 4770k at 4ghz because I never got around to buying a better cpu cooler. Paired with a 2070 Super (1440p monitor @144hz) I "can" hit 110 fps average. I ended up letting nvidia setting choose for me and getting much better visuals with 65 fps, unsure of the lows but I don't feel any dips or the pc struggling. I've just bought my wife a Ryzen 3600 as her old i5 650 just couldn't do what she needed it to do and can't say I'm not alittle jealous. Maybe next year for me
@@andrewboult848 Hi mate, yeah I just built myself a new system, AMD 3900X...32gig Corsair vengeance 3600hz, Asus Strix 2080TI OC...the Gigabyte Aorus X570 Master...need a new higher refresh monitor though now as I'm still running it off my old 27 inch 1440p monitor.
I'm looking at the 34inch LG ultra wide, 144hz adaptive sync.
Right now I'm still runing Red Dead 2 locked at 60hz ultra excactly as I did with the old system, there's absolutely 0 difference, still same frames and butter smooth ie no stutters no tearing.
Just shows if you can be happy at 60 frames ultra settings butter smooth and astonishing visuals you don't need top end systems. Yes when I get an 144hz ultra wide I'm sure I will be able to use ultra settings on red dead 2 say at 120hz but will I really notice that much??...I mean it was already silky smooth at 60hz ultra.
How much smoother gameplay can you get beyond silky smooth:) it's a funny one.
I think if you get 60 frames using ultra settings butter smooth there's not much piont upgrading.
Looking at these ultra wides and it's pretty expensive too, reviews between VA panels and IPS panels are confusing too, some say VA panels for gaming due to much higher contrast ratio etc, but some say IPS.
No because they have 8 threads and my 3770k with 2080ti gets same fps with amd's 3900x!
Glad I just built a new Gaming PC using Ryzen 7 3700X Processor with RTX 2070 Super and 32 Gig of Memory. So should be able to get 1440P with High Settings across the board.
My 6700K does very well with RDR2 and doesn't bottleneck my 1080 Ti at all. Smooth as silk. Utilization can get over 80% at times, but the GPU's utilization is always at or north of 95%.
What's up with the Intel captures looking glitchy?
I'm curious to see what GStadia's performance will be like.
I also wonder if we will have any control over settings.
About max settings 60 FPS? until it probably heads to the Google graveyard. I'd like to think Stadia will work out despite little interest in streaming, but I know better.
@@vincentvermilya1365 If it can pull of a native 2160p at max graphics settings and 60FPS we can see whether we need that level of fidelity.
With the new patch, I can play RDR2 without any issue (no stutter, freeze).. I'm using i5 7600K (4 cores)
LOL, this game runs on a Jaguar notebook CPU on consoles........
The game is just unoptimized for PC
Nope. Your elitist hardware is miles better than consoles only on paper. Result in games paints a different picture. Bottom line, high-end PC gaming is nothing but a waste of money.
Krešimir Mandić what absolute twiddle my friend. With the right optimisation a midrange PC will murder a console in framerates and graphical capabilities. Rockstar have always looked at PC has an unworthy platform nothing to do with consoles being better, consoles will always be inferior to PC. 👈🏻
The game runs at 30fps or less on consoles.....
Most new games for AMD recommend a Ryzen 5 1600 or the X edition so I completely agree, quade core gaming is coming to an end. My next step is to buy a new Ryzen CPU after Christmas any way, presently own a 1300x planing to move upto a 2600 or higher.
Next gen consoles will run on 8 cores 16 threads CPUs, so I recommend you to pick something like a Ryzen 7 3800X or higher.
Jesus christ, there's no way in hell this port is well optimized with frametimes like the one at 10:30.
My i5 6600k@4.828/4.627cache@1.456v+2x4gb ddr4&3018CL12@1.465 runs fine after i added the startup parameter -cpuLoadRebalancing
Before adding that i was getting frequent 3 to 5 second freezes.
the radeon 7 performance is kind of interesting. The question is, does the engine make better use of the radeon vii than is typical or is it just making poor use of some other architectures
Please do a follow up about the new patch addressing quad core issues
I ring the bell , for instant gratification.... ^^ the grin was badly held back richhard... very good. :-D ,
now plz investigate the pascal problem in red dead 2.
I'm waiting for the next video about the 1.14 cpu balance argument
I think that the AMD cpu optimization that is built up around the Red Dead 2 engine is for the old AMD cpus that are like the one in the ps4 console and have yet to be updated for the next generation of Ryzen cpu architecture. This may be all patchable, but that’s speculation
Man your vids are sharp as a razor
wtf is this clickbait title? When the 7700k is still getting 70-110 FPS most of the time you cannot say it's game over for quad cores or a bad experience. Here I am at over 90-100 FPS most of the time on my 4.5Ghz 6700k wondering what the hell you guys are smoking. I haven't even tried the new patch which fixes CPU performance either.
He means quad-cores without smt.
avg fps dnt demonstrate the whole story my friend, the 1% lows are bad same with 8400 the avg fps is better but when you visit busy area where tons of stuff there your quad core cpu will struggle to maintain higher number
Yeah, but the 7700K has 8 threads, plenty of cache, high IPC and very high all-core clock-speeds are possible. So it will last a while yet. Thankfully... as that's what I still have!
@@DJ_Dopamine 7700k will struggle games like ac odyssey my 1600 ryzen 4ghz avg usages 70-80% and sometime hit 94% and wherever my cpu hit over 90% usages those micro stutters drive me crazy, same goes with bf5 if you are targeting 120 fps or 150 fps for better multiplayer experience with a high end gpu sadly you cant do, those cpu spike will drive you nuts
the question mark implies that it's a question and not a statement
The new patch killed my performace 9700k 2080ti. 60 fps before patch GPU temp 67 C, after patch about 40 fps with same settings and only 48 C at 100% GPU load.
Would love to see a series x and ps5 analysis and alike that continues to these RDR2 benchmarks and such videos
I have an i7-4770k. I bought it back when people recommended i5 CPUs for gaming, saying that an i7 is a waste of money. Looking back I am glad about my choice. Hyperthreading can make a ton of difference now.
Steve from GN defined, that those stutters are coming from engine bug present since GTA V and those drops can be eliminated by upping graphical settings.
Isn't that the classic method of raising the strain on your GPU to such a level that you're making yourself GPU limited rather than CPU limited? Because in that case, instead of having dips to lower FPS, you're essentially framecapping yourself to the same low FPS of these dips.
I fixed the issue with my i5-6600k by limiting my CPU power by 10%. Absolutely zero stutter now.
2:23 me at gym class
4 core 8 threads i7 will outlasts 6 core 6 threads i5. it started with Far Cry now with RD2
Yeah, threads matters. Can't wait for Ryzen CPU with 4 threads per core.
@@klanas40 You already have a 6 c/12 t cpu for very cheap from AMD!
not every 4c 8t i7 I had to upgrade from i7 4790k to 9600k to get rid of the bottlenecking at high framerates with RTX 2080 most likely was ddr3 1886mhz at fault here though. Now have ddr4 4266mhz and performance is way better.
@@FinneousPJ1 I just upgraded from a i5 6500 to a ryzen 3600 and there is a big difference
Good to see them working out the remaining issues, what a game!
10:55 I like the design of these graphs
You should have also showed the difference between running Vukan vs. DX12... I wonder if Pascal cards run better under DX12 then Vulkan in RDR2? It would have been an interesting addition to this video...
Kristofer Stoll my 1080ti has been performing better with Vulkan. I haven’t tried DX12 since the most recent patch though
@@sazzyjaxaphone7581 Interesting...
so, i'm buying a new pc. this time around with cutting edge parts. what do i get as a graphical showcase? this one? resident evil2 remake? metro exodus? shadow of the tomb raider? anything else? and yeah. minecraft of course! :D
Assassins creed odyssey is a good one, battlefield v another, forza horizon 4, shadow of the tomb raider, metro, this, etc...
I am running a 4770K paired with an RTX 2080 and 16GB RAM @ 1600Mhz. It's time to say goodbye soon. Next to the core count, I feel like slow DDR3 RAM speeds are also starting to become a major bottleneck in modern game engines.
use some 2400mhz ddr3s on that 4770k and see the difference
Please test on the latest patch and I would also love to see how overclocked 6600K for example performs considering you can easily get 500-700 MHz higher boost clocks out of it.
Everything was going great until my motherboard started smoking around chapter 3
@Digital Foundry @12:32 you said 4k but on the top left say 1440p, just i typo i think.
Look closer. And listen closer. The truth is there
RDR2 at 1440p and max settings makes my 1080ti sing and warms my i7-6850k to 59C (with a corsair h100i). I get between 40 and 50fps, which I'm fine with for story mode.
4 core not demonstrate fantastic performance long time ago ( bf v and counting)
*Digital Foundry*
If you've not done so, it would be interesting to look at console vs PC in terms of CPU PROCESSING vs FPS in several games with the goal of guessing what you'll need for a CPU to hit 60FPS on PC at similar quality to what a PS5 is likely to do... I know you've at least talked about this in general... I'm guessing the minimum you'd want would be something like the 8-core R7-1700, 6-core R5-3600 or similar Intel CPU... we can roughly assume games have access to something comparable to the R5-3600 running at 3.2GHz (if we assume initially only 6/8 cores allowed for game usage).. that might mean the DESKTOP R5-3600 is 30% faster due to its higher frequency, but consoles run more efficiently plus for newer titles on console there should be the DRAW CALL HARDWARE that further increases CPU efficiency so it gets very difficult to compare even in guessing. Could the draw call hardware reduce CPU load by more than 10%? Last point I guess is that DX12/Vulkan softare should slowly close the gap in terms of using more of a CPU's threads but again we'll still need more cores and/or higher frequencies to offset the console advantages of coding efficiency and hardware optimizations.
Great video, you just confirmed my decision to upgrade from my old i5 3570k to an Ryzen R9 3900x :D
Makes me wonder how 2 cores / 4 threads CPUs perform
Superb work Richard
Use Windows High Performance profile under power settings and set CPU scaling to 99% min/max.
Where is the CPU scaling option? Is that in the game or in Windows?
4770k here. Looks like it's time for an upgrade!
My Ryzen 7 1700 is sub-par on most games, but Vulkan changes that entirely. Makes use of all 8 cores - 16 threads.
Why do you think Pascal performance is so lackluster? Hoping a new driver gives a boost but knowing nvidia they wanna separate their current cards as much as possible.
My i7 4770 still going strong with my 2060 even in this heavy but beautiful game
That's not too bad, I was thinking my 6600K @4.4Ghz would barely break the 60FPS mark like on Kingdom Come:Deliverance but this seems like it would just be some stutters and not game breaking drops. I can deal with that, but a Ryzen is in my future.
I have absolutely no issue running this game and I feel for everyone who has the hardware but still has issues. I list my settings and specs in my videos and showcase a mix of Ultra textures, with high settings. Great game, super ahead of our time. It'll age like fine wine!
It's fascinating Wichawd.
My question is: "Instead of releasing Read Dead Redemption 2 ,why didn't Rockstar released the first part of the game both able to run on high end low specs PC? Something that they have done with GTAV?
i could not get stable frames when the game first launched, my CPU (2600x stock clocks) was getting annihilated, 100% usage followed by a crash to desktop every 5 minutes. Now that the new patch has been installed and i added the launch argument for CPU rebalancing, i get a crisp 75fps (locked due to refresh rate) and my CPU bounces around in the 60-70% range and i have not had a crash since...makes you wonder why this wasnt implemented at launch.
I'm getting 1080p60 on Xeon X5650 (6 cores 12 threads X58 chipset) @ 4.0ghz, 16gb DDR3 1600, 980ti. Pretty stable frame times at 16.7ms. Frame capped at 59.92 on RTSS, vsync on in game. all in game settings is set to default. Not bad for an x58!
Shit, I thought I could keep my i5-7600k until early 2021... might need to upgrade earlier...
I already noticed this with GTA-V more than 4 years ago, it made me upgrade my i5 3570K to a i7 6700K.. Looks like I have to upgrade again coming year..
I'm in the same boat, just glad I didn't buy the 6600k like everyone was saying I should have.
looks like the new 16 core ryzen chip may be more future proff than anything intel has had to offer for the last 4-6 years. Hell if i where to switch/buy i would go for an upgradeable am4 platform and a 12-16 core cpu, with these new games seemingly more core demanding now.
my i5 3570 is running GTA V easily
@@AdaaDK yep, this is what im planning to do, i will get either ryzen 5 3600 or the r7 2700 as i currently have an i5 3570k with a gtx 1060 and 16 gb of ram and it stutters in asassins creed oddysey, the outer worlds and a little bit in borderlands but just at the start.
@@pumkinfamely4963 are u running something like gtx 970 or higher, because if you do then have you ever cranck all the settings up @1080p?
Considering I am using an i5 6700k at 4.6Ghz I'd say no, it isn't game over for quad core processors. This game has ran great for me except for the occasional stutter since day one. Since the latest patch the game runs flawlessly. By the way I am running the game on Ultra quality on 75% of the settings, high on the rest. My GPU is a RTX 2080.
this is not very surprising considering the game was written for and optimised for Consoles sporting AMD chips. it's a shame they didn't optimise more for the nvidia / intel side before launch but no doubt it will get resolved. I have played it on PS4 Pro so i can wait and i suggest that every NVidia / intel owner does the same so that they pull their fingers out and get it sorted.
I've noticed the game started to get a lower cpu usage when switched from 8 to 16 GB of ram.
This game is shockingly running FANTASTIC on my rig. And I have a FX 8350, 8GB RAM, and and RX 580 8GB. I'm thinking about upgrading my mobo, get more ram, and getting a nicer cpu cooler so I can OC it to 5GHz but I don't know if that;ll be very smart.
FX 8350 still destroying it i see.
Game runs ok for my FX-8350 and GTX 1050Ti, 40-50fps 1080p ultra textures everything else low.
Lord, Comment Section is awful.
Thumbs up. Sadly it always is. Kids bitching whats better xbox,PS4, or PC. Kids bitching about what games are better,cpus/gpus, etc. Really its just kids arguing and not realizing since they are young and stupid that people like different things and that opinions are like assholes, we all have one.
You can't expect much from the TH-cam commentary. It's still the Internet, after all.
It's like life, lots of variety.
Council estate trash probably, tho i feel sorry for the few genuinely nice people who live in those places.
This game will utilize 12 core CPUs fully, crazy.
Once they used to say gpu al matters for gaming. . Now for marketing they added ram, cpu cores everything the game needs. Right now switched to 4 core budget gaming build from console. They started the same issue when i switched from pc to console.. hardware hussle is always in pc gaming. This is happening bcoz reviewers over believing the next gen hardwares and rich gamers always preferring fancy over priced pc harwares. But most of the gamers ares still playing in budget, developers has to understand this and stop moving to bullshit high cores usage ans keep 4 core best for gaming and remaining part gpu wil take over
imagine , how this is going to run on Gtx 960 gb , Nvidia , 16 GB Ram , i7 :)
that is my system & it plays all modern games at around 40 (after slight graphical tweaking )
Testing this game will be a tough one ...
i was able to run it 4k ,30fps with an i5 4690k and gtx xtreme 1080 at ultra/high setting today!! still get some stuttering but it's getting better
3:50 Nvidia make CPU's now??
lol
Well... they have been making Tegra CPU's for years actually.
:p
Project Denver!
They have been making cpus for so long
Me: *pats Radeon 7* you’re all right boy
I have a 7700k and it runs this game just fine
Care to qualify that?
@@putinstea I have a 7700k and it runs this game just fine.
@@ShaneDanger42069 I have a 3570k and it also runs this game just fine. (so they must be equally just fine)
@@putinstea Yes, I have a 1080 paired with the 7700k. I'm playing at 1080p with most settings on high, and a few on medium and ultra. Getting a 70fps average. Lowest frame I've seen is 57, but it's rarely ever under 60fps. Overall a significantly better experience than when I played on the X. The draw distance is amazing. Never heard of the 3570k or whatever you said.
i also have an 7700k with a rtx 2080ti running everything on ultra at 1440p never had a stutter ingame not even before the patch