🛑 9950X vs 7800X3D vs 14900KS MAX OC BENCHMARKS 😱👑
ฝัง
- เผยแพร่เมื่อ 18 ก.ย. 2024
- ►►►► framechasers.o... ►►►► - OPTIMIZATION COURSE
► framechasers.org/ - Discord and Support
► framechasers.o... - Consulting
► framechasers.o... - Max OC Bundles
► / chasersframe - Twitter
► kick.com/frame... - Stream
► ... - Instagram
Join the community over on Kick and discord to talk tech and play some games with fellow frame chasers Saturdays 12PM PST.
#DDR5 #AMD #Warzone
this is the biggest bullshit video ever 🤣 i’ve done so much testing myself and i have nothing to do with tech youtube. but i have 14900ks and the 9950x. Running a custom 23h2 and a 24h2, gaming is significantly better on 24h2 on AMD and im a 14900ks fan boy lol. I got 28% boost from 24h2
Man when amd said this was for professionals, they were not kidding.
The Blender performance is really really impressive. I keep having to remind myself that this chip is substantially less wattage than my 14900k is.
@@jabronilifestyle yes and the 7800x3D draws substantially less power than the 9950x and a lot less than the 14900ks in gaming. remember that it can often draw like 75 watts, ive seen it draw like 50 watts and still be the fastest. dont just believe one video. watch all benchmarks, look at as many games as possible. the picture will be pretty clear. and also look at the 1% lows yes. also at price. at power efficiency. ppl saying power draw doesnt matter, are just coping their brains out.
if someone uses productivity apps, the 7800x3D isnt the right chip. but for someone who plays games and only does normal desktop usage besides that, its by far the best option, thats not only also the cheapest (like 40% as expensive as the other two CPUs here), but its also the most efficient chip out there, by a mile. the performance it brings is absolutely ridiculous just by itself, and it gets onto another level, when you factor in power draw. its insanely efficient.
Got my 9950X today and it’s super fast overall. Quite happy with it.
worth the upgrade from a 5950x? i think ive been mostly gaming instead of heavy editing like when i first got my chip for it.
@@The_GuthI would say so. In Cinebench R23 at 200 watts it overtakes a Threadripper 3970X with 32C and 64T. It’s blisteringly fast and in gaming it’s excellent. I paired it with a 4090 at 4K 120 Hz on my OLED and it feels like it has Raptor Lake IPC. It’s insane!
@@davidwilson5668I use 240hz oled. But same specs as you.
I notice my idle temps go from 50-60 range. This normal?
@@davidwilson5668dang! Thanks for the feedback! I’m considering it along with the X3D. 🤔
@@The_GuthI upgraded to the 9950x from the 5950x. Night and day performance difference
I miss the old cold openings
Thanks for the actual review and not bs from the mainstream
I wish I had time to do them 🥲. One day when I’m full time
@@FrameChasers hey jufus im a member in the discord but it wont let me type in the general degrade anxiety chat for some reason. could you help me out? please
Read your welcome email
Lont-time Intel user here:960(i7), 4670k, 4790k, 10700k, and now 14900k. A cheap box and a higher price tag is sounding like a pretty good tradeoff in exchange for not needing expert-level knowledge and extreme cooling to overclock the damn thing.
I don't have anything AMD but it's kinda nice to hear you something positive about them for once🤣
I think the small performance increase comes from a security feature being turned off! ( Core Isolation -> Memory Integrity) It seems that the default for Windows is normally 'on' but for some reason 24H2 Win 11 developer preview has Memory integrity is defaulted to off - giving a boost to the CPU. I haven't done any in-depth FPS testing myself but I've read around this and it might actually be a thing.
Yep you nailed it, we confirmed this on stream
That was the thing I found in the first 10 minutes of installing that Windows version so far I didnt find anything else since then.
you got it,. this was also my first suspicion. they posted gears 5 with around 180 fps 1080p ultra and after the update it was up to 230 fps. I tired downloading Gears 5 and mine was already doing 260 fps. I did the update and nothing happened, in fact it slowed a little bit now running at 250 fps. I pointed this out on one of their Videos, but I was just ignored.
So I've just done a quick FPS test with Fire Strike benchmark (3DMark) turning Memory integrity off and I got a 10.7% FPS increase @ 1080p 136.91 FPS vs 151.61 FPS (the most CPU bound test I have).
Just a thought - If you don't use virtualisation on your system just turn it off in your Bios (VMX) and you won't need (it doesn't load) memory integrity as its to protect from VM vulnerabilities. Oh FYI I'm running a 14900K & 4070Ti
they want the security features turned off for voting season in the US, so they can fuuck the world over. True shit. Whether you are a gamer or anyone else, your shiit will just shut down. These measures were put in place for people with backup power. We will all have to defend our way of life, PERIOD. They want to keep you gaming bros gaming, so you don't notice everything else has disappeared!
You look so much better without hair. I have seen your vids 1 or 2 years ago when your hair were bearly on and It did not look right now you are looking bad ass.
I appreciate you as always @Frame Chasers
Said exactly like a cat.
The CPU is passable but it's a fail as can't sit in the box... 😂
Heya Jufes 🥰💪👍Just ordered your Optimisation Course and looking froward to it and do some serious overclocking 👍💪. Thanks for making this course available my good sir! Also, thanks for the awesome content 💪🥳👍
@@michaelthompson9798 i agree😀❤️👍
The 35% came from the memory integrity turned off after the Update. With it turned back on, the performance was 2-5% faster after the update.
This is why I wait 3-6 months after a new gen comes out, same with games, they sell beta products.
I didn't use the preview version of 11 I just updated with optional 23h2 update and got around 10% in rust nothing else changed memory integrity was still on I was wondering how they got 35% when even AMD said it was 5-10% but for rust it was a nice healthy boost also the lows seemed much better
@@worldwar208 they uses insider preview 24h2 and it turned of memory integrity... I wonder if I should turn it off it it gives that much performance... Is that really needed to be turned on?
@@bl4d3runn3rX as long as you don't do sketchy shit or visit skech shit websites you should be good to turn it off 10 has it off by default If I remember correctly
Left the discord a few months back due to financial reasons was a supporter for almost 2 years. Im glad I can still get good information via your youtube videos thank you.
You didn’t have to leave man
@@FrameChasers oh I didn’t know if you stopped supporting you could stay and interact etc. sorry man
@@alexburke4569 yeah why not, I don’t care about the money, it’s just there to firewall idiots. Oh well next time
@@FrameChasers >it’s just there to firewall idiots.
That's highly antisemitic and your discord is clearly in need of diversity, equity, and inclusion. We'll be sending 10,000 non-english speaking indians and a few somali for good measure. Good luck! - ADL
I would like to see 14700k vs 14900k locked cores "safe values" benchmark. I feel like hostage to the 5800x3D because all of the issues the current generation has. I wonder if Intel is going to release rumored 12Pcore 0Ecore CPU for the LGA1700.
Pretty much exactly the same as all benchmark videos out there. All you need to do is lock your cores and undervolt a bit and that won't really change anything. You might even get better performance due to more of a power/thermal headroom.
14700k/kf is the best intel cpu i think.
im idiot that i give him away for buy a 14900ks 😮😢
you cant run him without costum loop!!
on max undervolting NOOT!
on undertakting and undervolting NOT!
its hopeless....
the problem for this monster 14900k and ks are the temps over 80degrees and this blowing youre systemfans and aio so loud
its a horror 😂
nobody for all youtuber say this truth thing.
i Warning you all: dont buy 14900k/ks without a costum loop!
600+ and -800 €for a cpu and cooling is crazy 😂
You benched my two favorite games. Riftbreaker and Civ 6. Very fair and rational review. One of the few to point out that total usage is more important than shrinking differences as resolution is increased. Kuddos for that. I can't help believing that 1080p benching is losing its predictive value. While arguably GPU limited at higher resolutions, FG and RT add their own computational demands that would show up in minimums and 1% lows with the data demands of higher resolutions. Thanks.
And on Intel you will pay every single month more on the electricity bill.
The constant crashes will lower that a bit though ... This channel is a joke. Everyone else covers very closely their setup, settings, eliminate all the variables, and test 20-40 games, then this guy sh!ts all over them. HUB released a video today, actually diving into the issues, and explaining the variability in their numbers with Ryzen 5 ... this guy tests a couple games, doesn't show his testing on the various windows but claims he does and only tests a handful of cheery picked games with a tuned 14900k. It is what it is.
@@WaspMedia3D Well said.
@@WaspMedia3D Also, I noticed the 7800X3D had 64GB of RAM while the other 2 had 32GB. Was the RAM even at it's EXPO/XMP profile? Sounds more like he hobbled the 7800X3D platform so it wouldn't shine as well.
Also noticed he made it a point to bring up 1% and 0.1% lows ONLY when the 7800X3D was worse, and in the benchmarks that were higher on the 7800X3D, didn't mention it.
This guy is so biased it's bizarre.
@@AluminumHaste Yeah he probably purposely stuffed in four 16gb sticks into the 7800x3D system to try to hobble it - every enthusiast knows two sticks is better for gaming with that CPU - every reputable one, that is. The really weird part is that he tries to hide his bias when its right out in the open for all to see, as though no one notices ... there's a reason his channel only has 30k subs ... all Intel fanbois.
Also look at his "dip" testing that he keeps going on about at 14:27 ... he smoothly and slowly scrolls the 14900k back and forth, then frantically beats off the mouse for the 7800x3D test and complains the 0.1% lows are lower .... SMH.
yeah intel consume double if not more the power of an Amd cpu
I love squirrels.
Her name is Bolt 🥹
@@FrameChasers Sounds like she's FAST 😂😂😂
Yes, workstation tests video is a must!
i Would love to see these CPUS tested in Emulation. My God. Probably the fastest example of these cpus on the internet and We cant see them running RPCS3 intensive games like God of War 3 and Metal Gear Solid 4. Those would be great test but nobody does them. Sucks really.
I appreciate you bringing up user experience. Not sure why thats so uncommon 😅
Have to agree. When you are dropping big money on parts, the unboxing experience is supposed to be a bit special.
Threadripper was a cool experience.
This.... Shameful!
Nice results. No joke, the 9950x windows experience is excellent. Even with the cpu pegged at 99% doing some heavy workload, the pc is still usable.
What are your memory speeds/timings like? Better memory and AGESA 1.2.x is out there, and I want to know how much more you can squeeze out of AMD
Wooo.. video came out just in time for morning coffee. Clicked on it so fast! 👍
Your findings are pretty much in line with what I saw when tunning 7700X.
It is basically the 7800x3d with more consistent 1% lows, and snappier in windows.
Speaking of that, I would say for gaming tune the 9700X, don't buy the 9950X
His path to becoming a chechen autonomist warrior is clear in front of him.
Very insane video! You are not a follower but an original and knowledgeable thinker
I had, 14900k and sold it for a 7800x3d. The difference is that instead of 90 degrees now is 60 and instead of 250-360w now is 60w. Performance wise is at least the same, maybe better in some games.
Colby vs Belal is the fight to make
colby washed
@@vladsta3833 I totally agree with you. Colby vs Belal is the fight to make, though.
Colby just lost his 3rd title fight... He does not deserve
@@joantonio6331 You're 100% correct and that's my #1 reason why he doesn't deserve a title shot. Colby vs Belal is the fight to make, though.
Quality video. I wish you would have included the mw3 in-game benchmark at 1080p very low settings. Regardless, love this video!
It’s in the previous 9950x deep dive video
@@FrameChasers Thank you!
The 9950x is surely impreessive and personally I do appreciate your take/opinion on it as it's quite a differeent look than most other reviewers have given! That being said, when you talk about cost per frame, at $600+ I would think that it would still be a tough sell to a primary gamer.
I thought I was the only one that couldn't find any improvements with that Windows update. LOL. I saw exactly ZERO fps difference in the games I tested which all have been shown by other sites to benefit from the update. Installed it on 2 different machines even.
I dont know what they are smoking either because I just installed my 9950x and tuned it just a bit, and immediately blew away all the previous benchmarks. 42,625 in cine23 first run! You do have to have the lastest 2308 bios, but it works great with it installed. Blows my 7950x3d out of the water in the same machine.
Glad I went intel this generation i have a 14500 i5 is really decent. 👌 it's capable for high refresh rate gaming. But I do 4k 60fps for now
Little memory tuning makes a massive difference on am5. Did a 7700x build for a buddy and just tuning trfc and trefi gave me like 40fps to minimums in warzone. Didn't touch anything else yet at that point lol.
How to do that bro
@@football4.069 Usually in the BIOS you can drill down into ALL the memory timings (there's a lot), but you don't want to play with it if you don't know what you're doing. you're almost guaranteed to make it worse.
100% agree with this review. i have 9950x now and i replace a garbage 7950x3d…but it s a long story.
Zen 4s problem was the L1 cache.... Zen 5 just doubles it...i guess zen5 3d will 25 to 30% faster then 9950x in gaming... Cause now, zen5 needs more L3 cahche
it doesnt, they didnt increase l1 cache size, just doubled l1 bandwidth
Keeping mainstream Tech Tubers real, like a boss.
Your channel is awesome bro thanks for getting to the tech without the biased bs. At 30k subscribers, you'll be 10x that easy soon! Keep it going!
480hz feels smoother tho and reviewers said they can tell a difference between 240 and 480. Plus that monitor has insanely low input lag.
I went from 240hz to 360hz and yeah I can tell the difference. Not in every situation but a lot still.
The 240hz display went to my wife and I use her computer daily. When the kids go to bed and I have time to play on my computer, that 360hz feels so nice.
Not going to bother going to 480hz though, not until 1000hz probably.
Idk why but the squirl took me out 😂
LOL suggesting a 13 900 K secondhand is like suggesting secondhand Volkswagen with no warranty left LOL
I want to comment on 1080p tests ones :3. If your game is cpu limited, like vrchat (does bot matter vr or desktop) you will face cpu bottleneck the same second you join huge lobby (more than 40 people, even with shields, since you have to process and unpack peoples data anyway). You can have 720p or 4k resolution, max fps you will get will be 50. I do have 7800x3d and 4090, i’m using with a headset and high presets and as you said “who will play 1080p on 4090” you still have 50 and there is a huge difference between cpus. I would recommend to anyone upgrade cpu first if they have more or less alive gpu if they are planning to join events or big instances. And it does not matter amd or intel, anything which is faster with given budget :3
So, different games have different load distribution between cpu and gpu. I know that tarkov is also very heavy on cpu. For cpu benchmarks it’s just makes 100% sense to isolate cpu load keeping resolution to minimum to see what would be the best cpu performance.
I also feel like people who are talking about 4k and no difference between cpu’s - i think most of them want content with optimal cpu-gpu pairs so they can spend as low as they can and it's great especially if there is more info for your set of games and resolution, but cpu benchmarks are not about that Still will be interesting to see gpu table with the lowest cpu you can use to have “utilization 100%” as people usually comment but i can only imagine how much it will take to get such a table.
7800X3D is the clear winner not just in per but in power Consumtion -100% power
Great video, I was waiting for AIDA latency numbers and RAM frequencies, hope that for another video
nothing new, just doubled l1 cache bandwidth
What i want to see is benchmarks with the cpu's used like someone who might stream and game from the same system might do
i9 13900k and i9 13900ks second hand are such fantastic options right now 👏
Oxidation issue?
Whenever I watch these videos, all I want to do is fix that hole in the dry wall. 😂
There it is !!!
So all I'm seeing from this is the 14th gen CPU is the best and people who have a 7800X3D should just ignore the 9950X entirely.
Went 7800x3d after 14700kf , playing mostly pubg , i got no more bottleneck and no more fps drops.
AMD just solved gpu bottleneck with x3d , go benchmark it for yourself. mostly on unreal engine games since they are cpu bound.
Thank God he showed us CIV 6 benchmark! Would be nice if we could get Stellaris 1 month timings on some late game save!
What would we do without Framechasers, keep the vids coming ❤
The reason why the zen 5 is snappier and better in the lows than zen 4 is because the higher L 1 Cache Mr. Framechaser!!!
19:53 is the squirrel overclocked too
So what I learned from this is I’m still happy with my 7800x3d and 13900k
AMD save on that box 0.1$ 😅😅😅
Great review.
Looking forward to the workstation follow up.
All this teaches me is CPUs have this low hanging fruit of a bottleneck for gaming and no one is making a gaming specific CPU. When AMD can come along and whack some extra cache and it yields massive performance increases far beyond what that CPU should be capable of..... You know there is a fundamental problem with the architecture. Why is everyone still making general CPUs and not gaming CPUs? It's not like you couldn't release a motherboard with two sockets, having one CPU for Windows and one CPU for gaming. I expect at this rate Nvidia will put the CPU directly on their graphics card and leapfrog the lot of them.
I'll just keep rocking my now 3 gen old cpu for a while, hasn't let me down yet.
Got 368 fps avg in the SOTTR with my 4090 stock clocks & 14900KF @5,9ghz ( HT off ) using DDR4 @4133mhz cl 15 15 15 30 Gear 1, the same game settings you used, 1080p low
is this with rebar on or off, especially for warzone and amd cpu, when i turn off smart access memory, the lows on my 5800x3d are much higher in the in game benchmark
I know you dog all previous gen Ryzens and 5950x in particular, but I reached 336 average in SOTTR bench on a daily tune and have had the tune pretty much since day one six months before the 11th gen Intel's were released. Glad to see the new gen hitting hard though.
Relatively unbiased in this one, kudos.
Box is so thin post man can post it though your letterbox 🤣🤣👏
Performance is not impressive compared to intel but efficiency IS!
You don't mention how power hungry the 14900KS is. I see you are intel biased. That's why Hardware Unboxed is a bigger channel. They are more objective.
"techtubers" don't know how to tune Windows properly for gaming, their numbers are useless
Ah yes, we live in a world where everyone wastes their time on tuning Windows. 🤡
They are just representing most people's experience. 98% of people are not going to bother with anything, just want to fire up their PC and play, not spend time on tuning Windows.
Thanks great break down and with actual evidence. I will keep my 14900k which you saved with the information you provided no issues what so ever . I have a 4090 Suprim x. I will have to sell my car and get a 5090 just the way it is no matter what the cost.
Ok but what happened to the imc on the Ryzen 9000 series? Are we still stuck on ddr5 6000 mhz?
Now I see 7800x3d is the fastest option, in contrast to your opinion😂
I've been waiting so impatiently 😅
Forteen Forty Peeeeeee!
Premium boxes what's next RGB leds inside a computer case ?
Insanity 😂😂
What was keeping me away from AMD in the past is my bad experience with the Athlons where I had two builds dying on me in less than a year between them. Modern AMD platforms are way too expensive for the performance they offer. You pay the same money for less performance, less M.2 slots, slower memory support and so on. I m still building an AMD PC every now and then though just to get first hand experience of how they compare but intel its always better value in every cost range.
Got a 7800x3D and it was so laggy, everything felt bad and windows felt like it was corrupted. Sent it back and got a 9950x. Also the 7800x3D was 100% usage permanently in bf2042 and star citizen what caused even more lag
Can't wait for AMD fanboy comments to roll in
I will pretend...... REEEEEEEE!!!!!!! Lol
Fan boys waiting for other fan boys to respond is a permanent loop the human mind can't break from.
Intel noobs 😅
@@dubya85 t r u e 😂
Intel fanboy trying to stir shit up.
How your Intel CPU holding up?
good stuff, i'm too chicken for overclockin' but i have awesome respect for people who do overclockin', i'm mostly just tryin' to get things to work and for my gaming PC i was already happy with a Ryzen 7 5800X3D from havin' it over a year now after an upgrade from an i7 4790k
however my dedicated livestreaming PC is using a Ryzen 7 5700X because it's fast enough and efficient for my needs and DDR 4 is like way cheaper than DDR5 especially how 64gb DDR4 kits are goin' pricewise now
i always enjoy ur vids despite me being too chicken to overclock, and if i would start anywhere, i'd probably start with Undervolting if needed
That cup on the back
Kind of interested in a 9950X build. I'd be willing to go back to 10. 5900X had a nice box, really fancy looking. I've been on Nobara, I don't really miss 11 much (still have a flashdrive with it). Am on 8 P cores with 12900KF 4090, 8 or more cores for the next build.
Do a warzone video please about the rebar and render worker count on intel cpus and how it affect performance by using the correct render worker count and how disabling or enabling rebar changes fps alot
What's the fastest memory a 9950x can use? Faster than 6000?
1 to 1 6400 MTS is the fastest I have seen. Running the memoy at 3x the fabric clock. So 2133 fc but is hit and miss with current boards. Above that you have to go to 2 to 1 and should go to 8000. I have seen both successfully done.
Amd memory controllers arent that good. Even 8000 on amd sees 30gb/s rates slower then intel at same speed. Thats the part amd should be improving.
@@killerrf current motherboards.
@@edwinwebber5776 it can use 7200+ but due to the IF it's pretty pointless going much over 6000
@@robertmyers6488 well some amds are running 8000 freq on many motherboards but read copy write speeds and latency suck compared to intel at same speeds.
really appreciate the review. I always had doubts with the ''other'' Mainstream tech media.
My struggle is which mobo to choose for my 9950X, X670E or X870 which as far as I can tell is just B650E with just the one prom21 chip
What’s the goal is favorite question now
in this time, better wait to see 285k, 40m L2 cache boy.
Masterclass Preordered👍
Thx. Still seems like Intel wins in 1% lows quite often which is what I suspected.
My 10900k is not going anywhere. Although I like the sound of 16 POWER cores.
Well if you're shopping for a new CPU now, getting the 14900ks doesn't make sense, it's a dead end platform. AM5, you can change out to 10950x or 11950x with just drop in replacement.
I'm currently on the 14700K myself and I kinda am saddened that I'm now stuck here unless I want to rip out the motherboard and change platforms.
appreciate your work as always Jufes
interesting that you dont show power usage 🙂
how much power does the 7800x3D draw for achieving these results? how much power do the other CPUs draw?
and yes, power efficiency is a massive benchmark of actual performance. The 7800x3D can do all this while drawing WAY less power, like WAY less power, which means its architecture is a lot more impressive than the other two. I will buy a 5090, it will draw like 400-600 watts, depending on the workload, the 4090 draws like 350-450watts+. I will absolutely pair it with an 9800x3D, to increase my power draw as little as possible while getting the best gaming performance overall. I dont want two heaters in my pc, I dont want to double my power bill just bc of a CPU, that isnt even better in gaming, lol
HBU tests like 50 games, GN tests like what, 20 games or so, those tests say a lot more than testing only a few. the 7800x3D is still the best gaming CPU overall. and yes, GN and HBU do show 1% lows, so this isnt an argument. the more games you test, the clearer the 7800x3D wins. Also the 7800x3D could be had for like 330-350$ for months now, which is another massive positive for it.
I hope the 9800x3D is a not just 5% faster than the 7800x3D. would be happy with 15%. or even 10%. and yes Im talking about stock, I dont want to direct die cool my CPU, its not needed. I like content about it, but I wouldnt do it myself, it doesnt do much. It already only draws like 75-90 watts in real world usage, so no need for direct die cooling. And it seems like AMD did improve the cache heat "problem" with the 9000x3D series, at least thats whats rumored, so its will be even less of a problem. but even now, a 360 AIO is more than enough for a 7800x3D lol
oh and yes, 1080p benchmarks are actually still relevant with a 4090 bc in AAA games with extreme ray tracing features or especially path tracing and other intensive mechanics, you will use DLSS Upscaling, and if you play at 4k - like me - you will use DLSS Performance aka internal rendering resolution at like 1080p.
Now if only chipsets and mobos became as solid as Intel ... My Zen4 platform is just buggy, i'll blame MSi - i find chipsets and mobos to be more problematics and less reliable overall on AMD, will love to see that change.
I wonder if Steve from bozos unboxed watches your videos jufes, but doesn't tell anyone. Lol.
He does yup
Hey mr. Jufus mind a question ? I am one of those cod bros and cs2 player also and right now i am having 5800X3D (AMFUCKINGDIP) + RX6800XT (which is actually quite nice) and i cant decide wheter to go I7 14700kf with tuned ram or 9900x, i look to remove the amdip from warzone, From the video it seems like your 9950X doesnt dip in warzone but does it apply to actual urzikstan online gameplay ? Oh the dilema, am5 zen 5, 14thgen or even potentially 12900kf(so cheap here) -> endup with dead platform - > wait for Arrow lake ? Would you mr.Jufus help me decide or you guys from the community ? Would greatly appreciate as there is no valid information on the internet yet. Keep it up!
14:30 terrible screen taring, is that normal?
Is this not a similar situation with the 7700x matching the 5800x3d? Also in Warzone that 20+ fps difference in 1% lows is 10% not 5% which is significant no?
So what do you think about the 9800X3D if it runs almost the same frequency as the 9700X with oc potential ? Could be this the game changer we are waiting for ? High avg fps and with the better ipc and higher frequency challenging the 1% lows? Also only one die. What do you think about it be honest ?
Also one thing - I guess amd noticed they fucked up with the launch of Zen 5 and they may trying to fix it with the 3Ds now?
If I got the course and access to the discord are there any more fees for the discord afterwords or is it a lifetime thing as long as i don't spread misinformation in there or "some bullshit"?
yay my intel is still king :)