Intel got back to us! "For CPU launch claims, Intel generally tests within parameters specified by the manufacturer. We conducted the performance testing on the AMD parts within specification based on AMD's public guidance regarding max memory speeds. In regards to the 1% lows - the 5800X3D not shown in the 1% lows/99th percentile gap to keep the message simplified. Our goal was to introduce a new way of measuring gaming experience beyond average FPS and focus on explaining this parameter and its importance to gamers. When we add additional SKUs to the slide, it would make the story more complicated to explain, which is why we didn’t include the ADL-S 12900K as well."
I'll tell my grandchildren when things got hard, the whole family had to gather round the rtx 4090 and the i9 13900k to keep warm and we all had to work 23 hours a day to pay the electricity.
@@kaktusgoreng who needs a fireplace when you could get a good heater. Hell, maybe you could even cook if it gets more ridiculously after this( rtx 50 series and oc'ed i9/ryzen 9)
I actually cannot get over where CPUs are at now. It wasn’t long ago that 5Ghz was a *HUGE* benchmark for serious overclocking and that 8 cores was insane! I cannot believe we are just chilling with consumer 16 and 24 core CPUs at over 5Ghz. Wtaf. Incredible progression.
I built my 1st pc back in the Core 2 days and remember watching TH-cam vids of people needing liquid nitrogen to hit 5ghz and I think only on Duo chips not Quad
Well, intel 24 cores are like pretty much all e-cores (basically, a bit better versions of those fucking useless intel atoms) so....it just sucks my friend.
8:40 don't forget that in Task Manager and in Linux you can limit specific processes to specific cores already (even while they're running) or decrease execution priority, it's just not done automatically.
@@ThomasWinget That's one way to do this, but another would be to adjust the nice levels to what you need. That's gonna make your CPU keep going at maximum while giving you ability to do simple tasks, although it'll still feel a bit slow...
@@ThomasWinget It annoys me when developers use oddball build environments or shell scripts that automatically select every core when using -j. It also annoys me when they single task the build. I tend to devote 4 to builds that I do in the background, sometimes 8.
If you care care about the efficiency, x86 platform would never beat the arm platform especially in laptop area. So that’s why Apple is dominated in laptop battery life
5:00 I had the same thought seeing these CPU benchmarks. For most games its obvious the biggest bottle neck is on the side of the GPU, exciting to see how the graphs change when they launch
Gaming is split between multiplayer competitive and single player that's GPU limited. 5950x was always memory starved and the socket PPT limit kind of enforced an Eco mode unless you overclocked, but using it with 3200 RAM as gaming CPU is silly. HUB tested 12 games and showed Ryzen 7 7700x did get more fps in games like CS:GO, DDR4 win was 5800x3D. The value winner was also the Ryzen 5 5600 based on fps/$ I don't get why running stuff in background is suddenly impressing anyone, I did that on with a chess engine using 8t set to low priority on a 4 core so I could do other stuff.
I think this would be a prime opportunity to see direct comparisons of DDR4 to DDR5 with the same hardware. Linus taking 5 full seconds to react to "charge they phone, eat hot chip, and lie" was the best thing I've seen in a while though.
@@ludvigbjorck7133 i hope so too. Im still on am4, but if ddr5 gives good result, i may jump.the gun, else ill pass on this gen since im still playing the same games
Remember when the 7th gen refresh was literally like 100-200mhz and nothing else? It's crazy how much horsepower cpus are putting out each generation, and given the rumors heard about Meteor Lake, it's shaping up to be an exciting 2023 for PC Building.
2023 is going to a great year to build a computer and RTX 3000 will probably still be available by the end of next year for more budget conscious builders.
@@kyletan4063 Welcome, you must have lost the part in the early 2000 until 2006 there was pure competition. Sadly intel played bad and you know how its finished Intel had to pay 2 billion fine.
I feel like quad core was standard a few years ago and now the I5 has 14 cores, it's insane how fast things evolved after quad cores were standard for so long
you can thank AMD for that, in 2017 while Intel was pushing out quad-cores, AMD brought 6, 8 and 16 core CPUs to the mainstream. This really got Intel moving for the first time in a long time, and ever since, there's been mostly great competition between the two companies.
well yeah. but most of those cores aren't really "cores". if you get what i mean. performance-wise, is still a 6 core. that's like saying a 6 core with HT is a 12 core cpu. is it tho? depends on your notion of what a cpu core should be, i guess.
@@dresnyd clock speed is all that matters when working. I video edit on a 2600x, going from 4/6 at 4.0ghz cut my 5-10 minutes sessions in half when rendering. Believe it or not, most of us editors don’t have thread rippers that do it as a side gig. Anyone rendering art though probably would need it. Every 10 minute clip, is around 4-8 minutes of render.
If only we can get this kind of competition in the GPU space. Nvidia just outright telling people they're being shafted and that we should be grateful for it.
@@ratboi9770 you forgot DLSS which massively improves performance. That is the main reason why people go with nvidia, oh and good drivers from day one, nvenc, better software. Oh and before you start, no, FSR 2.0 is not comparable to DLSS. Also DLSS has bigger performance gains than FSR 2.0. AMD is really great in CPU game since ryzen, but GPUs are just miss after miss. They are decent but never enough to tip the scale. We need AMD to get good in GPU sector so we actually get some competition and better prices. Also ray tracing is here to stay. And no, its solid for other games also.
I would love to see the update comparing these exact charts with "how they should have looked" when you bring all the test-benches to parity with the same processors they tested against.
The new P-core functionality could be huge for multi-tasking, though after both of these presentations, I honestly just wanna see what a Ryzen 7000 with 3D V-Cache would look like because I had no idea the 5800x3D was already such a beast. A Ryzen 7000x3D cpu sounds like it would put literally everything else on the market to shame and I hope we get to see it
Funny, I have had linux switching cores for me that way for ages. But having it on hardware has some definate advantages as long as you can control the behavior. Imagine your cpu accidentally swapping the wrong game processes to ecores mid-run.
I'd stop being intrigued as by all accounts Intel will only do the cheapest style graphic cards only going forward as they can't get their drivers right or have hardware to compete with beyond that.
Linus said that the power efficiency could mean i3 could be really fast. Unfortunately Intel said that the i3 and lower end i5 will still use Alder Lake. Not Raptor Lake. So the difference is probably minimal. Just the old chips with a small mhz boost and maybe some extra Efficiency cores.
I've been hypertasking since Core Duo and Core Quad. My old 4770K was a beast of a multitasking workhorse. It's always about choosing the right hardware for your tasks.
Im still using 4700k to this day. Im cooling it with a corsair h80i. One thing I noticed when I tried to change my OS yesterday from windows 7 ultimate to windows 11 pro is whenever I play shadow of the tomb raider, the display would just turn off on my asus gtx1060 6gb. Maybe there is a bug on the compatibility of cpu and gpu to the windows 11. Haha 😄
@@sanghuloom6943 Definitely look into Windows 10. I've been running it for years on all my Windows machines and I still run it with my new hardware. I won't be moving to Win11 until Win10 starts losing support from the software makers I use.
@@ragtop63 i just tried windows 🪟 11 because the design is closer to windows 7 with its glassy and aero style. Only for aesthetics and some programs support. But other than that, windows 7 is still a good for me as well. But definitely ill look to windows 10 next week. Maybe I'll like it even more
It feels like yesterday when intel announced 12th gen, and now it's 13th gen. But damn, 65w for a similar performance to 12900K OC? Makes me imagine what 14th gen will be like
@@Tugela60 is it possible that 65w power draw of intel and amd is different? how is it actually calculated? which would be better to compare- power draw of cpu itself or from psu?
AMD has a strange, roundabout way of calculating "TDP" that isn't actually based on power consumption. See Gamers Nexus' 7950X review to see what I'm talking about.
Without DDR4 support a lot of people would've been fairly angry with intel. Afterall, they bought their alder lake boards thinking they could re-use it at least for one more gen. And those boards came in DDR4 variants. Imagine if intel would flip on them by removing DDR4 support on the only other generation that would be compatible with those boards.
The space heater thing is legit. When I upgraded my CPU and GPU, I had to lower the AC or it would get 3-5 degrees C hotter in the room. I also undervolt, limit the max Ghz, and lower my GPU power limit-whenever possible. It hardly helps-baseline power consumption is just THAT much higher.
I think my favorite thing about the key note was that Intel is also talking about 3D stacking so we might see exclusive 3D stacked chips from both in a couple generations. I’m also hoping this Arc GPU does well enough to get another generation we need more competition in the GPU market… maybe Nvidia gets salty and jumps into the cpu market on desktop 😅
Watching the keynote is like hearing a car salesman talk about how much better his cars are then everyone else’s so he leaves out certain details and glosses over others.
@@washgaming5825 it’s my phone, idk what it is about the 12 but it has always done this to me, even if the word is spelled correctly it will swap it for something else if I type too fast, never had the issue with the 8 or 6, even my budget androids never had the issue, must be a setting somewhere that is causing it.
@@Tugela60 Except when you make a pillar chart with 3 products you compare, but only male pillars for 2 products and show the thrid as a single red line at the top. That IS creative graphmaking, as it literally flies in the face of standard graphmaking rules.
@@Tugela60 You kinda missed the point of the comment and probably didn't even look at the graph properly. they intentionally made the graph for the 5800X3D hard to notice as even though it is a previous gen CPU, it competes with and in some cases beats their new RaptorLake CPUs. Putting the 5800X3Ds results as a small red line above the result for the 5950X makes it confusing, because the only reason you would do it that way is because you hope people won't notice while at the same time you can say that you were brave enough to put those results in there even though they weren't flattering. So yes, it is creative graph making.
@@reappermen Even within such comparison those percentages are false when comparing to a 5800x3D, so it's more than just creative graphmaking, it's pure bad faith deceiving.
8:49 I'm using an i7 620m and encoded select Samurai Jack episodes on VirtualDub2 (custom occasional widescreen by removing black bar when possible) at 676p VP9 for my 2300x1080p phone. I waited at least 6 hours per episode but I could still watch TH-cam in 720p60. Normally it takes everything the cpu's got to play 1080p60 consistently. Not bad for a Tecra A11, but still a mediocre gaming machine, basically Wii graphics.
I appreciate that AMD is going for two markets at once and now Intel's going to do it too. Like, now they're both going to be making CPUs AND GPUs... that's awesome.
I'd say they are both going for 1.5 markets next generation. The new AMD CPU's make no sense for budget gamers, and the Intel GPUs won't be able to compete anywhere else than the budget segment.
lol, congrats. I'm gonna assume you're young based on the slang, but that feeling i had when I got my first PC running at age 14 will always stick with me, and hopefully with you too
5:46 The tiny bars a good way to: - not make the competition advantages obvious in a marketing slide - but still make the information visible for anyone who is looking. I would hope more companies would be that transparent in their marketing materials
Watching this on my 2012 gaming rig with trusty old i7-4790 and gtx 960 (in fact finally thinking about building a new one) Linus: “Maybe it doesn't matter to you because you only upgrade every 5 years anyway” I'm glad you mentioned undervolting with current energy prices rising (and space heater CPUs/ GPUs) would love to see an LTT take on energy efficient gaming/ high performing machines, maybe even in different price brackets
Welcome in da club, mate. I still have my 3770K and my 32GB of 1866 MHz DDR3 (and a RTX 3060). Still on Win 7. All games run perfectly at max settings in 1080p XD
I’m really impressed with your rig from 2012. To be able to get a 2014 CPU and 2015 GPU in 2012 really makes me feel hopeful for the commercial future of time machines. I wish you the best of luck in your endeavors to produce a potent time machine, and am looking forward to your exploits in the pre-history era.
I Just built one for the wife a 1080p mmo build with a 5600x and a 6700 pulse 16gb of 3600 CL16 corsair ram 1TB M.2 PCIE 4.0. it can run FF14 and ESO at max setting all day UPS loaded in 3DMark 297 watt including 2 monitors. with a Time Spy score of 11294. very good bang for buck on a system you can build for under $1k US if you price shop.
Considering how the performance difference in most games is no longer cpu bound, unless you are still playing at 1080p I'm glad for the competition keeping AMD and Intel trading blows but I'll keep my 5900x for a few more generations.
Well, tbh, it doesn’t really make much sense to upgrade your PC every year just cuz new CPU was released, it’s better to wait 3-4 years to actually see massive performance gains
"Considering how the performance difference in most games is no longer cpu bound" How incorrect could you be......... Spiderman, Cyberpunk2077, God of War, any upcomin UE5 game using any combination of the new tech in the engine etc etc.,........
It depends on the title. Eye candy games are GPU bound and those are usually used for benchmarks since they are very resource heavy. On the other side, games with heavy AI and lot of pathfinding (mostly strategy games and RPGs) are heavy CPU bound and even old titles eg. DwarfFortress will be CPU bound with current tier.
It scares me how quickly GPU and CPU power is advancing. Like how 5 years from now our games are gonna look arguably the same, but our hardware is going to be insanely powerful.
I feel like in 5 years most gamers or games will be marketed for 4k high fps gaming but the hardware like a i5-18th gen or 6060 will easily be able to run them
@@AponTechy Looking at the Steam hardware survey it seems like most gamers don't even have rigs as powerful as current gen consoles and only a little better than last gen consoles, so if anyone is a bottleneck it's PC players.
Maybe he's talking about the rumored i5 13400(f) having 6 Pcores and 4 Ecores, being a kind of locked 12600k(f). The only thing is the non-k SKUs seem to be using AlderLake's cores, but for a budget will be nice.
I can see undervolting becoming more and more popular. With power price going up and 5.8ghz not really being necessary for a lot of things it makes a lot of sense. I wish we could overclock and underclock on the fly with an windows utility tho. Would be an easy "Green" move for intel if they made it easy for millions to undervolt in the os for when you are just web browsing and stuff
The new Intel top of the line cpu might actually be a decent upgrade for those stuck on Threadripper 2950x without economical upgrade. How is the PCI lanes and bifurcation support on those new intel boards?
Hehehe, the switch between e-cores and p-cores is something I have done manually for a long time. If I do a long background-task like PNG-crunch, rendering a video etc. while working on other things. I just assign that application to a number of threads, maxing those out, while leaving the rest of the performance for my primary task. It doesn't happen that often, so the manual approach works, but it is neat to see this being automated by the CPU manufacturer.
10:20 Honestly though, if I were running a 1700x I'd rather upgrade to a 5800X 3D (mostly) without needing a mobo or RAM upgrade, instead of spending money to get a new chip+mobo that's EOL anyways.
I’m looking to upgrade from my i5-6400 on a $200 CPU budget so I’m impatiently waiting to hear more 13th non-k info. I’m very interested in seeing more info on their GPU as well.
I'm glad they included UE5 in the tests. I am a game dev and I use Blender, Adobe and UE5 (+ compiling) and I also play games. Would be cool if LTT would also include some UE5 or Unity test when testing hardware.
Yeah I'd be in love with them if they'd include at least basic compiling tests when benchmarking. If I have to set a 15 minute timer so I can let some Rust build a few more times, I might lose it
That statement from Linus was a bit odd to me: "Hit the build button and walk away for a while because the pc becomes unusable"... working in UE5 with my 3900X, I can't really confirm this. Yes it takes quite a bit of time but task prioritization means that anything else I then pull into focus will run just fine. So I just don't really see the point here that LTT and intel are trying to make
@@taku1101 He maybe has no clue bc most probably they don't use UE at LTT. But ofc, I have no issue doing something else while compiling. Having a Ryzen 9 5900X + a 3080. Thinking of jumping ship to Raptor Lake + 4090 maybe next year. The new AMD seems too darn hot tbf - and not in a good way lol.
@@GoatOfTheWoods yeah idk I'm not very exited for this generation in general.. I'll prob wait for two years or so - maybe skipping this socket entirely as I don't really need any extra performance with my 3900X and 5700XT
@@joebarthram596on a 4790k myself and since I'm on ddr3 my plan is to completely skip the cross over nonsens with ddr4 and wait for ddr5 to be the norm. Guess I'm waiting on 14th gen.
@@joebarthram596 I went from a 3570k to a cheap 2700x in early 2020, then dropped in a 5800x, crazy difference between the 3570k and 5800x it's worth the upgrade even to 5th gen.
@@realforest I run a lot of cpu heavy strategy games so honestly I'm looking at a 5800x3d once they drop a little more... Then hopefully a 3060/3070 to replace my 1060...
I'm ok with a 3700x, i will wait to see if the 7800x3d or something along the lines is worth the full investment of CPU+mobo+mems, if not i get a 5800x3d instead with a 6900xt
Intel just tested with the highest speed memory officially supported by each CPU. 5950x is 3200, 12900K is 4800, 13900K is 5600. Anything more than these speeds is technically overclocking. You can argue about them not wanting to use higher speeds, but it's not entirely without logic.
I'm pretty sure, and when I say pretty sure I mean certain you can use faster ram with the 5950x, not that it would make too huge a difference but if you're gonna compare the two at least run them both on the same ram, they have the ability
@@alloftheabove8522 Not just a suggestion I have systems in which running 4 dimms of memory prevents running beyond speeds of 3200mhz because the memory controller can't handle it. If they could run every CPU beyond 3200mhz they would have. There is instability beyond that point with certain CPUs and that's why they list it there.
You must be too young to remember the '90s. Things were ridiculous, by the mid to late '90s there were something like 5 manufacturers of chips in the Windows space and additional ones handling just graphical acceleration. It was nuts, every time you'd turn around there'd seemingly be a product that made the one from a couple months earlier look like crap and most of the time it wasn't just marketing bluster either. Technology was advancing just that fast. Obviously, it wasn't just the additional companies, the performance gains and need for them were also much more substantial
@@Toma-621 Nah not to this extent, the battle between Intel and AMD since the 286 Design up to AMD 64 was epic. The jumps are much more narrow today due to physical limitations.
Efficiency is what I want to start coming to the forefront. I have a 3080ti & 11850k and don't plan to upgrade this generation, or possibly next. When I see some efficient CPU/GPUs that can keep my room a bit cooler while still being powerful then I'll pull the trigger!
Excited to see how these releases compare, dubious marketing practices aside. I've been stuck on 6 year old hardware and let me tell ya, nothing gets your blood pressure going like trying to edit 4k videos with that. It's about time for an upgrade, once I have saved enough of course :D
Out here on a 9 year old Xeon I feel this. I am just holding out for as long as possible as while it is old, my xeon still has 8c/16t and is OC'd to nearly match a 9900k. I am really looking forward to the performance per watt improvement the most though.
@@DanKaschel Yes, I just upgraded to a Ryzen 5 based system after having the previous computer for like 7 years and it's pretty amazing. It was unthinkable keeping a computer that long when I was a teen, but technology of the '90s was such that a computer wouldn't even be usable that many years. The patches would stop and it probably wouldn't even run new software.
@@harryyoseminy7623 Oh yeah, and intermediate files too. Canon R6 footage is unplayable from camera but works well transcoded to ProRes even at full resolution, but sometimes it would be nice to edit straight off the camera and not have to deal with transcoding. After Effects also tends to bog down with heavier sequences and the only solution there is more horsepower.
Yes, I just wonder what things would be like if Cyrix and PowerVR had managed to hang around until now. They had some pretty impressive tech going on 20 years ago, it would be amazing to have had them stick around to build on that.
Aright that multitasking mega tasking tech might just be the reason I'll get a new laptop with a 13th gen CPU. I mean something like that is essential when doing something as simple as livestreaming, I've had an insane amount of issues in the past while livestreaming that I never had in other scenarios.
Just curious what CPU had that issue. My 3700x handles live streaming just fine while gaming, doing compiles, encoding, etc. Did you have an older CPU or just a crunchier workload?
Linus missed something @ 10:20. If you're running a 1700X and at least a decent B350 or X370 board, there's a good chance the cheapest short-term upgrade would be to drop in a 5800X3D without swapping anything else besides a BIOS update...
5 years? lol, I am still rocking a 6600K. I did decide now is the time and here is the place and I have a 7950X on the way as we speak. Pretty excited to finally be upgrading. My current setup is getting pretty long in the tooth.
@@nayan.punekar 3D chips are massively overrated. 5800x3D was only good for gaming, and only in CPU bound scenarios. In nearly everything else it was beat by the much cheaper 12600K. In GPU bound scenarios for gaming the 5800x3D is really no better than the much cheaper 5600x.
@@MistyKathrine It depends on the game, for most games the 3D cache is irrelevant and the lower clocks result in lower performance than the 5800x, but in some games (like Factorio, MS Flight Sim, and Assetto Corsa) the 3D cache results in noticeable performance uplifts even at higher resolutions, and even compared to the 12900k and 7950x.
Please note that the $590 pricing is not what customers are going to pay for. That's for every 1000 units sold to retailers. Right now, Newegg has the 13900k for pre-order at $659.99.
Just upgraded from my 6700k and got a 12400f I’m really impressed with it. It’s no 12700k but 50% more performance means I barely see my cpu above 25% utilization. I also see my GPU hit 99% utilization all the time and have butter smooth frame rate. Only down side is now I have coil whine to deal with and have to enable sync for certain games or else I’ll hit 4000 fps in the menus and listen to my 1070 beg to be put out of its misery.
I love this. A CPU Generation where both Intel AND AMD have compelling choices that I can make a use case for. I'm cautiously optimistic that no matter which brand you choose, it'll be a great choice (Gotta see that testing from all the trusted channels!)
Intel's support for DDR4 will make it alot easier for people to upgrade and I am happy about it. DDR4 is more than enough for most of us and they are dirt cheap at the moment.
Well its even harder to choose now. Do you stick to your old good DDR4 or do you go for promise of long socket support? I stopped considering Intel even a option since Ryzen came out with soldered IHS. Still kinda butthurt they made us delid their expensive CPUs. Price drop of DDR5 will be game changer.
especially for gaming. DDR4 will still be widely used for at least 2 more cpu generation change. we might even see a Mobo with both DDR4 and DDR5 slots. like back in the old DDR2/DDR3 days.
yeah being able to drop in my current kit of ram would save a signficant amount of money and with first gen ryzen I am considering an upgrade. However our exchange rate is awful right now so this generation is even more expensive than it looks. Look at the nzd to usd graph truly tragic
@@Pub4si Well, intel wise,AMD take the lead on Server/workstation,but still competing on Gaming Nvidia took over workstation/server ,but still fighting AMD from gaming
Let's hope they don't actually fight to the death, we don't want another monopoly situation on our hands again with very little advancement over the next 10 years.
That makes me wonder how the 12900ks was the prev gen 'gaming king' while in a lot of reviews from different channels the 5800x3d was shown as a relevant competitor for this new generation...
@@p_serdiuk at launch vs the 5800x3d, yes. But check this graphs and the ones from the ryzen 7000 review and you'll ser the 5800 one on one with the 12900.
@9:12 If there's one thing that would make me hesitate upgrading any time soon (with $$ not being a concern), it's the high power consumption (TDP rating) of the new and upcoming CPUs and GPUs.
In the "Mega Tasking" section, I think they are overlooking the fact that people prefer doing their builds/renders on all cores to finish it as soon as possible. I have never in my career in software engineering, seen people use less cores for their compilation/build just so they can do something else on their computer
It's still 24 cores, and if you don't want to reduce the load so you can do something else...you don't have to, it's user optional. Which is the point, for you you could let it go all in.
The biggest thing that both AMD and Intel left out was the question of why you'd even bother upgrading to the newest gen if you're only gaming? As many reviewers pointed out, it's difficult to get a CPU bottleneck for the high-end, and it's often at a framerate that doesn't even matter. Why buy this gen when you can pick up a 12900k or a 5800x3D for a much cheaper price?
I think if you're making the DDR5 jump, cry once buy once. If you're staying DDR4, unless the prices are comparable, a 12700 for gaming is gonna get it done.
@@spaceli0n I upgraded from an 1800x to the 5800x3d, probably the biggest jump I've made since i switched from the old family pentium system to an fx 8350,
I've been on a Ryzen 1600 since release, more or less. It has served me well, but I've been looking at upgrades... and neither Ryzen 7000 nor Intel 13th gen look promising enough to warrant such a massive hit to my wallet, considering I could just slot a R5 5600 into my existing mobo, a Ryzen 7000 would require a whole set of expensive motherboard and DDR5 RAM, and in the best case scenario for a 13th gen intel set I'd need a mid-tier motherboard (Which is still not cheap) and a CPU. I think whether it's worth it as an upgrade is very case by case basis, because anything Ryzen 1000 or 2000 could get a big boost from 5000 for far less money, and anything newer is modern enough that it doesn't warrant an upgrade unless you were also jumping product classes (like from an i3 10100 to a 13th gen i5 or i7)
@@VividFlash A far better performer than the 5600, but also more expensive and it needs a decent cooler so additional expense. At that kind of budget I'd look at 13th gen for longevity, if the i5 13th gen turns out to be a solid value once LTT reviews it
I mean if you're so strapped for cash then why not get an even older CPU? Unless you actually need a better CPU I think even lower end modern CPUs are more than enough.
@@Katoptrys if you want longevity, amd is the choice right? Intel change socket every 2 gen right. So based on that, 14th gen probably next year will require new socket that requires new mobo. While am5 will be supported until 2025+
I'm really looking forward to your testing. I haven't upgraded since the GTX1080 was brand new, and now that I'm out of uni and earning money, I am probably gonna upgrade this year, hopefully to something that might do me as long as my current pc has done.
With the RAM speeds, could it be that Intel only picked the default speeds (at least for 12 and 13 gen)? As far as I know intel bumped the default speed from 4800 to 5600mhz on their new cpu.
Well, LTT in thier AMD Ryzen 7000 series review used 5200 MT/s RAM instead of the 6000 MT/s as the sweet spot. A couple other reviews did that too as they follow the use of the official JEDEC speeds supported not enabling AMD EXPO or XMP. So the 12th and 13th gen testing using the slower RAM seems fair but the Ryzen testing doesn't so far.
the Ram speeds used are the official NON-OC supported speeds, reviewers seem to ignore this and act like drama queens when marketing benchmarks aren't running overclocked memory.
I really thought intel was making a comeback with last years release. Hopefully intel keeps it up and 13th gen can be just as massive of an improvement
8:46 I don't know about all that, it may not be as intense as their demo, but personally? For my uses? My Ryzen 7 5700X can transcode an entire queue of HD videos while playing the Master Chief Collection with friends online while chatting on Discord and recording screen capture, which personally feels like having the power of NASA or greater in my own PC, I've never had a computer so powerful in my life so far, this is the best experience I've ever had thus far. I love seeing all this new tech and watching the competition between Intel and AMD as though it was some kind of sport with a smile on my face, but having come in when I did at the point I did, I'm very happy with what I have at the moment and feel no real need to upgrade anything at the moment, so I'll just sit back and enjoy the show, cheering for no side in particular, just eager to see what either "team" can bring to the table. Now is a very exciting time for tech, I'm glad to see this sort of thing come back around and be in it this time.
It really is nice to have the choices we have right now. Yes, prices have gone a bit crazy, but I have a suspicion that we'll see earlier price cuts this gen, or at least better bundle offers than usual.
The 3200mhz, 4800mhz, and 5600mhz are the highest speeds for their respective platforms without it being an overclock. They went with the numbers printed on the box, that's why they chose those ones.
@@Bayonet1809 because Intel reliably makes whatever choice will cast their products in a favorable light. In this case z defensible or not, their benchmarks were rendered largely irrelevant except against its own previous generation chips.
Intel is finally starting to actually deliver without just Pentium 4 Extreme edition deliver like last gen. Will wait to see what the number looks like an will instead purchase a 5800x3D because performance right now, but in another five years might be back on Intel, if they can stop pushing more power draw. Competition might back in full swing for the first time since 2017.
@@groenevinger3893 well i already have an Am4 motherboard good ram and a decent GPU. Im just saying if the 5800X3D can trade blows with zen4 and 13th gen its a cheap but effective upgrade compared to a whole new system
Seems like a decent generational leap, personally I'm glad that it didn't blow previous gen out of the water like 12th did to 11th gen. Makes me feel better about buying my 12900k.
Those E cores are really starting to pull their weight. This generation might finally see my 8700K retire. I would also love to see a 0P+16E chip. Something like that would be perfect for a nice little render box provided it has enough PCIE lanes for the GPU and 5 gigabit LAN.
@@LifeOfToyz I just bought in myself. I have a 13700K on order and 32GB of DDR5 with a Z690 board in my hands. Just waiting on my A770 16GB and PSU. I just packed up the 8700K system for a friend to use. He's dropping in his own GPU, a 3060ti, so the sale of one very tired 1660ti has funded the build a little.
The Ryzen 9 7950X3D cache is rumoured to not be released until CES 2023 (Jan 5th-8th 2023). Having seen the HUGE boost in performance that the 3D cache gave to the 5800X3D, to the point where that last generation chip holds its own against the just released 7950X and the not yet released intel 13th gen flag ship, there is no way I would get a CPU before the 3D cache version, so I am praying the Ryzen 9 7950X3D is released sooner, just after intel 13th gen, so they crush it, and I don't have to wait even more months before building a brand new from scratch rig. Fingers crossed AMD have the sense to release the Ryzen 9 7950X3D straight after intel 13th gen so that AMD can corner the market before the Christmas purchasing rush! In my opinion it's not worth the extra money to go all in for a complete platform upgrade to AMD given the AMD 7000 vs Intel 13th gen (where no upgrade needed) are too close in performance and Intel is CHEAPER, so the temptation is just upgrade to intel 13th gen now and leave it a few years. BUT, if AMD pull out the Ryzen 9 7950X3D cache BEFORE black Friday this year (so I can buy EVERYTHING in the sale), then I'm ready to go all in, full AMD because the performance of the Ryzen 9 7950X3D cache will leave everything else miles behind!
lets go, I am so excited to see Intel make another massive performance jump, and if the efficiency is as they claim I'll no longer have to worry about thermals on my custom loop!
For my 13900k and 14900k I have done the following: MCE off, PL1 and PL2 limit to 225, limit P-core boost to 5.5 GHz and E-core boost to 4.3GHz, and use balanced power profile in Windows (although I do disable core parking to keep system highly responsive). Oh and just XMP on the RAM. I didn’t change LLC value. I have set voltage offset at a modest -0.010v. I have disabled the C6&C7 C states and EIST. Lastly I have locked AVX at 0 offset. I have tested on P95, CB R23 and CB R15. All great and in a mid 20 degree room, no workload exceeds 80c on package or cores just using an NH-D15 with an airflow case. Very happy and benchmarks are very close to where they were before taming these beasts.
I still use an 8700k. Thing is easy to overclock, takes abuse and doesn't die, and has handled all the virtualization I needed when I needed to run multiple VMs in school. Barely have to do anything for it to hit 5Ghz. I'll have to replace it some time, but I love the thing.
I literally ordered exactly the upgrades he's talking about at 10:15 yesterday. I'm doing 6700k->13700K, going with z690, keeping my cooler by adding a bracket, and will try to keep my current ram too. Great minds.
That was a very good video Linus. Brought up some points most reviewers wouldn't talk about and that got me thinking. Can't wait until the full review and comparison to ryzen 7000 comes out
I'm running a 6600K with 3600Mhz DDR4 and a 3070 and I've been wondering for a while how and when to modernize. The idea of a 13th gen chip with a new board, but keeping my 700watt PSU, RAM and AOI is really tempting.
I mean, I don't really think that it's fair to expect that Intel would've had access to AMD's newest Ryzen lineup the day the embargo was lifted and had a chance to test it against Raptor Lake... I don't really think that AMD is going to send Intel some QS samples so they can test against their latest gen against AMD's latest...
My last Intel CPU was the legendary 2700k and switched to Ryzen CPU since then. I'm currently using a Ryzen 5950X CPU and won't be upgrading for a while. I'm looking forward more to what AMD can offer on the GPU side of things to compete with NVIDIA.
Intel's choice of memory for its comparison builds are the correct ones, as far as they are the "maximum" JEDEC speeds for each platform without XMP/AMP. DDR5 4800Mhz 12th gen, DDR5 5600Mhz13th gen and 3200Mhz for Ryzen 5950x. So, Intel is playing fair in that respect. As for the 5800x3D (at least they included it) I agree they should have made the graph clearer. Plus Intel would not have had access to the new AMD 7000 series CPUs for comparison, nor would AMD have had 13th gen to compare their CPUs with.
For gamers like me I'm waiting for AMDs new x3d chips. I think it's amazing the 5800x3d is still competitive in Intels own graphs for the 13 series so it stands to reason that the new x3d chips amd will be releasing will be that much more OP.
@@directlinkrexx4409 we'll see I suppose. I wouldn't doubt AMD is prepared to drop prices if sales are bad. The x3d chips though were always going to be expensive, I mean the 5800x3d was 449 at launch so I can't imagine it being much more expensive than that for a similar chip.
@@directlinkrexx4409 Intel's pricing is so much better than AMD's for Ryzen 7000 that AMD will be forced to lower prices by the time 3D cache 7000 series chips are released.
I still think I'm leaning AMD, not ready to jump on the E-core train yet, but I am SUPER happy that both teams have really great performing chips, with any luck, Intel's chips will be about as good as they are saying, and AMD may have to cut their prices a little to stay competetive.
I'm honestly probably gonna wind up switching to AMD for my next tower. I figure that by the time my current one isn't good for much more than being a home server, the "who's on top" cycle will *probably* be back to having AMD as the overlords of price to performance. Plus, I'm probably never going Intel for a laptop ever again... I had two Intel laptops during the era where they were the only ones in the laptop game and *MAN* they got complacent on the laptop market because the first one turned into a potato when faced with Windows 10 and the second one overheated constantly. And now I have a Ryzen laptop that makes butter look like sandpaper.
We finally came to a point when we have enough cores to shut hyper-threading off and have no more productivity performance drops because stuff ran on weak hyper-threads(25% performance of physical core). And Intel comes out with their e-cores, which are 25% performance of p-cores and are basically hyper-threading with extra steps, which you can't shut off and should entrust operating system to utilize efficiently(which will never happen, since the system is not a Nostradamus and can't predict the thread will be a heavy workload).
@@grumpy_cat1337 Look, I don't really want E-Cores for how I use my system, but they are clearly of value at times. Also neither the E-Cores or Virtual cores of Hyper threading operate at 25% the performance of a "real" core, maybe in the early days, but some workloads that truly benefit from parralellization are faster with hyper-threading running than not. Why turn it off? (outside of some VERY specific workloads/ scenarios)
Intel got back to us!
"For CPU launch claims, Intel generally tests within parameters specified by the manufacturer. We conducted the performance testing on the AMD parts within specification based on AMD's public guidance regarding max memory speeds.
In regards to the 1% lows - the 5800X3D not shown in the 1% lows/99th percentile gap to keep the message simplified. Our goal was to introduce a new way of measuring gaming experience beyond average FPS and focus on explaining this parameter and its importance to gamers. When we add additional SKUs to the slide, it would make the story more complicated to explain, which is why we didn’t include the ADL-S 12900K as well."
Hi
Lyte gaming PCs 🤡🤡🤡
Also the intro had the critical thinking skills of a 🗿
Hi Linus
🤡
ok
I'll tell my grandchildren when things got hard, the whole family had to gather round the rtx 4090 and the i9 13900k to keep warm and we all had to work 23 hours a day to pay the electricity.
Unironically, i want to see if this is doable. maybe make a radiator shaped case with lots of fins and just run with it.
ong
Don't forget the part where the whole family sold one of their kidneys to pay for the 4090
@@kaktusgoreng who needs a fireplace when you could get a good heater. Hell, maybe you could even cook if it gets more ridiculously after this( rtx 50 series and oc'ed i9/ryzen 9)
We will see builds with loops connected to the heater soon
I actually cannot get over where CPUs are at now. It wasn’t long ago that 5Ghz was a *HUGE* benchmark for serious overclocking and that 8 cores was insane! I cannot believe we are just chilling with consumer 16 and 24 core CPUs at over 5Ghz. Wtaf. Incredible progression.
I built my 1st pc back in the Core 2 days and remember watching TH-cam vids of people needing liquid nitrogen to hit 5ghz and I think only on Duo chips not Quad
That’s what happens when there’s no competition with those crappy fx processors
but at a huge cost of consumption and heat, this is just competition to max level, amd and intel beating each other to death
people are commenting on comments like its a school project and you have to comment for full credit
Well, intel 24 cores are like pretty much all e-cores (basically, a bit better versions of those fucking useless intel atoms) so....it just sucks my friend.
8:40 don't forget that in Task Manager and in Linux you can limit specific processes to specific cores already (even while they're running) or decrease execution priority, it's just not done automatically.
@@ThomasWinget That's one way to do this, but another would be to adjust the nice levels to what you need. That's gonna make your CPU keep going at maximum while giving you ability to do simple tasks, although it'll still feel a bit slow...
@@Astra3yt switching them to "idle" priority (with chrt) will deprioritise them even further
@@ThomasWinget It annoys me when developers use oddball build environments or shell scripts that automatically select every core when using -j. It also annoys me when they single task the build. I tend to devote 4 to builds that I do in the background, sometimes 8.
The performance per watt is the most impressive thing about these new chips. Laptops are gonna be amazing with both Intel and AMD’s new CPUs.
laptops will always feel terrible to me, but i guess they will always have a place in the market.
If you care care about the efficiency, x86 platform would never beat the arm platform especially in laptop area. So that’s why Apple is dominated in laptop battery life
Yes, adding 8 e-cores will lower power consumption.
@@heflar yes, laptops have a place in the market.
atleast is portable 💀
5:00 I had the same thought seeing these CPU benchmarks. For most games its obvious the biggest bottle neck is on the side of the GPU, exciting to see how the graphs change when they launch
There definitely needs to be two different tests with top of the line old gen GPUs and new Gen
Gaming is split between multiplayer competitive and single player that's GPU limited.
5950x was always memory starved and the socket PPT limit kind of enforced an Eco mode unless you overclocked, but using it with 3200 RAM as gaming CPU is silly.
HUB tested 12 games and showed Ryzen 7 7700x did get more fps in games like CS:GO, DDR4 win was 5800x3D. The value winner was also the Ryzen 5 5600 based on fps/$
I don't get why running stuff in background is suddenly impressing anyone, I did that on with a chess engine using 8t set to low priority on a 4 core so I could do other stuff.
true
The fact that league of legends had the biggest performance jump is hilarious. The game is almost 13 years old
If rumours are true then raptor lake won't really up the average fps that much but drastically improve low fps.
I'd love to see a series of videos comparing 13th gen and 7000 series at a variety of price levels
I'd much rather see benchmarking with a much wider variation of software instead of most of the benchmarks focusing on what content creators do.
Subscribe to Hardware Unboxed.
sounds like something you would need to visit GN for, linus and jay are so pro AMD it's just disgusting
@@joshb6470 Absolute clown take
@@splicedbread 1 hour ago LMG Clips posted a video titled "We're hopeful for AMD".... enjoy being labelled a sheep by me
I think this would be a prime opportunity to see direct comparisons of DDR4 to DDR5 with the same hardware.
Linus taking 5 full seconds to react to "charge they phone, eat hot chip, and lie" was the best thing I've seen in a while though.
Initial benchmarks show very little difference while ddr5 are priced at least twice
HUB did that. th-cam.com/video/IstA56IAeVA/w-d-xo.html
Budget DDR5 was comparable to decent-tier DDR4.
@@wiizardx Yes but now we have AM5 which solely runs DDR5 so new data is bound to show here.
@@ludvigbjorck7133 i hope so too. Im still on am4, but if ddr5 gives good result, i may jump.the gun, else ill pass on this gen since im still playing the same games
Actually, the best thing I did see was that his sister looked much cuter then him....
Remember when the 7th gen refresh was literally like 100-200mhz and nothing else? It's crazy how much horsepower cpus are putting out each generation, and given the rumors heard about Meteor Lake, it's shaping up to be an exciting 2023 for PC Building.
2023 is going to a great year to build a computer and RTX 3000 will probably still be available by the end of next year for more budget conscious builders.
Also crazy how much electricity power their using. :(
Also 7000x3d early 2023
Intel was stagnant and did 4 cores for years because there was no alternative.
Meteor lake is supposed to be a new socket I believe.
Holy crap, finally some actual competition between the 2 CPU companies
This is the first time I get to witness this phenomenal, and it is bloody beautiful.
Edit: Phonemail --> Phenomenal, bloody voice input
Finally indeed. And I hope it stays that way.
@@kyletan4063 Welcome, you must have lost the part in the early 2000 until 2006 there was pure competition. Sadly intel played bad and you know how its finished Intel had to pay 2 billion fine.
@@parabelluminvicta8380 hold up what? What did they do to get a $2 billion fine? How can I search this up. Sounds interesting.
The i9-13900K's massive core count of 24 reminds me of AMD Bulldozer...I hope it's not headed in the same direction
I feel like quad core was standard a few years ago and now the I5 has 14 cores, it's insane how fast things evolved after quad cores were standard for so long
you can thank AMD for that, in 2017 while Intel was pushing out quad-cores, AMD brought 6, 8 and 16 core CPUs to the mainstream. This really got Intel moving for the first time in a long time, and ever since, there's been mostly great competition between the two companies.
well yeah. but most of those cores aren't really "cores". if you get what i mean. performance-wise, is still a 6 core. that's like saying a 6 core with HT is a 12 core cpu. is it tho? depends on your notion of what a cpu core should be, i guess.
@@dresnyd clock speed is all that matters when working. I video edit on a 2600x, going from 4/6 at 4.0ghz cut my 5-10 minutes sessions in half when rendering.
Believe it or not, most of us editors don’t have thread rippers that do it as a side gig. Anyone rendering art though probably would need it.
Every 10 minute clip, is around 4-8 minutes of render.
one word that can describe the innovation... competition
But ryzen 5 7600x still has only 6 core 🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣🤣
If only we can get this kind of competition in the GPU space. Nvidia just outright telling people they're being shafted and that we should be grateful for it.
Well, RAy Tracing is practically useless for anything other than minecraft so… amd aint a bad option
@@ratboi9770 you forgot DLSS which massively improves performance. That is the main reason why people go with nvidia, oh and good drivers from day one, nvenc, better software. Oh and before you start, no, FSR 2.0 is not comparable to DLSS. Also DLSS has bigger performance gains than FSR 2.0. AMD is really great in CPU game since ryzen, but GPUs are just miss after miss. They are decent but never enough to tip the scale. We need AMD to get good in GPU sector so we actually get some competition and better prices. Also ray tracing is here to stay. And no, its solid for other games also.
@@ratboi9770 nvidia also still has the dlss advantage tho. Hope that changes
@@ratboi9770 AMD GPUs are also virtually unusable for professional use unfortunately
@@wqite yup. CUDA is essential
I would love to see the update comparing these exact charts with "how they should have looked" when you bring all the test-benches to parity with the same processors they tested against.
The new P-core functionality could be huge for multi-tasking, though after both of these presentations, I honestly just wanna see what a Ryzen 7000 with 3D V-Cache would look like because I had no idea the 5800x3D was already such a beast. A Ryzen 7000x3D cpu sounds like it would put literally everything else on the market to shame and I hope we get to see it
and you can find it for 300-400$ XD
7000X3D will be Q1 ‘23 :)
It will. Just wait.
Funny, I have had linux switching cores for me that way for ages. But having it on hardware has some definate advantages as long as you can control the behavior. Imagine your cpu accidentally swapping the wrong game processes to ecores mid-run.
you don't care about the E-core which is also the main feature of intel processor now?
I'm pretty intrigued by Intel Arc. A third competitor in the gpu space is going to spice things up a bit, and can only benefit the consumer.
I'd stop being intrigued as by all accounts Intel will only do the cheapest style graphic cards only going forward as they can't get their drivers right or have hardware to compete with beyond that.
@@Nody78 So? Nvidia and AMD are both in a desperate race to abandon anything short of $700.
I think Intel still has ways to go before being considered a serious competitor but still competition is great especially needed in GPU space
intel will have a good graphics card on the year $CURRENT_YEAR + 3.
for two decades and running. the pr's even the same.
@@lasskinn474 i mean, they got physical products now tho
Linus said that the power efficiency could mean i3 could be really fast. Unfortunately Intel said that the i3 and lower end i5 will still use Alder Lake. Not Raptor Lake. So the difference is probably minimal. Just the old chips with a small mhz boost and maybe some extra Efficiency cores.
I see they used the same playbook as Nvidia
@@DanKaschel hm it's also like AMD with their previous APU Lineup and Laptop CPUs
Source? 🤔
I've been hypertasking since Core Duo and Core Quad. My old 4770K was a beast of a multitasking workhorse. It's always about choosing the right hardware for your tasks.
Im still using 4700k to this day. Im cooling it with a corsair h80i. One thing I noticed when I tried to change my OS yesterday from windows 7 ultimate to windows 11 pro is whenever I play shadow of the tomb raider, the display would just turn off on my asus gtx1060 6gb. Maybe there is a bug on the compatibility of cpu and gpu to the windows 11. Haha 😄
@@sanghuloom6943 lol yeah bro windows 11 is a mess, haven’t had any problems on 10, and not upgrading any time soon
@@EstaDePiPi thats what i notice too. I'll be reverting back to windows 7 in a few days. 😅
@@sanghuloom6943 Definitely look into Windows 10. I've been running it for years on all my Windows machines and I still run it with my new hardware. I won't be moving to Win11 until Win10 starts losing support from the software makers I use.
@@ragtop63 i just tried windows 🪟 11 because the design is closer to windows 7 with its glassy and aero style. Only for aesthetics and some programs support. But other than that, windows 7 is still a good for me as well. But definitely ill look to windows 10 next week. Maybe I'll like it even more
It feels like yesterday when intel announced 12th gen, and now it's 13th gen. But damn, 65w for a similar performance to 12900K OC? Makes me imagine what 14th gen will be like
@@Tugela60 is it possible that 65w power draw of intel and amd is different? how is it actually calculated? which would be better to compare- power draw of cpu itself or from psu?
AMD has a strange, roundabout way of calculating "TDP" that isn't actually based on power consumption. See Gamers Nexus' 7950X review to see what I'm talking about.
@@MaraAmaraaaa yeah it's different
@@rdmz135 not really no. Different calculations, so we don't actually know yet.
@@rdmz135 for the money the new ones cost at the top you may as well just buy a 5800x3d
That parity of compute with last gen at 65W is amazing! I think that and the DDR 4 support were really pleasant surprises.
Without DDR4 support a lot of people would've been fairly angry with intel. Afterall, they bought their alder lake boards thinking they could re-use it at least for one more gen. And those boards came in DDR4 variants. Imagine if intel would flip on them by removing DDR4 support on the only other generation that would be compatible with those boards.
I want to see a review purely focused on 65W testing.
@@kodan582 that's why I went DDR5 and Z690. Could really just buy a Raptor Lake cpu and call it a day
@@elmhurstenglish5938 same. Especially seeing as the 7950x at 65watts beats a 12900k at 125 watts! I want to see all the new CPUs in “eco mode”!
The DDR4 support is pointless since the 12th gen already gets hurt a lot in most cases while using DDR4 and that will only get worse with faster CPUs
The space heater thing is legit. When I upgraded my CPU and GPU, I had to lower the AC or it would get 3-5 degrees C hotter in the room. I also undervolt, limit the max Ghz, and lower my GPU power limit-whenever possible. It hardly helps-baseline power consumption is just THAT much higher.
I think my favorite thing about the key note was that Intel is also talking about 3D stacking so we might see exclusive 3D stacked chips from both in a couple generations. I’m also hoping this Arc GPU does well enough to get another generation we need more competition in the GPU market… maybe Nvidia gets salty and jumps into the cpu market on desktop 😅
Theyve been talking about FOVEROS for like 5-6 years now.
I have concerns about heat management in 3D stacking.
Right now it might not be a problem but eventually it will become an issue.
Completion? I think you meant competition.
Watching the keynote is like hearing a car salesman talk about how much better his cars are then everyone else’s so he leaves out certain details and glosses over others.
@@washgaming5825 it’s my phone, idk what it is about the 12 but it has always done this to me, even if the word is spelled correctly it will swap it for something else if I type too fast, never had the issue with the 8 or 6, even my budget androids never had the issue, must be a setting somewhere that is causing it.
That's some really creative graphmaking right there, especially for the 5800X3D
Well we all know who to thank for that...
Someone whose name rhymes with Brian Stout
@@Tugela60 Except when you make a pillar chart with 3 products you compare, but only male pillars for 2 products and show the thrid as a single red line at the top.
That IS creative graphmaking, as it literally flies in the face of standard graphmaking rules.
@@Tugela60 it is creative graphmaking. you showing 3 or 4 products 1 of them being a freaking line on top of other product
@@Tugela60 You kinda missed the point of the comment and probably didn't even look at the graph properly. they intentionally made the graph for the 5800X3D hard to notice as even though it is a previous gen CPU, it competes with and in some cases beats their new RaptorLake CPUs. Putting the 5800X3Ds results as a small red line above the result for the 5950X makes it confusing, because the only reason you would do it that way is because you hope people won't notice while at the same time you can say that you were brave enough to put those results in there even though they weren't flattering. So yes, it is creative graph making.
@@reappermen Even within such comparison those percentages are false when comparing to a 5800x3D, so it's more than just creative graphmaking, it's pure bad faith deceiving.
8:49 I'm using an i7 620m and encoded select Samurai Jack episodes on VirtualDub2 (custom occasional widescreen by removing black bar when possible) at 676p VP9 for my 2300x1080p phone. I waited at least 6 hours per episode but I could still watch TH-cam in 720p60. Normally it takes everything the cpu's got to play 1080p60 consistently. Not bad for a Tecra A11, but still a mediocre gaming machine, basically Wii graphics.
I appreciate that AMD is going for two markets at once and now Intel's going to do it too. Like, now they're both going to be making CPUs AND GPUs... that's awesome.
I'd say they are both going for 1.5 markets next generation. The new AMD CPU's make no sense for budget gamers, and the Intel GPUs won't be able to compete anywhere else than the budget segment.
Linus because of you I’ve built my own beast computer by me self. I’ve been through a lot and this is one accomplishment I’ll always remember
Good flipping job! Once you finally get into it, it’s very very hard to just stop tinkering. Also very hard on the wallet 😅
lol, congrats. I'm gonna assume you're young based on the slang, but that feeling i had when I got my first PC running at age 14 will always stick with me, and hopefully with you too
Yarrr!
Congrats and welcome to the glorious PCMR
Did you refer the verge pc build? That build guide is meta.
5:46 The tiny bars a good way to:
- not make the competition advantages obvious in a marketing slide
- but still make the information visible for anyone who is looking.
I would hope more companies would be that transparent in their marketing materials
Watching this on my 2012 gaming rig with trusty old i7-4790 and gtx 960 (in fact finally thinking about building a new one)
Linus: “Maybe it doesn't matter to you because you only upgrade every 5 years anyway”
I'm glad you mentioned undervolting with current energy prices rising (and space heater CPUs/ GPUs) would love to see an LTT take on energy efficient gaming/ high performing machines, maybe even in different price brackets
Welcome in da club, mate. I still have my 3770K and my 32GB of 1866 MHz DDR3 (and a RTX 3060). Still on Win 7. All games run perfectly at max settings in 1080p XD
I’m really impressed with your rig from 2012. To be able to get a 2014 CPU and 2015 GPU in 2012 really makes me feel hopeful for the commercial future of time machines. I wish you the best of luck in your endeavors to produce a potent time machine, and am looking forward to your exploits in the pre-history era.
My build might have actually been in 2012, 3770k and 660ti. Never pushed it hard even when new, but yeah, it's time for build 2.0
I Just built one for the wife a 1080p mmo build with a 5600x and a 6700 pulse 16gb of 3600 CL16 corsair ram 1TB M.2 PCIE 4.0. it can run FF14 and ESO at max setting all day UPS loaded in 3DMark 297 watt including 2 monitors. with a Time Spy score of 11294. very good bang for buck on a system you can build for under $1k US if you price shop.
Im with i5-4670k and a GTX 760, yes 760...
Considering how the performance difference in most games is no longer cpu bound, unless you are still playing at 1080p I'm glad for the competition keeping AMD and Intel trading blows but I'll keep my 5900x for a few more generations.
Well, tbh, it doesn’t really make much sense to upgrade your PC every year just cuz new CPU was released, it’s better to wait 3-4 years to actually see massive performance gains
"Considering how the performance difference in most games is no longer cpu bound" How incorrect could you be......... Spiderman, Cyberpunk2077, God of War, any upcomin UE5 game using any combination of the new tech in the engine etc etc.,........
@@lorenzovanoordt7894 isnt limited to cpu
It depends on the title. Eye candy games are GPU bound and those are usually used for benchmarks since they are very resource heavy. On the other side, games with heavy AI and lot of pathfinding (mostly strategy games and RPGs) are heavy CPU bound and even old titles eg. DwarfFortress will be CPU bound with current tier.
@@hubertnnn there are a few that continue to benefit, if you play a lot of civilization turn times decrease with faster cpu's.
It’s interesting to see Intel and AMD going in different directions with cpu design. I love it. It reminds me of laser disc vs dvd.
It scares me how quickly GPU and CPU power is advancing. Like how 5 years from now our games are gonna look arguably the same, but our hardware is going to be insanely powerful.
Bottleneck by Game 🎮 console
I guess you haven't heard of the numerous UE5 demos that make shit look life like.
@@VexxedSR We've had game engine demos like that for the past decade without it ever actually being implemented in any games.
I feel like in 5 years most gamers or games will be marketed for 4k high fps gaming but the hardware like a i5-18th gen or 6060 will easily be able to run them
@@AponTechy Looking at the Steam hardware survey it seems like most gamers don't even have rigs as powerful as current gen consoles and only a little better than last gen consoles, so if anyone is a bottleneck it's PC players.
I think for the gaming budget the intel i5 13th is a big deal
Idk if you would call the 13600k a budget gaming cpu anymore though. Lol
Its basicaly just a 12th gen i7 with slightly higher single core performance...
13400 will be amazing too.
Maybe he's talking about the rumored i5 13400(f) having 6 Pcores and 4 Ecores, being a kind of locked 12600k(f). The only thing is the non-k SKUs seem to be using AlderLake's cores, but for a budget will be nice.
@@jakobmax3299 yeah for 80$ less... and its an uplift in single and multicore...
I can see undervolting becoming more and more popular. With power price going up and 5.8ghz not really being necessary for a lot of things it makes a lot of sense. I wish we could overclock and underclock on the fly with an windows utility tho. Would be an easy "Green" move for intel if they made it easy for millions to undervolt in the os for when you are just web browsing and stuff
The new Intel top of the line cpu might actually be a decent upgrade for those stuck on Threadripper 2950x without economical upgrade. How is the PCI lanes and bifurcation support on those new intel boards?
bifurcation I have no idea, but PCIE lanes on LGA 1700 and AM5 have jumped massivelly
Bad
Most of the motherboard manufactures made their z790 announcements today. Should be able to see the specs on their respective websites.
Seems like the 7950x would be a better upgrade over a 2950x
Hehehe, the switch between e-cores and p-cores is something I have done manually for a long time. If I do a long background-task like PNG-crunch, rendering a video etc. while working on other things. I just assign that application to a number of threads, maxing those out, while leaving the rest of the performance for my primary task. It doesn't happen that often, so the manual approach works, but it is neat to see this being automated by the CPU manufacturer.
e cores truly are amazing.
Phones have been doing that with Big-Little since many years ago. Nothing new, Intel only adapted it to x86-64.
10:20 Honestly though, if I were running a 1700x I'd rather upgrade to a 5800X 3D (mostly) without needing a mobo or RAM upgrade, instead of spending money to get a new chip+mobo that's EOL anyways.
I’m looking to upgrade from my i5-6400 on a $200 CPU budget so I’m impatiently waiting to hear more 13th non-k info. I’m very interested in seeing more info on their GPU as well.
I'm glad they included UE5 in the tests. I am a game dev and I use Blender, Adobe and UE5 (+ compiling) and I also play games. Would be cool if LTT would also include some UE5 or Unity test when testing hardware.
Yeah I'd be in love with them if they'd include at least basic compiling tests when benchmarking.
If I have to set a 15 minute timer so I can let some Rust build a few more times, I might lose it
That statement from Linus was a bit odd to me: "Hit the build button and walk away for a while because the pc becomes unusable"... working in UE5 with my 3900X, I can't really confirm this. Yes it takes quite a bit of time but task prioritization means that anything else I then pull into focus will run just fine. So I just don't really see the point here that LTT and intel are trying to make
@@taku1101 He maybe has no clue bc most probably they don't use UE at LTT. But ofc, I have no issue doing something else while compiling. Having a Ryzen 9 5900X + a 3080. Thinking of jumping ship to Raptor Lake + 4090 maybe next year. The new AMD seems too darn hot tbf - and not in a good way lol.
@@GoatOfTheWoods yeah idk I'm not very exited for this generation in general.. I'll prob wait for two years or so - maybe skipping this socket entirely as I don't really need any extra performance with my 3900X and 5700XT
Running an old 7700k. So definitely waiting to see the comparisons between the 13 series and the 7000 series. Keep up the great work!
Yeaaaah I won't lie I'm running a 3770k and waiting for this gen releasing reallllly bad
@@joebarthram596on a 4790k myself and since I'm on ddr3 my plan is to completely skip the cross over nonsens with ddr4 and wait for ddr5 to be the norm. Guess I'm waiting on 14th gen.
@@joebarthram596 I went from a 3570k to a cheap 2700x in early 2020, then dropped in a 5800x, crazy difference between the 3570k and 5800x it's worth the upgrade even to 5th gen.
Though of buying 7700k but oldered used 9700k to z270 last year to get few more cores.
@@realforest I run a lot of cpu heavy strategy games so honestly I'm looking at a 5800x3d once they drop a little more... Then hopefully a 3060/3070 to replace my 1060...
I'm ok with a 3700x, i will wait to see if the 7800x3d or something along the lines is worth the full investment of CPU+mobo+mems, if not i get a 5800x3d instead with a 6900xt
Intel just tested with the highest speed memory officially supported by each CPU. 5950x is 3200, 12900K is 4800, 13900K is 5600. Anything more than these speeds is technically overclocking. You can argue about them not wanting to use higher speeds, but it's not entirely without logic.
I'm pretty sure, and when I say pretty sure I mean certain you can use faster ram with the 5950x, not that it would make too huge a difference but if you're gonna compare the two at least run them both on the same ram, they have the ability
@@nesyboi9421 "System Memory Specification
Up to 3200MHz"
@@Sinaeb bah, that's just a suggestion
@@alloftheabove8522 It is also just a suggestion to run the 13900 non k at 65W maximum then
@@alloftheabove8522 Not just a suggestion I have systems in which running 4 dimms of memory prevents running beyond speeds of 3200mhz because the memory controller can't handle it. If they could run every CPU beyond 3200mhz they would have. There is instability beyond that point with certain CPUs and that's why they list it there.
The competition is insane, I'm impressed with both sides. Hope the GPU market starts to do this as well.
Hopefully AMD 7000 series and Intel ARC gpu’s force nvidia to have a reality check
AMD needs to flip off nvidia and give their 7000 series gpu's cheaper than Nvidia's 40 series
You must be too young to remember the '90s. Things were ridiculous, by the mid to late '90s there were something like 5 manufacturers of chips in the Windows space and additional ones handling just graphical acceleration. It was nuts, every time you'd turn around there'd seemingly be a product that made the one from a couple months earlier look like crap and most of the time it wasn't just marketing bluster either. Technology was advancing just that fast.
Obviously, it wasn't just the additional companies, the performance gains and need for them were also much more substantial
@@SmallSpoonBrigade Why must they be too young to remember the 90’s? This is happening now in the 2020’s sir 🗿
@@Toma-621 Nah not to this extent, the battle between Intel and AMD since the 286 Design up to AMD 64 was epic. The jumps are much more narrow today due to physical limitations.
Efficiency is what I want to start coming to the forefront. I have a 3080ti & 11850k and don't plan to upgrade this generation, or possibly next. When I see some efficient CPU/GPUs that can keep my room a bit cooler while still being powerful then I'll pull the trigger!
Excited to see how these releases compare, dubious marketing practices aside. I've been stuck on 6 year old hardware and let me tell ya, nothing gets your blood pressure going like trying to edit 4k videos with that. It's about time for an upgrade, once I have saved enough of course :D
So fun to upgrade after being patient for so long :)
Out here on a 9 year old Xeon I feel this. I am just holding out for as long as possible as while it is old, my xeon still has 8c/16t and is OC'd to nearly match a 9900k. I am really looking forward to the performance per watt improvement the most though.
@@DanKaschel Yes, I just upgraded to a Ryzen 5 based system after having the previous computer for like 7 years and it's pretty amazing. It was unthinkable keeping a computer that long when I was a teen, but technology of the '90s was such that a computer wouldn't even be usable that many years. The patches would stop and it probably wouldn't even run new software.
You ever try working off of proxy files?
@@harryyoseminy7623 Oh yeah, and intermediate files too. Canon R6 footage is unplayable from camera but works well transcoded to ProRes even at full resolution, but sometimes it would be nice to edit straight off the camera and not have to deal with transcoding. After Effects also tends to bog down with heavier sequences and the only solution there is more horsepower.
I am freaking loving these numbers! Competition is pushing the technology forward like never before, this is god damn amazing.
Yes, I just wonder what things would be like if Cyrix and PowerVR had managed to hang around until now. They had some pretty impressive tech going on 20 years ago, it would be amazing to have had them stick around to build on that.
09:35 That's actually not that hard to believe, with eight extra small cores. The 7950X at a 65W TDP is still faster than the 12900K.
Aright that multitasking mega tasking tech might just be the reason I'll get a new laptop with a 13th gen CPU. I mean something like that is essential when doing something as simple as livestreaming, I've had an insane amount of issues in the past while livestreaming that I never had in other scenarios.
Just curious what CPU had that issue. My 3700x handles live streaming just fine while gaming, doing compiles, encoding, etc. Did you have an older CPU or just a crunchier workload?
The four bar graph with the Linus face made me cry 😂😂😂😂
ME too, I was like, "oh so this is why 5800x3d is a line not a graph." LOL
Linus missed something @ 10:20. If you're running a 1700X and at least a decent B350 or X370 board, there's a good chance the cheapest short-term upgrade would be to drop in a 5800X3D without swapping anything else besides a BIOS update...
5 years? lol, I am still rocking a 6600K. I did decide now is the time and here is the place and I have a 7950X on the way as we speak. Pretty excited to finally be upgrading. My current setup is getting pretty long in the tooth.
tell us if u feel you should have upgraded sooner or it wasn't that big of a deal
I don't think you should be uprading now as raptor lake is just around the corner and the zen 4 3d chips are the real deal on the amd side.
@@nayan.punekar 3D chips are massively overrated. 5800x3D was only good for gaming, and only in CPU bound scenarios. In nearly everything else it was beat by the much cheaper 12600K. In GPU bound scenarios for gaming the 5800x3D is really no better than the much cheaper 5600x.
@@MistyKathrine It depends on the game, for most games the 3D cache is irrelevant and the lower clocks result in lower performance than the 5800x, but in some games (like Factorio, MS Flight Sim, and Assetto Corsa) the 3D cache results in noticeable performance uplifts even at higher resolutions, and even compared to the 12900k and 7950x.
Please note that the $590 pricing is not what customers are going to pay for. That's for every 1000 units sold to retailers. Right now, Newegg has the 13900k for pre-order at $659.99.
Just upgraded from my 6700k and got a 12400f I’m really impressed with it. It’s no 12700k but 50% more performance means I barely see my cpu above 25% utilization. I also see my GPU hit 99% utilization all the time and have butter smooth frame rate. Only down side is now I have coil whine to deal with and have to enable sync for certain games or else I’ll hit 4000 fps in the menus and listen to my 1070 beg to be put out of its misery.
I love this. A CPU Generation where both Intel AND AMD have compelling choices that I can make a use case for. I'm cautiously optimistic that no matter which brand you choose, it'll be a great choice (Gotta see that testing from all the trusted channels!)
Intel's support for DDR4 will make it alot easier for people to upgrade and I am happy about it. DDR4 is more than enough for most of us and they are dirt cheap at the moment.
True. The prices of DDR5 in my country are tear-inducing. lol
100%. AMD is going to seriously struggle in sales because of their DDR5 exclusivity (unless DDR5 prices MAJORLY fall).
Well its even harder to choose now. Do you stick to your old good DDR4 or do you go for promise of long socket support? I stopped considering Intel even a option since Ryzen came out with soldered IHS. Still kinda butthurt they made us delid their expensive CPUs. Price drop of DDR5 will be game changer.
especially for gaming. DDR4 will still be widely used for at least 2 more cpu generation change. we might even see a Mobo with both DDR4 and DDR5 slots. like back in the old DDR2/DDR3 days.
yeah being able to drop in my current kit of ram would save a signficant amount of money and with first gen ryzen I am considering an upgrade. However our exchange rate is awful right now so this generation is even more expensive than it looks.
Look at the nzd to usd graph truly tragic
We're back to a good old fashioned early 2000's CPU arms race and I'm all for it
It seems we’re finally entering the real CPUs war.
Let’s hope the same happens on the GPU front.
AMD taking on 2 companies?
The battles shall be a glorious bloodshed of silicon!
Amd needs evga and let them loose with their creativity like the old days but I know evga says no for now. I hope
@@Pub4si Well, intel wise,AMD take the lead on Server/workstation,but still competing on Gaming
Nvidia took over workstation/server ,but still fighting AMD from gaming
@@Pub4si intel will join GPU market
I’m just happy to see Intel and AMD fighting to the death again. It must be an exciting time in the tech industry. Love your work LMG!
Agreed 😂
Let's hope they don't actually fight to the death, we don't want another monopoly situation on our hands again with very little advancement over the next 10 years.
That makes me wonder how the 12900ks was the prev gen 'gaming king' while in a lot of reviews from different channels the 5800x3d was shown as a relevant competitor for this new generation...
@@denyereduardopuentes4410 12900KS has slightly higher performance and TDP, doesn't it?
@@p_serdiuk at launch vs the 5800x3d, yes. But check this graphs and the ones from the ryzen 7000 review and you'll ser the 5800 one on one with the 12900.
@9:12 If there's one thing that would make me hesitate upgrading any time soon (with $$ not being a concern), it's the high power consumption (TDP rating) of the new and upcoming CPUs and GPUs.
In the "Mega Tasking" section, I think they are overlooking the fact that people prefer doing their builds/renders on all cores to finish it as soon as possible. I have never in my career in software engineering, seen people use less cores for their compilation/build just so they can do something else on their computer
It's still 24 cores, and if you don't want to reduce the load so you can do something else...you don't have to, it's user optional. Which is the point, for you you could let it go all in.
The biggest thing that both AMD and Intel left out was the question of why you'd even bother upgrading to the newest gen if you're only gaming? As many reviewers pointed out, it's difficult to get a CPU bottleneck for the high-end, and it's often at a framerate that doesn't even matter. Why buy this gen when you can pick up a 12900k or a 5800x3D for a much cheaper price?
I think if you're making the DDR5 jump, cry once buy once. If you're staying DDR4, unless the prices are comparable, a 12700 for gaming is gonna get it done.
Yea 5800x3d is crazy
@@spaceli0n I upgraded from an 1800x to the 5800x3d, probably the biggest jump I've made since i switched from the old family pentium system to an fx 8350,
I really like that sponsor spot! Very quick, it was over before i even noticed! Very good!
the 13400f is gonna be a beast. im helping a friend plan a build and i think ill suggest he go with 13400f.
the 400 series have always been, people slept on them hard.
i just hope it comes out soon, planning on getting one for myself and hopefully an amd gpu
13400F is rebadged Alder lake. You need to buy 13600k minimum to get RPL.
@@kognak6640 sure but its supposed to get 4 Ecores, and if its $150-$200 then its great bang for buck.
@@kognak6640 that's not true, 13400 will have more cache than 12600k
I've been on a Ryzen 1600 since release, more or less. It has served me well, but I've been looking at upgrades... and neither Ryzen 7000 nor Intel 13th gen look promising enough to warrant such a massive hit to my wallet, considering I could just slot a R5 5600 into my existing mobo, a Ryzen 7000 would require a whole set of expensive motherboard and DDR5 RAM, and in the best case scenario for a 13th gen intel set I'd need a mid-tier motherboard (Which is still not cheap) and a CPU. I think whether it's worth it as an upgrade is very case by case basis, because anything Ryzen 1000 or 2000 could get a big boost from 5000 for far less money, and anything newer is modern enough that it doesn't warrant an upgrade unless you were also jumping product classes (like from an i3 10100 to a 13th gen i5 or i7)
5800x3d?
@@VividFlash A far better performer than the 5600, but also more expensive and it needs a decent cooler so additional expense. At that kind of budget I'd look at 13th gen for longevity, if the i5 13th gen turns out to be a solid value once LTT reviews it
5600 would be a pretty good upgrade for you on a budget.
I mean if you're so strapped for cash then why not get an even older CPU? Unless you actually need a better CPU I think even lower end modern CPUs are more than enough.
@@Katoptrys if you want longevity, amd is the choice right?
Intel change socket every 2 gen right. So based on that, 14th gen probably next year will require new socket that requires new mobo.
While am5 will be supported until 2025+
I'm really looking forward to your testing. I haven't upgraded since the GTX1080 was brand new, and now that I'm out of uni and earning money, I am probably gonna upgrade this year, hopefully to something that might do me as long as my current pc has done.
Your GPU will be fine for majority of games. So no need to switch yet
i would buy a 7800xt or 7700xt when they come out
With the RAM speeds, could it be that Intel only picked the default speeds (at least for 12 and 13 gen)? As far as I know intel bumped the default speed from 4800 to 5600mhz on their new cpu.
Well, LTT in thier AMD Ryzen 7000 series review used 5200 MT/s RAM instead of the 6000 MT/s as the sweet spot. A couple other reviews did that too as they follow the use of the official JEDEC speeds supported not enabling AMD EXPO or XMP. So the 12th and 13th gen testing using the slower RAM seems fair but the Ryzen testing doesn't so far.
the Ram speeds used are the official NON-OC supported speeds, reviewers seem to ignore this and act like drama queens when marketing benchmarks aren't running overclocked memory.
Would love to see a series of videos with the goal of maximum performance per watt for different price points.
Damn, that stop trick in the intro was smooth, well done.
7:24 BABE WAKE UP HE SAID THE FUNNY WORD
I am not crazy! I know they swapped that memory! I knew it was 4000mhz, as if I could ever make such a mistake!
@@e-cap1239 The spec graphs! Are you telling me that performance just happens to fall like that? No! He orchestrated it! Intel!
I really thought intel was making a comeback with last years release. Hopefully intel keeps it up and 13th gen can be just as massive of an improvement
8:46 I don't know about all that, it may not be as intense as their demo, but personally? For my uses? My Ryzen 7 5700X can transcode an entire queue of HD videos while playing the Master Chief Collection with friends online while chatting on Discord and recording screen capture, which personally feels like having the power of NASA or greater in my own PC, I've never had a computer so powerful in my life so far, this is the best experience I've ever had thus far. I love seeing all this new tech and watching the competition between Intel and AMD as though it was some kind of sport with a smile on my face, but having come in when I did at the point I did, I'm very happy with what I have at the moment and feel no real need to upgrade anything at the moment, so I'll just sit back and enjoy the show, cheering for no side in particular, just eager to see what either "team" can bring to the table. Now is a very exciting time for tech, I'm glad to see this sort of thing come back around and be in it this time.
It really is nice to have the choices we have right now. Yes, prices have gone a bit crazy, but I have a suspicion that we'll see earlier price cuts this gen, or at least better bundle offers than usual.
The 3200mhz, 4800mhz, and 5600mhz are the highest speeds for their respective platforms without it being an overclock. They went with the numbers printed on the box, that's why they chose those ones.
That seems fair to me. Don't know why Linus got his knickers in a knot over it.
@@Bayonet1809 yes its fair, still a little misleading to say the least.
@@Bayonet1809 they all should have used 3200Mhz ram since they all support it, we aren't comparing ram speed, we are comparing the CPUs
@@juhotuho10 But the IMC is part of the CPU, so should it not be compared also?
@@Bayonet1809 because Intel reliably makes whatever choice will cast their products in a favorable light. In this case z defensible or not, their benchmarks were rendered largely irrelevant except against its own previous generation chips.
Intel is finally starting to actually deliver without just Pentium 4 Extreme edition deliver like last gen. Will wait to see what the number looks like an will instead purchase a 5800x3D because performance right now, but in another five years might be back on Intel, if they can stop pushing more power draw. Competition might back in full swing for the first time since 2017.
The new intel and AMD launches have done a good job convincing me i should get a 5800X3D
I should think twice if i was you..
@@groenevinger3893 well i already have an Am4 motherboard good ram and a decent GPU. Im just saying if the 5800X3D can trade blows with zen4 and 13th gen its a cheap but effective upgrade compared to a whole new system
@@DSSteve I that case, that would be a good option!
Wasn’t expecting an LTT video to drop
Seems like a decent generational leap, personally I'm glad that it didn't blow previous gen out of the water like 12th did to 11th gen. Makes me feel better about buying my 12900k.
Those E cores are really starting to pull their weight. This generation might finally see my 8700K retire. I would also love to see a 0P+16E chip. Something like that would be perfect for a nice little render box provided it has enough PCIE lanes for the GPU and 5 gigabit LAN.
@@DigitalJedi Yeah, I am retiring my 9700K. There are some sales on DDR5 right now that are making it an opportune time to upgrade.
@@LifeOfToyz I just bought in myself. I have a 13700K on order and 32GB of DDR5 with a Z690 board in my hands. Just waiting on my A770 16GB and PSU.
I just packed up the 8700K system for a friend to use. He's dropping in his own GPU, a 3060ti, so the sale of one very tired 1660ti has funded the build a little.
Competition that's what it's all about. I cannot wait for 13gen reviews.
Thanks for remaking the 5800x3D graph !
1:40 we needed a second for a laugh there, I'm glad I wasn't the only one
The Ryzen 9 7950X3D cache is rumoured to not be released until CES 2023 (Jan 5th-8th 2023). Having seen the HUGE boost in performance that the 3D cache gave to the 5800X3D, to the point where that last generation chip holds its own against the just released 7950X and the not yet released intel 13th gen flag ship, there is no way I would get a CPU before the 3D cache version, so I am praying the Ryzen 9 7950X3D is released sooner, just after intel 13th gen, so they crush it, and I don't have to wait even more months before building a brand new from scratch rig. Fingers crossed AMD have the sense to release the Ryzen 9 7950X3D straight after intel 13th gen so that AMD can corner the market before the Christmas purchasing rush!
In my opinion it's not worth the extra money to go all in for a complete platform upgrade to AMD given the AMD 7000 vs Intel 13th gen (where no upgrade needed) are too close in performance and Intel is CHEAPER, so the temptation is just upgrade to intel 13th gen now and leave it a few years. BUT, if AMD pull out the Ryzen 9 7950X3D cache BEFORE black Friday this year (so I can buy EVERYTHING in the sale), then I'm ready to go all in, full AMD because the performance of the Ryzen 9 7950X3D cache will leave everything else miles behind!
lets go, I am so excited to see Intel make another massive performance jump, and if the efficiency is as they claim I'll no longer have to worry about thermals on my custom loop!
For my 13900k and 14900k I have done the following: MCE off, PL1 and PL2 limit to 225, limit P-core boost to 5.5 GHz and E-core boost to 4.3GHz, and use balanced power profile in Windows (although I do disable core parking to keep system highly responsive). Oh and just XMP on the RAM. I didn’t change LLC value. I have set voltage offset at a modest -0.010v. I have disabled the C6&C7 C states and EIST. Lastly I have locked AVX at 0 offset. I have tested on P95, CB R23 and CB R15. All great and in a mid 20 degree room, no workload exceeds 80c on package or cores just using an NH-D15 with an airflow case. Very happy and benchmarks are very close to where they were before taming these beasts.
Update: I have now undervolted to -0.015v and set the Core limit to 300 Amps.
impressed by the pricing, in the current economy especially.
I still use an 8700k. Thing is easy to overclock, takes abuse and doesn't die, and has handled all the virtualization I needed when I needed to run multiple VMs in school. Barely have to do anything for it to hit 5Ghz. I'll have to replace it some time, but I love the thing.
I'm still on my Ryzen 7 2700x. I plan on upgrading to a 5700X3D when the price goes down for future proofing.
I literally ordered exactly the upgrades he's talking about at 10:15 yesterday. I'm doing 6700k->13700K, going with z690, keeping my cooler by adding a bracket, and will try to keep my current ram too. Great minds.
It's almost stupid how much bang for the buck you got with the 6700K. I still use it, but it lives on as a server when I moved to 5600X last year.
Great times. Insane power available for practically anyone. Sweet competition.
That was a very good video Linus. Brought up some points most reviewers wouldn't talk about and that got me thinking. Can't wait until the full review and comparison to ryzen 7000 comes out
I'm running a 6600K with 3600Mhz DDR4 and a 3070 and I've been wondering for a while how and when to modernize. The idea of a 13th gen chip with a new board, but keeping my 700watt PSU, RAM and AOI is really tempting.
I mean, I don't really think that it's fair to expect that Intel would've had access to AMD's newest Ryzen lineup the day the embargo was lifted and had a chance to test it against Raptor Lake... I don't really think that AMD is going to send Intel some QS samples so they can test against their latest gen against AMD's latest...
1:33 I had to pause to laugh, but when I was done Linus had to do the same exact thing
what happened here bro. I did not know about it.
I was looking for a comment like yours. I have a feeling I know exactly which writer wrote that line 😂
What is that image
eat hot chip
@@ANURAGMANDAL20 old meme
Im still waiting for my ltt screwdriver , speed up guys ! Greetings from Germany
My last Intel CPU was the legendary 2700k and switched to Ryzen CPU since then. I'm currently using a Ryzen 5950X CPU and won't be upgrading for a while. I'm looking forward more to what AMD can offer on the GPU side of things to compete with NVIDIA.
Intel's choice of memory for its comparison builds are the correct ones, as far as they are the "maximum" JEDEC speeds for each platform without XMP/AMP. DDR5 4800Mhz 12th gen, DDR5 5600Mhz13th gen and 3200Mhz for Ryzen 5950x. So, Intel is playing fair in that respect. As for the 5800x3D (at least they included it) I agree they should have made the graph clearer.
Plus Intel would not have had access to the new AMD 7000 series CPUs for comparison, nor would AMD have had 13th gen to compare their CPUs with.
1:19 yes it’s absent they never mentioned your sponsor or you for that matter!!!!
13th gen is looking quite promising so far. good to see competition ramping up hard now
arent most triple A games gpu bound while most competitive games, like csgo are cpu bound? bit of an unfair comparison for a cpu manufacturer right?
This video has so much potential. It was just seconds away from being 13:37 long.
I'm certainly looking forward to a potential 7800x3D
it will be a beast
@@juhotuho10 How can it not be? 5nm IPC gains, clock speed gains, DDR5 on top of the Vcache it's going to be a monster.
For gamers like me I'm waiting for AMDs new x3d chips. I think it's amazing the 5800x3d is still competitive in Intels own graphs for the 13 series so it stands to reason that the new x3d chips amd will be releasing will be that much more OP.
If amd chips are this expensive
Get ready to pay more for x3d
@@directlinkrexx4409 we'll see I suppose. I wouldn't doubt AMD is prepared to drop prices if sales are bad. The x3d chips though were always going to be expensive, I mean the 5800x3d was 449 at launch so I can't imagine it being much more expensive than that for a similar chip.
@@directlinkrexx4409 Intel's pricing is so much better than AMD's for Ryzen 7000 that AMD will be forced to lower prices by the time 3D cache 7000 series chips are released.
Thanks for timestamps so I can skip ads.
I still think I'm leaning AMD, not ready to jump on the E-core train yet, but I am SUPER happy that both teams have really great performing chips, with any luck, Intel's chips will be about as good as they are saying, and AMD may have to cut their prices a little to stay competetive.
Ohhh noooo not lower prices! 😏
I'm honestly probably gonna wind up switching to AMD for my next tower. I figure that by the time my current one isn't good for much more than being a home server, the "who's on top" cycle will *probably* be back to having AMD as the overlords of price to performance.
Plus, I'm probably never going Intel for a laptop ever again... I had two Intel laptops during the era where they were the only ones in the laptop game and *MAN* they got complacent on the laptop market because the first one turned into a potato when faced with Windows 10 and the second one overheated constantly.
And now I have a Ryzen laptop that makes butter look like sandpaper.
We finally came to a point when we have enough cores to shut hyper-threading off and have no more productivity performance drops because stuff ran on weak hyper-threads(25% performance of physical core). And Intel comes out with their e-cores, which are 25% performance of p-cores and are basically hyper-threading with extra steps, which you can't shut off and should entrust operating system to utilize efficiently(which will never happen, since the system is not a Nostradamus and can't predict the thread will be a heavy workload).
@@grumpy_cat1337 Look, I don't really want E-Cores for how I use my system, but they are clearly of value at times. Also neither the E-Cores or Virtual cores of Hyper threading operate at 25% the performance of a "real" core, maybe in the early days, but some workloads that truly benefit from parralellization are faster with hyper-threading running than not. Why turn it off? (outside of some VERY specific workloads/ scenarios)