@Führer des Benutzers I am fine with Intel not trying to sound cool. If you try to sound cool you better succeed or else it is just silly. Intel's branding is safe, it is not going to rock anyone's world but it goes with their brand image of efficiency and technical acumen. AMD's branding (coming from an AMD user) makes it seem like something some 13 year old would come up with. (edited for typos)
Well to be fair it's a $7000 CPU. That's more than the price of many laptops out there (10x the price or more for some). Edit: Looks like it's $5400 for the CPU + Motherboard on Newegg.
Intel: 'So there is this bug in our processors that means our chips will now run 3-7% slower' _AMD standing behind them with a crowbar labelled new CPU line..._
@marian roberto Sadly, there is a small portion who doesn't do any research. I know a guy who bought a i8600k during the Intel shortage with his tax refund.
3 years ago: A $40,000 Quad CPU Computer - HOLY $H!T Ep. 10 - 4 CPUs - Can support 2 TBs of memory - 64 cores in total Today: 64 Core EPYC CPU - HOLY $H!T - One CPU - Can support 4 TBs of memory - 64 cores Crazy, right?
Fr18Nietzsche44ed pretty much all Apple laptops use intel processors, it’s an issue as they don’t make cool (as in air cooling) designs, and intel makes hot CPUs, which is why I suspect Apple laptops will switch to ARM, like the iPad, within the next couple of years.
but they released this chip BEFORE we raided Area 51; the next gen is going to do this: AMD (EPYC 1st Gen): Am I a joke to you? AMD (EPYC 2nd Gen): Yes. I now have 512 cores with 1024 threads. I can now render 100 Minecraft worlds each with 200+ mods at the same time. AMD (EPYC 1st Gen): WHY?! AMD (EPYC 2nd Gen): BECAUSE ALIEN TECH IS AWESOME!
Better at workstation loads, sure, but not gaming. Games rarely use more than 8 cores, and the 64 core EPYC shown here was running a paltry 3GHz. Even within AMD, a Ryzen 3900X would wipe the floor against this 64 core processor, the Ryzen has "only" 16 cores but when you can only use 8 of them and the Ryzen's are almost twice as fast, the performance of the 64 core EPYC would be below average.
Fuck, I didn't even think about that. 64-bit is sometimes referred to as 'amd64' for a reason. Of *course* AMD would beat Intel to the 64 core mark too.
@@Hexagonaldonut yeah they still own the AMD64 instruction set. Intel IA64 is a relic now (Intel bought the AMD64 instruction set). I remember an old PC mag I used to read back in 2000s that said Intel Itanium will be the standard for 64bit server architect. Oh how wrong it turned out to be
@@yancgc5098 Not at all. AMD's threads and in general SMT are not real SMT(same with Intel's hyperthreading, which is why they didn't call it SMT, neither should have AMD). It's fake castrated SMT and all it does is feed empty pipeline stages. IBM's CPUs have real superscalar SMT where each thread actually is 90-95% as powerful as the physical one(there is overhead so it's not 100%)
after that , old Ryzen 3 processors will be like 5 bucks. Here in Turkey , I bought my r5 1400 for 850TL around 2 years ago and now you can buy a r5 2600 for that price. Ryzen 3 Processors are cheaper than some pentiums.
It's their fault for not innovating much for the past half a decade. I hope they get screwed like Kodak so other companies will know not to slack off in the future
But the server market operates very differently than consumer PCs.... intel will still be the biggest x86 supplier for servers... AMD will probably make a dent... but not much bigger than in the Laptop market
@@tupakveli 💯 I feel like they were literally holding mankind back since almost all major technology makes use of cpu's. Just imagine how much further we could have been if they were more productive
Lokesh Lutchmeenaraidoo I have the 4 cpu running quad EPYC 64 cores on 2048gb ram. My GPU is the RTX 2080 super. Nothing mine can’t handle. Even running as a cloud server. So to answer your question more directly.... I need 256 cores... 🤪
George Martinez sorry that we casual people offended you super god human by our really casual puns and jokes. Maybe u teach us how to be funny and tell us a joke?
@@michakrzyzanowski8554 Nah, Zen 3 Threadrippers/Epycs won't have any core count increases. Zen 4 will since it's on a smaller production node. Zen 3 is just a restructuring and improvements to single thread.
@@TheGentry000it will definitely take a while yes. But it's definitely something thats coming in the future because of stuff like implants and all that
Apex is on the Source Engine, which released in June of 2004. Very little has changed in it since except for additional features to keep it up to date with everything. That's why everyone has been so hyped about Source 2. To say Apex Legends is anything worth of a benchmark to modern hardware is like saying going 20 mp/h in a Lamborghini is a benchmark of its top performance.
My first thought at the "can it run Crysis"- Question were: "and what is the Catch about this?"... After it i thought: "shit. this cpu renders faster than the gpu on my old notebook. I think i should gear up..."
To be fair, this Epyc processor has about as many cores as a midrange GPU these days (stuff like "cuda cores" are not cores, they're just pipelines *within* the core), it just has fewer pipelines in those cores but more latency optimizations like bigger cache, out of order execution etc allowing it to better deal with branchy code but making it less effective at running embarrassingly parallel workloads like this graphics workload compared to a *real* GPU.
@@XavierXonora Ironically, back in the PowerPC CPU days they actually were. en.wikipedia.org/wiki/PowerPC and en.wikipedia.org/wiki/PowerPC_970 Back when they switched to the PowerPC architecture, it was meant to be the next generation of risc CPU, a collaboraion between IBM, Motorola and Apple that was technically superior to Intel's old fashioned cisc X86 line, and, for a while, it was! Unfortunately, IBM and Motorola were already on the road to losing their technological leadership, and in the end just couldn't compete with the behemoth that was Intel.
@@oooooo-vn3gx we accept that. But if someone brings out a revolutionary product for lower costs, wouldn't you be surprised? Maybe try out their stuff?
It's quite amazing how much more efficient CPUs are when you don't push frequency and voltage to the limit. My 2700X sips only 65W at 3.2GHz vs 140W at 3.8GHz (Cinebench R20) while still getting 83% of the score. 83% of the score with 84% of the frequency at 46% of the power... efficiency is almost doubled.
to be fair, hes saying its powerful compared to intel for the pricepoint - its hella expensive, but much cheaper than what you can get for intel like that
it's really not, not for server chips anyway. high-end Intel solutions are priced from 5 digits up to "If you had to ask you can't afford it". Intel still does have the most cores in a single server though, since AMD is limiting themselves to 2-way SMP and intel goes up to 8. Just have to decide between one of those and a nice house.
@@Alex-us2vw not really, you have to go back a few generations before server chips get cheap. Especially the apex chip for a given socket, those cost a mint for years
Not gonna happen. We have problems utilising over 4 cores 14 years after quad CPUs hit the mainstream market. Using 64 will not happen for a long, long time.
@@iowawalker Yeah, but you dont need it, cause the performance gain is pretty low. It's just like "ok, paying 200€ more for 10 more fps sounds like a good deal" Dont get me wrong. Its a very good processor, but the pricing is very bad
..what we *NEED* is those 3900x/3950x's free to actually benefit from better cooling and reach a bit higher clocks. There's almost no use for a 240/360mm AIO right now besides just dropping temps.
@@hg-sf7yx we don't need the 3950x to take on the 9900k. The 3900x does that just fine, losing by a small margin on gaming but easily winning pretty much everywhere else. So if you do anything other than games only the 3900x is the better choice. The 3950x will have a tiny bit more turbo clock though, so that might even the field in gaming. But OC a 3900x a little and it'll probably be the same.
And Linus quote of the year goes to... *drumroll* "THE BRAS COMMENT" *Internet goes wild with rapturous applause* *Everyone nods to each other knowingly*
2019: AMD: EPYC ROME is here Intel: Working on 14+++ nm 3019: AMD: EPYC Destroy is here (640 cores 1280T) Intel: Working on 14++++++++++++++++++++++++++++++++++++++++++++++++++++++++ nm
@@elijahizere these things are basically already god with how much we rely on them. As soon as general ai hits there will be no question as to who's in charge anymore lol
Linus: A "mere" 28 cores.
Me: Looks in disgust at my Celeron dual core
XD
#SingleCorePowerHere
I just bought a 2200g. I was so proud man.
Lol same
:( Intel Atom dual core....
Linus: waiting for AMD to drop it's new 64-core CPU.
Us: waiting for Linus to drop AMD's new 64-core CPU.
Heheheh
@@catriona_drummond well we almost had "Rome is burning" due to him not using the proper fan header. so very close but it was not to be
hahaa i get it
@Montevsgaming Rs cool, mebbe Macron will buy him a new one instead of preventing homelessness
Linus uses wrong fan header.
Emperor Nero: I had great expectations of you!
The fact that you represented the cores with 6 rows of 10 and and extra row of 4 instead of and 8x8 square is really upsetting
Fuck yeah I’m triggered too!
@@MK73DS A 2.828x2.828x2.828x2.828 tesseract.
yeah i was even counting how many rows
1k like!
Motherboard: Do I even have a job now?
AMD: Sure... just cool my ass.
@ramiyan22 OK boomerrr
@Xenon the Noble Gas chill out sensitive guy
@Xenon the Noble Gas man's not cool
Motherboards don’t cool cpus
@@naveeniscool but a cooler is nothing without a motherboard
This Central Processing Unit is worth being called a Central Processing Unit.
@Katoshnik it isnt a cpu then, its a CPQU (central processing quadruple unit)
MPU (main processing unit).
An absolute unit.
unit.
UCPK-ULTRA Central processing KING
When it has got more cores than you get fps in most games
I own a gt 730 GPU, so I can relate.
Crying while playing pubg at 25 fps with gfx tool 😭
Dilpreet Singh Play Pubg Lite. It runs fine on i3 6006u with iGPU and 4GB single channel RAM
Yup. 4th gen i5, gt 750m, 8 GB ram and a 60 hz monitor.
Overwatch isnt doing to well
Hyper-threading will beat the 120 FPS too
Blender: Cosmic Laundromat render at 6k samples on Cycles.
AMD: I can take two of you, at the same time and not even heat up.
Stereoscopic rendering! 😁
That's what she said.
And to think, the new 3990x is even faster.
Higher clock
Yeah ghz doesn’t equal performance
SuperiorMusicProduction kind of higher clock is most of the times beter performance
@@adrenalize25 You do know that 3990x also has 64 cores?
EPYC is made for continuous load.
This takes up a whole inventory slot in Minecraft!
AYYYYY
A full stack of cores
You can have one core dedicated to each inventory slot in Minecraft.
@@MKhrome in bedrock, maybe
How should it take up LESS than a slot?
The editors throwing in visual memes is appreciated af.
Stupid meme trend for zombie generation
@@lakasngamatzko4523 shut up boomer
@@ghoulbuster1
Wrong, not a Boomer.
I just don't blindly follow retard trends.
Window Licker haha wtf chill.
Window Licker shut ur mouth boomer
I remember when people used to make fun of the "Ryzen" and "Epyc" names.
Now they're "Ryzen" "Epyc"ally above Intel 😂
the names are still pretty stupid
@Führer des Benutzers I am fine with Intel not trying to sound cool. If you try to sound cool you better succeed or else it is just silly. Intel's branding is safe, it is not going to rock anyone's world but it goes with their brand image of efficiency and technical acumen. AMD's branding (coming from an AMD user) makes it seem like something some 13 year old would come up with.
(edited for typos)
@Führer des Benutzers "GeForce" is either Gaming Experience Force or Graphical Force.
epyc = epic
Linus: rendering in seconds
Me: Takes 10 Secs to open task manager
Ctrl + Shift + Esc
Done!
@@milutin.mp4 no when i open it it stucks and shows 92% cpu usage for 10 seconds
@@ajayrandev4159 92%? Lucky...
@@milutin.mp4 yesh my dual core overloads when i open task manager
@@milutin.mp4 'де си Милутине, има и Срба овде
Intel: Am I a joke to you?
AMD: Yes.
Always
Eos performance in general. Intel is still a better choice strictly for gaming.
@@springbok4015 Imagine buying a computer to just play games. lol
Fuck you fanboys
YOU: Am I using a meme from 2010 that isn't that funny anymore?
ME: Yes.
That 2% of that cpu, is my full laptop performance
Mine 0.02 %
@@PrdPak 1 core pentium with best oc lel
Idries IE when my 2003 Windows XP laptop with Pentium IV is WAY faster than my 2013 Windows 8.0 laptop...
Well to be fair it's a $7000 CPU. That's more than the price of many laptops out there (10x the price or more for some).
Edit: Looks like it's $5400 for the CPU + Motherboard on Newegg.
@@nitroraptor5316 If you haven't allready done so, try replacing your drive with an ssd.
I am really bothered by the fact when you show the little rome guys, you didn't do an 8 x 8 square. 64 = 8 x 8.
indeed i am bothered too.
I think they were trying to form a flag
That bugged the hell outve me
Well, he hires people who lack common sense, and really aren't that skilled at anything but sitting in front of a screen playing games.
OCD?
Intel: 'So there is this bug in our processors that means our chips will now run 3-7% slower'
_AMD standing behind them with a crowbar labelled new CPU line..._
AMD has entered the fight!
AMD uses Rusty Pipe!
It's super effective!
It uses 12% cpu when running apex... My laptop uses 100% cpu on the home screen. Not joking it does
Actually it's due to OS problem
Fucking windows 10😂😂😂😂🤣🤣🤣🤣🤣
@@RichardGrim the anti malware is malfunctioning 😂😂😂
@@RichardGrim I have Linux.. Lol... 😂😂😂
@@RichardGrim :-D
Wait. What? Did they just... Run Crisis without a GPU?
Yeah they did! :D
Yeah they did! :D
Yeah they did! :D
Integrated graphics...
yep exactly this...this thing is a true behemoth of a CPU and intel will seriously step up their game to counter this monster.
When you have more RAM than i have storage in my SSD.
When the CPU cache is more than ddr sticks
Same
I use hdd
The treads in the CPU is more than cuda cores in my laptop
shutup gay furry B)
They should call a group of 64 cores a “Stack CPU”
A stack of cores
A stack of cpu
*Minecraft intensifies*
*Minecraft intensifies*
The funny.
CPU: i have 64 cores
RAM: i have 256 gigs of ram
Mother board: wtf is going on
CPU: I can play games on my own.
GPU: bullocks, prove it.
CPU: hold my beer......
Stephen Maley bUt CAn YoU pLaY cRySis?
yea, youre right, 256 is a small amount(compared to the usual amount data centers put into a system with epyc)
Mac pep with 1.5 terabytes of ram: am I a joke to you?
19" telescope monitor.
- what? OH fu..k !! I getting rape by those hobos...
I believe AMD has discovered a crashed UFO and is currently deploying alien technology to the public.
lol
now all we need is intel to find some aliens
They've taken an alien in custody and are torturing it to milk out all of the information on they're next generation alien technology.
Intel eat my shit haha.....AMD on the f**king rise only after 10 yrs...
@marian roberto Sadly, there is a small portion who doesn't do any research. I know a guy who bought a i8600k during the Intel shortage with his tax refund.
3 years ago: A $40,000 Quad CPU Computer - HOLY $H!T Ep. 10
- 4 CPUs
- Can support 2 TBs of memory
- 64 cores in total
Today: 64 Core EPYC CPU - HOLY $H!T
- One CPU
- Can support 4 TBs of memory
- 64 cores
Crazy, right?
Yeah and cheaper too
Moore's law in action
You can still do dual socket and get the juicy 128C
Does Linus smoke or does he always yell that it literally damaged his vocal chords?
Moore's Law.
2100: AMD releases its new 64k core processor"
Intel
@@At0mHeart
Intel : Start developing iGPU apu, seems like there's a place they can still make a profit
@@AdminTechnopedia yep but amd is too far ahead in that race but i would like to see what intel can really do if they try
2100: intel has finally developed the ground breaking 14nm+++++++++++++++++++++++++++++++++ CPU. It has two more ++ than last generation.
2100 : intel pronounce to use 7nm 🤣
I just ripped off my intel sticker out of embarrassment.
i put mine on my old laptop to make it feel good
u probably play fortnite loser
Good work my friend now go and buy a Ryzen 3800x and GetUSumOfThatCrazyValue xd
look another mobile gamer
Fr18Nietzsche44ed pretty much all Apple laptops use intel processors, it’s an issue as they don’t make cool (as in air cooling) designs, and intel makes hot CPUs, which is why I suspect Apple laptops will switch to ARM, like the iPad, within the next couple of years.
Does it run Crysis..CPU only. That's some flex.
YAS!
Weird flex but ok
Amd 64 core: Not bad Apex, you made me use 14% of my true power.
They should have Named it Beerus the Cinebench destroyer!
This is not even my final form.
shaggy´s CPU
Zen 3 should be called ultra instinct goku.
“You might hav noticed we are only running 4 ram sticks...” Me to my single 8gb ram stick: He didn’t mean that...
i have a single 4 gig stick
everyone needs a friend. get your poor single stick a friend will you :(
The fact that you represented the cores with 6 rows of 10 and and extra row of 4 instead of and 8x8 square is really upsetting
i have a 2 gig stick of ddr3 at 1666mhz
@@mustlovecats442 woah really!!! im running 1 gig at 1333 mhz
I want to speak to whoever thought illustrating 64 cores in a 10x6 matrix would be a good idea
5:02 When Windows Task Manager has better aesthetics than a video editor...
Tell them to use 8x8 next time.
They just don’t math
For widescreens, ya it’s better than 8x8
why not use 72 cores with intel
Intel® Xeon Phi™ Processor 7290F
16GB, 1.50 GHz, 72 core
"we have more plans for this CPU."
Me: just don't drop it this time.
Underated comment
dedicated wamm
Just never hand him the soap
*Had* Plans for this CPU
@@OhSoTiredMan ooooof
It’s like AMD found a spaceship and just copied their electronics
AMD has bought some stuff from Area 51
They co-operated with elon musk and made it even cheaper
A spaceship built by intel ! :D
@@austingreen6042 stfu fanboi
didn't they do this already in one of the Transformers movies?
Amd took seriously the area51 meme. They actually entered with the engineers and managed to escape with the technology
Problem Linus: "It's just using half of them!"
Solution Linus: "So we run TWO Blender instances"
hahahaha
Blender: Holy Crap!
"Gee linus.. how come Blender allows you to run TWO blenders?"
Big brain play
Tbh we do this in the industry and composite images together to achieve faster renders using multiple systems.
8 gamers 1 cpu again. They each get an 8 core 16 thread processor
Still 4x more than I have
@@fb5601 F
64 gamers 1 cpu 😎
@@MrInsertfunnyname Playing AoE or CS
@@jerome6383 AoE or CS 1.6 = 128 gamers 1 cpu
You do know that the human eye can only see 23.97 cores, right? There's literally no difference at all
Liem Solow Thats a lie; just wait until you try a 64 core moniter- you can tell the difference
@@squidwardcommunitycollege3733 🤣
"It's cinematic"
that is not true...even I can see a difference between a 28 core and a 32 core monitor
@@squidwardcommunitycollege3733 I've looked at the screenshots and... no difference 🤔
Thank you so much for the but “Can It Run Crysis” Much appreciated 👍
You know you have serious power when it is the Cryengine is sweating....
@@piotrd.4850 There is yet to be a cpu that CE2 wont bottleneck on single thread. At this pace we might just get there with ryzen 6/7000.
it gonna beat crysis 10 ass
My god, Crysis running in Software mode. That is HARDCORE.
Linus: Can it run Crysis?
Me: What a stupid question, Linus. We all know Crysis is single threaded.
Linus: With a CPU renderer.
Me: Ooooh shit.
Nice
@Mark Donald Basically it's like running Crysis with the CPU working as the GPU.
@Mark Donald its running completely on the cpu, not the gpu
@Mark Donald for testing all core smh
@Mark Donald Yeah, but be able to do that on CPU is prove of how powerful is.
AMD must have brought back some shit from area 51 for their new chips.
Lisa Su is the Naruto running grandmaster.
but they released this chip BEFORE we raided Area 51; the next gen is going to do this:
AMD (EPYC 1st Gen): Am I a joke to you?
AMD (EPYC 2nd Gen): Yes. I now have 512 cores with 1024 threads. I can now render 100 Minecraft worlds each with 200+ mods at the same time.
AMD (EPYC 1st Gen): WHY?!
AMD (EPYC 2nd Gen): BECAUSE ALIEN TECH IS AWESOME!
AMD's whole R&D department were poached from area 51.
@@JoshsTechWorkshop Google: Summit IBM Power System AC922
area 51 is famous but all cool stuff are in area 52 :p
When you realize linus' setup is better than everyone who was playing apex during that match
Combined.
Better at workstation loads, sure, but not gaming. Games rarely use more than 8 cores, and the 64 core EPYC shown here was running a paltry 3GHz.
Even within AMD, a Ryzen 3900X would wipe the floor against this 64 core processor, the Ryzen has "only" 16 cores but when you can only use 8 of them and the Ryzen's are almost twice as fast, the performance of the 64 core EPYC would be below average.
@@trapical buuut he has a 2080 ti
^ yeah, that beats most people who play apex..
@@trapical 3900x has 12 cores not 16
CPU: “I am the captain now, you shall call me daddyboard from now on”
Motherboard: “ok daddy”
Wtf lol
marilyn gill yeh lmao
@@blinkcatmeowmeow8484 nice pp and name
@@blinkcatmeowmeow8484 wtf i mean pfp
marilyn gill lmfaooooo 😂
More cores than my FPS in games.
More threads than storage space.
* cries hysterically *
And my ping is much more than that cpu price
more cache than what my first pc had of RAM
Pretty sure more cores than every electronic in my entire building...
it has more cores than my iq
4TB memory too thats just under double the amount of my hard drive storage which is nuts for a cpu
The 64 intel curse is back. 1st the amd 64bit architecture, now the 64 cores amd cpu.
Fuck, I didn't even think about that. 64-bit is sometimes referred to as 'amd64' for a reason. Of *course* AMD would beat Intel to the 64 core mark too.
Amd will be first to 64 ghz
@@michaelilie1629 bruh
Michael Ilie lets get to 6.4 GHz first
@@Hexagonaldonut yeah they still own the AMD64 instruction set. Intel IA64 is a relic now (Intel bought the AMD64 instruction set). I remember an old PC mag I used to read back in 2000s that said Intel Itanium will be the standard for 64bit server architect. Oh how wrong it turned out to be
AMD is killing it now. Intel although a solid company, charges way too much compared to AMD. I switched over to AMD's Ryzen series.
Me too, 2 weeks ago.
I'm gonna get the ryzen 9 3000x
Me too 1 weeks ago
Got the 3900x and the 5700xt kills games unbelievable
9900k-$500
Ryzen 3950X-$750 and slower than the 9900k in 90% of applications
Wow what a bargain
Epyc: 64 Cores
Intel: Embarrassed
Hotel: Trivago
...I don't get it.
@@deusexaethera hotel:trivago is an ad but super funny comedian people use it to their puns
@@omniyambot9876: Oh.
201th view
Did anyone hear about amd going to 4 threads per core with ryzen 4000?
Imagine a 64 core 256 thread cpu.
AMD gonna have to pump 512 MB of L3 cache into it at that point.
@@yancgc5098 Not at all. AMD's threads and in general SMT are not real SMT(same with Intel's hyperthreading, which is why they didn't call it SMT, neither should have AMD). It's fake castrated SMT and all it does is feed empty pipeline stages. IBM's CPUs have real superscalar SMT where each thread actually is 90-95% as powerful as the physical one(there is overhead so it's not 100%)
Such a cpu exists, Linus has actually reviewed it, but it's one of those cpus that is meant as a coprocessor and with TRASH frequencies
There are already CPUs like that, but they're non-x86
after that , old Ryzen 3 processors will be like 5 bucks. Here in Turkey , I bought my r5 1400 for 850TL around 2 years ago and now you can buy a r5 2600 for that price. Ryzen 3 Processors are cheaper than some pentiums.
This is really going to hurt Intel. The server market is HUGE
And im glad
It's their fault for not innovating much for the past half a decade. I hope they get screwed like Kodak so other companies will know not to slack off in the future
But the server market operates very differently than consumer PCs.... intel will still be the biggest x86 supplier for servers... AMD will probably make a dent... but not much bigger than in the Laptop market
@@tupakveli 💯 I feel like they were literally holding mankind back since almost all major technology makes use of cpu's. Just imagine how much further we could have been if they were more productive
@@LoFiAxolotl They've taken 10% in just a couple months, if that's what you call a dent well... ok I guess.
normal people: 64 core
*Minecraft veterans*: a stack of cores
Minecraft Java Edition players: 63 idle cores.
If Minecraft supports more than 1 core, yeah good but not now..............
😂😂😂
@@fauxpasiii you can run minecraft java in multi-core mode, you just have to change some jvm arguments
*ah yes, an intellectual*
2014: AMD is way hotter than Intel.
2019: Intel is way hotter than AMD.
Vise versa lol
other way round
Dude you ARE so nerd you did not noticed the innuendo in your entire life... :P
Whowhowhowho Whowho are you talking to me
Nah, and was a silly nerd joke of mine :P
"Hey how many cores should we put in this processor?"
"All of them."
- "What do you mean all of them?"
- "ALL OOFF THEEM!!!!!"
th-cam.com/video/KBKXu3Kg4yg/w-d-xo.html
Squires no you should’ve said yes not all of them
Like A Lot XD
@@djsosonut lol, I say this all the time and people must think I'm a off my hinges.
... my ninjas.
AMD: so how many cores do you need?
People: yes
AMD: yes
I'm the 64th like. This is perfect.
Lokesh Lutchmeenaraidoo I have the 4 cpu running quad EPYC 64 cores on 2048gb ram. My GPU is the RTX 2080 super. Nothing mine can’t handle. Even running as a cloud server. So to answer your question more directly.... I need 256 cores... 🤪
@@mikeclevland GTX?
Yes
Mikhail Kuznetsov why not an rtx titan
Uh, Linus aren't you meant to drop the CPU on the floor?
if he did we'd be at his funeral right now
tihzho i was expecting that to happen at least once, really disappointing
Wouldn't hurt it. LGA
Found the intel Rep
I think that’s why it was already in the motherboard. He didn’t have the opportunity...
Epyc during apex*
Epyc: "I'm only using 2% of my power"
Apex: "NANI??"
AMD : this is not even my final form
Didnt laugh not funny
@@stefanniculae3811 tower of god
AMD actually raided Area 51 and found the tech for this.
Plasminium AMD is an incredible company
You got the Secret of Amd 😂👍
Actually, not. They just finished studying at Hogwards. Expecto UberCorem.
😱😱😱
Wow, AMD has not only came back swinging but have entered the ring with a chainsword, instead of boxing gloves!
Chainswords are less "epyc" than you think they are.
@@robertt9342 what would be more ' epyc'
Ya ha.. But AMD sure has, a BIG glass jaw!! They sure will fall!! Just give it time!!
In other words, AMD is getting disqualified by not following regulations and intel wins by default?
"and I have in my hand"
Come on Linus, put that down before it ends on the floor..
And then he throws it around...
Again
AMD CPU: *we have 64 core and we don't need gpu*
AMD GPU: *dude we're under same company*
AMD CPU: *yeah but different department*
😂😂
i don´t find it funny, simply fool and bored. as usual , mediocre people always
😂🤣😂🤣🤣😂😂🤣😂🤣🤣🤣😂😂😂😂🤣😂😂😂dude this is so fucking funni you should do stand up
Yeah if it wasnt for Navi, Radeon might be dead
George Martinez sorry that we casual people offended you super god human by our really casual puns and jokes. Maybe u teach us how to be funny and tell us a joke?
@@georgemartinezza 151 persons think you are trying to get their attention acting important.
EPYC to Cinebench R15: You are already rendered.
Ah, i see youre a man of culture as well.
NANI?!
Where does that reference came from?
Fist of the north star
@@PrinzEugen39 Fist of the north star.
Google "omae wa wo shinderu" or in English "you're already dead"
I love train.
Good Minecraft joke hehehe 😉
Its a Minecraft joke which makes it also a coding joke cause minecraft for those who get it.....
Hehe i get it.....
Вхвхвххв, точно
AMD: „Yes“
64 Cores? What's the point? The human eye can't even see more than 8 cores per second!
speak for yourself lumpen
Hahaha
The begge had build
Wait.... Aint that the braces And brackets dude that F up so Hard that pc build video...
You still made me lough my ass off
lol speak for yourself, I can easily tell the difference when I'm over 30 cores per second let alone 64. You're not a real gamer!
Linus september 28th 2019: "holy shit a 64 core cpu"
AMD: "wanna see me do it again?"
2021: 96 core 192 thread beast of a CPU released on zen 3
@@michakrzyzanowski8554 Nah, Zen 3 Threadrippers/Epycs won't have any core count increases. Zen 4 will since it's on a smaller production node. Zen 3 is just a restructuring and improvements to single thread.
@@crylune 64 is still 16 times the cores than few years ago
Just the TDP and the operating temperatures makes my jaw drop to the floor. EPYC win is a severe understatement. Well done AMD
Engineering team: "How many cores do you want?"
AMD: "Yes"
Lol. Hahahsha
And to drink, meatballs
I'm sick of these memes. lol. every fuckin video on youtube
blah blah "rah rah rah"
Someone : "yes"
@@davkdavk Good, then gtfo of social media. lol
OK next upgrade coming...
one day we will mount the motherboard on the CPU :)
This is literally all ready happening. Search it on TH-cam
@@TotalGAMIX not really, north bridge is on cpu but what about south bridge? How will cpu communicate with the rest of components
@@TheGentry000 bruh i dont know but its happening. Just youtube motherboards are going absolute and go to the 3rd video
@@TotalGAMIX yeah ive seen those videos but south bridge still exist. Probably in the next 10 years
@@TheGentry000it will definitely take a while yes. But it's definitely something thats coming in the future because of stuff like implants and all that
Apex is on the Source Engine, which released in June of 2004. Very little has changed in it since except for additional features to keep it up to date with everything. That's why everyone has been so hyped about Source 2. To say Apex Legends is anything worth of a benchmark to modern hardware is like saying going 20 mp/h in a Lamborghini is a benchmark of its top performance.
My first thought at the "can it run Crysis"- Question were: "and what is the Catch about this?"... After it i thought: "shit. this cpu renders faster than the gpu on my old notebook. I think i should gear up..."
To be fair, this Epyc processor has about as many cores as a midrange GPU these days (stuff like "cuda cores" are not cores, they're just pipelines *within* the core), it just has fewer pipelines in those cores but more latency optimizations like bigger cache, out of order execution etc allowing it to better deal with branchy code but making it less effective at running embarrassingly parallel workloads like this graphics workload compared to a *real* GPU.
Linus: Apex isn't the most demanding game on the market
My CPU: *boils water*
I'm dying
No need to turn on your baseboard heat this winter lmao
My CPU: A white hot glowing lump of molten metal and charred silicon
Just wait for Microsoft flight simulator 2020
@@wartymender1544 microsoft flight simulator 2020 your fans will rev so high it will really sound like a 747 taking off !!!
Wait, did Linus play Crysis only using the CPU? That is really impressive
Yes Crysis 1 on cpu only, that's not bad at all.
software rendering is back baby
motherfuckin software mode !! holy shit man
Sell your GPUs. The future is now!
Right!!! At first i was like why is it running like that?? But then i realized its raw processing power
Linus: This is madness
AMD: This is Sparta
best comment!!
AMD: Releases 64 core cpu
Linus: Let's put it in a Mac
Lmao what?
@@F17A Macs aren't really known for high end processors...
@@XavierXonora Ironically, back in the PowerPC CPU days they actually were. en.wikipedia.org/wiki/PowerPC and en.wikipedia.org/wiki/PowerPC_970
Back when they switched to the PowerPC architecture, it was meant to be the next generation of risc CPU, a collaboraion between IBM, Motorola and Apple that was technically superior to Intel's old fashioned cisc X86 line, and, for a while, it was! Unfortunately, IBM and Motorola were already on the road to losing their technological leadership, and in the end just couldn't compete with the behemoth that was Intel.
@Z3U5 wait for that computer to burn up your house with a trash cooling system
Rishi Bappanad i mean you have to give them credit for the 2 latest products: the 16” MBP & the Mac Pro for having solid cooling
Linus realising the $100000 6 editors pc is already obsolete
Hahaha Truth!
Linus should part it out and sell it.
Just like any high budget computer. Making a mid-range machine will always be the choice you don't regret.
Can he double it now?!
"Got our 2080ti which is inappropriate"
*Raises hand*
I'll take it.
🤣😂🤣😅😅
I'll take gtx 750
*Ryzes**
If you know what I mean
Ill take anything... I dont even have gpu...
@@astr0b1t Ryzes :)
No one:
Linus: *Has more RAM than storage on my ssd*
it's a server, pretty normal RAM capacity
@@michakrzyzanowski8554 you’re about 9 months too late
And right now, Intel's marketing team, engineers, and shareholders must be in a boardroom exchanging punches.
@@oooooo-vn3gx we accept that. But if someone brings out a revolutionary product for lower costs, wouldn't you be surprised? Maybe try out their stuff?
@@cuy50 fuck off. a company is designed to make money. Not serve people. That's the government, and they also need to get money too.
@@oooooo-vn3gx I agree as much with that about that profile picture.
@Disappointment change your profile picture. it's inappropriate
You're talking about their brand on an amd video so they're probably happy.
upcoming: 16 PCs, 1 CPU x'D
got enough PCIe lanes after all....
Let's see them use one of these to run the new LAN setup in the lounge they put together.
16 PCs have more RAM
Allah Maadarchod Just put in bigger sticks. This maxes out at 4000GB or 250GB for each of 16 VMs .
@@johnfrancisdoe1563 i miscalculated. Thanks.
When a CPU has enough cache to store half of your OS installer
When your cpu have enought cache to store the entire os
Better for crypto mining for sure!
@@pixels303at-odysee9 stop. Just stop.
@@johns3655 uh.. Stop? Did I offend someone?
@@pixels303at-odysee9 Us, gamers.
Reverse engineering alien technology, surely that's how AMD socked it to Intel.
Linus: we are only about 50 degrees on the cpu
Me: Watches my cpu brew coffee
Me: Watches my cpu ignite dragon2 fron spaceX
It's quite amazing how much more efficient CPUs are when you don't push frequency and voltage to the limit.
My 2700X sips only 65W at 3.2GHz vs 140W at 3.8GHz (Cinebench R20) while still getting 83% of the score.
83% of the score with 84% of the frequency at 46% of the power... efficiency is almost doubled.
@@TheNewTimeNetwork my 3900x runs at 3.7 GHz with an AVX256 load at 65W power consumption. It's insane.
@@Azraleee use a laptop cpu, you can get 3.9ghz on like 30 watts (if it dont thermal throttle)
my i3 also is at only 50 degrees
But because it's crap lol
I wonder how much TNT you can explode on Minecraft with this cpu?
this is a real question
this is a question we must all know
We need an answer
I dont need sleep I need answers
Minecraft is not very multithread friendly...
"More on that fan later..."
Me: Oh boy, Linus did an oopsie
Oopsies are tight.
Honestly, I assumed when I saw that that he didn't plug in the fan at all.
Atleast he didnt drop it. I think thats why they send the cpu preinstalled on the MB 😂😂
We also just gonna ignore the fact Linus nearly won in Apex Legends with no Microphone and no Headphones
"Dear Santa,"
... a 64 core leviathan that costs $7000 is just what I need to be happy
Santa's Elves: *proceeds to raid AMD Center just to get the CPU*
@@juparkjr1285 "speaking of raid "
@@AnjaliVerma-qu8zh Before we comment more jokes lets thanks to our sponsor Raid Shadow Legends
LOL, IKR? if only...
This CPU is so *E P Y C* , it can run crysis WITHOUT GPU
the future is now
damn impressive!
Ba Dum Tiss!
damn
son, whered you find this?
Intel: Can it run Crysis?
AMD: Hold my GPU!
6:16 cows when you slaughter their whole family
lmfao
THE COW BEHIND THE SALUGHTER
nobody:
Intel: let's make the prices higher and only .5 percent better
@stillFLiP Even Now Apple is Changing its Business strategy
Intel Need to do something
Intel : pay more to get hyper threading
Intel : pay more to over clock
AMD : meh , we give all this in Most of our processors and they are still cheap
@@moveonvillain1080 you forgot the better coolers
@@retroree5 yeah my friend AMD stock coolers are definitely better
@@moveonvillain1080 Yeh There is no need to purchase separate coolers
Stock Coolers Are Perfectly Fine Even With Little OC
LOL. Playable, software rendered Crysis. Now that's what I call a twist.
So what now then? 1 CPU, 64 gamers?
This baby can do dual socketed designs. One core per gamer isn't really enough so maybe "2 CPUs, 64 gamers" or "1 motherboard, 64 gamers."
@El-ahrairah lmao
Linus: “noooooooooo”
Also Linus: “I levelled up”
Cracked me up 😂
im about to comment this haha
he's kinda cute x3
Apparently you can now combine 4 OF THESE INTO ONE VIRTUAL CPU. HOLY SH
I’m somehow seeing a “16 Gamers - One CPU”...
perfect for fair miltiplayer,
Yes
One word, 'Stadia'
All running Crysis, naturally, in CPU render mode.
They'd have too many cores lmao
When your CPU is so powerful that you don't even need a GPU.
HAHAHA that thing is gonna make a yuge shockwave on the GRAPHICS CARD MARKET!
J- Man and who the fucks gonna spend 7 times more than a 2080ti on a cpu
@@PorWik the 1%
@@menash8313 LMAO TRUEE
@@menash8313 HAHA!!
Linus: HOLY SH!T THIS IS POWERFUL
Me: HOLY SH!T THIS IS EXPENSIVE
to be fair, hes saying its powerful compared to intel for the pricepoint - its hella expensive, but much cheaper than what you can get for intel like that
Lol where is your rationality bro
it's really not, not for server chips anyway. high-end Intel solutions are priced from 5 digits up to "If you had to ask you can't afford it". Intel still does have the most cores in a single server though, since AMD is limiting themselves to 2-way SMP and intel goes up to 8. Just have to decide between one of those and a nice house.
Just wait a year and can buy one for $1000 like anything else tech related that depreciates at obscene rates
@@Alex-us2vw not really, you have to go back a few generations before server chips get cheap. Especially the apex chip for a given socket, those cost a mint for years
It feels weird to know that in 10 years this is going to be probably a mid tier amount of cores and threads to have in a cpu
Yeah, it’s insane how exponential tech development is.
In ten years well be watching campfires made of plastic screens.
You think? I'm not so sure about that... Hopefully, we'll see.
Not gonna be 10 years, probably 20+. 10 years ago now 4 core was high 2 mid. Now 4 core is mid.
Not gonna happen. We have problems utilising over 4 cores 14 years after quad CPUs hit the mainstream market. Using 64 will not happen for a long, long time.
AMD: *Makes 64 core for $7000*
Intel: *Makes 28 core for $10000*
Stonks
Apple: makes 6 000 $ cheesegrater
StOiNkS x5
don't forget about the server 2019 license cost if you need that sort of thing @ $3900 US for the AMD 64core
Apple: makes a $1000 monitor stand
@@KevinSatterthwaite Dont forget about linux if you need that sort of thing for a whopping $0 US
Also that's $7k retail for AMD but you need to buy a thousand units to get the intel one for $10k
"Is anybody seeing intel?"
"I don't know, but I heard someone crying in the corner of the room"
"Found him then"
Well, they certainly lost some points in desktop and server CPUs, but they still got the best mobile CPUs. Also 9900k is a beast for gaming.
@@iowawalker Yeah, but you dont need it, cause the performance gain is pretty low. It's just like "ok, paying 200€ more for 10 more fps sounds like a good deal"
Dont get me wrong. Its a very good processor, but the pricing is very bad
@@iowawalker But we are still waiting for the Ryzen 3950X, so perhaps this 16 core CPU will take on the 9900k. We will see.
..what we *NEED* is those 3900x/3950x's free to actually benefit from better cooling and reach a bit higher clocks. There's almost no use for a 240/360mm AIO right now besides just dropping temps.
@@hg-sf7yx we don't need the 3950x to take on the 9900k. The 3900x does that just fine, losing by a small margin on gaming but easily winning pretty much everywhere else. So if you do anything other than games only the 3900x is the better choice. The 3950x will have a tiny bit more turbo clock though, so that might even the field in gaming. But OC a 3900x a little and it'll probably be the same.
"Bras are supportive and they are popular."
Linus 2019
@Nick F Pretty sure that's a myth.
Anyone else hear Brandon say "I personally love them"?
@@imadecoy. have you seen those native indian bra-less titties?
And Linus quote of the year goes to...
*drumroll*
"THE BRAS COMMENT"
*Internet goes wild with rapturous applause*
*Everyone nods to each other knowingly*
Love seeing dudes talk about the effects of bras vs not like they know shit lmaooo
Did anyone else look at the thumbnail when the screen color was inverted and go ahh what happened to linus
"Mere 28 cores...."
Cries in Pentium IV single core
at least it has HT- if its new
It's single thread
download more cores
If you want it, I have an FX-8350 with a board, ram and a cooler just lying around. That would be a big improvement for you.
@Hillybilly Throttling is highly motherboard dependent with the Vishera Processors. And I have one that only throttles on IBT or Prime 95.
When pixar starts rendering cgi movies in real time.
Last time toy story 1 only took a few seconds per frame.
Soon it'll be measured in frames per second instead of seconds per frame.
@mr.1n5an_e its literally going to be a miracle, what year do you think this super fast technology is gonna be released in??
@@mictrongaming9064 nah more like 4040
@@aadisahni nah more like 4200
2019:
AMD: EPYC ROME is here
Intel: Working on 14+++ nm
3019:
AMD: EPYC Destroy is here (640 cores 1280T)
Intel: Working on 14++++++++++++++++++++++++++++++++++++++++++++++++++++++++ nm
loled
3019?
God, just imagine what computers will be at that point.
They might be Gods.
Stupid but true joke lmao.. made me spit my water.
@@elijahizere these things are basically already god with how much we rely on them. As soon as general ai hits there will be no question as to who's in charge anymore lol
@@elijahizere or we will have another world war and go back to stone age
0:28 who made these graphics? In the computing world, everything can be made into a perfect cube. 64 is 8 by 8, not 6 by 10 leftover 4...