it's "tech topics" explained in a manner that's moderetely detailed "as fast as possible", presented in an attractive manner, as much as possible, in the title, thumbnail, and of course, the video. That means throwing "tech cult" things in an interesting way, curiosity-driven content for "techy people", in a nutshell, sometimes in a "humoristic manner. Oh, and there are salesmanship lessons in disguise too. Now if you like this comment, or curious to know how to make an "as fast as possible style" comment, you don't need to buy anything, yoyu need to practice summerizing things exhastively, without missing important detail, while highliting the things that might be of interest to people, in an interesting way :) "info-tainment for tech-people"
Basically a CPU has a VERY limited amount of Geniuses solving complex math (complex algebra and AI programming) and then you have your GPU which is basically alot of idiots solving basic math (mostly geometry and short math) *hence you have few cores on ya CPU (solving complex stuff and splitting some of the calculations between the "smart" cores) *and hence you have a GPU like a GTX1070 with 1920 cuda cores (that solves normal and easy math by splitting it between all of the "stupid" cores.
Computer engineer here. This is a pretty apt analogy. There is an additional point to be made as well that GPU cores are more isolated from each other and are optimized for simple computations (data flows in, some function is applied to it, and data flows out), while CPU cores are more tightly integrated, allowing for data to be easily passed back and forth between them and for more complex control flow that a GPU would struggle at (e.g. recursive algorithms, handling global mutable state, and manual memory management).
Each cuda core is actually specialized core either for like 3d, physics, shadows, etc and a group of cuda cores makes an actual GPU core. So a cuda core is basically a core with a ton of hyperthreading. They are also know as shader cores everywhere but Nvidia.
Linus, plz explain the components on the graphics card. Like texture units, shaders, ROPs, FP64. I want to understand them when I look at a specs sheet next time. And I dont understand why you havent covered this already.
+Charlie Lee Each unit does a job (poligons, shader, textures, etc). Like they said, more is better, but please compare the same architectures, don't mix shaders from Maxwell, Kepler and GCN, because they are not the same. Unlike GPU components, FP64 is a measure (benchmark) of operations that the GPU can perform per second. You can see the single and double precison performance of gpus at the wikipedia list of gpus of each company. But I must say, those numbers are important for computational purposes. For gaming, they are pretty much irrelevant.
This is my professional field of expertise. This explanation was amzing! Thanks. Just one more Techquickie video I will be linking to people who want to know more about technical subjects.
Graphic cards, sound cards, network cards, SSDs, HDDs, all of them have a dedicated CPU, and RAM (the later ARM based). So it is like a couple of tiny mini PCs, orchestrated by the main CPU.
I'm surprised for GPGPU you didn't mention plug-ins in DAWs or special effects in video editing software! Having GUI fully assigned to GPU can be very beneficial in high CPU loads.
It's very simple Graphics = gpu Versatile processing = cpu And they are highly specialised? Meaning that your game will probably on only cpu but it will make you depressed. I did this once by disabling my Intel HD graphics from device manager and my laptop started to cap at 15 fps. Running Skyrim on 800x600 was a bad idea. The whole computer hanged and I had to force shutdown it.
+AD KN Well just look at the number of cores. My CPU has 4 but my GPU has 1920. That goes along with what Linus was saying about the GPU needing to do a a lot more, but much simpler, calculations.
I like to think my CPU and GPU are good mates and when my PC boots the two just have some excellent bro on bro sex and everything is wonderful with Crysis 3 running at max settings and, as unbelievable as it sounds, notepad running too.
+TSLMachine Why? An APU is literally a CPU and a GPU put on the same die. They are still separate chips with different purposes. Every Intel processor in the i series (i3, i5, i7) is like that as well (with the exception of the i7 extremes and some desktop Skylakes). The only real difference is that the CPU and GPU are right next to each other and the GPU borrows some of the system RAM (usually 256MB) instead of having its own dedicated VRAM (the PS4 in fact used GDDR5 for its system RAM, which is why it can handle higher resolutions more easily than the XBoxOne which used DDR3, even though they both use similarly powerful APUs)
Your analogies and vivid descriptions really help us to understand what these components do and how they function. My neurospicy brain really gets it now. Thanks 👍!
I'm an obsessive gamer, but I think a lot of people might want this info answered in the most in-depth possible way, with no regard to how long the video is. If you can make an epic documentary, that is 1 hour, 2 hours, 3 hours, or whatever is necessary to get the correct points across, that has a rubric, so you could update it every 2 years or whatever you chose, or never update it again, but have it as a format of information for future PC consumers. There was so much information that I missed while creating my first PC, and there are probably more things that I am missing as well. I gave a like, but why did you under-represent the performance than a CPU has on frames per second, by basically not talking about frames per second at all, and that CPU influences gaming fps as does GPU. It can have massive changes, like 10-100 fps increases/decreases depending on the CPU you equip with your GPU. This even holds true at the high end GPUS. I would like you guys, to do a video, on this concept : Performance gains in fps on a massive scale. Meaning, a lengthy video showing all inputs to what creates frames per second differences in gaming, and to show whether they are more influential presently, at 1080, 2K, 4K, etc. I.E. , what percentage even if small, does Operating system, RAM, CPU, GPU, Motherboard speeds/components, SSD's/HDD, Raid0 or any other special feature that would affect fps. Basically which CPU types do better for gaming versus server CPU's. This would be a video for an enthusiast like myself, who knows some, but is still ignorant, or who wants literally every single frame they can get, and also which components created this : A higher low fps A higher average fps A higher max fps The least possible variation of frames/ least amount of dropped frames , etc.
That's impossible how a GPU actually works I.e. all the technical stuff and how it's manufactured and developed is an industry secret that nvidia and AMD won't tell us.
I love that this content is done in an easy to understand way, but it would be nice to reference (not explain) some of the processes/computing at one more level of detail - eg. mention how linear algebra is used in graphics processing and requires doing the same operation 100X times, while a CPU has to do 500 operations 1 time - just something qualitative without going into the algorithmic details.
Hi! I've been studying a bit about CPUs and GPUs and I came across the APUs (AMD's combination of a GPU and a CPU for consoles). What about them? Are they specialized or just better general purpose cards better suited for gaming? Are they more or less powerful when compared to separated GPUs and CPUs?
Ryzen APU's are now better than a GPU. Go for a Ryzen 7 3700x and you won't be sorry. It's the really sweet spot of price vs performance. You can get "used" ones for $230 on aliexpress. They're actually not used, they're brand new. You just save $100. I ordered one six months ago and it's legit.
Chan Huynh Of course his son is adorable. That's not the point. If I want adorable kids, I can watch a sitcom. When watching a tech show, kids tend to make them run twice as long with half the content per minute than without them. Linus Media Group for lack of a better phrase are professionals. I want to watch professional tech videos. Not family happy time. Cute doesn't equal awesome unless cute is what you want first. That's not why I'm here. I'm here to learn tech items first as an adult. I have seen other YT persons try to bring their kids on screen for tech stuff. It drags. The child usually looks like they are forced into doing it and is generally tiresome to watch. Imagine you get in a car wreck and need the bone in your arm reset at the Emergency Room. Do you want the doctor showing her kid how to put on a cast with your arm? Especially when the kid doesn't understand why he or she is there and is thinking about the cartoons he or she is missing.? No Same thing with tech news.
My teacher once asked me the same question, but angled it differently. He looked at the bare clockspeeds of the units and said: "why do you need a graphics card when resolution X color depth etc etc., clearly doesn't come close to the 3.0GHz clockspeed of this CPU? What a stupid f*** question. I should've showed him this video, it's clearly a fundamental difference. Too bad this was 6 years ago.
+GamingTurkey Yes but you have to buy the VRAM, and the only place you can get it is that dark alley downtown. Just look for the scary guy and be sure to bring lots of cash.
I use GPUs for computational fluid dynamics (CFD). One Nvidia A100 (250W) is as fast as ~7500 Xeon CPU cores (~100kW). What matters here is the much higher memory bandwidth. GPUs are a game-changer for computational physics, even cheaper gaming GPUs. Find simulations with my CFD software on my YT channel :)
Linus, thank you for making the learning of intricate computer components and function not only enjoyable, but easy, and interesting to follow. Your 10 minute videos only feel like 3 minutes because of how absorbed I am by your presentation skills.
Good point Linus, the differences in graphics processors vs machine processors are similar but very different! The time MIT-Came out with the Altair 8800, back in the 70's, the main processor did calculate the rendering of point graphics on the screen! It took about 5-minutes to draw as squiggly line from one side of the screen to the other! Of curse in the brilliant 8-color graphics! I think they knew back then the addition of another processor was needed to handle the load processing, freeing the main processor for housekeeping chores! A Little Trivia: Compaq, dell, and Gateway all used to be Mail-Order computer companies when they were mom-&-pop companies! Compaq came out with the first portable computer, called in many business, the suit-case! The two companies, Compaq and Gateway are now owned by Hewlett-Packard! Dell, now one of the major computer companies that provide platforms over many applications, even NASA! A year of Knowledge compact into 6-MIN, HA!
Laf, he may not be for everyone, for sure…but at least u took the good. I just commented on this presentation after watching a near-identical one on ComputerPhile. He’s young, I’m not, but I could easily see him as my manager. Chuckle, or his editing/production mates.
Imagine once quantum computing becomes mainstream we will have a CPU, GPU, and QPU for random processes, heavy number crunching, and predictive processes!
i would love to see something like what they showed in the video where you could buy a gpu and cpu and stick them in like that. would make watercooling it a lot easier if you're like me and you're into the simplicity of AIO water coolers.
This really didn't go into enough detail. The entire video was: graphics do graphics and cpu does everything else but lengthened. Wish you would have gone into more detail about the architecture. Your contacts at companies could have helped.
so why does it NEED a pci port rather than just plopping it in another cpu like port? (i kind of know half of this but you posed the question and never really answered it)
+Toriel GPUs have many compute units with their microchip too, we can't place all those compute units (usually 4-8) in cpu sockets ( as it will take too much space) and nor could we just provide all of them with adequate cooling if directly placed on motherboard. In a bit of understand-able way, GPU includes more than just 1 microchip that makes it a GPU, it includes but is not limited to VRAM modules, Compute units, Microchip and i guess some power amplifiers .That is what a GPU is and you cannot just stack up all of them in 1 cpu socket as you can't cool all of it or put them in different sockets on motherboard as cooling would be expensive and complex meanwhile making it all a tedious job.
+Toriel Having a single standardized GPU socket on a MOBO would be no different than the current PCIe standard in terms of speed and connectivity issues. In fact, if anything it would negatively impact performance and make the whole set-up more complicated EX: Power delivery would be handled by the MOBO meaning it would be less flexible and customizable for different chips. The chip would also have to use system RAM or have seperate RAM slots for faster vRAM either reducing performance or complicating the set-up. Basically, whether the GPU is on a seperate die in a seperate socket or on the same die and socket as the CPU (integrated GPU, APU etc.) it will have the same speed, power and connectivity issues.
***** Adel Alqadi I gave it all a thought and now i do understand all of it ( maybe ). GPUs are designed to work 1 task only per one cycle along with all of their cores doing the same thing. In a CPU, all cores could work simultaneously on many things at same cycle. APUs are basically a GPU+CPU on 1 same chip ( a marvel of engineering ). A dedicated GPU is different from a GPU in an APU as APU uses the same ram the cpu uses causing small bandwidth and memory bottleneck if ram is slow .Also APUs are basically 1/2 of a GPU in terms of power and performance and because of APUs, the CPUs in them aren't able to pull off great performance. And anyways, all CPUs could do what GPUs do but then they also have to do their own job and that's why their performance in graphics is terrible.Like Intel HD xxxx in all of Intel CPUs .APUs just compromise CPU performance for GPU and that's all. Now why do we need the bigger GPUs seperately and why do they need to be on a pcie slot ? Answer is , Dedicated GPUs have a lot of complexity in them and as I said before they include a CPU and a lot of stuff onboard with them which just can't be put on motherboard because of circuit complexity,cooling,power consumption and economy. GPUs use a pcie lane because, well 1stly pcie is just another way of communication with motherboard and 2ndly it seems better as nobody needs pins, 3rdly GPU has much lesser speed than CPU , and also requires lesser data than CPU so it's just not required of a CPU socket to be used for GPU ( also CPU socket is square/rectangle and that would make an odd connector for a thing as large as a GPU). And dedicated GPUs are that big because they need way faster RAM , so they have their own small VRAM which is way faster than RAMs and that makes the VRAMs hot too , so not only they need space to be put on but they also need power+cooling.Also GPUs have their own power management modules and many other things. In short, GPUs are big because they are an independent part of system and only need data and power, nothing else. And we don't use CPU sockets because that would be not necessary at all ( CPU sockets are more complex than GPU sockets ).
This makes me wonder if it would be technically possible (and sensible) to have a motherboard with a slot for the CPU and another slot for a GPU, with dedicated RAM slots for both system and graphics RAM. This would make it much easier to upgrade cooling on the GPU for better overclocking, or replace the GPU while keeping the VRAM, although would probably make access to the VRAM quite a bit slower (at least using current technology), and would definitely require a lot of cooperation between a lot of companies to bring it to the market.
00:08 . . . opening a new Console feels better for me at least. But sadly this feel wont come back with newer consoles... they kind of go downhill with 8th gen.
Guys im new here!!! I recently installed a new GPU on my pc. the problem is after i booted the pc. Whenever i play videos, i noticed that the gpu fan speeds up. and when i dont stop the video i played in a minute the monitor will go black (cpu still running). then i have to reset it in order to boot again, Same happens when i open a game, gpu fan speeds up, and then boom! monitor goes black again. same thing! Pls help me. my system is,, Intel Core i5 2400 3.10ghz 8gb ram 1 tb Hdd PSU 600 watts true rated GPU is GTX 560 Ti Twin Frozr II idont know if its the OC version or not. 2gb 256bit gddr5 Some says its the PSU not capable or short in powering my gpu. (NOTE: when display adapter is disabled in "device manager" or in "safe mode" the pc runs fine).
+Reditry well i have an R7 370 4gb version with 1024 stream processors and 256-bit memory interface, the GTX 950 (same price) has 2gb, 768 cuda cores and 128-bit memory bus, nvidia is just a douche company so i am forced to buy a gtx 970 next month :)
if I have 2 monitors, 1 for gaming and another for just an extra screen and I plug in my gaming monitor to the GPU and the extra monitor to the Integrated graphics after enabling it in the bios will I suffer from performance loss on the gpu side?
you can't enable both at the same time If this is even done somehow, Windows won't understand that you are trying to do that stuff It would consider things as one process and it would be confused in the thing that "which Unit, integrated Graphics or the Graphics Card ,to use? " Thus having separate system would be more effective A motherboard 30$ , Ryzen 3 CPU 80$, 8 GB ddr4 2400mhz Corsair 40$ and you have a streaming setup for 150$
"Techquickie". Not only rocket science and ultra advanced server stuff like you want, but other, more basic stuff. +I just asked myself about that sooo winning.
+Eric Pelayo This is still stuff you can Google and then TL;DR in like 1 sentence: "CPU's are designed to handle many workloads at once, whilst GPU's are specialized for one specific task: graphics"
Basically, they both can do the same things (GPU's have openCL and CPU's have software rendering), but they are MUCH better at different things (CPU=general computing/physics, GPU=anything that effects the display). You can't run a computer without a CPU, and you *can't run games without a GPU. If you aren't playing games, while a GPU may not be 100% necessary, it is HIGHLY recommended and even the most basic phones/netbooks usually have an iGPU (integrated GPU either on the same chip as the CPU or elsewhere on the MOBO) of some kind to help with video (AKA TH-cam) and take some stress off the CPU. *Unless we're talking about old (mostly 90's and before) games with software rendering options.
Somewhat ironic?…maybe? Anyway, I JUST watched a 6:39 vid from ComputerPhile on exact same subject. Each had their positives and I learned more from both than either. However, to me, the comparison indicated a caffeine-inspired genius can easily keep up with an Ivy Leage-inspired one. Which I find a relief, as this channel has a LOT of good to know topics. Like most, I imagine, I’m fussy w my new channel subscriptions, so when there’s such a clear-cut comparison between one I already trust to one I’m new to, it gives a high confidence value to a new sub. Nice work guys.
2:12 She is litrerlly watching some guy play Battelfield on youtube
And her fingers aren't over the WASD keys...
she was apparently editing video
@@OneTrueKind you cant edit other people's videos on TH-cam
I'm already Tracer
@@UltimateMTB she's pretennddinggg
2016: "GPUs not just good for gaming but also for mining"
2021: "GPUs not just good mining but also for scalping"
2026: "GPUS not just good for scalping but also causing civil wars"
2048: GPUs not only good for starting civil wars. But also good for blowing up solar systems.
2060:
GPU's aren't good for anything, cpus have taken over
bruh
@@TheEngineeringMonkey lol
I love the way he puts technical concepts aside to make it easier to understand for people who aren't tech savvy
i would like to see what is "as fast as possible" as fast as possible
Mind blowd
XDDDDD
ThangMD co you even *eNgRiSh*
You just explained what it is lol
it's "tech topics" explained in a manner that's moderetely detailed "as fast as possible", presented in an attractive manner, as much as possible, in the title, thumbnail, and of course, the video. That means throwing "tech cult" things in an interesting way, curiosity-driven content for "techy people", in a nutshell, sometimes in a "humoristic manner. Oh, and there are salesmanship lessons in disguise too. Now if you like this comment, or curious to know how to make an "as fast as possible style" comment, you don't need to buy anything, yoyu need to practice summerizing things exhastively, without missing important detail, while highliting the things that might be of interest to people, in an interesting way :)
"info-tainment for tech-people"
I'd love to see Firmware As Fast As Possible.
yap
FAFAP
+DrearierSpider1 Low level software usually not updated or changed often.
+DrearierSpider1 This should totally be the next episode.
+Leuel48Fan Tell that to Cisco. Some equipment have released updates as soon as 3 months (during their active life cycle). Some do take a year though.
Next video - how to download more ram
how to connect electricity to google
TeamRespawn you can just add more
Just ram more downloads.
you dont download ram you install it into your computer
No, its how to download gtx 1080ti .😀😂😂😂😂
keep up the good work guys
ok
Yep
Recommended
They did lol
@@whit3rose good thing the guy told them to!
2:12 Playing Games with TH-cam ? N1
LOL
cloud games
Well spotted. xD
Google got inspired from this and created Stadia
@@AnimMouse no
2:12 That girl is not playing the game... ;(
how did you know ??????
@@vatsalaykhobragade Because it's a TH-cam video. See the red stripe at the bottom?
@@martinv1995 sharp eyes
Basically a CPU has a VERY limited amount of Geniuses solving complex math (complex algebra and AI programming)
and then you have your GPU which is basically alot of idiots solving basic math (mostly geometry and short math)
*hence you have few cores on ya CPU (solving complex stuff and splitting some of the calculations between the "smart" cores)
*and hence you have a GPU like a GTX1070 with 1920 cuda cores (that solves normal and easy math by splitting it between all of the "stupid" cores.
So, in other words, polymaths vs. plebeians. Or in the world of GoT, maesters vs. wildlings. :P
Computer engineer here. This is a pretty apt analogy. There is an additional point to be made as well that GPU cores are more isolated from each other and are optimized for simple computations (data flows in, some function is applied to it, and data flows out), while CPU cores are more tightly integrated, allowing for data to be easily passed back and forth between them and for more complex control flow that a GPU would struggle at (e.g. recursive algorithms, handling global mutable state, and manual memory management).
Each cuda core is actually specialized core either for like 3d, physics, shadows, etc and a group of cuda cores makes an actual GPU core. So a cuda core is basically a core with a ton of hyperthreading. They are also know as shader cores everywhere but Nvidia.
Thanks, this was very clear!
Much better explanation than the entire 5:59 minute video.
Linus, plz explain the components on the graphics card. Like texture units, shaders, ROPs, FP64. I want to understand them when I look at a specs sheet next time. And I dont understand why you havent covered this already.
+Charlie Lee more compute units = better graphics card
+Charlie Lee Each unit does a job (poligons, shader, textures, etc). Like they said, more is better, but please compare the same architectures, don't mix shaders from Maxwell, Kepler and GCN, because they are not the same.
Unlike GPU components, FP64 is a measure (benchmark) of operations that the GPU can perform per second. You can see the single and double precison performance of gpus at the wikipedia list of gpus of each company. But I must say, those numbers are important for computational purposes. For gaming, they are pretty much irrelevant.
This is my professional field of expertise. This explanation was amzing! Thanks. Just one more Techquickie video I will be linking to people who want to know more about technical subjects.
"Are those big Chunky GPUs Actually Necessary?"
Ryzen: Good Question
Wait does ryzen do?
VEGA 11
@@henrybutt6893 Ryzen 7 5700G Vega 7 lol
@@TON-ob7ib lol
2:10 my feel when she is watching youtube
+SaveRaptor Ikr, was looking for this
Same XD
Lol
Graphic cards, sound cards, network cards, SSDs, HDDs, all of them have a dedicated CPU, and RAM (the later ARM based). So it is like a couple of tiny mini PCs, orchestrated by the main CPU.
1:10 So we're not going to talk about how he's been carrying a knife all this while :).
Verge intensifies
The Verge keeps a Swiss Army knife on hand at all times, just in case a computer needs to be built
I think most American adults carry knives, regardless of profession.
3:38 this would be useful right about now
This guy is completely a book of electronics both theoretical and practical as well. Amazing work
7 years later GPU’s are AI trainers
Most underrated comment
7 years later? It's always been that way.
1:10 IT specialist with folding knife! Bro! subscribing!
2:12 I love how thats a TH-cam video BF3
2:09 make Skype calls
Me: we don’t do that anymore
It’s almost as if this video is old....
I'm surprised for GPGPU you didn't mention plug-ins in DAWs or special effects in video editing software! Having GUI fully assigned to GPU can be very beneficial in high CPU loads.
I'm too drunk for this
Drinkyoghurt
I'm almost sober and I still don't understand
R u Slav
You should drink some yoghurt
It's very simple
Graphics = gpu
Versatile processing = cpu
And they are highly specialised? Meaning that your game will probably on only cpu but it will make you depressed.
I did this once by disabling my Intel HD graphics from device manager and my laptop started to cap at 15 fps.
Running Skyrim on 800x600 was a bad idea.
The whole computer hanged and I had to force shutdown it.
i'm too high for this
I love when Linus asks a question and says "Great question"
The last time I came this early, I had to pay child support
Mitchel Paulin as always
funny and original
x86/x64 vs ARM as quick as possible.
Next time: Oranges vs Bicycle Chains As Fast As Possible
+Seth Winfrey lolz, but still a great video. I thought that GPU processors are like normal processors.
+Seth Winfrey Gamers probably asked these stupid questions.
More specific question would draw less viewers.
+AD KN Well just look at the number of cores. My CPU has 4 but my GPU has 1920. That goes along with what Linus was saying about the GPU needing to do a a lot more, but much simpler, calculations.
Sir_Sethery alright mr smart guy
@@blodstainer this sounds like a stupid question to someone who already knows that. But guess what, not everyone has your expertise.
Always love the contents. Helps me decide for the specs and realize which is best suited on how i use my laptop. Good job.
Watch 3 videos and understood nothing before stumbling on this. Thanks Techquickie... You helped a great deal.
I like to think my CPU and GPU are good mates and when my PC boots the two just have some excellent bro on bro sex and everything is wonderful with Crysis 3 running at max settings and, as unbelievable as it sounds, notepad running too.
Ha, gaaay
7th i guess
It's been 4 years and there is no reply to this comment
Yep
I would've loved it if APUs were talked about in the video even a little bit.
+TSLMachine Why? An APU is literally a CPU and a GPU put on the same die. They are still separate chips with different purposes. Every Intel processor in the i series (i3, i5, i7) is like that as well (with the exception of the i7 extremes and some desktop Skylakes). The only real difference is that the CPU and GPU are right next to each other and the GPU borrows some of the system RAM (usually 256MB) instead of having its own dedicated VRAM (the PS4 in fact used GDDR5 for its system RAM, which is why it can handle higher resolutions more easily than the XBoxOne which used DDR3, even though they both use similarly powerful APUs)
Alexander Pavel
Exactly.
Here I'll sum it up for you apus=waste of money
+Fred Not at all.
+Fred not really
Taylor approximations and matrix calculations are the main mathematics that GPUs do, so they are designed around doing that as fast as possible.
I like listening to you at half speed. It's like you've been in a car accident and suffered a brain injury.
+inveterate Foreigner you are my savior, i never realized this
danggg
05:32 LOL! I hit the button right away!
I didn't know you used "sine qua non" even in Canada! I'm impressed.
That is the first word Linus come across and that is the first word i don't understand
Times were so simple then.
"Doing alot of things quickly"
10 minutes took to load vide- 'Intel Dual Core'
I still own my pentium
Your analogies and vivid descriptions really help us to understand what these components do and how they function. My neurospicy brain really gets it now. Thanks 👍!
1:51 linus be like: "the f**k is that"
I genuinely enjoyed the Swiss Army knife example. It's very much interesting.
Lv U Linus!
So my Cpu is 4 Isaac Newtons, and my Gpu are1792 manual laborers?
No, actually your CPU is 4 random know it all guys while your gpu is 1792 Friedrich Gauss! :D
love this site pace, please make all other videos in this pace :)
I'm an obsessive gamer, but I think a lot of people might want this info answered in the most in-depth possible way, with no regard to how long the video is. If you can make an epic documentary, that is 1 hour, 2 hours, 3 hours, or whatever is necessary to get the correct points across, that has a rubric, so you could update it every 2 years or whatever you chose, or never update it again, but have it as a format of information for future PC consumers. There was so much information that I missed while creating my first PC, and there are probably more things that I am missing as well.
I gave a like, but why did you under-represent the performance than a
CPU has on frames per second, by basically not talking about frames per
second at all, and that CPU influences gaming fps as does GPU. It can
have massive changes, like 10-100 fps increases/decreases depending on
the CPU you equip with your GPU. This even holds true at the high end
GPUS.
I would like you guys, to do a video, on this concept : Performance gains in fps on a massive scale. Meaning, a lengthy video showing all inputs to what creates frames per second differences in gaming, and to show whether they are more influential presently, at 1080, 2K, 4K, etc.
I.E. , what percentage even if small, does Operating system, RAM, CPU, GPU, Motherboard speeds/components, SSD's/HDD, Raid0 or any other special feature that would affect fps. Basically which CPU types do better for gaming versus server CPU's.
This would be a video for an enthusiast like myself, who knows some, but is still ignorant, or who wants literally every single frame they can get, and also which components created this :
A higher low fps
A higher average fps
A higher max fps
The least possible variation of frames/ least amount of dropped frames , etc.
u my friend u r ture we all must hear him he is speaking tue u r speaking ture mate
That's impossible how a GPU actually works I.e. all the technical stuff and how it's manufactured and developed is an industry secret that nvidia and AMD won't tell us.
I love that this content is done in an easy to understand way, but it would be nice to reference (not explain) some of the processes/computing at one more level of detail - eg. mention how linear algebra is used in graphics processing and requires doing the same operation 100X times, while a CPU has to do 500 operations 1 time - just something qualitative without going into the algorithmic details.
Why does it feel like he already made this video.
consistently good presentation.
Usage of the same pictures from other videos.
This video helped me a lot. I haven't seen how different the inside of a computer is since I put one together in the '90s
Hi! I've been studying a bit about CPUs and GPUs and I came across the APUs (AMD's combination of a GPU and a CPU for consoles). What about them? Are they specialized or just better general purpose cards better suited for gaming? Are they more or less powerful when compared to separated GPUs and CPUs?
I thought they were SOCs, but I might be wrong (and I certainly am ^^)
Ryzen APU's are now better than a GPU. Go for a Ryzen 7 3700x and you won't be sorry. It's the really sweet spot of price vs performance. You can get "used" ones for $230 on aliexpress. They're actually not used, they're brand new. You just save $100. I ordered one six months ago and it's legit.
"APU" is just a marketing term. Most Intel CPUs have integrated GPUs
Could you make a video explaining the different options for backing up a pc? Such as File History vs. Backup and Restore?
Linus. You should totally have your kid do one of these videos with you.
+Arlian Rubio (Arlian70) No.
+ohdogwow2 Why not? His son is adorable!!!
Chan Huynh Of course his son is adorable.
That's not the point. If I want adorable kids, I can watch a sitcom.
When watching a tech show, kids tend to make them run twice as long with half the content per minute than without them.
Linus Media Group for lack of a better phrase are professionals. I want to watch professional tech videos. Not family happy time. Cute doesn't equal awesome unless cute is what you want first. That's not why I'm here. I'm here to learn tech items first as an adult.
I have seen other YT persons try to bring their kids on screen for tech stuff. It drags. The child usually looks like they are forced into doing it and is generally tiresome to watch.
Imagine you get in a car wreck and need the bone in your arm reset at the Emergency Room. Do you want the doctor showing her kid how to put on a cast with your arm? Especially when the kid doesn't understand why he or she is there and is thinking about the cartoons he or she is missing.? No
Same thing with tech news.
I did like th analogy swiss knife vs surgical one to explain GPU vs CPU coz it really made sense to me❤
Next up Leafy vs h3h3 As Fast As Possible
+Alphawolfguy Idiotic fan bases attacking each other for a guy making fun of an autistic guy /end
+Alphawolfguy This channel is retarded enough.
+Vishal Shah No you're gay, basically what your argument is.
Replicated Wow gonna stoop to homophobia. Why the hell are you even watching the video if you think the channel is retarded?
+Vishal Shah You gonna stoop to this low? calling people retards? wow.
My teacher once asked me the same question, but angled it differently. He looked at the bare clockspeeds of the units and said: "why do you need a graphics card when resolution X color depth etc etc., clearly doesn't come close to the 3.0GHz clockspeed of this CPU? What a stupid f*** question. I should've showed him this video, it's clearly a fundamental difference.
Too bad this was 6 years ago.
Don't run your OS on a gpu.
A statement by linus
Me:
I don't even run it on my cpu I run It on my hardrive
you can't run OS on hard disk, OS modules will be fetched from hard drive to RAM and then executes in processors
Do you have a video on why some servers come with a GPU and what they do? Not a custom built server that one would run games on.
can you Just download more GPU?
+GamingTurkey Thio Joe can
+GamingTurkey Yes but you have to buy the VRAM, and the only place you can get it is that dark alley downtown. Just look for the scary guy and be sure to bring lots of cash.
Azahru Ddin
He is. What you cannot deny is how he and his followers expose how many dumb people there are
GPU is the most importantly demanding tool here in 2024
“Don’t expect for the gpu to go away anytime soon.” Apple Silicon: “Hold my beer.”
we already had APUs for years
I use GPUs for computational fluid dynamics (CFD). One Nvidia A100 (250W) is as fast as ~7500 Xeon CPU cores (~100kW). What matters here is the much higher memory bandwidth. GPUs are a game-changer for computational physics, even cheaper gaming GPUs.
Find simulations with my CFD software on my YT channel :)
dope Latin phrases in my techquickie
Kreygasm
4:33 paperwoerk
Linus, thank you for making the learning of intricate computer components and function not only enjoyable, but easy, and interesting to follow. Your 10 minute videos only feel like 3 minutes because of how absorbed I am by your presentation skills.
Good point Linus, the differences in graphics processors vs machine processors are similar but very different! The time MIT-Came out with the Altair 8800, back in the 70's, the main processor did calculate the rendering of point graphics on the screen! It took about 5-minutes to draw as squiggly line from one side of the screen to the other! Of curse in the brilliant 8-color graphics! I think they knew back then the addition of another processor was needed to handle the load processing, freeing the main processor for housekeeping chores! A Little Trivia: Compaq, dell, and Gateway all used to be Mail-Order computer companies when they were mom-&-pop companies! Compaq came out with the first portable computer, called in many business, the suit-case! The two companies, Compaq and Gateway are now owned by Hewlett-Packard! Dell, now one of the major computer companies that provide platforms over many applications, even NASA! A year of Knowledge compact into 6-MIN, HA!
I think a "Combing your hair" fast as possible should be the next video...
+Earnest Bunbury Agreed
I'd love to watch more of it but I just cudnt take the screaming! But I learnt what I wanted. Good stuff
Laf, he may not be for everyone, for sure…but at least u took the good. I just commented on this presentation after watching a near-identical one on ComputerPhile. He’s young, I’m not, but I could easily see him as my manager. Chuckle, or his editing/production mates.
Why isn't it illegal to have a GPU without a back plate yet?
Because there is no reason to have one and they are sometimes too thick for SLI and Crossfire
Quietbut_Deadly Make it a fin backplate. It looks ugly otherwise and AMD manages fine :(
3:08 your wellcome.
Imagine once quantum computing becomes mainstream we will have a CPU, GPU, and QPU for random processes, heavy number crunching, and predictive processes!
i would love to see something like what they showed in the video where you could buy a gpu and cpu and stick them in like that. would make watercooling it a lot easier if you're like me and you're into the simplicity of AIO water coolers.
I gotta say it - sometimes he looks sorta like David Tenant, (from certain angles)
Luigi I think I see what you mean but that is really quite a stretch.
When he's being expressive and raising his eyebrows if you're squinting he does actually look like David Tennant TBH
1:30 The verge just DID this. Use army knife that "hopefully" Have the right screwdriver to build a PC.
This really didn't go into enough detail. The entire video was: graphics do graphics and cpu does everything else but lengthened. Wish you would have gone into more detail about the architecture. Your contacts at companies could have helped.
1:45 I'm the best at what I do, but I do isn't very nice.
Except in this case it's the nicest feeling, unboxing a new graphics card.
so why does it NEED a pci port rather than just plopping it in another cpu like port? (i kind of know half of this but you posed the question and never really answered it)
+Toriel GPUs have many compute units with their microchip too, we can't place all those compute units (usually 4-8) in cpu sockets ( as it will take too much space) and nor could we just provide all of them with adequate cooling if directly placed on motherboard. In a bit of understand-able way, GPU includes more than just 1 microchip that makes it a GPU, it includes but is not limited to VRAM modules, Compute units, Microchip and i guess some power amplifiers .That is what a GPU is and you cannot just stack up all of them in 1 cpu socket as you can't cool all of it or put them in different sockets on motherboard as cooling would be expensive and complex meanwhile making it all a tedious job.
+A SaGaR so why can't it be done the same way that APUs are pulled off?
+Toriel Having a single standardized GPU socket on a MOBO would be no different than the current PCIe standard in terms of speed and connectivity issues. In fact, if anything it would negatively impact performance and make the whole set-up more complicated EX: Power delivery would be handled by the MOBO meaning it would be less flexible and customizable for different chips. The chip would also have to use system RAM or have seperate RAM slots for faster vRAM either reducing performance or complicating the set-up. Basically, whether the GPU is on a seperate die in a seperate socket or on the same die and socket as the CPU (integrated GPU, APU etc.) it will have the same speed, power and connectivity issues.
***** Adel Alqadi I gave it all a thought and now i do understand all of it ( maybe ). GPUs are designed to work 1 task only per one cycle along with all of their cores doing the same thing. In a CPU, all cores could work simultaneously on many things at same cycle. APUs are basically a GPU+CPU on 1 same chip ( a marvel of engineering ). A dedicated GPU is different from a GPU in an APU as APU uses the same ram the cpu uses causing small bandwidth and memory bottleneck if ram is slow .Also APUs are basically 1/2 of a GPU in terms of power and performance and because of APUs, the CPUs in them aren't able to pull off great performance. And anyways, all CPUs could do what GPUs do but then they also have to do their own job and that's why their performance in graphics is terrible.Like Intel HD xxxx in all of Intel CPUs .APUs just compromise CPU performance for GPU and that's all.
Now why do we need the bigger GPUs seperately and why do they need to be on a pcie slot ?
Answer is , Dedicated GPUs have a lot of complexity in them and as I said before they include a CPU and a lot of stuff onboard with them which just can't be put on motherboard because of circuit complexity,cooling,power consumption and economy. GPUs use a pcie lane because, well 1stly pcie is just another way of communication with motherboard and 2ndly it seems better as nobody needs pins, 3rdly GPU has much lesser speed than CPU , and also requires lesser data than CPU so it's just not required of a CPU socket to be used for GPU ( also CPU socket is square/rectangle and that would make an odd connector for a thing as large as a GPU). And dedicated GPUs are that big because they need way faster RAM , so they have their own small VRAM which is way faster than RAMs and that makes the VRAMs hot too , so not only they need space to be put on but they also need power+cooling.Also GPUs have their own power management modules and many other things.
In short, GPUs are big because they are an independent part of system and only need data and power, nothing else. And we don't use CPU sockets because that would be not necessary at all ( CPU sockets are more complex than GPU sockets ).
I just had an image in my head of all these surgeons operating on someone with swiss army knives
it's just blabla. give few comparison benchmarks with high-end cpu with low-end gpu vise versa
This makes me wonder if it would be technically possible (and sensible) to have a motherboard with a slot for the CPU and another slot for a GPU, with dedicated RAM slots for both system and graphics RAM. This would make it much easier to upgrade cooling on the GPU for better overclocking, or replace the GPU while keeping the VRAM, although would probably make access to the VRAM quite a bit slower (at least using current technology), and would definitely require a lot of cooperation between a lot of companies to bring it to the market.
What about gpu chips with HBM on board plugging directly into the motherboard?
Armadillito I don't know much about HBM, but I'm sure it would be (theoretically) possible!
notification squad meme initiate
Yep
+Phoenix2079x not funny
Ayy, I click on one immediately whenever the notification pops up...
love your vids linus, love em
2:12 Shes watching a video.
+eliasRrivera bruh she is, dude linus finds the most cheesy photos ever ahahaha i call out them all the time and i did not notice this one, gg
+dank pepe Hahaha I love finding these easter eggs XD
00:08 . . . opening a new Console feels better for me at least. But sadly this feel wont come back with newer consoles... they kind of go downhill with 8th gen.
Guys im new here!!!
I recently installed a new GPU on my pc.
the problem is after i booted the pc.
Whenever i play videos, i noticed that the gpu fan speeds up.
and when i dont stop the video i played in a minute the monitor will go black (cpu still running).
then i have to reset it in order to boot again,
Same happens when i open a game, gpu fan speeds up, and then boom! monitor goes black again. same thing!
Pls help me. my system is,,
Intel Core i5 2400 3.10ghz
8gb ram
1 tb Hdd
PSU 600 watts true rated
GPU is GTX 560 Ti Twin Frozr II idont know if its the OC version or not. 2gb 256bit gddr5
Some says its the PSU not capable or short in powering my gpu.
(NOTE: when display adapter is disabled in "device manager" or in "safe mode" the pc runs fine).
try getting up dated drivers
same issue. still not working i think old drivers will do
chone nam see if a friend has a old GPU and /or a PSU that you can use to test with and see if that fixes it
He explained so well. Before I had to search multiple videos to actually understand it. But finally a video which explains it clearly.
2:12 "Play Games"
Nah man. she's just watching a youtube video.
I like the m4 with the drum magazine shooting at those tanks...
Amd’s apus : am i a joke to you?
As I'm watching this, my phone buzzes to say there's a new techquickie video. Sorry, phone, but I beat you to it.
Gotta mine those delicious bitcoins.
+The Hoax Hotel its worthless now right?
+cypher nop it's worth 415$ a btc sooo not so worthless
NOUE NOUREDINE so why doesnt anyone mine bitcoins?
+NOUE NOUREDINE For GPUs it is worthless as you'll get less money out of it than the price of electricity.
Henrath what if your dad pays the bills ? hehe
I would like to see slow-motion as fast as possible
AMD > NVIDIA , let the battle begin
Nvidia is alot better
+Reditry why, because it pays companies to optimise games for their gpu's? nvidia is better, because of that one fact :)
She's mine now John does that matter?
+Reditry well i have an R7 370 4gb version with 1024 stream processors and 256-bit memory interface, the GTX 950 (same price) has 2gb, 768 cuda cores and 128-bit memory bus, nvidia is just a douche company so i am forced to buy a gtx 970 next month :)
+Reditry sorry, forgot to mention the 950 has 5-10 more fps in games :)
if I have 2 monitors, 1 for gaming and another for just an extra screen and I plug in my gaming monitor to the GPU and the extra monitor to the Integrated graphics after enabling it in the bios will I suffer from performance loss on the gpu side?
you can't enable both at the same time
If this is even done somehow, Windows won't understand that you are trying to do that stuff
It would consider things as one process and it would be confused in the thing that
"which Unit, integrated Graphics or the Graphics Card ,to use? "
Thus having separate system would be more effective
A motherboard 30$ ,
Ryzen 3 CPU 80$,
8 GB ddr4 2400mhz Corsair 40$ and you have a streaming setup for 150$
You're really running out of content, aren't you?
+Anon10W1z haha true
Not really. This channel was meant for all audience anyway, means many people are still learning despite of how "basic" the topics are.
"Techquickie". Not only rocket science and ultra advanced server stuff like you want, but other, more basic stuff. +I just asked myself about that sooo winning.
+Eric Pelayo This is still stuff you can Google and then TL;DR in like 1 sentence: "CPU's are designed to handle many workloads at once, whilst GPU's are specialized for one specific task: graphics"
+Anon10W1z if you dont like it, dont watch.
nicely done.. !! We're never gonna forgot that example
Basically, they both can do the same things (GPU's have openCL and CPU's have software rendering), but they are MUCH better at different things (CPU=general computing/physics, GPU=anything that effects the display).
You can't run a computer without a CPU, and you *can't run games without a GPU. If you aren't playing games, while a GPU may not be 100% necessary, it is HIGHLY recommended and even the most basic phones/netbooks usually have an iGPU (integrated GPU either on the same chip as the CPU or elsewhere on the MOBO) of some kind to help with video (AKA TH-cam) and take some stress off the CPU.
*Unless we're talking about old (mostly 90's and before) games with software rendering options.
This is a good way to put it , nice thanks man
Thanks for this video, exactly what i was looking for!
Somewhat ironic?…maybe? Anyway, I JUST watched a 6:39 vid from ComputerPhile on exact same subject. Each had their positives and I learned more from both than either. However, to me, the comparison indicated a caffeine-inspired genius can easily keep up with an Ivy Leage-inspired one. Which I find a relief, as this channel has a LOT of good to know topics. Like most, I imagine, I’m fussy w my new channel subscriptions, so when there’s such a clear-cut comparison between one I already trust to one I’m new to, it gives a high confidence value to a new sub. Nice work guys.
Hey- do you need a GPU and a CPU to work a computer?
the Swiss army knife and scalpel comparison did it for me
He explains it so well, thanks linus😄