considering their 10nm weren't rumored to be out until 2021 I don't think 2020 will happen unless they have fixed their design and the issues it caused going smaller compared to amds tsmc chip design which while it allows the ability to keep going smaller and gaining in power and so forth...once intels equivalent size becomes a reality...amds chips just sit there being once again behind because of their design. So I mean its really just a back and forth of fanboys every so often laughing at eachother rather than actually paying attention to everything.
@IHas well desktop 10nm we know is 2021 assuming covid didnt damage everything which is possible. However we also know more than likley a similar issue that amd has. While it may be better than the 9900k itll be a small difference still more than likley. Unlike amd who chose to use tsmc are the manufacturer so they didn't basically die. Intel has to create something that doesnt break existing patents etc for their architecture and so forth. Also a 3700x and 3900x is basicslly just a normal 9900k..yeah its a bit cheaper but the accompanying motherboards are basically what brings their cost even closer. The real difference was supposed to be heat and power draw..which amd chips are not fantastic at...and they still have some issues and depending if you were screwed on what chip you were given because of their horrible quality control and definitely other issues then intels old chips are superior and more efficient than amds new Chiplets.
@@zombiechibixd amds "7" nm is two 7nm chiplets on a 14nm die. Also amds 7nm is the same as intels 10nm. There is no real sizing standard for cpus. Doesn't surprise me though intels cpu design is much more difficult to work with compared to amds tsmc. Which gives amd an advantage in generating smaller chips to match up against intels 14nm high end chips and hopefully doing more than just matching them or only beating them by like 4% in sinlge core processes considering their cost and cost of motherboards...kinda ruins some of it for many people who don't need multi core use. I mean 16 cores sounds amazing in...tasks that would benefit from it. Any single core tasks will straight up for probably another decade or so dominate in many areas.
2moose4u hopefully not this moment, as semiconductor does not mean it does not conduct well, but that it can either conduct our not conduct depending on other factors like temperature and artificial impurities
Germanium did in fact see a lot of use in the very early days of solid-state semiconductors. Many of the first transistors used germanium instead of silicone, though the applications of this material are fairly niche these days.
Linus: "Intel has announced that for its 7nm chips that could be released around 2020 it'll be using something other than silicon" 2021 - Intel just starts releasing 10nm chips
Two questions, 1) if we can't make transistors smaller using silicon is there any benefit to just making CPU's larger so that they can hold more transistors? 2) Does anyone else find the snap pointing a tad annoying? Otherwise great job Linus.
so basicly CPU Cache is on die RAM we have L1 L2 and L3 as L4 was thought to slow it down not speed it up from a single core design for both consumer and server chip POV as after they go through the CPU Cache then they have 2GB to 1.2TB of system RAM to go through and the time to go through the small L4 cache is a slow down not a speed up for severs but a speed up for consumers so Intel dropped it in 2014~2015 broadwell had it through dropped in Skylake core design you also need to think about it from a point of view do you need more CPU Cache when the Cache to core ratio is about 1.24 or 1.25 MB to a core on a modern CPU so small so the CPU can access it quickly and return to the core after that the engineers decided to improve the other parts not listed on the spec sheet to make it faster by moving them around the reference chip design Sandybridge for Intel and Bulldozer for AMD while a majority redesign will improve performance most times at this time a higher on die cache to core ratio will slow it down not speed it up when using the same design for both consumer chips and server chips IF they are to divide the chip design apart and focus on consumer needs for one and server needs for the other then the socket in a server board will most likely not take a consumer chip and the other way around too but also the price will go up too as the amount of R&D will increase to do 2 designs instead of 1 for both so cost savings which in turn is thankfully passed onto the consumer too
Hi techquickie, just to let you know I just love your videos. all of them: time-span, quality content, even the clever commercials! Keep it going and congratz!
most electronics are now for the circuit board chips anyways the rest of the device depends on what it is used for and how much the manufacturer wants to pay for it's parts and peices
There's a lot of languages out there and what people should learn for employment, what a good entry point is and a list of what popular programs were made with might help people who come to this channel looking for computer information. :)
good idea actually. depending on your goals the "right" language to start might differ. maybe JS for web applications, C# for desktop applications? Go is pretty cool, too.
Many years ago I heard about a breakthru in synthetic diamond manufacture that would have made them very cheap to mass produce, and the possibility was raised that synth diamond could be used to replace silicon as the core for CPUs. Diamond was supposed to have greater heat tolerance than silicon so a diamond CPU could run hotter, and since diamond was transparent it would allow IR to radiate thru it to let heat out better, so it would actually heat up less. I guess that feel thru, or maybe the DeBeers family had the guy who invented the cheap synth diamond process murdered.
Linus does not do future predictions for what is to come soon as carbon nano tubes and graphene are 2 possibilities for what Intel will use in 2020 but they also have other materials they can use or even make a new one in a lab to use in their chips material engineering is a field upon itself
You did not give the proper answer to why semiconductors are used. Transistors and diodes are made by doping semiconductors with chemicals. You said in 1:30 that if metal was used instead of semiconductors, it will turn on too many transistors, but the problem is that it would not be possible to make the transistors in the first place without semiconductors.
That bit of hair sticking up, I can't help but stare. But this script is well written, interesting and to the point... most 'quick tips' on TH-cam tend to flounder about or use to little info. Here's a sub! Good on ya :)
looking at how carbon nanotubes work they are to big for a 7nm chip but if Intel can use that concept on a smaller scale something like that will be used as carbon is the most likely element to be used for the chip and germanium for the board connections but both are more expensive than Silicon and if Intel prices itself out of the market of consumers it will be bad
I know this is just a basic video, but I think it should have been emphasized more that SiO2 (silicon dioxide) is arguably THE most important reason for the choice of Si. It's not only a very good electrical insulator, but also has a stable interface with Si, is a good natural diffusion mask during doping, and has very good etching selectivity between Si and SiO2. Having such a good natural oxide made it the most compelling choice (among all the other reasons, like being cheap, easy to work with, etc.). Another huge aspect to Si that wasn't mentioned is the fairly well balanced electron and hole mobility, which makes CMOS technology viable in the first place.
Some electronic properties of gallium arsenide are superior to those of silicon. It has a higher saturated electron velocity and higher electron mobility, allowing gallium arsenide transistors to function at frequencies in excess of *250 GHz.*
Just realized that silicon and silicone are different things in English. Always wondered why you guys talk about silicone in CPUs like what the hell is that doing in there, because CPUs are made up of Silicium and in my language (German) there's only "Silikon" and "Silizium" and no silicon-silicium synonym whatsoever.
Apparantly one of the best materials for Computer chips is Gallium Arsenide. It has much higher electricity mobility and Direct band Gap. The only using Silicon is still used is because its cheap. Hopefully we soon see Consumer Gallium Arsenide chips.
Many material scientists theorize that the next semiconductor material will be Carbon. Some areas have already used diamond (a form of pure carbon) to store images. Carbon is also a very common in contrast to GaAs and Ge which are current substitutes for Si and used in some electronic devices.
Up until the 70s, transistors and circuits were made with germanium, but they were fragile and very prone to failure. Researchers found silicon as the best alternative for circuits, as it was in the same element group as germanium and it worked much better for transistor applications.
Why do they need to make Cpu's more compact, Couldn't they just make a Cpu that is twice the size for desktops or would that not actually increase performance?
Birki gts because the signal has to travel through more physical space than if the cpu is more compact and because there is a limit on much a typical task can be made in parallel so even if you had a big CPU highly optimized it would wait for even more time than our systems do now.
+patttiat if the games were to compromise in some other aspecs. Optimization is another word for "let's see what we can compromise on to achieve this other thing in the specific capabilities of this piece of hardware". It doesn't always lead to miracles.
***** yes and no, your definition is right but most optimizations are made so they compromise on aspects not perceived or way less perceived than "the other thing" by the user. So ideally you'd compromise where it doesn't matter but I wouldn't say anything specific about abilities of certain hardware because that needs verification to be correct.
Silicon is the 8th most abundant element in the universe, and the 2nd most abundant element in the earth's crust. en.wikipedia.org/wiki/Abundance_of_the_chemical_elements en.wikipedia.org/wiki/Abundance_of_elements_in_Earth's_crust
You can make heavier elements with fusion (that's how all the heavier ones got made) but all the fusion steps up to iron give off more energy than they take in, whereas iron requires more energy in than it gives off in fusing
It's good that they're getting away form silicone. It isn't as prone to temperature fluctuations but it is to light. That's why they're used in solar cells & IC's are very opaque. Hopefully the future replacement for silicone will be either translucent of transparent.
3:18 Hi Mr. W from the future is here silicon memory is now being used in Gaming devices like Playstation 5 and Xbox series x/s and it's so much faster than you can imagine. 😎
One of the most important properties of silicon is that the surface of a silicon wafer is smooth at the atomic level, so its good for building chips on.
It's so cool that there's so much natural silicon in San Francisco. It makes sense that tech businesses would thrive there if all they needed to do was dig in their backyards to find processors!
The reason why semiconductors are used is not because "they aren't quite as good" at conducting electricity", it's because they conduct electricity under some circumstances, and under others - they don't. This allows transistors to be switched on and off, allowing them to perform calculations. Metals can't be made to not conduct electricity, therefore they can't be used for making transistors.
where's my 7nm processor, intel?
Yet Apple has the first 5nm... Oh boy!
@@elpatriotaLX ARM tho, not x86.
@@pablogonzalezhermosilla4210 and it wasn't first, TSMC has been ramping 5nm for a few months before apple got any chips.
@@juliancumming6893 tsmc makes apple chips
@@juliancumming6893 also apple made a large investment into tsmc to get 5nm first
2020, intel, 7nm... pick two out of three
considering their 10nm weren't rumored to be out until 2021 I don't think 2020 will happen unless they have fixed their design and the issues it caused going smaller compared to amds tsmc chip design which while it allows the ability to keep going smaller and gaining in power and so forth...once intels equivalent size becomes a reality...amds chips just sit there being once again behind because of their design. So I mean its really just a back and forth of fanboys every so often laughing at eachother rather than actually paying attention to everything.
@@yulfine1688 then covid
@@techhelpportal7778 i mean yeah. But Intel has to create a completley new design. Amd just uses tsmc because they really didn't have a choice.
@IHas well desktop 10nm we know is 2021 assuming covid didnt damage everything which is possible. However we also know more than likley a similar issue that amd has. While it may be better than the 9900k itll be a small difference still more than likley. Unlike amd who chose to use tsmc are the manufacturer so they didn't basically die. Intel has to create something that doesnt break existing patents etc for their architecture and so forth. Also a 3700x and 3900x is basicslly just a normal 9900k..yeah its a bit cheaper but the accompanying motherboards are basically what brings their cost even closer. The real difference was supposed to be heat and power draw..which amd chips are not fantastic at...and they still have some issues and depending if you were screwed on what chip you were given because of their horrible quality control and definitely other issues then intels old chips are superior and more efficient than amds new Chiplets.
Ok
3:45 "Intel 7nm chips that released around 2020 won't use Silicon" well they aren't wrong.
Evening John. You, ah, working again?
@@lucysluckyday Yup!
I came here to comment this haha
need context.
Lol
"Intels 7nm CPUs in 2020"
Boy that'll be the day!
*Dances in AMD*
Buhahahahahaha
Any day now... any day...
The Q 2020 calling - not till 2021 at the earliest.
yeah this didnt age well... :/
Update from the end of 2022; Intel currently at 10nm. Ryzen stopped waiting for Intel at 7nm and went on to 5nm.
here it is 2019 and intel isn't even at 10nm lol
lol, and being beaten up by AMD
F
@@zombiechibixd amds "7" nm is two 7nm chiplets on a 14nm die. Also amds 7nm is the same as intels 10nm. There is no real sizing standard for cpus. Doesn't surprise me though intels cpu design is much more difficult to work with compared to amds tsmc. Which gives amd an advantage in generating smaller chips to match up against intels 14nm high end chips and hopefully doing more than just matching them or only beating them by like 4% in sinlge core processes considering their cost and cost of motherboards...kinda ruins some of it for many people who don't need multi core use. I mean 16 cores sounds amazing in...tasks that would benefit from it. Any single core tasks will straight up for probably another decade or so dominate in many areas.
@@yulfine1688 but at least Intel is trying to work on their research. So they're not all bad
@@zombiechibixd no one said Intel is well besties fanboys..I honestly dont care Intel main focus has moved to the mysterious science of quantum lol
We don't need 7nm they eye can only see 700nm
GamingTurkey this comment gave me a headache
+Mfgcasa shut up and enjoy the fps joke ffs
GamingTurkey (Linus) why aren't CPUs made of something else... Like potato. (Me) but AMD parts are made of potato
The eye cant see 700nm, under a light microscope maybe.
yes. they say they are making a 7nm chip, but they are just going to sell an empty box.
U don't insult my potato pc Linus
Hero Acer 4K ProRes
Too late, Potato GLaDOS got to Linus
That moment when Linus teaches you more about science than your Chem teacher...
2moose4u a very specific area without much detail...
2moose4u hopefully not this moment, as semiconductor does not mean it does not conduct well, but that it can either conduct our not conduct depending on other factors like temperature and artificial impurities
u have a shitty chem teacher
GamerABC 👌🏼
Why would your chem teacher teach you about transistors? Not to mention the video over simplified.
I think potato is pretty commonly used among console manufacturers.
CodeNameZ *cough* *cough* even apu's are powerful enough to beat consoles while requiring little to no power.....
batt3ryac1d Maybe if you are innovative.
Cereal killer is right, you Pesant
*cough* *cough* Consoles use APUs you fucking idiot
No shit sherlock
It also recycles really easily.
Germanium did in fact see a lot of use in the very early days of solid-state semiconductors. Many of the first transistors used germanium instead of silicone, though the applications of this material are fairly niche these days.
linus's hair is like "i just woke up" xD
Some Random Guy 666 whatareudoing this
Linus: "Intel has announced that for its 7nm chips that could be released around 2020 it'll be using something other than silicon"
2021 - Intel just starts releasing 10nm chips
2023: Intel still releasing 10nm chips
0:20
Umm, excuse me? My PC is 80% potato!
2:00 that aged well
silicon is mega cheap, wdym? CPU’s are expensive, but silicon is just the substance its made of
You explained elements better than my science teacher ever could have.
Last time I was this early they announced Skylake
NJBoden Last time I was this early, They announced the first celeron.
Last time I was this early apple announced an innovative product, oh that never happened I must have been dreaming.
Jean-samuel boisvert the mouse
Joe Schmoe
they didn't invent the mouse, they were among the first to sell commercial mice to consumers.
Jean-samuel boisvert yup. But they announced an innovative product, didn't they?
Two questions, 1) if we can't make transistors smaller using silicon is there any benefit to just making CPU's larger so that they can hold more transistors?
2) Does anyone else find the snap pointing a tad annoying?
Otherwise great job Linus.
SS S I know I'm 3 years late but linus made a vid about why we can't just double GPU sizes. Idk if you watched it already just wanted to help :)
@@HamuelPter tl;dr: they would have 2x fewer chips on 1 wafer. That would make it 2x more likely to not work, and there would be 2x less chips
MetTy yeah basically. There were also other problems but this was the main one
@@metty3873 and more than 2x more expensive
@@metty3873 and power consumption and heat would be enourmous
I want to get CPU and RAM implants
Well first you're going to need a wi-fi connection for a RAM implant, how else would you download it?
so basicly CPU Cache is on die RAM we have L1 L2 and L3 as L4 was thought to slow it down not speed it up from a single core design for both consumer and server chip POV as after they go through the CPU Cache then they have 2GB to 1.2TB of system RAM to go through and the time to go through the small L4 cache is a slow down not a speed up for severs but a speed up for consumers so Intel dropped it in 2014~2015 broadwell had it through dropped in Skylake core design
you also need to think about it from a point of view do you need more CPU Cache when the Cache to core ratio is about 1.24 or 1.25 MB to a core on a modern CPU so small so the CPU can access it quickly and return to the core after that the engineers decided to improve the other parts not listed on the spec sheet to make it faster by moving them around the reference chip design Sandybridge for Intel and Bulldozer for AMD
while a majority redesign will improve performance most times at this time a higher on die cache to core ratio will slow it down not speed it up when using the same design for both consumer chips and server chips IF they are to divide the chip design apart and focus on consumer needs for one and server needs for the other then the socket in a server board will most likely not take a consumer chip and the other way around too but also the price will go up too as the amount of R&D will increase to do 2 designs instead of 1 for both so cost savings which in turn is thankfully passed onto the consumer too
memes
***** im gonna download more space on my SSD so i can play minecraft with all of the sex and car mods.
grapezzzz19 You need a power supply too.
3:40 - yeah... about that...
Hi techquickie, just to let you know I just love your videos. all of them: time-span, quality content, even the clever commercials! Keep it going and congratz!
35 atoms wide? When did we get this amazing at building computers, they didn't even exist 100 years ago
5 nm probably 20 - 25 atoms, 3 nm 10 - 15 atoms maybe.
It's all over for shrinking
7nm are 35 Atoms? MAKE IT SMALLER thats very big..
Lol at the decision to add reverb to his voice when he was in space
1:41 actual picture of the iMac
trashcan mac pro go brrr
Your video editors are getting quite creative these days. LOL
problem is, I didn't know that cpus were made of silicon lol
Jeryl H. Then you obviously never went to school
joshua lotion honest question, your school has a legit computer science class or the sort?
joshua lotion also, I understand the properties of silicon, it's just I never thought about cpus being made of it
Yeah learnt this in 11th grade chemistry
most electronics are now for the circuit board chips anyways the rest of the device depends on what it is used for and how much the manufacturer wants to pay for it's parts and peices
Can you do an episode on programming languages? Unless you've already done one that is.
Jared Prymont why?
There's a lot of languages out there and what people should learn for employment, what a good entry point is and a list of what popular programs were made with might help people who come to this channel looking for computer information. :)
good idea actually. depending on your goals the "right" language to start might differ. maybe JS for web applications, C# for desktop applications? Go is pretty cool, too.
+busTedOaS heh
C# for desktop applications
funny joke
Ryan Jones visual studio is free for learners and not that cluttered anymore.
Little does Linus know that 2020 Intel was just pushing the same CPU he showed at 3:33 but just overclocked
Can't wait for Carbon Nano Tubes!!!
Many years ago I heard about a breakthru in synthetic diamond manufacture that would have made them very cheap to mass produce, and the possibility was raised that synth diamond could be used to replace silicon as the core for CPUs. Diamond was supposed to have greater heat tolerance than silicon so a diamond CPU could run hotter, and since diamond was transparent it would allow IR to radiate thru it to let heat out better, so it would actually heat up less. I guess that feel thru, or maybe the DeBeers family had the guy who invented the cheap synth diamond process murdered.
No mention of carbon nano tubes? *Triggered*
Linus does not do future predictions for what is to come soon as carbon nano tubes and graphene are 2 possibilities for what Intel will use in 2020 but they also have other materials they can use or even make a new one in a lab to use in their chips material engineering is a field upon itself
yumri4 Mic nor was referring to the joke used by Netlinked daily host keys . Linus was once a part of Netlinked daily
CAAAAAARBON NAAAAANOOOTUUUUUUUUUUBES!
Riley/Keys anyone?
+Dieser 1337 Baka. whoosha
This video is about silicon. Not carbon.
All hosts in this channel are awesome...
Intel: "7nm - 2020", sure intel..
You did not give the proper answer to why semiconductors are used. Transistors and diodes are made by doping semiconductors with chemicals. You said in 1:30 that if metal was used instead of semiconductors, it will turn on too many transistors, but the problem is that it would not be possible to make the transistors in the first place without semiconductors.
Metalloids aren't true Metals, unlike Metallica or Slayer.
[Insert reference to The Metal by Tenacious D]
Lamb of God is the only sound good enough to enter my ear canals.
\m/
a m a z i n g
DJC9000 Productions Metallica ain't true metal anymore
I really love the editors of techquickie
Microsoft and Sony already uses the potato for their consoles
potatoes, yeah, because you can cook a whole BBQ on top.
OBX Sand Devil I read that in a squidward voice
On the other hand, nintendo uses toasters to power their console.
AMD uses baked potato
apple use apple
i think we don't have to worry about shrinking the transistors any more. but instead think about stacking layers of the transistors.
LOL 4:14 That's Minecraft RO
TinchoX Did I have a problem with you? Or you're a troll?
BeAnBeAn22 Romania (România).
That bit of hair sticking up, I can't help but stare.
But this script is well written, interesting and to the point... most 'quick tips' on TH-cam tend to flounder about or use to little info. Here's a sub! Good on ya :)
They are probably going to use carbon nanotubes for smaller transistors.
looking at how carbon nanotubes work they are to big for a 7nm chip but if Intel can use that concept on a smaller scale something like that will be used as carbon is the most likely element to be used for the chip and germanium for the board connections but both are more expensive than Silicon and if Intel prices itself out of the market of consumers it will be bad
Techquickie is very educating, perhaps he also can explain in front of a classroom, just an option, Thank You
Watching this on my potato powered PC!
1:24 The electrons don't move (or do so very slowly). Electricity (a type of force) is what actually moves through a current.
Lol, 2023 and we're still shrinking down silicone chips. Hopefully one day we move on from silicone to a better, and maybe even synthetic material
I know this is just a basic video, but I think it should have been emphasized more that SiO2 (silicon dioxide) is arguably THE most important reason for the choice of Si. It's not only a very good electrical insulator, but also has a stable interface with Si, is a good natural diffusion mask during doping, and has very good etching selectivity between Si and SiO2. Having such a good natural oxide made it the most compelling choice (among all the other reasons, like being cheap, easy to work with, etc.). Another huge aspect to Si that wasn't mentioned is the fairly well balanced electron and hole mobility, which makes CMOS technology viable in the first place.
Sand, its just all sand
Not exactly. Sand is Silicon dioxide. The silicon used to make integrated circuits is extracted from sand but it isn't sand.
the main point here is a definitive example of just how common silicon is
someone learned something from playing Anno I guess.
it isn't just sand. The middle east is all sand but they will not make the simplest chip in the universe in a quintillion years
+ed tyler Intel has a fabrication plant in Israel, which is a middle-eastern country
Wow this one was actually good and informative ... *smashes like*
Holy crap, they got the color grading right.
NICE TO KNOW as an employee of a semiconductor Company. Infineon. very well explained
I'm from 2021
3:40 there's no such thing as 7nm Intel Chip
Some electronic properties of gallium arsenide are superior to those of silicon. It has a higher saturated electron velocity and higher electron mobility, allowing gallium arsenide transistors to function at frequencies in excess of *250 GHz.*
After 4 years im here waiting for 7 nm of Intel
techquickie is like a stock photo powerpoint presentation
BRUH THE HAIR HAD ME FUCKED UP
Love this channel, its like a tech oriented SciShow
RIP 7nm Intel
Just realized that silicon and silicone are different things in English. Always wondered why you guys talk about silicone in CPUs like what the hell is that doing in there, because CPUs are made up of Silicium and in my language (German) there's only "Silikon" and "Silizium" and no silicon-silicium synonym whatsoever.
Silicon Wafers... YUM
There's some other dominant factors including silicon's relatively low forward voltage bias for pn junctions and fast switching ability.
could you imagine how cheap the pc part would be if they were made out of potato? @.@
But imagine how much more expensive they'd be in Ireland!
it would be rather cheap actually, considering that ireland has one of the highest potato consumpiton per capita in the world
wat is potatoe, is dat wat the politburo do to mens sent to gulag
Techquickie is specialised, keep going, and surely he could be a teacher
39 people are running on potatoes at home.
TheN0odles 139 now
2:06 the way he really steps into space and voice changes :p
"Californee"
Apparantly one of the best materials for Computer chips is Gallium Arsenide. It has much higher electricity mobility and Direct band Gap. The only using Silicon is still used is because its cheap. Hopefully we soon see Consumer Gallium Arsenide chips.
Ha! Who's watching this video in 2020???
I wish I could go back to 2020, at least.
Many material scientists theorize that the next semiconductor material will be Carbon. Some areas have already used diamond (a form of pure carbon) to store images. Carbon is also a very common in contrast to GaAs and Ge which are current substitutes for Si and used in some electronic devices.
Well AMD has been using potatoes so make their chips for the last 4-5 years lets hope they made the move back to silicon with Zen!
Up until the 70s, transistors and circuits were made with germanium, but they were fragile and very prone to failure. Researchers found silicon as the best alternative for circuits, as it was in the same element group as germanium and it worked much better for transistor applications.
Why do they need to make Cpu's more compact, Couldn't they just make a Cpu that is twice the size for desktops or would that not actually increase performance?
Well that wouldn't increase performance but require more power and cooling for the same performance. Making CPUs smaller makes them more efficient.
gentuxable
Ok, Why wouldn't that make them faster?
Birki gts
because the signal has to travel through more physical space than if the cpu is more compact and because there is a limit on much a typical task can be made in parallel so even if you had a big CPU highly optimized it would wait for even more time than our systems do now.
+patttiat if the games were to compromise in some other aspecs.
Optimization is another word for "let's see what we can compromise on to achieve this other thing in the specific capabilities of this piece of hardware".
It doesn't always lead to miracles.
*****
yes and no, your definition is right but most optimizations are made so they compromise on aspects not perceived or way less perceived than "the other thing" by the user. So ideally you'd compromise where it doesn't matter but I wouldn't say anything specific about abilities of certain hardware because that needs verification to be correct.
Thanks for today's science lesson
Like and the 7950X will be made of Germanium and will cost 5000USD
7.111% faster than a i7 2600K
Pyjamas987 - Clash Royale and more nah we will be able to grow our own 7950x
"Intel has already announced, that for its 7nm chips that could be released around 2020."
Ouch
Don't fart in the apple store there are no windows 🦍💨
Why?
@@super-awesome-funplanet3704 r/woooosh
We're conducting research at my university in terahertz transistors. We're currently capping at 300-500 GHz with InP type. Cool stuff!
Mourneris now that's an OC lel
I'm only slightly bothered by how you pronounced my home state.......
im not even kidding my computer science showed us techquickie for the vid
360p 4 life
this should've arrived much earlier. this is the exact question I had to answer in the exam!
Never heard of Si being common in our universe. I think H and He together should be like 99% of all the elements in the universe lol.
Tianchen Zheng yes,but silicone ranks quite high in the rankings
Silicon is the 8th most abundant element in the universe, and the 2nd most abundant element in the earth's crust.
en.wikipedia.org/wiki/Abundance_of_the_chemical_elements
en.wikipedia.org/wiki/Abundance_of_elements_in_Earth's_crust
Well, I know Iron is the heaviest element that can be made by fusion, so Iron is probably pretty common.
You can make heavier elements with fusion (that's how all the heavier ones got made) but all the fusion steps up to iron give off more energy than they take in, whereas iron requires more energy in than it gives off in fusing
Is it just me or does it surprise any of you that we have more gold than silver, or Iodine be for that matter which we eat everyday as salt
thank you so much this is the exact question i was asking today
7nm intel CPUs in 2020 LMAO 😂😂
It's good that they're getting away form silicone. It isn't as prone to temperature fluctuations but it is to light. That's why they're used in solar cells & IC's are very opaque. Hopefully the future replacement for silicone will be either translucent of transparent.
The future belongs to molecular computers!
3:18 Hi Mr. W from the future is here silicon memory is now being used in Gaming devices like Playstation 5 and Xbox series x/s and it's so much faster than you can imagine. 😎
Linus explains 8th grade science
2:05 In space, someone can hear you scream. ..Except with bathroom reverb!
Silicon valey is nickname for street in Belgrade, where freshly 'upgraded' women look to find rich sponsors.
thats siliCONE rubber not siliCON
One of the most important properties of silicon is that the surface of a silicon wafer is smooth at the atomic level, so its good for building chips on.
3:38 "Exotic Materials" I believe you mean Exotic BUTTERS my friend :P
comment
Reaction on the previous comment
TR-8R overreaction to the first and second comment
reading the comments
Great bud luv the hoodies very retro
.
It's so cool that there's so much natural silicon in San Francisco. It makes sense that tech businesses would thrive there if all they needed to do was dig in their backyards to find processors!
That segue was brutal
The reason why semiconductors are used is not because "they aren't quite as good" at conducting electricity", it's because they conduct electricity under some circumstances, and under others - they don't. This allows transistors to be switched on and off, allowing them to perform calculations. Metals can't be made to not conduct electricity, therefore they can't be used for making transistors.
Can you please make a techquickie video about kaby lake processors as fast as possible?
excellent writing