I've lived my whole life hearing about vacuum tubes and never really knowing how they work. This was an amazing presentation connecting lightbulbs to transistors. I'm stunned.
Everyone understands mechanical computers, then school skips vacuum tubes because we don't use them anymore, and jumps to digital circuits, honestly if I had this video I would prob have got digiy
If you've ever gotten into guitar amps, you'll still hear people say that tube amps sound warmer. Still plenty of people using tubes. Tubes are still often the best way to amplify very high wattage radio signals.
@@abhishekkushwaha3462 Truly outstanding point! Personally found this method of finding out 'how or why was this thing invented in the first place?' really great approach to learning many new topics!
I, personally think, a better origin of computer and automation can be found in Looms, specially Jaquered looms. And I'll also advise u to not take Veritassium seriously. I mean not even as a good entertainment.
@@frostfamily5321 It doesn't take quantum physics to explain the operation of a transistor. Someone well-versed in quantum physics might have something to add to the conversation, but the operation is currently well understood.
I designed and built my first computer with vacuum tubes in 1957. Being a ham radio operator I knew a little about electronics. Also, I was a lazy math and physics major. There were 2 computers in town. One belonged the TVA. Being a part of the government, I was refused access. The other belonged to the largest bank in town. It took up the entire 3rd floor of one of their buildings downtown. They explained how it worked. Several friends contributed tubes. Large and not exactly cost efficient, it did less than my slide rule. It did give me a bit of a leg up years later in getting a job with an airline as an assembly language core programmer on IBM 360s.
Thank you for the kind words. I haven't considered such an effort as the major part of my work was steered into international relations and international economics. I am researching and writing two books on those subject aimed at college students, their professors and people involved with multinational corporations. I am close enough to my 85th birthday that I can see my wife counting the candles. I am just hoping to pass along my education and experience before I pass on.
My mind is constantly blown how far humans have come in the last 100 years. Edit: Great to see awesome comments here. The goal is to become a peaceful species to explore the cosmos. Let's overcome the great filter!
It's comforting to know that for 99% of the problems humanity faces today, an amelioration or even a straight fix is due in the next century. Really makes you optimistic for the future. Do not quote me on that number ;)
Seeing the progress of computers laid out in a timeline is one of the most fascinating things to me. I've probably seen/ read the story about a dozen times and it's still interesting
same!! i'm in my late 30s here and the first time i read about it was in david macauley's 'the way things work' - a book i got as a christmas gift when i was probably 8 or 9. i found the description of this early computer extremely fascinating.
I can barely understand the logic behind it all and *_I still_* find it interesting -until thinking about it gets too hard and something else grabs my attention.-
the progress just keeps going... the transistor and then the programming of the computers is probably our greatest achievement as a species. we are still in the phase of implementing this invention, we haven't seen "anything" yet. :)
I was a kid when solid-state electronics were replacing vacuum tubes in consumer products. I remember that radio and TV repair was a widespread cottage industry. The best in that field were able to adapt and stay afloat, until the advent of integrated circuits.Great video 👍👍
I enjoyed taking the back off our television, gathering all the tubes, and going down to the drug store to test them on the tube tester ... even when the tv was working fine.
If you could repair a TV you could work for a big name company going to hospitals installing, repairing and maintaining high end medical x-ray. You were a hero and even the Doctors worshiped you. We were networking image transfers before they knew what to call it. We could do 3D imaging using analog video. You got the image in a day or two because computers were too slow. CT, computed tomography was invented by an engineer that worked at the EMI recording studios. Then the companies handed out laptops and said figure it out. Dial up. The blood ran despite how hard we tried to reprogram those folks. Now the software and every other kind of repair are two disciplines. Benjamin Franklin was your last renascence man that could know enough of everything to function in any subject. Its only going to get worse. How long before you ask your computer "Whats wrong?' A you have to decide if you can trust it. I'm hanging on by my finger nails.
@@terryscott5568A lot of fields are requiring the specialties to know so much more than ever before to function. Medicine is a field that suffers this from so much advancement. It is a blessing and a curse. To graduate medical school Doctors are required to know magnitudes more than doctors just 30 years ago due to the huge advancements in research. It is absurd really.
When you replace a vacuum tube with another of identical type (same code on label), you can be sure that the properties are the same. This is an advantage of the tubes. When you replace a transistor with another of identical type, the replacement should be first measured, if matching the properties is paramount. Also, a sound amplifier with vacuum tubes does not have that background noise notable in those using transistors, when the amplification is at maximum.
I have to give mad props to your editor/animator(s). They do such a tremendous job distilling your scripts into visual language even though we all know none of this is actually classical mechanics at its roots. The classicality of it is emergent and the art style helps with that even though it is not explicitly said.
Dude, I’ve watched so many of your videos, and you are one of my absolute favorite channels on TH-cam. Your team does such an amazing job between research, writing, producing, editing, etc… Veritasium makes GREAT content! Please keep doing what you’re doing! Thanks!
As a 3rd year Electrical Electronics Engineering student, I can say that this video is by far the best video that made me finally understand all these theoretical concepts we took in our lessons, you are a true genius
@@zefellowbud5970 come on, basic schoolbooks provide the same explanation. Maybe American schoolbooks are suffering from all the book banning. Turing was gay, so maybe he is forbidden as, to woke 🤣
@bzuidgeest I think you hit the nail on the head. The American public education system keeps going further and further downhill. I’m a proud American, but even I know that our country is doomed if something doesn’t change soon.
As a guy who majored in computer science, I gotta say this is one of the coolest videos I've seen in the TH-cam science community in a while. I never made the connection between lightbulbs and the invention of vacuum tube based machines. Thank you Derek for putting together this amazing narrative for the fundamental turning point of electronic computer history!
@@thespacejedi eh, I think they just teach it to electrical engineers probably. They probably want more Software Engineers than Computer Scientists, so things like understanding the nitty gritty gets tossed aside (or I just haven't taken the relevant course. There is a course called digital circuits that I think I'm supposed to take.)
@@jpisello oh transistors are covered? That’s good. I’ve been trying to learn about them/read up on them, and it’s been slow going. The only thing I know so far is that there are numerous kinds of transistors. So far, my understanding is that for turning off a transistor maybe the middle portion of the transistor gets a counterbalance of voltage to make it so that no difference of voltage exists for a current to run across the transistor. Is this the case? That would mean keeping a transistor off would actually cost energy. Do transistors have a capacitor be the key to whether they’re switched on or off? To switch them off the part behind the dielectric (of air or maybe silicon) would be made into whatever charge so that no difference of voltage exists for a current to flow across. And when they want it to turn on then they make a difference appear by change the charge.
As someone who works for a commercial and industrial lighting agency, I love this. Such a great history lesson. This is the kind of Veritasium video I love to see!
🥰Thanks !!! Finally I understand how electronic works...such a simple, yet brilliant animation and presentation. This is my favorite education channel!!!
I was born in 1968. My mother was a Comptometer operator ( a mechanical adding machine), and my father was mad about electronics. I grew up surrounded by vacuum tubes, but I don't think I really understood them until watching this video! Thank you for your amazing content.
@unnamedchannel1237 I suppose that is a _possible_ interpretation of my statement. Another slightly more _plausible_ interpretation is that she was the *operator* of a mechanical adding machine.
I was also born in 1968. Specifically November 17. My dad first bought our (my) first computer on my 13th birthday. AI will be for STEM geeks in 2023 what BASIC was to computer geeks in 1981.
@@unnamedchannel1237Yeah. My mother was in fact an actual mechanic adding machine, always busy with spending and budgets or something. My father called her "cranky" sometimes..
Hi Derek, I'm a semiconductor Electrical Engineer. I also look forward to your silicon video(s). I have often imagined how to animate a Bipolar Junction Transistor (BJT). The electron would be a circle of one color, the hole a different color. During recombination, the colors disappear into the background. I'm sure you will explain what took me some time to learn. The reason the charge carriers get through the base into the collector is diffusion! The importance of emitter injection efficiency might be out of scope. Another in the series might show how the photo-voltaic diode works, the light making electron-hole pairs (EHP)s, and that makes power how?
I remember my old professor teaching me the differences in semiconductor transistors... I never fell more in love with the MOSFET and its design although delicate in nature.
Setting aside your grammar (unless English is not your first language?), I agree: I, too, am enthusiastically looking forward to the follow-up video that, hopefully, will feature Derek explaining solid-state electronics in a similar manner!
I too would like to see a similar video on transistors. I have never quite understood how the small current at the junction can effectively amplify the current passing through the transistor. Really enjoyed this video. I went into electronics in 1974 via the United States Air Force. This was about the beginning of the end of vacuum tube usage. Still, in some cases it was still more cost effective to maintain old equipment and perform preventive maintenance on it than to buy new. I had a piece of equipment made in 1940 that was still there when I left in 1982😊 The transition from vacuum tubes to transistors necessitated a career change for some TV repairman because some could not quite figure out how to troubleshoot transistors. I was always proud of my uncle Dave. He grew up during the depression, and with a High School education taught himself enough electronics to start a radio and TV repair shop on the side. He kept on going even with the change. I talked to him about transistors. He said it was a bit difficult at first, but he was determined to understand them. He was one of people other repairmen in the area called when they needed help with a particularly difficult problem.
I think it’s interesting to realize that Tesla‘s invention of the radio would end up relying on an invention of Edison’s, the lightbulb, and a discovery by Edison that he only discovered because of his refusal to use Tesla’s AC, which led to first being used as a device to convert AC to DC, Lol, and then to create another device to amplify radio transmissions and then used to receive and play radio transmissions on a radios speaker.
@@jaredf6205if those two had gotten along we might not have gotten as far. Ironically the competition of one upping each other’s inventions was the driving force for advancement. Like most things competition is good for advancement.
i know about them and have even "messed around" with them, because i work with audio/music related stuff. The audio and music industry still uses them, they can produce the same quality of audio than a transistor based system, and they have a very "unique" kind of touch added to the sound. its usually described as a warm super subtle distortion in audio that's very pleasant, and its imposible to emulate trough digital stuff. even music from your phone going trough an vacuum tube desk amplifier will sound very crispy in the most pleasant way there could be. i know it sounds exaggerated, but if you got good ears and know what you're hearing, you'll see that it's different.
My father was a professor at Cornell University, and I have some memories from the early era of computers. My father acquired for his office in about 1960 a purely mechanical calculator made by Friden that could add, subtract, multiply, divide, and, COMPUTE A SQUARE ROOT. It was about twice the size of an electric typewriter, very noisy in operation, and cost the then-huge sum of $1300. I also remember being taken in about 1958, as a very small child, to see Cornell's first electronic computer, running on banks of vacuum tubes and pretty much filling a former engineering school machine shop,
The memories: In high school 4 of us tried to build a "computer" with pinball game relays. Load &slow. We got it to add, subtract, & multiply. We graduated before getting it to divide. Later as a college instructor, I built a spreadsheet to demonstrate how computers calculated. It still amazes me how computers can do anything with such a limited number of basic tricks. My head is hurting again!
This makes me think about the people who built calculators and computers in Minecraft using the in-game “electricity” system called Redstone. It started as just making switches that could automatically open doors when you hit a button or stepped on a pressure plate to trigger it, but it eventually grew into more and more complicated electric systems until people eventually built calculators and even computers in the game. I remember seeing a video where someone built a computer in Minecraft that was running Minecraft itself in a scaled down version, on a screen made of Minecraft blocks. Someone even built a computer that was able to connect to the internet and they were able to order a pizza through the game that then was delivered to their house. I’m sure by now people have built huge and even more complex computing systems in the game and I have no idea what their capabilities even are at this point.
@@TheOriginalMacOS I’m aware they can’t connect directly to the internet through in-game stuff alone, but they still had to build the thing in the game to interface with it.
@@TheOriginalMacOS Pretty sure in the video I saw years ago they built a computer in game that used the mod to show the web browser and connect to the internet.
The Z3 was a German electromechanical computer designed by Konrad Zuse in 1938, and completed in 1941. It was the world's first working programmable, fully automatic digital computer. The Z3 was built with 2,600 relays, implementing a 22-bit word length that operated at a clock frequency of about 5-10 Hz.
The smart German guy who created Z3 was very young, a fresh graduate. He was conscripted and sent to the front line! After a few months, it was realized that he was more useful back home, as engineer, than on the front line as soldier. Fortunately, he did not die during his deployment.
@@scopestacker9787 The Z3 was not designed to be Turing complete; they didn't need it. In 1998 it was discovered that the Z3 was actually Turing complete. As it wasn't designed to be Turing complete, coding branches is unintuitive and complicated.
The title made me think that the light bulb had some relavance in todays world apart from the historical progression thing. Does anyone know where this technology is still current?😁 Apologies if I missed something, I promise I watched the whole thing.
@@captainoates7236 based on the video, he spent a lot of it showing how the light bulb was one of the milestone inventions along the path toward computers. He didn't show a lot of other ways it's used which is a shame because he could pick any other invention along the chain and do the same thing as this video and have the same basis to call that thing the greatest invention.
One thing I really appreciate about Veritasium is how it explores the history of science; highlighting the people who made these discoveries. It's a great reminder that our modern world didn't just randomly pop into existence one day.
I firmly believe that the best way to truly understand something is to learn its history because that helps us understand it’s evolution and the reasons for why things are the way they are. And I always love the way you take this approach in most of your videos and the bravery with which you approach and easily explain such complex topics. Thanks again Derek and team!
For the record I have worked in IT for over 30 years and this is the first explanation of how we got from light bulbs to circuits that actually made sense. Showing the model K went a long way to understanding it.
I look forward to the next video in this evolution, because after this comes the transistor. I think it could be argued that the biggest milestones in human history are the mastery of fire, the printing press, the discovery of penicillin, and the invention of the transistor. There are literally billions of transistors used in our everyday life, yet very few are aware of how much they have changed the world.
I would probably add internal combustion and agriculture (I’m not 100% certain here, but I believe it was the advent of crop rotation that first enabled long-term/perpetual human settlement), but yeah, you’re definitely not wrong!
When I was a kid in the early 1970's, our Zenith TV had a LOT of vacuum tubes in it. We even had a monthly (I think) visit from a technician who would check the condition of each of the vacuum tubes and replace the ones that were failing. I was very young and dumb and assumed that he was putting new television programs in the TV for us. I held this belief until I saw "Willy Wonka & the Chocolate Factory" (the one with Gene Wilder in the title role). That movie had a scene that explained how TV programs got to your TV and Wonka's invention of sending chocolate bars via a similar technology.
In those days hardware stores sold replacement tubes, and they would have a "self service tube tester." (I'll pause while you do an image search…) You would yank all the tubes out of your broken radio or TV, bring them into the store, test them, and of course, replace any the machine told you was "bad."
@@raffriff42 I recall a hardware store that had one of these giant testers in the hallway of the mall it was located in up to the early to mid 80s. I can't say I ever saw anyone use it back then of course, as nobody used vacuum tube TVs by then. But the device sat their in disuse for quite a while.
Back in the day before things were made intentionally difficult or impossible to replace or fix. Nowadays, if something goes wrong in your TV even finding a repairman would be difficult. Most people just get a new TV.
I remember the headaches from understanding logic operators when I was a student, circa 2006. This is so beautifully explained and easy to understand that I wish I could have seen this back then.
As someone who programs, the title made absolute sense to me, as anyone who codes knows you almost never know what actually is going wrong when something does, so writing code that gives you cues of at which point the code breaks, in a more analog design, using lightbulbs as status indicators makes a lot of sense
Colossus was a set of vacuum tubes based computers developed by British codebreakers in the years 1943-1945 BEFORE ENIAC. Colossus is regarded as the world's first PROGRAMMABLE, electronic, digital computer, it was programmed by switches and plugs
That's true, but I note he carefully said "programmable" computer. Colossus wasn't programmable. However, Colossus would have been worth a mention, simply because it was that endeavour that worked out how to reliably use thousands of valves in a single machine (i.e. never turn it off). I don't know if Eniac benefited from this or whether they had to work it out for themselves. Arguably, demonstrating that electronics could be reliable enough for practical use was as important as being the first electronic computer. Had it been built and been impractical because tubes kept blowing, maybe no one would have bothered to build another. What Colossus was was fast. When the designs finally became public, sometime in the late 1990s I think, the first thing someone did was write a piece of software for a PC to emulate it. I can remember reading that it ran not significantly faster than a real Colossus, even though a mid-1990s PC had 50 years of computation development in its favour.
The term was around before computers were a thing. It had to do with buzzing interference noises on phone lines which sounded like buzzing insects. Debugging referred to fixing the interference.
This was a cool video. As a computer engineer who designed my own tube guitar amp in college, you basically just did a summary of my education experience. Very rad deep dive into the world of tubes!
Im really happy that you'll be covering transistors. As soon as you mentionned the "grid" of the vacuum tube, i knew that was coming. This winter semester i had a course on electronics, and something that i was wondering is why the "grid" input is called just that. This was really informative, well done!
Haven't watched the vid yet, but they are often literally a mesh/grid. Sometimes they consist just of a loop of wire, but usually it's a screen mesh/grid.
When I was a kid in the early 60's my father was an Air Force officer connected with electronic warfare and computer development. HE had shoe boxes full of old, dead vacuum tubes. I loved to play with them; they made great space ships. Kinda wish I had gotten him to explain to em how they worked. I was only 6 year sold, so I guess I can forgive myself for seeing them as only toys. What I REALLY wish I had gotten him to explain was all the papers with programming on them.
@@whatshumor1 If you're referring to the fact that I use capital letters for emphasis instead of italics, it's because TH-cam doesn't enable italics if you enter directly. Not all of us are willing to go to a third party app or learn some kind of obscure tag in order to make italics. In this case, I did go to a third party app and type with italics, but when I pasted it into TH-cam, the italics were gone. Kudos for you for figuring out how to do italics. That definitely makes you superior to the rest of us.
@@pickleballer1729 Use underlines, (_): _Hello World!_ Do not put characters before/after the symbols, it will break the effect: ._Hello_ Also works in WhatsApp and similar chatting applications. For bold, use (*): *Hello World!* Strikethrough (-): -Hello!-
In the early '70s a lot of the businesses in my town--restaurants, bars, grocery stores--were starting to replace their tube-based in-store music systems with solid-state systems, so they were happy for me to take their old machines off their hands so I could tinker with them. In a lot of cases, the tube systems had already failed, but since the owners already planned to replace them, they didn't bother to fix them. I took them back to my "workshop" (the shed in my parents' backyard) and worked on them myself. I made uncounted bike trips to the local drug store, which had a tube tester in the back at the end of a row of bins of different tubes, to get replacement parts. But I never knew until now exactly *how* the triode tube worked. Thanks for the great video!
While I was a kid, my dad had an old FM radio and a separate audio amplifier that used vacuum tubes. Today I learned how they worked. Simply amazing! I hope you do the follow up video on how this works in silicon.
Great video Derek ! I love this story and you did a great job telling it ! Its worth mentioning that Claude Shannon's 1937 thesis was originally to be about using boolean algebra to simplify existing switching circuits but by accident discovered that these same circuits could be used to solve boolean equations. Thus it all began
always found early breadboards extremely fascinating. as someone who repairs electrical devices daily, and is used to circuit boards, i really appreciate all the work that went into old circuitry with vacuum tubes, and electromechanical engineering, the best days are the days i get a old appliance from before i was even thought of.
The first electronic programable computer was COLOSSUS built by the British code breakers at Bletchley Park in 1943 to decipher the German Lorenz cypher. However as the project was classified, the existence of COLOSSUS was kept secret until the 1970’s.
Although Colossus was the first programmable electronic machine that was called a "computer" at the time, I don't think it was a computer in the sense that we use the word "computer" today, since it wasn't general purpose, and wasn't a stored-program machine. That said I don't think ENIAC was a computer in the sense we use the word today either, as although it was general purpose, it wasn't a stored-program machine either. I think the first computer, in the modern sense of the word, was the Small-Scale Experimental Machine (SSEM), also known as the Manchester Baby.
I guess it depends on which history books you read. Furthermore, the existence of Colossous was keot secret until the mid 1970s and it wasn't really public knowledge until the early 1980s. Therefore, the history books would not have known to mention Colossus when considering ENIAC as the first computer.
And again nowbody thinks about the Zuse Z3 wich could also be arguably the first Computer. Yes it wasn't natuarly Turing compleet, but could bei withe trixe. And it was a lot more programebel (you didn't need to rewire it for evry programm). Manny germans would say the Z3 was the first Computer and mit the ENIAC.
Thankyou Americans are always trying to re write history and say they invented the computer. Colossus was built by Alan Turing also of Turing test fame. It was built to crack enigma which it did amazingly well, another thing Americans like to claim they did. They only got away with it because it was classified.
I have an Associate degree in Electronics Technology and a Bachelor of Science in Industrial Engineering, and this is the video that finally made me understand how vacuum tubes work...
That was pretty cool to see UsagiElectric featured in this. His videos are pretty awesome and I am glad I discovered his channel awhile back. He is brilliant and has a lot of fascinating videos. Glad he was getting some well deserved praise and attention from this video.
Thank you! That was fascinating. As one who grew up in the 1960s & '70s, and who was an electronics hobbyist at that time (now an engineer), I played with a LOT of vacuum tubes. My first introduction to transistors in school - specifically field effect transistors - was through learning about triode vacuum tubes, as they translate very closely. Again, thank you. This brought back many neat memories.
hey im 15 and considering becoming an engineer. I'm pretty good at math and physics, but I'm worried that being an engineer might be dull. What do you think?
@@thesnowman2509 "Dull" is not a word I would ever apply to electronics engineering. Fun, exciting, lucrative, challenging, ever-changing, and fun (yes, fun x2!) are far more applicable. If you have a passion for electronic circuit design now like I did when I was in my teens some 50 years ago - and I STILL DO - then every day of your engineering career will present new and intriguing challenges. At its root, engineering is just problem-solving using math, physics and electronics theory, which is really just a specialized area of physics. If you enjoy figuring out how things work, an engineering career might be for you. Because of engineers we've seen computers go from 27 tons filling a space the size of a medium-sized home to something that's 24,000 times faster and incomprehensibly more versatile that fits in our pocket...all in just about 75-years. With quantum computers just now emerging, the careers of new and future engineers is guaranteed to be a breathtakingly exciting ride!
@@roberttrautman2747 yeah engineer sounds super cool but I’m worried because: A. Don’t want to end up in an office job like some engineers do B. It’s so fascinating but it’s so complicated and I don’t see how I could ever retain that much knowledge in my head haha. I perform really well in school but when seeing the work some engineers do it’s kind of overwhelming. Did you ever have a point where you thought you wouldn’t be be able to continue? (Sorry to bug you)
@@thesnowman2509 Not a problem. I'm pleased to offer advice to those potentially interested in becoming engineers. I'm a frequent mentor at work. To address your concern about "ending up in an office job"...I'm not sure what you'd expected for the work environment of electronics engineers, but we (at least most of us) don't dwell in secret cavernous lairs. ::Evil Laugh:: Truthfully, I don't know why an "office job" would be considered a bad thing. It's warm and comfortable, and in most situations you'll have a semi-enclosed office, a private office, or a cubicle. Despite how the media portrays cubicles there's certainly a large degree of privacy while supporting creative collaboration with those around you. It also depends on what you've been hired to do. If you're designing computer-like electronic equipment you'll very likely be in an office most of the time - although since COVID many companies now allow their engineers to either work exclusively from home, or in a hybrid home/office situation if they're not 100% in-plant. But, if your position is a field representative or you're working on high-voltage power substations or something, then you'll likely spend a fair amount of time on-location outside of your office. In reference to your concern about being able to fit so much complicated knowledge in your head, I won't pretend that there isn't a lot there. But, it's infused over the course of several years while you're at school, and even further once you're working under more experienced engineers so it becomes MUCH easier to store away without becoming totally overwhelming. Also, while some circuits might at first look completely incomprehensible, once you start working with them (after you've learned the basics from school), then they become almost like child's-play. That said, I will fully acknowledge that it's FAR simpler to design a complicated circuit totally from scratch myself than it is to try and figure out what some other engineer was thinking in an existing complicated circuit design - if, for instance, you need to troubleshoot it or change the design in some way. I began designing semiconductor circuits as a natural part of my electronics hobby when I was 13. I worked as an electronics technician from 16 through 21, when I graduated college. I've been working as an electronics engineer for 44 years. In ALL of that time the thought that I wouldn't be able to continue has NEVER EVEN ONCE entered my mind. I'm even more excited about electronics theory now than I ever was, since I now have a lot of experience of successful circuit designs to look back upon and to draw from.
@@roberttrautman2747 thanks for all the advice! It’s really cool that you were making circuit designs at 13 btw. I’m just a beginner but you’ve helped me narrow what I’d like to do down to just a few things. You explained everything really well and now I’m super interested. Thanks again!🙏 P.S i also noticed you watch veritasium too haha, we have that in common :)
this has to be one of the most underrated videos on YT....amazing when you think about it! YT and entirety of modern life inc. social media would not be possible without it!
It's crazy to think that calculators and computers used to fit in a room. Now, it fits into our pockets and are way more powerful. On top of that, it comes fitted with a camera, a flashlight, ability to make international calls, as well as access to the world's knowledge without having to seek it in a library or be taught it in school, plus a whole lot more. All easily accessible in a few taps and swipes of a finger, and it hasn't even been 100 years. I can only imagine how fast the world seems to be changing for anyone born in the early 1900's.
When I was a kid I read about the world's fastest computer in the Guinness Book of World Records. The Cray-2. I wanted to have one when I grew up. No idea what I was going to use it for, but, you know, computers were exciting, right? Welp, IIRC the iPad 2 has about the same processing power as a Cray-2 supercomputer, so ... *lots* of people got my childhood wish.
@@taiguy53 Just checked the processor in my lil' Chromebook from 2017 that I take with me everywhere. With the throttles wide open it is 10 Cray-2s of CPU and 60 Cray-2s of integrated GPU, at a power drain of 6 watts.
@@taiguy53 Not exactly true. Quantum computing has fairly specific use cases, and limited number of real world applications. They also require incredibly low temperatures to work. It's therefore unlikely to ever replace conventional computers. You won't be playing games, typing emails or browsing the Web on a quantum computer.
13:17 was a nostalgia moment for me. I started my career in the early 90s, replacing old telephone switching systems in telephone company central offices. That constant clicking of the relays was what we were replacing. It is amazing to me that we were still using that technology so many decades later, and even more amazing to me how much it has changed in the 30 years since.
No mention to Konrad Zuse? He did even relay computers with floating point numbers. See the Z2 from 1940 and Z3 1938-1941. He made the Z1 mechanical computer in 1936-1938.
I noticed that you properly called ENIAC the first "general purpose" computer. Something outside the scope of this video is that the British government was using a "special purpose" computer to decrypt German radio communications as fast or faster than the Enigma machine at the receiving end prior to ENIAC. (A side note: The Poles had already broken that code by hand.) Excellent work. And don't forget that nearly home in the first world still has at least one vacuum tube in occasional use: a magnetron in the microwave oven.
I’ve seen a reconstruction of the UK machine, which was known as Colossus, at the National Museum of Computing (TNMOC) at Bletchley Park. Astonishing stuff!
The first version of ENIAC was not a general purpose computer. It was specifically built to compute ballistic tables. In 1946, John von Neumann proposed a modification of ENIAC that repurposed the memory for storing tables of values for various functions (like the trigonometric functions) to store a small program. This modification of ENIAC (completed in 1947 iirc) was an intermediate step toward a proper stored-program computer, the EDVAC (completed in 1949). Another interesting thing about the second version of ENIAC is that it had the first Turing complete instruction set (also proposed by von Neumann).
Collossus was made to decrypt Lorentz Cyphers, not Enigma. They used an electro-mechanical machine called a Bombe to decrypt Enigma. The Poles had not broken Enigma, they had captured an intact machine, which they eventually gave to Britain (after the Brits and the Yanks had initially turned it down I believe). Collossus was the worlds first Electronic Programmable Computer, but that fact was kept very secret until the mid 1970's, which is why it was widely believed that Eniac was the 1st.
@@Algernon1970 Thanks for this. I couldn't remember the name, and am a bit annoyed that it wasn't mentioned in the vid. Then again, not surprised; this channel has become the early 2000s version of Discovery Channel. Some interesting content, but rarely much below the surface.
@@Algernon1970 One minor point. The Poles did crack enigma. They had a captured (thanks to French intelligence), reverse engineered and were able to produce replicas of the first commercial version of Enigma, and in doing so they had learned how to crack the Enigma code mathematically of those first machines. The Germans then later added more security features to subsequent versions and better information security practices, meaning that cracking enigma by hand became an ever intensive and impractical task. However, the Poles shared the replicas, plans and means of cracking the early models with the Brits, significantly reducing the time they needed to get a more industrialised industrial code breaking operation set up.
You're making us Electrical Engineers so happy with these video topics - I cannot wait for the solid state video, whether it be germanium, BJT, FET, or anything in between.
In my humble opinion, the transistor (hinted at the end of the video) is among the most important inventions in human history. It's up there with the wheel, the steam engine, gunpowder, penicillin and the like. The lightbulb, too.
Personally I think it was one of the most destructive inventions. Humans were wholesale happier, healthier, and more connected before the invention of the computer which led to the internet which led to social media which led to destruction of real connection and localized groups and culture and the creation of photoshop, fake faces and bodies, and tons of misinformation which all together in turn led to fatter and unhealthier humans who are less connected, less motivated (due to dopamine overdrive), and far far far more depressed and anxious. Correlation does not equal causation but with the obvious connections and cause/effect groupings I see I believe it is completely to blame for being the majority of the causation of each of those three main detriments to humanity/society along with tons of other detriments and while being the cause of TONS of good I still think humanity would be better off with a FAR slower progression in computing power so we had time to adapt and see the negatives thus setting up defense mechanisms for everyone especially Thresh defenseless like kids who have been ravaged by dopamine overdrive and the many negatives of social media.
@@ClipsCrazy__ I think you're romanticizing the past. Before computers, we had everything from witch hunts to nazism, racial segregation, oppression of women, slavery, torture and public executions. I think the present, even with all its flaws, is the best time for humanity overall. Computers have allowed countless advances in fields like medicine, and brought the world closer together with essentially free, instant global communications. It's not perfect (we're human, after all) but we're doing pretty well, compared to centuries past. I'd say it supports my point that your biggest worry seems to be social media.
@@ironcito1101 na I’m just referencing data. All that horrible stuff “before phones and computers” actually proves my point even more. Humans are more depressed now than EVER before. More depressed even though things are “better” than they’ve ever been in endless different ways.
@@ironcito1101 social media is definitely the worst thing to come from this advancement so far but again we’re still hurling ahead in record time and AI is the next thing in the horizon. Social media causing political polarization and/or deep fakes representing a political party/leader could literally be the start of the next civil/world war which would create a world that with no doubt would’ve been better without computers but I’d argue we’re already in one that would benefit greatly from the throttling of computer advancements by 1/4 to 1/10 15 years ago.
I would highly suggest that you do a second part of this topic, it is very interesting and it explains how we achieved this level of computing power today
this also makes light of the lightbulb, it was a key invention to enable the modern world, it'd be great to see a long form video, say 2 hours, covering key technologies from history, starting from speech/writing/fire via maths, zero, metalworking, into some modern ones. Maybe making it a series would be better
I hope you eventually collab with someone like Ben Eater to explain at least some of the basics of how a processor actually works in this "series" of topics. When I was a kid this was the biggest mystery for me and no one could really explain it well (mostly because it is a complicated topic), but now as an adult this is something that I think is not so complicated that child me could not understand, there just was a lack of easy to access and understand material. Ben Eater's videos really helped cement that knowledge and build an intuition for it even after taking college undergrad courses that touched on the subject.
As an EE student, it was amazing watching this video. „Ain‘t that a Diode?“. Turns out to be its first concept. „Damn and that‘s the same principle a transistor also uses“. Turns out to be it‘s predecessor. Since we meager learned about a theories history, it was great to watch this! Top content
Crikey Derek... You basically placed my Basic Electrical Theory... The very first block of my electronics engineering degree in a 19 minute program... Your analogy of the vacuum tube is better than what my old professor taught me back then.
Nice video! But I would have liked a mention of the zuse Z3 wich was build in 1941 in germany. In contrast to the design and use of the ENIAC, the Z3's design did not meet the later definition of a Turing-complete computer, and it was never used as such. But in 1998 it was found out that, from a theoretical point of view, it still has this property due to the tricky use of some detours. There were also the special-purpose machines Colossus (England 1943) used for dectyption and the turing-bombe (England 1940) which was used to decrypt enigma.
even the z1 was years ahead of the model k or 1 and it were released in 1938 one year after the model k. the Z3 was as much as i knew the first of its kind with a Ram.
Around the same time in 1937, Konrad Zuse created the first mechanical computer called aptly 'Z1'. It could process instructions, perform arithmetic using floating point registers, had input, output and memory.
This is really dope. As a tube audio amplifier enthusiast this was a great video to watch, you nailed this one man. Usually channels cover a topic I don't understand well, and I see all kinds of flaws in how it's covered, and it ruins a channel for me, because you wonder what else they're getting wrong. This is the opposite experience, bravo dude! Also listening to your videos on a tube amplifier is something special, your audio quality really shines through, it's truly like you're in the room explaining it with me because tubes better replicate the natural harmonics of the human voice due to their resonance harmonics, when compared to solid state amplification
You showcased you still need to get your facts checked out; I personally recommend you read mostly just Wikipedia instead of watching TH-cam videos, when it comes to looking for truthful information.
I don’t understand how it would sound better with tubes. The audio still comes as a digital signal from the TH-cam-video and it’s still just an electromagnet speaker that produces the sound waves so how could your sound experience be better than mine for example?
Just love your videos - you have a knack of distilling the most complex things down with easy-to-understand concepts and articulating ideas with clear, concise language that everyone can understand.
I’m a pro computer nerd and naturally know all the history of computers but I still feel fascinated every time I am reminded of it. Your video was one of the best ways I ever learned about it, really a great presentation of a great history and the most impressive inventions in the history of mankind. ❤
I am full heartedly waiting for part 2 ❤ such a nice video explaining a lot in just 18 mins, What a context and way of explaining and how much one could understand and gain knowledge from these 18 mins instead of watching brain rotting reels, I have been searching for such a video on this topic from a long time. I can write a book on this but I don't got time and no one got to care 😊
The British Colossus should be considered as the first or even Konrad Zuse Z1, which was considered the first electronic computer. ENIAC gets all the glory simply because it was well-published and made a big splash in the papers. The British did not declassify their systems till well after ENIAC was a household name. There was another British computer after Colossus that had a programing language, screen and keyboard, but I cannot remember its name, and it was also before ENIAC.
Most Excellent Vid, concise and very informative. Born in 1963 I grew up when vacuum tubes were still in use (but then again, so were silk top hats) and witnessed the transistor age come into being. This vid helped me to understand WHY. Many, many thanks! Well done!
Voicing a pet peeve ... the world's FIRST programmable, Turing-complete computer, the Z3, was created by Konrad Zuse in 1941 ... give the inventor of the modern computer the credit he deserves.
@@TheRenHoek Nah he requested funding from the NSDAP for his computers and initially got denied until they realised they could use it for primitive guided missiles. Bro wasn't drafted at gunpoint, he happily took their money and helped build weapons.
It's a bit of a stretch to describe the Z3 as Turing complete. It had no conditional branching, so could only be considered Turing complete if it propagated all branch results. I'm not sure how one then would choose the "correct" result of a program.
@Soken50: Alan Turing's work was used to kill more enemies efficiently, ENIAC was essential to complete the work at Los Alamos, and hence, the mass murder of Hiroshima and Nagasaki ... no one walks away from war unsullied. This is not about cheering on Nazis, but to acknowledge the intellectual achievement that Z3 embodies. Failing to do so, in my opinion, is to elevate fairy-tales above reality, propaganda above history, whitewashing lies above considered debate, ignorance above insight ... to me, that's the road to mindless bigotry, indifferent 'othering', and the self-righteous arrogance that birthed so many horrors our history is splattered with. We can do better than that, we deserve to do better than that, we ought to do better than that ... because we can BE better than that. @A Barratt: True, Z3 didn't have conditional branching, and Zuse didn't know anything about Turing-completeness when he built Z3, since he didn't know of Turing's work, and it wasn't until 1998 that Z3 was shown to have been Turing-complete. In a way that makes it even more remarkable THAT Z3 actually WAS Turing-complete. Again, credit where credit is due.
If you are not familiar with computing history, it's important to also see the role of women. Far to often these days women are a minority in the field of programming. The hardware design was done by men, the programming was done by women on the ENIAC. But rear admiral Grace Hopper was the one who came up with a lot of concepts/created a lot of first.
Coder, not software enginer... Because if you didn't at least briefly read/heard about ENIAC, then I guess you also don't know anything about the work of Blaise Pascal, Charles Babbage, Konrad Zuse, John von Neumann or John Nash. You probably just vaguely recognize the names like George Boole or Alan Turing, but do not actually know more about their work either. So, no, you are not a true software engineer...
I'm disappointed that you made no mention of Colossus. It may not have been a programmable computer, but it was an electronic logic machine that used thousands of vacuum tubes to statistically analyse encrypted teleprinter messages at a very high speed. It came into service a whole year before Eniac and it made a very significant contribution to the success of the invasion of occupied France in June 1944.
When I learnt electronic, vacuum tubes/valves were still widespread, so we learnt to fix both solid state and tube decices. The first thing I learnt was that it was rarely a tube that was wrong, often it was a resistor or a paper capacitor, but since the customers complained about how a small part was so expensive, we also replaced a few tubes, even if they were fine, so they won't complain too much. The customers didn't pay me for parts, unless it's a crazy expensive one, they paid for the knowledge I have accumulated over the years. It takes time to be a good electronic technician, that's why I demand a certain price. I had the chance to learn both solid state and tube so I could fix both.
I found it fascinating back in the early 2000s when I worked on C-130 aircraft that still utilized vacuum tubes in the compass amplifiers for the navigation system. Your explanation of how they amplify signals, like earth's magnetic field, helped close the gap on my understanding of that navigation system. Those airplanes used a device called a Magnetic Azimuth Detector in the wing tips or in the top of the vertical stabilizer to sense magnetic heading and transmitted a very low voltage signal to the navigation computers for heading reference. Before the signal could be utilized effectively, though, it had to be amplified. Enter the humble light bulb 💡
The first production C-130A came off the production line in 1955. I know they have had several upgrades to the airframe and systems over the years but it still amazes me that a plane designed and built in the 1950's is still in service today. The B-52 is another plane that still in service that was first built in the 1950's. I worked in PMEL witch was renamed TMDE when I was in the U.S.A.F. and I was shocked by how many pieces of test equipment we had in Germany that were built in the 1950's and 1960's and used vacuum tubes and all the wires used silver solder on ceramic bus strips for connection points. I was over there from 1982 to 1984. I was surprised when the government decided to make all the military calibration labs totally civilian contractor jobs in the 1990's.
Field effect transistors are the solid state analogue of the thermionic valve. I wonder if, for very high temperature applications in the vacuum of space, very tiny valves, without envelopes, might make a comeback.
I visited a Dew Line radar base just after they had switched from vacuum tube computers to new computers for the time (1980). They took us (Air Cadets) into a gymnasium size room filled with tight pack rows of vacuum tubes reaching to the ceiling. After they took us into another room and showed us this small refrigerator size box and told us that it did the same job as the gymnasium size computer. It was an amazing vision of the future of computer miniaturization.
Ahh they should teach this (in highschool). I spent nearly 40 years doing CMOS design but in my undergrad school the vacuum tube basic operations were not covered. I guess in the 80s they were considered obsolete.
Excluding the history, this was basically the first month of my intro to electrical engineering class. Not sure how much this would be needed for general education
How did the software used for CMOS design transition during your career? Nowadays mainly Cadence EDA and then Synopsis custom compiler, Tanner EDA, etc are used industrially. Back then what was used?
@@Songfugel the history illustrates the law of unintended consrquences. The greatest advance in human history began with a simulated candleflame that utilized electricity. People back then could not imagine smart phones or flatscreen TVs, or almost any of the things that these rudimentary devices would lead to. It makes me wonder what the next great advance will be, and what accidents will lead up to it, and how long until that happens? There are no answers to these questions of course, until it happens. But this history is real and we are all living the results of it, every day and in every way..
My father had an electronic shop back in the '70s and I would assist him in going to houses and doing in home television repairs. We had a tube tester that allowed you to put in the tube number and it would tell you if it was good or bad. We also had a television picture to rejuvenator which he did the picture to filaments up burning off any buildup. It always amazes me the amount of heat that came off these things. You didn't awesome job of presenting this by the way!
In addition you should look at the work done by Allan M. Turing and Tommy H. Flowers (at the British Post office research facility, Dollis Hill in London) during World War 2 to decipher German encrypted radio signals using the Colossus computers which predates ENIAC. These machines used 1-2 thousand thermionic valves (or vacuum tubes) and are recognised as the first semi-programmable computers. A rebuilt Colossus is on display at the National Museum of Computing at Bletchley Park in England, where it is operational. War time security meant that Dr. Turing and Dr. Flowers were not recognised for their contribution to computing until the 1980's, even today some of their work remains classified.
My father worked for the California Department of Water Resources. When I was a kid, they were still operating a UNIVAC computer. It took up a large room which had to be independently air conditioned. It could only do one function at a time. In order to change functions the programmers would have to come in and program the thing. The funny thing was that their desktop calculators were many times the computing power of the "Computer"
The diference with lightbulbs is that humans weren't afraid of them, we immediately embraced them but with AI... with AI everyone is sh***** their pants!
@@fagelhdwe also control AI. Anyone who believes the crap that AI will go sentient needs to learn the basic facts that AI does what it is told to do. If it causes harm, it is because of the person that worked on it or by someone that managed to get access to it with malicious intent
@@notjeff7833 The difference is that you can't use lightbulbs to flood the world with vast quantities of propaganda and misinformation and faked photorealistic images. Lightbulbs were never easy-to-use limitless garbage content factories that posed an existential threat to democracy and an accurately informed public.
There are several great TH-cam channels, but Veritasium always - and I mean ALWAYS - delivers! It is amazing how he can make videos of such different subjects, and are always on point! Never dull, never stretched out, never compromising. A real gem this channel.
Absolument incroyable la manière dont vous avez réussi à condenser cette évolution complexe en une vidéo de moins de 20 minutes. Merci beaucoup, vous contribuez magnifiquement à enrichir nos connaissances avec chacune de vos contributions
Translation: Absolutely incredible the way you managed to condense such a complex evolution in a video less than 20 minutes long. Thank you so much, you contribute beautifully to enriching our knowledge with each one of your contributions.
Oui c'est vrai. J'aimerais qu'il y ait une traduction de ce vidéo en français et allemand pour le montrer à mes proches. Je suis un technicien en électronique depuis 45 ans et quand j'essaie de leur expliquer c'est très rare qu'ils comprennent.
That was excellent. While working on my 2 year Electrical and Computer Eng. degree, I had an Instructor that was really fond of vacuum tubes and I learned a lot from him. The origins of modern computing are fascinating.
Someone else said it’s because when both switches are flipped no difference of voltage exists for the Solenoid to get a current run through it. I guess differences of voltage is if not *the* core of how the binary machine language works then it’s one of the cores.
Studying the history of learning disciplines is a must for people studying that discipline, because it puts them inside the head of pioneers of that discipline which will in turn render learning materials and subjects more easy to infer. Understanding the way people thought about a subject to come up with amazing ideas will make learning those ideas more alive and appreciative by anyone learning that subject. Whether it's history of mathematics, or computer science... etc students will always benefit from it.
New developments commonly come from thinking about things in novel ways. "Getting in the minds of the people" who built the fundamentals doesn't really do that. Knowing some of the building blocks is crucial but recreating the wheel from the beginning seems like not a great idea
@@gw6667 >recreating the wheel from the beginning seems like not a great idea Do you think the speed at which the wheel is being recreated is the same? How do you think we can avoid recreating the wheel without studying the work of previous people? How can we effectively study what other came up with if we can't understand the circumstances and knowledge nuances that lead to those new ideas?
@@gw6667 In software development we are indeed taught from the proverbial invention of the wheel. You aren't considered competent until after you either learn of or make your very own compiler and also learn at least basic OS functionality. 99% of people won't even touch a compiler or the code of an OS, yet it's fundamental. Because it's the basis of everything we do.
ENIAC wasn't first with being programmable - Colossus, co-developed by Alan Turing, predated it by years and did a lot of heavy lifting for the Bletchley Park codebreakers during the war.
Colossus was not a general purpose Turing complete stored program computer. ENIAC was. However, 10 Colossii clustered together could in theory be made to be Turing complete, according to Benjamin Wells at University of San Francisco. Turing didn't really "co-develop" it - it was Tommy Flowers with Sidney Broadhurst and William Chandler. Colossus was pretty much an application specific machine. Outside of the world of cryptography, it's enduring contribution was to show that valve electronics could be reliable enough at that scale to be useful. Had it been a total failure in that regard, possibly no one else would have tried again until the transistor became a viable component. Arguably, those early days of valve computers (fixed or general purpose) gave computer science a 10 year head start.
To be fair, the first general purpose computer was the ABC (Atanasoff-Berry Computer) which was built 3 years before ENIAC, but history sometimes forgets this
Light bulbs were such a good idea, they became the symbol for good ideas
🤣🤣🤣 that's cool
Galaxy brain comment
here before 10k likes
💡
Copied comment
I've lived my whole life hearing about vacuum tubes and never really knowing how they work. This was an amazing presentation connecting lightbulbs to transistors. I'm stunned.
Everyone understands mechanical computers, then school skips vacuum tubes because we don't use them anymore, and jumps to digital circuits, honestly if I had this video I would prob have got digiy
@@simonhenry7867 Agreed. I know vacuum tubes was the predecessor of the transistor and functioned very similarly, but never know how it works.
If you've ever gotten into guitar amps, you'll still hear people say that tube amps sound warmer. Still plenty of people using tubes. Tubes are still often the best way to amplify very high wattage radio signals.
I think if you really want to understand something well, just start from its origin.. go to its history.
@@abhishekkushwaha3462 Truly outstanding point! Personally found this method of finding out 'how or why was this thing invented in the first place?' really great approach to learning many new topics!
As a Computer Engineer, I would like to thank you for illuminating the origins of my profession. This was an exceptional, historical documentary.
You’re welcome
Eyyyy illuminating! I get the joke! HAHAHA
What would also be illuminating is if you explain the quantum physics of transistors and maybe even laser keyboards!
I, personally think, a better origin of computer and automation can be found in Looms, specially Jaquered looms. And I'll also advise u to not take Veritassium seriously. I mean not even as a good entertainment.
@@frostfamily5321 It doesn't take quantum physics to explain the operation of a transistor. Someone well-versed in quantum physics might have something to add to the conversation, but the operation is currently well understood.
I designed and built my first computer with vacuum tubes in 1957. Being a ham radio operator I knew a little about electronics. Also, I was a lazy math and physics major. There were 2 computers in town. One belonged the TVA. Being a part of the government, I was refused access. The other belonged to the largest bank in town. It took up the entire 3rd floor of one of their buildings downtown. They explained how it worked. Several friends contributed tubes. Large and not exactly cost efficient, it did less than my slide rule. It did give me a bit of a leg up years later in getting a job with an airline as an assembly language core programmer on IBM 360s.
Sir, it would be an honor to hear your perspective on the history of computing. Do you have plans to create videos on the subject?
Thank you for the kind words. I haven't considered such an effort as the major part of my work was steered into international relations and international economics. I am researching and writing two books on those subject aimed at college students, their professors and people involved with multinational corporations. I am close enough to my 85th birthday that I can see my wife counting the candles. I am just hoping to pass along my education and experience before I pass on.
@@crawfordharris4795 thanks for your response, eagerly waiting for your books
Why are you on TH-cam? You’re like 490 years old
@@crawfordharris4795 You designed and built one of the first computers at just 18 years old? That's remarkable
My mind is constantly blown how far humans have come in the last 100 years.
Edit: Great to see awesome comments here. The goal is to become a peaceful species to explore the cosmos. Let's overcome the great filter!
Same 😅
Thats the power of communication
Don't read my name!...
@@Dont_Read_My_Picture 100 years of progress and we end up with *this*
It's comforting to know that for 99% of the problems humanity faces today, an amelioration or even a straight fix is due in the next century. Really makes you optimistic for the future.
Do not quote me on that number ;)
Seeing the progress of computers laid out in a timeline is one of the most fascinating things to me. I've probably seen/ read the story about a dozen times and it's still interesting
same!! i'm in my late 30s here and the first time i read about it was in david macauley's 'the way things work' - a book i got as a christmas gift when i was probably 8 or 9. i found the description of this early computer extremely fascinating.
The computer saga
I can barely understand the logic behind it all and *_I still_* find it interesting -until thinking about it gets too hard and something else grabs my attention.-
Its even cooler trying to replicate it, like Usagi did.
the progress just keeps going... the transistor and then the programming of the computers is probably our greatest achievement as a species. we are still in the phase of implementing this invention, we haven't seen "anything" yet. :)
I was a kid when solid-state electronics were replacing vacuum tubes in consumer products.
I remember that radio and TV
repair was a widespread cottage industry. The best in that field were able to adapt and stay afloat, until the advent of integrated circuits.Great video 👍👍
I enjoyed taking the back off our television, gathering all the tubes, and going down to the drug store to test them on the tube tester ... even when the tv was working fine.
If you could repair a TV you could work for a big name company going to hospitals installing, repairing and maintaining high end medical x-ray. You were a hero and even the Doctors worshiped you. We were networking image transfers before they knew what to call it. We could do 3D imaging using analog video. You got the image in a day or two because computers were too slow. CT, computed tomography was invented by an engineer that worked at the EMI recording studios. Then the companies handed out laptops and said figure it out. Dial up. The blood ran despite how hard we tried to reprogram those folks. Now the software and every other kind of repair are two disciplines. Benjamin Franklin was your last renascence man that could know enough of everything to function in any subject. Its only going to get worse. How long before you ask your computer "Whats wrong?' A you have to decide if you can trust it. I'm hanging on by my finger nails.
@@terryscott5568A lot of fields are requiring the specialties to know so much more than ever before to function. Medicine is a field that suffers this from so much advancement. It is a blessing and a curse. To graduate medical school Doctors are required to know magnitudes more than doctors just 30 years ago due to the huge advancements in research. It is absurd really.
When you replace a vacuum tube with another of identical type (same code on label), you can be sure that the properties are the same. This is an advantage of the tubes. When you replace a transistor with another of identical type, the replacement should be first measured, if matching the properties is paramount. Also, a sound amplifier with vacuum tubes does not have that background noise notable in those using transistors, when the amplification is at maximum.
I have to give mad props to your editor/animator(s). They do such a tremendous job distilling your scripts into visual language even though we all know none of this is actually classical mechanics at its roots. The classicality of it is emergent and the art style helps with that even though it is not explicitly said.
OK tHERE Mr critique
@Repent and believe in Jesus Christ Bad bot...
@Repent and believe in Jesus Christ goku solos
Speaking of, let me know if that water drop bit IS in fact morse code, or am I loosing my mind.
5
Dude, I’ve watched so many of your videos, and you are one of my absolute favorite channels on TH-cam. Your team does such an amazing job between research, writing, producing, editing, etc… Veritasium makes GREAT content! Please keep doing what you’re doing! Thanks!
Nice
Nice
Nice
Nice
Nice
Mad props to Veritassium for explaining such a complex subject in such a simplified manner. Brilliant!
Every. Single. Time.
You’re welcome
@@justingolden21 Oh hey! Its moist critical
.org!
Ikr
As a 3rd year Electrical Electronics Engineering student, I can say that this video is by far the best video that made me finally understand all these theoretical concepts we took in our lessons, you are a true genius
how many replys!!!!!!!!!
I have never seen the development of computers explained this fundamentally before. Thank you.
Then you must have been born yesterday or missed a lot😂
@@bzuidgeest well sir not everyone is a nerd like us.
@@zefellowbud5970 come on, basic schoolbooks provide the same explanation. Maybe American schoolbooks are suffering from all the book banning. Turing was gay, so maybe he is forbidden as, to woke 🤣
@bzuidgeest I think you hit the nail on the head. The American public education system keeps going further and further downhill. I’m a proud American, but even I know that our country is doomed if something doesn’t change soon.
@@bzuidgeest If you didn't know American School Systems were bad you must've born yesterday 😂😂
As a guy who majored in computer science, I gotta say this is one of the coolest videos I've seen in the TH-cam science community in a while. I never made the connection between lightbulbs and the invention of vacuum tube based machines. Thank you Derek for putting together this amazing narrative for the fundamental turning point of electronic computer history!
A little odd that they didn't teach you this in computer science
@@thespacejedi eh, I think they just teach it to electrical engineers probably. They probably want more Software Engineers than Computer Scientists, so things like understanding the nitty gritty gets tossed aside (or I just haven't taken the relevant course. There is a course called digital circuits that I think I'm supposed to take.)
@@kintamas4425 Well, I _did_ take Digital Circuits (back in 1987), and we didn't learn about vacuum tubes (though we learned about transistors).
@@jpisello oh transistors are covered? That’s good. I’ve been trying to learn about them/read up on them, and it’s been slow going. The only thing I know so far is that there are numerous kinds of transistors. So far, my understanding is that for turning off a transistor maybe the middle portion of the transistor gets a counterbalance of voltage to make it so that no difference of voltage exists for a current to run across the transistor. Is this the case? That would mean keeping a transistor off would actually cost energy.
Do transistors have a capacitor be the key to whether they’re switched on or off? To switch them off the part behind the dielectric (of air or maybe silicon) would be made into whatever charge so that no difference of voltage exists for a current to flow across. And when they want it to turn on then they make a difference appear by change the charge.
I felt exacly the same way
As someone who works for a commercial and industrial lighting agency, I love this. Such a great history lesson. This is the kind of Veritasium video I love to see!
without a doubt
@@JokeswithMitochondria funny username lol
@@JokeswithMitochondria ur username actually made me click on ur profile. Love ur content hahaha. Funny stuff
wrr
🥰Thanks !!! Finally I understand how electronic works...such a simple, yet brilliant animation and presentation. This is my favorite education channel!!!
Nice
I was born in 1968. My mother was a Comptometer operator ( a mechanical adding machine), and my father was mad about electronics. I grew up surrounded by vacuum tubes, but I don't think I really understood them until watching this video! Thank you for your amazing content.
Your mother was a mechanical adding machine ?
@unnamedchannel1237 I suppose that is a _possible_ interpretation of my statement. Another slightly more _plausible_ interpretation is that she was the *operator* of a mechanical adding machine.
I was also born in 1968. Specifically November 17. My dad first bought our (my) first computer on my 13th birthday. AI will be for STEM geeks in 2023 what BASIC was to computer geeks in 1981.
My mom's best friend was also a Comptometer operator for Bacardi. That helped them get hotel rooms in Puerto Rico when others were turned away!
@@unnamedchannel1237Yeah. My mother was in fact an actual mechanic adding machine, always busy with spending and budgets or something. My father called her "cranky" sometimes..
Hi Derek, I'm a semiconductor Electrical Engineer. I also look forward to your silicon video(s). I have often imagined how to animate a Bipolar Junction Transistor (BJT). The electron would be a circle of one color, the hole a different color. During recombination, the colors disappear into the background. I'm sure you will explain what took me some time to learn. The reason the charge carriers get through the base into the collector is diffusion! The importance of emitter injection efficiency might be out of scope. Another in the series might show how the photo-voltaic diode works, the light making electron-hole pairs (EHP)s, and that makes power how?
I remember my old professor teaching me the differences in semiconductor transistors... I never fell more in love with the MOSFET and its design although delicate in nature.
What’s your major?
Setting aside your grammar (unless English is not your first language?), I agree: I, too, am enthusiastically looking forward to the follow-up video that, hopefully, will feature Derek explaining solid-state electronics in a similar manner!
i only read till the first 5 words
I too would like to see a similar video on transistors. I have never quite understood how the small current at the junction can effectively amplify the current passing through the transistor. Really enjoyed this video. I went into electronics in 1974 via the United States Air Force. This was about the beginning of the end of vacuum tube usage. Still, in some cases it was still more cost effective to maintain old equipment and perform preventive maintenance on it than to buy new. I had a piece of equipment made in 1940 that was still there when I left in 1982😊
The transition from vacuum tubes to transistors necessitated a career change for some TV repairman because some could not quite figure out how to troubleshoot transistors. I was always proud of my uncle Dave. He grew up during the depression, and with a High School education taught himself enough electronics to start a radio and TV repair shop on the side. He kept on going even with the change. I talked to him about transistors. He said it was a bit difficult at first, but he was determined to understand them. He was one of people other repairmen in the area called when they needed help with a particularly difficult problem.
As a electronics student I knew what vacuum tubes are but finding out the history behind them was super interesting.
I think it’s interesting to realize that Tesla‘s invention of the radio would end up relying on an invention of Edison’s, the lightbulb, and a discovery by Edison that he only discovered because of his refusal to use Tesla’s AC, which led to first being used as a device to convert AC to DC, Lol, and then to create another device to amplify radio transmissions and then used to receive and play radio transmissions on a radios speaker.
@@jaredf6205if those two had gotten along we might not have gotten as far. Ironically the competition of one upping each other’s inventions was the driving force for advancement.
Like most things competition is good for advancement.
We found the way to connect the seemingly irrelevant pieces of the puzzle.
I came here to say this. I knew about vacuum tubes and I knew they were rudimentary BJTs. But it was awesome learning the history and the details.
i know about them and have even "messed around" with them, because i work with audio/music related stuff. The audio and music industry still uses them, they can produce the same quality of audio than a transistor based system, and they have a very "unique" kind of touch added to the sound. its usually described as a warm super subtle distortion in audio that's very pleasant, and its imposible to emulate trough digital stuff. even music from your phone going trough an vacuum tube desk amplifier will sound very crispy in the most pleasant way there could be.
i know it sounds exaggerated, but if you got good ears and know what you're hearing, you'll see that it's different.
Can it run doom ?
Every Comp can run Doom and Linux
it even can run you
0.0018 fps
Awesome question 😅
According to my calculations and observations you would need 19,120,000 of those computers shown in this scene 15:29 to run doom
My father was a professor at Cornell University, and I have some memories from the early era of computers. My father acquired for his office in about 1960 a purely mechanical calculator made by Friden that could add, subtract, multiply, divide, and, COMPUTE A SQUARE ROOT. It was about twice the size of an electric typewriter, very noisy in operation, and cost the then-huge sum of $1300. I also remember being taken in about 1958, as a very small child, to see Cornell's first electronic computer, running on banks of vacuum tubes and pretty much filling a former engineering school machine shop,
ur so lucky
U still have that calculator?
@@miyamoto900 Heavens no. To start with, it was always property of the university.
@@Pamudder can we steal it ? What would it take ? Please make a plan and inform me at earliest.
Yours truly,
Miyamoto.
@@miyamoto900 I have a tesla model 3. I'll drive getaway, deal?
The memories: In high school 4 of us tried to build a "computer" with pinball game relays. Load &slow. We got it to add, subtract, & multiply. We graduated before getting it to divide. Later as a college instructor, I built a spreadsheet to demonstrate how computers calculated. It still amazes me how computers can do anything with such a limited number of basic tricks. My head is hurting again!
well you tried and you failed
@@lavishlavon Bro got three out of 4 operations working. I won't assume, but I'm betting more than you could do as a highschooler.
my grandpa called them "confusers"
@@realtechhacks and whose fault is that? 3 out of 4..the guy failed and he failed hard. nothing to go bragging about
@@lavishlavon Lmao what??
This makes me think about the people who built calculators and computers in Minecraft using the in-game “electricity” system called Redstone. It started as just making switches that could automatically open doors when you hit a button or stepped on a pressure plate to trigger it, but it eventually grew into more and more complicated electric systems until people eventually built calculators and even computers in the game. I remember seeing a video where someone built a computer in Minecraft that was running Minecraft itself in a scaled down version, on a screen made of Minecraft blocks. Someone even built a computer that was able to connect to the internet and they were able to order a pizza through the game that then was delivered to their house. I’m sure by now people have built huge and even more complex computing systems in the game and I have no idea what their capabilities even are at this point.
the pizza thing was a mod called web displays. you cant connect to the internet using minecraft redstone
@@TheOriginalMacOS I’m aware they can’t connect directly to the internet through in-game stuff alone, but they still had to build the thing in the game to interface with it.
@@CrippledMerc it was just a portal frame thing, it was just like lighting a nether portal, the mod was what did the web browser
@@TheOriginalMacOS Pretty sure in the video I saw years ago they built a computer in game that used the mod to show the web browser and connect to the internet.
😮🤯😳
The Z3 was a German electromechanical computer designed by Konrad Zuse in 1938, and completed in 1941. It was the world's first working programmable, fully automatic digital computer. The Z3 was built with 2,600 relays, implementing a 22-bit word length that operated at a clock frequency of about 5-10 Hz.
The Japanese also developed a relay computer very early on
@@myid9876543IIRC the Japanese one is the most advanced relay computer ever produced.
But not Turing complete
The smart German guy who created Z3 was very young, a fresh graduate. He was conscripted and sent to the front line! After a few months, it was realized that he was more useful back home, as engineer, than on the front line as soldier. Fortunately, he did not die during his deployment.
@@scopestacker9787 The Z3 was not designed to be Turing complete; they didn't need it.
In 1998 it was discovered that the Z3 was actually Turing complete. As it wasn't designed to be Turing complete, coding branches is unintuitive and complicated.
This is one of your videos that could have been 2 hours and it wouldn’t have felt long enough! This was amazing! Thank you so much.
Amen yess a three hour movies would maybe be enough it really was great.
Hmm...I felt he rambled on too much already
The title made me think that the light bulb had some relavance in todays world apart from the historical progression thing.
Does anyone know where this technology is still current?😁
Apologies if I missed something, I promise I watched the whole thing.
@@captainoates7236 based on the video, he spent a lot of it showing how the light bulb was one of the milestone inventions along the path toward computers. He didn't show a lot of other ways it's used which is a shame because he could pick any other invention along the chain and do the same thing as this video and have the same basis to call that thing the greatest invention.
Ikr
One thing I really appreciate about Veritasium is how it explores the history of science; highlighting the people who made these discoveries. It's a great reminder that our modern world didn't just randomly pop into existence one day.
I firmly believe that the best way to truly understand something is to learn its history because that helps us understand it’s evolution and the reasons for why things are the way they are. And I always love the way you take this approach in most of your videos and the bravery with which you approach and easily explain such complex topics. Thanks again Derek and team!
The absolute best way to understand some technology is to build yourself a rudimentary version to play and tinker with if such a thing is plausible.
While it is not strictly necessary, I agree that it is a very good way to gain a solid understanding.
For the record I have worked in IT for over 30 years and this is the first explanation of how we got from light bulbs to circuits that actually made sense. Showing the model K went a long way to understanding it.
I look forward to the next video in this evolution, because after this comes the transistor. I think it could be argued that the biggest milestones in human history are the mastery of fire, the printing press, the discovery of penicillin, and the invention of the transistor. There are literally billions of transistors used in our everyday life, yet very few are aware of how much they have changed the world.
They have all certainly hastened the end of our trajectory on this planet. It is interssting you left out the internal combustion engine.
About 10 sextillion transistors have been made since they were invented in 1947.
Don't forget the disco ball
Discrete transistors have nothing on integrated circuits.
I would probably add internal combustion and agriculture (I’m not 100% certain here, but I believe it was the advent of crop rotation that first enabled long-term/perpetual human settlement), but yeah, you’re definitely not wrong!
When I was a kid in the early 1970's, our Zenith TV had a LOT of vacuum tubes in it.
We even had a monthly (I think) visit from a technician who would check the condition of each of the vacuum tubes and replace the ones that were failing.
I was very young and dumb and assumed that he was putting new television programs in the TV for us.
I held this belief until I saw "Willy Wonka & the Chocolate Factory" (the one with Gene Wilder in the title role). That movie had a scene that explained how TV programs got to your TV and Wonka's invention of sending chocolate bars via a similar technology.
In those days hardware stores sold replacement tubes, and they would have a "self service tube tester." (I'll pause while you do an image search…) You would yank all the tubes out of your broken radio or TV, bring them into the store, test them, and of course, replace any the machine told you was "bad."
And people today complain that modern microprocessor-controlled electronics are unreliable.....🙄
@@raffriff42 I recall a hardware store that had one of these giant testers in the hallway of the mall it was located in up to the early to mid 80s. I can't say I ever saw anyone use it back then of course, as nobody used vacuum tube TVs by then. But the device sat their in disuse for quite a while.
Back in the day before things were made intentionally difficult or impossible to replace or fix. Nowadays, if something goes wrong in your TV even finding a repairman would be difficult. Most people just get a new TV.
This story becomes a fanciful "grandpa tale" if Right to Repair fails =(
I remember the headaches from understanding logic operators when I was a student, circa 2006. This is so beautifully explained and easy to understand that I wish I could have seen this back then.
As someone who programs, the title made absolute sense to me, as anyone who codes knows you almost never know what actually is going wrong when something does, so writing code that gives you cues of at which point the code breaks, in a more analog design, using lightbulbs as status indicators makes a lot of sense
Thank you Derek! This video filled the gaps I had in my knowledge of the history of the things I love the most. Big hug from Canada 🍁
Colossus was a set of vacuum tubes based computers developed by British codebreakers in the years 1943-1945 BEFORE ENIAC. Colossus is regarded as the world's first PROGRAMMABLE, electronic, digital computer, it was programmed by switches and plugs
Don't read my name!...
That's true, but I note he carefully said "programmable" computer. Colossus wasn't programmable.
However, Colossus would have been worth a mention, simply because it was that endeavour that worked out how to reliably use thousands of valves in a single machine (i.e. never turn it off). I don't know if Eniac benefited from this or whether they had to work it out for themselves.
Arguably, demonstrating that electronics could be reliable enough for practical use was as important as being the first electronic computer. Had it been built and been impractical because tubes kept blowing, maybe no one would have bothered to build another.
What Colossus was was fast. When the designs finally became public, sometime in the late 1990s I think, the first thing someone did was write a piece of software for a PC to emulate it. I can remember reading that it ran not significantly faster than a real Colossus, even though a mid-1990s PC had 50 years of computation development in its favour.
It was kept secret until the 1970s though, so it didn't have as much of an impact on the development of computers as ENIAC
And before both was the Atanasoff-Berry Computer (or the ABC for short) built over at Iowa Stat University.
@@abarratt8869 Colossus is regarded as the world's first PROGRAMMABLE, electronic, digital computer, it was programmed by switches and plugs
Over the years I've designed circuits with vacuum tubes, transistors and integrated circuits.
It's incredible how rapidly technology has evolved.
Usagi! Hell yeah what a surprise. Love his channel
Interesting trivia: The first "computer bug" was a literal moth stuck in a relay in one of these relay calculators!
Don't read my name!...
I've waited for the moment that this fact gets dropped in this video! Thx for mentioning :D
Nope. That's most likely myth. Research it!
The name was probably around earlier. But the moth incident is most likely real.
It was grace gopper, and coined this malfunction as bug
The term was around before computers were a thing. It had to do with buzzing interference noises on phone lines which sounded like buzzing insects. Debugging referred to fixing the interference.
This was a cool video. As a computer engineer who designed my own tube guitar amp in college, you basically just did a summary of my education experience. Very rad deep dive into the world of tubes!
A great video, and yes Matthew Connolly... guitar amps!!! a fantastic use for tubes that continues to be used today to make the best sounding amps.
Thanks so much for coming to hang out, I had an absolute blast!
Im really happy that you'll be covering transistors. As soon as you mentionned the "grid" of the vacuum tube, i knew that was coming.
This winter semester i had a course on electronics, and something that i was wondering is why the "grid" input is called just that. This was really informative, well done!
It is a quite literal grid haha
Haven't watched the vid yet, but they are often literally a mesh/grid. Sometimes they consist just of a loop of wire, but usually it's a screen mesh/grid.
@@sophiophile When i asked my professor why, he said it's a mix of practical and historical reasons. Now i know why.
I find it awesome that Stibitz built his circuit in his kitchen out of some spare parts he had lying around. The DIY spirit is what gets things done.
When I was a kid in the early 60's my father was an Air Force officer connected with electronic warfare and computer development. HE had shoe boxes full of old, dead vacuum tubes. I loved to play with them; they made great space ships. Kinda wish I had gotten him to explain to em how they worked. I was only 6 year sold, so I guess I can forgive myself for seeing them as only toys. What I REALLY wish I had gotten him to explain was all the papers with programming on them.
Brilliant fuckers ...
Help me, i need money NOW !
I wish you understand how stupid that sound
@@whatshumor1 If you're referring to the fact that I use capital letters for emphasis instead of italics, it's because TH-cam doesn't enable italics if you enter directly. Not all of us are willing to go to a third party app or learn some kind of obscure tag in order to make italics. In this case, I did go to a third party app and type with italics, but when I pasted it into TH-cam, the italics were gone. Kudos for you for figuring out how to do italics. That definitely makes you superior to the rest of us.
Given the circumstances, we should be happy that Kevin is even able to use TH-cam!
@@pickleballer1729 Use underlines, (_):
_Hello World!_
Do not put characters before/after the symbols, it will break the effect:
._Hello_
Also works in WhatsApp and similar chatting applications.
For bold, use (*):
*Hello World!*
Strikethrough (-):
-Hello!-
In the early '70s a lot of the businesses in my town--restaurants, bars, grocery stores--were starting to replace their tube-based in-store music systems with solid-state systems, so they were happy for me to take their old machines off their hands so I could tinker with them. In a lot of cases, the tube systems had already failed, but since the owners already planned to replace them, they didn't bother to fix them. I took them back to my "workshop" (the shed in my parents' backyard) and worked on them myself. I made uncounted bike trips to the local drug store, which had a tube tester in the back at the end of a row of bins of different tubes, to get replacement parts. But I never knew until now exactly *how* the triode tube worked. Thanks for the great video!
While I was a kid, my dad had an old FM radio and a separate audio amplifier that used vacuum tubes. Today I learned how they worked. Simply amazing! I hope you do the follow up video on how this works in silicon.
Guitar amplifiers use vacuum tubes today.
@@RideAcrossTheRiver Only some of them. Others are solid state, including the one I have.
@@Sepherisal The current Fender 'Custom' models are all-tube. Superb amps.
Great video Derek ! I love this story and you did a great job telling it ! Its worth mentioning that Claude Shannon's 1937 thesis was originally to be about using boolean algebra to simplify existing switching circuits but by accident discovered that these same circuits could be used to solve boolean equations. Thus it all began
well he didn't, so its not
@@lavishlavon what did Derek not do ? Lol
@@jasonhildebrand1574 mention what Claude Shannon's thesis was originally to be about
always found early breadboards extremely fascinating.
as someone who repairs electrical devices daily, and is used to circuit boards, i really appreciate all the work that went into old circuitry with vacuum tubes, and electromechanical engineering, the best days are the days i get a old appliance from before i was even thought of.
This is the best explanation of tube amplifiers, diodes and triodes that I've ever seen. Very nicely done.
The first electronic programable computer was COLOSSUS built by the British code breakers at Bletchley Park in 1943 to decipher the German Lorenz cypher. However as the project was classified, the existence of COLOSSUS was kept secret until the 1970’s.
Although Colossus was the first programmable electronic machine that was called a "computer" at the time, I don't think it was a computer in the sense that we use the word "computer" today, since it wasn't general purpose, and wasn't a stored-program machine. That said I don't think ENIAC was a computer in the sense we use the word today either, as although it was general purpose, it wasn't a stored-program machine either. I think the first computer, in the modern sense of the word, was the Small-Scale Experimental Machine (SSEM), also known as the Manchester Baby.
I guess it depends on which history books you read. Furthermore, the existence of Colossous was keot secret until the mid 1970s and it wasn't really public knowledge until the early 1980s. Therefore, the history books would not have known to mention Colossus when considering ENIAC as the first computer.
@@jamesc3505 Made by one of the few who knew about Colossus, and who was able to take what he knew of Colossus with him.
And again nowbody thinks about the Zuse Z3 wich could also be arguably the first Computer.
Yes it wasn't natuarly Turing compleet, but could bei withe trixe. And it was a lot more programebel (you didn't need to rewire it for evry programm).
Manny germans would say the Z3 was the first Computer and mit the ENIAC.
Thankyou Americans are always trying to re write history and say they invented the computer. Colossus was built by Alan Turing also of Turing test fame. It was built to crack enigma which it did amazingly well, another thing Americans like to claim they did. They only got away with it because it was classified.
I have an Associate degree in Electronics Technology and a Bachelor of Science in Industrial Engineering, and this is the video that finally made me understand how vacuum tubes work...
Same here , but I don't have any degrees.
Yeah, first it was "oh that's like a diode" and then "oh just like a transistor" 😅
@ Actually at first it was a diode, and then it became a transistor...
…alla bon’ora! 😂
So glad to see David getting attention for his awesome work!
Same, I did NOT expect this collab! This is great!
3:18 FULL BRIDGE RECTIFIER
That was pretty cool to see UsagiElectric featured in this. His videos are pretty awesome and I am glad I discovered his channel awhile back. He is brilliant and has a lot of fascinating videos. Glad he was getting some well deserved praise and attention from this video.
With my computer science background, this is truly one of my favourite episodes. Thank you so much.
Thank you! That was fascinating. As one who grew up in the 1960s & '70s, and who was an electronics hobbyist at that time (now an engineer), I played with a LOT of vacuum tubes. My first introduction to transistors in school - specifically field effect transistors - was through learning about triode vacuum tubes, as they translate very closely. Again, thank you. This brought back many neat memories.
hey im 15 and considering becoming an engineer. I'm pretty good at math and physics, but I'm worried that being an engineer might be dull. What do you think?
@@thesnowman2509 "Dull" is not a word I would ever apply to electronics engineering. Fun, exciting, lucrative, challenging, ever-changing, and fun (yes, fun x2!) are far more applicable.
If you have a passion for electronic circuit design now like I did when I was in my teens some 50 years ago - and I STILL DO - then every day of your engineering career will present new and intriguing challenges.
At its root, engineering is just problem-solving using math, physics and electronics theory, which is really just a specialized area of physics. If you enjoy figuring out how things work, an engineering career might be for you.
Because of engineers we've seen computers go from 27 tons filling a space the size of a medium-sized home to something that's 24,000 times faster and incomprehensibly more versatile that fits in our pocket...all in just about 75-years.
With quantum computers just now emerging, the careers of new and future engineers is guaranteed to be a breathtakingly exciting ride!
@@roberttrautman2747 yeah engineer sounds super cool but I’m worried because:
A. Don’t want to end up in an office job like some engineers do
B. It’s so fascinating but it’s so complicated and I don’t see how I could ever retain that much knowledge in my head haha. I perform really well in school but when seeing the work some engineers do it’s kind of overwhelming. Did you ever have a point where you thought you wouldn’t be be able to continue? (Sorry to bug you)
@@thesnowman2509 Not a problem. I'm pleased to offer advice to those potentially interested in becoming engineers. I'm a frequent mentor at work.
To address your concern about "ending up in an office job"...I'm not sure what you'd expected for the work environment of electronics engineers, but we (at least most of us) don't dwell in secret cavernous lairs. ::Evil Laugh::
Truthfully, I don't know why an "office job" would be considered a bad thing. It's warm and comfortable, and in most situations you'll have a semi-enclosed office, a private office, or a cubicle. Despite how the media portrays cubicles there's certainly a large degree of privacy while supporting creative collaboration with those around you. It also depends on what you've been hired to do. If you're designing computer-like electronic equipment you'll very likely be in an office most of the time - although since COVID many companies now allow their engineers to either work exclusively from home, or in a hybrid home/office situation if they're not 100% in-plant. But, if your position is a field representative or you're working on high-voltage power substations or something, then you'll likely spend a fair amount of time on-location outside of your office.
In reference to your concern about being able to fit so much complicated knowledge in your head, I won't pretend that there isn't a lot there. But, it's infused over the course of several years while you're at school, and even further once you're working under more experienced engineers so it becomes MUCH easier to store away without becoming totally overwhelming. Also, while some circuits might at first look completely incomprehensible, once you start working with them (after you've learned the basics from school), then they become almost like child's-play.
That said, I will fully acknowledge that it's FAR simpler to design a complicated circuit totally from scratch myself than it is to try and figure out what some other engineer was thinking in an existing complicated circuit design - if, for instance, you need to troubleshoot it or change the design in some way.
I began designing semiconductor circuits as a natural part of my electronics hobby when I was 13. I worked as an electronics technician from 16 through 21, when I graduated college. I've been working as an electronics engineer for 44 years. In ALL of that time the thought that I wouldn't be able to continue has NEVER EVEN ONCE entered my mind. I'm even more excited about electronics theory now than I ever was, since I now have a lot of experience of successful circuit designs to look back upon and to draw from.
@@roberttrautman2747 thanks for all the advice! It’s really cool that you were making circuit designs at 13 btw. I’m just a beginner but you’ve helped me narrow what I’d like to do down to just a few things. You explained everything really well and now I’m super interested. Thanks again!🙏
P.S i also noticed you watch veritasium too haha, we have that in common :)
this has to be one of the most underrated videos on YT....amazing when you think about it! YT and entirety of modern life inc. social media would not be possible without it!
As an eager mechanical engineering student, your videos always leave me in awe, feeling calm and hopeful. Your videos are beautiful and brilliant!
It's crazy to think that calculators and computers used to fit in a room. Now, it fits into our pockets and are way more powerful. On top of that, it comes fitted with a camera, a flashlight, ability to make international calls, as well as access to the world's knowledge without having to seek it in a library or be taught it in school, plus a whole lot more. All easily accessible in a few taps and swipes of a finger, and it hasn't even been 100 years. I can only imagine how fast the world seems to be changing for anyone born in the early 1900's.
When I was a kid I read about the world's fastest computer in the Guinness Book of World Records. The Cray-2. I wanted to have one when I grew up. No idea what I was going to use it for, but, you know, computers were exciting, right?
Welp, IIRC the iPad 2 has about the same processing power as a Cray-2 supercomputer, so ... *lots* of people got my childhood wish.
@@eritain That is insane
@@taiguy53 Just checked the processor in my lil' Chromebook from 2017 that I take with me everywhere. With the throttles wide open it is 10 Cray-2s of CPU and 60 Cray-2s of integrated GPU, at a power drain of 6 watts.
@@eritain Just wait til you see quantum computing. It's exponentially more powerful than the tech currently out in the market
@@taiguy53 Not exactly true. Quantum computing has fairly specific use cases, and limited number of real world applications. They also require incredibly low temperatures to work. It's therefore unlikely to ever replace conventional computers. You won't be playing games, typing emails or browsing the Web on a quantum computer.
13:17 was a nostalgia moment for me. I started my career in the early 90s, replacing old telephone switching systems in telephone company central offices. That constant clicking of the relays was what we were replacing. It is amazing to me that we were still using that technology so many decades later, and even more amazing to me how much it has changed in the 30 years since.
No mention to Konrad Zuse? He did even relay computers with floating point numbers. See the Z2 from 1940 and Z3 1938-1941. He made the Z1 mechanical computer in 1936-1938.
Lets say im Timetraveler: How do i BEST explain the Ancient-People (lets say Medival Times) the Concept? Not the Details, but the Concept
I noticed that you properly called ENIAC the first "general purpose" computer. Something outside the scope of this video is that the British government was using a "special purpose" computer to decrypt German radio communications as fast or faster than the Enigma machine at the receiving end prior to ENIAC. (A side note: The Poles had already broken that code by hand.) Excellent work. And don't forget that nearly home in the first world still has at least one vacuum tube in occasional use: a magnetron in the microwave oven.
I’ve seen a reconstruction of the UK machine, which was known as Colossus, at the National Museum of Computing (TNMOC) at Bletchley Park. Astonishing stuff!
The first version of ENIAC was not a general purpose computer. It was specifically built to compute ballistic tables. In 1946, John von Neumann proposed a modification of ENIAC that repurposed the memory for storing tables of values for various functions (like the trigonometric functions) to store a small program. This modification of ENIAC (completed in 1947 iirc) was an intermediate step toward a proper stored-program computer, the EDVAC (completed in 1949). Another interesting thing about the second version of ENIAC is that it had the first Turing complete instruction set (also proposed by von Neumann).
Collossus was made to decrypt Lorentz Cyphers, not Enigma. They used an electro-mechanical machine called a Bombe to decrypt Enigma. The Poles had not broken Enigma, they had captured an intact machine, which they eventually gave to Britain (after the Brits and the Yanks had initially turned it down I believe).
Collossus was the worlds first Electronic Programmable Computer, but that fact was kept very secret until the mid 1970's, which is why it was widely believed that Eniac was the 1st.
@@Algernon1970 Thanks for this. I couldn't remember the name, and am a bit annoyed that it wasn't mentioned in the vid. Then again, not surprised; this channel has become the early 2000s version of Discovery Channel. Some interesting content, but rarely much below the surface.
@@Algernon1970 One minor point. The Poles did crack enigma. They had a captured (thanks to French intelligence), reverse engineered and were able to produce replicas of the first commercial version of Enigma, and in doing so they had learned how to crack the Enigma code mathematically of those first machines. The Germans then later added more security features to subsequent versions and better information security practices, meaning that cracking enigma by hand became an ever intensive and impractical task. However, the Poles shared the replicas, plans and means of cracking the early models with the Brits, significantly reducing the time they needed to get a more industrialised industrial code breaking operation set up.
You're making us Electrical Engineers so happy with these video topics - I cannot wait for the solid state video, whether it be germanium, BJT, FET, or anything in between.
Yess, I collected a bunch of those germanium diodes and their 0.3v voltage drop is impressive! I am also waiting for when he will cover FETs, BJTs etc
And us Computer Engineers! 2s Complement was my bread and butter in college. As were BJTs, MOSFETs, and friends.
03:18
THE FULL BRIDGE RECTIFIER!!!
every child deserves a teacher like you 🙏
In my humble opinion, the transistor (hinted at the end of the video) is among the most important inventions in human history. It's up there with the wheel, the steam engine, gunpowder, penicillin and the like. The lightbulb, too.
Gunpowder is not a good invention.
Personally I think it was one of the most destructive inventions. Humans were wholesale happier, healthier, and more connected before the invention of the computer which led to the internet which led to social media which led to destruction of real connection and localized groups and culture and the creation of photoshop, fake faces and bodies, and tons of misinformation which all together in turn led to fatter and unhealthier humans who are less connected, less motivated (due to dopamine overdrive), and far far far more depressed and anxious. Correlation does not equal causation but with the obvious connections and cause/effect groupings I see I believe it is completely to blame for being the majority of the causation of each of those three main detriments to humanity/society along with tons of other detriments and while being the cause of TONS of good I still think humanity would be better off with a FAR slower progression in computing power so we had time to adapt and see the negatives thus setting up defense mechanisms for everyone especially Thresh defenseless like kids who have been ravaged by dopamine overdrive and the many negatives of social media.
@@ClipsCrazy__ I think you're romanticizing the past. Before computers, we had everything from witch hunts to nazism, racial segregation, oppression of women, slavery, torture and public executions. I think the present, even with all its flaws, is the best time for humanity overall. Computers have allowed countless advances in fields like medicine, and brought the world closer together with essentially free, instant global communications. It's not perfect (we're human, after all) but we're doing pretty well, compared to centuries past. I'd say it supports my point that your biggest worry seems to be social media.
@@ironcito1101 na I’m just referencing data. All that horrible stuff “before phones and computers” actually proves my point even more. Humans are more depressed now than EVER before. More depressed even though things are “better” than they’ve ever been in endless different ways.
@@ironcito1101 social media is definitely the worst thing to come from this advancement so far but again we’re still hurling ahead in record time and AI is the next thing in the horizon. Social media causing political polarization and/or deep fakes representing a political party/leader could literally be the start of the next civil/world war which would create a world that with no doubt would’ve been better without computers but I’d argue we’re already in one that would benefit greatly from the throttling of computer advancements by 1/4 to 1/10 15 years ago.
I would highly suggest that you do a second part of this topic, it is very interesting and it explains how we achieved this level of computing power today
this also makes light of the lightbulb, it was a key invention to enable the modern world, it'd be great to see a long form video, say 2 hours, covering key technologies from history, starting from speech/writing/fire via maths, zero, metalworking, into some modern ones. Maybe making it a series would be better
Up vote
Yes
UP vote
I hope you eventually collab with someone like Ben Eater to explain at least some of the basics of how a processor actually works in this "series" of topics. When I was a kid this was the biggest mystery for me and no one could really explain it well (mostly because it is a complicated topic), but now as an adult this is something that I think is not so complicated that child me could not understand, there just was a lack of easy to access and understand material. Ben Eater's videos really helped cement that knowledge and build an intuition for it even after taking college undergrad courses that touched on the subject.
As an EE student, it was amazing watching this video. „Ain‘t that a Diode?“. Turns out to be its first concept. „Damn and that‘s the same principle a transistor also uses“. Turns out to be it‘s predecessor. Since we meager learned about a theories history, it was great to watch this!
Top content
Usagi is awesome ❤ I’m so happy he’s featured in this video.
Came here to post the same thing!
It was so much fun hanging out with Derek!
@@UsagiElectric Wow it's him
Crikey Derek... You basically placed my Basic Electrical Theory... The very first block of my electronics engineering degree in a 19 minute program... Your analogy of the vacuum tube is better than what my old professor taught me back then.
Nice video! But I would have liked a mention of the zuse Z3 wich was build in 1941 in germany. In contrast to the design and use of the ENIAC, the Z3's design did not meet the later definition of a Turing-complete computer, and it was never used as such. But in 1998 it was found out that, from a theoretical point of view, it still has this property due to the tricky use of some detours.
There were also the special-purpose machines Colossus (England 1943) used for dectyption and the turing-bombe (England 1940) which was used to decrypt enigma.
Agree, the video would have benefited from a broader view rather than American -only.
even the z1 was years ahead of the model k or 1 and it were released in 1938 one year after the model k. the Z3 was as much as i knew the first of its kind with a Ram.
@@joajojohalt Z1 already had a 64 word RAM (I think 22 bits each word as its arithmetic unit used 22bit binary floating point values).
Around the same time in 1937, Konrad Zuse created the first mechanical computer called aptly 'Z1'. It could process instructions, perform arithmetic using floating point registers, had input, output and memory.
... and was already based on binary system.
This is really dope. As a tube audio amplifier enthusiast this was a great video to watch, you nailed this one man. Usually channels cover a topic I don't understand well, and I see all kinds of flaws in how it's covered, and it ruins a channel for me, because you wonder what else they're getting wrong. This is the opposite experience, bravo dude! Also listening to your videos on a tube amplifier is something special, your audio quality really shines through, it's truly like you're in the room explaining it with me because tubes better replicate the natural harmonics of the human voice due to their resonance harmonics, when compared to solid state amplification
Tube amp worship is like astrology.
@@JacksonKillroy Hating on someone else's hobby is like infidelity.
A dick move.
Liking a distortion and admitting its there is ok, thinking that the thing that can't distort is inferior is less so.
You showcased you still need to get your facts checked out; I personally recommend you read mostly just Wikipedia instead of watching TH-cam videos, when it comes to looking for truthful information.
I don’t understand how it would sound better with tubes. The audio still comes as a digital signal from the TH-cam-video and it’s still just an electromagnet speaker that produces the sound waves so how could your sound experience be better than mine for example?
Just love your videos - you have a knack of distilling the most complex things down with easy-to-understand concepts and articulating ideas with clear, concise language that everyone can understand.
I’m a pro computer nerd and naturally know all the history of computers but I still feel fascinated every time I am reminded of it. Your video was one of the best ways I ever learned about it, really a great presentation of a great history and the most impressive inventions in the history of mankind. ❤
I am full heartedly waiting for part 2 ❤ such a nice video explaining a lot in just 18 mins, What a context and way of explaining and how much one could understand and gain knowledge from these 18 mins instead of watching brain rotting reels, I have been searching for such a video on this topic from a long time. I can write a book on this but I don't got time and no one got to care 😊
You are a true gem of a channel. What you do for science and knowledge is incalculable. Thanks for this :)
The British Colossus should be considered as the first or even Konrad Zuse Z1, which was considered the first electronic computer. ENIAC gets all the glory simply because it was well-published and made a big splash in the papers. The British did not declassify their systems till well after ENIAC was a household name. There was another British computer after Colossus that had a programing language, screen and keyboard, but I cannot remember its name, and it was also before ENIAC.
It's great how he didn't even care to mention Zuse in an 18 minute video about the history of computers.
Further evidence that it doesn't matter who gets there first ... just who markets the accomplishment better. (c.f., Rolex)
The Zuse was very much erased from history, as it was a german development. History is usually written by the victors.
No surprise there, typical American with an American version of history
@@bzuidgeest I thought Derek was born in / retains Australian citizenship!?
Most Excellent Vid, concise and very informative. Born in 1963 I grew up when vacuum tubes were still in use (but then again, so were silk top hats) and witnessed the transistor age come into being. This vid helped me to understand WHY. Many, many thanks! Well done!
Voicing a pet peeve ... the world's FIRST programmable, Turing-complete computer, the Z3, was created by Konrad Zuse in 1941 ... give the inventor of the modern computer the credit he deserves.
As a computer scientist, I had never heard of him. Thanks for the clue.
Nah, he can stay in his NSDAP infamy.
@@TheRenHoek Nah he requested funding from the NSDAP for his computers and initially got denied until they realised they could use it for primitive guided missiles. Bro wasn't drafted at gunpoint, he happily took their money and helped build weapons.
It's a bit of a stretch to describe the Z3 as Turing complete. It had no conditional branching, so could only be considered Turing complete if it propagated all branch results. I'm not sure how one then would choose the "correct" result of a program.
@Soken50: Alan Turing's work was used to kill more enemies efficiently, ENIAC was essential to complete the work at Los Alamos, and hence, the mass murder of Hiroshima and Nagasaki ... no one walks away from war unsullied. This is not about cheering on Nazis, but to acknowledge the intellectual achievement that Z3 embodies. Failing to do so, in my opinion, is to elevate fairy-tales above reality, propaganda above history, whitewashing lies above considered debate, ignorance above insight ... to me, that's the road to mindless bigotry, indifferent 'othering', and the self-righteous arrogance that birthed so many horrors our history is splattered with. We can do better than that, we deserve to do better than that, we ought to do better than that ... because we can BE better than that.
@A Barratt: True, Z3 didn't have conditional branching, and Zuse didn't know anything about Turing-completeness when he built Z3, since he didn't know of Turing's work, and it wasn't until 1998 that Z3 was shown to have been Turing-complete. In a way that makes it even more remarkable THAT Z3 actually WAS Turing-complete. Again, credit where credit is due.
As a software engineer for over 30 years I was fascinated that I just learned of ENIAC from you! Just proves that learning is 4ever. Thanks!
you'd like the book 'The Innovators'.
If you are not familiar with computing history, it's important to also see the role of women. Far to often these days women are a minority in the field of programming.
The hardware design was done by men, the programming was done by women on the ENIAC.
But rear admiral Grace Hopper was the one who came up with a lot of concepts/created a lot of first.
@@autohmae no one wants your identity politics here
Coder, not software enginer... Because if you didn't at least briefly read/heard about ENIAC, then I guess you also don't know anything about the work of Blaise Pascal, Charles Babbage, Konrad Zuse, John von Neumann or John Nash. You probably just vaguely recognize the names like George Boole or Alan Turing, but do not actually know more about their work either. So, no, you are not a true software engineer...
@@davidyoung2990 it's not identity politics, it's surprising how rolls have changed.
I'm disappointed that you made no mention of Colossus. It may not have been a programmable computer, but it was an electronic logic machine that used thousands of vacuum tubes to statistically analyse encrypted teleprinter messages at a very high speed. It came into service a whole year before Eniac and it made a very significant contribution to the success of the invasion of occupied France in June 1944.
It was programmable.
@@PaulLemars01 It was a special-purpose machine built to do one job and one job only.
Im very glad someone has mentioned colossus as i feel alot of history is "america washed" which is quite upsetting to see.
Atanasoff-Berry computer out-dates them both
@@gustcles22 Thanks for drawing my attention to this, I'd not hears of it before.
When I learnt electronic, vacuum tubes/valves were still widespread, so we learnt to fix both solid state and tube decices. The first thing I learnt was that it was rarely a tube that was wrong, often it was a resistor or a paper capacitor, but since the customers complained about how a small part was so expensive, we also replaced a few tubes, even if they were fine, so they won't complain too much. The customers didn't pay me for parts, unless it's a crazy expensive one, they paid for the knowledge I have accumulated over the years. It takes time to be a good electronic technician, that's why I demand a certain price. I had the chance to learn both solid state and tube so I could fix both.
I found it fascinating back in the early 2000s when I worked on C-130 aircraft that still utilized vacuum tubes in the compass amplifiers for the navigation system. Your explanation of how they amplify signals, like earth's magnetic field, helped close the gap on my understanding of that navigation system. Those airplanes used a device called a Magnetic Azimuth Detector in the wing tips or in the top of the vertical stabilizer to sense magnetic heading and transmitted a very low voltage signal to the navigation computers for heading reference. Before the signal could be utilized effectively, though, it had to be amplified. Enter the humble light bulb 💡
Some FM transmite still using 1KW tubes as output. Tubes still good on RF applications.
The first production C-130A came off the production line in 1955. I know they have had several upgrades to the airframe and systems over the years but it still amazes me that a plane designed and built in the 1950's is still in service today. The B-52 is another plane that still in service that was first built in the 1950's. I worked in PMEL witch was renamed TMDE when I was in the U.S.A.F. and I was shocked by how many pieces of test equipment we had in Germany that were built in the 1950's and 1960's and used vacuum tubes and all the wires used silver solder on ceramic bus strips for connection points. I was over there from 1982 to 1984. I was surprised when the government decided to make all the military calibration labs totally civilian contractor jobs in the 1990's.
One of the reasons for retaining valves was resistance to emp.
Field effect transistors are the solid state analogue of the thermionic valve. I wonder if, for very high temperature applications in the vacuum of space, very tiny valves, without envelopes, might make a comeback.
@@rogerphelps9939
The idea has definitely been investigated. The main problem being launch shock.
I visited a Dew Line radar base just after they had switched from vacuum tube computers to new computers for the time (1980). They took us (Air Cadets) into a gymnasium size room filled with tight pack rows of vacuum tubes reaching to the ceiling. After they took us into another room and showed us this small refrigerator size box and told us that it did the same job as the gymnasium size computer. It was an amazing vision of the future of computer miniaturization.
Ahh they should teach this (in highschool). I spent nearly 40 years doing CMOS design but in my undergrad school the vacuum tube basic operations were not covered. I guess in the 80s they were considered obsolete.
Excluding the history, this was basically the first month of my intro to electrical engineering class. Not sure how much this would be needed for general education
Semi-conductor transistors are trillion times better, there is absolutely no reason to teach any of this stuff outside of historical interest
How did the software used for CMOS design transition during your career? Nowadays mainly Cadence EDA and then Synopsis custom compiler, Tanner EDA, etc are used industrially. Back then what was used?
@@Songfugel Vacuum tubes are still used in audio amplification and microwave ovens.
@@Songfugel the history illustrates the law of unintended consrquences. The greatest advance in human history began with a simulated candleflame that utilized electricity. People back then could not imagine smart phones or flatscreen TVs, or almost any of the things that these rudimentary devices would lead to. It makes me wonder what the next great advance will be, and what accidents will lead up to it, and how long until that happens? There are no answers to these questions of course, until it happens. But this history is real and we are all living the results of it, every day and in every way..
Excellent, a part 2 with the rise of microchips in the same style as this would be brilliant
I've seen videos about vacuum tubes before but the story at the beginning of this video got it so intuitively!!
It's incredible how these two seemingly unrelated things are intertwined. Thank you Veritasium.
they are related because they are things
My father had an electronic shop back in the '70s and I would assist him in going to houses and doing in home television repairs. We had a tube tester that allowed you to put in the tube number and it would tell you if it was good or bad. We also had a television picture to rejuvenator which he did the picture to filaments up burning off any buildup. It always amazes me the amount of heat that came off these things. You didn't awesome job of presenting this by the way!
It never ceases to impress me what great quality content there is on TH-cam. This was all kinds of awesome. 👏 🎉
The sad part is kids will not learn about this in schools anymore.
You mean biased and historically inaccurate?
In addition you should look at the work done by Allan M. Turing and Tommy H. Flowers (at the British Post office research facility, Dollis Hill in London) during World War 2 to decipher German encrypted radio signals using the Colossus computers which predates ENIAC. These machines used 1-2 thousand thermionic valves (or vacuum tubes) and are recognised as the first semi-programmable computers. A rebuilt Colossus is on display at the National Museum of Computing at Bletchley Park in England, where it is operational. War time security meant that Dr. Turing and Dr. Flowers were not recognised for their contribution to computing until the 1980's, even today some of their work remains classified.
Both the subject and it's presentation were amazing. Very high quality content. Thank you
My father worked for the California Department of Water Resources. When I was a kid, they were still operating a UNIVAC computer. It took up a large room which had to be independently air conditioned. It could only do one function at a time. In order to change functions the programmers would have to come in and program the thing. The funny thing was that their desktop calculators were many times the computing power of the "Computer"
From Lightbulb to AI, the so short yet such exquisite journey of human inventions.
"Exquisite" is not the word I would choose. The reality is too much for one word to describe.
The diference with lightbulbs is that humans weren't afraid of them, we immediately embraced them but with AI... with AI everyone is sh***** their pants!
@@Argoon1981 Because we control the lightbulbs
@@fagelhdwe also control AI. Anyone who believes the crap that AI will go sentient needs to learn the basic facts that AI does what it is told to do. If it causes harm, it is because of the person that worked on it or by someone that managed to get access to it with malicious intent
@@notjeff7833 The difference is that you can't use lightbulbs to flood the world with vast quantities of propaganda and misinformation and faked photorealistic images.
Lightbulbs were never easy-to-use limitless garbage content factories that posed an existential threat to democracy and an accurately informed public.
This video covers relays, light bulbs, vacuum tubes, boolean algebra, radio, telephone, and computers.
Each of these topics deserves its own video !
There are several great TH-cam channels, but Veritasium always - and I mean ALWAYS - delivers! It is amazing how he can make videos of such different subjects, and are always on point! Never dull, never stretched out, never compromising. A real gem this channel.
Absolument incroyable la manière dont vous avez réussi à condenser cette évolution complexe en une vidéo de moins de 20 minutes. Merci beaucoup, vous contribuez magnifiquement à enrichir nos connaissances avec chacune de vos contributions
Translation:
Absolutely incredible the way you managed to condense such a complex evolution in a video less than 20 minutes long. Thank you so much, you contribute beautifully to enriching our knowledge with each one of your contributions.
Oui c'est vrai. J'aimerais qu'il y ait une traduction de ce vidéo en français et allemand pour le montrer à mes proches. Je suis un technicien en électronique depuis 45 ans et quand j'essaie de leur expliquer c'est très rare qu'ils comprennent.
That was excellent. While working on my 2 year Electrical and Computer Eng. degree, I had an Instructor that was really fond of vacuum tubes and I learned a lot from him. The origins of modern computing are fascinating.
Could you explain 9:15 to me? I don't get why closing both circuits would prevent a current from running through the solenoid.
Someone else said it’s because when both switches are flipped no difference of voltage exists for the Solenoid to get a current run through it. I guess differences of voltage is if not *the* core of how the binary machine language works then it’s one of the cores.
Vote for making video about silicon computers😊
Linus tech tips has hundreds of those
Studying the history of learning disciplines is a must for people studying that discipline, because it puts them inside the head of pioneers of that discipline which will in turn render learning materials and subjects more easy to infer. Understanding the way people thought about a subject to come up with amazing ideas will make learning those ideas more alive and appreciative by anyone learning that subject.
Whether it's history of mathematics, or computer science... etc students will always benefit from it.
New developments commonly come from thinking about things in novel ways. "Getting in the minds of the people" who built the fundamentals doesn't really do that. Knowing some of the building blocks is crucial but recreating the wheel from the beginning seems like not a great idea
"Standing on the shoulders of giants" doesn't include becoming a giant yourself
@@gw6667 >recreating the wheel from the beginning seems like not a great idea
Do you think the speed at which the wheel is being recreated is the same? How do you think we can avoid recreating the wheel without studying the work of previous people? How can we effectively study what other came up with if we can't understand the circumstances and knowledge nuances that lead to those new ideas?
@@gw6667 In software development we are indeed taught from the proverbial invention of the wheel.
You aren't considered competent until after you either learn of or make your very own compiler and also learn at least basic OS functionality. 99% of people won't even touch a compiler or the code of an OS, yet it's fundamental.
Because it's the basis of everything we do.
ENIAC wasn't first with being programmable - Colossus, co-developed by Alan Turing, predated it by years and did a lot of heavy lifting for the Bletchley Park codebreakers during the war.
He didn't even care to mention Konrad Zuse, the father of the modern computer - not even ONCE. This is such a poorly researched video.
Colossus was not a general purpose Turing complete stored program computer. ENIAC was. However, 10 Colossii clustered together could in theory be made to be Turing complete, according to Benjamin Wells at University of San Francisco. Turing didn't really "co-develop" it - it was Tommy Flowers with Sidney Broadhurst and William Chandler.
Colossus was pretty much an application specific machine. Outside of the world of cryptography, it's enduring contribution was to show that valve electronics could be reliable enough at that scale to be useful. Had it been a total failure in that regard, possibly no one else would have tried again until the transistor became a viable component. Arguably, those early days of valve computers (fixed or general purpose) gave computer science a 10 year head start.
He didn’t say Turing complete. He said electronic, and programmable. Colossus was both of those.
are you british?
because british people seem to have this... tendency to find a way to claim credit for American inventions, innovations.
To be fair, the first general purpose computer was the ABC (Atanasoff-Berry Computer) which was built 3 years before ENIAC, but history sometimes forgets this