I'm one of the engineers that designs the TLBs on modern x86 processors. This is a fantastic explainer and I will absolutely forward this to folks that ask what I work on!
I applaud you for successfully dodging a minefield of legacy crap every day! I was just thinking that for the beginners, anything _but_ x86 might be a better place to start, because x86 has so much historical baggage.
So cool. I am a master student I will start a PhD in computer architecture (RISC-V) I found this video very entertaining even I knew everything hahahaha.
@@mal2kscx86 is only really legacy on the surface. Under the hood, at the level that the MMU is working, it’s a Superscalar RISC processor with an x86 translation layer grafted on top.
@@mal2ksc It's all the legacy baggage that kept Intel ahead, everything is backward compatible to x86 and (with a bit of translation) 8080. If Zilog had gotten the Z8000 out 6 months earlier, things might have been different.
My mind is positively blown by how high-effort this video is. 60s aesthetic, Python demos, physical models?! Absolutely insane. Add the clear and approachable explanations and you have top tier educational content on this platform. Splendid!
Not only that but, and I mean this with complete and utter respect, she is a cute as a button and that makes her videos even more appealing, the right combination of personal charisma, details, fun and accessible! I'd have paid a kidney for a teacher like her, when I was learning this stuff!
A virtual memory manager (VMM) is one commonly-used name for the operating system component that implements virtual memory. A person working on things like that can reasonably be called a VMM programmer.
It's been 25 years since I've thought about how a computer program uses actual physical memory. This was an entertaining refresher along with a history lesson that is new to me. Very well done.
You have definitely found your calling in life. Part of what makes computing so powerful is you can be working on top of many layers of brilliant innovations that are usually transparent to you; downside of that is pioneers can get forgotten, so thank you for doing such a good job of highlighting their contributions
You struck the perfect balance between informative content and entertaining visuals. I hope you are able to make a living creating your videos, you are truly gifted for this kind of work!
Really much of computer science has been about refining early concepts into efficiently usable functions and building hardware to efficiently support them. You can look at the failures of some systems (like the Pentium 4's prediction/cache miss penalties) to see this directly. And you can look at how old languages like Lisp already had capabilities similar to modern ones, minus hardware optimisations and libraries (Lisp dialects are still in use).
Most of computing is exactly the same as any logistics. The processing is a small part of the puzzle and is impossible at scale without the grunt work of ports, trucks, highways, warehousing, shelving, forklifts, and every other form of logistics you can think of. Those functions will never really change much, so the solutions of 50 years ago work the same today - even if they're performed faster.
@@guybolt The lambda calculus was developed in the 1940's and forms the bedrock of formal logic for modern computing. Turing's work in the 1930's informed this very heavily. Of course this was pre-dated by pioneers in programmable logic through mathematics and mechanical design a century beforehand in the 1840's by people like Lovelace and Babbage. Tldr; There's no point in nitpicking over a decade when many "modern" computing concepts are minimally 180+ years old.
In the IBM world of computing, which I learned in 1972, there is an intermediate stage between fixed address programming and Virtual Storage. It's called relative addressing. Instead of using fixed addresses, the program would load the starting memory address into a base register. The program was compiled using a relative address from the base address. Thus, the program could start anywhere in memory. System/370 used Virtual Storage and was the successor to System/360, which was limited to physical memory. P.S. At my first programming job in 1974, the computer was a 370/125 and had only 92K of actual physical memory.
I'm a retired Computer and Electrical Engineer, and I gotta say, Laurie, you did an excellent job explaining this. 👍🤓 Also, I dig the '60s look! You pulled it off well. ❤ Brings back some memories (pun intended...sort of). I remember the days of paper punch tape, core memory, drum storage, the early Winchester disk drives, and when "monitors" were a teletype machine that physically burned the dots into the paper using a single-pin dot-matrix print head. As for programming, I've used binary-coded octal, BASIC (including BASIC-A, Q-BASIC, GW-BASIC, Visual BASIC, and Visual BASIC for Applications), Pacal and Turbo Pascal, C and C++ and Turbo C++, Fortran IV, Assembler, and Machine Code. Here's an old programmer's joke for you: "Do you know how to tell who is a C programmer? They're found in a dark basement corner, muttering to themselves as they slowly go mad." 😵💫
Engineers get a bad rap at a supposed inability to explain themselves. And some of that is deserved. I’ll explain. They are usually interfacing with the requirements, material folks et al. And it’s customary to be very precise, very succinct. They assume people know the argot, because usually they do. And many just cannot believe everyone isn’t fascinated by the detail of it all. This adds up to a custom that is a real liability when then need to break things down to the user sponsor level. Rare is the individual who can effortlessly dwell in minutiae land, and then abstract it to explain to the neophytes.
Holy moly, you really went all-in with this one! It's a rare occasion where one would do a video about such a low-level thing and actually dress up in era-appropriate style. Your videos are super interesting, well produced, and cute as heck to top it off!
your educational content on foundational computing concepts is simply the best there is. and on top of it the production quality is incredible. well done
I spent the better part of the 1990s teaching prospective systems engineers, and I have to say that this is one of the best illustrations I've seen on this topic. It brings back fond memories of those days. Thanks for that. You are a natural! Well done.
I never read it but I bought it. I had not noticed that was in the wrong time frame, 1979 is a bit late for the 60s which I have noticed started around 1964 and ended when the draft ended.
Funny how it's just her video that's a bit of fun. I'm sure she's not being factually correct. These videos are brilliant, only let down by you clods coming into the comments saying the things you are. Watch the content for what it is. 🙄
Its incredible how far you go for a simple skit, from almost dying from smoking asbestos filtered cigs, to cloning yourself and getting rid of the body afterwards.
A very good, but also very simplified explanation. For example: Between the old completely fixed memory programming and virtual memory another important concept was invented: Relocation. This meant that programs no longer had to directly specify physical addresses, but instead could specify a base address of zero and all other addresses in the program were relative to this base address. The operating system or program loader would then select a new base address and rewrite the program so all addresses became absolute before running it.
I'm quite impressed with how skillfully Laurie has taken a relatively unknown/uninteresting subject and presented it in such an accessible, engaging, and creative way!!! This was a good refresher, and the visuals were memorable. Great production value too! Excellent content!
Thanks for remembering that it was the Atlas first had virtual memory. However, there was a stage before that with memory segmentation which was a simpler form of memory management. Each user program had its own address space which was mapped onto real memory using segmentation registers. The big difference was that segments tended to be fairly big and still had to be in main memory. Segments could be swapped out but this was expensive operation in terms of time. Atlas style paged memory divided physical memory into equal sized pages that could be read from and written to the paging store. As they were smaller, it was easier to fit multiple programs into a system concurrently. Btw, the Atlas was being used under the Titan operating system until the mid seventies. By that time multiple manufacturers were producing demand paged systems. The first microprocessors with memory management followed the segmented route and didn't get proper memory management until much later.
Except this video describes memory mapping and calls it virtual memory which is something else. It's a shame since it's explained well but uses the wrong terms and therefore is increasing the confusion instead of providing clarification.
Plot twist: Programming is just a side gig for Laurie. This whole TH-cam series is "just" a clever way to showcase her impressive video editing and acting skills to land a role in Hollywood. Yep, that must be it.
Hollywood doesn't deserve her, and she should stay as far away as possible as to not put herself in a situation to become manipulated and or corrupted. Just keep doing what you're doing Laurie!
Hollywood can't make any money any more. You won't make any money on YT either, I haven't in 15 years. Not one penny. $50,000 investment of labor. You create because you want to see yourself what you create. To do otherwise, you're barking up a wrong tree on this platform.
hello laurie! this is the first video of yours I've ever seen, I'm amazed youtube didn't recommend you sooner because it's right up my alley. thanks for the awesome video, I'm going to go watch your whole back catalogue now
By the time core memory was the dominant memory type, the programmer was not selecting the load address for most batch systems, instead the link loader would fixup relocatable references, so multiple programs could be loaded into the same address space so long as they fit in physical memory. This was also common in the pre-386 MS-DOS where various utilities could load into RAM in the "terminate and stay resident" (TSR) model. This requires a relocation table as part of the executable image format. I wrote a lot of TSR programs along with a multitasker for DOS in the 1980s. Virtual memory allows sharing physical memory between multiple processes. I have been working on operating systems on and off since 1979 on computers with and without virtual memory management. In the 1980s, I wrote an overlay linker for the M68000 that included a runtime that could swap sections of code into the same physical memory areas to allow programs larger than available physical memory to run without an MMU. The PDP-11, on the other hand had a 64 KiB virtual address space (16 bit addresses), but most models had an MMU that supported 4 to 256 times that for physical memory. Kind of reverse how today's virtual memory works.
The IBM 360 didnt have a relocatable loader. You could however to a link & go but that require all associated libraries to be present. On the 360 you could define a BG & FG process, but your code had to have been linked in either partition in orde to work. You could NOT run a job, thay was linked for BG into the FG partition. It would crash as there was no relocatable loader. Ive been programming since 1969 (1401, 360, 370, 4300, 390)
The early minutes of her video reminded me of era of the late 80s into the early 90s when the Macintosh "System" OS transitioned from "Finder" (unitasking, apart from the Desk Accessories, akin to TSRs), to "Multifinder" (cooperative-multitasking). There, many of the Macs which System 6 & 7 could run on may not have had an MMU. You'd pull up the memory info windows (I forget now it's proper title, was it just "System Info"?) and it'd show you the physical memory regions occupied by each program you'd launched. From Multifinder, if you started Program A, then Program B, and later closed Program A and went to start Program C, if there wasn't a free region big enough after program B's occupied area, or if the hole left from the closing of Program A wasn't big enough, Multifinder would report to you that Program C couldn't start, because there was not enough memory! -- Just like her little demo (almost)! What really gets me, looking back, is that many of the later M68K series CPUs (I think from the 020 onward) powering Macs of that era actually /had/ MMUs and could do virtual memory and paging operations...but the Mac OS of that era, while looking so good compared to Windows 3.11, was suffering from pretty severe tech-debt by that point, and didn't utilize the hardware to its fullest potential. And at then in her video, she displays a MacOS-y looking demo with the "Copeland" codename displayed. I think this is the point when the Mac's OS finally gained virtual memory and paging support. I'd stopped using Macs on the regular after System 7.1. By then after a brief period of hell using PCs under DOS 6.22 and Win3.11, I'd gotten into them under Win95 (which was just enough Mac-like to please me and help me not regret leaving the Motorola-powered Macs behind) and WinNT 3.51 (which looked like the homely basic Win3 PC, but was VMS-like underneath), and near-simultaneously discovered the new and novel Unix-like early Linux distributions for PC. I stayed with Windows too long...until Vista. But I've been a Linux-man ever since 1996.
It was good to hear someone reference the key contributions of the Manchester University, England research teams to the development of modern computer systems. The UK built the first programmable computer and followed this work with the seminal series of MU Computing systems. It was truly remarkable that so many complex software systems could be built upon the crude hardware of the times. They were a nightmare... Hats off to MU, who showed what could be done. How about a look at the UK GEC systems next? IIRC they pioneered a highly secure OS design....
Great to see the new generation appreciating the venerable history of computer technology. This is a great set of videos for young people getting into computing, very nicely done.
As a 1970's graduate of EE, I can say I really appreciate this material. First encountered virtual addressing on a VAX-11/750. It was life changing, but I have to say I miss all the paging-out of portions of tasks to load another overlay :)
I used to be pretty good at coding ODL files for RSX11M . While it was a welcome change when VMS overtook RSX and made overlays obsolete, I do miss the challenge. It was like solving a puzzle.
@@njphilwt TKB Forever! and Ever and Ever and Ever and Ever... (I used to have a round button from a DECUS with this written in a circle around the edge.)😁
It sounds like you are talking about something similar to the overlays that many DOS programs used in the early '90s to swap portions of themselves in and out of memory at different points since the whole program could not all fit into memory at the same time, and that was NOT full on paging. dosshell used something like that before memory paging too so that you could swap between multiple programs that were all "loaded" at the same time, but only one was really ever running, and the whole thing had to be saved to disk and another loaded into ram when you wanted to switch between programs. The background program was fully suspended when swapped out and only the one you were actively looking at was in ram and running.
I was thinking about my time with an Interdata Model-70 computer, where I had to manually give the linker/loader instructions about which code segments were to be overlaid against others. It took a good bit of planning and code re-writing to make sure the module sizes were all compatible with the available memory. I learned a whole lot about memory management, which immediately became obsolete with the implementation of automatic hardware-handled virtual memory under VMS.
There are two concepts here which are co-mingled. First is virtual memory (each user process has its own memory space), and the second is demand paging (virtual memory can be marked "filled" or "not filled" with supporting bits in the hardware and an OS kernel service to change the state by "paging" from external storage such as a disk or ssd to backing physical memory).
That's a pretty neat coincidence, I have a test on this subject tomorrow. I figured I'd watch something on youtube after studying.. and welp. I guess Virtual Memory is my destiny lol. Nice watching this as an overview on the subject! Also, great production!
I nearly always speed up videos 2x - 3x on TH-cam, especially informational videos. The entertainment, engagement, and quality of the video is so good, I didn't even notice I was watching it at regular speed till I was over half way through. Great work.
Sometimes as high as 4x if the speaker has a slow cadence. There is a chrome extension, Video Speed Controller, that puts an overlay on most video websites, works on TH-cam and Udemy which is where I use it most. Audible let's you listen up 3.5 speed, and I find that a lot of non-fiction is read fairly slowly, especially for content like, The Great Courses, where it often sounds like a college professor giving a lecture and speaks much slower with longer pauses for thinking.
The cool thing about this is the fact that we now don´t have to worry for most of this low-level stuff. I went through my computer science degree without ever writing my own memory allocator and grew afraid of going this low level, but it is in fact fun and even simpler than what I thought. In the area I work, every last call matters so I often find myself doing hobby projects in which I put myself working on bare metal and I love to see the difference when you have no choice but optimize every byte. I have made some simple stuff for the original game boy and commodore 64 in pure 6502 asm and it is impressive to see how much you can do with what you judge to be "so little"... accomplishing such projects, simply feels good. And taking the opportunity of feeling good about something: I feel like (and hope) you are having a blast doing these totally unnecessary skits and theme around a very informative video. Your videos amaze me for the content and creativity. They are very well produced. I hope this is sustainable and you can keep talking about more tech stuff, as this is a rare find between being informative and entertaining. Please, keep it up.
@@Sven_Dongle I'm 100% self-taught in C/C++ and you'd think that a CS degree would require the learning, ability and the know-how of at least translating ASM to C and C to ASM. I wouldn't go as far as knowing how to design your own ISA and computer architecture layout as well as providing an ABI interface for some kernel of some arbitrary operating system. I could see that within an Electronics Engineering degree, a Hardware Engineering degree and a Software Engineering or Systems Engineering degree. Now as for a CS degree, that might depend on the level of the degree. An associate or bachelor's degree perhaps not, a master's degree maybe, a Ph.D. or doctorates degree why wouldn't it be a part of it. I taught myself C++ because I taught myself the 3D Graphics Pipelines including DirectX 9.0c, 10, 11, Legacy OpenGL 1.0 and modern OpenGL from v3.3 - 4.2 along with their respective shader languages HLSL and GLSL, then later Vulkan to build a 3D Game Engine / Physics Simulation from the ground up. From there I decided to take on or tackle making a Hardware Emulator in C++ and I choose to go with the NES 6502. During that project and my research, I came across one of my all-time favorite TH-cam channels, and that is none other than Ben Eaters. I was so intrigued by his series that I ended up finding, downloading, installing and using Logisim to build his entire 8-Bit Bread Board CPU. I had to do a few work arounds to implement his Bus Transceivers, but I was able to get all of the signal lines linked. I even had LEDs displaying out of the custom-built registers from basic logic gates mimicking what he did in his videos. I didn't use built in components except for the actual multiplexers and a ROM component to substitute his EEPROM for the microcontroller's binary instructions. Other than that, everything was fully wired via component to component with the entire bus visible. I did not use Logisim's tunnels. I even implemented his 7-Segment Displays. I was able to set the 8-Bit CPU to program mode, input his binary instructions into their respective memory locations one at a time, verify that they were correct, then switch it to run mode, and then start the simulator. I was able to produce the same output, and I was able to get it to display to the series of the 7-segment displays with the proper output being displayed. Very rewarding. Again 100% self-taught, all personal hobby. And yet, I fully agree with you. Script Kiddies with a CS Degree and don't even understand the fundamentals of pointer arithmetic and basic memory management and we wonder why there are billions of lines of codes with trillions of bugs...
A memory allocator manages the virtual address ranges given to a process. That is usually much more complex than this, because it has to deal with an unpredictable sequence of allocations, deallocations, as well as resizing of objects of varying size. It has to be able to keep track of every byte used, be able to quickly find a continuous range of free bytes of any given size, while avoiding fragmentation and wasting too much space. And it can get a lot more complex if garbage collection is also needed. Paging on the other hand just deals with blocks of fixed size in tables. That said, don't be discouraged. Writing an implementation of malloc is something a CS graduate should be able to handle.
I loved the name and liked to say: 'TWH_TL_SKIP_CNT'. The parameter in VAX/VMS controlled the number of times the Translation Lookaside Buffer (TLB) was probed before causing a page fault. This parameter was part of the system's memory management settings, helping to optimize performance by reducing the frequency of page faults. I'm sure DEC shipped the parameter at a slightly low value because after you've been on a VAX VMS performance course it's one parameter you could up straight away to improve the speed and prove to your boss the course was worth it.
Old guy here. So refreshing to see a young woman do such a great job explaining a highly-technical subject. Your illustrations, demonstrations, and delivery are exceptional. Keep up the great work.
@@kayakMike1000 Context switching sounds like a topic for another video since it's concening the cpu registers and of course dictated by os scheduler. But for this video the topic was memory specifically
To answer your first question, yes. A long time ago. I have a different take than the intro is giving though. Memory paging is a tool but not strictly necessary to solve the problem of "running two programs at once", and can be used on a single-process single-threaded OS. The real hero is context switching which enables multi-threading for a single core CPU, still of course used on virtually any multi-processing based system as well, but threading is the essential tool.
Indeed pre-emptive multitasking was a huge milestone when it comes to running more than one program at once. Anybody who's had the (pleasure?) of using a cooperative multitasking system like Windows 3.1 or a system like the Amiga will know the pain that comes when programs don't play nice with regards to yielding the CPU.
Holy production quality batman. First, thank you for the awesome job you have done for exploring this topic and taking us with you for that journey. Second, LAIN!!! Lain was the fist anime I ever watched and will forever hold a special place in my heart. You made me smile with this feeling of adoring nostalgia - thank you for that.
I just discovered your channel and this is the first video of your I've seen. I'm a semi-retired systems admin and I absolutely loved this video, excellent presentation and extremely informative, and as others may have noted, you looked great as 1962 Laurie, you pulled that look off very well.
Yo, you're a fantastic host and communicator, explaining what could be a boring topic with such energy in a very concise way. Wish you, all the success
That’s some high quality content! I have been programming for over a decade now and this is one of the best explanations I’ve seen! Keep up the good work, Laurie!
I'm not a programmer, but I like history and loosely understanding how some of the technology we use & rely on everyday works. This production was exquisite for the history of memory allocation in computers, and also highly entertaining to watch! You were able to tap into my curiosity, of which is something I feel that is required (for me at least) to actually learn something. Well done 👏
I'm a programmer myself. hand assembling Z-80 machine language back in 1980, then the 6502, 68000 and now pretty much am x86 based. I do still do javascript, php, bash scripting and linux stuff, visual studio and I do use IDA Pro from hexrays now and then. I love your videos, they are well made. Laurie is an awesome lady and I'm impressed. P.S. Love your pig tails and the 60's retro hair style. Glad I found you friend. 🙂
With love: You pronounce the word contiguous wrong (as far as I know) - which usually means you read it before hearing it out loud. And I give props to those who do that. good for you
The more explainer videos I watch on computer science, the more I remember my books from the 80s that had little people running around in the CPU doing little tasks to achieve an objective.
Really crazy level of quality in these videos, especially all the examples and editing in this one. So glad I found this channel even if I'm only beginning to understand some of the lessons here, it's really helpful.
What an awesome explanation of a complex idea. Great touch in writing programs to illustrate the issues! And as someone who was a teenager in the 60's, I can say that you completely nailed the 60's look Laurie!
I thought I understood pagination and TLBs during uni, but your explanations and visual demonstrations made it WAAAAAY more digestible than my textbooks. Amazing explanation!
The new production quality is amazing and you're absolutely a natural when it comes to doing those skits. I personally think the part with the physical cubes was a little too dumbed down for your audience - I totally get it, it's very hard to find the right level of abstraction for explaining these things in an entertaining way. Very much looking forward to everything you make!
Where were you when I was learning IBM MVS back in the '80s? You offered wonderful explanations with simple props, and your presentation was beautifully done in an entertaining way. Consider yourself bookmarked!
Wow, was not expecting this… This was sooo well done... interesting and absolutely captivating! Many thanks to the LaurieWired Team. This is the very best of TH-cam, content creation, and science popularization🎉
Your videos are some AMAZING levels of production i cannot under-emphasize that! Major kudos to Laurie and her fantabulous editing and production teams!
The Amiga did it without paging.... Carl Sassenrath was the genius that did it.. the mechanism was called Exec and the machine didn't have a MMU, so no virtual memory but it did have a very useful dynamic ram disk... truly amazing for the time as it was real pre-emptive multitasking, not like the co-operative multitasking used on the on the pre w95 Windows OS's of the time, it worked especially well if you had real fast mem installed, not bad on machine that was mc68000. but it did have some amazing custom chips which offloaded a lot from the cpu
What was the trick of exec compared to the virtual memory method? Anyway, the Amiga was an amazing piece of hardware way ahead of other computers of the time.
The Amiga sprang instantly to mind when Laurie mentioned paging enabling modern day multitasking. I know the Amiga didn't have paged memory, yet was still full pre-emptive. And all this in 1985 and with as little as 256k of RAM.
The story is certainly fascinating. Although I learned about most of the topics covered on this channel many years ago, I have rarely seen such good and detailed explanations, which means I'm still learning little details. Thank you, Laurie, for your great videos!
Legitimate question: is contigous a word? It seems like it should be, but when I try to Google it, I get "contiguous" and the pronunciation really emphasizes the U. 🤔 PS: Thank you for importing me.
@@jamesarthurkimbell It makes me question the authenticity of the presenter as an actual programmer instead of an actress pretending to be one. Clearly there is a production company behind all of these thirst trap style 'geek girl' videos in recent years. It looks like the strategy works though.
@@tr1p1eaweird thing to just jump to.. her authenticity as a programmer shouldn’t be questioned just because she misused a single word… and even then… the amount of praise she’s getting on the video and all her other videos plus the catalogue of videos she has made a while back should be enough…
what a wonderful splended video! i am a >50y old guy from Austria who grew up with Commodore64 and machine-language programming fascinated me. But i had no memory management at that time in the mid 1980s. Although this was fascinating because as a programmer i could have the entire machine to myself, doing all memory managing yourself often led to mistakes and sometimees spectacular crashes. Along came the 286 and 386 and 486 with intel's architecture and all kinds of memory paging (himem.sys emm386.exe and so on) and i first had to get used to the fact that the programmer no longer needs to know where in memory to exactly put his /her programs and data. I did NOT know that the concept of memory management went all the way back to the early 1960s. you young guys give me a refreshing warmhearted impression that there are people around who really think how the technology that surrounds them works. great video, please continue!
Nice way to explain these underlying concepts. I started working in computers in the late 70’s when memory, program execution and disk space was tiny, unimaginable today.
Yes, but with a proper browser, such as anything that isn’t chrome, you get something called working memory “garbage collection”, and the problem disappears.
Thx for refreshing my... memory on this topic! If only much more content was so well prepared!!! research/scenario/cam/lighting/editing/styling and acting! Excellent work Laurie!
The thumbnail was kind of repelling at first glance but the video is top notch. I instantly subscribed. I love this channel now thank you so much for this video.
I'm one of the engineers that designs the TLBs on modern x86 processors. This is a fantastic explainer and I will absolutely forward this to folks that ask what I work on!
I applaud you for successfully dodging a minefield of legacy crap every day! I was just thinking that for the beginners, anything _but_ x86 might be a better place to start, because x86 has so much historical baggage.
So cool. I am a master student I will start a PhD in computer architecture (RISC-V) I found this video very entertaining even I knew everything hahahaha.
just curious: what was your major?
@@mal2kscx86 is only really legacy on the surface. Under the hood, at the level that the MMU is working, it’s a Superscalar RISC processor with an x86 translation layer grafted on top.
@@mal2ksc It's all the legacy baggage that kept Intel ahead, everything is backward compatible to x86 and (with a bit of translation) 8080. If Zilog had gotten the Z8000 out 6 months earlier, things might have been different.
My mind is positively blown by how high-effort this video is. 60s aesthetic, Python demos, physical models?! Absolutely insane. Add the clear and approachable explanations and you have top tier educational content on this platform. Splendid!
Came to the comments to say the same thing! I'm used to great working examples and explanations (and fun video effects!) but the extra details, wow.
Not to mention the servo control in the MMU block. Over The Top.
You said it so well. I agree :-)
the hair, the dress, the chair.. only the start of it all :D amazing effort
It’s almost like they made videos for the “Head First” books
Please don't stop releasing this kind of content. This is insanely good, Laurie. Seriously.
I do wonder if she's able to do any actual work as well then.. This must take a lot (!!) of time to make.
@@JaapVersteegh so Microsoft must have been wrong for almost 4 years ;d
Not only that but, and I mean this with complete and utter respect, she is a cute as a button and that makes her videos even more appealing, the right combination of personal charisma, details, fun and accessible!
I'd have paid a kidney for a teacher like her, when I was learning this stuff!
simp
@@michaeljones1686 I am absolutely a simp for computer science content.
The production value is insanely high on these, script is clear and concise and the line delivery on point. 10/10 would recommend
Come for the programming stay
For the pretty clothes omg girl ❤
As a virtual memory manager programmer from back in the 80's, I am very impressed with the depth and clarity of your description. Well done!
Back in my day I was a virtual memory consultant
@@moritz584 Is your consulting rate adjusted for inflation since the 80s?
What is a virtual memory manager programmer? I never heard that term used
@@rty1955maybe you have, but you overwrote the memory!
A virtual memory manager (VMM) is one commonly-used name for the operating system component that implements virtual memory. A person working on things like that can reasonably be called a VMM programmer.
It's been 25 years since I've thought about how a computer program uses actual physical memory. This was an entertaining refresher along with a history lesson that is new to me. Very well done.
You have definitely found your calling in life. Part of what makes computing so powerful is you can be working on top of many layers of brilliant innovations that are usually transparent to you; downside of that is pioneers can get forgotten, so thank you for doing such a good job of highlighting their contributions
"If I have seen further it is by standing on the shoulders of Giants."
-Isaac Newton
"working on top of many layers of brilliant innovations" Could not have hit the proverbial nail on the head more accurately good sir.
You struck the perfect balance between informative content and entertaining visuals.
I hope you are able to make a living creating your videos, you are truly gifted for this kind of work!
I love how concepts from 50+ years ago are still relevant today.
PS: great 60s aesthetics, appreciate the editing effort next to nerdy content.
Really much of computer science has been about refining early concepts into efficiently usable functions and building hardware to efficiently support them. You can look at the failures of some systems (like the Pentium 4's prediction/cache miss penalties) to see this directly. And you can look at how old languages like Lisp already had capabilities similar to modern ones, minus hardware optimisations and libraries (Lisp dialects are still in use).
Most of computing is exactly the same as any logistics.
The processing is a small part of the puzzle and is impossible at scale without the grunt work of ports, trucks, highways, warehousing, shelving, forklifts, and every other form of logistics you can think of.
Those functions will never really change much, so the solutions of 50 years ago work the same today - even if they're performed faster.
@@myne00 Exactly
60+ years ago, 1962 is 62 years ago.
@@guybolt The lambda calculus was developed in the 1940's and forms the bedrock of formal logic for modern computing. Turing's work in the 1930's informed this very heavily. Of course this was pre-dated by pioneers in programmable logic through mathematics and mechanical design a century beforehand in the 1840's by people like Lovelace and Babbage.
Tldr; There's no point in nitpicking over a decade when many "modern" computing concepts are minimally 180+ years old.
In the IBM world of computing, which I learned in 1972, there is an intermediate stage between fixed address programming and Virtual Storage. It's called relative addressing. Instead of using fixed addresses, the program would load the starting memory address into a base register. The program was compiled using a relative address from the base address. Thus, the program could start anywhere in memory.
System/370 used Virtual Storage and was the successor to System/360, which was limited to physical memory. P.S. At my first programming job in 1974, the computer was a 370/125 and had only 92K of actual physical memory.
60’s Laurie must be a time traveller if she’s reading my favourite book from ‘79 ❤
Didn't really think about that😅
Truly ahead of her time
We (including my friend) still think you are an AI or a transgender @@lauriewired
I thought it was '76, but as there is a copy 4 feet away from me, I checked and you are correct.
I saw the year 1979 and correctly guessed what book it would be. All time classic.
I'm a retired Computer and Electrical Engineer, and I gotta say, Laurie, you did an excellent job explaining this. 👍🤓 Also, I dig the '60s look! You pulled it off well. ❤ Brings back some memories (pun intended...sort of). I remember the days of paper punch tape, core memory, drum storage, the early Winchester disk drives, and when "monitors" were a teletype machine that physically burned the dots into the paper using a single-pin dot-matrix print head. As for programming, I've used binary-coded octal, BASIC (including BASIC-A, Q-BASIC, GW-BASIC, Visual BASIC, and Visual BASIC for Applications), Pacal and Turbo Pascal, C and C++ and Turbo C++, Fortran IV, Assembler, and Machine Code. Here's an old programmer's joke for you: "Do you know how to tell who is a C programmer? They're found in a dark basement corner, muttering to themselves as they slowly go mad." 😵💫
"Good programmers copy... great programmers steal..." ,-P
Engineers get a bad rap at a supposed inability to explain themselves. And some of that is deserved. I’ll explain. They are usually interfacing with the requirements, material folks et al. And it’s customary to be very precise, very succinct. They assume people know the argot, because usually they do. And many just cannot believe everyone isn’t fascinated by the detail of it all. This adds up to a custom that is a real liability when then need to break things down to the user sponsor level. Rare is the individual who can effortlessly dwell in minutiae land, and then abstract it to explain to the neophytes.
@@boxsterman77you evidently have the ability, it’s evident.
LOL! yes me too, old time main frame guy here. Still in the business 47 years later. Great content, earned a sub.
This video deserves a Turing award for production quality.
I feel like this video alone would get a high mark as a university thesis.
The video is first rate.
The audio has room for improvement.
It's better at 11:05
Most of the vocals sound a touch tinny and far away.
Agreed.
@@Cyba_IT she left out Lynn Conway who developed and published how all this was supposed to work. Kind of a big hole for for a university paper
@@salerio61Since she died a few weeks ago maybe the best timme to devote a whole video to her.
Holy moly, you really went all-in with this one! It's a rare occasion where one would do a video about such a low-level thing and actually dress up in era-appropriate style. Your videos are super interesting, well produced, and cute as heck to top it off!
your educational content on foundational computing concepts is simply the best there is. and on top of it the production quality is incredible. well done
First time viewer of your channel and I must say "wow!". You put A LOT of production time explaining these concepts. Very well done.
Outstanding content, a largely underrated channel that should be part of any computer science syllabus.
I spent the better part of the 1990s teaching prospective systems engineers, and I have to say that this is one of the best illustrations I've seen on this topic. It brings back fond memories of those days. Thanks for that. You are a natural! Well done.
Mind blowing how Laurie from 60s got access to Gödel, Escher, Bach!
I never read it but I bought it. I had not noticed that was in the wrong time frame, 1979 is a bit late for the 60s which I have noticed started around 1964 and ended when the draft ended.
She IS a time traveler after all... works both ways. no mystery.
Funny how it's just her video that's a bit of fun.
I'm sure she's not being factually correct.
These videos are brilliant, only let down by you clods coming into the comments saying the things you are.
Watch the content for what it is. 🙄
@@cgi2173
Seems factually correct. Did you see a real error that is significant?
she's just that good
Its incredible how far you go for a simple skit, from almost dying from smoking asbestos filtered cigs, to cloning yourself and getting rid of the body afterwards.
The clone was a side-effect of the time-travel machine she used
The smoking laugh cough nearly killed me, that was fun to watch as someone who's quitting smoking. (Don't start, it's not worth it.)
Contiguous con tig you us. Otherwise great video!!!! I love this channel! Extremely clear explanations
Thank you. Thought I was watching AI....which I HATE.
My head was about to explode. I had to stop the video and look up the word to make sure I hadn't been living a lie all these years.
It repeatedly grated. Interesting exposition, though.
A very good, but also very simplified explanation. For example: Between the old completely fixed memory programming and virtual memory another important concept was invented: Relocation. This meant that programs no longer had to directly specify physical addresses, but instead could specify a base address of zero and all other addresses in the program were relative to this base address. The operating system or program loader would then select a new base address and rewrite the program so all addresses became absolute before running it.
60s aesthetics is just amazing!
I'm quite impressed with how skillfully Laurie has taken a relatively unknown/uninteresting subject and presented it in such an accessible, engaging, and creative way!!! This was a good refresher, and the visuals were memorable. Great production value too! Excellent content!
My god, this video deserves an award.
Thanks for remembering that it was the Atlas first had virtual memory. However, there was a stage before that with memory segmentation which was a simpler form of memory management. Each user program had its own address space which was mapped onto real memory using segmentation registers. The big difference was that segments tended to be fairly big and still had to be in main memory. Segments could be swapped out but this was expensive operation in terms of time. Atlas style paged memory divided physical memory into equal sized pages that could be read from and written to the paging store. As they were smaller, it was easier to fit multiple programs into a system concurrently. Btw, the Atlas was being used under the Titan operating system until the mid seventies. By that time multiple manufacturers were producing demand paged systems. The first microprocessors with memory management followed the segmented route and didn't get proper memory management until much later.
A well paced informational youtube video that doesn't keep throwing info to your face along with hundreds of cuts in 2024? Count me in, great work!
Except this video describes memory mapping and calls it virtual memory which is something else. It's a shame since it's explained well but uses the wrong terms and therefore is increasing the confusion instead of providing clarification.
One of the most underrated channels on this entire platform. Excellent educational content presented at an unreasonably good production value too
Plot twist: Programming is just a side gig for Laurie. This whole TH-cam series is "just" a clever way to showcase her impressive video editing and acting skills to land a role in Hollywood. Yep, that must be it.
Hollywood doesn't deserve her, and she should stay as far away as possible as to not put herself in a situation to become manipulated and or corrupted. Just keep doing what you're doing Laurie!
Hollywood can't make any money any more.
You won't make any money on YT either, I haven't in 15 years. Not one penny. $50,000 investment of labor. You create because you want to see yourself what you create. To do otherwise, you're barking up a wrong tree on this platform.
what?
Reminds me of Hedy Lamarr.
@@niksto Yes! Page switching isn't that different from frequency hopping.
hello laurie! this is the first video of yours I've ever seen, I'm amazed youtube didn't recommend you sooner because it's right up my alley.
thanks for the awesome video, I'm going to go watch your whole back catalogue now
Computer science is the art of abstraction and studying it is like becoming disillusioned, but also amazed at the same time. It's a bit like magic.
A cpu is a flattened rock with stored lightning
its close enough 😂😂😅😂😅😂
OMGAWD, That 1960's voice over was SPOT ON ! I SMILED SO BIG ! That was pretty good !
By the time core memory was the dominant memory type, the programmer was not selecting the load address for most batch systems, instead the link loader would fixup relocatable references, so multiple programs could be loaded into the same address space so long as they fit in physical memory.
This was also common in the pre-386 MS-DOS where various utilities could load into RAM in the "terminate and stay resident" (TSR) model. This requires a relocation table as part of the executable image format.
I wrote a lot of TSR programs along with a multitasker for DOS in the 1980s.
Virtual memory allows sharing physical memory between multiple processes.
I have been working on operating systems on and off since 1979 on computers with and without virtual memory management. In the 1980s, I wrote an overlay linker for the M68000 that included a runtime that could swap sections of code into the same physical memory areas to allow programs larger than available physical memory to run without an MMU.
The PDP-11, on the other hand had a 64 KiB virtual address space (16 bit addresses), but most models had an MMU that supported 4 to 256 times that for physical memory. Kind of reverse how today's virtual memory works.
The IBM 360 didnt have a relocatable loader. You could however to a link & go but that require all associated libraries to be present. On the 360 you could define a BG & FG process, but your code had to have been linked in either partition in orde to work. You could NOT run a job, thay was linked for BG into the FG partition. It would crash as there was no relocatable loader.
Ive been programming since 1969 (1401, 360, 370, 4300, 390)
The early minutes of her video reminded me of era of the late 80s into the early 90s when the Macintosh "System" OS transitioned from "Finder" (unitasking, apart from the Desk Accessories, akin to TSRs), to "Multifinder" (cooperative-multitasking). There, many of the Macs which System 6 & 7 could run on may not have had an MMU. You'd pull up the memory info windows (I forget now it's proper title, was it just "System Info"?) and it'd show you the physical memory regions occupied by each program you'd launched. From Multifinder, if you started Program A, then Program B, and later closed Program A and went to start Program C, if there wasn't a free region big enough after program B's occupied area, or if the hole left from the closing of Program A wasn't big enough, Multifinder would report to you that Program C couldn't start, because there was not enough memory! -- Just like her little demo (almost)!
What really gets me, looking back, is that many of the later M68K series CPUs (I think from the 020 onward) powering Macs of that era actually /had/ MMUs and could do virtual memory and paging operations...but the Mac OS of that era, while looking so good compared to Windows 3.11, was suffering from pretty severe tech-debt by that point, and didn't utilize the hardware to its fullest potential.
And at then in her video, she displays a MacOS-y looking demo with the "Copeland" codename displayed. I think this is the point when the Mac's OS finally gained virtual memory and paging support.
I'd stopped using Macs on the regular after System 7.1. By then after a brief period of hell using PCs under DOS 6.22 and Win3.11, I'd gotten into them under Win95 (which was just enough Mac-like to please me and help me not regret leaving the Motorola-powered Macs behind) and WinNT 3.51 (which looked like the homely basic Win3 PC, but was VMS-like underneath), and near-simultaneously discovered the new and novel Unix-like early Linux distributions for PC. I stayed with Windows too long...until Vista. But I've been a Linux-man ever since 1996.
It was good to hear someone reference the key contributions of the Manchester University, England research teams to the development of modern computer systems. The UK built the first programmable computer and followed this work with the seminal series of MU Computing systems. It was truly remarkable that so many complex software systems could be built upon the crude hardware of the times. They were a nightmare...
Hats off to MU, who showed what could be done. How about a look at the UK GEC systems next? IIRC they pioneered a highly secure OS design....
Great to see the new generation appreciating the venerable history of computer technology. This is a great set of videos for young people getting into computing, very nicely done.
Contagious! vs! Contiguous! .. Fight!
So, I'm not the only one who was hearing "contagious"... Why is she touching those blocks of contagious memory with her bare hands?
Viral video becomes a contagious memory, when viewed contiguously.
Contentious vs Contiguous
No such word as contigious. She means contiguous.
con-TIG-you-us
@@davidgari3240 we know what she means... But she pronounces it as contagious, which is a word.
As a 1970's graduate of EE, I can say I really appreciate this material. First encountered virtual addressing on a VAX-11/750. It was life changing, but I have to say I miss all the paging-out of portions of tasks to load another overlay :)
I used to be pretty good at coding ODL files for RSX11M . While it was a welcome change when VMS overtook RSX and made overlays obsolete, I do miss the challenge. It was like solving a puzzle.
@@njphilwt TKB Forever! and Ever and Ever and Ever and Ever... (I used to have a round button from a DECUS with this written in a circle around the edge.)😁
It sounds like you are talking about something similar to the overlays that many DOS programs used in the early '90s to swap portions of themselves in and out of memory at different points since the whole program could not all fit into memory at the same time, and that was NOT full on paging. dosshell used something like that before memory paging too so that you could swap between multiple programs that were all "loaded" at the same time, but only one was really ever running, and the whole thing had to be saved to disk and another loaded into ram when you wanted to switch between programs. The background program was fully suspended when swapped out and only the one you were actively looking at was in ram and running.
I was thinking about my time with an Interdata Model-70 computer, where I had to manually give the linker/loader instructions about which code segments were to be overlaid against others. It took a good bit of planning and code re-writing to make sure the module sizes were all compatible with the available memory. I learned a whole lot about memory management, which immediately became obsolete with the implementation of automatic hardware-handled virtual memory under VMS.
@@phillipsusi1791 Yes - I remember that I used to store the results from each overlay into global common, then chain to the next overlay....
Oh, if only we had teachers and instructors of your calibre back in the 70s when I was learning this. Laurie, you are amazing!
There are two concepts here which are co-mingled. First is virtual memory (each user process has its own memory space), and the second is demand paging (virtual memory can be marked "filled" or "not filled" with supporting bits in the hardware and an OS kernel service to change the state by "paging" from external storage such as a disk or ssd to backing physical memory).
That's a pretty neat coincidence, I have a test on this subject tomorrow. I figured I'd watch something on youtube after studying.. and welp. I guess Virtual Memory is my destiny lol. Nice watching this as an overview on the subject!
Also, great production!
I nearly always speed up videos 2x - 3x on TH-cam, especially informational videos. The entertainment, engagement, and quality of the video is so good, I didn't even notice I was watching it at regular speed till I was over half way through. Great work.
3x???
Sometimes as high as 4x if the speaker has a slow cadence. There is a chrome extension, Video Speed Controller, that puts an overlay on most video websites, works on TH-cam and Udemy which is where I use it most. Audible let's you listen up 3.5 speed, and I find that a lot of non-fiction is read fairly slowly, especially for content like, The Great Courses, where it often sounds like a college professor giving a lecture and speaks much slower with longer pauses for thinking.
the production quality of these videos is getting even more insane lately, keep up the great work
Wow. The production quality on this video is amazing. I was unaware that anyone else cared this much about how computers do what they do.
The cool thing about this is the fact that we now don´t have to worry for most of this low-level stuff. I went through my computer science degree without ever writing my own memory allocator and grew afraid of going this low level, but it is in fact fun and even simpler than what I thought.
In the area I work, every last call matters so I often find myself doing hobby projects in which I put myself working on bare metal and I love to see the difference when you have no choice but optimize every byte. I have made some simple stuff for the original game boy and commodore 64 in pure 6502 asm and it is impressive to see how much you can do with what you judge to be "so little"... accomplishing such projects, simply feels good.
And taking the opportunity of feeling good about something: I feel like (and hope) you are having a blast doing these totally unnecessary skits and theme around a very informative video.
Your videos amaze me for the content and creativity. They are very well produced. I hope this is sustainable and you can keep talking about more tech stuff, as this is a rare find between being informative and entertaining.
Please, keep it up.
Pathetic that a CS degree is had without learning this.
@@Sven_Dongle I'm 100% self-taught in C/C++ and you'd think that a CS degree would require the learning, ability and the know-how of at least translating ASM to C and C to ASM. I wouldn't go as far as knowing how to design your own ISA and computer architecture layout as well as providing an ABI interface for some kernel of some arbitrary operating system. I could see that within an Electronics Engineering degree, a Hardware Engineering degree and a Software Engineering or Systems Engineering degree. Now as for a CS degree, that might depend on the level of the degree. An associate or bachelor's degree perhaps not, a master's degree maybe, a Ph.D. or doctorates degree why wouldn't it be a part of it. I taught myself C++ because I taught myself the 3D Graphics Pipelines including DirectX 9.0c, 10, 11, Legacy OpenGL 1.0 and modern OpenGL from v3.3 - 4.2 along with their respective shader languages HLSL and GLSL, then later Vulkan to build a 3D Game Engine / Physics Simulation from the ground up. From there I decided to take on or tackle making a Hardware Emulator in C++ and I choose to go with the NES 6502. During that project and my research, I came across one of my all-time favorite TH-cam channels, and that is none other than Ben Eaters. I was so intrigued by his series that I ended up finding, downloading, installing and using Logisim to build his entire 8-Bit Bread Board CPU. I had to do a few work arounds to implement his Bus Transceivers, but I was able to get all of the signal lines linked. I even had LEDs displaying out of the custom-built registers from basic logic gates mimicking what he did in his videos. I didn't use built in components except for the actual multiplexers and a ROM component to substitute his EEPROM for the microcontroller's binary instructions. Other than that, everything was fully wired via component to component with the entire bus visible. I did not use Logisim's tunnels. I even implemented his 7-Segment Displays. I was able to set the 8-Bit CPU to program mode, input his binary instructions into their respective memory locations one at a time, verify that they were correct, then switch it to run mode, and then start the simulator. I was able to produce the same output, and I was able to get it to display to the series of the 7-segment displays with the proper output being displayed. Very rewarding. Again 100% self-taught, all personal hobby. And yet, I fully agree with you. Script Kiddies with a CS Degree and don't even understand the fundamentals of pointer arithmetic and basic memory management and we wonder why there are billions of lines of codes with trillions of bugs...
Back when hundred of thousands of instructions per second was fast.
A memory allocator manages the virtual address ranges given to a process. That is usually much more complex than this, because it has to deal with an unpredictable sequence of allocations, deallocations, as well as resizing of objects of varying size. It has to be able to keep track of every byte used, be able to quickly find a continuous range of free bytes of any given size, while avoiding fragmentation and wasting too much space. And it can get a lot more complex if garbage collection is also needed. Paging on the other hand just deals with blocks of fixed size in tables. That said, don't be discouraged. Writing an implementation of malloc is something a CS graduate should be able to handle.
I loved the name and liked to say: 'TWH_TL_SKIP_CNT'. The parameter in VAX/VMS controlled the number of times the Translation Lookaside Buffer (TLB) was probed before causing a page fault. This parameter was part of the system's memory management settings, helping to optimize performance by reducing the frequency of page faults.
I'm sure DEC shipped the parameter at a slightly low value because after you've been on a VAX VMS performance course it's one parameter you could up straight away to improve the speed and prove to your boss the course was worth it.
Old guy here. So refreshing to see a young woman do such a great job explaining a highly-technical subject. Your illustrations, demonstrations, and delivery are exceptional. Keep up the great work.
I think a picture would have helped. There's a whole world in here that goes way beyond MMU and virtual memory. Context switching comes to mind...
Sounds like an idea for a future episode.
@@kayakMike1000 Context switching sounds like a topic for another video since it's concening the cpu registers and of course dictated by os scheduler. But for this video the topic was memory specifically
@@kayakMike1000 When I learned this in college the white board was completely full of boxes, tables and arrows.
WOW what am I looking at? A super awesome video edited to perfection and explained to perfection. This is a gem! Love at first sight!
To answer your first question, yes. A long time ago. I have a different take than the intro is giving though. Memory paging is a tool but not strictly necessary to solve the problem of "running two programs at once", and can be used on a single-process single-threaded OS. The real hero is context switching which enables multi-threading for a single core CPU, still of course used on virtually any multi-processing based system as well, but threading is the essential tool.
Indeed pre-emptive multitasking was a huge milestone when it comes to running more than one program at once. Anybody who's had the (pleasure?) of using a cooperative multitasking system like Windows 3.1 or a system like the Amiga will know the pain that comes when programs don't play nice with regards to yielding the CPU.
When she started describing the problem at the beginning, first thing I thought of was "context switching."
Context switching + linking loaders, where the OS managed load addresses and mapped internal references to hardware addresses, pre-runtime.
I would never have expected such a highly polished video on this topic. Great job!
Holy production quality batman. First, thank you for the awesome job you have done for exploring this topic and taking us with you for that journey. Second, LAIN!!! Lain was the fist anime I ever watched and will forever hold a special place in my heart. You made me smile with this feeling of adoring nostalgia - thank you for that.
Great video.
Easy to follow, nicely paced, no more jargon than was necessary, and a great style.
You are very good at this.
Thanks.
First time viewer.
this felt on par with something like veritasium for production quality and explanation but for computer science. really well made!
I just discovered your channel and this is the first video of your I've seen. I'm a semi-retired systems admin and I absolutely loved this video, excellent presentation and extremely informative, and as others may have noted, you looked great as 1962 Laurie, you pulled that look off very well.
Yo, you're a fantastic host and communicator, explaining what could be a boring topic with such energy in a very concise way. Wish you, all the success
Laurie definitely working overtime with a full time job and these amazing videos with awesome edits truly great thanks so much
i can't believe i just found the best channel in the world
Who is this person?? Brilliantly explained, excellently edited, then she picks up godel escher bach?! I think im in love.
@@josugambee3701 we all do, bruh
She also complays as Asuka
IKR. She is like the perfect nerdy pixie dream girl.
@littlemeg137 That makes sense.
That’s some high quality content! I have been programming for over a decade now and this is one of the best explanations I’ve seen! Keep up the good work, Laurie!
Impressive introduction into memory management, great job, saved on my longtime memory in my brain
I am so happy that I came upon this in my recommended page, you reminded me that CS is more than JS frameworks
The production value here is insane. Great video!
I'm not a programmer, but I like history and loosely understanding how some of the technology we use & rely on everyday works. This production was exquisite for the history of memory allocation in computers, and also highly entertaining to watch! You were able to tap into my curiosity, of which is something I feel that is required (for me at least) to actually learn something. Well done 👏
This is the first video of yours I’m seeing - love it! The production value, the pacing, the difficult concepts being smoothly explained
I'm a programmer myself. hand assembling Z-80 machine language back in 1980, then the 6502, 68000 and now pretty much am x86 based. I do still do javascript, php, bash scripting and linux stuff, visual studio and I do use IDA Pro from hexrays now and then. I love your videos, they are well made. Laurie is an awesome lady and I'm impressed. P.S. Love your pig tails and the 60's retro hair style. Glad I found you friend. 🙂
With love: You pronounce the word contiguous wrong (as far as I know) - which usually means you read it before hearing it out loud. And I give props to those who do that. good for you
It's AI
The more explainer videos I watch on computer science, the more I remember my books from the 80s that had little people running around in the CPU doing little tasks to achieve an objective.
Really crazy level of quality in these videos, especially all the examples and editing in this one. So glad I found this channel even if I'm only beginning to understand some of the lessons here, it's really helpful.
What an awesome explanation of a complex idea. Great touch in writing programs to illustrate the issues! And as someone who was a teenager in the 60's, I can say that you completely nailed the 60's look Laurie!
I thought I understood pagination and TLBs during uni, but your explanations and visual demonstrations made it WAAAAAY more digestible than my textbooks.
Amazing explanation!
Amazing production quality on this video.. I could listen to you talk 24/7. You make a perfect 60's pinup too..
The new production quality is amazing and you're absolutely a natural when it comes to doing those skits. I personally think the part with the physical cubes was a little too dumbed down for your audience - I totally get it, it's very hard to find the right level of abstraction for explaining these things in an entertaining way. Very much looking forward to everything you make!
Where were you when I was learning IBM MVS back in the '80s? You offered wonderful explanations with simple props, and your presentation was beautifully done in an entertaining way. Consider yourself bookmarked!
The smoking a cigarette, when you clearly don't smoke was funny as hell. don't smoke kids
The ASD force runs strongly in her
@@Pgr-pt5ep what the heck kinda comment is that
@@javierflores09 Those who are ASD know, like myself. We are special.
@@forbidden-cyrillic-handleweird random comment?
@@javierflores09nah they're right im autistic as well lmao
kind of a meme with CS students at this point
I loved everything about this video. The edition, the costumes, the explanation, the presenter, the scenario...
Lol I think last time I checked the channel few months ago you had like 15k subs, congrats!
Wow, was not expecting this… This was sooo well done... interesting and absolutely captivating! Many thanks to the LaurieWired Team. This is the very best of TH-cam, content creation, and science popularization🎉
Don't you just hate it when your blocks of memory become contagious?
I know, right? hahahaha
She has a math brain, not a linguistics one.
I know. Everything was amazing until the word and then it would cause a hiccup in the simulation of reality.
She's touching those contagious blocks of memory with her bare hands!
@@EricMnemonicNo worries. She’s been allocated.
Well presented. There are so many innovations in the history of computing we are just not aware of and take for granted.
It's insane how much work you put into this. This looks like a documentary! You've earned at least one subscriber today
Your videos are some AMAZING levels of production i cannot under-emphasize that! Major kudos to Laurie and her fantabulous editing and production teams!
The Amiga did it without paging.... Carl Sassenrath was the genius that did it.. the mechanism was called Exec and the machine didn't have a MMU, so no virtual memory but it did have a very useful dynamic ram disk... truly amazing for the time as it was real pre-emptive multitasking, not like the co-operative multitasking used on the on the pre w95 Windows OS's of the time, it worked especially well if you had real fast mem installed, not bad on machine that was mc68000. but it did have some amazing custom chips which offloaded a lot from the cpu
I still have my Amiga 500 stored away. They were great for home use before PCs and Macs became affordable.
What was the trick of exec compared to the virtual memory method? Anyway, the Amiga was an amazing piece of hardware way ahead of other computers of the time.
The Amiga sprang instantly to mind when Laurie mentioned paging enabling modern day multitasking. I know the Amiga didn't have paged memory, yet was still full pre-emptive. And all this in 1985 and with as little as 256k of RAM.
I started watching this and was turned off, but by half way through, I admit, I'm going to have my kids watch this. Nice work, nice work.
The story is certainly fascinating. Although I learned about most of the topics covered on this channel many years ago, I have rarely seen such good and detailed explanations, which means I'm still learning little details. Thank you, Laurie, for your great videos!
Brilliant. This is how a subject should be taught
Legitimate question: is contigous a word? It seems like it should be, but when I try to Google it, I get "contiguous" and the pronunciation really emphasizes the U. 🤔
PS: Thank you for importing me.
"Contiguous" is right. She's probably just running into the "seen it in a book, never heard it" problem.
@@jamesarthurkimbell It makes me question the authenticity of the presenter as an actual programmer instead of an actress pretending to be one.
Clearly there is a production company behind all of these thirst trap style 'geek girl' videos in recent years.
It looks like the strategy works though.
@@tr1p1ea Click on her channel. Go back to her earlier videos. Look at her other work. I think she knows what she's talking about.
@@tr1p1eaweird thing to just jump to.. her authenticity as a programmer shouldn’t be questioned just because she misused a single word… and even then… the amount of praise she’s getting on the video and all her other videos plus the catalogue of videos she has made a while back should be enough…
Honestly, I wish I had videos like this back in high school or even the first year of college. It's like Bill Nye but for computer science.
The retro segment is absolute perfection!
Those retro scenes made me instantly think of *Austin Powers* movies. 😄
what a wonderful splended video! i am a >50y old guy from Austria who grew up with Commodore64 and machine-language programming fascinated me. But i had no memory management at that time in the mid 1980s. Although this was fascinating because as a programmer i could have the entire machine to myself, doing all memory managing yourself often led to mistakes and sometimees spectacular crashes. Along came the 286 and 386 and 486 with intel's architecture and all kinds of memory paging (himem.sys emm386.exe and so on) and i first had to get used to the fact that the programmer no longer needs to know where in memory to exactly put his /her programs and data. I did NOT know that the concept of memory management went all the way back to the early 1960s. you young guys give me a refreshing warmhearted impression that there are people around who really think how the technology that surrounds them works. great video, please continue!
Godel Escher and Bach…
fractint and xscreensaver in the background..
You’re bringing back all the hits
This is the first video I've seen of yours, and I'm sold. This is great stuff. As an ol CS educator, I really appreciate what you're doing here.
The 1960s hairstyle really suits you!
Those retro scenes made me instantly think of *Austin Powers* movies. 😄
Nice way to explain these underlying concepts. I started working in computers in the late 70’s when memory, program execution and disk space was tiny, unimaginable today.
Insane production value and really well explained.
This is the first video I ever saw from you and I subscribed after a few minutes of watching. Super quality content!
00:05 no, my chrome can only open so much tabs
Yes, but with a proper browser, such as anything that isn’t chrome, you get something called working memory “garbage collection”, and the problem disappears.
Thx for refreshing my... memory on this topic! If only much more content was so well prepared!!! research/scenario/cam/lighting/editing/styling and acting! Excellent work Laurie!
CON-TIG-YOU-US
Otherwise, a decent video on memory.
The thumbnail was kind of repelling at first glance but the video is top notch. I instantly subscribed. I love this channel now thank you so much for this video.
Very underrated channel.
GJ!
this is awesome !! HAve wanted to understand this for decades. The production quality is unmatched.
I have no idea what the term "contigious" means, but I do know what con-ti-gu-ous means 😅