4. Assembly Language & Computer Architecture

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 พ.ย. 2024

ความคิดเห็น • 303

  • @ModusPonenz
    @ModusPonenz 2 ปีที่แล้ว +329

    A friend and I were taking an x86 assembly language class together back when we were working on our Computer Science degrees. We worked at Intel at the time. At some point during the semester the instructor discovered that we were Intel employees. We worked in the Memory Components Group at that time and had nothing to do with CPU products. Even so, from that point on, whenever he was explaining some of the peculiarities of the x86 architecture, he jokingly glared at us as if it were our fault. He would say, "Why does this instruction work the way it does? Because Intel decided that's the way it is." And look directly at us. We both got an A in the class.

    • @Meiyourenn
      @Meiyourenn 2 ปีที่แล้ว +4

      did you have good understanding of MASM???

    • @ModusPonenz
      @ModusPonenz 2 ปีที่แล้ว +11

      @@Meiyourenn It's been a long time since I took this class, but yes we had a good understanding of MASM. We were more familiar with the assembler that Intel sold with their development systems. But the class was using the Microsoft assembler. Luckily the Intel and Microsoft assemblers weren't all that different. At least the operand order was the same.

    • @ndotl
      @ndotl 2 ปีที่แล้ว +9

      I worked for IBM, and had an adjunct professor who worked at IBM across the hall from me. Needless to say, I got a B+. (Intel) Assembly was my favorite language, until I learned C the next semester. C was my favorite language until I learned C++ the next semester. Even after learning Java, C++ remained my favorite language. Currently learning C# and relearning C++, while coding in Apex on the Salesforce platform.
      P.S: I got a B+ when I really deserved an A, but I really enjoyed the experience and could go back to it if I really had to. I was at a Salesforce conference in San Francisco a year or so before corvid, and met some guy on the bus coding in Assembly on his laptop. In a way I kind of envied him, because whatever it was it was going to be fast. I can remember that in IBM assembly which I did not code in, MVCL was considered by programmers one of the worst instructions invented.

    • @mr.incognito4640
      @mr.incognito4640 ปีที่แล้ว

      Can you please recommend me some good books in assembly

    • @mr.incognito4640
      @mr.incognito4640 ปีที่แล้ว

      ​@@ndotlcan you suggest me some good books for assembly?

  • @jellyjams7217
    @jellyjams7217 3 ปีที่แล้ว +95

    I do not have the prerequisites to be watching this video, but still watched the whole thing, learning a lot about something I know nothing about.

    • @justcurious1940
      @justcurious1940 ปีที่แล้ว +1

      same with me, I enjoyed it .

    • @johndoe-g6f
      @johndoe-g6f 9 หลายเดือนก่อน +1

      bro... stop it! i know you know something atleast about computer.. and your just being curious about whats happening inside those hardwares and software you see on screen..

    • @saritshull3909
      @saritshull3909 8 หลายเดือนก่อน +1

      this is the way.
      not my way though lol. I'm trying to pass

    • @elisheets772
      @elisheets772 3 หลายเดือนก่อน

      Congratulations, you have been promoted to management 🎉

  • @div6601
    @div6601 2 ปีที่แล้ว +21

    Prof. Leiserson is an amazing instructor. I love watching his lectures. He never rushes off with the material and always prioritizes quality over the quantity. Thank you Prof. Leiserson and MIT OCW🙏🏽

  • @wmffmw
    @wmffmw ปีที่แล้ว +8

    I used to teach Assembly Language, DOS, CPU Architecture and the instruction set timing and bus cycle operation for several systems. Including the DEC PDP 11, Intel MCS86 and MCS 51 and earlier. My class started with Hardware operation from the point of the first clock pulse from the release of the reset button. Including Machine Instruction and Bus cycles and how instruction execution controlled hardware operation. We also introduced Hardware Emulators. To support the class, I wrote a Text Book on the Theory and Operation of Microprocessor Hardware and Software. The Z-80 / 8080 CPUs were used as the example architecture. My class supported the NTX 2000 NATO Telephone Switching System Back in 1980 for NATO & ITT at the introduction of Fully Integrated Digital Switching Systems for Central Office Telephone Switches. basically, 256 Microprocessors operating in fully redundant pairs, all under the control of a custom minicomputer executing a Real Time, Multi-tasking Operating System.

  • @icantollie
    @icantollie 6 หลายเดือนก่อน +7

    12:45: “My experience is that if you really want to understand something, you want to understand it to the level that’s necessary, and then one level below that; it’s not that you’ll necessarily use that one level below it, but that gives you insight as to why that layer is what it is, and what’s really going on”

  • @יונתןלוברסקי-ז9ז
    @יונתןלוברסקי-ז9ז 4 ปีที่แล้ว +218

    I just love how you can see that he loves his job and that he truly enjoys teaching. Thank you for sharing this lecture!

    • @intuit13
      @intuit13 2 ปีที่แล้ว +3

      must be an amazingly charmed life to be super smart, loving academia. perfect life. :x i can only imagine

    • @muhammadsubhani7420
      @muhammadsubhani7420 2 ปีที่แล้ว +2

      Love his passion for explaining architecture for software development implications!

  • @leixun
    @leixun 4 ปีที่แล้ว +175

    *My takeaways:*
    1. Assembly language 0:22
    - Why we want to have a look the assembly language 6:46
    - Expectation of students 10:05
    - If you really want to understand something, you should understand one level below what you normally need to use 12:20
    2. X86-64 ISA primer 13:28
    - 4 important concept: Registers 14:03, Instructions 20:25, data types 28:15 and memory addressing modes 35:50
    - Assembly idiom 43:08
    3. Floating-point and vector hardware 47:54: SIMD
    4. Overview of computer architecture 56:55
    - Historically, computer architects have aimed to improve processor performance by exploit parallelism (e.g. instruction-level parallelism (ILP), vectorization, multicore) and locality (caching)
    - ILP 1:00:35: pipelining, pipeline stall, hazards
    - Superscalar processing 1:09:08
    - Out-of-order execution 1:11:40: bypassing, from in-order to out-of-order, register renaming
    - Branch prediction 1:15:44: speculative execution

    • @wasif1809
      @wasif1809 3 ปีที่แล้ว +7

      cool

    • @Kingda2_
      @Kingda2_ 2 ปีที่แล้ว +1

      Thanks pal

    • @ktburger659
      @ktburger659 2 ปีที่แล้ว

      thank you!

  • @madyogi6164
    @madyogi6164 ปีที่แล้ว +3

    Joyful to watch, even as an entertainment.
    Assembly is very fun to do. Maybe a suicidal task if doing something for normal OS, but pure joy when there is none such thing as OS, no C, and starting barely from scratch - uC-s, for example!

  • @skywatchers9675
    @skywatchers9675 2 ปีที่แล้ว +3

    Charles Eric Leiserson is a computer scientist, specializing in the theory of parallel computing and distributed computing, and particularly practical applications thereof. As part of this effort, he developed the Cilk multithreaded language. He invented the fat-tree interconnection network, a hardware-universal interconnection network used in many supercomputers, including the Connection Machine CM5, for which he was network architect. He helped pioneer the development of VLSI theory, including the retiming method of digital optimization with James B. Saxe and systolic arrays with H. T. Kung. He conceived of the notion of cache-oblivious algorithms, which are algorithms that have no tuning parameters for cache size or cache-line length, but nevertheless use cache near-optimally. He developed the Cilk language for multithreaded programming, which uses a provably good work-stealing algorithm for scheduling. Leiserson coauthored the standard algorithms textbook Introduction to Algorithms together with Thomas H. Cormen, Ronald L. Rivest, and Clifford Stein.
    Leiserson received a B.S. degree in computer science and mathematics from Yale University in 1975 and a Ph.D. degree in computer science from Carnegie Mellon University in 1981, where his advisors were Jon Bentley and H. T. Kung.[2]
    He then joined the faculty of the Massachusetts Institute of Technology, where he is now a professor. In addition, he is a principal in the Theory of Computation research group in the MIT Computer Science and Artificial Intelligence Laboratory, and he was formerly director of research and director of system architecture for Akamai Technologies. He was Founder and chief technology officer of Cilk Arts, Inc., a start-up that developed Cilk technology for multicore computing applications. (Cilk Arts, Inc. was acquired by Intel in 2009.)
    Leiserson's dissertation, Area-Efficient VLSI Computation, won the first ACM Doctoral Dissertation Award. In 1985, the National Science Foundation awarded him a Presidential Young Investigator Award. He is a Fellow of the Association for Computing Machinery (ACM), the American Association for the Advancement of Science (AAAS), the Institute of Electrical and Electronics Engineers (IEEE), and the Society for Industrial and Applied Mathematics (SIAM). He received the 2014 Taylor L. Booth Education Award from the IEEE Computer Society "for worldwide computer science education impact through writing a best-selling algorithms textbook, and developing courses on algorithms and parallel programming." He received the 2014 ACM-IEEE Computer Society Ken Kennedy Award for his "enduring influence on parallel computing systems and their adoption into mainstream use through scholarly research and development." He was also cited for "distinguished mentoring of computer science leaders and students." He received the 2013 ACM Paris Kanellakis Theory and Practice Award for "contributions to robust parallel and distributed computing."
    WIKIPEDIA

  • @Ali-kl3ql
    @Ali-kl3ql 2 ปีที่แล้ว +2

    What a great mindset! 13:19 : `` Go one step beyond, and then you can come back!``

  • @WacKEDmaN
    @WacKEDmaN ปีที่แล้ว +2

    im learning assembly on z80 and now, after watching half of this course, thinking about moving to C and hand compiling required optimised assembly...this playlist is great stuff... sure alot of it i wont need.. but its nice to see not much has changed in the x86-64 world...
    big thanks to MIT for sharing this stuff with us all :)

  • @fluctura
    @fluctura 2 ปีที่แล้ว +16

    Thank you SO MUCH for putting this online!! I'm writing an assembler by myself to actually really understand Assembly and machine code, and this one is an eye-opener in so many ways for me

    • @kewtomrao
      @kewtomrao 2 ปีที่แล้ว

      Hows the project going?
      Want to do it my self too but scared to start it :)

    • @MrChrisRP
      @MrChrisRP หลายเดือนก่อน

      Learn the most frequently used opcodes and you are already started down the path, very well. Memorize them and how many following bytes there are. It's not too overwhelming.

  • @grzesiek1x
    @grzesiek1x 2 ปีที่แล้ว +1

    It became more clear to me while I started doing digital electronics projects myself and what is amazing I come up to the ideas that someone has already invented but it is amazing sometimes to invent it myself! It is like exploring history or archeology and you find out what is going on with your computer.

  • @sikendongol4208
    @sikendongol4208 3 ปีที่แล้ว +7

    12:46 My experience is that if you really want to understand something, you want to understand it to level that's necessary and then one level below that.

    • @skilz8098
      @skilz8098 3 ปีที่แล้ว +1

      I'm bi-directional when it comes to learning about computers and I'm 100% self taught. I like to peek into 2 levels below and 2 levels above. I'll go all the way down to the transistors and how they are made within the realm of chemistry and physics all the way up to scripting languages such as Python.
      If I was to design a Degree Program at an arbitrary University here's a road map or an outline for each level of abstraction:
      Mathematics - General fields: Algebra, Geometry, Trigonometry, Calculus 1 and 2 at a minimum and maybe some Linear Algebra
      Physics - Basic Newtonian up to some Quantum Mechanics including the beginning stages of Electricity and Simple Circuits
      Chemistry - Basic Chemistry at the College Level primarily focused on the various chemicals and compounds that can act as conductors, insulators and inductors. (No need for Organic Chemistry)
      This would cover that basics needed for Entry Level Electronic Engineering
      Mathematics - Analytical Geometry, Vector Calculus, more Linear Algebra, Lambda Calculus, Probability & Statistics, Logic - Boolean Algebra, Introduction to Truth Tables & Karnaugh Maps.
      Physics - Quantum Mechanics continued, deeper theories such as wave propagation
      Circuit Design - Ability to both read and build electric diagrams or schematics using both DC and AC type currents learning about the differences between circuits components that are either in parallel or in series, voltage and current dividers. Also covering both Analog and Digital Electronic design patterns leading us into our Logic Gates.
      Chemistry - Specialized on the components that make up your transistors, resistors, diodes, capacitors, rectifiers, and more...
      Still no programming yet...
      Mathematics - More on Boolean Algebra and Discrete math that is Log2 Base mathematics (Binary Arithmetic and Logic calculations), Extended Truth Tables & Karnaugh Maps, State Machines covering both Mealy and Moore machines. Arbitrary Languages as Sets with a defined Alphabet.
      Physics - building various digital circuits and analyzing them with voltage meters and oscilloscopes.
      Chemistry - diving deeper into the components at the quantum levels
      Mathematics - Complex Number Analysis, Euler's numbers and formulas, Fourier Series, Laplace Transforms, Signal Analysis and FFTs(Fast Fourier Transforms)... and more
      Physics & Chemistry - maybe specialized fields at this point
      Bringing it all together, your first day as a programmer, well hardware / software engineer!
      Digital Circuits I - Combinational Logic - Integrated Circuits and Complex Circuit Design building an adder, a multiplexer and demux or selector, comparators, binary counters.
      Digital Circuits II - Sequential Logic - Feedback loops, SR Latch, D-Latch, Flip Flop, JK Flip Flop, Toggle, Timers, Registers
      Digital Circuits III - High and Low Logic, Synchronous vs Asynchronous, Bit Addressing, Memory and Data Paths
      Digital Circuits IV - Putting It All Together (ISA Design) - Refresher on Truth Tables, Karnaugh Maps & State Machines.
      *Introduction to HDL and Verilog
      **Any addition mathematics, physics or chemistry that is needed
      ISA I - Basic Turing Machines - Implementing one on a breadboard then move to either Proto Board and or a PCB
      ISA II - Pipelining
      ISA III - Caches
      ISA IV - Branch Prediction
      ISA V - Out of Order Execution
      **Any addition mathematics, physics or chemistry that is needed
      Transition from Hardware to Software:
      - Building an Assembler, Disassembler and Hex Editor.
      - Learning about the C Language and Building a basic C Compiler, Linker & Debugger
      - More Mathematics
      *Advanced Courses: GPU(Graphical Processing) - APU(Audio Processing) design, Storage Design, Monitor or Display Design, I/O Peripheral Design (Keyboards and Mouse) these would be basic and simplified versions of what you can buy on the market, something that could theoretically be done on a breadboard and easily be done on something like either on a ProtoBoard, PCB, of FPGA ... These may also include various classes related to Mechanical Engineering and other required mathematics, physics and chemistry courses.
      Operating System Design (OSD)
      OSD I - CMOS, Bios and or EFI/UEFI, Bootstrapper, the Kernel, the HAL and more basic features.
      OSD II - Vector Tables, Memory Addressing and Mapping, Virtual Memory, Processes and Threads, I/O handling and Interrupts, and more...
      OSD III - File Handling, Compression & Decompression, Encryption & Decryption
      OSD IV - Drivers and Peripheral Support
      OSD V - Graphics - Font and 2D Image Rendering, Audio and Networking (Ports)
      OSD VI - Generating an Installer
      OSD VII - Installing and Configuring the OS for First USE, extending your C compiler to support C/C++ based on your assembly language.
      Using your homebrew built computer with your own OS and compiler and if you elected to build your own or just interface with I/O, storage, graphic and audio devices... now it's time to use them and put it to good use. We can now jump into basic Game Design by starting off with implementing DOOM in a combination of ASM, C and or C++ to test out your hardware.
      The final stages: Designing your own high level dynamic scripting or interpreted language using your custom built CPU-PC to write high level languages within it.
      ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
      If you can build your own CPU and ISA from the ground up, build your own basic assembler and compiler that is enough to be able to write, build, and run DOOM on it, design your own high level scripting language to work on it... Then I could easily say you have graduated and become both a Hardware and Software Engineer!
      If I was coordinating a degree program at some university, something like this would be the outline for the Dual Bachelors/Masters Degree programs for both Software and Hardware Engineering. There may be some fields of mathematics, physics, etc. that I didn't cover here. Such as Data Sets, Algorithms, etc. which would be important yet I feel they would be more related towards the Computer Science side of things as opposed to the Engineering side of things. Yet to transition from the Bachelors to the Masters, the Bachelors in Computer Science side of things would be required.
      If one was to take every course available, they'd end up with a Masters for the following fields: Mathematics, Physics, Computational Chemistry, Computer Science, Software Engineering, Electrical-Electronic/Hardware Engineering, Operating System Design & Data Management and possibly a Bachelors in some of the follow: Mechanical Engineering, Data Structures & Algorithms, Information Theory, Compiler & OS Design and a few more...

  • @Rfc1394
    @Rfc1394 3 ปีที่แล้ว +12

    When I went to City College back in the late '70s, Assembly Language was commonly taught. Today, nobody worries about assembler except people who need extreme performance and compiler writers. I used to work at a shop that wrote almost all their software in Assembly. Nobody does that anymore,

    • @psycronizer
      @psycronizer 2 ปีที่แล้ว +4

      it's a shame really, because it is right down to machine level, and not that difficult to learn, just takes longer to get certain things done without relying on library calls etc. I loved it, using my ZX spectrum I wrote a defender type game, took me months, lots of coffee, all on paper, no assembler.

    • @mytech6779
      @mytech6779 2 ปีที่แล้ว +7

      Nobody on x86.
      Asm is still needed for the kajillions of microcontrollers embedded in various products, where they still only have a kilobyte of space for cost savings and a low clock speed 8bit processor for minimal power consumption. They may get the overall program compiled from C++ but then is needs adjustments. These specialty compilers do not have the broad support that the x86 compiler-optimizers enjoy.

  • @Jeremygee
    @Jeremygee 3 ปีที่แล้ว +7

    By far the best intro to x86 assembly I’ve ever seen

    • @pauleveritt3388
      @pauleveritt3388 3 ปีที่แล้ว

      Come on! It's MIT for Pete's Sake. You would expect any less?

  • @davidpanetta5492
    @davidpanetta5492 2 ปีที่แล้ว +3

    Interesting! I, much like the professor, started by doing most software projects in assembly code. Over the years I learned many, once you get the hang of one, others come more easily. I was reflecting back on one project I worked on where the code ran on an IBM 370, there were many groups involved, each writing code with only a specification sheet. I later learned it was a government secret project, so one group didn't know much about any other, all we ever saw of other groups work was object code modules for the linker which got linked to our code. Software breaks, in several cases the code authoring group no longer existed (Legacy) and if there was source code it was classified, when we traced a fault to such an object module we often had to disassemble it to figure out what it was doing. Our group usually wrote in IBM370 assembly, but we soon discovered that other groups had used many different languages - you can recognize compiler constructs after a while. Some code was written in C, FORTRAN, PL1, COBOL, even FORTH and assembly. Once we figured out what was broken, we had to write a patch for that object module - remember it was legacy code - no source available. Sometimes it was fun, sometimes it was stressful!

  • @KuldeepYadav-jw7jn
    @KuldeepYadav-jw7jn 9 หลายเดือนก่อน

    Prof Leiserson is the professor every CS student deserves on this earth......I wish he would have taught me

  • @paulmarkert5907
    @paulmarkert5907 2 ปีที่แล้ว +2

    This lecture is so nostalgic for me and reminds me of my first programming assignments, literally in Assembler (albeit mainframe). I enjoyed it.

  • @headoverbars8750
    @headoverbars8750 3 ปีที่แล้ว +13

    Thank you so much MIT and Professor Leiserson!!
    I write primarily to a VM (JVM via Kotlin) which compiles to Java Bytecode... I am loving that I get almost 2 levels of abstraction down by working through these lectures' n C... I mean I wasn't oblivious but didn't learn assembly as I should have.

    • @NazriB
      @NazriB 2 ปีที่แล้ว

      Lies again? Hello DLC

  • @SphereofTime
    @SphereofTime หลายเดือนก่อน +2

    19:00 If you change eax you also changes rax

  • @rustycherkas8229
    @rustycherkas8229 2 ปีที่แล้ว +1

    At the other end of my career, I caught a glimpse of what was then called "micro code"...
    An "instruction decode" cycle means that the instruction bits en-/dis-able circuitry of data signal pathways. (Eg: enable a shift register to multiple/divide by 2, or enable the block of 2-bit adders to sum bytes...) I envisioned this magic as a really complex railway switching yard. This is the coalface where machine code's 1s or 0s appear as 'high or low' electrical potentials.
    Then came learning about micro code, the embedded multi-step gating/latching operations that would occur (synchronously) within one or a few clock cycles. Way back then, the hardware I saw didn't have advanced micro code or machine code or Assembly to multiply two integers; multiply was done with many Assembly instructions (usually a library 'routine')...
    It helped when thinking how the effect of C's pre-/post-increment (decrement) instructions could be achieved, for instance.

  • @kabel74
    @kabel74 3 ปีที่แล้ว +11

    Among my favourite computer science subjects when I was in college way back then. It tune your mind to think really hard about optimizing your code for a given hardware. Amazing lectures and brilliant labs by similarly enthusiastic lecturer too 🙂

    • @jrwickersham
      @jrwickersham 2 ปีที่แล้ว

      I loved the living crap out of the physics classes I took that used Fortran (91??)
      Especially when looking at the problems that looked into route optimization across a large matrix. Prof at the time was a former Honeywell employee. I tossed out a comment “wow, this looks like figuring out how to plan a route for a Tomahawk missile..”.
      There was a look and a wink, and a sly “yep..”

  • @GamerX84
    @GamerX84 4 ปีที่แล้ว +26

    Since it wasn't mentioned, the older 8/16 bit systems often used bank switching to access more hardware resources within the 64KB address space. The CPU could only see 64KB at a time, but the hardware mappings within the 64KB could be changed on the fly as is possible with the Commodore 64.

    • @legion_prex3650
      @legion_prex3650 8 หลายเดือนก่อน

      it is obvious that you were game programming with (turbo) assembler on the C64 back in the 80ies... :) am i right?

  • @antonfernando8409
    @antonfernando8409 3 ปีที่แล้ว +6

    Came here to understand somethings about c++ optimizations, but this is another level altogether, its been 30 since I last wrote assembler code (68k motorola to multiply matrx, it was fun), still not a bad idea to get refresher on assembly. So loop decrementing is more efficient that counting up, interesting, now I know why. That instruction pipeline stuff is just crazy shit, way out of my pay grade. Thanks prof, enjoyed it.

  • @kd1s
    @kd1s 3 ปีที่แล้ว +13

    My introduction to assembly language was on a TRS-80 back in the day. Found I could speed things by poking values into memory. Then got the assembler and life was fantastic. And i still remember the video addrs 3c00 to 3fff

    • @pauleveritt3388
      @pauleveritt3388 3 ปีที่แล้ว +1

      I wrote a program on my TRS-80 that would draw the X-Y axis centered on the screen and the plot cardioids using polar coordinates. Not bad for a 9 line BASIC program.

    • @jimmccusker3155
      @jimmccusker3155 3 ปีที่แล้ว

      @@pauleveritt3388 Remember this?
      LD HL, 3C00H
      LD DE, 3C01H
      LD BC, 3FFH
      LD (HL),20H
      LDIR

    • @kd1s
      @kd1s 3 ปีที่แล้ว +1

      @@pauleveritt3388 Ah very cool. Yeah I did all sorts of things, had a voice input module and a speech synthesizer hooked up to mine.

    • @psycronizer
      @psycronizer 2 ปีที่แล้ว

      @@jimmccusker3155 was this some display buffer program, the load, increment, repeat instruction and load (HL) 20H is the clue for me, never used TRS but that was Z80A just like my ZX spectrum I believe, I wrote a defender type game in assembly, by using paper and poke statements, took me fucking ages ! it didn't use sprites, just the larger character set attributes, but it ran ! the spectrum split up the display screen in a really weird way though.

    • @jimmccusker3155
      @jimmccusker3155 2 ปีที่แล้ว +1

      @@psycronizer Yes, the TRS-80's video was mapped from 3C00H - 3FFFH and those assembly instructions were the quickest way to clear the screen using a space character (20H).

  • @Bytheocean
    @Bytheocean 3 ปีที่แล้ว +4

    I applaud you teaching this elegant language. One point from a 40+ year programmer and fellow professor. Everyone I have ever known has pronounced "Clang" as "C-Lang" Or See-Lang.

    • @lawrencemanning
      @lawrencemanning 3 ปีที่แล้ว +2

      Are you sure? Acronyms are nearly always made into short easily pronounced utterances. See: picofarad to puff, SSH to shush (fairly unusual admittedly) and even WWW to wah,wah,wah or dub,dub,dub. FWIW every time I've talked about Clang with other programmers it was: Klang.

    • @darrellee8194
      @darrellee8194 3 ปีที่แล้ว

      I’ve also never heard the option flag referred to as minus but always Dash

  • @brucefelger4015
    @brucefelger4015 2 ปีที่แล้ว +1

    first program i ever wrote was for CS101 which we had to right in Binary literals, so 0 and 1's learning from the bottom up has helped me more times than i can name. thanks for this.

  • @therealb888
    @therealb888 3 ปีที่แล้ว +10

    13:10 The one level below & one step beyond learning philosophy is something I've followed as well.

    • @skilz8098
      @skilz8098 3 ปีที่แล้ว +4

      I'm self taught and I have learned about every level of abstraction within computers from the science (mathematics, physics and chemistry side of things) all the way up the chain to logic gates, cpu and ISA design to OS and Compiler design and various other fields such as 3D Graphics - Game Engine design, to Hardware Emulation all primarily in C/C++ and a little bit of assembly, and now learning about Neural Nets, A.I. Programming and Machine Learning within Python. I didn't start at the bottom though nor the top. I started near the middle tackling some of the hardest aspects of software engineering with C++ while learning DirectX and OpenGL. C++ is one of the hardest languages to become proficient and accurate with as a first language and 3D Graphics and Game Engine Programming incorporates almost every and all other aspects of programming throughout a majority of all other industries. The Graphics Rendering is just one small part... You also have Audio Processing, Animation, Physics Engines, A.I. programming such as Path Finding and scripting dialogs for in game NPCs. Terrain Generation, Foliage, Weather and Environment Mapping and Generation, Various Tree Structures with Memory Management, Networking with Server and Client side applications that deal with sockets, packets, ip addresses and more if the game is meant to be Online, Compression & Decompression, and Security with encryption and decryption and much much more as this is only the "game engine" side of things as this doesn't include any "game logic" or "game rules"... Many Engines will have their own Parsers and some will even have their own built in scripting languages for that specific engine which also leads into compiler or interpreter design... But this is where I started... I branched from here going in both directions. Going up the ladder of abstraction I am now learning Python, Java and JavaScript with Machine Learning Algorithms and Techniques as my target goal and going down the ladder I'm learning more about Assembly and how it is designed through the implementation of its targeted hardware (ISA, CPU - Hardware Design). I never stop learning! That's why I'm watching this video... It's a refresher and I might pick up something new that I didn't know from before! But yes that 1 step above and below is a very good method to follow.

    • @MJ-ur9tc
      @MJ-ur9tc 3 ปีที่แล้ว +1

      @@skilz8098 you must be kidding ! How could somebody learn so many things by just self studying!!!? And if your age is below 30 then it does not seem to be feasible to self learn so many concepts by your age. Please enlighten me how you have achieved this feat if you have really learnt so much within a time span of few years.

    • @skilz8098
      @skilz8098 3 ปีที่แล้ว +1

      @@MJ-ur9tc Well, I am 40 and I never stop reading into things! Also, when I was a teenager either in middle or high school, I still knew more then than most 40 year olds did.

    • @therealb888
      @therealb888 3 ปีที่แล้ว +1

      @@skilz8098 I mean re-reading your comment just makes me pause. It's like word to word what I want to do. Like, are you trying to social engineer me? I'm not exaggerating, I hope you're not as well but the things you've mentioned here are everything I want to learn and get good at myself. Moving down & moving up the abstraction ladder this is what I want to do as well but it's not something that's easy. It's like EECS, ECE/TCE + IT/NE. Those are like 3-4 fields. You have different teams, hell different departments for them in an organization. Damn! I thought I was overreaching and doing something like that is impossible but you have done it.
      Could you please share a timeline of your learning too please.

    • @skilz8098
      @skilz8098 3 ปีที่แล้ว +1

      ​@@therealb888 The learning or timeline that you are referring to doesn't really exist, it's kind of random and what's of interest to me at the time... but it would look something like this...
      From about 2000 - 2018 I have taught myself C and C++ and my primary motivation was to learn how to build a 3D Graphics or 3D Game Engine from scratch. I wasn't so much interested in making a game even though that was on the todo list; I was more interested in learning how to make the actual engine. I wanted to know how lines, triangles, etc. are being drawn to the screen. How to simulate physics, motion and animation, even 3D Audio was a part of this. By learning how to build an actual Game Engine and all of the different aspects and components of the Engine it teaches you memory management, resource management and object lifetime. It teaches you databasing, file parsing, writing your own scripting language sort of like an assembler or compiler, etc... and this is just to open, read, load, write to files either being graphics, audio, vertex data, later on shaders, initialization files, configuration files, save files, etc. Then comes the Graphics Pipeline and all of its various stages. It teaches you how to setup up and how to both integrate and use a Graphics API such as DirectX, OpenGL, and now Vulkan and I've learned all 3. Well, I've learned DirectX from version 9.0c upto version 11.1 as I currently don't have a Windows 10 machine and don't have access to hardware that supports DirectX 12. The same goes for OpenGL. I started with Legacy OpenGL or v 1.0, I then skipped version 2... and went straight to modern OpenGL which was 3.3 at the time and now its 4.5 or 4.6... And even to this day I'm still interested in learning so I read through the documents, forums, help pages, watch videos, etc... just to remain current. Even your programming language such as C++ changes with time. When I first started I was using Visual Studio Express 2003 or 2005... My current latest compiler is Visual Studio 2017 and when I finally get my new rig, I'll have Visual Studio 2019 or newer for C++ 20... I can only do C++ 17 and there are some features that I can not, but for C++ 14 and 11, I can do just about all of them.
      Outside of just the graphics portion. I then started looking into and trying to learn how compilers are made, how assemblers, disassemblers are made for I was diving into how the actual programming languages themselves were built. As for the assembly side of things, I started to learn this because of my newer interest in learning how Hardware Emulators are built or engineered. This is still the software side of things. Now at this point I had some knowledge of basic circuitry more on the math side, but never have solder a circuit board, but I've always had an interest in how they are made and how we are able to get them to behave in the manner in which we prescribe. So this lead me into doing some research and I came across Ben Eaters TH-cam page on building an 8-bit breadboard CPU. I've been following his channel ever since! I even went from there and started to watch videos on TH-cam from either MIT, Stanford, etc... on Computer Architect Design and more. I even went and found a free program call Logisim that allows you to place icons onto a grid fashion to build circuits. And yes you can build a CPU within. I went this route because I didn't have the physical materials to follow along with Ben Eaters videos. And I was able to build my own implementation of his 8-bit breadboard CPU within Logism following and adhering to his ISA (Instruction Set Architecture). I can take his binary form of his own assembly language and run it through my CPU within Logisim and it will run the same functions that he demonstrates.
      Before this I had already knew about the internal hardware components of a modern computer such as the CPU, the RAM, ROM(Hard drives or external storage), The Main Bus, CMOS originally then BIOS, expansion cards, video cards, sound cards etc. and I've always been able to build, configure and install all of the hardware and this goes as far back as the early to mid 90s. I was about 12 when I got my first PC and it had Windows 3.1 & Dos 6.0. When you turned it on, you got some writing on the screen about the connected hardware, the ram, etc... then you'd get a command prompt! C:\ If you wanted to use Windows, you had to be at the root of the drive and type "cd Windows" without the quotes, this would then put you in the Windows directory, and then you had to type Windows or Windows.exe for it to load and run... CDs never mind DVDs or BluRays were not exactly new, but most computers then didn't have CD Rom support, they still had the old floppies, both the smaller 3.5 and the larger flimsier 5.25 diskettes. The internet was around but it was both young and most didn't have it as it was expensive and there weren't that many graphics nevermind videos. It wasn't until the late 90s that we first got internet and even then (pre dot com boom) most websites had only a couple of graphics or images because their download times were slow. Over 90% of the internet back then was text and hypertext links and many sites were still found on ftp sites.
      So I've always had a decent knowledge of the hardware but never knew how they were built through their circuitry. Due to my interest in wanting to learn Hardware Emulation Programming, this lead me into CPU Architect and Design as well Electronic Engineering beyond the integrated circuits right down to the logic gates themselves and their connections or pathways. And I didn't stop there, I even learned how the logic gates themselves are made from the actual components such as transistors, resistors, capacitors, etc... and around 2005 - 2008 I was trying to put myself through college to get a degree in both Software and Hardware or Electronic Engineering. And within my Calc Based Physics Class I learned more about Electricity and the various components. Not so much the circuit diagrams, but what chemicals they are made of, how they interact with each other, what happens to them when you apply a voltage or a charge, or put them near a magnetic field, etc...
      Now as for the math behind it all my highest level of math from a classroom is Calc II. However I was teaching myself Linear Algebra and Vector Calculus while I was learning Calc I. This was from write the functions to build a working 3D Scene with a viewpoint and view frustum to have a working Camera as well as applying transformations from one coordinate system to another as well as applying transformations onto objects to simulate both physics and animations.
      When you start working with Audio especially 3D Audio this is an entire different ball game! Now you have to learn FFTs, bitrate samples, frequencies, compression and decompression algorithms etc...
      And it doesn't end there... Just recently within the past year or two, I have an interest in two different topics. AI - Programming, Neural Networks, Machine Learning and Operating System Design. The first got me into learning Python and the later refreshed my understanding of Assembly and C as C++ is my primary language of choice. So even to this day I'm still learning!

  • @kbflom4500
    @kbflom4500 2 ปีที่แล้ว +6

    Thank you for sharing and giving some insight into some of what our computers have of workload under the hood. This class inspires me and even if the topics are abstract, it reminds me of how lucky we are to have advanced human interface device applications - and it all scales down to processors Hz and memory - but all that is thanks to students and lectures digging deep persistantly and manufacturers even deeper.

  • @Marius-vw9hp
    @Marius-vw9hp 2 ปีที่แล้ว

    I love his enthusiasm for the historical confusion X)

  • @matt-g-recovers
    @matt-g-recovers 2 ปีที่แล้ว +4

    Great series!
    I operate at a couple of layers above this currently, but I love learning about the layers below and maybe one day will get to work a bit closer to the hardware on some performance critical projects.

  • @dpz34
    @dpz34 3 ปีที่แล้ว +8

    Thank God TH-cam and MIT both exist

  • @chang-kp9sp
    @chang-kp9sp 3 ปีที่แล้ว +2

    Surprised to see current programmers do not learn much of Assembly language. It is very useful and insightful if they really want to know inside of computer.

    • @sbalogh53
      @sbalogh53 2 ปีที่แล้ว

      Most current programmers probably just patch together a bunch of library functions when writing their "program". This explains why so much of current software feels as if it is running on a 1980's computer instead of the super fast machines we use today. I have seen some insanely stupid code written by "young" programmers. For example a customer counter was stored in an SQL database that was located in another machine in another building. Every time that counter was incremented, which was often in that application, meant a series of SQL instructions over a local area network. They had code like that yet still wondered why their million dollar Sun servers could only handle 70-80 web requests a second. Then occasionally the network connection would time out and the program would fail. I tried to mention that this was a silly idea but was told to stop being so negative and be a "team player". I am so glad to be retired and not have to deal with these people, although I still have to suffer crappy software every day.

    • @toby9999
      @toby9999 2 ปีที่แล้ว

      @@sbalogh53 Spot on. Most of the developers I've worked with wouldn't have a clue about what's under the hood. It's mostly high level stuff these days.

    • @williamdrum9899
      @williamdrum9899 2 ปีที่แล้ว +1

      Things I learned from assembly:
      * The compiler will avoid multiplication and division at all costs (except by powers of 2)
      * x % 256 = x & 255 (same for other powers of 2)
      * for(;;) is a goto
      * while(true) is a goto
      * in fact, every control structure is just a goto wearing a trench coat
      * The largest integer on any computer is -1
      * Arrays are secretly one-dimensional

  • @georgealex19
    @georgealex19 2 ปีที่แล้ว

    Great lecture. I’ve graduated 6+ years ago and I already knew most of the stuff, but I just wanted to say I really appreciate the lecture and really liked the professor’s attitude and presentation👌

    • @georgealex19
      @georgealex19 2 ปีที่แล้ว

      Edit: I’m actually impressed I knew the answer to how many bytes the quad-word has and also why it historically has 8 😂 man, I’m getting old

  • @vetiarvind
    @vetiarvind 3 ปีที่แล้ว +5

    Holy shit he's one of the authors of the CQRS book. We're in the presence of a legend.

    • @danman6669
      @danman6669 2 ปีที่แล้ว +1

      You mean CLRS. What's really ironic is the 'L' in CLRS stands for Professor Leiserson's last name, yet you got it wrong and wrote 'Q' instead, even though the whole point to your comment was to point out that he is one of the co-authors of the book.

  • @thomasclapton2010
    @thomasclapton2010 2 ปีที่แล้ว

    Thank you for sharing and giving some insight into some of what our computers have of workload under the hood.

  • @truehurukan
    @truehurukan 2 ปีที่แล้ว

    Assembly is a very nice programmation language but damn it is complicated ^^ I'm a french speaking programmer/teacher and I would give all my positive feedback for this lecture, lately, yes... I did not used the Google's subtitles to understand the instructor... that's great !!

  • @juanmamani2110
    @juanmamani2110 3 ปีที่แล้ว

    Lecture effect: memory jumps to 70s and 80s intel 8080,8088,286,386,486....
    Excelent lecture as intro to asm instructions.

  • @fernandoochoaolivares8829
    @fernandoochoaolivares8829 3 ปีที่แล้ว +1

    One of the best coders I see in a while...

  • @RolandNSI
    @RolandNSI 3 ปีที่แล้ว +2

    Assembly gives you power ! The ultimate power over your machine. Use it wisely. ( and please don't put viruses in the cracks you make )

    • @AntonySimkin
      @AntonySimkin 3 ปีที่แล้ว +3

      just put some sleeping helpers if you need some huge resources to access some servers or may be calculate something lol

  • @dmpase
    @dmpase 2 ปีที่แล้ว +3

    11:15 "Vector operations tend to be faster than scalar operations." Well, not as much as they used to. I have plots of DAXPY (y[i]=a*x[i]+y[i]) for different sizes of x and y on Intel and ARM architectures. Vector operations are much faster when x and y both fit into registers or L1. When they fit in L2, vector is still faster but not as much. When they only fit in L3 or memory, there is no difference in speed. Gather and scatter operations show no difference in speed (vector vs. scalar) regardless of cacheability. I haven't found an example where vector ops are slower, but I do have examples where vector and scalar ops are very similar in speed. Much different now than the CDC 7600, FPS-264, CRAY Y-MP and C-90 vector architectures of our younger days.
    BTW, I love your work Dr. Leiserson. I was very happy to find this lecture. This topic is not taught enough. Please don't take my comments as a criticism. I only wish to say that your example, "vector is faster", is not nearly as true today as it once was.

    • @schmetterling4477
      @schmetterling4477 2 ปีที่แล้ว

      If you are busting the cache, then you are a shite programmer. :-)

  • @MrChrisRP
    @MrChrisRP หลายเดือนก่อน

    The reason why it is not "copy" opcode but rather "mov" is because you can literally make up a random integer, most notably 1 or 0 but anything that fits, out of the blue and from thin air. For instance, lets consider that we have 32 C0 which is xor al, al. (Anything at all xor with self creates a null value: 0.) In order to make a program function in a more desirable way, perhaps changing that instruction to mov al, 1 which is B0 01 instead, will work. So in this case, the 1 is not being copied but literally just imagined and told.

  • @Anonnius
    @Anonnius ปีที่แล้ว

    Thank you for making this lecture freely available!

  • @lucas404x
    @lucas404x 4 ปีที่แล้ว +108

    Great video. I'm trying to get this knowledge without to be in college. Thanks Professor Leiserson! :)

    • @Uvisir
      @Uvisir 3 ปีที่แล้ว +6

      same here

    • @ian_b
      @ian_b 3 ปีที่แล้ว +29

      Back in the old days when microcomputers first came out, many of us had to learn Assembly from a handful of books and magazine articles. Nowadays there are massive resources available online. If we could do it, you can! Best wishes for your learning.

    • @therealb888
      @therealb888 3 ปีที่แล้ว +3

      @@ServitorSkull oh ben eater I knew I was familiar with it!. He sould have used 8086 though.

    • @brucemunro8598
      @brucemunro8598 3 ปีที่แล้ว +4

      @@therealb888 You can still buy new 6502 chips, and they are very cheap as well.

    • @Mr_ToR
      @Mr_ToR 2 ปีที่แล้ว

      watch ben eater. first watch his videos about building a cpu, then his other stuff.

  • @JakeBechtold
    @JakeBechtold 3 ปีที่แล้ว +4

    Man I wish I could have had this professor in college! Guess that's why he's at MIT.

    • @picklerix6162
      @picklerix6162 3 ปีที่แล้ว +1

      I wish I had this guy in college. The professor who taught my assembly language class barely knew the topic but that didn’t stop me from learning.

    • @sbalogh53
      @sbalogh53 2 ปีที่แล้ว

      @@picklerix6162 ... I self-taught myself in Honeywell Easycoder (it's assembler language) and Z80 assembler. It was not that hard back then, although today's chips seem to be far more complex so there may be a need for referencing the manual more often.

  • @leonlao744
    @leonlao744 3 ปีที่แล้ว +2

    I will absolutely give this lecture a thumb up

  • @amyh4606
    @amyh4606 3 ปีที่แล้ว +2

    I remember this class. It was interesting to know how programs work under the covers. That said, you wouldn't want to write a program in assembly

    • @sbalogh53
      @sbalogh53 2 ปีที่แล้ว +2

      I wrote many programs in Z80 assembler code back in the 1980s. I also reverse engineered large blocks of binary code stored in EPROMS, modified the code to suit new requirements, assembled the new code and burned it back into EPROMS. They were fun times. Not sure I would enjoy the experience with X86 code though.

    • @williamdrum9899
      @williamdrum9899 2 ปีที่แล้ว

      These days, no. The x86 has a huge number of instructions, and the ARM compresses its immediate operands meaning that not every 32-bit value can be loaded into a register as a one-liner (usually on the ARM, constants are stored in nearby data blocks and you load from there instead). I don't mind writing in Motorola 68000 assembly tbh.

  • @alexandersviridov8682
    @alexandersviridov8682 4 ปีที่แล้ว +12

    Great explanation. TY from Russia.

  • @naruto6918
    @naruto6918 2 ปีที่แล้ว +2

    His book"introduction to algorithms" is just osm ❤️

  • @mostafar8514
    @mostafar8514 ปีที่แล้ว

    My only experience in assembly is zachtronics exapunks game and it led me here and i understand most of it surprisingly well. To anyone who wants to learn assembly in a fun way, definately check the game out

  • @allanrichardson9081
    @allanrichardson9081 3 ปีที่แล้ว

    The assembly language for this one-chip microprocessor is more complex than the assembly language for the 1964 models of System/360! Learned some very interesting information. Now I need to get the details somewhere without going BACK to college!

    • @raybod1775
      @raybod1775 2 ปีที่แล้ว +1

      I’m a retired IBM mainframe programmer, nothing but respect for any assembler programmer who can write quality code. Assembler is completely unforgiving, takes so much diligence and concentration. I did write macro assembler code to run Cobol programs, but someone else had written the original code I modified and used. Great for getting a true feel about how computers operate.

    • @schmetterling4477
      @schmetterling4477 2 ปีที่แล้ว

      @@raybod1775 Assembler is no more unforgiving than any other language. Poor code is poor code in any language. There is, unfortunately, a religious belief among younger programmers without a solid computer science background that computer languages and compilers make programming easier. That is not the case. A language is simply a collection of shortcuts to achieve certain side-effects. It is, sort of, an implicit library to a set of often used algorithms. The exact same results can be achieved with explicit libraries of assembly level functions. What will be missing are the (extensive) compiler optimizations. An assembler programmer would have to work much harder to get the same level of optimization of out the code as is possible with a modern compiler. Other than that no language can transform a hard programming problem into an easy one.

    • @schmetterling4477
      @schmetterling4477 2 ปีที่แล้ว

      @@raybod1775 I have written 6502 assembler before you were even born, Ray.
      You can do absolutely everything on a well designed compiler, Ray. C, for instance, has a so called _asm_ statement. Guess what that does? Python lets you bind C code to your Python program directly. Just because you are chicken to use these facilities doesn't mean they don't exist and aren't being used by people who actually know how to use computers. You clearly don't. So what? So nothing except that a guy called Ray has to educate himself.

    • @rty1955
      @rty1955 5 หลายเดือนก่อน

      ​@@raybod1775I also wrote mainframe assembly code since 1969. Assembly code to programmers is to a blank canvas to artists. I can do things in assembly that other languages simply can not do.
      As far as being unforgiving, and most other commenters are talking about is that assembly code has no type checking and I am glad for that, as type checking ties my hands. Memory is memory, you can pit whatever you want in it, code, text, number (in any coding scheme you want)
      All these stupid special purpose registers are ridiculous. Jist give me GENERAL purpose registers and let ME worry about what to put in them. While im on the subject about special purpose registers, why have a stack at all? Its just memory! IBM mainframes never had a stack and They worked just fine.
      I wrote millions of lines of assembly code (as well as 13 other languages) i also wrote micro code for IBM mainframes using special mylar punch cards on a 360. One of my modules in assembly was still running in 2013, many decades after I wrote it.
      Most of you will never experience the true joy of coding in assembly if you had no exposure to an IBM mainframe. Its still my fav language

  • @mariodrechsler2618
    @mariodrechsler2618 2 ปีที่แล้ว

    I began with assembly in 1987 but sadly endet some lines of code later. I knew how allmighty it is, but that time I studied architecture. Today I sometimes think how helpful it would been all the passing time to have skills like that... This is the real cause why Mies van der Rohe said "Less is more" ;)

  • @TranscendentBen
    @TranscendentBen 3 ปีที่แล้ว +1

    1:00:06 Here's a point where he misspoke on memory vs. registers: "I would say also the fact that you have a design with registers, that also reflects locality. cuz thwat that the processor wants to do things is fetch stuff from memory, doesn't want to operate on it in memory, that's very expensive, it wants to fetch things into memory [he meant registers], get enough of them there that you can do some calculations, do a whole bunch of calculations, and then put them back out there [to memory]."
    This reminds me of the February 1984 BYTE Magazine, in an interview with the original Macintosh system-level programmers, one of them said of writing efficient code for the Mac's 68000 processor, "keep the registers full." The hit (time delay) of accessing main memory vs. a register wasn't nearly as bad back then, but keeping as much data in registers as you can keeps from having to swap out a lot of data to main memory.

    • @williamdrum9899
      @williamdrum9899 2 ปีที่แล้ว +1

      I always feel like I don't have enough registers on the 68000 and yet I never feel this way with machines that have far fewer registers like the 6502. Sure you've got 8 on the 68k but it seems like I need to give up a data register any time I have to index an array so I can get the right offset

  • @okaro6595
    @okaro6595 3 ปีที่แล้ว +1

    8086 could address 1 megabyte by using the segment registers. The weird memory architecture was created partially to make it compatible with 8080.

    • @williamdrum9899
      @williamdrum9899 2 ปีที่แล้ว

      Yeah I remember trying C for 8086 and thought "Ok but how do I pick a segment to load from" not realizibg I could just specify the full 20-bit address and the compiler did the rest... Lesson learned, registers are the compiler's job

  • @abevigoda3149
    @abevigoda3149 2 ปีที่แล้ว

    When I needed to really optimize code the only option was doing it in assembly, I have no doubt it still applies today, knowing the target processor well (instruction set and the bugs you can exploit) can dramatically increase performance in iterative code, it's not for lazy people though.

    • @NinjaRunningWild
      @NinjaRunningWild 2 ปีที่แล้ว

      True, it can be tedious. But, to quote Michael Abrash, "Rarely can you get more than a 2x performance increase by doing everything in Assembly Language."

    • @HarnessedGnat
      @HarnessedGnat 2 ปีที่แล้ว

      Today, compilers can be VERY good at producing optimized code. Sometimes it's a matter of knowing how not to get in the compiler's way. Write readable code, and then maybe have a look to see what it did.

    • @abevigoda3149
      @abevigoda3149 2 ปีที่แล้ว

      @@HarnessedGnat Sure, and compilers will get even better in the future, but knowing how not to get in the compiler's way requires knowledge of how the compiler works, the deeper the better, and not all programmers take the time to do that, they just compile their code away and expect it to be thoroughly optimized, in some applications it is best to do things yourself (use your brain and natural intelligence) instead of relying on an automated, rule following algorithm that will oversee even the obvious for its lack of intelligence, then you reap the benefits of assembly programming.

  • @nishthagupta1357
    @nishthagupta1357 2 ปีที่แล้ว +1

    Such an eloquent speaker he is! ❤

    • @danman6669
      @danman6669 2 ปีที่แล้ว

      "He is such an eloquent speaker!"

    • @nishthagupta1357
      @nishthagupta1357 2 ปีที่แล้ว

      @@danman6669 I tried complicated sentence formation technique. Unlike you, who did the simple kind.

  • @charlespackwood
    @charlespackwood 3 ปีที่แล้ว +6

    Every computer nerd needs to learn Assembly Language.

  • @veramentegina
    @veramentegina 5 ปีที่แล้ว +20

    love this guy!! Great lecture really!!

  • @GoogleUser-ee8ro
    @GoogleUser-ee8ro 2 ปีที่แล้ว

    This lesson was filmed in 2018, and if I recall correctly it was the time when Intel introduced AVX512 to its Skylake architecture; if MIT were to film this course again, I wonder if it would update it to an ARM assembly version, 🙂

  • @legion_prex3650
    @legion_prex3650 3 ปีที่แล้ว +5

    cool! Thanks Professor Leiserson! Great lecture! I still love assembly.

  • @magnuswootton6181
    @magnuswootton6181 3 ปีที่แล้ว +2

    Thanks for the ILP lesson! now I know what its called!!! ILP is the future, I think.

  • @lil-hooves
    @lil-hooves 5 ปีที่แล้ว +25

    Incredibly helpful, thank you!

  • @yashas9974
    @yashas9974 3 ปีที่แล้ว +6

    19:05
    "640KB ought to be enough for anyone" - Bill Gates (1981)
    "64bit address is a spectacular amount of stuff" - Charles Leiserson (2018)
    I wonder how long this will hold.

    • @toby9999
      @toby9999 2 ปีที่แล้ว

      My first 'computer' back in the mid 70's had 1k, so 64k seemed outrageous. All coding was done in hex. I can remember someone around the mid 90's telling me they'd installed a 2Gb hard drive. My first thought was why would you need that much storage? Mine was ~300MB. It's all relative to the times.

    • @schmetterling4477
      @schmetterling4477 2 ปีที่แล้ว

      @@toby9999 Huh? 300MB was already rather small at the time. I used to run out of storage space all the time.

  • @laboratoriodojulio
    @laboratoriodojulio 3 ปีที่แล้ว

    For me.. this is the best class ... great.. best regards professor.

  • @SphereofTime
    @SphereofTime หลายเดือนก่อน

    11:41 compiler intrinsic function

  • @DrewryPope
    @DrewryPope 2 ปีที่แล้ว

    a truly great overview thank you i've watched this multiple times

  • @starriet
    @starriet 2 ปีที่แล้ว

    I don't know how TH-cam recommended me this(maybe related to googling),
    but anyways, excellent job TH-cam!!! Now you're working.

  • @saicharanmarrivada5077
    @saicharanmarrivada5077 18 ชั่วโมงที่ผ่านมา

    44:05 xor %eax, %eax clears the entire register and 1 byte shorter.

  • @mhaddadi
    @mhaddadi 3 ปีที่แล้ว

    8087 was doing the floating point calculations, mathco chip.

  • @erichlow3109
    @erichlow3109 3 ปีที่แล้ว

    Excellent intro course, maybe considering in further lesson scope a look to ARM M33 et al architectures very present in IoT use cases

  • @jvolstad
    @jvolstad 3 ปีที่แล้ว +2

    I'm a retired COBOL developer. 👍

  • @davereid-daly2205
    @davereid-daly2205 2 ปีที่แล้ว +1

    Fantastic explanations and diagrams. Extremely helpful indeed, thank you SO much !!!!!!

  • @chiefolk
    @chiefolk 3 ปีที่แล้ว +3

    I wish i could listen properly to the MIT's student's questions/answers

    • @w1d3r75
      @w1d3r75 3 ปีที่แล้ว +1

      Maybe in the course transcripts ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-172-performance-engineering-of-software-systems-fall-2018/lecture-videos/lecture-4-assembly-language-computer-architecture/#vid_playlist

    • @chiefolk
      @chiefolk 3 ปีที่แล้ว

      @@w1d3r75 thanks 🙂

  • @KingofUrukhai
    @KingofUrukhai ปีที่แล้ว

    C or C++ compiler have got such a level of efficiency that coding in assembler is no long needed, even to optimize code that might be time critical...
    Developing control units in an automotive environment, I spent a lot of time with my customer,s which were among the most prominent European car makers, mainly executing code reviews.
    Code reviews consisted of checks to verify that coding rules were met, and in details that no assembler code was used!!!

  • @brahimd8683
    @brahimd8683 3 ปีที่แล้ว

    Quantum computers will be very fast, then they will be developed with complex algorithms where they can decide how to work and charge etc. Then they will be developed into 3D, through a new technology that is still in the preliminary research stage, I call it 3D lights, These 3D lights will also be used in TV and even smartphones, etc.

  • @NinjaRunningWild
    @NinjaRunningWild 2 ปีที่แล้ว

    Just CISC architecture. There's pros & cons to that. One con is a complicated & verbose instruction set.

  • @WeconTechnology
    @WeconTechnology 2 ปีที่แล้ว

    very nice video about language and computer.

  • @Basieeee
    @Basieeee 3 ปีที่แล้ว +3

    6:55,
    me in my head, REVERSE ENGINEERING, REVERSE ENGINEERING.

  • @FreestateofOkondor
    @FreestateofOkondor 5 ปีที่แล้ว +8

    awesome video but why does the playlist have some videos in the wrong order?

    • @mitocw
      @mitocw  5 ปีที่แล้ว +26

      No idea why. We were sure they were in order before the playlist was made public... but it's been fixed. Thanks for you note!

  • @justcurious1940
    @justcurious1940 ปีที่แล้ว

    Thanks for free lecture.

  • @laohu5511
    @laohu5511 2 ปีที่แล้ว

    Great video , great introduction.

  • @TomFynn
    @TomFynn 3 หลายเดือนก่อน

    Assembly is what makes playing Dragon's Lair look like casual gaming.

  • @jakefischer8281
    @jakefischer8281 3 ปีที่แล้ว

    Love his enthusiasm!

  • @engcre
    @engcre ปีที่แล้ว +1

    Gostei muito 🇧🇷

  • @AndyWJP
    @AndyWJP 3 ปีที่แล้ว +11

    He terminates his sentences with the enum value OK (zero/null)

  • @DHorse
    @DHorse 3 ปีที่แล้ว

    What a great speaker .

  • @SphereofTime
    @SphereofTime หลายเดือนก่อน

    9:22 reveals what compiler does and not

  • @nickmullen9510
    @nickmullen9510 6 หลายเดือนก่อน

    Love you man great presentation love the shirt

  • @RobWinchesterBoston
    @RobWinchesterBoston 3 ปีที่แล้ว +1

    2**128 addressing... well I can sort of see databases using that eventually as SSD and RAM continue to blur (though for normal use I agree x64 will be around for a long, long time)

  • @andrewherrera7735
    @andrewherrera7735 3 ปีที่แล้ว

    59:02 all this is the duct tape holding together moore's law.

  • @kahnfatman
    @kahnfatman หลายเดือนก่อน

    Dissertation paper: Apollo 11 - Graduation will be on the Moon 🌕 💙
    But seriously kids (8 years old and older) should be exposed to all this prior to doing Scratch and Python

  • @pfever
    @pfever 3 ปีที่แล้ว +3

    Amazing professor! :)

  • @TranscendentBen
    @TranscendentBen 3 ปีที่แล้ว

    1:07:16 "FMA? Fused Multiply and Add." Is this Intel coming up with a new name for something that already existed? In DSP terminology (which goes back a few decades) it's MAC, for Multiply and Accumulate.

  • @rabbitcreative
    @rabbitcreative 3 ปีที่แล้ว +2

    *chuckles* He said, "similar-in-structure". How Korzybskian.

  • @pial2461
    @pial2461 5 ปีที่แล้ว +3

    Wow! Awesome content

  • @lashlarue59
    @lashlarue59 3 ปีที่แล้ว +1

    Wait a minute. At 11:45 he said that in the class you should be able write Assembler from scratch but they wouldn't be doing that in class. How can you have an Assembler class when the students aren't going to write Assembler programs from scratch? You learn by doing it, screwing up a few 1000 times and fixing those mistakes.

  • @brahimd8683
    @brahimd8683 3 ปีที่แล้ว

    Quantum computing is the exploitation of collective properties of quantum states, such as superposition and entanglement, to perform computation. The devices that perform quantum computations are known as quantum computers, They are believed to be able to solve certain computational problems, such as integer factorization (which underlies RSA encryption), substantially faster than classical computers. The study of quantum computing is a subfield of quantum information science. Expansion is expected in the next few years as the field shifts toward real-world use in pharmaceutical, data security and other applications.

    • @williamdrum9899
      @williamdrum9899 2 ปีที่แล้ว

      I don't think I want to see the assembly code for those things... good grief

    • @brahimd8683
      @brahimd8683 2 ปีที่แล้ว

      @@williamdrum9899 The more freedom of thought, the more invention

  • @arnolddalby5552
    @arnolddalby5552 4 ปีที่แล้ว +2

    Excellent, loved it.

  • @numeric.alphabet
    @numeric.alphabet 3 ปีที่แล้ว

    4k: assembly language and computer architecture