That’s a lowkey self own. While it’s good you recognize your own limitations in teaching, you’re kind of getting in your students way. The professors I’ve hated the most were the ones that couldn’t bother themselves to lecture coherently and put that burden on some TH-camr.
The calling convention is to push values to the stack to call a subroutine and the subroutine pop the values from the stack. Personaly i like to put values into register and/or into the data segment to call a subroutine. So my stack if filled with return addresses only for calling subroutines and return to the caller next instruction. We can read the values of the data segment from inside any nested subroutines multiple times without to care where the stack pointer is pointing to. PUSH and POP instructions are slower than mov&mov instructions on x86 befor Pentium 4 was made.
I've been programming for 5 years now. Recently I've been getting into embedded programming and computer graphics with OpenGL, and your videos have been really valuable for me to clarify some CS fundamentals. Thanks for the amazing job you do!
The heap grows up into increasing memory addresses and the stack grows from a high memory address down into decreasing memory addresses. It's a remnant of when an x86 program worked within one segment of memory (16k). And not to forget about real mode too. It would have been best if the stack diagrams in this video grew from top down and that the stack registers were rsp and rbp since everything is 64-bit today.
What I find interesting is that it seems to be convention to visually map memory vertically, with high addresses at the top. I'm more inclined to visualize it horizontally, with higher addresses to the right. I find the horizontal way makes it easier to visualize more complex memory structures and algorithms, because it's more like a number line.
Maybe you should cover UEFI or BIOS boot loading process at one point. UEFI is effectively used everywhere, even ARM. and ofc newer X64 Systems BIOS is also used in ARM but usually on very embedded systems and ofc on older X86 based systems. and i've seen some arm systems actually "emulating" uefi by basically just wrapping bios syscalls into uefi
TH-cam's algorithm for some reason led me to this video, anyway a few comments I think you should have added and talked about: caller-callee, arguments, locals, calling convention, return address, return value, EIP register, red zone (x64), initialization of the area, cleaning the area and more...
The nice thing with having some surface level Assembly understanding and experience is that you kinda already understand how things like this works. I'm exited to see what more there is to learn, and what I missed.
To reason why the sp “grows” towards the negative is because the stack is actually a reverse stack, its bottom is in higher addresses and it grows towards lower ones (a new variable address is less than older variable address)
I will be writing an exam on assembler and how it works with stacks soon and this together with your assembler vids, they helped me to understand, thank you.
stack is like a hotel if the first floor is full you go to next floor but only the last floor that go in can come out you can push it but also pop it out but if you don't want the data just set esp somewhere else no pop just add esp, how many bytes you need to clear
Yeah this channel is so great, I've just had one of my first "sparks" a few days ago when learning about memory and automatically linked the stack to the scope of variables and functions in my head. That felt so good so had to go to gpt to ask if my assumption was right and it confirmed it but now the chad just uploaded this and I'm so happy to have more confirmation and a more in depth explantion on this.
I started with the low level stuff out of my own interest, simply because it was interesting, logical and intuitive. I had a Z80 based machine and quickly became frustrated with the limitation of BASIC, so I found a book on Z80 assembly language at the library and taught myself. The assembly was written and compiled to machine code with pen and paper. Nothing to do with school, I just enjoyed it. Computer science these days is much harder than it was 30 or 40 years ago, because there's too much to learn and it can be overwhelming.
Great video, i always wonder how the hell a stack pointer retrace it tracks to the previous stack. it turns out they just save the ebp below the stack lol
cool, I really enjoy your videos, I would recommend compiler explorer for better representation of the assembly code, or make sure you use -O option… Looking forward to the next one.
I could be wrong but the stack is just literally our main function where it can call other functions to do their own things. If it needs to call other function, it will save the return address to the main function so that after the function being called by main has finished, it can go back to where the main function is in RAM. When calling a function, main will stop and allocate memory on the stack for that function to perform its work, after its done, it will dellocates memory automatically from the stack and give back the control to main to call other function as well until main function exits. The question of where the stack is for a particular program is depending on operating system. For example, if at a particular time, there is 10 programs are executed at the same time on a computer. It means that in RAM, there is 10 main functions executing calling it's own other functions. It means all the stacks (the main functions) are all aligned next to each other. If one of the main functions requires allocation on the heap, that means the OS needs to find a memory address outside of this range of address. I.e the heap memory region cannot be overlapped with the stack region. Even if there is availability of memory in stack region.
The stack can be anywhere, the OS prealocates a region, but theres nothing stopping you from malloc'ing some memory and setting esp to the last byte in that region in Go for example, each goroutine has it's own stack that is just alocated on the heap like any other allocation (IIRC it still uses the stack provided by the OS for signal handlers and cgo calls)
I would have preferred you describe the stack the way it's laid out instead of symbolizing it. After all, you subtract from the stack pointer because when you're first given a stack you are pointing to the end of the stack. Every push instruction essentially subtracts the amount of space for whatever object you're about to add so that you can move it into place where the new position of the stack pointer points. If you really want to get in depth, it'd be nice to explain how the loader sets up the stack for your program.
The more I learn about low level programming, the more I believe that Magic the Gathering was designed by computer engineers. The Stack in MtG works almost exactly like The Stack in computers
Sick dude. Thanks for the answer! Only recently I heard stack might not be "real" and was thoroughly surprised because as I've been to C++ classes or heard people talk about C++, people talk like stack was a separate physical memory thing. This explained a lot of what's going on. What about languages that don't have/use stack the same way as C, I understand that happens as well? So what about when you start a new program and I understand it requests a certain size of memory filled with zeroes. Who does the clearing/writing of zeroes and when? Can it happen that the memory is NOT actually clear when requested clean memory? Maybe for a bug or cybersecurity issue. Where do you get that appointed memory? Stack? Heap?
To answer your first question, the stack is concept specific to computer architecture, not just programming languages. Function calls in any language will typically utilize the stack because the machine code they compile into relies on the stack to do calls. Try writing a simple program that calls a function and returns a value. Compile and disassemble it to language agnostic use of the stack. Onto your second question. The OS is typically responsible for clearing the memory, although this doesn't apply in most of the embedded world.
I'm curious how does it know where to `push` and where to `pop` value from? It seems there should be some other register to store that pointer on the stack and this register should update automatically on `pop` and `push`.
He represented it symbolically which is probably the wrong way to demonstrate it. The stack grows downward and every push operation subtracts from the stack pointer by the size of the operand and then moves it into place where the pointer now points.
it might have been worth it to include , where the stack is located , (physically) and maybe more important where its not located,... and what its limits are. both in how big it can get (why it might be a bad idea to throw heaps of data onto the stack , :P) , and in terms of how its parts are accessed (why its actually called a stack
As far as i know. Its just an allocated segment in memory(ram). And for the size as far as i know am not sure its between 16ko-10ko. But i think it could theoratically go higher but its not neccesary most of the time
@@newkg3591 I meant as worthwhile mentioning in the video, don't know what ko is can you clarify. Stack size (varies but) on Linux default 10mb , on windows 1mb , but with l3 cache just as big , memory much bigger, where does it go , if the heap grows backward what's the maximum and where do I get more, is it contiguous, can you pick a spot. Why does heap must be a pointer. If malloc chooses? But you can write your own allocator is it set or up to the programmer ?Is it lang dependant are just some if the things I think a starter would be curious about, I know I had to look up most of them at one point. And I still have many questions, hehe,Something I still haven't found an answer to that is satisfieing isthe CPU I'm programming for in my brain is still a z80 6502 hybrid simplification, but it always runs at 100% no Mather what , what makes a modern CPU to only have 20% load on a core or even clock back when in my mind it should be processing the next instruction as fast as possible, especially because I don't program allot of nops, or have routines triggered by interrupts. So for me its black magic.atm
@@hoefkensj tbh i honestly dont have concrete answers to your questions sorry. Am mostly familiarnwith 8086 and 80 architecture those are the ones i actually used but modern ones i am not that into it sorry. But your questions are interesting i really hope he covers them in his videos!
Why are variables 'x' and 'y' saved on the stack rather than in data segment? Is it so that the amount of memory needed for a function call is "allocated" dynamiclly for that call or is there some other reason such as to better allow for rucursion? I've only worked with intel 8080 and z80 assemblies, both of which are much more minimal than these modern cpu's, so sorry if it's a dumb question
no matter the reason, found a way to create a function in i8080 assembly though it takes a bit of work being the Stack pointer is harder to interface with. writing to the pointer is fine, we got SPHL, but to read it you gotta use the Double-Addition instruction (DAD SP) with HL set to 0. rereating the foo function in the video (minus the printing) could look like this. also x,y,z are going to be 8 bit values for simplicity and the architure foo: ; L = 0x01 = x variable (base stack offset of +0) ; H = 0x02 = y variable (base stack offset of +1) LXI H, 0102H ; push (16bit) HL onto stack PUSH H ; load variable x into Reg. A ; load the variable offset into HL, in this case 0 LXI H, 0 ; move HL to the variables position in the stack DAD SP ; store value pointed to by HL into A MOV A,M ; repeat for variable y and Reg. B ; load the variable offset into HL, in this case +1 LXI H, 1 ; move HL to the variables position in the stack DAD SP ; store value pointerd to by HL into B MOV B,M ; do our calculation (result stored in A) ADD B ; move SP back to origonal position by either method 1 or 2 depending on how many pushes you made in the beginning ; method #1 (equal POP's to that of PUSHes in the beginning, speed varies based on # of POPS) POP H ... ; method #2 ( constant speed, may be faster if many PUSHes were made in the beginning) ; load total function offset (# of PUSHES in the beggining multiplied by 2) LXI H,function_offset ; calculate the new value of the stack pointer after "POP"ing the functino variables of the stack DAD SP ; store the calculated value back into the stack pointer SPHL ; And after all that, we return from the procedure ("function") call RET Please note this code is untested, dont quote me
If they were saved on the data segment they would be shared by multiple calls, just like globals and variables declared static They are alocated on the stack so that each call has its own private copy of the local vars, even if the function is called recursively
For a function like foo here, how "tall" is the provided stack? would it change if local variables x and y are floating point numbers instead of integers?
And if a function foo1 calls another function foo2 (both having their own local variables), would the stacks just stack on top of each other? main > foo1 >foo2
Am no expert but i think In assembly the stack's size is that of the segment(i forgot how big), and the size of the stack elements are i think 32 or 64bit by default(again am not expert i just dabbled in it before.) And as for the foo calling foo2 its basically the same thing foo2 returns from the adress that is poped from the stack back to foo1 and return of foo1 returns to main by popping adress from stack
i'm having difficulty getting how ebp knows where the bottom of the outer stack frame is if i've got three frames, how does it find the bottom of stack two after i return out of layer 3 is it storing previous ebp at the bottom of the stack frame?
Hi, i’ve been seeing your videos and they’re quite interesting, so i decided to start learning c again, can you recommend any ressources, and thank you.
Is there any performance difference when accessing data held on the stack vs data held in heap memory? If so, which operations are fastest in which environment?
The fact that the stack is maintained using pointers and the data between them is not erased upon return leave a security hole that is processor expensive to clean up. The reason the stack grows negative is that it used to be set at the top of memory to start. The program was at bottom of ram and the heap in between - while the heap grew up the stack grew down to maximize memory usage. If you want to prevent stack over/under flows from blowing up the process just don't map a page of virtual memory on either end of the stack and put in an un mapped memory access exception handler.
It's called a descending stack but why it's implemented to be so depends on the CPU architecture but essentially one rational was to simplify indexing into the stack.
ebx (the register) might have been modified by foo(), that's why you put ebx (the value that was stored in the stack) back into ebx (the register) before returning to main, main might have used ebx before calling foo() and might use it after the call of foo(). This video is a bit confusing when he uses the same name of the registers and their values copied on the stack.
The processor executes some assembler instructions in a routine usually called c-startup to set up things like the stack pointer. This code then has a branch or jump to a label main and the linker during the link stage inserts the address of ‘C’ main and that is why we need a function in ‘C’ called main. It is just convention really, you could branch to another c function if you modified the c-startup code written in assembler.
I'm assuming when you said "not only does it subtract esp to bring it where ebp is, it also pops ebp, which means now ebp points back to the start of the main", you meant now it sets the value of ebp to the popped value? Sorry if that sounds obvious but assembly is too nuanced for me to not ask such trivial but clarifying questions.
I feel like this would have made more sense to people if you explained it as it growing down. You saying "As SP goes up, the value goes more negative" is just confusing. Just my preference ig
If you think about it, why do stack overflows happen ? if the address actually increases then the size of the stack is your entire memory which makes no sense, stack are very small in size.
im in foo rwar xd
teehee im in main
int z = foo();
I like feet (I'm heterosexual)
}
How does it look?
I always recommend my students to your channel the way you break down the basics are so valuable for building blocks.
The content is often subtly wrong, but mostly fine.
That’s a lowkey self own. While it’s good you recognize your own limitations in teaching, you’re kind of getting in your students way. The professors I’ve hated the most were the ones that couldn’t bother themselves to lecture coherently and put that burden on some TH-camr.
For me as a beginner to assembly, this was a bit too fast paced, but I believe that for more experienced assembly people this video is very good
The calling convention is to push values to the stack to call a subroutine and the subroutine pop the values from the stack. Personaly i like to put values into register and/or into the data segment to call a subroutine. So my stack if filled with return addresses only for calling subroutines and return to the caller next instruction.
We can read the values of the data segment from inside any nested subroutines multiple times without to care where the stack pointer is pointing to. PUSH and POP instructions are slower than mov&mov instructions on x86 befor Pentium 4 was made.
I've been programming for 5 years now. Recently I've been getting into embedded programming and computer graphics with OpenGL, and your videos have been really valuable for me to clarify some CS fundamentals. Thanks for the amazing job you do!
The heap grows up into increasing memory addresses and the stack grows from a high memory address down into decreasing memory addresses. It's a remnant of when an x86 program worked within one segment of memory (16k). And not to forget about real mode too. It would have been best if the stack diagrams in this video grew from top down and that the stack registers were rsp and rbp since everything is 64-bit today.
That is exactly how I was tought to think about the stack for this very reason.
I thought the growth of the stack was dependent on the architecture's processor.
@@cyrilemeka6987 The focus of the video is only the x86 architecture.
@@DaveAxiom my bad, I replied to your comment before I got to watch the video
What I find interesting is that it seems to be convention to visually map memory vertically, with high addresses at the top. I'm more inclined to visualize it horizontally, with higher addresses to the right. I find the horizontal way makes it easier to visualize more complex memory structures and algorithms, because it's more like a number line.
I am a full stack engineer, but I can always appreciate a high quality video that makes it easy to understand low level concepts.
Maybe you should cover UEFI or BIOS boot loading process at one point.
UEFI is effectively used everywhere, even ARM. and ofc newer X64 Systems
BIOS is also used in ARM but usually on very embedded systems and ofc on older X86 based systems.
and i've seen some arm systems actually "emulating" uefi by basically just wrapping bios syscalls into uefi
TH-cam's algorithm for some reason led me to this video, anyway a few comments I think you should have added and talked about: caller-callee, arguments, locals, calling convention, return address, return value, EIP register, red zone (x64), initialization of the area, cleaning the area and more...
The nice thing with having some surface level Assembly understanding and experience is that you kinda already understand how things like this works. I'm exited to see what more there is to learn, and what I missed.
4:13 my earrssss
To reason why the sp “grows” towards the negative is because the stack is actually a reverse stack, its bottom is in higher addresses and it grows towards lower ones (a new variable address is less than older variable address)
I spent a lot of time understanding this, but you clarified it in only 8 minutes.
I will be writing an exam on assembler and how it works with stacks soon and this together with your assembler vids, they helped me to understand, thank you.
stack is like a hotel
if the first floor is full you go to next floor
but only the last floor that go in can come out
you can push it
but also pop it out
but if you don't want the data just set esp somewhere else no pop just add esp, how many bytes you need to clear
I've been watching your videos for some time but this particular one made me subscribe to your channel. Thanks for making these difficult videos.
Between you, ben eater, and chatGPT, I am having an unbelievable time learning about how computers work. This is so much fun omg
That’s awesome to hear I love that
Yeah this channel is so great, I've just had one of my first "sparks" a few days ago when learning about memory and automatically linked the stack to the scope of variables and functions in my head. That felt so good so had to go to gpt to ask if my assumption was right and it confirmed it but now the chad just uploaded this and I'm so happy to have more confirmation and a more in depth explantion on this.
I guess I never paid attention to this stuff at college… but this low level stuff is really interesting to me nowadays.
I started with the low level stuff out of my own interest, simply because it was interesting, logical and intuitive. I had a Z80 based machine and quickly became frustrated with the limitation of BASIC, so I found a book on Z80 assembly language at the library and taught myself. The assembly was written and compiled to machine code with pen and paper. Nothing to do with school, I just enjoyed it. Computer science these days is much harder than it was 30 or 40 years ago, because there's too much to learn and it can be overwhelming.
Great video, i always wonder how the hell a stack pointer retrace it tracks to the previous stack. it turns out they just save the ebp below the stack lol
you have the best videos i love em. these kind of videos especially
I am very excited to learn that you have an entire video on the ret instruction.
cool, I really enjoy your videos, I would recommend compiler explorer for better representation of the assembly code, or make sure you use -O option… Looking forward to the next one.
A full Stack Computer Architecture video?? Looks like the Haskell has hit hard
I have absolutely no ide what i just watched, but I won't be saying that in 6 months! 💪
another low level learning video, another happy day
0:13 that is windows 10 with a windows xp skin
I could be wrong but the stack is just literally our main function where it can call other functions to do their own things. If it needs to call other function, it will save the return address to the main function so that after the function being called by main has finished, it can go back to where the main function is in RAM. When calling a function, main will stop and allocate memory on the stack for that function to perform its work, after its done, it will dellocates memory automatically from the stack and give back the control to main to call other function as well until main function exits.
The question of where the stack is for a particular program is depending on operating system. For example, if at a particular time, there is 10 programs are executed at the same time on a computer. It means that in RAM, there is 10 main functions executing calling it's own other functions. It means all the stacks (the main functions) are all aligned next to each other. If one of the main functions requires allocation on the heap, that means the OS needs to find a memory address outside of this range of address. I.e the heap memory region cannot be overlapped with the stack region. Even if there is availability of memory in stack region.
The stack can be anywhere, the OS prealocates a region, but theres nothing stopping you from malloc'ing some memory and setting esp to the last byte in that region
in Go for example, each goroutine has it's own stack that is just alocated on the heap like any other allocation (IIRC it still uses the stack provided by the OS for signal handlers and cgo calls)
That is windows 10 with the XP wallpaper, an open shell taskbar skin and a start button skin
ebp is extended base pointer, not bottom
thanks for the video
I would have preferred you describe the stack the way it's laid out instead of symbolizing it. After all, you subtract from the stack pointer because when you're first given a stack you are pointing to the end of the stack. Every push instruction essentially subtracts the amount of space for whatever object you're about to add so that you can move it into place where the new position of the stack pointer points. If you really want to get in depth, it'd be nice to explain how the loader sets up the stack for your program.
Okay, I didn't realize I needed to know assembly to understand this video...
You look tired but also explain things in much more detail than any books I have read. (not counting the assembly for dummies books :) )
More videos on assmebly please!
At 4:24, it's the function epilogue that collapses the stack frame.
By the way, Exx registers are in 32bit systems, as today most is 64bit the registers are named Rxx (RAX and so on)
The more I learn about low level programming, the more I believe that Magic the Gathering was designed by computer engineers. The Stack in MtG works almost exactly like The Stack in computers
"Also, why am I stuck here in Windows XP?"
I feel you.
Titan-tier thumbnail
Don't mind my comment, it's just to boost your popularity. Keep learning us at low level
Hey, i really like how your Vim looks, can you provide the vimrc configuration?
awesome video as always
Sick dude. Thanks for the answer! Only recently I heard stack might not be "real" and was thoroughly surprised because as I've been to C++ classes or heard people talk about C++, people talk like stack was a separate physical memory thing. This explained a lot of what's going on.
What about languages that don't have/use stack the same way as C, I understand that happens as well?
So what about when you start a new program and I understand it requests a certain size of memory filled with zeroes. Who does the clearing/writing of zeroes and when? Can it happen that the memory is NOT actually clear when requested clean memory? Maybe for a bug or cybersecurity issue. Where do you get that appointed memory? Stack? Heap?
To answer your first question, the stack is concept specific to computer architecture, not just programming languages. Function calls in any language will typically utilize the stack because the machine code they compile into relies on the stack to do calls. Try writing a simple program that calls a function and returns a value. Compile and disassemble it to language agnostic use of the stack.
Onto your second question. The OS is typically responsible for clearing the memory, although this doesn't apply in most of the embedded world.
3s into the video, “is that fucking windows xp?”
I prefer the XP vibe. idk it just really got my attention - like I've been waiting to see someone come around that hill for a long time
Bro gave major Big Tugg vibes at the beginning 👀
I'm curious how does it know where to `push` and where to `pop` value from? It seems there should be some other register to store that pointer on the stack and this register should update automatically on `pop` and `push`.
This video feels like a precursor to buffer overflows 👀 are we seeing that soon?
Oh whoops that already happened a year ago. Somehow I missed it.
th-cam.com/video/qpyRz5lkRjE/w-d-xo.htmlsi=YwzJCQFLYlp9B4en
At 5:57 wouldn’t ebp - 0xc be lower on the stack than ebp - 0x10? So why did you draw it corresponding to x, which is higher on the stack?
He represented it symbolically which is probably the wrong way to demonstrate it. The stack grows downward and every push operation subtracts from the stack pointer by the size of the operand and then moves it into place where the pointer now points.
As a js developer, I got lost after the windows background went away.
Ben Eater of programming in low level
Phenomenal content
so one question. does the stackpointer moving for every push instruction? i'm not quite sure.
I died when you said you got banished here cuz you coded in haskell🤣😂🤣
it might have been worth it to include , where the stack is located , (physically) and maybe more important where its not located,... and what its limits are. both in how big it can get (why it might be a bad idea to throw heaps of data onto the stack , :P) , and in terms of how its parts are accessed (why its actually called a stack
As far as i know. Its just an allocated segment in memory(ram). And for the size as far as i know am not sure its between 16ko-10ko. But i think it could theoratically go higher but its not neccesary most of the time
@@newkg3591 I meant as worthwhile mentioning in the video, don't know what ko is can you clarify. Stack size (varies but) on Linux default 10mb , on windows 1mb , but with l3 cache just as big , memory much bigger, where does it go , if the heap grows backward what's the maximum and where do I get more, is it contiguous, can you pick a spot. Why does heap must be a pointer. If malloc chooses? But you can write your own allocator is it set or up to the programmer ?Is it lang dependant are just some if the things I think a starter would be curious about, I know I had to look up most of them at one point. And I still have many questions, hehe,Something I still haven't found an answer to that is satisfieing isthe CPU I'm programming for in my brain is still a z80 6502 hybrid simplification, but it always runs at 100% no Mather what , what makes a modern CPU to only have 20% load on a core or even clock back when in my mind it should be processing the next instruction as fast as possible, especially because I don't program allot of nops, or have routines triggered by interrupts. So for me its black magic.atm
@@hoefkensj tbh i honestly dont have concrete answers to your questions sorry. Am mostly familiarnwith 8086 and 80 architecture those are the ones i actually used but modern ones i am not that into it sorry. But your questions are interesting i really hope he covers them in his videos!
@@newkg3591 yeah ive not found much eiter wich is weird cause it seems kind of important somehow:)
Why are variables 'x' and 'y' saved on the stack rather than in data segment? Is it so that the amount of memory needed for a function call is "allocated" dynamiclly for that call or is there some other reason such as to better allow for rucursion?
I've only worked with intel 8080 and z80 assemblies, both of which are much more minimal than these modern cpu's, so sorry if it's a dumb question
no matter the reason, found a way to create a function in i8080 assembly though it takes a bit of work being the Stack pointer is harder to interface with. writing to the pointer is fine, we got SPHL, but to read it you gotta use the Double-Addition instruction (DAD SP) with HL set to 0.
rereating the foo function in the video (minus the printing) could look like this. also x,y,z are going to be 8 bit values for simplicity and the architure
foo:
; L = 0x01 = x variable (base stack offset of +0)
; H = 0x02 = y variable (base stack offset of +1)
LXI H, 0102H
; push (16bit) HL onto stack
PUSH H
; load variable x into Reg. A
; load the variable offset into HL, in this case 0
LXI H, 0
; move HL to the variables position in the stack
DAD SP
; store value pointed to by HL into A
MOV A,M
; repeat for variable y and Reg. B
; load the variable offset into HL, in this case +1
LXI H, 1
; move HL to the variables position in the stack
DAD SP
; store value pointerd to by HL into B
MOV B,M
; do our calculation (result stored in A)
ADD B
; move SP back to origonal position by either method 1 or 2 depending on how many pushes you made in the beginning
; method #1 (equal POP's to that of PUSHes in the beginning, speed varies based on # of POPS)
POP H
...
; method #2 ( constant speed, may be faster if many PUSHes were made in the beginning)
; load total function offset (# of PUSHES in the beggining multiplied by 2)
LXI H,function_offset
; calculate the new value of the stack pointer after "POP"ing the functino variables of the stack
DAD SP
; store the calculated value back into the stack pointer
SPHL
; And after all that, we return from the procedure ("function") call
RET
Please note this code is untested, dont quote me
If they were saved on the data segment they would be shared by multiple calls, just like globals and variables declared static
They are alocated on the stack so that each call has its own private copy of the local vars, even if the function is called recursively
For a function like foo here, how "tall" is the provided stack? would it change if local variables x and y are floating point numbers instead of integers?
And if a function foo1 calls another function foo2 (both having their own local variables), would the stacks just stack on top of each other? main > foo1 >foo2
Am no expert but i think In assembly the stack's size is that of the segment(i forgot how big), and the size of the stack elements are i think 32 or 64bit by default(again am not expert i just dabbled in it before.) And as for the foo calling foo2 its basically the same thing foo2 returns from the adress that is poped from the stack back to foo1 and return of foo1 returns to main by popping adress from stack
i'm having difficulty getting how ebp knows where the bottom of the outer stack frame is
if i've got three frames, how does it find the bottom of stack two after i return out of layer 3
is it storing previous ebp at the bottom of the stack frame?
Hi, i’ve been seeing your videos and they’re quite interesting, so i decided to start learning c again, can you recommend any ressources, and thank you.
You seem like you were teaching in the harvard cs50 class too?
do you have a video about heap?
I wonder if anybody has done a stack frames lecture using Lego and address labels. 🙂
Is there any performance difference when accessing data held on the stack vs data held in heap memory? If so, which operations are fastest in which environment?
Repeat after me, "the stack grows down"
this channels is pure gold!
It seems that this only applied to x86 assembly and not to ARM
5:55 - Why does it sub 0x14 and not 0x18? Two 64-bit registers and two 32-bit ints = 0x18 bytes. So esp points to y in your diagram?
I'm always intrique by naming the function as 'foo'. It should've been 'fool'.
Why is esp subtracted by 0x14, is there any 4-byte data residing above varible x?
The fact that the stack is maintained using pointers and the data between them is not erased upon return leave a security hole that is processor expensive to clean up.
The reason the stack grows negative is that it used to be set at the top of memory to start. The program was at bottom of ram and the heap in between - while the heap grew up the stack grew down to maximize memory usage.
If you want to prevent stack over/under flows from blowing up the process just don't map a page of virtual memory on either end of the stack and put in an un mapped memory access exception handler.
Just a reverse queue
im quite new to this domain, but doesn't the stack grow downwards?
It's called a descending stack but why it's implemented to be so depends on the CPU architecture but essentially one rational was to simplify indexing into the stack.
I guess it's how you imagine the memory layout. I always imagine 0x000... at the top and 0xFFFF... at the bottom, so for me the graphic made sense.
@@VivekYadav-ds8oz fair enough, thanks everyone
I don’t like to use the calling convention.
The ebx value that is retained across the call to foo() - what is that? How does that relate to the code we see in main()?
ebx (the register) might have been modified by foo(), that's why you put ebx (the value that was stored in the stack) back into ebx (the register) before returning to main, main might have used ebx before calling foo() and might use it after the call of foo().
This video is a bit confusing when he uses the same name of the registers and their values copied on the stack.
Why does main always get called in a C program? How does the OS give over control to the process?
The processor executes some assembler instructions in a routine usually called c-startup to set up things like the stack pointer. This code then has a branch or jump to a label main and the linker during the link stage inserts the address of ‘C’ main and that is why we need a function in ‘C’ called main. It is just convention really, you could branch to another c function if you modified the c-startup code written in assembler.
I'm assuming when you said "not only does it subtract esp to bring it where ebp is, it also pops ebp, which means now ebp points back to the start of the main", you meant now it sets the value of ebp to the popped value? Sorry if that sounds obvious but assembly is too nuanced for me to not ask such trivial but clarifying questions.
"So that why page is called StackOverflow..."... comments season started 😉
*bold* of you to assume I use the stack at all
I would also appreciate an deepdive into loading ELF executables, or MACH-O or PE32+. although ELF is much more documented
What determines the size of the stack frame?
The compiler does, based on the size of local variables.
You guys use stacks? As a Chad firmware engineer I inline everything.
whats ur font
windows xp hell yeah
Meh the only stack that matters is page 1, 256 bytes, all you need.
obligatory first... or whatever
pls keep making low level programming videos, your the best content creator in this area
Anyone know what font and colorscheme that is?
colorscheme is solarized dark
@@dflyboy420 No not the one you're using but the one from the video, which looks more like One Dark or some variation on it.
@@TheHeavenlyDemonSupreme he uses solarized in his i3 shell and the solarized vim colorscheme
I feel like this would have made more sense to people if you explained it as it growing down. You saying "As SP goes up, the value goes more negative" is just confusing. Just my preference ig
If you think about it, why do stack overflows happen ? if the address actually increases then the size of the stack is your entire memory which makes no sense, stack are very small in size.
@@Brad_ScriptThink you're replying to the wrong dude my guy
Hi xp guy
bold font looks bad
👍👍
👍
0:22 what the Fuck was that? don’t do that.
bro is MAD
whats yer vim setup