These lecture series are truly precious. MIT and all the professors involved in this monumental effort are to be commended for their superb contribution to the advancement and propagation of human knowledge. The same also goes for the anonymous financial contributors whose generous donations have made it possible for these lectures to be made available to the general public, free of charge. 👏🏻👏🏻👏🏻
the moment he talks about the derivative of step function is zero except the one point everything happens and my mind was like 🤯🤯🤯 all the questions in my mind were explained. OMG, thank you so much professor
Such a pleasure to watch the explanation of the derivative of the Step function and your explanation of the Sifting Property of the Delta Function. It's very valuable when doing Laplace and Fourier Transforms.
Superb explanation of the Heavyside and Delta functions. No textbook I've read mentions the Heavyside function's relation to the Delta function. The common "explanation" is to state that the integral of the Delta function is defined as one!!!
7:26 this is how long it took for me to understand something my lecturer tried and failed to explain over the course of two hours. Wtf. I wish they could clone Gilbert Strang and have him teach in every institution.
The Dirac distribution is the Fourier transform of unity and a special case of convolution, where A*f=g, g(x)=d(x-y). f(y)dy , if we imagine the gravitational interaction as a function of g(x) and the electromagnetic interaction as a function of f(y), then these forces (i.e. the lines of force) only interact when x is equal to y ( the Dirac impulse).
It is a conventional thing. You can definite it either way and still have the core properties discussed in this video, however when you define H(0) = 1 instead of H(0)=0 you get additional convenient properties: H becomes right-continuous, which is necessary for lebesgue integration. H(-x) + H(x) = 1 for *all* x (under the other convention H(-0) + H(0) = 0) And a few others. TLDR; the properties in this video are true regardless of which convention you use, but the convention that H(t)=1 if t>= 0 has additional nice properties.
The differential equation y'(t) = ay(t) + delta(t - T) can only hold when t != T since y'(t) is undefined at T. I guess delta functions are a shorthand for that idea? Theorem: pick any x in [a, b]. The integral of f over [a, b] is independent of f(x). Proof: consider any partition P of [a, b]. Now add x - 1/n and x + 1/n to that partition to get Q. Since Q is a finer partition than P (formally P is a subset of Q), then undersum-oversum gap of Q is smaller than that of P. But the oversum and undersum of Q equal those of P on every subinterval except in a small neighborhood around x. The contribution of that neighborhood can be made as small as we like by picking n large enough. Let g(y) = f(y) whenever y != x and arbitrary when y = x. Then the difference between the integral of g and the integral of f can be made smaller than any epsilon = 1/n, implying that those two integrals are equal. [The existence of such a g is exactly what I mean by "[...] independent of f(x)".] But then if the delta function is 0 everywhere except when x = 0 it must have the same integral as the zero function (g(x) = 0 everywhere), which is zero. Hence the delta "function" cannot be a function. But integrals and derivatives are only defined for functions. So what goes on here is that we're adding some abstract symbol whose behavior is given by its definition [similar to the way a sequence is not a number, but arithmetic on its limit functions like arithmetic on the sequence so it's kinda'-sorta'-like a number]. But we're not told the meaning of all expressions containing the delta. We are not shown the rules of its algebra, and they are not justified. Note that in the final example, y(t) is discontinuous at T (limit 0 from below and 1 from above) and thus not differentiable at T, i.e. y'(t) is not defined when t = T. So the only sensible meaning I can make of the delta function is my initial statement: it's some abstract token which we use to pretend that y is differentiable everywhere. In that way I guess it's like the infinity symbol: if two series diverge to positive infinity their sum also diverges to positive infinity, and in that sense (+oo) + (+oo) = (+oo). But note that this doesn't extend neatly to differences: the harmonic series minus itself converges to 0, but the harmonic series minus twice itself goes to (-oo). So is (+oo) - (+oo) equal to 0 or (-oo)? The question has no answer. To understand algebra with the delta "function" and exactly what is permitted, we would need theorems characterizing it.
Actually my theorem is bogus, because if g(x) is arbitrary, it is not guaranteed that the oversum-undersum delta for g can be made to shrink to 0, specifically around x. For example, take a constant function f(x) = 0 and let g(x) = f(x) + 1 exactly when x is rational. By repeated application of my "theorem" g and f integrate equally, but we cannot even integrate g.
The Heaviside Step Function and the Dirac Delta Function are both extreme limits of their smoother, slightly more well-behaved counterparts, the Error Function and the Gaussian. The Gaussian is a smooth bell curve, exp(-(x squared)), and has finite area under it (it integrates to the square root of pi, famously). The Error Function is the name given to the function whose derivative is the Gaussian, though it has no real formulaic representation in x outside of its series expansion. The Error Function looks like a smooth version of the step function, one with a somewhat rounded off, curved step. In the limit as the full width at half maximum of the Gaussian goes infinitely narrow, it converges to the Dirac Delta Function, and in the limit of infinitely-square step shape, the Error Function converges to the Heaviside Step Function. Since the Gaussian is the derivative of the Error Function, (which can be shown by looking at the series expansions of both) it stands to reason that the Dirac Delta Function should be the derivative of the Heaviside Step Function.
"Dr. Strange" is pure fiction but "Dr. Strang" is REAL; here he is. He's a real-life SUPER HERO to me! I have his book on Linear Algebra and have been following him now for decades. Educators are the actual "super heroes", but never get the recognition they deserve. South Korea treats (pays) their educators like "Rock Stars", so should we.
Ahhh this was a great introductory lecture but I really need to get behind the in-depth math of delta-functions without the whole thing diverging into a graduation paper about distributions...
Quick question: so in part 9:09 since delta function is 0 everywhere but in x=0, could we then put integral_{-a}^{a} diracdelta(t)f(t)dt=f(0) for all a real number? Or does the area of integration need to be from -infty to +infty?
6:41 Lets us not consider integral of dirac delta from -inf to inf as heaviside function. here you will see the uncertainty. 1. dirac delta is not 1 at t=0( its 1 in case of discrete time or kronecker delta, which is a discrete analog of dirac delta) 2.I can not understand why d/dt(H(t))|t=t0 = dirac delta(t0), how? when we donot know the value of dirac delta at t0. 3. integral -inf to inf dirac delta is 1 how? dirac delta is defined as a large value at strictly zero. how do you integrate such a quantity. If i consider splitting the integral from lower limit to 0 and 0 to upper limit, the answer of total integral is 0. I really donot know how to do it, please explain.
for the last differential equation. I can`t get why when t is smaller than T y=0. if I substitute y=exp(a(t-T)) to the equation I still get equality when t < T. and 0 only when t=T.
Old question, but cannot resist answering ;-) Yes, you do get equality upon substitution of the exponential into the differential equation, but you violate the initial condition y(0)=0. (Strang is assuming that the time of the 'kick' t=T actually occurs after t=0, so T>0.) For t
Sir H is not differenciable at t=0, then why are we considering that point?, if H was a step function which jumped to say 2 then the integral would have equal to 2?
For the example starting on 11:21, why can't, up to t = T, y(t) = [e^(at) - 1]? That also satisfies the initial condition, does it not? Isn't 0, what Strang says is the value for the y up to t = T, just the case where a=0 for the general solution I gave? If so, wouldn't it be better to be consistent, since he doesn't explicitly specify any value of a in the second condition of the solution, and wishes to be general with respect to the variable a?
I don't see how that satisfies the range properly. It would only work precisely at t = 0 (the initial condition), but it would be wrong for every other value until t = T, unless i'm completely misunderstanding.
I have a question for the community/teacher that I can't understand. At time @8:57, he has the integral of [the Dirac Delta Function {d(t)} multiplied by some function {f(t)}]; so the integral [d(t) * f(t) * dt], which he concludes is just f(0) because the integral of d(t) is just H(t) which when evaluated from -inf to +inf, which equals 1. And I get 1 * anything = anything, so only at input 0 can there ever be an output. However, what I'm not getting is this, shouldn't the answer, instead of f(0), actually be F(0), where F(x) = integral [f(t) * dt]? If not, I don't see why the function f(t) doesn't get integrated.
That would have been the case if Int(f*g)=int(f)*int(g) but this is not the case. One way to look this problem is numerical integration so i ll sum f(i)*g(i) for i=-inf to +inf .now g is special , its derac delta so its value is zero except at i=0. Hence i can replace f(i) by f(0). Now take f(0) common so sum of f(i)*g(i)=f(0)*{g(1)+g(2)+...} =f(0)*1=f0. Hope this helps.
f(t)*d(t)=0 at all points except at 0. So it only needs to be integrated at 0, but f(t)=f(0)(ie constant) at t=0, and can be taken out of integral giving f(0)*integral[d(t)dt].
I am looking for the elementary properties of del function from which every other properties can be deduced. The visualisation of del function does not make sense technically.
Imagine him going to the bakery and he lectures you about "Imagine I deposit one dollar..." and then continues with the Heavyside function and Dirac Delta...
Your description that the integral of Delta function equals to H(t) is not correct. H'(t)=Delta function(t) = 0 at except x=0, then the integral of it is Zero by Lebesgue integral theory and does not equal to H(t). Do you know that the derivative of Cantor function takes Zero at almost everywhere, and that the integral of it is also Zero ? But Cantor function is an increasing continuous function and not Zero. How do you consider the relation between H(t) and Delta function. The distribution theory is needed.
Just out of curiosity, if the step function is defined as 1 for t>=0 why do we say the slope at t=zero is infinity? The slope should be 0 at t=0. Actually, if the function is defined for all values of "t" than there should be no value of "t" where it's derivative is anything other than 0. Does my reasoning make any sense?
Is there anyone on the internet that can explain how to evaluate a delta function as an indefinite integral? I need it for my differential equations class but I can’t find any videos.
A possible alternative is linear algebra courses that have problem sets with solutions. You can filter for them on this list: ocw.mit.edu/courses/find-by-topic/#cat=mathematics&subcat=linearalgebra. We hope this helps!
This is what happens when a teacher actually knows what he is teaching. Amazing.
Whenever I learn something new from the book, I always polish it with your lectures
These lecture series are truly precious. MIT and all the professors involved in this monumental effort are to be commended for their superb contribution to the advancement and propagation of human knowledge. The same also goes for the anonymous financial contributors whose generous donations have made it possible for these lectures to be made available to the general public, free of charge. 👏🏻👏🏻👏🏻
Lies again? Hello DDF
Watching lectures with this guy is like watching a Netflix series, you’re eager to see what’s going on in the next episode
Happy retirement Professor Strang. Your commentaries are very clear.
the moment he talks about the derivative of step function is zero except the one point everything happens and my mind was like 🤯🤯🤯 all the questions in my mind were explained. OMG, thank you so much professor
His course on Linear algebra is fantastic and has helped me a lot.
I wish to thank him personally one day.
Thank You Prof. Strang
Thank you Professor Gilbert! May you have students as enthusiastic as your good explanations!
I can't thank you enough for sharing. i love the intuitive approach.
Such a pleasure to watch the explanation of the derivative of the Step function and your explanation of the Sifting Property of the Delta Function. It's very valuable when doing Laplace and Fourier Transforms.
11:41 The intuition behind that equations is brilliant ❤️❤️❤️
Coolest chalkboard I've ever seen. MIT does things right. No doing an equation and going across the entire room until it's too long to even focus
The way he explain is amazing .
what an excellent way of teaching such complex terms. I just wanna give him a hug
I can't believe that in just 1 minute i understood step function! Best prof ever!! 🌹🌹🌹
Superb explanation of the Heavyside and Delta functions. No textbook I've read mentions the Heavyside function's relation to the Delta function. The common "explanation" is to state that the integral of the Delta function is defined as one!!!
7:26 this is how long it took for me to understand something my lecturer tried and failed to explain over the course of two hours. Wtf. I wish they could clone Gilbert Strang and have him teach in every institution.
Awesome video...1 year on TH-cam and no dislikes.exceptional.
He has 7 on this video. I fully believe that it is your fault.......... wow
A great conceptual simplification and learning device: a derivative shows when something changes.
Professor Gilbert strang painstakingly explains the problem with sincerity thank you sir.
-god I love this guy. I feel like he was a better teacher then all my real life teachers haha
well it is MIT. The best for the best man
5:47 "If we take derivatives, we get crazyness". I feel ya 100% bruh :D
The Dirac distribution is the Fourier transform of unity and a special case of convolution, where A*f=g, g(x)=d(x-y). f(y)dy , if we imagine the gravitational interaction as a function of g(x) and the electromagnetic interaction as a function of f(y), then these forces (i.e. the lines of force) only interact when x is equal to y ( the Dirac impulse).
Such a gifted teacher, unbounded mind.
The best explanation of Delta function ever
1:53 H(t) = 1 if t≥0? in severals books said if t>0. Greetings from my beloved homeland Perú
It is a conventional thing. You can definite it either way and still have the core properties discussed in this video, however when you define H(0) = 1 instead of H(0)=0 you get additional convenient properties:
H becomes right-continuous, which is necessary for lebesgue integration.
H(-x) + H(x) = 1 for *all* x (under the other convention H(-0) + H(0) = 0)
And a few others.
TLDR; the properties in this video are true regardless of which convention you use, but the convention that H(t)=1 if t>= 0 has additional nice properties.
Excellent presentation of the topics. Thanks DrRahul Rohtak India
The differential equation y'(t) = ay(t) + delta(t - T) can only hold when t != T since y'(t) is undefined at T. I guess delta functions are a shorthand for that idea?
Theorem: pick any x in [a, b]. The integral of f over [a, b] is independent of f(x).
Proof: consider any partition P of [a, b]. Now add x - 1/n and x + 1/n to that partition to get Q. Since Q is a finer partition than P (formally P is a subset of Q), then undersum-oversum gap of Q is smaller than that of P. But the oversum and undersum of Q equal those of P on every subinterval except in a small neighborhood around x. The contribution of that neighborhood can be made as small as we like by picking n large enough. Let g(y) = f(y) whenever y != x and arbitrary when y = x. Then the difference between the integral of g and the integral of f can be made smaller than any epsilon = 1/n, implying that those two integrals are equal. [The existence of such a g is exactly what I mean by "[...] independent of f(x)".]
But then if the delta function is 0 everywhere except when x = 0 it must have the same integral as the zero function (g(x) = 0 everywhere), which is zero.
Hence the delta "function" cannot be a function. But integrals and derivatives are only defined for functions.
So what goes on here is that we're adding some abstract symbol whose behavior is given by its definition [similar to the way a sequence is not a number, but arithmetic on its limit functions like arithmetic on the sequence so it's kinda'-sorta'-like a number].
But we're not told the meaning of all expressions containing the delta. We are not shown the rules of its algebra, and they are not justified.
Note that in the final example, y(t) is discontinuous at T (limit 0 from below and 1 from above) and thus not differentiable at T, i.e. y'(t) is not defined when t = T. So the only sensible meaning I can make of the delta function is my initial statement: it's some abstract token which we use to pretend that y is differentiable everywhere.
In that way I guess it's like the infinity symbol: if two series diverge to positive infinity their sum also diverges to positive infinity, and in that sense (+oo) + (+oo) = (+oo). But note that this doesn't extend neatly to differences: the harmonic series minus itself converges to 0, but the harmonic series minus twice itself goes to (-oo). So is (+oo) - (+oo) equal to 0 or (-oo)? The question has no answer.
To understand algebra with the delta "function" and exactly what is permitted, we would need theorems characterizing it.
and they exist, but are just wayy more complicated. it's all part of distribution theory
Actually my theorem is bogus, because if g(x) is arbitrary, it is not guaranteed that the oversum-undersum delta for g can be made to shrink to 0, specifically around x. For example, take a constant function f(x) = 0 and let g(x) = f(x) + 1 exactly when x is rational. By repeated application of my "theorem" g and f integrate equally, but we cannot even integrate g.
thank you very much for your lecture, now I am writing for ethnomathematics on onion farming using step function. God bless
clearly explained, brilliant, what a professor
The Heaviside Step Function and the Dirac Delta Function are both extreme limits of their smoother, slightly more well-behaved counterparts, the Error Function and the Gaussian. The Gaussian is a smooth bell curve, exp(-(x squared)), and has finite area under it (it integrates to the square root of pi, famously). The Error Function is the name given to the function whose derivative is the Gaussian, though it has no real formulaic representation in x outside of its series expansion. The Error Function looks like a smooth version of the step function, one with a somewhat rounded off, curved step. In the limit as the full width at half maximum of the Gaussian goes infinitely narrow, it converges to the Dirac Delta Function, and in the limit of infinitely-square step shape, the Error Function converges to the Heaviside Step Function. Since the Gaussian is the derivative of the Error Function, (which can be shown by looking at the series expansions of both) it stands to reason that the Dirac Delta Function should be the derivative of the Heaviside Step Function.
"Dr. Strange" is pure fiction but "Dr. Strang" is REAL; here he is.
He's a real-life SUPER HERO to me!
I have his book on Linear Algebra and have been following him now for decades.
Educators are the actual "super heroes", but never get the recognition they deserve.
South Korea treats (pays) their educators like "Rock Stars", so should we.
Ahhh this was a great introductory lecture but I really need to get behind the in-depth math of delta-functions without the whole thing diverging into a graduation paper about distributions...
OMG, the integral sign that he draw, it's perfect!!!
Thank you, Professor Strang!
best prof of all time
a very heart rending lecture.Inadequate explanation and unsatisfactory.
Love u mr Strang
You are a great person in teaching
only a minute in and he's great
This is one of those math things that make my brain happy
I had the absolute worst professor for this class and didn’t learn anything from him. I learned differential equations via TH-cam University.
Quick question: so in part 9:09 since delta function is 0 everywhere but in x=0, could we then put integral_{-a}^{a} diracdelta(t)f(t)dt=f(0) for all a real number?
Or does the area of integration need to be from -infty to +infty?
this is awsome!!!! thanks Proffesor!!
Thank you Dr. Strang.
6:41 Lets us not consider integral of dirac delta from -inf to inf as heaviside function. here you will see the uncertainty.
1. dirac delta is not 1 at t=0( its 1 in case of discrete time or kronecker delta, which is a discrete analog of dirac delta)
2.I can not understand why d/dt(H(t))|t=t0 = dirac delta(t0), how? when we donot know the value of dirac delta at t0.
3. integral -inf to inf dirac delta is 1 how? dirac delta is defined as a large value at strictly zero. how do you integrate such a quantity. If i consider splitting the integral from lower limit to 0 and 0 to upper limit, the answer of total integral is 0. I really donot know how to do it, please explain.
I hope this will help u
th-cam.com/video/aPnBZG2y_UM/w-d-xo.html
Amazing teacher
for the last differential equation. I can`t get why when t is smaller than T y=0. if I substitute y=exp(a(t-T)) to the equation I still get equality when t < T. and 0 only when t=T.
Old question, but cannot resist answering ;-)
Yes, you do get equality upon substitution of the exponential into the differential equation, but you violate the initial condition y(0)=0. (Strang is assuming that the time of the 'kick' t=T actually occurs after t=0, so T>0.)
For t
thank you
I hope I'm like this when I get old.
Sir H is not differenciable at t=0, then why are we considering that point?, if H was a step function which jumped to say 2 then the integral would have equal to 2?
I've never known this relation between step function and delta function since my college time, to be honest.
This is really great.
Great explanation!
For the example starting on 11:21, why can't, up to t = T, y(t) = [e^(at) - 1]? That also satisfies the initial condition, does it not? Isn't 0, what Strang says is the value for the y up to t = T, just the case where a=0 for the general solution I gave? If so, wouldn't it be better to be consistent, since he doesn't explicitly specify any value of a in the second condition of the solution, and wishes to be general with respect to the variable a?
I don't see how that satisfies the range properly. It would only work precisely at t = 0 (the initial condition), but it would be wrong for every other value until t = T, unless i'm completely misunderstanding.
If you consider y(t)=[e^(at)-1] and substitute in the equation for any amount of 't', it doesn't satisfy the equation, it will be a=delta(t).
6:11 let's just appreciate the perfect integration symbol
8:42 is what i wanted... I was sleepy all the moment, and then I was turned ON at this moment........... WOAAAHHHHHHHHH
Thank you, professor!
I have a question for the community/teacher that I can't understand.
At time @8:57, he has the integral of [the Dirac Delta Function {d(t)} multiplied by some function {f(t)}];
so the integral [d(t) * f(t) * dt], which he concludes is just f(0) because the integral of d(t) is just H(t) which when evaluated from
-inf to +inf, which equals 1. And I get 1 * anything = anything, so only at input 0 can there ever be an output.
However, what I'm not getting is this, shouldn't the answer, instead of f(0), actually be F(0), where F(x) = integral [f(t) * dt]?
If not, I don't see why the function f(t) doesn't get integrated.
That would have been the case if Int(f*g)=int(f)*int(g) but this is not the case. One way to look this problem is numerical integration so i ll sum f(i)*g(i) for i=-inf to +inf .now g is special , its derac delta so its value is zero except at i=0. Hence i can replace f(i) by f(0). Now take f(0) common so sum of f(i)*g(i)=f(0)*{g(1)+g(2)+...} =f(0)*1=f0. Hope this helps.
f(t)*d(t)=0 at all points except at 0. So it only needs to be integrated at 0, but f(t)=f(0)(ie constant) at t=0, and can be taken out of integral giving f(0)*integral[d(t)dt].
thank you so much for that intuitive explanation!!
@@sabarikrishnam1485
I am looking for the elementary properties of del function from which every other properties can be deduced. The visualisation of del function does not make sense technically.
Great video
Thank you so much
Imagine him going to the bakery and he lectures you about "Imagine I deposit one dollar..." and then continues with the Heavyside function and Dirac Delta...
this is amazing!!
Awesome Video like others
Thank you so much sir
Thanks alot! God bless you!
I watched this simply because it was free. I have no idea what he was saying.
Great video! 🙏
At minute 13:00 i think y(t=0)=0 not y(t=T)
Excellent!
thank youuuu.
How can we take a derivative of a non-continuous function?
Your description that the integral of Delta function equals to H(t) is not correct.
H'(t)=Delta function(t) = 0 at except x=0, then the integral of it is Zero by Lebesgue integral theory and does not equal to H(t).
Do you know that the derivative of Cantor function takes Zero at almost everywhere, and that the integral of it is also Zero ?
But Cantor function is an increasing continuous function and not Zero.
How do you consider the relation between H(t) and Delta function.
The distribution theory is needed.
11:01 me when I try to explain anything
Just out of curiosity, if the step function is defined as 1 for t>=0 why do we say the slope at t=zero is infinity? The slope should be 0 at t=0. Actually, if the function is defined for all values of "t" than there should be no value of "t" where it's derivative is anything other than 0. Does my reasoning make any sense?
If there’s no slope, how did the function go from 0 to 1?
@@axelnils Well, interesting question, but isn't this why it is called the jump function?
Anyone else watching this during finals week cause their prof uses strang,a textbook
Is there anyone on the internet that can explain how to evaluate a delta function as an indefinite integral? I need it for my differential equations class but I can’t find any videos.
But the derivative does not exist at t=0!!! Why would you say it equals infinity?
big fan
Anyone thinks that the chalk marks on the right hand side of the board @8:23 looks like a guy in agony lol
many a great lecturer has failed to adequately illuminate the delta function....but not the illustrious Mr. Strang.
100 % interest rate lol
Remember when Finance PHDs were saying :" negative interest rates lol"
They are not losing so much now
super!!!!!
A blackboard in 2015?
Retro.
You say retro, I say timeless. Consider your computer and phone will be totally obsolete in about 20 years.
control systems.... i almost failed this course
Se parece al vaquero de la pelicula toy
wow!!
Click!
Clearly another lecture for engineers🙄.
Yes, and I´m grateful for it
Sir ..plz send solutions for Ur linear algebra book by Gilbert strong
A possible alternative is linear algebra courses that have problem sets with solutions. You can filter for them on this list: ocw.mit.edu/courses/find-by-topic/#cat=mathematics&subcat=linearalgebra. We hope this helps!
bizarre concept
It's awesome!
I'm pretty sure this is not true.
no entendí del todo gnte xd
pero lo voy a estudiar bn, gracias por su atención
Great explanation!