@@ironbutterfly3701 Guys, we are all getting superb content for free. And I've had crappy professors, I know exactly what it's like to be left to flounder with little or no guidance. I can count the no of Math profs I've had who actually taught well - or even cared about teaching - on the fingers of one hand. Idk if that's the case in just my country or it's like that elsewhere too, but resources like these are desperately needed, so let's not quibble. Thanks to this channel and others like it, the next gen of Math students won't struggle as much as we did, at least in the same ways we did. Also, on a more mundane note, most root tests don't work for every series. That's why we have so many. Glad to have one more weapon in my arsenal.
@@music_lyrics-ni7ksWe totally love channel, I am also a patreon (in another name). I was at least trying to predict what a good example would have been :)
Yassir Abu-Mustafa? He has some great online lectures on machine learning. They're fun, because he so much enjoys the cleverness of some of the methods that he almost laughs because of them ... which makes the audience enjoy them the same way. (Michael Penn is also very good at remarking on the cleverness of solutions.)
it would be interesting to see converging and diverging series where f''(0) doesn't exist, thereby showing the necessity of the assumption beyond just its role in the proof
@@yardenshani586 f''(0) exists in both of the cases you mentioned, so no really what @coreyyanofsky was saying at all. You would need examples where f''(0) does NOT exist, (such as a_n = n or n^2) and one of them converge while the other diverges.
Seems pretty clear to me that f(0)=0 implies lim a_n = 0, but I'm interested to see how f'(0)=0 and f''(0) exists implies that it is absolutely convergent.
Very interesting! That immediately shows that the sum of 1/n^s converges if and only if s>1. Maybe you could show that the series converges more quickly depending on how many derivatives of f are 0? EDIT: Apparently, I wasn't very careful in applying the theorem to 1/n^s. Still, I think the smoother f is near 0, the faster the series should converge.
Not only does it not really work for 1 < s < 2, but it would be a circular argument since the very same fact that 1/n^s converges for 1 < s < 2 is used in the proof.
My favorite somewhat obscure (not in many calculus books), but east to prove, convergence test is Cauchy's condensation test. en.wikipedia.org/wiki/Cauchy_condensation_test
10:50 would it not be another L'Hospital's rule with 0/0? Is it because we only assumed f''(0) to exist and not that f''(x) exists in a neighborhood around 0?
For L'Hôpital's rule, the existence of the derivative around 0 is necessary but existence of the derivative at 0 is not. So the fact that f''(0) exists doesn't help us. In fact, that's another reason we needed f''(0) to exist because that guarantees that f' exists in some neighborhood of 0 and not just at 0, so L'Hôpital's rule can be used the first time.
Before looking, I think this amounts to: f(0) is lim a_n and f'(0) is lim n*a_n if the first limit is zero. I am probably missing something, because the sum of a_n=[n loglog(n+100)]^(-1) should diverge. EDIT: "where f''(0)exists" (!!!) Ok then! 😅
You actually show that if it is enough if the one-sided second derivative at 0 exists (and both the function and its one-sided first derivative at 0 at 0). In particular, f(x) need only be defined for x>=0. That might be useful in some examples.
Doing everything with one-sided derivatives is almost assumed in this video. Specifically, the powers that Michael is comparing to in a lot of cases have their functions only exist for x>=0. In other words, the theorem as stated would apply to the p-series with terms 1/n^2, but not the p-series with terms 1/n^(3/2), since f(x) = x^(3/2) is only defined for x>=0 and so f'(0) and f''(0) don't exist. If we allow one-sided derivatives, it will apply to all p-series with p>=2 as Michael claimed.
Clever. However, like many clever tricks, it is obvious though nontrivial after a bit of thought. It is enssentially the quadratic term, congenital with the seoncd derivative, of the Taylor expansion with cautious treatment of the second derivative minful of the fact that the second derivative may not exist away from 0. The quadratic term is O(1/n^2). The series convergs quadratically and of course absolutely. It is not a very strong test of absolute congergence.
In fact, thinking a bit more, it seems that any convergence you could get just as easy by limit comparison with 1/n^2 and any divergence by limit comparison with 1/n. Unless I'm missing something.
@@DarinBrownSJDCMath Oh, I missed when I wrote my previous comment your earlier phrase "just as easy by". Sorry. Yes, you are exactly right and that is equivalent to what I said.
Perhaps the reason why your calculus professor did not use a differentiation test for convergence is that she was aiming to use limits to define the derivative and Riemann integration? Using this in that endeavour would be awfully circular. 😏😄
I have a question. When we changed variables in the limit(x=1/n), doesn't that mean that x is still a number 1/n and then it doesn't take on all the values in some region of the domain of the function and then L'hopitals rule may not be appliable? Correct me if I'm wrong
If I'm not wrong (and I may be), if he proves that the limit of the function exists then by the Heine definition of the limit we'll have that for all sequences that are in the domain of f and converge to 0, the sequence of the functions values converge to the functions sequence, and since 1/n approaches 0 the equality holds. I think maybe if we wished to be rigorous we'd first consider the limit of the function and then state that by Heine definition it then equals The sequence limit.
The other comment is great, but a simpler version is, the "as X approaches" limit is a stronger condition than "the sequence approaches", like how limits in multiple variables are stronger than taking sequential limits
2nd comment: slaps forehead! Of course if a discrete math object is extended into infinity it must have an analogue function in the reals. Where is Cantor when he is needed :-) And at 7:58 or there about the equivalence in direction seems a sublime observation. By that I mean if a discrete thing tends to infinity and becomes a real valued function thing then that real valued function thing can be deconstructed to become discrete things. (hint: we see it all the time with Euler's exponent e ) So all of this happens precisely in and at an event boundary (surface? topological surface?) where a mathematical thing has representations both in finite and infinite realms. Interim conclusion: Ouch! Michael! That made my head hurt! 🙂
This wasn't hidden. There are likely an infinite number of results that really don't matter that you could show your students. But why and where do you stop? How does this help them understand calculus? How does this make them better problem solvers?
I was waiting for the nice juicy example 🥺
And I was waiting for the "That's a good place to stop". 😉
18:47 Juicy example 🤔
To be fair - there are at least X hours work of research and exploration that could be generated by this single video
@@Alan-zf2tt Amen to that, lol
Unfortunately, 1/(n log(n)) does not work for this test :( as second derivative does not exist.
@@ironbutterfly3701 Guys, we are all getting superb content for free. And I've had crappy professors, I know exactly what it's like to be left to flounder with little or no guidance. I can count the no of Math profs I've had who actually taught well - or even cared about teaching - on the fingers of one hand. Idk if that's the case in just my country or it's like that elsewhere too, but resources like these are desperately needed, so let's not quibble. Thanks to this channel and others like it, the next gen of Math students won't struggle as much as we did, at least in the same ways we did.
Also, on a more mundane note, most root tests don't work for every series. That's why we have so many. Glad to have one more weapon in my arsenal.
@@music_lyrics-ni7ksWe totally love channel, I am also a patreon (in another name). I was at least trying to predict what a good example would have been :)
I know the author of that paper! He was my learning systems professor in grad school. Interesting to see him show up here!
Yassir Abu-Mustafa? He has some great online lectures on machine learning. They're fun, because he so much enjoys the cleverness of some of the methods that he almost laughs because of them ... which makes the audience enjoy them the same way.
(Michael Penn is also very good at remarking on the cleverness of solutions.)
it would be interesting to see converging and diverging series where f''(0) doesn't exist, thereby showing the necessity of the assumption beyond just its role in the proof
@@yardenshani586 those are both cases where f''(0) exists
@@yardenshani586That wouldn't quite work, though, because f'(0)=1 for the harmonic series.
@@yardenshani586 "where f''(0) doesn't exist"
@@yardenshani586 f''(0) exists in both of the cases you mentioned, so no really what @coreyyanofsky was saying at all.
You would need examples where f''(0) does NOT exist, (such as a_n = n or n^2) and one of them converge while the other diverges.
Obviously both my examples diverge but I didn't say I knew the right functions 😂
The proof is pretty much as you'd expect but the result is pretty nice. Never seen it before
we never got to see the juicy example :(
Really cool Michael! I'm going to share it with my teacher
Great video. The trick of chosing 0
This is fascinating, thanks for sharing ✨
Professors are hiding this because the test is weak.
This is just the Limit Comparison theorem in disguise.
do not forget to plug an assumption on the second derivative into the conditions of the theorem
sounds like a more flexible version of squeezing under 1/n² or above 1/n respectively
i havent seen it either thanks for sharing
what if f(x)=0 and f'(x)=0 but f''(x) doesnt exist? that case wasnt quite covered, for example x^p for 1
small typo at 16:00, he meant lim x->0 f(x), or lim n->00 f(1/n),
Juicy example?
That is not a good place to stop
On the first direction, could u use taylors theorem and say that a_n=O(1/n^2) which converges?
Just saw your comment after having put down mine. Yes, you can but need to prove with care. See my comment.
Seems pretty clear to me that f(0)=0 implies lim a_n = 0, but I'm interested to see how f'(0)=0 and f''(0) exists implies that it is absolutely convergent.
Very interesting! That immediately shows that the sum of 1/n^s converges if and only if s>1. Maybe you could show that the series converges more quickly depending on how many derivatives of f are 0?
EDIT: Apparently, I wasn't very careful in applying the theorem to 1/n^s. Still, I think the smoother f is near 0, the faster the series should converge.
Look closely, the case with 1 < s < 2 is not covered by the theorem.
@@motoroladefy2740Why not? We'd have f(x)=x^s, which satisfies f'(0)=0 if and only if s>1.
@@MathFromAlphaToOmega Yes, but when s < 2, f''(0) does not exist.
Oh, oops... Okay, maybe it's not so straightforward.
Not only does it not really work for 1 < s < 2, but it would be a circular argument since the very same fact that 1/n^s converges for 1 < s < 2 is used in the proof.
My favorite somewhat obscure (not in many calculus books), but east to prove, convergence test is Cauchy's condensation test. en.wikipedia.org/wiki/Cauchy_condensation_test
This one is usually introduced in analysis textbooks and it's quite neat to use on the harmonic series
Looks like the existence of f''(0) can be weakened to boundedness of f'(x)/x for small positive x. Did I miss anything?
10:50 would it not be another L'Hospital's rule with 0/0? Is it because we only assumed f''(0) to exist and not that f''(x) exists in a neighborhood around 0?
you don't know if the second derivative is continuous, therefore you can't evaulate the limit with f''(x) as x goes to 0+.
@@SimsHacks ahh okay.
For L'Hôpital's rule, the existence of the derivative around 0 is necessary but existence of the derivative at 0 is not. So the fact that f''(0) exists doesn't help us. In fact, that's another reason we needed f''(0) to exist because that guarantees that f' exists in some neighborhood of 0 and not just at 0, so L'Hôpital's rule can be used the first time.
Before looking, I think this amounts to: f(0) is lim a_n and f'(0) is lim n*a_n if the first limit is zero.
I am probably missing something, because the sum of a_n=[n loglog(n+100)]^(-1) should diverge.
EDIT: "where f''(0)exists" (!!!)
Ok then! 😅
Follow-up: from the proof, any such series must converge by direct comparison with n^(-p) for any choice of 1
You actually show that if it is enough if the one-sided second derivative at 0 exists (and both the function and its one-sided first derivative at 0 at 0). In particular, f(x) need only be defined for x>=0. That might be useful in some examples.
Yes, another way of seeing it is that you can always just set f(x) = 0 for all negative x.
@@shirou9790 This might not be twice differentiable at 0 then, and as the proof shows you don't need that.
Doing everything with one-sided derivatives is almost assumed in this video. Specifically, the powers that Michael is comparing to in a lot of cases have their functions only exist for x>=0. In other words, the theorem as stated would apply to the p-series with terms 1/n^2, but not the p-series with terms 1/n^(3/2), since f(x) = x^(3/2) is only defined for x>=0 and so f'(0) and f''(0) don't exist. If we allow one-sided derivatives, it will apply to all p-series with p>=2 as Michael claimed.
Clever. However, like many clever tricks, it is obvious though nontrivial after a bit of thought. It is enssentially the quadratic term, congenital with the seoncd derivative, of the Taylor expansion with cautious treatment of the second derivative minful of the fact that the second derivative may not exist away from 0. The quadratic term is O(1/n^2). The series convergs quadratically and of course absolutely. It is not a very strong test of absolute congergence.
Good observation. Makes more sense now (and shows why it's not too useful).
In fact, thinking a bit more, it seems that any convergence you could get just as easy by limit comparison with 1/n^2 and any divergence by limit comparison with 1/n. Unless I'm missing something.
@@DarinBrownSJDCMath Not necessarily. Consider 1/n^{1+a}, a>0, and 1/(n ln n), for example.
@@nightrider1560 Well, sure. I meant any convergence or divergence that *THIS* test would show. I think this test would fail for those as well.
@@DarinBrownSJDCMath Oh, I missed when I wrote my previous comment your earlier phrase "just as easy by". Sorry. Yes, you are exactly right and that is equivalent to what I said.
ow do you know that you can use f'(x) in your limit? As far as I can see, no assumption about differentiability away from 0 was made.
For second derivative to exist we have to have f' defined in an interval around 0.
@@JohnSmith-zq9mo thanks, that makes sense.
Perhaps the reason why your calculus professor did not use a differentiation test for convergence is that she was aiming to use limits to define the derivative and Riemann integration? Using this in that endeavour would be awfully circular. 😏😄
confusing at best
I have a question. When we changed variables in the limit(x=1/n), doesn't that mean that x is still a number 1/n and then it doesn't take on all the values in some region of the domain of the function and then L'hopitals rule may not be appliable? Correct me if I'm wrong
If I'm not wrong (and I may be), if he proves that the limit of the function exists then by the Heine definition of the limit we'll have that for all sequences that are in the domain of f and converge to 0, the sequence of the functions values converge to the functions sequence, and since 1/n approaches 0 the equality holds. I think maybe if we wished to be rigorous we'd first consider the limit of the function and then state that by Heine definition it then equals The sequence limit.
The other comment is great, but a simpler version is, the "as X approaches" limit is a stronger condition than "the sequence approaches", like how limits in multiple variables are stronger than taking sequential limits
ahhhh that's why(I'm more a physicist so I needed a simpler version). Thanks to all tho
Good stuff
neat proof!
in 4 hours Michael's video notched 3,315 views. That seems pretty impressive to me = Well done Professor Penn!
The example was too juicy.
2nd comment: slaps forehead! Of course if a discrete math object is extended into infinity it must have an analogue function in the reals.
Where is Cantor when he is needed :-) And at 7:58 or there about the equivalence in direction seems a sublime observation.
By that I mean if a discrete thing tends to infinity and becomes a real valued function thing then that real valued function thing can be deconstructed to become discrete things. (hint: we see it all the time with Euler's exponent e )
So all of this happens precisely in and at an event boundary (surface? topological surface?) where a mathematical thing has representations both in finite and infinite realms.
Interim conclusion: Ouch! Michael! That made my head hurt! 🙂
This wasn't hidden. There are likely an infinite number of results that really don't matter that you could show your students. But why and where do you stop? How does this help them understand calculus? How does this make them better problem solvers?
Michael are you a robot or atheist or alien