To motivate the selection of g(x) we see that the numerator of f'(x) is a difference of squares. Hence f'(x) = (x - f(x))(x + f(x))/(something positive). Since we want to show f'(x) > 0, this motivates looking at x - f(x) and x + f(x).
Yes the difficulty in solving these problems is not so much in the transformations, but in *seeing* why this is a sensible idea. But the presenter blithely proceeds as if it has fallen out of the sky.
5:26 Note: This step is valid becaise the denominator is strictliy bigger than 0 for all x€(0,inf), no matter what the value of f(x) is at any point x. Therefore, multiplying the equation by it does not switch the order of inequality
My solution uses the set A = {x : f(x)=x}. Since f'(x) = 0 for x in A and f' is continuous from the given equation, the set A is discrete. If cx for x>c and we're done, or f(x)
After some manipulations I was able to integrate the equation and bound the remaining integral to show x-C0 for x>C. Additionally the left-hand side of our equation is unbounded which implies f is unbounded. These two facts as well as the continuity of f gives the desired result. My solution also showed that x-C>f+f^3/3. Hence we know that f is asymptomatic to (3x)^(1/3).
I think we may also need to mention that f’ is continuous which is true since it equals a composition of continuous functions with f and f is continuous since it’s differentiable. With f’ being continuous we can claim that the derivative of the limit of f equals the limit of the derivative of f. Or is this not required and we can say that anyway, if so why are we allowed to pass through the limit here?
The DE is non-singular for finite x, so the only singularity may be x=infinity. If |f(x)|=0, i.e. f(x) is non-decreasing. In particular, since f is unbounded, it can only increase to +infinity as x->infinity. If f(x) > x, there is nothing to show (f may decrease, but only as long as f(x)>x, otherwise we are back to the case f(x)infinity. Then to first order f(x) satisfies the differential equation f'(x)=1/(1+f(x)^2) in the vicinity of infinity i.e. f(x)+f(x)^3/3=x+c for some constant c. Thus, up to first order, f(x) ~ (3x)^{1/3} as x-> infinity and f(x)=o(x) as required and furthermore f(x) -> infinity as x->infinity.
yes. A full series expansion in (inverse) powers of x^{1/3} is possibile. If I've not messed up the first terms should be: f(x) ~ (3x)^{1/3} - 1 / (3x)^{1/3} + 3/x + O(x^-5/3)
17:15 is wrong. The limit of f' doesnt have to exist. A function f that converges to L can have derivative 1 in the sets [n , n+ (1/(n^2)) ] for natural numbers n. You can easily see that (because of the fact that the sum over 1/n^2 converges) there is no contradiction to the fact that f converges to L. Obviously f' must become smaller and smaller outside of the intervalls [n , n + (1/(n^2)) ]. But then still thw limit of f' does not exist.
It doesnt ruin the proof because you can find large enough x outside of the {f'=1} intervalls where f' becomes arbitrarily small. But to write lim f' is just wrong.
9:23 Some additional assumption is needed like g' being monotonic. Otherwise you can have oscilating function. And I would start from proving that differential equation we start with has all solutions that don't implode before x reaches infinity, otherwise all of it doesn't sense.
Do you mean g being monotonic? That would make sense, I don’t think we need g’ to be monotonic though We get g being monotonic from g’ always being positive implies that g is monotonically increasing, therefore the limit of its derivative must be 0, otherwise g would have to be oscillating at some points but g is increasing. This isn’t a rigorous argument, but that’s the intuition for me.
@@Happy_Abe Yeah, you're right. I actually had g typed there and then i changed it to g', while it was correct with g. Although now I think g' could also be fine but checking if g is monotonic should be easier. With monotonic g', g should also be monotonic from some place to infinity. xD Maybe that's why I didn't notice right away how silly it is, because it's actually correct too and makes it even more silly. Technically it's not really stronger condition unless we add for monotonic g condition "from some place" then we get weaker condition so that would make it better. Actually f and g should extrapolate to (0, infty) and then I think g could stop being monotonic cause we used x>1 assumption to prove g'>0. Only later all of the gs become monotonic from some place to infinity. So it's good to add that we mean monotonic from some place, not globally.
@@kokainum yes good last point about from some place and on. We can probably show this about g’ but I think for what we’re ultimately trying to say about f, saying this about g would be more helpful for our proof than g’ even if both are true
Actually the argument that g was monotonic increasing was true for all c in the domain But for f, we needed all x>x_0 so there we need after some point so maybe f may not be always monotonic at first but becomes it but g and h are always monotonic
There was a shorter way. First, no finite limit, done at the end of the video. Second, f' is within (-1, 1), easy. Then, pick one x0 > 1. Assume the value of f(x0). 3 cases : Is it such as -x0 < f(x0) < x0 ? Notice f'(x0) > 0. No limit and f is increasing, deduce lim f. Is it f(x0) > x0 ? Then f'(xo) < 0, there is x1 such as back to bounds, deduce. I'm even adding x1 < f(x0)... Is it f(x0) < -x0 ? g = f + id has g' positive, f is getting close to -id, but then f' gets close to 0 and g' to 1, therefore not 0 and there's x2 such as g(x2) > 0, back to bounds.
I think there is a simpler demo by assuming that if f(x) is continuous and is not going to infinity it must be bounded, and then f' ends up being strictly positive.
I might not have understood you correctly, but I don't think that it can work. First of all, not going to infinity does not imply that it has a limit or that it is bounded. For example, f(x) = x*sin(x) is a continuously differentiable function on (1,inf) which isn't bounded, and doesn't have a limit. Moreover, strictly possitive derivative does not contradict boundedness. For example, the function g(x) = arctan(x) is continuously differentiable, and it's derivative is g'(x) = 1/(1+x²) Which is strictly possitive. Morover, -π/2 < arctan(x) < π/2 The subtle thing to consider here is that a monotonous function can approach a finite limit as x approaches infinity, while maintaining a strictly possitive/ negative derivative. It just means that it's slope must aaproach 0, fast enough for the function to not climb to infinity. A classic example for a monotonous function that it's slope approaches 0 but isn't bounded is ln(x)
Never the less there is a way out, assuming f is bounded lead to a contradiction as f' is strictly over a positive number for large x. Then we can look at the mins and maxs of f, for these values if they exist f(x) = +-x and if f(0) =0 f’(x)=1 so f cannot oscillate from negative to positive value (only max one transition from negative to positive). So after f’is either always negative or is positive for large x, as f is not bounded its limit is +- infinity.
If, by contradiction, the limit was L where L is finite, then lim f' = lim (x^2 - L) / x^2 (1+L^2) = (via L'H) lim 2x / 2x(1+L^2) = 1/(1+L^2) > 1/(1+0) = 1. But lim f' > 1 means that lim f can't be finite, a contradiction.
My proof is a lot simpler. Assume that the limit is finite numer A, then the derivative of the function must converge to zero for x approaching infinity. However, this cannot be true given the definition of f'(x) = x² - f(x)²/... does not converge to zero. Therefore, a finite solution is impossible, reductio ad absurdum. There can only be the solution + oder - infinity and it is easy to figure out that the result must be positive. Therefore, + infinity is the solution.
Your comment is correct, it would be possible that lim g'(x) does not exist for x → +∞. However, the calculation presented by Penn shows that the limit of g'(x) for x → +∞ DOES exist, and it is 1 (in the case that g tends to any finite real number G). This is the contradiction.
@@tombalabombaism it is true for this case, but it is not true in general. Pick for example the strictly increasing function g(x) = x + sin(x), then its derivative g'(x) = 1 - cos(x) does not have a limit for x → +∞.
@@bjornfeuerbacher5514The limit of f' (for an arbitrary f that converges) doesnt have to exist: A function f that converges to L can have derivative 1 in the sets [n , n+ (1/(n^2)) ] for natural numbers n. You can easily see that (because of the fact that the sum over 1/n^2 converges) there is no contradiction to the fact that f converges to L. Obviously f' must become smaller and smaller outside of the intervalls [n , n + (1/(n^2)) ]. But then still thw limit of f' does not exist
This solution is incorrect, I think. "lim_{x--->infty}g(x)=Ginfty}g(x) is non finite" doesn't really imply lim_{x-->infty}g(x)=+infty since we don't know for sure that g is even convergent to anything at infinity. There are gaps here
Remember that he showed that g was strictly increasing before he got to that part, so it converges to its supremum (either infinity or a finite value). He acknowledged that the g converges implies the limit g' = 0 was left undone, but a slight modification of the way he presented things - which really amounts to the same thing - is maybe more obvious. The new order would be to first show that g converging implies g ' converges to 1, which he did justify, and then claim that those two are impossible simultaneously. Again, that's not done, but it's a standard kind of Calc argument that should be at least intuitively obvious: you can't be converging to a finite value "at infinity" while simultaneously increasing at something close to a 45 degree angle. So I agree with you that it's not entirely rigorous or thorough, but I think it's more than adequate to be persuasive as a "proof".
@@mathboy8188I missed him talk about g being strictly increasing, ok. But it really is possible for g’ to be nonconvergent even though g is. We can’t have a situation where we constantly approach things at a 45 degree angle but we can have situations where the derivative violently oscillates despite the function values converging. We’re done if we conclude g’ is convergent to _something_.
I need to stop trying to solve problems off the thumbnail. I keep wanting to see if I can solve it without his help. But then I run into videos like this where the thumbnail only has a differential equation and I think there's going to be some trick to get a nice solution, only to find out the problem in the video is just to show that the solution satisfies a particular condition.
Wouldn't a straightforward proof by contradiction do the job? Suppose f(x) is bounded, do the limit and show that its derivative is finite (and positive, thus f(x) is always increasing) as x gets arbitrarily big Am I missing something?
It seems correct what you are saying. Can we assume in this problem that there is an f(x) that satisfies the differential equation? Not sure what "let f(x) be such that" means. Does it mean that there exist such an f, or does it mean "if there is such an f, it is unbounded".
Initially I thought this, too. It doesn't work. All this does is to show that f(x) is unbounded. The term x^2-f(x)^2 could oscillate in sign infinitely often, and the limit would not exist. x^2-f(x)^2 could be negative, and then the limit would be minus infinity. For this proof, it's crucial to show that x^2-f(x)^2 is always positive asymptotically.
@@felipelopes3171 the oscillations are not allowed due to f'(x) being always nonnegative, since f' ~1/(1 + f^2). That mislead me, but I can see it clearly now. Thanks!
The converse of lim_{x\to \infty}f(x)=\infty includes the case f(x) has no limit as well as f(x) oscillates and is unbounded. So your contradiction does not work.
Do you mean that x is going to -∞? That isn't possible because f and therefore also g is defined only on (1;∞). Or do you mean that g(x) is going to -∞? That's not possible, too, because the proof shows early on that g is increasing.
f’ is strictly positive so f is strictly increasing, a strictly increasing function cannot have negative infinity as it’s limit. It was not necessary to treat that case
the 3rd inequality he gets is true for all x in f’s domain : •the left side is always greater than 2 (due to x^2>=1). •the right side is always negative (due to the right factor being negative and the left one being positive) •2 is always strictly greater than any negative number Now notice that the 3rd inequality is equivalent the 1st inequality. Therefore if the 3rd inequality is true for all x in f’s domain then the 1st inequality is true as well. Hope that helped
g' = 1 + > -1, g' > 0, g is increasing; x^2 > f^2, f' is increasing (limit exists), finally, f(inf) not equal L, an finite limit implying that f(x) tends to inf.
Not very much. He could have claimed >-2 and followed the same process. From this inequality, all the other inequalities follow. The domain is (1,oo) so why -1?
@@gp-ht7ug My bad i thought you didn’t understand the arguments not the choice of the inequality. Let’s call the big square l(x). Imagine he claimed l(x)>-2. Then g’(x)= 1 + l(x) with l(x) in ]-2,inf[ if l(x) in ]-2,-1] then g’(x)-1 or any bigger number but why bother when you can just chose -1
After trying some parametrics, looks like a bottomless pit at zero, and somewhat discontinuous on the left side, ...so range takes precedence for presentation? Without listening to theocuztics, inhale the fairy dust at zero, and sneeze for the solution.
I tried to solve it and end-up with this ugly stuff: On [1;+oo], we can observe that |x²-f(x)²| < x²+x²f(x)² so |f'| < 1 and lim |f(x)|/x < 1 (since -1 a implies |f(x)/x| < r, if we chose r (1-(q/x)²)/(1+q²) > 0.25/(1+q²) for any x > 2q so f tends toward infinity
To motivate the selection of g(x) we see that the numerator of f'(x) is a difference of squares. Hence f'(x) = (x - f(x))(x + f(x))/(something positive). Since we want to show f'(x) > 0, this motivates looking at x - f(x) and x + f(x).
Thank you for this
It’s always frustrating when functions in solutions seem like they came out of a magical hat
Yes the difficulty in solving these problems is not so much in the transformations, but in *seeing* why this is a sensible idea. But the presenter blithely proceeds as if it has fallen out of the sky.
5:26 Note: This step is valid becaise the denominator is strictliy bigger than 0 for all x€(0,inf), no matter what the value of f(x) is at any point x. Therefore, multiplying the equation by it does not switch the order of inequality
My solution uses the set A = {x : f(x)=x}. Since f'(x) = 0 for x in A and f' is continuous from the given equation, the set A is discrete. If cx for x>c and we're done, or f(x)
the video was amazing I got motivation to do tough looking math myself thanks for all the efforts that you put in.
Contradiction is so stylish
From what the question asks, I presume there's no hope in hell of finding a closed form solution for f(x).
After some manipulations I was able to integrate the equation and bound the remaining integral to show x-C0 for x>C. Additionally the left-hand side of our equation is unbounded which implies f is unbounded. These two facts as well as the continuity of f gives the desired result. My solution also showed that x-C>f+f^3/3. Hence we know that f is asymptomatic to (3x)^(1/3).
Interesting what you did. Is it possible to see your calculations?
Day 1 or asking for the Fourier Transform on finite groups.
At 12:01, since we already know that
• g(x) and h(x) are positive at x→infinity
• It implies that -x
I think we may also need to mention that f’ is continuous which is true since it equals a composition of continuous functions with f and f is continuous since it’s differentiable. With f’ being continuous we can claim that the derivative of the limit of f equals the limit of the derivative of f.
Or is this not required and we can say that anyway, if so why are we allowed to pass through the limit here?
The DE is non-singular for finite x, so the only singularity may be x=infinity.
If |f(x)|=0, i.e. f(x) is non-decreasing. In particular, since f is unbounded, it can only increase to +infinity as x->infinity.
If f(x) > x, there is nothing to show (f may decrease, but only as long as f(x)>x, otherwise we are back to the case f(x)infinity. Then to first order f(x) satisfies the differential equation f'(x)=1/(1+f(x)^2) in the vicinity of infinity i.e. f(x)+f(x)^3/3=x+c for some constant c. Thus, up to first order, f(x) ~ (3x)^{1/3} as x-> infinity and f(x)=o(x) as required and furthermore f(x) -> infinity as x->infinity.
yes. A full series expansion in (inverse) powers of x^{1/3} is possibile. If I've not messed up the first terms should be: f(x) ~ (3x)^{1/3} - 1 / (3x)^{1/3} + 3/x + O(x^-5/3)
Nice.for the editor: please add drums sound when michael says oooooooof..
Is there any way to find an explicit expression for f(x)? I suppose as it's a nonlinear differential equation it can be arbitrarily difficult.
I just took the 2023 putnam. I think I got one right!
isn't this technically a differential equation, where we're only concerned with the asymptotic behavior of the solution?
Precisely
17:15 is wrong. The limit of f' doesnt have to exist. A function f that converges to L can have derivative 1 in the sets [n , n+ (1/(n^2)) ] for natural numbers n. You can easily see that (because of the fact that the sum over 1/n^2 converges) there is no contradiction to the fact that f converges to L. Obviously f' must become smaller and smaller outside of the intervalls [n , n + (1/(n^2)) ]. But then still thw limit of f' does not exist.
It doesnt ruin the proof because you can find large enough x outside of the {f'=1} intervalls where f' becomes arbitrarily small. But to write lim f' is just wrong.
9:23 Some additional assumption is needed like g' being monotonic. Otherwise you can have oscilating function.
And I would start from proving that differential equation we start with has all solutions that don't implode before x reaches infinity, otherwise all of it doesn't sense.
Do you mean g being monotonic?
That would make sense, I don’t think we need g’ to be monotonic though
We get g being monotonic from g’ always being positive implies that g is monotonically increasing, therefore the limit of its derivative must be 0, otherwise g would have to be oscillating at some points but g is increasing. This isn’t a rigorous argument, but that’s the intuition for me.
@@Happy_Abe
Yeah, you're right. I actually had g typed there and then i changed it to g', while it was correct with g. Although now I think g' could also be fine but checking if g is monotonic should be easier. With monotonic g', g should also be monotonic from some place to infinity. xD Maybe that's why I didn't notice right away how silly it is, because it's actually correct too and makes it even more silly. Technically it's not really stronger condition unless we add for monotonic g condition "from some place" then we get weaker condition so that would make it better.
Actually f and g should extrapolate to (0, infty) and then I think g could stop being monotonic cause we used x>1 assumption to prove g'>0. Only later all of the gs become monotonic from some place to infinity. So it's good to add that we mean monotonic from some place, not globally.
@@kokainum yes good last point about from some place and on. We can probably show this about g’ but I think for what we’re ultimately trying to say about f, saying this about g would be more helpful for our proof than g’ even if both are true
Actually the argument that g was monotonic increasing was true for all c in the domain
But for f, we needed all x>x_0 so there we need after some point so maybe f may not be always monotonic at first but becomes it but g and h are always monotonic
@11:38 how can you conclude that quickly that the whole thing approaches zero?
He explained in detail why it approaches zero. Did you miss some of his explanations, or didn't you understand his reasoning?
And can we know also the values of f(x) ?
There was a shorter way.
First, no finite limit, done at the end of the video.
Second, f' is within (-1, 1), easy.
Then, pick one x0 > 1. Assume the value of f(x0). 3 cases :
Is it such as -x0 < f(x0) < x0 ? Notice f'(x0) > 0. No limit and f is increasing, deduce lim f.
Is it f(x0) > x0 ? Then f'(xo) < 0, there is x1 such as back to bounds, deduce. I'm even adding x1 < f(x0)...
Is it f(x0) < -x0 ? g = f + id has g' positive, f is getting close to -id, but then f' gets close to 0 and g' to 1, therefore not 0 and there's x2 such as g(x2) > 0, back to bounds.
I think there is a simpler demo by assuming that if f(x) is continuous and is not going to infinity it must be bounded, and then f' ends up being strictly positive.
I might not have understood you correctly, but I don't think that it can work. First of all, not going to infinity does not imply that it has a limit or that it is bounded. For example,
f(x) = x*sin(x)
is a continuously differentiable function on (1,inf) which isn't bounded, and doesn't have a limit.
Moreover, strictly possitive derivative does not contradict boundedness. For example, the function
g(x) = arctan(x)
is continuously differentiable, and it's derivative is
g'(x) = 1/(1+x²)
Which is strictly possitive. Morover,
-π/2 < arctan(x) < π/2
The subtle thing to consider here is that a monotonous function can approach a finite limit as x approaches infinity, while maintaining a strictly possitive/ negative derivative. It just means that it's slope must aaproach 0, fast enough for the function to not climb to infinity.
A classic example for a monotonous function that it's slope approaches 0 but isn't bounded is ln(x)
You are right
Never the less there is a way out, assuming f is bounded lead to a contradiction as f' is strictly over a positive number for large x. Then we can look at the mins and maxs of f, for these values if they exist f(x) = +-x and if f(0) =0 f’(x)=1 so f cannot oscillate from negative to positive value (only max one transition from negative to positive).
So after f’is either always negative or is positive for large x, as f is not bounded its limit is +- infinity.
I was sure he was going to use difference of squares in the numerator
If, by contradiction, the limit was L where L is finite, then lim f' = lim (x^2 - L) / x^2 (1+L^2) = (via L'H) lim 2x / 2x(1+L^2) = 1/(1+L^2) > 1/(1+0) = 1.
But lim f' > 1 means that lim f can't be finite, a contradiction.
You have to worry about the limit not existing or going to negative infinity
My proof is a lot simpler. Assume that the limit is finite numer A, then the derivative of the function must converge to zero for x approaching infinity. However, this cannot be true given the definition of f'(x) = x² - f(x)²/... does not converge to zero. Therefore, a finite solution is impossible, reductio ad absurdum. There can only be the solution + oder - infinity and it is easy to figure out that the result must be positive. Therefore, + infinity is the solution.
Isn't the claim at 9:10 false? A counterexample is the function g(x) = sin(x^4)/(x^2+1), for which lim g(x) = 0 but lim g'(x) does not exist
Your comment is correct, it would be possible that lim g'(x) does not exist for x → +∞. However, the calculation presented by Penn shows that the limit of g'(x) for x → +∞ DOES exist, and it is 1 (in the case that g tends to any finite real number G). This is the contradiction.
@@tombalabombaism it is true for this case, but it is not true in general. Pick for example the strictly increasing function g(x) = x + sin(x), then its derivative g'(x) = 1 - cos(x) does not have a limit for x → +∞.
@paolo: That counterexample doesn't work, since it was shown early in the proof that g is an increasing function.
@@bjornfeuerbacher5514The limit of f' (for an arbitrary f that converges) doesnt have to exist: A function f that converges to L can have derivative 1 in the sets [n , n+ (1/(n^2)) ] for natural numbers n. You can easily see that (because of the fact that the sum over 1/n^2 converges) there is no contradiction to the fact that f converges to L. Obviously f' must become smaller and smaller outside of the intervalls [n , n + (1/(n^2)) ]. But then still thw limit of f' does not exist
@@hakarraji5723 But could such a function even be differentiable everywhere?
This solution is incorrect, I think. "lim_{x--->infty}g(x)=Ginfty}g(x) is non finite" doesn't really imply lim_{x-->infty}g(x)=+infty since we don't know for sure that g is even convergent to anything at infinity. There are gaps here
Remember that he showed that g was strictly increasing before he got to that part, so it converges to its supremum (either infinity or a finite value).
He acknowledged that the g converges implies the limit g' = 0 was left undone, but a slight modification of the way he presented things - which really amounts to the same thing - is maybe more obvious. The new order would be to first show that g converging implies g ' converges to 1, which he did justify, and then claim that those two are impossible simultaneously. Again, that's not done, but it's a standard kind of Calc argument that should be at least intuitively obvious: you can't be converging to a finite value "at infinity" while simultaneously increasing at something close to a 45 degree angle.
So I agree with you that it's not entirely rigorous or thorough, but I think it's more than adequate to be persuasive as a "proof".
@@mathboy8188I missed him talk about g being strictly increasing, ok. But it really is possible for g’ to be nonconvergent even though g is. We can’t have a situation where we constantly approach things at a 45 degree angle but we can have situations where the derivative violently oscillates despite the function values converging. We’re done if we conclude g’ is convergent to _something_.
I need to stop trying to solve problems off the thumbnail. I keep wanting to see if I can solve it without his help. But then I run into videos like this where the thumbnail only has a differential equation and I think there's going to be some trick to get a nice solution, only to find out the problem in the video is just to show that the solution satisfies a particular condition.
what are the clues that x+f(x) is a good function to explore?
I guess he wanted to prove that x^2-f(x)^2>0 which is (x+f(x))(x-f(x)) = g*h and prove that each is positive
f’(x)= (x+f(x))(x-f(x))/ something positive
Imagine Catalin Zara is watching this video
Why have you chosen g(x) built in that way?
Wouldn't a straightforward proof by contradiction do the job? Suppose f(x) is bounded, do the limit and show that its derivative is finite (and positive, thus f(x) is always increasing) as x gets arbitrarily big
Am I missing something?
It seems correct what you are saying. Can we assume in this problem that there is an f(x) that satisfies the differential equation? Not sure what "let f(x) be such that" means. Does it mean that there exist such an f, or does it mean "if there is such an f, it is unbounded".
Initially I thought this, too. It doesn't work. All this does is to show that f(x) is unbounded. The term x^2-f(x)^2 could oscillate in sign infinitely often, and the limit would not exist. x^2-f(x)^2 could be negative, and then the limit would be minus infinity.
For this proof, it's crucial to show that x^2-f(x)^2 is always positive asymptotically.
But how can f(x) oscillate if its derivative is asymptotically lower bounded by a positive value (which is 1)? @@felipelopes3171
@@felipelopes3171 the oscillations are not allowed due to f'(x) being always nonnegative, since f' ~1/(1 + f^2). That mislead me, but I can see it clearly now. Thanks!
The converse of lim_{x\to \infty}f(x)=\infty includes the case f(x) has no limit as well as f(x) oscillates and is unbounded. So your contradiction does not work.
But what if the limits are -∞ inside the proofs??? I think u forgot that case...
Do you mean that x is going to -∞? That isn't possible because f and therefore also g is defined only on (1;∞). Or do you mean that g(x) is going to -∞? That's not possible, too, because the proof shows early on that g is increasing.
I think the final argument only shows that the limit is not finite, but doesn't deal with the case where the limit is negative infinity.
f’ is strictly positive so f is strictly increasing, a strictly increasing function cannot have negative infinity as it’s limit. It was not necessary to treat that case
Can someone explain to me the claim >-1? Thanks
the 3rd inequality he gets is true for all x in f’s domain :
•the left side is always greater than 2 (due to x^2>=1).
•the right side is always negative (due to the right factor being negative and the left one being positive)
•2 is always strictly greater than any negative number
Now notice that the 3rd inequality is equivalent the 1st inequality. Therefore if the 3rd inequality is true for all x in f’s domain then the 1st inequality is true as well.
Hope that helped
g' = 1 + > -1, g' > 0, g is increasing; x^2 > f^2, f' is increasing (limit exists), finally, f(inf) not equal L, an finite limit implying that f(x) tends to inf.
Not very much. He could have claimed >-2 and followed the same process. From this inequality, all the other inequalities follow. The domain is (1,oo) so why -1?
@@gp-ht7ug My bad i thought you didn’t understand the arguments not the choice of the inequality.
Let’s call the big square l(x). Imagine he claimed l(x)>-2.
Then g’(x)= 1 + l(x) with l(x) in ]-2,inf[
if l(x) in ]-2,-1] then g’(x)-1 or any bigger number but why bother when you can just chose -1
After trying some parametrics, looks like a bottomless pit at zero, and somewhat discontinuous on the left side, ...so range takes precedence for presentation?
Without listening to theocuztics, inhale the fairy dust at zero, and sneeze for the solution.
Correct
18:14 there should be (1+f(x))^2 in the denominator, so finally limit will be 1/(1+2L+L^2). (Just sayin', it has no effect on the conclusion.)
No one talks about why any limits in this problem should even exist in the first place!
I tried to solve it and end-up with this ugly stuff:
On [1;+oo], we can observe that |x²-f(x)²| < x²+x²f(x)² so |f'| < 1 and lim |f(x)|/x < 1 (since -1 a implies |f(x)/x| < r, if we chose r (1-(q/x)²)/(1+q²) > 0.25/(1+q²) for any x > 2q so f tends toward infinity