Differentiate both sides. 2x=f(x) * integral(1/f(t) dt, 0->x) + 1/f(x) * integral(f(t)dt, 0->x) Let F(x) be the integral of f(t)dt from 0 to x. 2x = f(x)/F(x) *x^2 + F(x)/f(x) quadratic in g(x) = F(x)/f(x) g(x) = x xf(x) = F(x) differentiate both sides xf'(x)=0 either x=0 and f is any function or f(x) = C
I suggest a different approach : it is pretty obvious that f[x] must be a power function . Hence make the Ansatz f[x]= a*x^p. Plug it in into the integral equation to get ∫ f *∫1/f =1/(1-p^2)* x^2 . Hence p = 0 must hold ,i.e. f[x] = a.
It's really gratifying how unexpectedly Cauchy-Schwarz came to save our calculus problem. I would like to see it used in a not-so-trivial scenario. Anybody can come up with that?
I leave easter eggs like these in the videos. They're different from typical errors like missing a negative or something and it's interesting whether or not people find them😂
I've tried with the hypotesis f(x) = ax^b and the result was pretty linear: [ax^(b+1)/(b+1)+C] [x^(1-b)/[a(1-b)]+K] = x^2 A lot of equation systems and fun ... --> b = 0, a 0 --> f(x) = a Checkin(g): ax(x/a) = x^2 Delightfun, oops, delightful
Rare miss here, Maths 505. What exactly is your integral equation supposed to mean? I would normally think that it would mean \int^{x} f(t) \int^{t} du/f(u) dt = x^2, which is equivalent to 1/f = (2x/f)' with solution f(x) = C sqrt(x). If you're simply multiplying the two indefinite integrals together, then yes it's clear f(t)=constant is a solution.
Integral signs should always be accompanied by differentials. Otherwise the expression is nonsense. And that is what the original equation at 0:42 is. I'm sorry to be so blunt, but it is sloppy to leave this off. I deduct points from student exam scores when they do that.
Yes, it is a bit sloppy but not so much more than people (mathematicians, physicists, engineers) sometimes use elsewhere. By the same argument you must admit that y' = y is not a valid expression for a differential equation, because it does not specify that y is supposed to mean a function, it doesn't say of how many variables it is a function and what they are, it doesn't say which of those variables we take the derivative of (if the prime mark is to mean a derivative at all!). And it doesn't specify an initial condition. For me at least it was clear that the notation "int f" was supposed to mean any fixed antiderivative of a function f (of a single argument x) we are looking for. The antiderivative of f is - in some historical mathematical texts - also written as f^(-1) meaning the negative first derivative of f (not to be confused with the inverse function of f) relating to the fact that integration can be considered the inverse operation of differentiation (by the Fundamental Theorem of Calculus). However, that notation is not in use anymore.
Probably the most controversial thing I see here in the thumbnail image is that, because of the missing "dx", we do not see where the first integration operation ends, and also that we don't see the bounds of integration (i.e. *which* antiderivative to take). But both things were clarified right at the start of the video.
I mean it's not really algebra, just a use of Cauchy-Schwarz inequality applied on integrals, the inequality itself is not necessarily akin to algebra Still cool video
You’re glossing over the fact that you’re either treating x as a constant, or you’re considering an uncountable family of Hilbert spaces here. The latter is the better way to think of it. You’re considering the vector space H_x consisting of all Lebesgue integrable functions defined on the interval [0,x] for some fixed x. On this space, a natural Hilbert space inner product maps a function f to the number obtained by integrating f over the interval [0,x]. Your calculations show that if f satisfies the given integral equation for all (positive) x, then for each x, there is some \alpha(x) such that f(x)=\alpha(x). Of course, but I don’t see why you think that the value of \alpha MUST BE independent of x. Clearly any constant (nonzero) function IS a solution of the given integral equation, so when you checked the solution, it worked. However, I don’t see why you think your work proves that EVERY solution of that given integral equation is a constant function, precisely because you glossed over the details behind the way in which you used the Cauchy-Schwartz Inequality. …
@@giorgioripani8469 I didn't know of Dr peyam's video hence I could not have copied him. However, I'm unsurprised that he came up with the idea first. Definitely one of the most underrated math TH-camrs around.
Bruh, seriously at this point the proofs u r doing is almost same the proof which were called by the text book authors as proof by magic.😅 But a very interesting video 😊
I'm gunna guess there are an infinite number of solutions to that equation... LOL.... Cheers AsymptoticSum[(-Log[1 - t x]/t)/z /. t -> n/z, {n, 1, z}, z -> Infinity]
This video is undeniable proof that x² = x² if proof need be.
😂😂😂
Nah idk if that can be conclusively proven its too difficult.
Differentiate both sides. 2x=f(x) * integral(1/f(t) dt, 0->x) + 1/f(x) * integral(f(t)dt, 0->x)
Let F(x) be the integral of f(t)dt from 0 to x.
2x = f(x)/F(x) *x^2 + F(x)/f(x)
quadratic in g(x) = F(x)/f(x)
g(x) = x
xf(x) = F(x)
differentiate both sides
xf'(x)=0
either x=0 and f is any function
or f(x) = C
Wildest use of LA i have seen in my fucking life
Hi,
"ok, cool": 0:58 , 2:18 , 4:19 ,
"terribly sorry about that " : 3:29 , 4:38 .
These are the statistics we need
I also like that he always says it in the same way haha.
there is no way he says it in the same exact wayy
I suggest a different approach : it is pretty obvious that f[x] must be a power function . Hence make the Ansatz
f[x]= a*x^p. Plug it in into the integral equation to get ∫ f *∫1/f =1/(1-p^2)* x^2 . Hence p = 0 must hold ,i.e. f[x] = a.
It's really gratifying how unexpectedly Cauchy-Schwarz came to save our calculus problem. I would like to see it used in a not-so-trivial scenario. Anybody can come up with that?
2:06 Ermmmmm I think you meant to say linearly dependent 🤓
5:09 bro said it again
I leave easter eggs like these in the videos. They're different from typical errors like missing a negative or something and it's interesting whether or not people find them😂
@@maths_505I can't believe you do that shit to us. 😂 I just thought I didn't hear it right and it must have to be something with my ears.
@@Grecks75 Easter eggs are fun though 😂
Linearly dependant .
thank you
Excellent. Simply, wonderful. Beautiful application of LA. Ok cool!
I've tried with the hypotesis f(x) = ax^b and the result was pretty linear:
[ax^(b+1)/(b+1)+C] [x^(1-b)/[a(1-b)]+K] = x^2
A lot of equation systems and fun ... --> b = 0, a 0 --> f(x) = a
Checkin(g): ax(x/a) = x^2
Delightfun, oops, delightful
Power functions to the rescue (again)
wouldn't the vectors be linearly dependent if they are colinear?
Assuming, of course, that f is greater than or equal to zero.
1/f
Very interesting problem. So, solution is smart as usual for your talent.
Linearly independent implies you can describe functions as linear combination? It just a mistake, that happened at least twice in the video ;)
@@AriosJentu I left that as a little easter egg....good for comment farming too😂😂
I request you to do something like silver ratio or plastic ratio
When a quantum physicist is bored
Rare miss here, Maths 505. What exactly is your integral equation supposed to mean? I would normally think that it would mean \int^{x} f(t) \int^{t} du/f(u) dt = x^2, which is equivalent to 1/f = (2x/f)' with solution f(x) = C sqrt(x).
If you're simply multiplying the two indefinite integrals together, then yes it's clear f(t)=constant is a solution.
Integral signs should always be accompanied by differentials. Otherwise the expression is nonsense. And that is what the original equation at 0:42 is. I'm sorry to be so blunt, but it is sloppy to leave this off. I deduct points from student exam scores when they do that.
Yes, it is a bit sloppy but not so much more than people (mathematicians, physicists, engineers) sometimes use elsewhere. By the same argument you must admit that y' = y is not a valid expression for a differential equation, because it does not specify that y is supposed to mean a function, it doesn't say of how many variables it is a function and what they are, it doesn't say which of those variables we take the derivative of (if the prime mark is to mean a derivative at all!). And it doesn't specify an initial condition.
For me at least it was clear that the notation "int f" was supposed to mean any fixed antiderivative of a function f (of a single argument x) we are looking for. The antiderivative of f is - in some historical mathematical texts - also written as f^(-1) meaning the negative first derivative of f (not to be confused with the inverse function of f) relating to the fact that integration can be considered the inverse operation of differentiation (by the Fundamental Theorem of Calculus). However, that notation is not in use anymore.
Probably the most controversial thing I see here in the thumbnail image is that, because of the missing "dx", we do not see where the first integration operation ends, and also that we don't see the bounds of integration (i.e. *which* antiderivative to take). But both things were clarified right at the start of the video.
This was fun. Wondering of there's a solution involving the Gamma function (LOL)
I mean it's not really algebra, just a use of Cauchy-Schwarz inequality applied on integrals, the inequality itself is not necessarily akin to algebra
Still cool video
Can you do a video where you derive all of the equations of the gamma merch?
Aight
You’re glossing over the fact that you’re either treating x as a constant, or you’re considering an uncountable family of Hilbert spaces here. The latter is the better way to think of it.
You’re considering the vector space H_x consisting of all Lebesgue integrable functions defined on the interval [0,x] for some fixed x. On this space, a natural Hilbert space inner product maps a function f to the number obtained by integrating f over the interval [0,x]. Your calculations show that if f satisfies the given integral equation for all (positive) x, then for each x, there is some \alpha(x) such that f(x)=\alpha(x). Of course, but I don’t see why you think that the value of \alpha MUST BE independent of x.
Clearly any constant (nonzero) function IS a solution of the given integral equation, so when you checked the solution, it worked. However, I don’t see why you think your work proves that EVERY solution of that given integral equation is a constant function, precisely because you glossed over the details behind the way in which you used the Cauchy-Schwartz Inequality.
…
Man... You just copied the very same solution development of him
th-cam.com/video/PPOtOIp-pfc/w-d-xo.htmlfeature=shared
@@giorgioripani8469 I didn't know of Dr peyam's video hence I could not have copied him. However, I'm unsurprised that he came up with the idea first. Definitely one of the most underrated math TH-camrs around.
Bruh, seriously at this point the proofs u r doing is almost same the proof which were called by the text book authors as proof by magic.😅
But a very interesting video 😊
First :)
I'm gunna guess there are an infinite number of solutions to that equation... LOL.... Cheers
AsymptoticSum[(-Log[1 - t x]/t)/z /. t -> n/z, {n, 1, z}, z -> Infinity]