Why {1,x,x²} Is a Terrible Basis

แชร์
ฝัง
  • เผยแพร่เมื่อ 18 ธ.ค. 2024

ความคิดเห็น •

  • @MathTheBeautiful
    @MathTheBeautiful  4 ปีที่แล้ว +1

    Go to LEM.MA/LA for videos, exercises, and to ask us questions directly.

  • @georgeorourke7156
    @georgeorourke7156 7 ปีที่แล้ว +13

    Dear Prof Grinfeld, after this lecture I felt that you were" pulling a fast one on us" when it came to the decomposition of x²+x. When dealing with vectors out of all the possible inner products we chose the dot product based on geometrical arguments (i.e. the cos(α) did what we needed it to do). On the other had when it came to polynomials you just presented one possible inner product and hence we obtained "orthogonal" polynomials but these were explicitly linked to the inner product you chose ∫+1-1 p(x)q(x) dx. Here is where I felt a little short changed: Could you comment on other sets of othogonal polymnomials that one could get using a different inne product and then explain of how mathematicians chose amongs the different orthogonal bases. Thank you, George

    • @MathTheBeautiful
      @MathTheBeautiful  7 ปีที่แล้ว +15

      You are hitting the nail right on the head, George!
      First, your initial point: that the geometric inner product was "natural" and the polynomial inner product is "arbitrary". That's exactly right! We saw how great the natural dot product was and then, by extracting its three governing properties (commutativity, distributivity and positive definiteness), generalized it arbitrary inner products. Yes, the inner product I used in this example is quite arbitrary, as are all of them, by definition. Similarly, most of the bases I used in the example earlier in the course are arbitrary, again by definition. It is a specific problem that dictates the choice of - earlier - bases, and - now - inner products.
      When you watch the video that explains Gaussian quadrature, you will see that this particular inner product is natural for that problem. If the limits of integration changed, the inner product would change accordingly. If we were dealing with functions on the unit disc, then ∫+1-1 (r*p(r)*q(r)) would be more natural. Chebyshev polynomials make a different choice for other reasons. And on and on.
      Please let me know if this is helpful.

    • @georgeorourke7156
      @georgeorourke7156 7 ปีที่แล้ว +8

      The single greatest thing I learned from mathematics is to know when I don't fully understand something (i.e. most of the time). I'm glad my question was not completely off the mark and I will now try to gain a more profound insight into how one can use different inner products to best adapt to the problem at hand. I will obviously start with your explanation of Gausian quadrature. Большое спасибо

    • @MathTheBeautiful
      @MathTheBeautiful  7 ปีที่แล้ว +7

      На здоровье!

  • @ryanlangman4266
    @ryanlangman4266 2 ปีที่แล้ว +2

    I have 3 questions.
    1. I can see why the monomials are a terrible basis in this inner product space, but is there an inner product space where the monomials do form an orthogonal basis? It would probably be a useful inner product for studying Taylor series and analytic functions.
    2. What is the span of the set of all Legendre polynomials? Is it the set of all analytic functions just like the monomials which build Taylor series?
    3. I wouldn’t think the Graham Schmitt process could change the span of the basis vectors, but is it possible when you have an infinite dimensional vector space?

    • @MathTheBeautiful
      @MathTheBeautiful  2 ปีที่แล้ว +2

      All excellent questions and warrant separate discussions.
      1. Yes. In any space, take any basis b1 b2 b3, and ~~~define~~~ the inner product to be such that the basis is orthonormal. This is a valid definition and it defines a unique inner product. For example, what is the inner product of u = u1b1 + u2b2 + u3b3 and v = v1b1 + v2b2 + v3b3 ?
      2. You always need to be careful when discussing spans in infinite-dimensional spaces with infinite linear combinations. But I think in this case it is safe to say that the span is the same as for monomials since any monomial can be expressed in terms of a finite number of Legendre polynomials.
      3. This question is just a generalization of #2? So I would give the same answer for the same reason.

  • @rookiecookie8258
    @rookiecookie8258 3 ปีที่แล้ว +3

    Amazing work and really got the intuition behind the concept of Numerical Analysis thanks to your lectures but I have one question. Why do we limit the concept of orthogonality to the interval [-1,1]?

  • @ThemJazzyBeats
    @ThemJazzyBeats ปีที่แล้ว

    Around the 3:30 mark, when you give the "magnitude of error" argument to justifiy why one basis is better than the other, I could not help but think about the concept of continuity.
    It seems that for the first basis which causes a small error to provoke [0,2] to go to [2.5,0], the "function" that would associate an error of measurement to the vector representation in the basis would be "less continuous" than that one of the 2nd, orthogonal basis, since for the first basis, if I change sligthly the input, it causes vast changes in the output.
    I'm also tempted to say that the first "fonction" would not be "continuous" at all since it can happen that small changes swap two zeroes in the output vector basis representation.
    To my intuition, continuity in this context should keep the 0 components of the vectors where they are and not swap them like the example did.
    I'm not sure if "continuity" is the correct terminology to communicate what I'm trying to say, but this notion just struck me while you were explaining it!
    Thanks for your videos, they are awesome!

  • @ozzyfromspace
    @ozzyfromspace 4 ปีที่แล้ว +8

    This was so insightful! If ever I construct basis functions (Legendre or something else), this will be an additional reason to perform a Gram-Schmidt process 👏🏽 Funny how I only have a high school degree but I feel like I’m learning sooo much because of educators like you, on TH-cam 😭🙏🏽🎊

  • @h2ogun26
    @h2ogun26 2 ปีที่แล้ว +2

    Never thought of approximation perspective ! Thanks

  • @ferdinandoinsalata3949
    @ferdinandoinsalata3949 4 ปีที่แล้ว +2

    Dear Prof Grinfeld, this was an amazing insight for me in understanding why orthogonal matrices are well-conditioned! Thanks.
    One quick question, if you can help: why did you say that you should have said "x^7 and x^9" instead of "x^7 and x^8" ? Just because x^7 and x^9 are very similar also in the interval (-1,0) or for some deeper reason?

    • @samwhite4284
      @samwhite4284 4 ปีที่แล้ว +3

      I believe this was simply because when considered over the larger range of -1 to +1 (instead of 0 to +1 as in his hand drawn diagram) x^7 and x^8 do diverge drastically over the negative x values (x^7 goes down and x^8 goes up) :)

    • @ferdinandoinsalata3949
      @ferdinandoinsalata3949 4 ปีที่แล้ว +2

      @@samwhite4284 thanks!

  • @duckymomo7935
    @duckymomo7935 7 ปีที่แล้ว +2

    Orthogonal is easier to find coefficients of linear combinations

  • @alexcwagner
    @alexcwagner 4 ปีที่แล้ว +1

    What troubles me is that even though {1, x, x^2, ...} is a terrible basis, we still have to specify Legendre polynomials in terms of that basis. So, how do we know we can avoid some computer precision problems with Legendre polynomials if we're going to run into precision problems defining them in the first place?

    • @MathTheBeautiful
      @MathTheBeautiful  4 ปีที่แล้ว +1

      Hi Alex, that's a good point. There does seem to be a logical flaw there somewhere. I can't quite put my finger on the reason why, but I think that Legendre polynomials are "clean". Perhaps it's because we did the decomposition symbolically rather than in the context of limited precision.

    • @alexcwagner
      @alexcwagner 4 ปีที่แล้ว +1

      @@MathTheBeautiful Thank you for your reply! Also, as long as I'm thanking you for things, I should thank you for your Gaussian Quadrature videos. I'm working on a project for my master's thesis, and I have a 3D integral that I need to calculate repeatedly, so I need it to be fast. I figured that learning how Gaussian Quadrature works would at least be a first step in figuring how to choose my methods, and your videos did the trick. So, thanks!

    • @matthewgraham790
      @matthewgraham790 3 ปีที่แล้ว

      Wouldn't a computer program store the polynomial as [x0, x1, x2, ..., xn] where xn is the coefficient for the nth Legendre polynomial rather than storing it as the coefficients for each power?

    • @alexcwagner
      @alexcwagner 3 ปีที่แล้ว

      @@matthewgraham790 Sure, you can store and manipulate the polynomial that way, but what I meant was when you want to evaluate the polynomial at some specific value. I'm finding that Python/NumPy/SciPy has a lot of functionality to make it easy to deal with many different flavors of orthogonal polynomials, but I suspect that underneath, when you ask it to evaluate it, it falls back to 1, x, x^2, ... and probably uses something like Horner's method to evaluate it.

    • @matthewgraham790
      @matthewgraham790 3 ปีที่แล้ว

      @@alexcwagner Isn't the only problem with precision from using 1, x, x^2 .. the representation of polynomials using a given basis? So it only comes about when trying to represent a polynomial in algebraic form as a linear combination of other polynomials in algebraic form. At no point does there need to be an evaluation of that polynomial at any given x, and after the representation in a given basis has been found, the precision problem ceases to exist

  • @duckymomo7935
    @duckymomo7935 7 ปีที่แล้ว +1

    Are two functions said to be orthogonal if their points of intersections are in the interval where they form the orthogonal condition?

    • @MathTheBeautiful
      @MathTheBeautiful  7 ปีที่แล้ว

      Actually, no. The answer is in the playlist bit.ly/InnerProducts

    • @duckymomo7935
      @duckymomo7935 7 ปีที่แล้ว +1

      oh okay, yea there are many types of orthogonal polynomials. It just depends on your inner product definition.

    • @MathTheBeautiful
      @MathTheBeautiful  7 ปีที่แล้ว +1

      Exactly!

  • @iriskanter225
    @iriskanter225 ปีที่แล้ว +1

    great video! thank you!

  • @ekandrot
    @ekandrot 7 ปีที่แล้ว +1

    Do you have a png of the image in your final note available somewhere for download? Does the scaling you used imply that the ones in this chart are orthonormal?

    • @MathTheBeautiful
      @MathTheBeautiful  7 ปีที่แล้ว +2

      Yes, I'll provide the PNG. They are not orthonormal: they are scaled so that p_n(1) = 1

    • @ekandrot
      @ekandrot 7 ปีที่แล้ว +1

      Thanks!

  • @kadrikocer5021
    @kadrikocer5021 4 ปีที่แล้ว +2

    Thank you for this Great lecture.

  • @styx4947
    @styx4947 4 ปีที่แล้ว +1

    Stable, linear, under small perturbations small errors turn into linear functions of small errors. "First order analysis". In physics speak

  • @ibi342
    @ibi342 4 ปีที่แล้ว

    2:50

  • @MrWandalen
    @MrWandalen 4 ปีที่แล้ว +1

    Amazing! Very thanks!

  • @dimitriosmenounos1009
    @dimitriosmenounos1009 7 ปีที่แล้ว +1

    The graphic representation of the polynomial functions has nothing to do with the vector space that they create. So IMHO your argument at 4:45 is moot. Actually, the set {1,x,x^2} is the standard base of the polynomial vector space with 3 dimensions and as such also orthonormal. I guess that makes it the perfect base?

    • @tracyh5751
      @tracyh5751 7 ปีที่แล้ว

      Orthonormal with respect to what inner product? B is certainly not orthonormal with respect to the inner product discussed in the video.

    • @dimitriosmenounos1009
      @dimitriosmenounos1009 7 ปีที่แล้ว +2

      Looks like you are right. I admit I haven't done the calculus, however, I have read it before that the monomial set is the standard basis for the vector space of polynomials. I found now 2 references:
      "By definition, the standard basis is a sequence of orthogonal unit vectors. In other words, it is an ordered and orthonormal basis.. There is a standard basis also for the ring of polynomials in n indeterminates over a field, namely the monomials."
      en.wikipedia.org/wiki/Standard_basis
      "In P2, where P2 is the set of all polynomials of degree at most 2, {1, x, x^2} is the standard basis."
      en.wikipedia.org/wiki/Basis_(linear_algebra)

  • @isaackay5887
    @isaackay5887 3 ปีที่แล้ว +1

    Ahhh yes, Scientific Computing...where it all comes together.

  • @nhanNguyen-wo8fy
    @nhanNguyen-wo8fy 7 ปีที่แล้ว

    Professor! This is realy important.
    What is that joke?

  • @InnerMindCreations
    @InnerMindCreations 7 ปีที่แล้ว +2

    Unbelievable!

  • @givemeArupee
    @givemeArupee 7 ปีที่แล้ว +1

    What a clickbait title!

  • @AroundTheBest
    @AroundTheBest 7 ปีที่แล้ว +4

    I appreciate the Trump joke.