The Subtle Reason Taylor Series Work | Smooth vs. Analytic Functions

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 พ.ค. 2024
  • Get Surfshark VPN at surfshark.deals/MORPHOCULAR and enter promo code
    MORPHOCULAR for a Holiday Special offer of 5 extra months for free with the Surfshark One
    package.
    Taylor series are an incredibly powerful tool for representing, analyzing, and computing many important mathematical functions like sine, cosine, exponentials, and so on, but in many ways, Taylor series really shouldn't work as well as they do, and there are functions out there that can't be represented with them. What are these functions? And what's so special about so many of our familiar functions that we can compute them with Taylor series?
    =Chapters=
    0:00 - How to calculate e^x
    4:16 - Surfshark ad
    5:15 - Why Taylor series shouldn't work
    6:54 - A pathological function
    8:25 - Taylor's Theorem
    10:48 - Analytic functions vs. smooth functions
    12:53 - The simplicity of complex functions
    14:10 - The uses of non-analytic smooth functions
    14:53 - See you next time!
    ===============================
    This video was generously supported in part by these patrons on Patreon:
    Marshall Harrison, Michael OConnor, Mfriend.
    ===============================
    CREDITS
    The music tracks used in this video are (in order of first appearance): Icelandic Arpeggios, Checkmate, Ascending, Rubix Cube, Orient
    The track "Rubix Cube" comes courtesy of Audionautix.com
    ===============================
    The animations in this video were mostly made with a homemade Python library called "Morpho". It's mostly a personal project, but if you want to play with it, you can find it here:
    github.com/morpho-matters/mor...

ความคิดเห็น • 444

  • @morphocular
    @morphocular  4 หลายเดือนก่อน +220

    Hey all. Just a few clarifications I'd like to make in response to comments I've seen. It seems I've had to do this a lot lately, huh? Nothing gets past you guys :)
    7:04 - Many have taken issue with the piecewise function g(x) being a satisfying example of a function that fails to equal its Taylor series because it's a "gluing" of two completely different functions, so it's natural to expect its Taylor series to behave incorrectly at the join. But the critical trait with this particular piecewise function that makes it different from most others is that the join is truly "seamless": despite being a "gluing" of two "different" functions, it's perfectly smooth (i.e. has derivatives of all orders) at the join point, which is not usually true of most piecewise functions you could construct. Because certainly if a function fails to be smooth at a point, its Taylor series will break there.
    0:05 - I was aware that many calculators do not actually employ Taylor series directly to compute sin, cos, and e^x. My intent there was just to make it as clear as possible that I'm talking about computing these functions at a truly arbitrary input (like a calculator can). In my defense, that's why I used the word "might" in that line, but I was probably asking that word to do too much work, so if I could go back, I'd rewrite that line. Apologies for any confusion!

    • @Kapomafioso
      @Kapomafioso 4 หลายเดือนก่อน +8

      On a related note, a video on how these functions are actually implemented, would be awesome. But also that's more like computer engineering, so maybe not quite suitable for this channel. Anyway, I love your videos!

    • @patrolin
      @patrolin 4 หลายเดือนก่อน +2

      @@Kapomafioso they are implemented by polynomials - see Handmade Hero Day 440

    • @Sean-of9rs
      @Sean-of9rs 4 หลายเดือนก่อน

      Well the first point isn't really an issue, as there is the famous Fabius function, which is smooth everywhere and analytic nowhere. I don't know how, but I trust the math it is.

    • @gigantopithecus8254
      @gigantopithecus8254 4 หลายเดือนก่อน

      @@Kapomafiososomtimes they use the agm

    • @typo691
      @typo691 4 หลายเดือนก่อน

      Did you mean to write unsatisfying?

  • @maxwellhunt3732
    @maxwellhunt3732 5 หลายเดือนก่อน +801

    I love Taylor's Theorem. It's one of those results that is so incredibly important, but is not at all obvious at first sight.

    • @anggalol
      @anggalol 5 หลายเดือนก่อน

      ​@@deltapi8859 Engineer use it a lot for approximation. Maybe you already heard about sin(x) ≈ x. That is based on Taylor Series

    • @leif1075
      @leif1075 5 หลายเดือนก่อน +6

      Isn't what eh said around 6:40 not right..if you "contort " the function it no longer is the same.function anymore its no longer e^× except maybe in some.small.subsection..so isn't thst wrong?

    • @joeyshi2114
      @joeyshi2114 5 หลายเดือนก่อน +1

      @@leif1075 what do you mean? He wanted to look at a different function with the property that a neighbourhood of points around x = 0 is the same as e^x. It illustrates that not all functions are analytic

    • @ExplosiveBrohoof
      @ExplosiveBrohoof 5 หลายเดือนก่อน +3

      Yeah, I was never taught it when I took calc in high school. I wonder if there's a nice visual proof of it somewhere on TH-cam.

    • @sakshamsingh1778
      @sakshamsingh1778 5 หลายเดือนก่อน

      ​@@ExplosiveBrohoofthere is a TH-cam video titled "geometric interpretation of sinx= ...
      ..." From mathemaniac TH-cam channel you should check it out

  • @user-ln2ri9nx8u
    @user-ln2ri9nx8u 5 หลายเดือนก่อน +239

    It's honestly wild how well he can explain these things using the visuals.

  • @angelofdeth94
    @angelofdeth94 4 หลายเดือนก่อน +277

    One interesting thing about analytic functions is they behave more like "infinite degree polynomials" than a general smooth function. Polynomials are very rigid. If you know the value at n+1 points of a degree-n polynomial, then you know the whole polynomial. So even though it might seem like you could express a lot of different shapes with a degree-4 polynomial, it only takes 5 points to completely pin it down. There's a theorem in complex analysis that says if you know the value of an analytic function at a sequence of points and at a limit point of the sequence, then you know the analytic function everywhere. For example, if you know the values at 1/n for every natural number n, and at 0, then you uniquely determine the analytic function. In retrospect, it's kind of "obvious" that analytic functions would act like infinite-degree polynomials, because that's basically what a power series is.

    • @pyropulseIXXI
      @pyropulseIXXI 4 หลายเดือนก่อน +3

      f(x)= ax^2+bx+c
      Pretty obvious you need n+1|n=2 = 3; I fail to see the 'insight' here; and these two statements you made are not related at all:
      (1) One might think you could express a lot of different shapes with a degree-4 polynomial;
      (2) but it only takes 5 points to completely pin it down.
      You can express lots of shapes and having 5 points to give a unique curve for those 5 points does not limit the number of shapes one can generate with a degree-4 polynomial; The limit in the shapes one can generate is due to the nature of the polynomial itself, not the fact that having n+1 points gives a unique curve for those given points.
      That is, you could have a unique curve for any set of given points, yet still be able to draw literally any shape, which is why the two statements are litearlly irrelevant when taken together.

    • @cparks1000000
      @cparks1000000 4 หลายเดือนก่อน +19

      ​@@pyropulseIXXINot sure what you're trying to say.

    • @nielskorpel8860
      @nielskorpel8860 4 หลายเดือนก่อน +3

      nice that the theorem existsin the complex plane, but does it also exist on the real line?
      complex derivatives are much more deanding objects, making complex definitions and theorms much stronger and narrower than their real analogs.

    • @scalesconfrey5739
      @scalesconfrey5739 4 หลายเดือนก่อน +9

      @@pyropulseIXXI
      "That is, you could have a unique curve for any set of given points, yet still be able to draw literally any shape"
      Your statement is patently false. A unique curve is a unique set of points, and any shape is defined by its points.
      Even if what you meant was that you can define any region with said curve as a boundary, that still means that you can't draw those regions that have a different boundary using that curve.

    • @simenjorissen5357
      @simenjorissen5357 4 หลายเดือนก่อน +1

      Wow that's actually a really cool result, what's the name of this theorem?
      However, aren't there some conditions that need to be placed on the sequence of points you're evaluating in? Because the way you phrased it I could pick the sequence (aₙ)ₙ with aₙ=0 for all n, obviously the limit is also 0. So that would mean that knowing a function in 0 is enough to know the whole function

  • @billcook4768
    @billcook4768 4 หลายเดือนก่อน +156

    The crazy thing about analytic functions is that if you know everything that is going on in a “small” region around a point, you understand the entire function.

    • @pyropulseIXXI
      @pyropulseIXXI 4 หลายเดือนก่อน +4

      Knowing infinite derivatives is not a 'small' region; derivatives tell you how the function changes, so knowing infinite derivatives will obviously give you the exact function
      This is obvious and not crazy at all
      In fact, the moment I learned about derivatives and linear approximations, I instantly knew, via pure intuition, that if I took an 'infinite' amount of derivatives to 'approximate' the function, I would get the exact function but in polynomial infinite series form.

    • @marekkryspin8712
      @marekkryspin8712 4 หลายเดือนก่อน +32

      ​@@pyropulseIXXI Imho @billcook4768 discusses something different. What is surprising is the amount of information needed for a complete description of an analytic function. Indeed, it is sufficient to know "only" all the derivatives at a point to determine the entire function potentially over the entire real line. This means that a countable amount of local information (focused at a single point) provides a complete description of a function defined on a potentially big domain.

    • @freyc1
      @freyc1 4 หลายเดือนก่อน +35

      It's so obvious it's only true for a very particular kind of functions... As the video explains perfectly. "Pure intuition" is just hasty reasoning, in that case, I'm afraid.@@pyropulseIXXI

    • @scalesconfrey5739
      @scalesconfrey5739 4 หลายเดือนก่อน +26

      @@pyropulseIXXI
      "In fact, the moment I learned about derivatives and linear approximations, I instantly knew, via pure intuition, that if I took an 'infinite' amount of derivatives to 'approximate' the function, I would get the exact function but in polynomial infinite series form."
      In that case, how do you explain bump functions? The existence of functions which have derivatives of all orders at the origin and yet fail to be analytic shoots your "intuition" out of the water. That's why mathematics relies on proof to determine truth, rather than insight and assumption.

    • @Czeckie
      @Czeckie 4 หลายเดือนก่อน +1

      it's not crazy. Analytic function is given as the taylor series, which is described precisely by the countably many numbers. Analytic functions are simple, that's why it works. What's crazy is that holomorphic = analytic. Usual proofs don't offer an intuitive reason for it.

  • @jacob_90s
    @jacob_90s 5 หลายเดือนก่อน +101

    I know this wasn't the primary point of the video, but I just wanted to note this because it's something I was very interested in when I first started programming, and I had a hard time learning this because every first year calc student would just copy and paste the same damn explanation about taylor series in every online forum.. Most programming math libraries DO NOT use infinite series of continued fractions to calculate elementary functions (the exceptions generally being arbitrary precision libraries) The issue with them is that they are in general too slow, and often times require the intermediate calculations to be computed at a greater precision than the final results needs to be.
    Instead, when writing the math library, the developers will curve fit either a polynomial or rational function, which can compute the function within a certain range to the required level of precision. Additionally, identities are often used to reduce the input to a smaller range so that you don't have to try and compute the values for all possible floating point values; the trig functions are probably the best example of this; sin and cos are defined for all x values from -infinity to +infinity, but since it just repeats, you can reduce the input value into the range -2pi to +2pi (depending upon the library sometimes it will be reduced even further). Similar tricks can be used for exponential and logarithmic functions using the layout of floating point numbers.
    For anyone who wants to read up more on this, I would suggest
    * Approximations for Digital Computers by Hastings (1955)
    * Computer Approximations by Hart (1968)

    • @ryanpitasky487
      @ryanpitasky487 5 หลายเดือนก่อน +7

      CORDIC is another commonly used algorithm.

    • @Kapomafioso
      @Kapomafioso 4 หลายเดือนก่อน +6

      Wouldn't it be enough to only consider the interval [0, pi/2] for trig functions? For sin, for example, if the value is between pi/2 to pi, the values are reflected. From pi to 2pi, the values are negative of those between 0 and pi. So the whole curve can be reconstructed from just the interval 0 to pi/2.

    • @johannbauer2863
      @johannbauer2863 4 หลายเดือนก่อน +3

      If the square root operation has its own instruction you can extend this further: you only need [0, pi/4] and use sin(x)^2 + cos(x)^2 = 1 to fill in the rest
      This was used for example by mario64 modders iirc

    • @TheFrewah
      @TheFrewah 3 หลายเดือนก่อน

      Fast inverse square root os pretty clever.

    • @samsamson6070
      @samsamson6070 3 หลายเดือนก่อน

      @@Kapomafioso just [0, pi/4] is enough! (Through reflection of that region of the circle over x = y)

  • @B_u_L_i
    @B_u_L_i 5 หลายเดือนก่อน +36

    THANK YOU. When I first heard about the Taylor expansions of e, sin and cos, the fact that they can be described by a polynomial exactly was so confusing to me. Like it's so random. But it makes a lot more sense now.

    • @ciceron-6366
      @ciceron-6366 5 หลายเดือนก่อน +5

      In fact it’s not really a polynomial because it has an infinite number of non-zero coefficient
      But the infinite sum is equal

    • @B_u_L_i
      @B_u_L_i 5 หลายเดือนก่อน +6

      @@ciceron-6366 I really, really don't give a damn.

    • @prod_EYES
      @prod_EYES หลายเดือนก่อน

      @@B_u_L_i😭

  • @exotic_sphere
    @exotic_sphere 5 หลายเดือนก่อน +33

    What I personally find surprising is the effectiveness of compactly supported smooth (or continuous) functions, which are not "good" functions if you have the naive idea that the "best" possible functions are the real-analytic ones. From being used to show all sorts of approximations in function spaces, while having a topological vector space structure that is extremely non-trivial to having a dual that in the end is big enough to contain all kinds of weird "functions" people got as solutions of linear PDE via heuristic methods. Such a beautiful theory. Also, a good show of how weird and different is the world of complex calculus.

  • @kingbeauregard
    @kingbeauregard 5 หลายเดือนก่อน +10

    Taylor Expansions are great. For concept, I recommend this: a given function f(x) is actually built out of a bunch of polynomial terms (ax, bx^2, cx^3, etc) but it does not readily admit to what the coefficients a, b, c, etc are for the various terms. So we need to torture the function into confessing each coefficient. The method of torture that works is taking the derivative the appropriate number of times for a given polynomial term, and then setting x equal to zero. It's brutal and harrowing work, but it's also brutally efficient.

  • @MisterTutor2010
    @MisterTutor2010 4 หลายเดือนก่อน +21

    If anyone tells you that math is boring, just shake it off.

    • @osmosisjohns5650
      @osmosisjohns5650 2 หลายเดือนก่อน +2

      😂 I see what you did there

  • @kevj8708
    @kevj8708 5 หลายเดือนก่อน +19

    Just another casual banger video. Very quickly becoming one of my favorite TH-cam channels (not just math). Keep it up, you're killing it unbelievably hard.

  • @tylershepard4269
    @tylershepard4269 5 หลายเดือนก่อน +36

    This is a great video. Sadly we don’t use this method anymore. Bit-shifting and a very accurate representation of log(2) is the efficient route. Extend this concept and add an extra register for complex numbers. It’s been a while (5 years) since I’ve done any assembly in an x86, but if I recall these functions are built right into the hardware essentially.

    • @trogdorbu
      @trogdorbu 4 หลายเดือนก่อน +3

      I'm not making the connection between this and bit-shifting, although I am familiar with the latter. Can you expand on this?

    • @nuke_clear
      @nuke_clear 4 หลายเดือนก่อน +5

      ​​@@trogdorbuI am guessing its about how calculators calculate values of e^x and other such functions at any x

    • @dkosolobov
      @dkosolobov 4 หลายเดือนก่อน +3

      This method is more relevant than it seems: Intel made a mistake in their fsin function and programmers had to implement the sine by hand usually using the Taylor expansion and a few tricks. The backward compatibility prevents a simple patch to the issue. See the article "Intel Underestimates Error Bounds by 1.3 quintillion" that explains the problem.

    • @Smitology
      @Smitology 15 วันที่ผ่านมา

      @@trogdorbu I think it was a comment on the motivation of taylor series established at the beginning of the video, to compute the values of exponential and trigonometric functions digitally

  • @calmkat9032
    @calmkat9032 5 หลายเดือนก่อน +38

    This is my #1 favorite subject! All of calculus feels like a narrative, but none moreso than Taylor series. The way it starts with something plain with approximating functions, to turning irrational, even transcendental, functions into these weird work-around ratios, it's just such a cool story!
    It even ends what I consider a years-long story arc in math. Since algebra 1, we learned about functions. Then we steered into the seemingly unrelated geometry. Then we alternate with algebra 2 and trigonometry. And it all comes together here at the end of calculus 2, when you turn sin(x) and cos(x) into plain algebra, and vice versa.
    And as a bonus, you learn that the taylor series of cos(x) + i*sin(x) is the same as e^x. Meaning trigonometry, algebra, and calculus all meet here. Add the cornerstone of geometry, pi, by making x=(i*pi), and boom. The one and only e^(i*pi)=-1

    • @stephenbeck7222
      @stephenbeck7222 4 หลายเดือนก่อน +2

      In a sense, Calc 1 and 2 is an adventure in approximating functions. Tangent line approximations are learned early in the course, which is a first order technique. Euler’s method is typically introduced with basic differential equations (which may be reserved for the separate course of differential equations, but the AP Calc BC curriculum does cover it), which is iterating on the first order approach. Then Taylor series come along and extend the first order tangent lines into polynomials of however many degrees you’d care to find. Then you can take more advanced courses and blow it all up with Fourier transforms.

    • @pyropulseIXXI
      @pyropulseIXXI 4 หลายเดือนก่อน +2

      This is an insane comment, which means I love it
      For one, pi is not the cornerstone of geometry; and Euler's identity/formula is not the meeting of trig, calculus and algebra.
      You are not turning sin(x) or cos(x) into "plain algebra," and the fact you think an infinite polynomial is "plain algebra" shows that you do not understand what algebra is
      Algebra, in its most basic sense, is balance and restoration; just having a variable being _x_ does not make something algebra, or an algebra. Just having variables does not make something an algebra.
      Lastly, cos(x) + i sin(x) is NOT equal to e^x; it is equal to e^(ix)
      This should be utterly obvious, as the graph of e^x is in no way the same as the graph of cos(x) + i sin(x); one isn't even in the same realm as the other, being in the complex plain whilst e^x is obviously in the 'real' plane.

    • @pyropulseIXXI
      @pyropulseIXXI 4 หลายเดือนก่อน +1

      @@stephenbeck7222 Fourier transforms and Fourier series are not 'more advanced.' It is literally just calculus I material
      I went to college thinking I would be around smart people; turns out, I was the only one that self taught myself calculus before taking the course and everyone else was an absolute oaf that struggled
      I was literally coming up with Taylor series on my own in Calc I, and it is utterly obvious and 100% intuitive, so I'm sick and tired of people saying this super obvious stuff is not intuitive.
      I also came up with Fourier series, as it is also utterly obvious.
      In fact, you can literally create any function with any arbitrary functions, provided you can choose the coefficients of those arbitrary functions and have an infinite amount of them.

    • @jasoncampbell1464
      @jasoncampbell1464 4 หลายเดือนก่อน +2

      @@pyropulseIXXI Yeah you absolutely have no idea what you’re talking about. You’re probably a good at crunching and memorising rules, and you can get somewhere with that, but that’s not what you need to discover knowledge except by accident. If you’ve been exposed to even 32 people in your age group and have any intuition at all that’s required of basic statistics, you’ll understand how to set proper expectations of people in your age. And yet you came to college expecting that everyone self-studied calculus, as if it’s the only intelligent endeavour. You’ll get to funny places for sure, but I’ll never be impressed.

    • @pyropulseIXXI
      @pyropulseIXXI 4 หลายเดือนก่อน

      @@jasoncampbell1464
      People like you are so insecure; I went to UC Berkeley and double majored in physics and mathematics.
      I tell a story of how I derived Taylor series on my own and your reply is "I bet you just memorize rules."

  • @user-yb9ol8sz7o
    @user-yb9ol8sz7o 5 หลายเดือนก่อน +6

    Brilliant video of the Error term in Taylor Theorem.
    It's NEVER assumed that a Taylor series will converges for all x tho.
    IF it converges to F(x) at a point the next question that's asked is on what interval (what neighborhood of x) does it converge to F(x) and that's as you point out, IS there an open interval around x such that the series converges to F(x) ?
    As you point out in the video derivatives are providing LOCAL information. Now Polyas Theorem is a physical way of looking at complex functions using DIV and CURL operators and can allow you to decide if a function has a valid Taylor series(that means Analytic) in some region/neighborhood using NONE LOCAL information. It's a physical (physics way) way of doing Cauchy Theorem. Amazing really.
    Brilliant video, thank you.

  • @jojo_125
    @jojo_125 3 หลายเดือนก่อน +3

    My jaw just dropped when I saw the g(x) function! I didn't know there were smooth but not analytic functions and I'm glad to hear that this isn't possible in the complex plane. This shows once again that the complex plane is much nicer and it's just incomplete to work only within real numbers.

  • @NoNTr1v1aL
    @NoNTr1v1aL 5 หลายเดือนก่อน +8

    Thought you were gonna dive into Schwartz's Theory of Distributions at the end there after you mentioned the bump function and its uses, then I remembered the video title and duration. Maybe it could be the topic of another video. Absolutely brilliant video! Can't wait for the next one.

  • @MagicGonads
    @MagicGonads 5 หลายเดือนก่อน +4

    The way it is shown to construct analytic functions from other analytic functions is a bit too vague.
    It is true that they form a ring, so addition, subtraction, and multiplication work.
    It's also true that composition works, however you have to carefully consider what happens to the domain where you can easily puncture it making it only piecewise-analytic on the original domain, and it's not as obvious as for the ring where we have the open intersection of the domains as the resulting domain.
    And specifically for division and inversion there are special conditions that need to be met and of course for inversion the domain totally changes.

  • @francescololiva5826
    @francescololiva5826 5 หลายเดือนก่อน +2

    Yes you're back with a new video! I'm going to watch it now, I know it will be great❤

  • @Kram1032
    @Kram1032 5 หลายเดือนก่อน +7

    another neat one (similar to the bump function) is the fabius function which has the property that its derivative is two rescaled copies of itself. Normally defined on the unit interval, it's also possible to extend it into a pseudoperiodic form that is positive or negative according to the Thue-Morse sequence. If you don't do so though, it's constantly zero for negative values and constantly one for any value beyond 1, and in between it takes rational values for any dyadic rational input.

  • @Audio_noodle
    @Audio_noodle 4 หลายเดือนก่อน +3

    Fantastic video, filled the gaps in understand I had with taylor expansion, and kinda explained why taylor series are such a powerful tool in physics :D

  • @05degrees
    @05degrees 5 หลายเดือนก่อน +8

    Also, analytic functions are quit rigid, usually you can’t arbitrarily define them on several intervals at once (like the case with exp, sin and cos which automatically define themselves on all of ℝ!)-but non-analytic functions allow more freedom.
    Though analytic functions are still not the worst when trying to have one that’s _very much like_ zero even if not exactly zero outside a region: take the ever-present gaussian exp(−x²/2), for many purposes it’s very much zero outside, say, x ∈ [−10; +10]. Taking a larger power of x will make it even better at this, though then we’ll get a function that is less useful in many fields.
    Analytic functions are like “infinite-order polynomials” in a sense. Plain finite-order polynomials, on the other hand, are the top candidate for being the worst rigid class of functions that seems like a nice and large class at first. Has its upsides because of that, though.

  • @mrtthepianoman
    @mrtthepianoman 4 หลายเดือนก่อน

    Thank you for making this video! I learned the concepts of smooth and analytic in the context of complex analysis where they are equivalent. As a result, I have always had a hard time remembering what the distinction is. This makes it clear by outlining where they diverge in the real numbers. Well done!

  • @nikkatalnikov
    @nikkatalnikov 4 หลายเดือนก่อน +2

    Great video, thank you!
    Bump functions are really important in studies of weak solutions / weak derivatives as support functions for distributions.

  • @latarte3931
    @latarte3931 5 หลายเดือนก่อน +1

    A gem amongst all mathematical channels, thank you for the insights

  • @ciCCapROSTi
    @ciCCapROSTi 4 หลายเดือนก่อน +2

    Thanks mate, I was fascinated by Taylor series since the first semester of calculus, but forgot a lot since then. Good, concise, informative video.

  • @Steindium
    @Steindium 5 หลายเดือนก่อน +4

    Awesome video. I always had my doubts with the Taylor series, so it's nice to see a video addressing them. In fact, coincidentally, I was just watching a video on Euler's identity and grunted when it was another proof using the Taylor expansion.

    • @wumbo_dot_net
      @wumbo_dot_net 4 หลายเดือนก่อน

      I also struggled with them in school, something about the *why* was always missing. I also made a video about Taylor series recently if you’re interested!

  • @TheJara123
    @TheJara123 4 หลายเดือนก่อน

    Again wonderful man!! Helps me out of my personal health pain!! Reminding to the wonderful world of math!! Please keep posting often....you have real unique gift to explain complex math concepts!!

  • @billgatesharmikropenls
    @billgatesharmikropenls 4 หลายเดือนก่อน

    This is quickly becoming my favorite channel

  • @GhostyOcean
    @GhostyOcean 5 หลายเดือนก่อน +88

    Complex analysis is probably my favorite subject to study. All the nasty things from real analysis get smoothed away.

    • @budderman3rd
      @budderman3rd 5 หลายเดือนก่อน +5

      Well complex is more complete than just reals.

    • @Tutor-i
      @Tutor-i 5 หลายเดือนก่อน +1

      What did you take first complex analysis or real? I can choose to take complex or real next year but don’t know which one to choose.

    • @ryanh7167
      @ryanh7167 5 หลายเดือนก่อน

      ​@@Tutor-i if you are in undergrad, your real analysis course and introductory complex analysis course (probably called something like "functions of complex variables") will be very different courses with a different focus.
      Introductory real analysis courses tend to focus on the basics of set theory, topology of metric spaces, and sequences/series of real numbers/vectors in metric spaces. Sometimes you'll get to derivatives and the beginnings of Riemann integration.
      Introductory complex functions courses tend to focus on the parts of complex analysis which can be handled with standard multivariable calculus. They'll walk you through the standard exponentials of complex functions, the basics of complex polynomials/the fundamental theorem of algebra, and then usually go towards talking about how to handle derivatives and integrals of well-behaved complex functions (functions who are equivalent to rotation and scaling in R2).
      Sorry for the novel, but I don't think you should really think of them as being in sequence for each other, because they tend to have a different purpose and focus.

    • @GhostyOcean
      @GhostyOcean 5 หลายเดือนก่อน +2

      @@Tutor-i I took complex first, but actually I took it concurrently with my intro to proofs class. Guess you could say I was smart enough to still be in the top of the class while learning proofs

    • @6funnys
      @6funnys 5 หลายเดือนก่อน +3

      Real is definitely more fundamental and will change the way you think about math, but it still kind of depends on your institution. Where I go to school, they offered a functions of a complex variable course that was in between the levels of a calculus course and an analysis course - we did a fair mix of proofs and computations. I absolutely loved that class, and it definitely came before real in the difficulty progression. But if you’ve got a lot of room in your schedule next semester, real is pretty awesome.

  • @ericdculver
    @ericdculver 3 หลายเดือนก่อน

    Great video! I have know about the examples of smooth but not analytic functions for a long time, but I did not know why they failed to be analytic. This was very illuminating.

  • @Procyon50
    @Procyon50 5 หลายเดือนก่อน +16

    The fact that combinations of analytic functions are also analytic is so cool. This reminds me of how elements of groups stay in the group, when you multiply them together. Are these concepts related?

    • @epicwalrus1262
      @epicwalrus1262 5 หลายเดือนก่อน +17

      Yes, analytic functions on a given domain form a ring, which is a group with multiplication (but not necessarily division)

    • @schweinmachtbree1013
      @schweinmachtbree1013 4 หลายเดือนก่อน +4

      Things staying in a given set when applying an operation (your 2 examples - 1: the set of analytic functions mapping into itself when applying addition, subtraction, multiplication, composition, etc., and 2: the set of elements of a group mapping into itself when applying the group multiplication) is called "the set being _closed_ under the operation". With this terminology your 2 examples are "(the set of) analytic functions being closed under +, -, *, ∘, etc." and "(the underlying set of) a group being closed under the group multiplication".

    • @schweinmachtbree1013
      @schweinmachtbree1013 4 หลายเดือนก่อน

      ​@@epicwalrus1262 To clarify a little, a ring _R_ is an abelian group _A_ = ( _A_ ; +, 0, -) (0 and - being the identity element and inverse operation for +) together with an associative multiplication × distributing over it (that is, (a×b)×c = a×(b×c) for all a,b,c in _A_ and also a×(b+c) = a×b + a×c and (a+b)×c = a×c + b×c for all a,b,c in _A_ ). Depending on where rings are being used, × is sometimes required to have an identity element, denoted 1, such rings being called "rings with 1", "rings with identity", or "unital rings".
      An example of a unital ring is yours, analytic functions on a domain _D_ (or any subset of *C* ), for which the standard notation is _R_ = C^ω( _D_ ), with multiplicative identity being the constant function _f_ ( _x_ ) = 1, and an example of a non-unital ring being the analytic functions on a domain _D_ with compact support (using the precise definition of a "domain" in complex analysis: a non-empty connected open subset of *C* ), denoted _R_ = C^ω_K( _D_ ): now the constant function _f_ ( _x_ ) = 1 is excluded by definition (since the support of _f_ is all of _D_ , but _D_ is open so not compact) and _R_ has no multiplicative identity.

    • @drdca8263
      @drdca8263 4 หลายเดือนก่อน

      @@epicwalrus1262a commutative group under addition, and closed under a multiplication operation, where that multiplication distributes over the addition, and is associative.
      Typically one also requires that there be a multiplicative identity, but I think some people don’t require that? But most people give a different name to the version of the idea without that requirement.

    • @jorgenharmse4752
      @jorgenharmse4752 3 หลายเดือนก่อน

      @@epicwalrus1262: Extend to meromorphic functions, and then you can do division. (Weierstrass factorisation implies that the field of meromorphic functions on a connected open set 'is' the field of fractions of the ring of holomorphic functions.)

  • @JourneyThroughMath
    @JourneyThroughMath 3 หลายเดือนก่อน

    As a teacher, Im familiar with Taylor series, but I have never stopped to consider how they form. Thank you for this video!

  • @yds6268
    @yds6268 5 หลายเดือนก่อน +25

    Most calculators or computers don't use Taylor's theorem for trig functions and the exponential. It's very inefficient.

    • @ryanpitasky487
      @ryanpitasky487 5 หลายเดือนก่อน +3

      CORDIC!

    • @justafanoftheguywithamoust5594
      @justafanoftheguywithamoust5594 5 หลายเดือนก่อน +1

      Then what do they use ?

    • @fullfungo4476
      @fullfungo4476 5 หลายเดือนก่อน +3

      @@justafanoftheguywithamoust5594 Some use lookup tables with further approximation techniques like Newton’s method.

    • @yds6268
      @yds6268 5 หลายเดือนก่อน

      @@ryanpitasky487 exactly, the CORDIC algorithm, amazing invention

  • @coaster1235
    @coaster1235 4 หลายเดือนก่อน +3

    would also be fun to learn about pade approximants, and compare the priorities of the two approximations (at an arbitrary close neighborhood of a point vs over an interval), and why pade does better at the latter

  • @_unkown8652
    @_unkown8652 5 หลายเดือนก่อน +2

    Hey morphocular! Huge fan here! I would recommend using a darker colour palette for your vids, because the pink here is a little agressive 😅

  • @mauisstepsis5524
    @mauisstepsis5524 2 หลายเดือนก่อน

    This is the most insightful discussion of Taylor series I have seen. Thanks a lot!

  • @jonathanbeeson8614
    @jonathanbeeson8614 4 หลายเดือนก่อน

    Just wanted to add my thanks and appreciation. My level of mathematical sophistication was well matched by your level of explanation !

  • @blitzkringe
    @blitzkringe 4 หลายเดือนก่อน

    Thanks, my struggle with the concept of complex analytic functions seemed almost hopeless until youtube recommended me this video

  • @mr_underscore7681
    @mr_underscore7681 5 หลายเดือนก่อน +1

    This is great, love this stuff. Keep it up!

  • @some1rational
    @some1rational 17 วันที่ผ่านมา

    Damn I saved this in a playlist and put off watching it until now, I'm glad I got around to it. This is so amazing, as a math minor some of these concepts eluded me during university, particularly everything after 13:00 , but within 20 minutes you literally finally made me understand and truly appreciate analytic, holomorphic functions and their 'equivalence' in complex analysis

  • @v_i_e_w_e_r_405
    @v_i_e_w_e_r_405 5 หลายเดือนก่อน

    great xmass gift! thank you!

  • @General12th
    @General12th 5 หลายเดือนก่อน

    Hi Morph!
    Nice writing!

  • @ElchiKing
    @ElchiKing 4 หลายเดือนก่อน

    An addition to 12:00 This does not only hold for composition by concatenation, inversion, division, multiplication, addition and subtraction, but I think also for "most" solutions to equations:
    Suppose we have some equation of the form f(x,y)=0 and let f be analytic if we fix either x or y. Suppose further, that we have some solution (x0,y0) such that df/dy(x0,y0) is nonzero. Then the implicit function theorem tells us, that we can locally describe the set of solutions around (x0,y0) as the graph of a function g, i.e. near (x0,y0), all points satisfying the equation f(x,y)=0 have the form (x,g(x)). Furthermore, the same theorem also tells us that g is differentiable near x0 and if f is analytic, then so is g (at least in a small neighborhood of x0).

  • @ashie.official
    @ashie.official 5 หลายเดือนก่อน

    that last line made me chuckle :) good video!!

  • @giacomocasartelli5503
    @giacomocasartelli5503 4 หลายเดือนก่อน

    Great video, complete and sound explanation of a profound concept

  • @AllemandInstable
    @AllemandInstable 5 หลายเดือนก่อน

    great video, would have loved to watch these when i was a student doing its first steps in analysis

  • @erikb.celsing4496
    @erikb.celsing4496 หลายเดือนก่อน

    This video is completely AMAZING I am so thankful you made it!!

  • @joaopedrodiniz7067
    @joaopedrodiniz7067 5 หลายเดือนก่อน +1

    Wow, and once again you succeed to amaze me. Congratulations on the amazing video!

  • @Calcprof
    @Calcprof 4 หลายเดือนก่อน

    The numerical evaluation of transcendental functions is a fascinating field, and there are many subtleties. I particularly like the use use assymptotic but divergent series. Also rational function approximations can be used and can converge "past" (on the other side) of singularities (poles).

  • @Waffle_6
    @Waffle_6 5 หลายเดือนก่อน

    please never stop making videos

  • @FredericoKlein
    @FredericoKlein 4 หลายเดือนก่อน +1

    i remember getting really spooked about this in college thinking that if you could know every derivative of a path, that you could calculate it in the future and how future information would be somehow hidden in higher order derivatives, I kinda forgot about this, but i think it has to do with my incomplete understanding of the taylor theorem and the limitations of taylor explansions

  • @eriktempelman2097
    @eriktempelman2097 วันที่ผ่านมา

    Thoroughly enjoyed this one ❤❤

  • @tangentfox4677
    @tangentfox4677 4 หลายเดือนก่อน

    I love that you describe the bump function as useful because it's not too bumpy.

  • @patturnweaver
    @patturnweaver 4 หลายเดือนก่อน

    wonderful.
    i now see why analytic functions are such a big deal.
    makes it much easier to find an approximating function for one thing.
    i also see the advantages of working in the complex domain.
    if a function has derivatives for all n on an open domain, then the function is analytic.
    if a function is smooth, then its analytic.
    if a function has a first derivative, then it has a derivative for all n.

  • @tryingintrovert1239
    @tryingintrovert1239 4 หลายเดือนก่อน

    We learnt to approximate the value of the sine function in our engineering programming class using Taylor series. It was a very interesting experience

  • @that_guy4690
    @that_guy4690 4 หลายเดือนก่อน

    Thank you for your video. It made me view the Taylor series from a new perspective

  • @dariomartinezmartinez5422
    @dariomartinezmartinez5422 3 หลายเดือนก่อน

    I absolutely love your videos, I'm studying maths and I find your videos crystal clear, thank you very much for making such a good content!

  • @cdenn016
    @cdenn016 5 หลายเดือนก่อน +6

    As a PhD physicist, I greatly appreciate this point

  • @fireballman31
    @fireballman31 4 หลายเดือนก่อน

    Great video: presented at the perfect level and reminded me of some of your best videos.

  • @danielbautista7
    @danielbautista7 4 หลายเดือนก่อน

    Thanks for refining my intuition

  • @michaelthompson5396
    @michaelthompson5396 4 หลายเดือนก่อน

    great explanation. very helpful.

  • @remekstepaniuk7820
    @remekstepaniuk7820 2 หลายเดือนก่อน

    I could listen to this guy explain all of mathemathics to me. From axioms to complex functions, to topology and geometrics ❤

  • @jackpisso1761
    @jackpisso1761 5 หลายเดือนก่อน

    Very nice presentation. Thank you!

  • @hqTheToaster
    @hqTheToaster 4 หลายเดือนก่อน

    I have the same notion. I used Taylor Series to approximate how I figure characters should be scaled to each other in Dreams (a game) to mimic how characters are aligned in Smash Bros. Series from 'one canon height structure' to another, but eventually, I had to settle for making a table of possible canon non-Smash Bros. heights to convert to canon Smash Bros. heights simply because the measurements like to fudge each other. Great video!

  • @Jaylooker
    @Jaylooker 5 หลายเดือนก่อน +3

    Holomorphic functions under more conditions are automorphic forms like modular forms.

    • @MagicGonads
      @MagicGonads 5 หลายเดือนก่อน +1

      they don't call them conformal maps for nothing

  • @yanceyward3689
    @yanceyward3689 หลายเดือนก่อน

    An absolutely wonderful video of a concept I was wrestling with just a few weeks ago.

  • @whatitmeans
    @whatitmeans 4 หลายเดือนก่อน

    Nice video, recently I found about the existence of smooth bump functions and how they aren't analytical. But there are much more that could be told about the failure of Taylor expansions:
    I learned that due the Identity Theorem, no non-piecewise defined power series could match a constant value in a non-zero measure interval, wich means that an analytic function just cannot represent a phenomena with a finite duration (so having a finite extinction time).
    As example, the differential eqn:
    x' = -sgn(x) sqrt(|x|), x(0) =1
    has as unique solution
    x(t)=1/4 (1-t/2 +|1-t/2|)^2
    that becomes exactly zero after t>=2
    No power series could approximate this simple solution for all t.
    This means that at least no 1st neither 2nd order linear ODE, and neither non-linear ODEs like Bessels' and others with power series solutions, just cannot represent a finite extinction time: for doing it so the ODE must have a non-lipchitz point in time were uniqueness could be broken (so, it must handle singular solutions).
    This could have deep meaning in physics, How you could describe accurately what "time" means if your models don't even know when the clock stopped ticking?
    Think about it. More on MathStackExchange tag [finite-duration]

  • @bradzoltick6465
    @bradzoltick6465 4 หลายเดือนก่อน

    Wonderful presentation.

  • @matteovassallo568
    @matteovassallo568 2 หลายเดือนก่อน

    Great work!

  • @mustafizurrahman5699
    @mustafizurrahman5699 4 หลายเดือนก่อน

    Superb mesmerising....cannot thank you more for such lucid explanation

  • @haidarhaidar9092
    @haidarhaidar9092 4 หลายเดือนก่อน

    Thanks man that's a lot of real wisdom ❤

  • @GrandMoffTarkinsTeaDispenser
    @GrandMoffTarkinsTeaDispenser 5 หลายเดือนก่อน +1

    Great video thank you.

  • @peamutbubber
    @peamutbubber 21 วันที่ผ่านมา

    Excellent video!

  • @tomkerruish2982
    @tomkerruish2982 4 หลายเดือนก่อน +2

    I'd speculate that the reason analytic functions show up so much is because we're using them to solve (relatively simple) differential equations.

  • @frankreashore
    @frankreashore 4 หลายเดือนก่อน

    Fantastic video. Loved it!

  • @cloud_222
    @cloud_222 5 หลายเดือนก่อน +1

    Amazing video!

  • @deanrumsby
    @deanrumsby 5 หลายเดือนก่อน

    Great video and such great animations! :)

  • @tap9095
    @tap9095 5 หลายเดือนก่อน +2

    One smoot but not analytic function that I like is the Fabius function. It has the properties that f(0)=0, f(1)=1, and all derivatives at 0 and 1 = 0. So it works like the smoothest possible step function. But it's not analytic so you can't really compute values for arbitrary points. It also has a fun functional differential equation, f'(x)=2f(2x).

  • @anketmohadikar8767
    @anketmohadikar8767 หลายเดือนก่อน

    Very well explained,great video;)

  • @blueheartorangeheart3768
    @blueheartorangeheart3768 3 หลายเดือนก่อน +1

    I was just explaining to a student how the Taylor series worked, and I realized I had no idea WHY it worked. Then I remembered this video showing up on my timeline

  • @anmoldesai6022
    @anmoldesai6022 3 หลายเดือนก่อน

    Hey! Love the video. I was also studying Laurent series and wanted to know if it solves exactly the problem you explained. Thanks!

  • @arielwen8040
    @arielwen8040 4 หลายเดือนก่อน

    explain a lot!!!! thanks!

  • @jimi02468
    @jimi02468 5 หลายเดือนก่อน +1

    This channel is like 3b1b with a different voice and I love it

  • @thejoojoo9999
    @thejoojoo9999 2 หลายเดือนก่อน

    Great Video !

  • @MattMcIrvin
    @MattMcIrvin 4 หลายเดือนก่อน

    Bump functions are important in differential geometry--they're what allow us to define maps between any smooth manifold and a set of coordinate charts that are like Euclidean space (like a set of flat maps covering the geography of the round Earth), which don't disturb any of the derivatives of functions on the manifold. The bump functions define the cross-fade from one coordinate chart to another. That's useful for proving things.

  • @Sofialovesmath
    @Sofialovesmath 5 หลายเดือนก่อน +1

    Amazing!

  • @1timoasif
    @1timoasif 2 หลายเดือนก่อน

    Need a Complex Analysis video after this one 🙏

  • @ffdm
    @ffdm 5 หลายเดือนก่อน

    Wow. You're channel is FANTASTIC

  • @droidcelestial
    @droidcelestial 3 หลายเดือนก่อน

    Morphocular... more like, spectacular!

  • @TheFrewah
    @TheFrewah 3 หลายเดือนก่อน

    The mathologer channel has a good video showing how these series work. It’s beautiful!. Maybe 3 blue 1 brown as well. Animations are beautiful and open source

  • @Kwauhn.
    @Kwauhn. 4 หลายเดือนก่อน +7

    Of course complex functions make everything simpler! Their "complexity" is doing all the heavy lifting 😉

  • @ytbvdshrtnr
    @ytbvdshrtnr หลายเดือนก่อน

    Are the inverse functions of analytic functions always analytic too?
    I ask because at 12:18 cos^-1(x) is in the composite function.

  • @Kapomafioso
    @Kapomafioso 4 หลายเดือนก่อน +1

    Fun fact, sin, cos and exp are very closely related in an analytic sense- they're just linear combinations of each other for some well chosen argument. So it is not surprising that all three have the same radius of convergence. On the other hand, analytic functions with poles will only have Taylor series with finite radius that's at most the distance from the point to the nearest pole. e^(-1/x) has a singularity (essential, that is, this function hits every possible complex value arbitrarily close to x = 0) at zero, so it's Taylor series around x = 0 has zero radius of convergence (it only reproduces the zero value, but that is also undefined in the complex sense).

  • @user-qz6zu6ir4e
    @user-qz6zu6ir4e 5 หลายเดือนก่อน

    thank you so much

  • @Diaming787
    @Diaming787 5 หลายเดือนก่อน

    Great video. Something got me wondering sparked from this video:
    The set of all functions possible from R --> R.
    We have elementary/analytic ones and the non-analytic smooth functions mentioned in the video. What about the integral of e^x^2 dx where its elementary anti-derivative doesn't exists? We also have special functions, which are the solutions to common differential equations. What we include solutions to the set of all possible ordinary differential equations? Does that "span" the entire set of all functions from R --> R? If not, what does or is it even possible? How about functions Weierstrass functions?
    Also, just like non-analytic smooth functions are more common than analytic functions, which types of functions listed above is "dense" in this whole set of function space?

  • @TheIllerX
    @TheIllerX 27 วันที่ผ่านมา

    This information issue is part of a much larger general question about how much local information can be used to predict the behavior at other points.
    All information about an analytic function is contained at each point of the function.
    A function on the real line constructed where you just pick a random function value for each argument value would be the opposite. The information at each point says absolutely nothing about the value at other points. Most functions are in between those extreme cases.
    It would be interesting to quantify this information dependence between points in some general way.

  • @JeanYvesBouguet
    @JeanYvesBouguet 5 หลายเดือนก่อน +1

    This is truly one awesome video. I have always loved complex analysis and this might be the best introduction video to analytic functions and its relationship with smooth functions (or C-infinity). I have always been fascinated by this concept of smooth but not analytic. Thank you!

  • @hitarthk
    @hitarthk 3 หลายเดือนก่อน

    It's so cute that you make sure people don't hate non analytic functions by showing their utility ❤

  • @anvayjain4100
    @anvayjain4100 หลายเดือนก่อน

    Gonna watch every second of sponsored second cuz the rest of the video is worth it.

  • @General12th
    @General12th หลายเดือนก่อน

    Are there other bump functions than e^(-1/x)? Is that bump function the best of the bunch?