Q1. When expressing trigonometric functions as vectors using the Taylor series (Maclaurin series), differentiation is employed to calculate the coefficients. Doesn't this mean that saying trigonometric functions can be differentiated by the matrix D is a form of circular reasoning? A1. "D can differentiate" is an intuitive explanation. To be precise, "D is the representation matrix of differentiation," which means that "multiplying D by a vector representing a function yields the vector representing the derivative of that function." Therefore, using differentiation within the Taylor series to calculate the "vector representing the function" does not result in circular reasoning.
Q2. To define the matrix D representing differentiation, polynomials are differentiated. Similarly, to define the vectors 's' and 'c' representing sin x and cos x, differentiation is used within the Taylor series. Doesn't this imply that it is impossible to independently define the "vectors and the matrix" corresponding to "functions and differentiation"? A2. That depends on the meaning of "independently." In the first half of the video, differentiation is used to explore what kinds of "vectors and a matrix" can correspond to "functions and differentiation." However, since vectors and a matrix are merely collections of numbers, the latter half of the video argues that these can also be axiomatically defined. Specifically for vectors, by defining the whole set of infinite-dimensional column vectors (≈ formal power series), it becomes possible to explain that the column vectors corresponding to trigonometric functions inherently exist within this framework. This approach minimizes the impression of relying on axiomatic definitions.
@@zunda-theorem-en Depending on the context (such as complex analysis), some functions are *defined* as their Taylor series. One could just say, define exp(z) = sum(z^n/n!) for any complex z. This removes any circular issues even further
Shedding some light on this: Defining differentiation like this works for functions that are analytic, that is, have a power series representation that converges. The underlying reason is mainly, that such a matrix representation requires the vector space to have a countable basis. For example, holomorphic function spaces often admit a basis of polynomials because of the power series for complex functions. Analyticity is weaker for real functions, though. It's cool, and for spaces of polynomials, this still works, but just be careful applying this to other function spaces. For example, this wouldn't work on functions that aren't analytic. Even just infinitely often differentiable isn't enough.
the concept of using anime characters as a fun math explainers is creative, especially when you choose them to be 2 AI girls, but I will have some explaining to do when my parents / anyone walks in ......
I like that infinite here doesn't mean that you've contained an infinite amount of things in a matrix, it just means the matrix can be as big as you need it to be
Well, sort of. If you have any polynomial p with degree n, then you can consider the n x n matrix and then you don't need to do anything infinite. But when considering sin, its power series/infinite polynomial is actually infinite, and you need the whole infinite matrix. In practice, when working with things like this, people use linear maps/transformations instead of matrices, as they two are pretty much the same thing but the former is slightly more general.
Coincidentally yesterday I've learned about finding closed-forms of linear recursive sequences with generating functions and surprisingly it's also extremely closely tied to linear algebra. Linear algebra just likes to sneak in everywhere.
Saw this video after my maths exam and thanks myself that this was not in my syllabus🗿☠️ By the way your channel is bestest Mathematics channel on youtube till date
@@jadude378 Square roots of matrices do exist in certain cases, e.g. for symmetric positive semidefinite matrices. However, they are non-unique in general, so defining half-differentiation by a square root of a matrix requires selecting one of the roots, and it’s not clear which one you should choose. It also requires that the differentiation matrix have a square root in the first place, which is not guaranteed.
I think that the reason D has no inverse may be related to how when integrating you have to add a constant at the end. Meaning that when you take a function, differentiate first and the integrate you get f(x) + c. But when you integrate first and then differentiate you get f(x) again
That formula for D' seems useful for integrating functions that don't have an elemantary integral But you need the taylor series and you can already integrate the taylor series of every function Stil cool tho
Well getting linear algebra explained by two anime/game characters is a fist for me. But I do not remember ever seeing this before. It makes sense to me that it is possible, I had a harder time finding a good use case where I could use it. The D^2 could be very helpful. So if I one day need to differentiate many very large polynomials super fast, I will think of this video and create D...D^n as sparse matrixes. Thanks. 🥰 I might just have a look at some of your other videos. 😂
My prediction coming from 2:16 - they will first solve this simple example, and after that they will represent functions using their infinite Taylor series polynomials to generate infinite vectors and then find the infinite matrix that represent d/dx
i'm claiming this channel as my underdog of 2025. i hope to see this channel grow. exponentially, tetratively, even. i will teach bacterial colonies the way of zundamon
imo, most obvious extension to this representation of this would be for holomorphic functions on C, which are locally just sums of x^i for i in Z (ie. the matrix extends in all directions on the plane in the same manner). Alternatively if you want some more generality, given any linearly independent set of differentiable functions (technically, in this video you are treating the x^i as a Schauder basis, or allowing infinite sums), for example the collection {x^i}, by linear algebra (using Zorn's lemma), we can extend it to a basis (this is the key point, as Schauder bases do not always exist, and this means we need to explicitly construct one for every case we use). However, in the case of differentiable functions on R, this results in an uncountable basis and hence an uncountably infinite matrix. I'm not aware of a nice Schauder basis for the differentiable functions (there exist ones for smooth functions, or C1, but idk otherwise).
A cool fact about using this "matrix multiplication" method of differentiating is that it can be used to show that d/dx[e^x]=e^x By taking the taylor polynomial expansion of e^x, you can show that each higher order term will cancel out exactly with the first term of the factorial when applied to the infinite polynomial, along with being shifted up by 1. There are other instances, but I think showing this one would have been a good way to demonstrate how it can be used to take derivatives of non-polynomial functions.
Отличное видео! Давно мною недопонятая тема, которую вы тут для меня очень хорошо объяснили) Единственный вопрос - для получения ряда Тейлора (или его разновидностей) дифференциал функции так и так придётся брать для коэффициента при x^n, так что применение этой матрицы к тем же тригонометрическим функциям выглядит слегка... Запутанно. Получается, чтобы взять дифференциал, нам нужно... Взять дифференциал
Как отмечено в видео, этот способ используется когда функция представлена в виде вектора, иными словами разложена по базису. Например, в квантовой механике это как правило так. Кроме того, сама функция может быть неизвестна и ее нахождение может быть Вашей задачей. В целом, для инженерных задач Вы хотите представить все в виде матриц, это критически важно.
Proof that D is not invertible : Suppose D is the derivative matrix. Since the derivative of any constant equals zero, (1,0,...,0) is the basis in the null space of D : D(1,0...0) = (0,0,...0). The nullspace of D including a nonzero vector means that there exists at least one dependent(or zero) column in D, which means D is not invertible.
Nope, they're not completely different, the reason you can interchangeably treat one as the other is because there's an isomorphism of linear maps between an infinite dimensional matrix and the differential operator over the set of smooth functions.
In a nutshell this actually makes sense because ax^b = abx^(b-1). If b represent the index of the coefficient for the input matrix for the differential matrix. Because there is 0 in front of the incrementing diagnal numbers, the coeficient, after multiplied by the corresponding power (the ab in abx^(b-1) will be "shifted" back (the (b-1) in abx^(b-1).
I would be interested in seeing this recast for Fourier series. When I tried to look that up, I did see a note that not all Fourier series can be differentiated. Something about jump discontinuities.
Now we have an issue, x^(3/2) or root of x cubed. Differential of x^ (3/2) is 3/2* x^ 1/2 or 3/2 root of x. Can't really represent all possible number using a matrix, because powers of number are not limited to integers. But I loved your idea :D. But it's too silly to do for real problems.
I'm studying Electrical Engineering, are you able to derive a Matrix that maps Time Domain to Fourier Domain and connecting differentiation matrix to multiplying by iw?
So... Can we go further and take matrix exponential of D to get e-th derivative? If my skills of calling torch.matrix_exp are to be believed, it looks e-th derivative for 5:05 (2x^3+x) is 3+7x+6x^2+2x^3 . Which feels kinda strange as dx = 6x^2+1, dxdx = 12x, dxdxdx = 12.
This method is limited to the functions that have Taylor expansion at each point. Try this on tan(x). A Taylor series exists for a function if the function is infinitely differentiable at the expansion point, and it accurately represents the function only if the function is analytic at that point.
I'm a math PhD and I love your content. Let me know if you need help with translating this to more languages, I could help with German and French. Lmk how to get in touch if interested.
FAQ
Q1. When expressing trigonometric functions as vectors using the Taylor series (Maclaurin series), differentiation is employed to calculate the coefficients. Doesn't this mean that saying trigonometric functions can be differentiated by the matrix D is a form of circular reasoning?
A1. "D can differentiate" is an intuitive explanation. To be precise, "D is the representation matrix of differentiation," which means that "multiplying D by a vector representing a function yields the vector representing the derivative of that function." Therefore, using differentiation within the Taylor series to calculate the "vector representing the function" does not result in circular reasoning.
Q2. To define the matrix D representing differentiation, polynomials are differentiated. Similarly, to define the vectors 's' and 'c' representing sin x and cos x, differentiation is used within the Taylor series. Doesn't this imply that it is impossible to independently define the "vectors and the matrix" corresponding to "functions and differentiation"?
A2. That depends on the meaning of "independently." In the first half of the video, differentiation is used to explore what kinds of "vectors and a matrix" can correspond to "functions and differentiation." However, since vectors and a matrix are merely collections of numbers, the latter half of the video argues that these can also be axiomatically defined. Specifically for vectors, by defining the whole set of infinite-dimensional column vectors (≈ formal power series), it becomes possible to explain that the column vectors corresponding to trigonometric functions inherently exist within this framework. This approach minimizes the impression of relying on axiomatic definitions.
i was questioning this also now its clearrr
@@zunda-theorem-en Depending on the context (such as complex analysis), some functions are *defined* as their Taylor series. One could just say, define exp(z) = sum(z^n/n!) for any complex z. This removes any circular issues even further
This was literally 3 paragraphs in my linear algebra textbook
Normals: *having creepy nightmares*
Zundamon: *having mathematical nightmares*
Normals? Vectors perpendicular to surfaces
Bedtime? BEDTIME IS NOT MORE IMPORTANT THAN ZUNDAMON’S THEOREMS!
How did you know
Homie decided to speak fax
i don't need sleep, i need answers 😂
Amen!!! 🤸🏻♀️🤸♂️🤸🤸🏾
I have a linear algebra exam in 1 hour💀
oof
hope it's going well mate!!!
good luck ahahaaa
lol
I did linear algeria on the first trimester???
Shedding some light on this:
Defining differentiation like this works for functions that are analytic, that is, have a power series representation that converges.
The underlying reason is mainly, that such a matrix representation requires the vector space to have a countable basis. For example, holomorphic function spaces often admit a basis of polynomials because of the power series for complex functions. Analyticity is weaker for real functions, though. It's cool, and for spaces of polynomials, this still works, but just be careful applying this to other function spaces. For example, this wouldn't work on functions that aren't analytic. Even just infinitely often differentiable isn't enough.
Indeed, thank you for noting this
the concept of using anime characters as a fun math explainers is creative, especially when you choose them to be 2 AI girls, but I will have some explaining to do when my parents / anyone walks in ......
i knew i was meant to learn maths from zundamon
“I just woke up though.”
Mood.
Watching this instead of listening in class
I'm still learning
When peasant students learn the proper way to learn quantum mechanics:
this (me) peasent student who just finished UG quantum seeing this video:😳
I like that infinite here doesn't mean that you've contained an infinite amount of things in a matrix, it just means the matrix can be as big as you need it to be
Well, sort of. If you have any polynomial p with degree n, then you can consider the n x n matrix and then you don't need to do anything infinite. But when considering sin, its power series/infinite polynomial is actually infinite, and you need the whole infinite matrix. In practice, when working with things like this, people use linear maps/transformations instead of matrices, as they two are pretty much the same thing but the former is slightly more general.
Starting my morning with a new video of Zundamon's Theorem
Less than 30 seconds in and i already subscribed this is the peak I yearn for
I have a strange feeling what this may have some ARG elements.
MATH is an ARG.
Damn explained linear algebra course in 13 minutes. Awesome work Zundamon and Metan.
So fascinating, so clear. One of the best math channels. Thank you for your work. I love it
Coincidentally yesterday I've learned about finding closed-forms of linear recursive sequences with generating functions and surprisingly it's also extremely closely tied to linear algebra. Linear algebra just likes to sneak in everywhere.
I see Zundamon's Theorem, I watch.
Relatable
I love the soundtrack!
Saw this video after my maths exam and thanks myself that this was not in my syllabus🗿☠️
By the way your channel is bestest Mathematics channel on youtube till date
Thank you for making a video on something related to my request, it was surprisingly easy to follow even being very rusty on my math.
I like this format of lectures)
Can you do half derivatives with this method???
I would guess that you can’t, because square roots of matrices are not well defined so you couldn’t get D^0.5 , although i could be wrong
It's possible I used a time ago and results in the coefficients of the Grundwald letnikov definition
@@jadude378 Square roots of matrices do exist in certain cases, e.g. for symmetric positive semidefinite matrices. However, they are non-unique in general, so defining half-differentiation by a square root of a matrix requires selecting one of the roots, and it’s not clear which one you should choose. It also requires that the differentiation matrix have a square root in the first place, which is not guaranteed.
Modern Plato
This channel makes me think outside the box
I'd argue they put an infinite box around differentiation.
Wow so cool! I like your channel and the so much interesting math topics ❤
Having just passed my linear algebra exam I feel called out by this 😂
This channel is so fun, somehow
and of course we can raise the matrix to any arbitrary power, like 1/2, to get a partial derivative.
now calculate eigenvalues and eigenvectors of this differentiation matrix to perform higher order differential
Eigenvector corresponds to e^(ax) and eigenvalue is a.
Is it doable tho? After all its determinant would be 0 right?
I think that the reason D has no inverse may be related to how when integrating you have to add a constant at the end. Meaning that when you take a function, differentiate first and the integrate you get f(x) + c. But when you integrate first and then differentiate you get f(x) again
Functional vector spaces and isomorphisms to R^n are so cool!
The world of matrices is trully beautiful
Who starts a conversation like this, I just woke up!
That formula for D' seems useful for integrating functions that don't have an elemantary integral
But you need the taylor series and you can already integrate the taylor series of every function
Stil cool tho
Another great video! Thank you!
Lilly and Lana: The Matrix
Zundamon and Metan: The Infinite Matrix
Well getting linear algebra explained by two anime/game characters is a fist for me.
But I do not remember ever seeing this before. It makes sense to me that it is possible, I had a harder time finding a good use case where I could use it.
The D^2 could be very helpful.
So if I one day need to differentiate many very large polynomials super fast, I will think of this video and create D...D^n as sparse matrixes. Thanks. 🥰
I might just have a look at some of your other videos. 😂
Will multiplying D' by a vector still give the integrated function but without +C?
このチャンネルならかわいいずんだもんも英語も数学も勉強できてお得(ハードコア)
My prediction coming from 2:16 - they will first solve this simple example, and after that they will represent functions using their infinite Taylor series polynomials to generate infinite vectors and then find the infinite matrix that represent d/dx
i'm claiming this channel as my underdog of 2025. i hope to see this channel grow. exponentially, tetratively, even. i will teach bacterial colonies the way of zundamon
oh and i also sent the channel link to my former lin alg professor, i think he'll appreciate it
Writing down infinite matricies on my cheat sheet...
imo, most obvious extension to this representation of this would be for holomorphic functions on C, which are locally just sums of x^i for i in Z (ie. the matrix extends in all directions on the plane in the same manner).
Alternatively if you want some more generality, given any linearly independent set of differentiable functions (technically, in this video you are treating the x^i as a Schauder basis, or allowing infinite sums), for example the collection {x^i}, by linear algebra (using Zorn's lemma), we can extend it to a basis (this is the key point, as Schauder bases do not always exist, and this means we need to explicitly construct one for every case we use). However, in the case of differentiable functions on R, this results in an uncountable basis and hence an uncountably infinite matrix. I'm not aware of a nice Schauder basis for the differentiable functions (there exist ones for smooth functions, or C1, but idk otherwise).
Absolute procrastination is not doing your calculus hw while watching this
Zundamon is more important than sleep at 1 am
"I like lore in anime" mf after watching this video of cute anime girls casually explaining deep math concepts. very wholesome tho ^^
A cool fact about using this "matrix multiplication" method of differentiating is that it can be used to show that d/dx[e^x]=e^x
By taking the taylor polynomial expansion of e^x, you can show that each higher order term will cancel out exactly with the first term of the factorial when applied to the infinite polynomial, along with being shifted up by 1.
There are other instances, but I think showing this one would have been a good way to demonstrate how it can be used to take derivatives of non-polynomial functions.
Hell Zundamon is so cute
Thank You
Who was the first mathematician that discovered representation od derivative by matrix ?
Отличное видео! Давно мною недопонятая тема, которую вы тут для меня очень хорошо объяснили)
Единственный вопрос - для получения ряда Тейлора (или его разновидностей) дифференциал функции так и так придётся брать для коэффициента при x^n, так что применение этой матрицы к тем же тригонометрическим функциям выглядит слегка... Запутанно. Получается, чтобы взять дифференциал, нам нужно... Взять дифференциал
Как отмечено в видео, этот способ используется когда функция представлена в виде вектора, иными словами разложена по базису. Например, в квантовой механике это как правило так. Кроме того, сама функция может быть неизвестна и ее нахождение может быть Вашей задачей.
В целом, для инженерных задач Вы хотите представить все в виде матриц, это критически важно.
That's all fascinating, but there is no way I will replace simple polynomial differentiation with anything what has to do smth with matrices
2:14 This has the same energy with 'I identify myself as attack helicopter.' Unfortunately, Trump recently banned that :(
Proof that D is not invertible :
Suppose D is the derivative matrix. Since the derivative of any constant equals zero, (1,0,...,0) is the basis in the null space of D : D(1,0...0) = (0,0,...0). The nullspace of D including a nonzero vector means that there exists at least one dependent(or zero) column in D, which means D is not invertible.
2:07 What!? >:3
UwU
great video
Nope, they're not completely different, the reason you can interchangeably treat one as the other is because there's an isomorphism of linear maps between an infinite dimensional matrix and the differential operator over the set of smooth functions.
I think the set needs to be analytic functions, not smooth ones, smooth is too general
My favorite math visual novel.❤ Is that done with renpy by any chance?
Zundamon can see the future by dreaming. Another nice video!
Not going to lie...
I saw Zundamon and I immediately thought of Ui-mama's Uidamon.
Instead of working on my BSc Math coursework im watching zundamon videos, how wonderful!
In a nutshell this actually makes sense because ax^b = abx^(b-1). If b represent the index of the coefficient for the input matrix for the differential matrix. Because there is 0 in front of the incrementing diagnal numbers, the coeficient, after multiplied by the corresponding power (the ab in abx^(b-1) will be "shifted" back (the (b-1) in abx^(b-1).
I would be interested in seeing this recast for Fourier series. When I tried to look that up, I did see a note that not all Fourier series can be differentiated. Something about jump discontinuities.
MATlab flashbacks!
i love you both zundamon and metang
Good video!
Day 5 of requesting Zundamon and Metan kiss
Things get serious at 8:00.
That's fantastic!
Now we have an issue, x^(3/2) or root of x cubed.
Differential of x^ (3/2) is 3/2* x^ 1/2 or 3/2 root of x. Can't really represent all possible number using a matrix, because powers of number are not limited to integers.
But I loved your idea :D. But it's too silly to do for real problems.
How to construct invert differental matrix with known constant C so that it is basically integral? Is it by set d^-1[1][1] = C?
Question - can one express partial differentiation with matrices?
Can one solbe non linear diff' equations with matrices?
This is so peak.
What about integration? Would it be the inverse matrix but with c somewhere?
6:53 the other girls’s face thinking of taylor series lol
A video about arithmetic geometry would be really interesting maybe height theory.
I'm studying Electrical Engineering, are you able to derive a Matrix that maps Time Domain to Fourier Domain and connecting differentiation matrix to multiplying by iw?
It's a little hard to get used to listening to these videos in English after coming from only the Japanese versions! :)
So... Can we go further and take matrix exponential of D to get e-th derivative?
If my skills of calling torch.matrix_exp are to be believed, it looks e-th derivative for 5:05 (2x^3+x) is 3+7x+6x^2+2x^3 . Which feels kinda strange as dx = 6x^2+1, dxdx = 12x, dxdxdx = 12.
That's because e^(d/dx) is actually the Shift Operator which acts on functions like so:
e^(d/dx) f(x) = f(x+1)
Useful for modelling translations. Applying this to f(x) = 2x³ + x results in:
2(x+1)³ + (x + 1) =
2(x³ + 3x² + 3x + 1) + x + 1 =
2x³ + 6x² + 6x + 2 + x + 1 =
2x³ + 6x² + 7x + 3
Is this all building up to the Wrongstine (i can't spell)? Deviation and infinite matrices.
If so, I'd like a practical use for it.
Wronskian?
Thank you very educational
I JUST WOKE UP THOUGH🗣️‼️‼️‼️
Yes but only when the function is analytic at that point. Being differentiable alone does not suffice. Eg. x*|x| at x=0.
DMT PARK님 영상에서 본 행렬과는 조금 다르네요
Can the matrix be generalized to d^n/dx^n?
1:23 pause here ittt cool
Did you use LaTeX to show the matrix?
Aw, no mention of right-inverses?
Can you raise d/dx to the power of i(√-1) ?
Do Berezin calculus next 🤩
This method is limited to the functions that have Taylor expansion at each point. Try this on tan(x). A Taylor series exists for a function if the function is infinitely differentiable at the expansion point, and it accurately represents the function only if the function is analytic at that point.
Just make it entire
I'm a math PhD and I love your content. Let me know if you need help with translating this to more languages, I could help with German and French. Lmk how to get in touch if interested.
Wait... does this work in reverse? I was taught that Integration has no universal method...
Doesn't this only apply for polynomials though? And it's already easy to integrate polynomials
@@TabletopShelf that's true
You can integrate infinite polynomials, but there's no telling whether you can return it to a closed form.
Hi, great video as usual !! Do you plan on translating the four very first videos of the japanese channel, which don't have subtitles ?
That's an exercise left to the reader - er, I mean viewer.
Now please make one for eigenvalues and eigenvectors!
Give us the half derivatives and half integrals using the matrices!
AM2 and proud of it.
what about functions which dont have Taylor series expansions?
So does it mean u can integrate using matrix?