ACT 2020 Tutorial: The Yoneda lemma in the category of matrices (Emily Riehl)

แชร์
ฝัง
  • เผยแพร่เมื่อ 13 ม.ค. 2025

ความคิดเห็น • 52

  • @risavkarna
    @risavkarna 4 ปีที่แล้ว +22

    I love how she describes the 'procedure called abuse of notations' at around 16:00.

    • @chris8443
      @chris8443 3 ปีที่แล้ว +2

      Gerald Sussman has a great rant about this kind of thing. "Gerald Jay Sussman: The Role of Programming (Dan Friedman's 60th Birthday)"

    • @jsmdnq
      @jsmdnq 2 ปีที่แล้ว

      Um, it's pretty common. It happens all over. Nothing special about what she said.

    • @jsmdnq
      @jsmdnq 2 ปีที่แล้ว +2

      @@vaibhavsutrave2309 I don't think it is very subjective. It's done all the time in science. If everything we wrote had to be absolutely explicit and "objective" we wouldn't get anywhere. We always are "abusing notation". 99% of what we write down is leaving out 99% of the details. When you solve a polynomial equation for some math test do you make sure you write down the polynomial is from the polynomial ring of complex numbers and use the evaluation homomorphism? Even then, do you write down every axiom of logic you are using in the process? Which logic?
      As long as it is understood what is meant then it's ok to leave common things out. Most math books mention they will do such things to be clear. Musicians do this too where they leave out accidentals when it is understood. It's not subjective. The rules are clear: Make a mental note of the context and insert it mentally.
      Maybe what is subjective is when to apply such things. It can be problematic when teaching to people who are unfamiliar with the notation and keep forgetting the context. That is really a teaching issue though and not part of the "abuse of notation" idea. If mathematics had to state everything explicit we'd never get anywhere because at some point the explicit restatements of everything would be overwhelming in size.

    • @Bratjuuc
      @Bratjuuc 2 ปีที่แล้ว +1

      This "notation abusing" approach bites you in Haskell, when compiler refuses to do implicit "fmap".
      An "h :: a -> b" is an "h :: a -> b", not "fmap h :: Functor f => f a -> f b"

    • @Bratjuuc
      @Bratjuuc 2 ปีที่แล้ว +1

      @@jsmdnq we don't need to write "evaluation morphism" to solve a polynomial - usual regular notation is rigorous enough. Also leaving lifted A as just A like she doesn't care, instead of adding a letter "h" is just laziness, that hurts rigorocity, and easing already trivial notation. And rigorocity is one of the most valuable math's treasures.
      I would understand, if it was a really complex formula instead of just letter "A" - easing that notation would be somewhat justified, but it isn't and it only teaches us that notation is so disregarded, that it doesn't matter.

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว +3

    So just to setup some linear algebra maps to her notation and CT: For any pxq matrix with p rows and q cols we can effectively associate in the category of natural numbers maps from one natural number to another that represent matrices of the corresponding p=domain and q=codomain. That is, an arrow from p to q in the category of natural numbers is really any matrix with size pxq. In reality it is not the category of natural numbers but the category of matrices. Here the objects are the natural numbers and the morphisms/arrows are transformations, linear transformations, or equivalently matrices.
    Hence the homset(p,q) is a set of morphisms. In fact, it is the set of all pxq matrices over whatever field the matrix category is over. So the homset(p,q) itself is generally a "set of matrices".
    Composing arrows or homsets correspond to matrix multiplication(at least that is one way we get a valid composition structure to turn Mat_F in to a category).
    So what she is effectively describing categorically is everything you learn in Linear Algebra but by putting it in an "abstract" framework. Everyone that knows LA knows that you can multiply matrices and that the sizes have to "match", that matrix multiplication is associative, and usually that matrices and linear transformations are "isomorphic". This is just translating that language in to the language of category theory as arrows, categories, homsets, and functors.
    h_k(n), the k-column functor is a functor from Mat->Set that maps the objects to objects but fixes exactly what is mapped so we are only dealing with some fixed-k column matrices. This can be considered a contravariant functor homset(_,k) where _ is our variation parameter and corresponds to n. Effectively h_k(n) = homset(n,k) (but this homset is not in Mat but "across Mat"(in a category that contains Mat and Set as objects)).
    That is, look at the category Mat and go to the k object. Then an arrow from n to k(or in her OP-notation k to n) is an nxk matrix. There are a number of them for each n(one for matrix for each n->k) and for each object p in the category there is an arrow from p to k(or in her OP-notation k to p).
    h_k(n) is essentially looking at a sub-category of Mat with k fixed (it has just as many objects as Mat but only arrows in to k so k acts like a terminal object). h_k is just a very long winded way to think of _xk matrices(all matrices with k -columns such as 1xk, 2xk, 3xk, 4xk, etc). Of course n is a variable here too but k is fixed over in. E.g., this is a given k, ("given n do stuff but pretend k is fixed").
    Around 18:46 she states h_k(m)

    • @jsmdnq
      @jsmdnq 2 ปีที่แล้ว +2

      Mathematicians love to create ways to say the same ol things in a complex way because, well, it pays the bills. Sometimes it is useful. At the very least it provides some way to say something. Of course musicians and dancers pretty much do the same thing as do judges and lawyers.
      Once you realize that category theory isn't a "new thing" but something that is really just a more explicit and "universal" way to say the same old things then you realize that on one hand it does nothing special and on the other completely changes the entire game. It's power is precisely in it's ability to provide a unified language that then unifies most of mathematics and everything else. Category theory is a sort of "ideal language". (it may not bbe perfect in it's conception but it provides at least one "ideal language" for us to use which is better than 0.
      One, could, hypothetically, try to translate everything in to linear algebra terms. E.g., when you ice skate somehow you translate the "acts" in to linear algebra(this probably gets in to physics side of linear algebra). Then you try to translate money in to linear algebra... then driving a mopad... then playing backgammon, etc. If you could do so then you could, in some sense have a "universal language" using linear algebra to describe everything. One language fits all. Well, that is, for the most part, category theory.
      If humanity doesn't kill itself off and something better than category theory doesn't come along, chances are in about some number of centuries the most used language in humanity would be category theory. Why? Because we would have one language to talk about many things rather than many languages to talk about many things. One language that is designed to express expression(or structure structure) optionally as possible. This would remove ambiguity(but not logic errors) and so allow for maximum efficiency and precision between communicators.

    • @samueldeandrade8535
      @samueldeandrade8535 7 หลายเดือนก่อน

      ​@@jsmdnq hehe. Do you like Category Theory?

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว +1

    Basically if AB for matrices A and B is their product which we write .(A,B) then if C is some column operation we have .(A,C(B)) = C(.(A,B)). By the Yoneda lemma we have ABC_I where C_I is a special matrix that represents our column operation which we can equivalently do by multiplying on the right by it.
    This acts, in some sense, like homomorphism: f(AB) = f(A)f(B) = ABF where F is a representation of f. Here f(A) = A if f is matrix multiplication. It's not quite the same as a homomorphism because if f(A) = A then we end up with the identity. It is effectively f(A,B) = Af(B) = ABF as the position of the arguments are treated differently.
    That is, column operations "commute" through to the second argument of matrix multiplication. I'm sure in the mirror/opposite g(XY) = g(X)Y = GXY when we are talking about row transformations although maybe there is some transposes(OP's) thrown it there to make things work out.... well, lets see f^T((XY)^T) = f((XY)^T)^T = f(Y^TX^T)^T = (Y^Tf(X^T))^T = (Y^TX^TF)^T = F^TX^TY^T = GXY = g(XY)
    by setting g = f^T, G = F^T, X = B^T, and Y = A^T then we get a row version. E.g., if A = mxn, B = nxk and f(AB) = mxk then X = kxn, Y = nxm and g(XY) = kxm.

  • @pmcgee003
    @pmcgee003 3 ปีที่แล้ว +5

    This is an awesome example, and an awesome explanation.

  • @jmgimeno
    @jmgimeno 4 ปีที่แล้ว +9

    A link to the slides: www.math.jhu.edu/~eriehl/matrices.pdf

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว +2

    When I was in the womb I always used the matrix multiplication operation on two matrices A:nxm and B:mxk by selecting random points from each with a uniform weighted probability of n/(n+k) for A and k/(n+k) for B and placing them randomly within a C:nxk matrix. Now I know why I always got the wrong answer.

  • @valentinussofa4135
    @valentinussofa4135 ปีที่แล้ว +1

    Great lecture by Profesor Emily Riehl. Thank you very much. 🙏

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว +2

    So the Yoneda lemma is just a high level view of a "change of basis"... a sort of "change of functors"? It's expressed categorically but shows up everywhere. It seems quite powerful(because of the simple answer it gives) then as it gives us a way to "change perspectives" in a universal way.

  • @Bratjuuc
    @Bratjuuc 2 ปีที่แล้ว

    Nice presentation. It was a very intuitive example, illuminating the Yoneda lemma

  • @kaleevans1692
    @kaleevans1692 3 ปีที่แล้ว +2

    Pretty amazing talk. Thank you!

  • @eternaldoorman5228
    @eternaldoorman5228 หลายเดือนก่อน

    11:33 Is it because matrices are operators and matrix multiplication is composing operators?

    • @eternaldoorman5228
      @eternaldoorman5228 หลายเดือนก่อน

      32:30 Ummmm, ... So this is one example of how you can talk about algebras in general in terms of functions and Natural Transformations?

    • @eternaldoorman5228
      @eternaldoorman5228 หลายเดือนก่อน

      36:15 All those transformations you can express as matrix multiplication in terms of compositions of operations at lower grades? So it's a kind of induction principle for matrices?

    • @eternaldoorman5228
      @eternaldoorman5228 หลายเดือนก่อน

      42:12 So I guess the whole of signals and systems being about LTI operators fits into this scheme. You analyse a Linear Time Invariant system by considering the response of the system in terms of a "unit impulse" which is a bunch of zeros with a one in it somewhere, shifted along the time axis.

    • @eternaldoorman5228
      @eternaldoorman5228 หลายเดือนก่อน

      1:01:19 So does that prove the "conservation of dimension" theorem for linear algebra without having to refer to bases at all? Surely not.

  • @minch333
    @minch333 4 ปีที่แล้ว +1

    That was great! I've heard of this category before but not known much about it aside from its definition, which is weird cus it seems important. It would be really interesting to see an example of the Yoneda lemma being applied to a natural transformation that's codomain is not a representable functor, as was briefly mentioned in the talk

    • @keithharbaugh2594
      @keithharbaugh2594 4 ปีที่แล้ว +3

      Consider a group operating on a set.
      In the Yoneda lemma context, the domain (representable) functor is the group operating on itself (sometimes called the "regular representation of the group").
      The codomain functor is the group operating on the set.
      In this example, that orbits are "natural"
      corresponds to the statement (forall x)(forall h)(forall g) (xg)h = x(gh).

  • @jonathanweston-dawkes287
    @jonathanweston-dawkes287 3 ปีที่แล้ว +2

    I was a TA in 1978 for Prof. Linton when he taught introductory mathematics at Wesleyan University in Middletown, CT. No specific memories other than I enjoyed working for him.

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว +1

    Could one in any way say that naturality *IS* the thing that mathematics tries to pin down? I mean, it's so ubiquitous in math surely it is not coincidental. Given that we can probably safely say there is some deeper connection that is independent of humans(surely if it was just "choice" or "preference" then it wouldn't be so common) then it could be more of a "law" of the universe or at least some type of structure the universe itself contains.

  • @pablo_brianese
    @pablo_brianese 4 ปีที่แล้ว +1

    Is it often the case that natural transformations form a vector space?

    • @jamesfrancese6091
      @jamesfrancese6091 2 ปีที่แล้ว

      Typically not, no. However, much of category theory does end up looking a lot like "abstract (linear) algebra" in a way -- check out enriched categories! Various categories of vector spaces + linear transformations were traditionally used to develop homological algebra; eventually people realized you could do all the essential operations of homological algebra in any category whose morphisms between fixed objects formed an abelian group, together with a few additional properties. Whence abelian categories. People then realized that the category of small categories Cat is "enriched" over itself, so that natural transformations between two fixed categories can themselves be organized into a category, as morphisms between functors. Whence 2-categories. etc. And after all a generic category is simply an algebraic gadget of sorts: a monoid with many objects. So category theory can be said to resemble abstract algebra to a great extent, with Cat being a many-object 2-monoid.

  • @pmcgee003
    @pmcgee003 3 ปีที่แล้ว

    "I'm gunna help you answer your own question" is savagely efficient black-belt mathematics. 🙂🙂

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว

    Beautiful! But what is the Yoneda Lemma of Yoneda Lemmas? After all, functors are functions towo!

    • @Bratjuuc
      @Bratjuuc 2 ปีที่แล้ว +2

      Functors live as morphisms in the category of categories and I doubt this category even locally small (I doubt morphisms between any 2 objects form a set). We already know it's not small.
      P.S. Turns out there is only category of small categories, where functors do form a set, but unlike sets, you're not allowed to zoom into objects and picking something there (there is nothing to "pick" from an object in a category of a Partially Ordered Set (objects = numbers, morphisms are "less than or equal" relations)).

  • @Fetrose
    @Fetrose 2 ปีที่แล้ว

    What is the application of this talk? Any thought?

    • @newwaveinfantry8362
      @newwaveinfantry8362 29 วันที่ผ่านมา

      There are none. This is a simple example of a profound, but very abstract theorem in pure mathematics, made so that understands can understand what it is about.

  • @NoNTr1v1aL
    @NoNTr1v1aL 3 ปีที่แล้ว

    Amazing video!

  • @NikolajKuntner
    @NikolajKuntner 4 ปีที่แล้ว +3

    Thanks.

  • @fredeisele1895
    @fredeisele1895 4 ปีที่แล้ว

    At 9:00 you talk about a matrix A. Are we to suppose that there are are set of m x n matrices of which A is an ensample? or is there only one A?

    • @davtor33
      @davtor33 4 ปีที่แล้ว

      It's true for any matrix A of that size.

  • @kingyinyan254
    @kingyinyan254 4 ปีที่แล้ว

    Wonder if this special category could be understood as the category of linear transformations over some vector spaces? That could be more useful in applications :)

    • @austinalderete2730
      @austinalderete2730 4 ปีที่แล้ว +3

      That's what this is though, no? Your natural number are the dimension of your vector spaces (the objects) and your coefficients of the matrix are the field underlying the vector space. The matriced (morphisms) are linear transformations.
      It's exactly what you said already.

    • @janouglaeser8049
      @janouglaeser8049 3 ปีที่แล้ว +6

      No, they're not the same categories, but they're what's called "equivalent categories". Actually, the category of matrices is what's known as a "skeleton" of the category of vector spaces.

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว

    It's quite funny that when I took linear algebra in college it took me about 1-3 minutes to understand that matrix multiplication had a rule on it about how row dimension and column dimension had to "align up" properly. But to understand it from the categorical perspective it takes about 20-30 minutes. In "basic" math it was like about 3 sentences and 5 examples. In "advanced" math it's like 2000 sentences and 0 examples. I get that the human brain thinks naturally in terms of functions, functions, and natural transformations but it seems the more I become aware just how fundamental of concepts they are the more things seem to become complex-ified. It's quite cool seeing the world as one huge category of vector spaces... but I get tired of having to transform my kinematical movements in to a vector space to fix my coffee now.
    When category theorists say "I know category is a bit abstract" they are absolutely lying! It's so abstract that it is the most concrete thing possible.

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว

    CompositionOrder(X) = DiagramOrder(X^(OP)) = DiagramOrder(X)^(OP)
    Life would be very simple if everyone used the same number of OP's!!! Duality, what a joke!

  • @samueldeandrade8535
    @samueldeandrade8535 7 หลายเดือนก่อน

    I kinda like this professor. But she should have picked some better subject to talk about. I mean, Yoneda Lemma for the Category of Matrices is for college students to lecture, not professors.

  • @jsmdnq
    @jsmdnq 2 ปีที่แล้ว +1

    My problem with Emily is that she "harps" way to much on what people may or may not understand rather than skipping over having to put up warnings at every turn. She should just state things as if everyone was essentially on her level. I have a feeling she's spend way to much of her life trying to explain things to people who don't know even the basics and so is constantly in the mindset that she is explaining things to novices. I believe this is the wrong approach.

    • @jsmdnq
      @jsmdnq 2 ปีที่แล้ว

      My proof: Anyone that is confused with 0 and 1 will have no idea what category theory really is with no understanding. Understanding integers is a prereq for understanding category theory from the mathematicians perspective. I don't try to run 5k marathons because I don't run. People that have such basic lack of mathematics shouldn't try to actually understand advanced mathematics. It's a waste of time. If a person can't solve a linear equation they shouldn't be in calculus. It's not ego or arrogance or superiority... it is just a fact because learning is progressive.
      Hence if she explained things on "her level" then let people come up to hers it will actually be more fun for her, more fun for those who do understand, and help those who don't by not filling their head with things they will never comprehend correctly. (I'm not saying that one can't simplify CT to explain to beginners, I'm saying that simple explanations should be simple(not "complex" with a bunch of caveats that "Warning, you won't understand this" or trying to appease both the beginner and expert in the same "tutorial").
      Of course this should be true for most people. As they say "Know they audience". If someone doesn't know anything about group theory, linear algebra, calculus, topology, etc then learning category theory isn't going to happen. Sure something will be learned but it is highly ineffective for everyone involved.
      I've found Emily's approach to teaching quite difficult to engage in. I have read her category theory in context book and, while it has similar overtones or issues, is actually quite good. In that book I find her presentation of a wide variety of "applications" is very good even though I'm weak on some making it more difficult than it would be if I spend more time in those areas. This i my fault though rather than hers but because she did include "advanced" concepts it will make the next reading even better. I think her talks should follow a similar approach. Most people are not going to learn category theory and many people that are trying will not be novices in math. She should target an audience of near her level to achieve optimal presentation. If people do not understand it is more likely their fault than hers and by slowing down the presentation so "novices can keep up" makes it very hard to stay mentally engaged.
      I'm not saying this is a constant issue but seems to creep in to the talks regularly acting like speed bumps and pot holes.

    • @jamesfrancese6091
      @jamesfrancese6091 2 ปีที่แล้ว +4

      This is an expository tutorial talk for a general or mixed academic audience, including people essentially outside math entirely. Also, this has little to do with "teaching" per se, it's more like a "continuing education talk for computer scientists, applied mathematicians, physicists, philosophers, and a general mathematical audience". Just curious, did you participate in ACT 2020?