This is such a fascinating topic when you abstract it out of the matrix view. The Riesz Representation Theorem puts this into its proper context. Every linear function of a vector from a Hilbert space to the complex numbers is just an inner product of that vector with some other vector that is uniquely associated with that linear map. When you take the dual, and then apply the natural matrix representation from the basis to the set of canonical column vectors, the inner product becomes the dot product, whose matrix representation is just a row matrix multipied by a column matrix. But now, linear transformations are viewed as operators formed from a vector times a dual vector, whose representation using the same rule as before is a square matrix. So, matrix multiplication is a consequence of the Riesz Representation theorem combined with the natural representation of the inner product on the canonical column basis.
This is such a fascinating topic when you abstract it out of the matrix view.
The Riesz Representation Theorem puts this into its proper context. Every linear function of a vector from a Hilbert space to the complex numbers is just an inner product of that vector with some other vector that is uniquely associated with that linear map.
When you take the dual, and then apply the natural matrix representation from the basis to the set of canonical column vectors, the inner product becomes the dot product, whose matrix representation is just a row matrix multipied by a column matrix.
But now, linear transformations are viewed as operators formed from a vector times a dual vector, whose representation using the same rule as before is a square matrix. So, matrix multiplication is a consequence of the Riesz Representation theorem combined with the natural representation of the inner product on the canonical column basis.
Those Hagoromo chalks 😎
When you say linear, do you mean bilinear?