4
Linear Transformations
”Why do my eyes hurt?” ”You’ve never used them before.”
— Morpheus to Neo, when waking up from the Matrix for the first time
In most linear algebra courses, the curriculum is all about matrices. In machine learning, we work with them all the time. Here is the thing: matrices don’t tell the whole story. It is hard to understand the patterns by looking only at matrices. For instance, why is matrix multiplication defined in such a complex way as it is? Why are relations like B = T−1AT important? Why are some matrices invertible and some are not?
To really understand what is going on, we have to look at what gives rise to matrices: linear transformations. Like for Neo, this might hurt a bit, but it will greatly reward us later down the line. Let’s get to it!