5.4 Summary
In this chapter, we have looked at matrices from the perspective of linear equation systems, i.e., equations of the form
Not surprisingly, these are described by matrices, and the above is equivalent to the expression Ax = b. Solving linear equations is an ancient art, so why are we talking about it in the age of AI?
Remember: It’s only AI if you are talking to investors. Deep down, it’s linear algebra, calculus, and probability theory.
We wanted to solve linear equations, which led us to Gaussian elimination (well, led Gauss to Gaussian elimination). Which led us to the LU decomposition. Which led us to fast matrix inversion, and a bunch of other innovations on which our current technology is built on. Let me tell you, fast matrix multiplication and inversion are the bread and butter of computational linear algebra, and they all stem from that aforementioned ancient art of solving linear equations.
Let’s recap the feats that we’ve achieved...