2.3 Summary
In this chapter, we have learned that, besides the algebraic structure given by addition and scalar multiplication, vectors have a beautiful geometry that rises from the inner product. From the inner product, we have norms; from norms, we have metrics; and from metrics, we have geometry and topology.
Distance, similarity, angles, and orthogonality all arise from the simple concept of inner products. These are all extremely useful in both theory and practice. For instance, inner products give us a way to quantify the similarity of two vectors via the so-called cosine similarity, but they also provide a means to find optimal bases through the notion of orthogonality.
To summarize, we’ve learned what norms and distances are, the definition of the inner product, how inner products give angles and norms, and why all of these are useful in machine learning.
Besides the basic definitions and properties, we’ve encountered our very first algorithm: the Gram-Schmidt process...