3.2 Introducing the multi-layer perceptron
Deep neural networks are at the core of the deep learning revolution. The aim of this section is to introduce basic concepts and building blocks for deep neural networks. To get started, we will review the components of the multi-layer perceptron (MLP) and implement it using the TensorFlow
framework. This will serve as the foundation for other code examples in the book. If you are already familiar with neural networks and know how to implement them in code, feel free to jump ahead to the Understanding the problem with typical NNs section, where we cover the limitations of deep neural networks. This chapter focuses on architectural building blocks and principles and does not cover learning rules and gradients. If you require additional background information for those topics, we recommend Sebastian Raschka’s excellent Python Machine Learning book from Packt Publishing (in particular, Chapter 2, Fundamentals of Bayesian Inference)...