Reader small image

You're reading from  Generative AI with Python and TensorFlow 2

Product typeBook
Published inApr 2021
PublisherPackt
ISBN-139781800200883
Edition1st Edition
Right arrow

Summary

In this chapter, you learned about one of the most important models from the beginnings of the deep learning revolution, the DBN. You saw that DBNs are constructed by stacking together RBMs, and how these undirected models can be trained using CD.

The chapter then described a greedy, layer-wise procedure for priming a DBN by sequentially training each of a stack of RBMs, which can then be fine-tuned using the wake-sleep algorithm or backpropagation. We then explored practical examples of using the TensorFlow 2 API to create an RBM layer and a DBN model, illustrating the use of the GradientTape class to compute updates using CD.

You also learned how, following the wake-sleep algorithm, we can compile the DBN as a normal Deep Neural Network and perform backpropagation for supervised training. We applied these models to MNIST data and saw how an RBM can generate digits after training converges, and has features resembling the convolutional filters described in Chapter...

lock icon
The rest of the page is locked
Previous PageNext Page
You have been reading a chapter from
Generative AI with Python and TensorFlow 2
Published in: Apr 2021Publisher: PacktISBN-13: 9781800200883