Reader small image

You're reading from  Hands-On Deep Learning with TensorFlow

Product typeBook
Published inJul 2017
Reading LevelBeginner
PublisherPackt
ISBN-139781787282773
Edition1st Edition
Languages
Right arrow
Author (1)
Dan Van Boxel
Dan Van Boxel
author image
Dan Van Boxel

Dan Van Boxel is a data scientist and machine learning engineer with over 10 years of experience. He is most well-known for Dan Does Data, a YouTube livestream demonstrating the power and pitfalls of neural networks. He has developed and applied novel statistical models of machine learning to topics such as accounting for truck traffic on highways, travel time outlier detection, and other areas. Dan has also published research articles and presented findings at the Transportation Research Board and other academic journals.
Read more about Dan Van Boxel

Right arrow

Chapter 4. Introducing Recurrent Neural Networks

In the previous chapter, you learned about convolutional networks. Now, it's time to move on to a new type of model and problem—Recurrent Neural Networks (RNNs). In this chapter, we'll explain the workings of RNNs, and implement one in TensorFlow. Our example problem will be a simple season predictor with weather information. We will also take a look at skflow, a simplified interface to TensorFlow. This will let us quickly re-implement both our old image classification models and the new RNN. At the end of this chapter, you will have a good understanding of the following concepts:

  • Exploring RNNs

  • TensorFlow learn

  • Dense Neural Network (DNN)

Exploring RNNs


In this section, we'll explore RNNs. Some background information will start us off, and then we will look at a motivating weather modeling problem. We'll also implement and train an RNN in TensorFlow.

In a typical model, you have some X input features and some Y output you want to predict. We usually consider our different training samples as independent observations. So, the features from data point one shouldn't impact the prediction for data point two. But what if our data points are correlated? The most common example is that each data point, Xt, represents features collected at time t. It's natural to suppose that the features at time t and time t+1 will both be important to the prediction at time t+1. In other words, history matters.

Now, when modeling, you could just include twice as many input features, adding the previous time step to the current ones, and computing twice as many input weights. But, if you're going through all the effort of building a neural network...

TensorFlow learn


Just as Scikit-Learn is a convenient interface to traditional machine learning algorithms, tf.contrib.learn (https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/learn/python/learn), formerly known as skflow, it is a simplified interface to build and train DNNs. Now it comes free with every installation of TensorFlow!

Even if you're not a fan of the syntax, it's worth looking at TensorFlow Learn as the high-level API to TensorFlow. This is because it's currently the only officially supported one. But, you should know that there are many alternative high-level APIs that may have more intuitive interfaces. If interested, refer to Keras (https://keras.io/), tf.slim (included with TF), to learn more about TensorFlow-Slim refer to https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/slim or TFLearn (http://tflearn.org/).

Setup

To get started with TensorFlow Learn, you only need to import it. We'll also import the estimators function, which will...

DNNs


While there are better ways to implement purely linear models, simplifying DNNs with a varying number of layers is where TensorFlow and learn really shine.

We'll use the same input features, but now we'll build a DNN with two hidden layers, first with 10 neurons and then 5. Creating this model will only take one line of Python code; it could not be easier.

The specification is similar to our linear model. We still need SKCompat, but now it's learn.DNNClassifier. For arguments, there's one additional requirement: the number of neurons on each hidden layer, passed as a list. This one simple argument, which really captures the essence of a DNN model, puts the power of deep learning at your fingertips.

There are some optional arguments to this as well, but we'll only mention optimizer. This allows you to choose between different common optimizer routines, such as Stochastic Gradient Descent (SGD) or Adam. Very convenient!

# Dense neural net
classifier = estimator.SKCompat(learn.DNNClassifier...

Summary


You learned a lot in this chapter, going from simply understanding RNNs to implementing them in a new TensorFlow model. We also looked at a simple interface to TensorFlow called TensorFlow Learn. We also walked through DNNs, and understood CNNs and extracting weights in detail.

In the next chapter, we will wrap up our look at TensorFlow, looking at how far we've come and where you can go from here.

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Hands-On Deep Learning with TensorFlow
Published in: Jul 2017Publisher: PacktISBN-13: 9781787282773
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Dan Van Boxel

Dan Van Boxel is a data scientist and machine learning engineer with over 10 years of experience. He is most well-known for Dan Does Data, a YouTube livestream demonstrating the power and pitfalls of neural networks. He has developed and applied novel statistical models of machine learning to topics such as accounting for truck traffic on highways, travel time outlier detection, and other areas. Dan has also published research articles and presented findings at the Transportation Research Board and other academic journals.
Read more about Dan Van Boxel