Reader small image

You're reading from  Recurrent Neural Networks with Python Quick Start Guide

Product typeBook
Published inNov 2018
Reading LevelIntermediate
PublisherPackt
ISBN-139781789132335
Edition1st Edition
Languages
Right arrow
Author (1)
Simeon Kostadinov
Simeon Kostadinov
author image
Simeon Kostadinov

Simeon Kostadinoff works for a startup called Speechify which aims to help people go through their readings faster by converting any text into speech. Simeon is Machine Learning enthusiast who writes a blog and works on various projects on the side. He enjoys reading different research papers and implement some of them in code. He was ranked number 1 in mathematics during his senior year of high school and thus he has deep passion about understanding how the deep learning models work under the hood. His specific knowledge in Recurrent Neural Networks comes from several courses that he has taken at Stanford University and University of Birmingham. They helped in understanding how to apply his theoretical knowledge into practice and build powerful models. In addition, he recently became a Stanford Scholar Initiative which includes working in a team of Machine Learning researchers on a specific deep learning research paper.
Read more about Simeon Kostadinov

Right arrow

Building a conversation

This step is really similar to the training one. The first difference is that we don't make any evaluation of our predictions, but instead use the input to generate the results. The second difference is that we use the already trained set of variables to yield this result. You will see how it is done later in this chapter. 

To make things clearer, we first initialize a new sequence-to-sequence model. Its purpose is to use the already trained weights and biases and make predictions based on different sets of inputs. We only have an encoder and decoder sequence, where the encoder one is an input sentence and the decoder sequence is fed one word at a time. We define the new model as follows:

encode_seqs2 = tf.placeholder(dtype=tf.int64, shape=[1, None], name="encode_seqs")
decode_seqs2 = tf.placeholder(dtype=tf.int64, shape=[1, None], name...
lock icon
The rest of the page is locked
Previous PageNext Page
You have been reading a chapter from
Recurrent Neural Networks with Python Quick Start Guide
Published in: Nov 2018Publisher: PacktISBN-13: 9781789132335

Author (1)

author image
Simeon Kostadinov

Simeon Kostadinoff works for a startup called Speechify which aims to help people go through their readings faster by converting any text into speech. Simeon is Machine Learning enthusiast who writes a blog and works on various projects on the side. He enjoys reading different research papers and implement some of them in code. He was ranked number 1 in mathematics during his senior year of high school and thus he has deep passion about understanding how the deep learning models work under the hood. His specific knowledge in Recurrent Neural Networks comes from several courses that he has taken at Stanford University and University of Birmingham. They helped in understanding how to apply his theoretical knowledge into practice and build powerful models. In addition, he recently became a Stanford Scholar Initiative which includes working in a team of Machine Learning researchers on a specific deep learning research paper.
Read more about Simeon Kostadinov