Reader small image

You're reading from  Hands-On Natural Language Processing with PyTorch 1.x

Product typeBook
Published inJul 2020
Reading LevelBeginner
PublisherPackt
ISBN-139781789802740
Edition1st Edition
Languages
Tools
Right arrow
Author (1)
Thomas Dop
Thomas Dop
author image
Thomas Dop

Thomas Dop is a data scientist at MagicLab, a company that creates leading dating apps, including Bumble and Badoo. He works on a variety of areas within data science, including NLP, deep learning, computer vision, and predictive modeling. He holds an MSc in data science from the University of Amsterdam.
Read more about Thomas Dop

Right arrow

Summary

In this chapter, we covered how to build sequence-to-sequence models from scratch. We learned how to code up our encoder and decoder components individually and how to integrate them into a single model that is able to translate sentences from one language into another.

Although our sequence-to-sequence model, which consists of an encoder and a decoder, is useful for sequence translation, it is no longer state-of-the-art. In the last few years, combining sequence-to-sequence models with attention models has been done to achieve state-of-the-art performance.

In the next chapter, we will discuss how attention networks can be used in the context of sequence-to-sequence learning and show how we can use both techniques to build a chat bot.

lock icon
The rest of the page is locked
Previous PageNext Chapter
You have been reading a chapter from
Hands-On Natural Language Processing with PyTorch 1.x
Published in: Jul 2020Publisher: PacktISBN-13: 9781789802740

Author (1)

author image
Thomas Dop

Thomas Dop is a data scientist at MagicLab, a company that creates leading dating apps, including Bumble and Badoo. He works on a variety of areas within data science, including NLP, deep learning, computer vision, and predictive modeling. He holds an MSc in data science from the University of Amsterdam.
Read more about Thomas Dop