Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Natural Language Understanding with Python

You're reading from  Natural Language Understanding with Python

Product type Book
Published in Jun 2023
Publisher Packt
ISBN-13 9781804613429
Pages 326 pages
Edition 1st Edition
Languages
Author (1):
Deborah A. Dahl Deborah A. Dahl
Profile icon Deborah A. Dahl

Table of Contents (21) Chapters

Preface Part 1: Getting Started with Natural Language Understanding Technology
Chapter 1: Natural Language Understanding, Related Technologies, and Natural Language Applications Chapter 2: Identifying Practical Natural Language Understanding Problems Part 2:Developing and Testing Natural Language Understanding Systems
Chapter 3: Approaches to Natural Language Understanding – Rule-Based Systems, Machine Learning, and Deep Learning Chapter 4: Selecting Libraries and Tools for Natural Language Understanding Chapter 5: Natural Language Data – Finding and Preparing Data Chapter 6: Exploring and Visualizing Data Chapter 7: Selecting Approaches and Representing Data Chapter 8: Rule-Based Techniques Chapter 9: Machine Learning Part 1 – Statistical Machine Learning Chapter 10: Machine Learning Part 2 – Neural Networks and Deep Learning Techniques Chapter 11: Machine Learning Part 3 – Transformers and Large Language Models Chapter 12: Applying Unsupervised Learning Approaches Chapter 13: How Well Does It Work? – Evaluation Part 3: Systems in Action – Applying Natural Language Understanding at Scale
Chapter 14: What to Do If the System Isn’t Working Chapter 15: Summary and Looking to the Future Index Other Books You May Enjoy

Using BERT – a classification example

In this example, we’ll use BERT for classification, using the movie review dataset we saw in earlier chapters. We will start with a pretrained BERT model and fine-tune it to classify movie reviews. This is a process that you can follow if you want to apply BERT to your own data.

Using BERT for specific applications starts with one of the pretrained models available from TensorFlow Hub (https://tfhub.dev/tensorflow) and then fine-tuning it with training data that is specific to the application. It is recommended to start with one of the small BERT models, which have the same architecture as BERT but are faster to train. Generally, the smaller models are less accurate, but if their accuracy is adequate for the application, it isn’t necessary to take the extra time and computer resources that would be needed to use a larger model. There are many models of various sizes that can be downloaded from TensorFlow Hub.

BERT models...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}