Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Transformers for Natural Language Processing and Computer Vision - Third Edition

You're reading from  Transformers for Natural Language Processing and Computer Vision - Third Edition

Product type Book
Published in Feb 2024
Publisher Packt
ISBN-13 9781805128724
Pages 728 pages
Edition 3rd Edition
Languages
Author (1):
Denis Rothman Denis Rothman
Profile icon Denis Rothman

Table of Contents (24) Chapters

Preface 1. What Are Transformers? 2. Getting Started with the Architecture of the Transformer Model 3. Emergent vs Downstream Tasks: The Unseen Depths of Transformers 4. Advancements in Translations with Google Trax, Google Translate, and Gemini 5. Diving into Fine-Tuning through BERT 6. Pretraining a Transformer from Scratch through RoBERTa 7. The Generative AI Revolution with ChatGPT 8. Fine-Tuning OpenAI GPT Models 9. Shattering the Black Box with Interpretable Tools 10. Investigating the Role of Tokenizers in Shaping Transformer Models 11. Leveraging LLM Embeddings as an Alternative to Fine-Tuning 12. Toward Syntax-Free Semantic Role Labeling with ChatGPT and GPT-4 13. Summarization with T5 and ChatGPT 14. Exploring Cutting-Edge LLMs with Vertex AI and PaLM 2 15. Guarding the Giants: Mitigating Risks in Large Language Models 16. Beyond Text: Vision Transformers in the Dawn of Revolutionary AI 17. Transcending the Image-Text Boundary with Stable Diffusion 18. Hugging Face AutoTrain: Training Vision Models without Coding 19. On the Road to Functional AGI with HuggingGPT and its Peers 20. Beyond Human-Designed Prompts with Generative Ideation 21. Other Books You May Enjoy
22. Index
Appendix: Answers to the Questions

Summary

The paradigm shift triggered by ChatGPT compelled us to redefine what an NLP task is. We saw that ChatGPT, like other LLM models, can perform tasks they were not trained for, including many SuperGLUE tasks through advanced emergence. We explored the outputs of the attention heads to bridge the gap between numerical calculations and producing sequences of words.

We then explored how to measure the performance of multi-task transformers. Transformers’ ability to obtain top-ranking results for downstream tasks is unique in NLP history. We went through the demanding SuperGLUE tasks that brought transformers up to the top ranks of the GLUE and SuperGLUE leaderboards.

BoolQ, CB, WiC, and the many other tasks we covered are by no means easy to process, even for humans. We went through an example of several downstream tasks that show the difficulty transformer models face in proving their efficiency.

Transformers have proven their value by outperforming the former...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €14.99/month. Cancel anytime}