Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Transformers for Natural Language Processing and Computer Vision - Third Edition

You're reading from  Transformers for Natural Language Processing and Computer Vision - Third Edition

Product type Book
Published in Feb 2024
Publisher Packt
ISBN-13 9781805128724
Pages 728 pages
Edition 3rd Edition
Languages
Author (1):
Denis Rothman Denis Rothman
Profile icon Denis Rothman

Table of Contents (24) Chapters

Preface What Are Transformers? Getting Started with the Architecture of the Transformer Model Emergent vs Downstream Tasks: The Unseen Depths of Transformers Advancements in Translations with Google Trax, Google Translate, and Gemini Diving into Fine-Tuning through BERT Pretraining a Transformer from Scratch through RoBERTa The Generative AI Revolution with ChatGPT Fine-Tuning OpenAI GPT Models Shattering the Black Box with Interpretable Tools Investigating the Role of Tokenizers in Shaping Transformer Models Leveraging LLM Embeddings as an Alternative to Fine-Tuning Toward Syntax-Free Semantic Role Labeling with ChatGPT and GPT-4 Summarization with T5 and ChatGPT Exploring Cutting-Edge LLMs with Vertex AI and PaLM 2 Guarding the Giants: Mitigating Risks in Large Language Models Beyond Text: Vision Transformers in the Dawn of Revolutionary AI Transcending the Image-Text Boundary with Stable Diffusion Hugging Face AutoTrain: Training Vision Models without Coding On the Road to Functional AGI with HuggingGPT and its Peers Beyond Human-Designed Prompts with Generative Ideation Other Books You May Enjoy
Index
Appendix: Answers to the Questions

References

  • Alex Wang, Yada Pruksachatkun, Nikita Nangia, Amanpreet Singh, Julian Michael, Felix Hill, Omer Levy, and Samuel R. Bowman, 2019, SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems: https://w4ngatang.github.io/static/papers/superglue.pdf
  • Alex Wang, Yada Pruksachatkun, Nikita Nangia, Amanpreet Singh, Julian Michael, Felix Hill, Omer Levy, and Samuel R. Bowman, 2019, GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding: https://arxiv.org/abs/1804.07461
  • Yu Sun, Shuohuan Wang, Yukun Li, Shikun Feng, Hao Tian, Hua Wu, and Haifeng Wang, 2019, ERNIE 2.0: A Continual Pretraining Framework for Language Understanding: https://arxiv.org/pdf/1907.12412.pdf
  • Melissa Roemmele, Cosmin Adrian Bejan, and Andrew S. Gordon, 2011, Choice of Plausible Alternatives: An Evaluation of Commonsense Causal Reasoning: https://people.ict.usc.edu/~gordon/publications/AAAI-SPRING11A.PDF
  • Richard Socher, Alex...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}