Search icon
Subscription
0
Cart icon
Close icon
You have no products in your basket yet
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Advanced Python Programming - Second Edition

You're reading from  Advanced Python Programming - Second Edition

Product type Book
Published in Mar 2022
Publisher Packt
ISBN-13 9781801814010
Pages 606 pages
Edition 2nd Edition
Languages
Author (1):
Quan Nguyen Quan Nguyen
Profile icon Quan Nguyen

Table of Contents (32) Chapters

Preface 1. Section 1: Python-Native and Specialized Optimization
2. Chapter 1: Benchmarking and Profiling 3. Chapter 2: Pure Python Optimizations 4. Chapter 3: Fast Array Operations with NumPy, Pandas, and Xarray 5. Chapter 4: C Performance with Cython 6. Chapter 5: Exploring Compilers 7. Chapter 6: Automatic Differentiation and Accelerated Linear Algebra for Machine Learning 8. Section 2: Concurrency and Parallelism
9. Chapter 7: Implementing Concurrency 10. Chapter 8: Parallel Processing 11. Chapter 9: Concurrent Web Requests 12. Chapter 10: Concurrent Image Processing 13. Chapter 11: Building Communication Channels with asyncio 14. Chapter 12: Deadlocks 15. Chapter 13: Starvation 16. Chapter 14: Race Conditions 17. Chapter 15: The Global Interpreter Lock 18. Section 3: Design Patterns in Python
19. Chapter 16: The Factory Pattern 20. Chapter 17: The Builder Pattern 21. Chapter 18: Other Creational Patterns 22. Chapter 19: The Adapter Pattern 23. Chapter 20: The Decorator Pattern 24. Chapter 21: The Bridge Pattern 25. Chapter 22: The Façade Pattern 26. Chapter 23: Other Structural Patterns 27. Chapter 24: The Chain of Responsibility Pattern 28. Chapter 25: The Command Pattern 29. Chapter 26: The Observer Pattern 30. Assessments 31. Other Books You May Enjoy

Automatic vectorization for efficient kernels

You might remember from our discussions on NumPy that the library is efficient at applying numerical operations to all elements in an array or the elements along specific axes. By exploiting the fact that the same operation is to be applied to multiple elements, the library optimizes low-level code that performs the operation, making the computation much more efficient than doing the same thing via an iterative loop. This process is called vectorization.

When working with machine learning models, we would like to go through a procedure of vectorizing a specific function, rather than looping through an array or a matrix, to gain performance speedup. Vectorization is typically not easy to do and might involve clever tricks to rewrite the function that we'd like to vectorize into another form that admits vectorization easily.

JAX addresses this concern by providing a function transformation that automatically vectorizes a given...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €14.99/month. Cancel anytime}