Reader small image

You're reading from  TinyML Cookbook - Second Edition

Product typeBook
Published inNov 2023
PublisherPackt
ISBN-139781837637362
Edition2nd Edition
Right arrow
Author (1)
Gian Marco Iodice
Gian Marco Iodice
author image
Gian Marco Iodice

Gian Marco Iodice is team and tech lead in the Machine Learning Group at Arm, who co-created the Arm Compute Library in 2017. The Arm Compute Library is currently the most performant library for ML on Arm, and it's deployed on billions of devices worldwide – from servers to smartphones. Gian Marco holds an MSc degree, with honors, in electronic engineering from the University of Pisa (Italy) and has several years of experience developing ML and computer vision algorithms on edge devices. Now, he's leading the ML performance optimization on Arm Mali GPUs. In 2020, Gian Marco cofounded the TinyML UK meetup group to encourage knowledge-sharing, educate, and inspire the next generation of ML developers on tiny and power-efficient devices.
Read more about Gian Marco Iodice

Right arrow

Preface

This book is about tinyML, the technology that allows smartness in a minimally intrusive way using machine learning (ML) on low-powered devices like microcontrollers.

This technology has been around us for many years, for example, in smartwatches, intelligent assistants, and drones, just to name a few. However, today, it is witnessing an incredible growth in all market segments because of the continued success in reducing the complexity of ML model deployment, the proliferation of low-cost devices with extraordinary computing capabilities, and the invaluable contributions from the open-source community. Therefore, tinyML is not a niche technology designed by a few people to solve a few technological problems. Instead, it is a technology in the hands of many developers to solve big real-world problems.

tinyML is an exciting field full of opportunities. With a few tens of dollars, you can give life to objects that interact with the environment smartly and transform how we live for the better. However, this field can be challenging for those unfamiliar with microcontroller programming. Therefore, this book aims to dispel these barriers and demonstrate that tinyML is for everyone through practical examples.

Whether new to this field or looking to expand your ML knowledge, this improved second edition of TinyML Cookbook has something for all. Each chapter is structured to be a self-contained project to learn how to use some of the key tinyML technologies, such as Arduino, CMSIS-DSP, Edge Impulse, emlearn, Raspberry Pi Pico SDK, TensorFlow, TensorFlow Lite for Microcontrollers, and Zephyr.

Your practical journey into tinyML will start with an introduction to this multidisciplinary field and get you up to speed with some of the fundamentals for deploying applications on microcontrollers. For example, you will tackle problems you may encounter while prototyping microcontrollers, such as controlling the LED light or reading the push-button state using the GPIO peripheral.

After preparing for microcontroller programming, you will focus on tinyML projects using real-world sensors. Here, you will employ the temperature, humidity, and three “V” sensors (Voice, Vision, and Vibration) to implement end-to-end smart applications in different scenarios and learn best practices for building models for memory-constrained microcontrollers.

This second edition includes new recipes featuring an LSTM neural network to recognize music genres and the Edge Impulse Faster-Objects-More-Objects (FOMO) algorithm for detecting objects in a scene. These will help you stay updated with the latest developments in the tinyML community.

Finally, you will take your tinyML solutions to the next level with TVM, Arm Ethos-U55 microNPU, on-device learning, and the scikit-learn model deployment on microcontrollers.

TinyML Cookbook is a practical book with a focus on the principles. Although most of the presented projects are based on the Arduino Nano 33 BLE Sense and Raspberry Pi Pico, this second edition also features the SparkFun RedBoard Artemis Nano to help you practice the learned principles on an alternative microcontroller.

Therefore, by the end of this book, you will be well versed in best practices and ML frameworks to develop ML applications easily on microcontrollers.

Who this book is for

Whether you are an enthusiast or professional with a basic familiarity with ML and an interest in developing ML applications on microcontrollers through practical examples, this book is for you.

TinyML Cookbook will help you expand your knowledge of tinyML by building end-to-end projects with real-world data sensors on the Arduino Nano 33 BLE Sense, Raspberry Pi Pico, and SparkFun RedBoard Artemis Nano.

While familiarity with C/C++, Python programming, and the command-line interface (CLI) is required, no prior knowledge of microcontrollers is necessary.

What this book covers

Chapter 1, Getting Ready to Unlock ML on Microcontrollers, provides an overview of tinyML, presenting the opportunities and challenges to bring ML on extremely low-power microcontrollers. This chapter focuses on the fundamental elements behind ML, power consumption, and microcontrollers that make this technology different from conventional ML in the cloud, desktop, or even smartphones.

Chapter 2, Unleashing Your Creativity with Microcontrollers, presents recipes to deal with the relevant microcontroller programming basics. We will deal with code debugging and how to transmit data to the Arduino serial monitor. The transmitted data will be captured in a log file and uploaded to our cloud storage in Google Drive. Afterward, we will delve into programming the GPIO peripheral using the Arm Mbed API and use a solderless breadboard to connect external components, such as LEDs and push-buttons.

Chapter 3, Building a Weather Station with TensorFlow Lite for Microcontrollers, teaches us how to implement a simple weather station with ML to predict the occurrence of snowfall based on the temperature and humidity of the last three hours. In the first part, we will focus on dataset preparation and show how to acquire historical weather data from WorldWeatherOnline. After preparing the dataset, we will see how to train a neural network with TensorFlow and quantize the model to 8-bit with TensorFlow Lite. In the last part, we will deploy the model on the Arduino Nano 33 BLE Sense and Raspberry Pi Pico with TensorFlow Lite for Microcontrollers.

Chapter 4, Using Edge Impulse and the Arduino Nano to Control LEDs with Voice Commands, shows how to develop an end-to-end keyword spotting (KWS) application with Edge Impulse and the Arduino Nano 33 BLE Sense board. The chapter will begin with dataset preparation, showing how to acquire audio data with a mobile phone and the built-in microphone on the Arduino Nano. Next, we will design a model based on the popular Mel Filterbank Energy (MFE) features for speech recognition. In these recipes, we will show how to extract these features from audio samples, train the machine learning (ML) model, and optimize the performance with the Edge Impulse EON Tuner. At the end of the chapter, we will concentrate on deploying the KWS application.

Chapter 5, Recognizing Music Genres with TensorFlow and the Raspberry Pi Pico – Part 1, is the first part of a project to recognize three music genres from recordings obtained with a microphone connected to Raspberry Pi Pico. The music genres we will classify are disco, jazz, and metal. Since the project offers many learning opportunities, it is split into two chapters to give as much exposure to the technical aspects as possible. Here, we will focus on the dataset preparation and the analysis of the feature extraction technique employed for classifying music genres: the Mel Frequency Cepstral Coefficients (MFCCs).

Chapter 6, Recognizing Music Genres with TensorFlow and Raspberry Pi Pico – Part 2, is the continuation of Chapter 5 and discusses how the target device influences the implementation of the MFCCs feature extraction. We will start our discussion by tailoring the MFCCs implementation for Raspberry Pi Pico.

Here, we will learn how fixed-point arithmetic can help minimize the latency and show how the CMSIS-DSP library provides tremendous support in employing this limited numerical precision in feature extraction. After reimplementing the extraction of the MFCCs using fixed-point arithmetic, we will design an ML model capable of recognizing music genres with a Long Short-Term Memory (LSTM) recurrent neural network (RNN). Finally, we will test the model accuracy on the test dataset and deploy a music genre classification application on Raspberry Pi Pico with the help of TensorFlow Lite for Microcontrollers.

Chapter 7, Detecting Objects with Edge Impulse using FOMO on the Raspberry Pi Pico, showcases the deployment of an object detection application on microcontrollers using Edge Impulse and the Faster Objects, More Objects (FOMO) ML algorithm. The chapter will begin with dataset preparation, demonstrating how to acquire images with a webcam and label them in Edge Impulse. Next, we will design an ML model based on the FOMO algorithm. In this part, we will explore the architectural features of this novel ML solution that allows us to deploy object detection on highly constrained devices. Subsequently, we will test the model using the Edge Impulse Live classification tool and then on the Raspberry Pi Pico.

Chapter 8, Classifying Desk Objects with TensorFlow and the Arduino Nano, demonstrates the benefit of adding sight to our tiny devices by classifying two desk objects with the OV7670 camera module in conjunction with the Arduino Nano 33 BLE Sense board. In the first part, we will learn how to acquire images from the OV7670 camera module. Then, we will focus on the model design, applying transfer learning with the Keras API to recognize two objects we typically find on a desk: a mug and a book. Finally, we will deploy the quantized TensorFlow Lite model on an Arduino Nano 33 BLE Sense with the help of TensorFlow Lite for Microcontrollers.

Chapter 9, Building a Gesture-Based Interface for YouTube Playback with Edge Impulse and the Raspberry Pi Pico, teaches us how to use accelerometer measurements with ML to recognize three hand gestures with Raspberry Pi Pico. These recognized gestures will then be used to play/pause, mute/unmute, and change YouTube videos on our PC. The development of this project will start by acquiring the accelerometer data to build the gesture recognition dataset. In this part, we will learn how to interface with the I2C protocol and use the Edge Impulse data forwarder tool. Next, we will focus on the Impulse design, where we will build a spectral-features-based feed-forward neural network for gesture recognition. Finally, we will deploy the model on the Raspberry Pi Pico and implement a Python script with the PyAutoGUI library to build a touchless interface for YouTube video playback.

Chapter 10, Deploying a CIFAR-10 Model for Memory-Constrained Devices with the Zephyr OS on QEMU, demonstrates how to build an image classification application with TensorFlow Lite for Microcontrollers for an emulated Arm Cortex-M3 microcontroller. To accomplish our task, we will start by installing the Zephyr OS, the primary framework used in this chapter. Next, we will design a tiny quantized CIFAR-10 model with TensorFlow. This model will be capable of running on a microcontroller with only 256 KB of program memory and 64 KB of RAM. Ultimately, we will deploy an image classification application on an emulated Arm Cortex-M3 microcontroller through Quick Emulator (QEMU).

Chapter 11, Running ML Models on Arduino and the Arm Ethos-U55 microNPU Using Apache TVM, explores how to leverage Apache TVM to deploy a quantized CIFAR-10 TensorFlow Lite model in various scenarios. After introducing Arduino CLI, we will present TVM by showing how to generate C code from an ML model and how to run it on a machine hosting the Colab environment. In this chapter, we will also discuss the ahead-of-time (AoT) executor, a crucial feature of TVM that can help reduce the program memory usage of the final application. Then, we will delve into running the model on the Arduino Nano 33 BLE Sense and Raspberry Pi Pico and discuss how to compile a sketch from the code generated by TVM. Finally, we will explore the model deployment on a Micro-Neural Processing Unit (microNPU).

Chapter 12, Enabling Compelling tinyML Solutions with On-Device Learning and scikit-learn on the Arduino Nano and Raspberry Pi Pico, aims to answer three likely questions you might be pondering to bring your tinyML projects to the next level. The first question will delve into the feasibility of training models directly on microcontrollers. In this part, we will discuss the backpropagation algorithm to train a shallow neural network. We will also show how to use the CMSIS-DSP library to accelerate its implementation on any microcontroller with an Arm Cortex-M CPU. After discussing on-device learning, we will tackle another problem: deploying scikit-learn models to microcontrollers. In this second part, we will demonstrate how to deploy generic ML algorithms trained with scikit-learn using the emlearn open-source project. The final question we will answer is about powering microcontrollers with batteries.

To get the most out of this book

For most of the chapters, with the exception of Chapter 10, Deploying a CIFAR-10 Model for Memory-Constrained Devices with the Zephyr OS on QEMU, you will need a computer (either a laptop or desktop) running Linux (preferably Ubuntu 20.04+), macOS, or Windows operating systems on an x86_64 architecture. Additionally, your computer should have a minimum of two USB ports.

In Chapter 10, you will specifically require a computer running either Linux (preferably Ubuntu 20.04+) or macOS on an x86_64 architecture.

It is worth noting that most projects can also be developed on Macs powered by Apple silicon, such as M1 or M2 chips. However, at the time of writing, there is no support for the SparkFun RedBoard Artemis Nano on Apple silicon devices.

The only software prerequisites for your computer are:

  • Python (Python 3.7+)
  • A text editor (for example, gedit on Ubuntu)
  • A media player (for example, VLC)
  • An image viewer (for example, the default image viewer in your OS)
  • A web browser (for example, Google Chrome)

During our tinyML journey, we will require different software tools to cover ML development and embedded programming. Thanks to Arduino, Edge Impulse, and Google, these tools will be in the cloud, browser-based, and have a free plan for our usage.

You can develop projects on the Arduino Nano 33 BLE Sense and Raspberry Pi Pico directly in your web browser using the Arduino Web Editor (https://create.arduino.cc). However, at the time of writing, the Arduino Web Editor has a limit of 25 compilations per day. Therefore, you may consider upgrading to any paid plan or using the free local Arduino IDE (https://www.arduino.cc/en/software) to get unlimited compilations. For those interested in the free local Arduino IDE, we have provided the instructions to install the local Arduino IDE on GitHub (https://github.com/PacktPublishing/TinyML-Cookbook_2E/tree/main/Docs/setup_local_arduino_ide.md).

For projects involving the SparkFun RedBoard Artemis Nano, you must use the local Arduino IDE. You can find the setup instructions for developing projects on this microcontroller by following this link: https://github.com/PacktPublishing/TinyML-Cookbook_2E/blob/main/Docs/setup_sparkfun_artemis_nano.md.

The projects we will develop together require sensors and additional electronic components to build realistic tinyML prototypes and experience the complete development workflow. These components are listed at the beginning of each chapter and in the README.md file within the corresponding chapter folder on GitHub.

Since we will build real electronic circuits, we require an electronic components kit with at least a solderless breadboard, colored LEDs, resistors, push-buttons, and jumper wires. Don’t worry if you are a beginner in electronics. You will learn more about these components in the first two chapters of this book. Furthermore, we have prepared a beginner shopping list on GitHub so you know precisely what to buy: https://github.com/PacktPublishing/TinyML-Cookbook_2E/tree/main/Docs/shopping_list.md.

Download the example code files

The code bundle for the book is hosted on GitHub at https://github.com/PacktPublishing/TinyML-Cookbook_2E.

We also have other code bundles from our rich catalog of books and videos available at https://github.com/PacktPublishing/. Check them out!

Download the color images

We also provide a PDF file that has color images of the screenshots/diagrams used in this book. You can download it here: https://packt.link/gbp/9781837637362.

Conventions used

There are a number of text conventions used throughout this book.

CodeInText: Indicates code words in text, database table names, folder names, filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handles. For example: “To do so, import the os Python module to use its listdir() method, which lists all the files in a specified directory.”

A block of code is set as follows:

def representative_data_gen():
  data = tf.data.Dataset.from_tensor_slices(x_test)
  for i_value in data.batch(1).take(100):
    i_value_f32 = tf.dtypes.cast(i_value, tf.float32)
    yield [i_value_f32]

Any command-line input or output is written as follows:

$ arduino-cli core install arduino:mbed_nano

Bold: Indicates a new term, an important word, or words that you see on the screen. For instance, words in menus or dialog boxes appear in the text like this. For example: “Scan the quick response (QR) code with your smartphone to pair the device with Edge Impulse.”

Warnings or important notes appear like this.

Tips and tricks appear like this.

Get in touch

Feedback from our readers is always welcome.

General feedback: Email feedback@packtpub.com and mention the book’s title in the subject of your message. If you have questions about any aspect of this book, please email us at questions@packtpub.com.

Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book, we would be grateful if you reported this to us. Please visit http://www.packtpub.com/submit-errata, click Submit Errata, and fill in the form.

Piracy: If you come across any illegal copies of our works in any form on the internet, we would be grateful if you would provide us with the location address or website name. Please contact us at copyright@packtpub.com with a link to the material.

If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit http://authors.packtpub.com.

Share your thoughts

Once you’ve read TinyML Cookbook, Third Edition, we’d love to hear your thoughts! Please click here to go straight to the Amazon review page for this book and share your feedback.

Your review is important to us and the tech community and will help us make sure we’re delivering excellent quality content.

Download a free PDF copy of this book

Thanks for purchasing this book!

Do you like to read on the go but are unable to carry your print books everywhere?Is your eBook purchase not compatible with the device of your choice?

Don’t worry, now with every Packt book you get a DRM-free PDF version of that book at no cost.

Read anywhere, any place, on any device. Search, copy, and paste code from your favorite technical books directly into your application. 

The perks don’t stop there, you can get exclusive access to discounts, newsletters, and great free content in your inbox daily

Follow these simple steps to get the benefits:

  1. Scan the QR code or visit the link below

https://packt.link/free-ebook/9781837637362

  1. Submit your proof of purchase
  2. That’s it! We’ll send your free PDF and other benefits to your email directly
lock icon
The rest of the chapter is locked
You have been reading a chapter from
TinyML Cookbook - Second Edition
Published in: Nov 2023Publisher: PacktISBN-13: 9781837637362
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €14.99/month. Cancel anytime

Author (1)

author image
Gian Marco Iodice

Gian Marco Iodice is team and tech lead in the Machine Learning Group at Arm, who co-created the Arm Compute Library in 2017. The Arm Compute Library is currently the most performant library for ML on Arm, and it's deployed on billions of devices worldwide – from servers to smartphones. Gian Marco holds an MSc degree, with honors, in electronic engineering from the University of Pisa (Italy) and has several years of experience developing ML and computer vision algorithms on edge devices. Now, he's leading the ML performance optimization on Arm Mali GPUs. In 2020, Gian Marco cofounded the TinyML UK meetup group to encourage knowledge-sharing, educate, and inspire the next generation of ML developers on tiny and power-efficient devices.
Read more about Gian Marco Iodice