Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
MicroPython Projects

You're reading from  MicroPython Projects

Product type Book
Published in Apr 2020
Publisher Packt
ISBN-13 9781789958034
Pages 294 pages
Edition 1st Edition
Languages
Author (1):
Jacob Beningo Jacob Beningo
Profile icon Jacob Beningo

Table of Contents (14) Chapters

Preface Down the Rabbit Hole with MicroPython Managing Real-Time Tasks Writing a MicroPython Driver for an I/O Expander Developing an Application Test Harness Customizing the MicroPython Kernel Start Up Code A Custom Debugging Tool to Visualize Sensor Data Device Control Using Gestures Automation and Control Using Android Building an Object Detection Application Using Machine Learning The Future of MicroPython Downloading and Running MicroPython Code
Assessments Other Books You May Enjoy

Device Control Using Gestures

Knobs, buttons, levers, and touch screens have dominated the controls world by being the primary way that a user interacts with an embedded device. These tactile interfaces aren't the only way to interact with a device. In recent years, new sensors and technologies have created opportunities to create tactile-less interfaces that rely on hand movements and gestures. These gesture-based controls can be a far more intuitive and natural way to interact with a device.

In this chapter, we will examine how to integrate a gesture controller into an embedded device that allows us to control it using gestures.

The following topics will be covered in this chapter:

  • An introduction to gesture controllers
  • Gesture controller requirements
  • Gesture controller hardware and software design
  • Constructing a gesture controller
  • Testing gesture controller applications...

Technical requirements

The example code for this chapter can be found at https://github.com/PacktPublishing/MicroPython-Projects/tree/master/Chapter07.

In order to run the examples, you will need to have the following hardware and software:

  • A MicroPython supported development board
  • An Adafruit ADPS9960 breakout board
  • A prototyping breadboard
  • Wire jumpers
  • Four LEDs with appropriately sized resistors
  • A terminal application (PuTTy, RealTerm, Terminal, or one of many others)
  • A text editor, such as Sublime Text

Introducing gesture controllers

Gesture controllers provide developers with the ability to create unique interfaces to their embedded product that allows the user to interact with their device in a hands-free way.

Gesture technology can vary quite dramatically in its capabilities and the technology that drives it. For example, a low-end system can take advantage of an infrared light-emitting diode (IR LED) and a photodiode with a cost of less than $10, whereas a higher-end system, such as Leap or the discontinued Microsoft Kinect, might cost several hundred dollars. High-end solutions often use several cameras, including an IR camera, to capture motion and then break it down into a gesture.

For most readers, integrating Leap, or another gesture controller that is typically USB-based, is going to be outside your price range and will also require quite a bit of development time...

Gesture controller requirements

The main purpose of this project is to build a cost-effective gesture controller that we can integrate into an embedded device and use to control the device. Our device should be able to control relays, send a message out on Wi-Fi or Bluetooth, or carry out many other possibilities. With this project, we want to set up the building blocks that will make our detected gesture turn an LED on for five seconds and then turn it off. The LED that lights up will correspond to the gesture that was detected, which will also be printed to the terminal. Let's now look at our hardware and software requirements.

Hardware requirements

The hardware requirements for our gesture controller are much stricter...

Hardware and software design

The requirements for this project give us a very concrete direction concerning the hardware and software, but at the same time, there is quite a bit of wiggle room as to how exactly we implement our architecture. In this section, we are going to develop the hardware and software architecture that we will use to build our gesture controller.

The gesture hardware architecture

There are just three major components that we need to be concerned about within the hardware architecture:

  • The MicroPython development boards
  • The APDS-9960
  • The LEDs

Just as in the previous projects, we can power the MicroPython board through a USB connector, at least during development. As we saw in the last lab, if you are...

Constructing the gesture controller

We are going to look at building the gesture controller in several different chunks. First, we are going to explore the theory behind how the APDS-9960 works. Once we understand how it works, we will then develop the APDS-9960 driver that is shown in the class diagram in the The software architecture section. Finally, we will write our high-level application that uses the class. At that point, we will be ready to test the controller. Let's get started!

The APDS-9960 theory of operation

The APDS-9960 has four directional photodiodes, which are used to detect the reflected infrared light that is generated by integrated IR LEDs. The reflected light can be used to sense motion, such as...

Testing the gesture controller

The code for this project can again be found at https://github.com/PacktPublishing/MicroPython-Projects/tree/master/ch7.

Download the code and then copy it to your development board. If you aren't using an STM32L475 IoT Discovery node, you may need to modify the LED pins or the I2C bus you are using, but otherwise, the application should run without any other issues.

Once the application and the APDS-9660 module are copied to your MicroPython board, in the REPL, press Ctrl + D. This will perform a soft reboot and start the application. You can now present the APDS-9660 with a gesture. If you swipe right, you should see Right! in the REPL, along with one of your LEDs turning on. If you swipe left, you'll see Left! and the LED associated with it will turn on. The LEDs should turn off within 5 seconds. If you find this is too long, change...

Summary

In this chapter, we explored how to build a gesture controller using the Avago APDS-9960. We saw that the APDS-9960 is a very sophisticated device but, through a carefully crafted software architecture, we were able to abstract this complexity into a few simple calls in our application code. We also looked at how to parse incoming gesture data. You can easily expand upon our gesture controller to add additional functionality, such as light sensing and proximity detection.

In the next chapter, we will shift gears and look at how we can build an automation and control device with MicroPython and an Android-capable tablet.

Questions

  1. What are the technologies that are typically used in gesture control applications?
  2. What four main gestures were covered in this chapter?
  3. What three analog engines are provided in the APDS-9660?
  4. What is the difference between a driver and an integrated application module?
  5. What method was used to determine the gesture direction?
lock icon The rest of the chapter is locked
You have been reading a chapter from
MicroPython Projects
Published in: Apr 2020 Publisher: Packt ISBN-13: 9781789958034
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}