Reader small image

You're reading from  Learn Robotics Programming - Second Edition

Product typeBook
Published inFeb 2021
PublisherPackt
ISBN-139781839218804
Edition2nd Edition
Concepts
Right arrow
Author (1)
Danny Staple
Danny Staple
author image
Danny Staple

Danny Staple builds robots and gadgets as a hobbyist, makes videos about his work with robots, and attends community events such as PiWars and Arduino Day. He has been a professional Python programmer, later moving into DevOps, since 2009, and a software engineer since 2000. He has worked with embedded systems, including embedded Linux systems, throughout the majority of his career. He has been a mentor at a local CoderDojo, where he taught how to code with Python. He has run Lego Robotics clubs with Mindstorms. He has also developed Bounce!, a visual programming language targeted at teaching code using the NodeMCU IoT platform. The robots he has built with his children include TankBot, SkittleBot (now the Pi Wars robot), ArmBot, and SpiderBot.
Read more about Danny Staple

Right arrow

Chapter 17: Controlling the Robot with a Phone and Python

The robot we've been programming has many behaviors, but when you run some of them, they result in the robot stopping on the other side of the room. You could try to write code to return it back to you, but this may be complicated. We've also got a neat camera with some visual feedback available on what the robot is doing. Wouldn't it be neat to take control and drive the robot sometimes?

We've been launching commands to drive our robot from a Secure Shell (SSH) terminal, but the robot will be more exciting and more comfortable to demonstrate if you could start the commands via a menu. We can build upon the web application programming interface (API) code you made in Chapter 15, Voice Communication with a Robot Using Mycroft.

In this chapter, we will see how to create a menu system to choose behaviors designed for a phone. We will then use the touch surface to build a control system, with the camera...

Technical requirements

For this chapter, you will need the following items:

  • Your Raspberry Pi robot with the camera set up and the code from previous chapters
  • A touchscreen device such as a phone with Wi-Fi
  • A wireless network

The GitHub code for this chapter is at https://github.com/PacktPublishing/Learn-Robotics-Programming-Second-Edition/tree/master/chapter17.

Use the 0_starting_point folder to find the complete code from the previous chapters and the full_system folder on GitHub for this chapter's full code.

Check out the following video to see the code in action: https://bit.ly/2Kb7rp8

When speech control won't work – why we need to drive

In Chapter 15, Voice Communication with a Robot Using Mycroft, we built a Mycroft system to launch behaviors. If you have tried to build intents to make the robot stop in time or drive left or right, you will have probably noticed that it takes some time to respond even with the clearest speaking.

Speech control also only really works in a quiet room. If your robot is outside (you would like to drive it somewhere), this is not useful.

Mycroft is also utterly dependent on having access to the internet. It is one thing to have a small shared network for a robot and a controller; it's another to always require internet access, which can become tricky when not at your home, school, or lab.

Using an SSH session to log in to a robot, then typing commands to start and stop behaviors works well during testing stages, but it can be slow and cumbersome. In demonstration conditions, mistyping a command or just restarting...

Choosing a controller — how we are going to drive the robot, and why

We want to be able to control our robot with something that is handheld and wireless. Trailing a wire to our robot would make little sense. Having seen how our robot drives in Chapter 7, Drive and Turn – Moving Motors with Python, we will want a control system that directly affects the wheels.

One way to do this would be to use a Bluetooth joypad. There are a large number of these on the market, which may require specialist drivers to read. Bluetooth has a habit of dropping pairings at inopportune times.

Some joypads use a custom wireless dongle; these are far more reliable than Bluetooth but have a dongle that doesn't fit very nicely on the robot.

However, you already have a handheld device in your pocket: your phone. It has a touchscreen, capable of reading finger movements. With a bit of the right code, you can display the video between controller bars, creating a kind of robotic periscope...

Preparing the Raspberry Pi for remote driving—get the basic driving system going

Our Raspberry Pi has already been able to run web services, using Flask to create a menu server and video servers. We can use image and control queues to make a behavior interact with a web server. We are going to reuse these capabilities. In the phone app, the slider controls will need to be smart. The next diagram shows the parts of our manual drive system:

Figure 17.8 – The system overview of a manual drive app

The dashed boxes in Figure 17.8 show where the code is running, with the top dashed box being code running on the phone, and the lower box being code running on the Raspberry Pi in the robot. Inside the dashed boxes, the boxes with solid outlines are blocks of code or systems our code will need. At the bottom layer of Figure 17.8, the Robot box accepts the stop motors and set motor speed calls. These are from the Behavior box based on timeouts or the...

Making the robot fully phone-operable

The goal here is to make it so that we can drive the robot completely from the phone. We need to ensure that the robot is ready to run when we turn it on, and make sure that the menu is usable from a phone. The menu we made earlier doesn't seem very touch-friendly. It also will not successfully run any of the behaviors with displays using Flask. We will make the menu buttons bigger and more touch-friendly, using styles similar to our manual drive behavior. The menu will also load our server page after clicking a behavior with a server such as this one or the last chapter's visual tracking behaviors.

Let's fix the Flask behaviors first.

Making menu modes compatible with Flask behaviors

If you've already tried running Flask-based behaviors (such as those with a camera) in the control server, you will have noticed some very odd behavior. Your behavior will appear to do the right thing with sensors on the robot, but the...

Making the menu start when the Pi starts

You now have a menu system launching robot behaviors. Using SSH to log in is great to debug, see problems, and fix them. However, when you want to demonstrate your robot, a SSH session will become inconvenient.

The ideal is to turn on the robot, wait for a light to come on, then point your phone browser at it to control it.

We are going to do two things to make this useful, as follows:

  • Use an LED to indicate that it's ready (in menu mode) to allow the robot to tell us before our phone has linked to the page
  • Use systemd to automatically start the menu Flask server when we turn on the robot

Let's get stuck in with the lights.

Adding lights to the menu server

We won't want the whole robot class loaded in our menu, but it can use the lights to indicate our robot is now ready. We will import the LED system, turn it on as the server starts, and then turn it off/release it when the first /run request arrives...

Summary

This chapter added a small menu system to our robot to start different modes from a connected web browser.

You've seen how to drive a robot from a mobile phone and how to create interesting-looking animated widgets with SVG and JavaScript.

Your robot has now gained the ability to be driven manually. It may take you a while to get used to handling it, and manually correcting for veer (motors behaving slightly differently) is more challenging than when the PID systems correct themselves. Still, you will gain skills in driving it with your phone. You can use the camera on the front of the robot to get a robot's-eye view of the world.

You've turned the control server into a menu server and then made that start automatically when you turn on the robot. You've also seen how to connect your menu server to the video-server apps such as manual driving, color-tracking, or face-tracking apps. By making the buttons more touch-friendly on the menu server, you...

Exercises

You could enhance the system in many ways. Here are some suggestions for building further:

  1. In the manual_drive.py file, the handle_instruction function uses a bunch of if statements to handle the instruction. If this list of command handlers exceeds five, you could improve it by using a dictionary (such as menu_modes) and then calling different handler methods.
  2. Could you change the touch interface into two circular pads—perhaps so the left controls motor movement and the right changes the camera position?
  3. What about creating phone-friendly interfaces for other behaviors to control their parameters?
  4. You could embellish the CSS by adding round buttons or putting spacing between the buttons.
  5. The menu still uses text buttons. Could you find a way to associate an image with each behavior and make a button grid?
  6. Adding a Shutdown menu button will mean you could more gracefully shut down the Pi, where it would start the sudo poweroff command.
  7. ...

Further reading

To find out more about the topics covered in this chapter, here are some suggestions:

  • I highly recommend the Flask API documentation (http://flask.pocoo.org/docs/1.0/api/), both to help understand the Flask functions we've used and to learn other ways to use this flexible web server library.
  • For a more guided look at the Flask web server, I suggest reading Flask By Example, Gareth Dwyer, Packt Publishing (https://www.packtpub.com/product/flask-by-example/9781785286933), showing you how to build more involved web applications using Flask.
  • I also recommend the book Mastering Flask, Jack Stouffer, Packt Publishing (https://www.packtpub.com/web-development/mastering-flask).
  • The HTML used in this chapter is elementary. To get a more in-depth look into how you could enhance the simple menu system, I recommend the e-learning video guide Beginning Responsive Web Development with HTML and CSS [eLearning], Ben Frain, Cord Slatton-Valle, Joshua Miller...
lock icon
The rest of the chapter is locked
You have been reading a chapter from
Learn Robotics Programming - Second Edition
Published in: Feb 2021Publisher: PacktISBN-13: 9781839218804
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Danny Staple

Danny Staple builds robots and gadgets as a hobbyist, makes videos about his work with robots, and attends community events such as PiWars and Arduino Day. He has been a professional Python programmer, later moving into DevOps, since 2009, and a software engineer since 2000. He has worked with embedded systems, including embedded Linux systems, throughout the majority of his career. He has been a mentor at a local CoderDojo, where he taught how to code with Python. He has run Lego Robotics clubs with Mindstorms. He has also developed Bounce!, a visual programming language targeted at teaching code using the NodeMCU IoT platform. The robots he has built with his children include TankBot, SkittleBot (now the Pi Wars robot), ArmBot, and SpiderBot.
Read more about Danny Staple