Chapter 9. Flying a Mission with Crazyflie
Robots are fun and sometimes frustrating to program. Quadrotors are particularly difficult to control due to the number of flight factors and the complexity of flight programs to control these factors. Quadrotors are currently being tested as surveillance cameras and delivery vehicles for packages and fast food. In this chapter, we will explore the subject of programming quadrotors to fly to specific destinations. This application may be handy for delivering coffee and paperwork around the office. We will begin by using a barebones quadrotor and an inexpensive depth camera to sense the quadrotor's location.
This chapter will highlight the use of ROS communication to coordinate the locations of the quadrotor and the target. A Kinect sensor will be used to visualize the environment and the position of the quadrotor in it to coordinate its landing at a marked location. ROS tf transforms and pose messages will be generated to identify the reference frames...
The components we will use in this mission include a Crazyflie 2.0 quadrotor, a Crazyradio PA, a Kinect for the Windows v2 sensor, and a workstation computer. Chapter 7, Making a Robot Fly, describes the Crazyflie and Crazyradio and their operations. Chapter 4, Navigating the World with TurtleBot, is a good introduction to a depth sensor such as the Kinect v2. It is recommended to review these chapters before beginning this mission.
Kinect v2 is an infrared time of flight depth sensor that operates at a higher resolution than the Kinect for Xbox 360. The modulated infrared beam measures how long it takes for the light to travel to the object and back, providing a more accurate measurement. This sensor has improved performance in dark rooms and in sunny outdoor conditions. With a horizontal field of view (FOV) of 70 degrees and a vertical FOV of 60 degrees, the infrared sensor can accurately detect distances ranging from 0.5 to 4.5 meters (20 inches...
Loading software for the mission
Part of the software that needed to perform this cooperative mission has been installed in previous chapters:
The ROS software installation of the ros-indigo-desktop-full
configuration is described in the Installing and launching ROS section of Chapter 1, Getting Started with ROS
The installation of Crazyflie ROS software is described in the Loading Crazyflie ROS software section of Chapter 7, Making a Robot Fly
Software for the Kinect v2 to interface with ROS requires the installation of two items: libfreenect2
and iai_kinect2
. The following sections provide the details of these installations.
The libfreenect2
software provides an open-source driver for Kinect v2. This driver does not support the Kinect for Xbox 360 or Xbox One. Libfreenect2 provides for the image transfer of RGB and depth as well as the combined registration of RGB and depth. Image registration aligns the color and depth images for the same scene into one reference image...
For the Kinect, our workstation computer requires us to run Protonect prior to using the kinect2_bridge
software. If you have trouble launching the kinect2_bridge
software, use the following command before you begin:
Verify that Protonect shows color, depth, and IR images and that none of the screens are black. Be aware that Protonect has three optional parameters: cl
(for OpenCL), gl
(for OpenGL) or cpu
(for CPU support). These options can be useful for testing the Kinect v2 operation.
If Protonect has successfully brought up the Kinect image, then press Ctrl + C to close this window. The kinect2_bridge
and kinect2_viewer
should then work properly until the system is restarted.
Next, we must determine how to identify our robots within the frame of the Kinect image.
Detecting Crazyflie and a target
For our Crazyflie and target location, we have prepared markers to uniquely identify them in our lab environment. For the Crazyflie, we...
As you have seen throughout this book, the cmd_vel
topic (the geometry_msgs/Twist
message) is the common control method for ROS robots, whether driving on the ground or flying in the air. For TurtleBot, mobile_base_commands/velocity
and cmd_vel_mux/input/navi
are used to move around the base. For Crazyflie, the crazyflie/cmd_vel
topic is published to control the flight of the quadrotor.
Within the crazyflie_autonomous
package, the crazyflie_controller
node (control_crazyflie.py
) determines the Crazyflie's control state and publishes the crazyflie/cmd_vel
topic. To launch the crazyflie_controller
node, the control_crazyflie.launch
file is used. This launch file also launches the crazyflie_window
node that observes the Crazyflie and takes action when it flies near the edge of the Kinect image frame. The function of this node is described in the subsequent section, using an observer mode.
The crazyflie_controller
node has five states of flight...
Now we are finally ready to fly our mission. To make Crazyflie fly to a target requires that the quadrotor must be controllable to hover in place. Once this task is successful, the next step is to fly to a stationary target. We will introduce the steps to accomplish these tasks in the next sections.
The first step to control Crazyflie's flight is the ability to demonstrate control of the quadrotor hovering in one location. To start the process, use the launch command:
Then, turn on Crazyflie and let it run through its startup routine. When it is complete, type in a second terminal window:
The hover mission can be started by pushing the Takeoff (blue) button on the Xbox 360 controller. After Crazyflie has achieved takeoff, the quadrotor will begin to receive cmd_vel
(geometry_msgs/Twist
) messages to stay in its same location with respect to the...
The aim of this chapter was to stretch your knowledge of ROS by implementing an advanced practical experience to identify and highlight some of the ROS advantages. A ROS system of nodes was created to visualize the environment in which a Crazyflie quadrotor was seen and controlled. The Kinect for Windows V2 depth camera was used to visualize this environment, and ROS nodes handled the detection of markers on the Crazyflie and the target. The location of the Crazyflie was identified in Cartesian coordinates (x, y, z), with the x and y values referring to the quadrotor's position in the image frame and z referring to its distance from the camera. These coordinates were converted into a tf transform and published. The target location was published in a message by a separate ROS node.
The advantage of ROS layers of tf and message passing leaves lower-level details to be handled by another dedicated node. The tf transform for the Crazyflie was used by a controller node to apply PID control...