Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
ROS Robotics Projects

You're reading from  ROS Robotics Projects

Product type Book
Published in Mar 2017
Publisher Packt
ISBN-13 9781783554713
Pages 452 pages
Edition 1st Edition
Languages
Concepts

Table of Contents (20) Chapters

ROS Robotics Projects
Credits
About the Author
Acknowledgements
About the Reviewer
www.PacktPub.com
Customer Feedback
Preface
Getting Started with ROS Robotics Application Development Face Detection and Tracking Using ROS, OpenCV and Dynamixel Servos Building a Siri-Like Chatbot in ROS Controlling Embedded Boards Using ROS Teleoperate a Robot Using Hand Gestures Object Detection and Recognition Deep Learning Using ROS and TensorFlow ROS on MATLAB and Android Building an Autonomous Mobile Robot Creating a Self-Driving Car Using ROS Teleoperating a Robot Using a VR Headset and Leap Motion Controlling Your Robots over the Web

Chapter 5. Teleoperate a Robot Using Hand Gestures

As you all know, robots can be controlled mainly in the following modes:

  • Manual: In manual control, the robot is controlling manually by a human. The controlling is done using a remote controller or teach pendant.

  • Semiautonomous: The semiautonomous robot will have both manual and autonomous control. For simple task, it can work autonomously but in complex task it may change its mode to manual.

  • Fully autonomous: An autonomous robot has complete control over its action and can think for itself. It can learn and adapt, and very much everything is controlled by the robot itself.

We can choose the model of robot control based on our application. In this chapter, we are mainly discussing implementing a manual robot control; we can call it distance control or teleoperation. In teleoperation, the robot and human can be far apart, and the operator may not able to see the real robot moving but may get some visual feedback. Rather than manual control...

Teleoperating ROS Turtle using a keyboard


This section is for beginners who haven't worked with teleoperation in ROS yet. In this section, we will see how to teleoperate a robot manually using a keyboard. Using a keyboard, we can translate and rotate the robot. One of the basic example to demonstrate keyboard teleoperation is ROS turtlesim.

The following commands launch turtlesim with keyboard teleoperation. You can run each command on separate Terminals.

Run roscore:

$ roscore

Run a turtlesim node using the following command. This command will launch the turtlesim window:

$ rosrun turtlesim turtlesim_node

Run the keyboard teleoperation node. We can change the turtle's position by pressing arrow keys on the keyboard:

$ rosrun turtlesim turtle_teleop_key

The screenshot of the moving turtle using arrow keys is shown here:

Figure 1: Turtlesim keyboard teleoperation

In ROS, most of the robot packages are bundled with a teleop node for manual control of the robot. This control can either be through...

Teleoperating using hand gestures


The idea of this project is converting IMU orientation into the linear and angular velocity of the robot. Here is the overall structure of this project.

Figure 2: Basic structure of the gesture teleop project

For the IMU device, we are using an IMU called MPU-9250 (https://www.invensense.com/products/motion-tracking/9-axis/mpu-9250/). The IMU will interface with an Arduino board using the I2C protocol. The orientation values from the IMU are computed by the Arduino and send to PC through the rosserial protocol. The orientation values are received on the PC side as ROS topics and converted into twist messages using a ROS node.

Here is the project block diagram with the MPU 9250 and Arduino board:

Figure 3: Functional block diagram of the robot teleop project

We are using a hand glove in which an Arduino board is fixed in the palm area and an MPU-9250 is fixed on the finger area, as shown in the following image:

Figure 4: Hand glove with Arduino and MPU-9250...

Setting up the project


Let's set up the project. To finish this project, you may need the following electronic components. You can see the component name and the link to buy it from the following table:

No

Name

Link

1

Arduino Mega - 2560 with USB cable

https://www.sparkfun.com/products/11061

2

MPU - 9250 breakout

https://amzn.com/B00OPNUO9U

3

Male to Female jumper wires

https://amzn.com/B00PBZMN7C

4

Hand glove

https://amzn.com/B00WH4NXLA

You can use any Arduino having I2C communication. You can also use MPU - 6050 /9150, both of which are compatible with this project. A few words about the MPU - 9250 IMU: it is a 9-axis motion tracking device consisting of a gyro, accelerometer, and compass. MPU - 6050/9150/9250 models have an inbuilt Digital Motion Processor (DMP), which can fuse the accelerometer, gyro, and magnetometer values to get accurate 6DOF/9DOF motion components. In this project, we are only taking the yaw and pitch rotation components.

Note

If you want...

Interfacing the MPU-9250 with the Arduino and ROS


So the first step in this project is to interface the IMU to the Arduino to get the rotation values and send those values to ROS. We're essentially making an Arduino-ROS node that is receiving IMU values and publishing the yaw, pitch, and roll as well as the transformation (TF) corresponding to the IMU movement as ROS topics.

The following figure shows the interfacing of IMU with the Arduino. The IMU is interfaced using the I2C protocol:

Figure 7: Interfacing MPU 9250/9150/6050 with Arduino

The connection from Arduino to MPU-9250 is shown in this table:

Arduino pins

MPU - 9250 pins

5V

VCC

GND

GND

SCL (21)

SCL

SDA (20)

SDA

Digital PIN2

INT

To start working on IMU values in ROS, we have to create a ROS-Arduino node that is receiving IMU values and send it as ROS topics. I hope you have set up the Arduino IDE in your system. For running this code, you will need the Arduino library for the MPU - 9250. Note that you can use the MPU...

Visualizing IMU TF in Rviz


In this section, we are going to visualize the TF data from Arduino on Rviz. Here's the procedure to do that.

Plug the Arduino to the PC and find the Arduino's serial port. To get topics from the Arduino-ROS node, we should start a ROS serial server on the PC, listening on the Arduino serial port. We did this in Chapter 4, Controlling Embedded Boards Using ROS. Still, let's look at the commands again in this section too.

Starting roscore first:

$ roscore

Starting the ROS serial server:

$ rosrun rosserial_python serial_node.py /dev/ttyACM0

You can get the following topics when you run the previous node:

Figure 9: Listing ROS topics from Arduino

You can simply echo these topics, or visualize the TF data on Rviz. You can run Rviz using the following command. The base_link option is the fixed frame, and we can mention that on the command line itself.

$ rosrun rviz rviz -f base_link

The Rviz window will pop up, and if there is no TF option on the left-hand side of Rviz...

Converting IMU data into twist messages


If you are able to the visualization in Rviz, you are done with the interfacing. The next step is to convert IMU orientation into command velocity as ROS twist messages. For this, we have to create a ROS package and a Python script. You can get this package from chapter_5_codes/gesture_teleop; look for a script called gesture_teleop.py from the gesture_teleop/scripts folder.

If you want to create the package from scratch, here is the command:

$ catkin_create_pkg gesture_teleop rospy roscpp std_msgs sensor_msgs geometry_msgs

Now let's look at the explanation of gesture_teleop.py, which is performing the conversion from IMU orientation values to twist commands.

In this code, what we basically do is subscribe to the /imu_data topic and extract only the yaw and pitch values. When these values change in the positive or negative direction, a step value is added or subtracted from the linear and angular velocity variable. The resultant velocity is sent using...

Integration and final run


We are almost done! But how to test this teleop tool? We can create some launch file that can start all these nodes and work with some robot simulation. The gesture_teleop/launch folder has three launch files. Let's take a look at them.

The gesture_teleop.launch file is a generic launch file that can be used for any robot. The only thing we need to edit is the command velocity topic. Here is the definition of this launch file:

     <launch> 
        <param name="teleop_topic" value="/cmd_vel"/> 
        <rosparam command="load" file="$(find 
           gesture_teleop)/config/teleop_config.yaml"/>   
      <node name="rosserial_server_node" pkg="rosserial_python"   
          type="serial_node.py" args="$(arg port)" output="screen"/>      
      <node name="gesture_teleop_node" pkg="gesture_teleop"   
          type="gesture_teleop.py" output="screen"/> 
     </launch> 

This launch file...

Teleoperating using an Android phone


If it is difficult to build the previous circuit and set everything up, there is an easy way to do so with your Android phone. You can manually control either using a virtual joystick or the tilt of the phone.

Here is the Android application you can use for this:

https://play.google.com/store/apps/details?id=com.robotca.ControlApp.

The application's name is ROS Control. You can also search on Google Play Store for it.

Here is the procedure to connect your Android phone to a ROS environment:

Initially, you have to connect both your PC and Android device to a local Wi-Fi network in which each device can communicate with each other using IP addresses.

After connecting to the same network, you have to start roscore on the PC side. You can also note the IP address of the PC by entering the command ifconfig.

Figure 13: Retrieving the IP address of a PC with ifconfig

  1. After obtaining the IP address of the PC, you can start the app and create a robot configuration...

Questions


  • What are the main modes of controlling a differential drive robot?

  • What is the twist message in ROS for?

  • What is DMP and what is the use of DMP in this project?

  • How can we teleoperate a robot from an Android phone?

Summary


This chapter was about making a gesture-based teleoperation project for a ROS-based robot. We used an IMU to detect gestures and interfaced with the Arduino to get the values from the IMU. The Arduino is interfaced with ROS using the ROS serial protocol. The PC is running a ROS node that can convert IMU orientation into linear and angular velocity and send it as a twist message. This twist message can be used in any robot just by changing the teleop topic name. We can also visualize the IMU orientation data in Rviz using TF data from Arduino. If it is too difficult to build this circuit, we can use an Android app called ROS Control that can move the robot using the inbuilt IMU on the phone.

In the next chapter, we'll be dealing with 3D object recognition using ROS.

lock icon The rest of the chapter is locked
You have been reading a chapter from
ROS Robotics Projects
Published in: Mar 2017 Publisher: Packt ISBN-13: 9781783554713
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}