Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Learning Robotics using Python

You're reading from  Learning Robotics using Python

Product type Book
Published in May 2015
Publisher Packt
ISBN-13 9781783287536
Pages 330 pages
Edition 1st Edition
Languages
Concepts

Table of Contents (19) Chapters

Learning Robotics Using Python
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
Introduction to Robotics Mechanical Design of a Service Robot Working with Robot Simulation Using ROS and Gazebo Designing ChefBot Hardware Working with Robotic Actuators and Wheel Encoders Working with Robotic Sensors Programming Vision Sensors Using Python and ROS Working with Speech Recognition and Synthesis Using Python and ROS Applying Artificial Intelligence to ChefBot Using Python Integration of ChefBot Hardware and Interfacing it into ROS, Using Python Designing a GUI for a Robot Using Qt and Python The Calibration and Testing of ChefBot Index

Chapter 6. Working with Robotic Sensors

In the previous chapter, we have seen the interfacing of some actuators for our service robot. The next important section that we need to cover is about the robotic sensors used in this robot.

We are using sensors in this robot to find the distance from an obstacle, to get the robot odometry data, and for robotic vision and acoustics.

The sensors are ultrasonic distance sensors, or IR proximity sensors are used to detect the obstacles and to avoid collisions. The vision sensors such as Kinect to acquire 3D data of the environment, for visual odometry; object detection, for collision avoidance; and audio devices such as speakers and mics, for speech recognition and synthesis.

In this chapter, we are not including vision and audio sensors interfacing because in the upcoming chapter we will discuss them and their interfacing in detail.

Working with ultrasonic distance sensors


One of the most important features of a mobile robot is navigation. An ideal navigation means a robot can plan its path from its current position to the destination and can move without any obstacles. We use ultrasonic distance sensors in this robot for detecting objects in close proximity that can't be detected using the Kinect sensor. A combination of Kinect and ultrasonic sound sensors provides ideal collision avoidance for this robot.

Ultrasonic distance sensors work in the following manner. The transmitter will send an ultrasonic sound which is not audible to human ears. After sending an ultrasonic wave, it will wait for an echo of the transmitted wave. If there is no echo, it means there are no obstacles in front of the robot. If the receiving sensor receives an echo, a pulse will be generated on the receiver, and it can calculate the total time the wave will take to travel to the object and return to the receiver sensors. If we get this time...

Working with the IR proximity sensor


Infrared sensors are another method to find obstacles and the distance from the robot. The principle of infrared distance sensors is based on the infrared light that is reflected from a surface when hitting an obstacle. An IR receiver will capture the reflected light and the voltage is measured based on the amount of light received.

One of the popular IR range sensors is Sharp GP2D12, the product link is as follows:

http://www.robotshop.com/en/sharp-gp2y0a21yk0f-ir-range-sensor.html

The following figure shows the Sharp GP2D12 sensor:

The sensor sends out a beam of IR light and uses triangulation to measure the distance. The detection range of the GP2D12 is between 10 cm and 80 cm. The beam is 6 cm wide at a distance of 80 cm. The transmission and reflection of the IR light sensor is illustrated in the following figure:

On the left of the sensor is an IR transmitter, which continuously sends IR radiation, after hitting into some objects, the IR light will...

Working with Inertial Measurement Unit


An Inertial Measurement Unit (IMU) is an electronic device that measures velocity, orientation, and gravitational forces using a combination of accelerometers, gyroscopes, and magnetometers. An IMU has a lot of applications in robotics; some of the applications are in balancing of Unmanned Aerial Vehicles (UAVs) and robot navigation.

In this section, we discuss the role of IMU in mobile robot navigation and some of the latest IMUs on the market and its interfacing with Launchpad.

Inertial Navigation

An IMU provides acceleration and orientation relative to inertial space, if you know the initial position, velocity, and orientation, you can calculate the velocity by integrating the sensed acceleration and the second integration gives the position. To get the correct direction of the robot, the orientation of the robot is required; this can be obtained by integrating sensed angular velocity from gyroscope.

The following figure illustrates an inertial navigation...

Interfacing MPU 6050 to Launchpad with the DMP support using Energia


In this section, we will see the interfacing code of MPU 6050 by activating DMP, which can give us direct orientation values in quaternion or yaw, pitch, and roll. This value can be directly applied to our robotic application too.

The following section of code imports all the necessary header files to interface and create an MPU6050 object like the previous code:

#include "Wire.h"
#include "I2Cdev.h"
#include "MPU6050_6Axis_MotionApps20.h"

//Creating MPU6050 Object
MPU6050 accelgyro(0x68);

The following code initializes and declares variables to handle DMP:

//DMP options
//Set true if DMP initialization was successful
bool dmpReady = false;

//Holds actual interrupt status byte from MPU
uint8_t mpuIntStatus;

//return status after each device operation
uint8_t devStatus;

//Expected DMP packet size
uint16_t packetSize;

//count of all bytes currently in FIFO
uint16_t fifoCount;

//FIFO storate buffer
uint8_t fifoBuffer[64...

Questions


  1. What are ultrasonic sensors and how do they work?

  2. How do you calculate distance from the ultrasonic sensor?

  3. What is the IR proximity sensor and how does it work?

  4. How do you calculate distance from the IR sensor?

  5. What is IMU and how do you get the odometric data?

  6. What is the Aided Inertial Navigation system?

  7. What are the main features of MPU 6050?

Summary


In this chapter, we have seen some robotic sensors, which can be used in our robot. The sensors we discussed are ultrasonic distance sensors, IR proximity sensors, and IMUs. These three sensors help in the navigation of the robot. We also discussed the basic code to interface these sensors to Tiva C LaunchPad. We will see more on vision and audio sensors interfacing using Python in the next chapter.

lock icon The rest of the chapter is locked
You have been reading a chapter from
Learning Robotics using Python
Published in: May 2015 Publisher: Packt ISBN-13: 9781783287536
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}