Chapter 6. Working with Robotic Sensors
In the previous chapter, we have seen the interfacing of some actuators for our service robot. The next important section that we need to cover is about the robotic sensors used in this robot.
We are using sensors in this robot to find the distance from an obstacle, to get the robot odometry data, and for robotic vision and acoustics.
The sensors are ultrasonic distance sensors, or IR proximity sensors are used to detect the obstacles and to avoid collisions. The vision sensors such as Kinect to acquire 3D data of the environment, for visual odometry; object detection, for collision avoidance; and audio devices such as speakers and mics, for speech recognition and synthesis.
In this chapter, we are not including vision and audio sensors interfacing because in the upcoming chapter we will discuss them and their interfacing in detail.
Working with ultrasonic distance sensors
One of the most important features of a mobile robot is navigation. An ideal navigation means a robot can plan its path from its current position to the destination and can move without any obstacles. We use ultrasonic distance sensors in this robot for detecting objects in close proximity that can't be detected using the Kinect sensor. A combination of Kinect and ultrasonic sound sensors provides ideal collision avoidance for this robot.
Ultrasonic distance sensors work in the following manner. The transmitter will send an ultrasonic sound which is not audible to human ears. After sending an ultrasonic wave, it will wait for an echo of the transmitted wave. If there is no echo, it means there are no obstacles in front of the robot. If the receiving sensor receives an echo, a pulse will be generated on the receiver, and it can calculate the total time the wave will take to travel to the object and return to the receiver sensors. If we get this time...
Working with the IR proximity sensor
Infrared sensors are another method to find obstacles and the distance from the robot. The principle of infrared distance sensors is based on the infrared light that is reflected from a surface when hitting an obstacle. An IR receiver will capture the reflected light and the voltage is measured based on the amount of light received.
One of the popular IR range sensors is Sharp GP2D12, the product link is as follows:
http://www.robotshop.com/en/sharp-gp2y0a21yk0f-ir-range-sensor.html
The following figure shows the Sharp GP2D12 sensor:
The sensor sends out a beam of IR light and uses triangulation to measure the distance. The detection range of the GP2D12 is between 10 cm and 80 cm. The beam is 6 cm wide at a distance of 80 cm. The transmission and reflection of the IR light sensor is illustrated in the following figure:
On the left of the sensor is an IR transmitter, which continuously sends IR radiation, after hitting into some objects, the IR light will...
Working with Inertial Measurement Unit
An Inertial Measurement Unit (IMU) is an electronic device that measures velocity, orientation, and gravitational forces using a combination of accelerometers, gyroscopes, and magnetometers. An IMU has a lot of applications in robotics; some of the applications are in balancing of Unmanned Aerial Vehicles (UAVs) and robot navigation.
In this section, we discuss the role of IMU in mobile robot navigation and some of the latest IMUs on the market and its interfacing with Launchpad.
An IMU provides acceleration and orientation relative to inertial space, if you know the initial position, velocity, and orientation, you can calculate the velocity by integrating the sensed acceleration and the second integration gives the position. To get the correct direction of the robot, the orientation of the robot is required; this can be obtained by integrating sensed angular velocity from gyroscope.
The following figure illustrates an inertial navigation...
Interfacing MPU 6050 to Launchpad with the DMP support using Energia
In this section, we will see the interfacing code of MPU 6050 by activating DMP, which can give us direct orientation values in quaternion or yaw, pitch, and roll. This value can be directly applied to our robotic application too.
The following section of code imports all the necessary header files to interface and create an MPU6050 object like the previous code:
The following code initializes and declares variables to handle DMP:
In this chapter, we have seen some robotic sensors, which can be used in our robot. The sensors we discussed are ultrasonic distance sensors, IR proximity sensors, and IMUs. These three sensors help in the navigation of the robot. We also discussed the basic code to interface these sensors to Tiva C LaunchPad. We will see more on vision and audio sensors interfacing using Python in the next chapter.